Sample records for process simulation possibilities

  1. Possibilities of Particle Finite Element Methods in Industrial Forming Processes

    NASA Astrophysics Data System (ADS)

    Oliver, J.; Cante, J. C.; Weyler, R.; Hernandez, J.

    2007-04-01

    The work investigates the possibilities offered by the particle finite element method (PFEM) in the simulation of forming problems involving large deformations, multiple contacts, and new boundaries generation. The description of the most distinguishing aspects of the PFEM, and its application to simulation of representative forming processes, illustrate the proposed methodology.

  2. Expanded Processing Techniques for EMI Systems

    DTIC Science & Technology

    2012-07-01

    possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and mapping...possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and...54! Figure 4.25: Plots of simulated MetalMapper data for two oblate spheroidal targets

  3. Practical Unitary Simulator for Non-Markovian Complex Processes

    NASA Astrophysics Data System (ADS)

    Binder, Felix C.; Thompson, Jayne; Gu, Mile

    2018-06-01

    Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.

  4. A simulation framework for mapping risks in clinical processes: the case of in-patient transfers.

    PubMed

    Dunn, Adam G; Ong, Mei-Sing; Westbrook, Johanna I; Magrabi, Farah; Coiera, Enrico; Wobcke, Wayne

    2011-05-01

    To model how individual violations in routine clinical processes cumulatively contribute to the risk of adverse events in hospital using an agent-based simulation framework. An agent-based simulation was designed to model the cascade of common violations that contribute to the risk of adverse events in routine clinical processes. Clinicians and the information systems that support them were represented as a group of interacting agents using data from direct observations. The model was calibrated using data from 101 patient transfers observed in a hospital and results were validated for one of two scenarios (a misidentification scenario and an infection control scenario). Repeated simulations using the calibrated model were undertaken to create a distribution of possible process outcomes. The likelihood of end-of-chain risk is the main outcome measure, reported for each of the two scenarios. The simulations demonstrate end-of-chain risks of 8% and 24% for the misidentification and infection control scenarios, respectively. Over 95% of the simulations in both scenarios are unique, indicating that the in-patient transfer process diverges from prescribed work practices in a variety of ways. The simulation allowed us to model the risk of adverse events in a clinical process, by generating the variety of possible work subject to violations, a novel prospective risk analysis method. The in-patient transfer process has a high proportion of unique trajectories, implying that risk mitigation may benefit from focusing on reducing complexity rather than augmenting the process with further rule-based protocols.

  5. Knowledge-based simulation for aerospace systems

    NASA Technical Reports Server (NTRS)

    Will, Ralph W.; Sliwa, Nancy E.; Harrison, F. Wallace, Jr.

    1988-01-01

    Knowledge-based techniques, which offer many features that are desirable in the simulation and development of aerospace vehicle operations, exhibit many similarities to traditional simulation packages. The eventual solution of these systems' current symbolic processing/numeric processing interface problem will lead to continuous and discrete-event simulation capabilities in a single language, such as TS-PROLOG. Qualitative, totally-symbolic simulation methods are noted to possess several intrinsic characteristics that are especially revelatory of the system being simulated, and capable of insuring that all possible behaviors are considered.

  6. A Parallel Sliding Region Algorithm to Make Agent-Based Modeling Possible for a Large-Scale Simulation: Modeling Hepatitis C Epidemics in Canada.

    PubMed

    Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla

    2016-11-01

    Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.

  7. Evaluation of tocopherol recovery through simulation of molecular distillation process.

    PubMed

    Moraes, E B; Batistella, C B; Alvarez, M E Torres; Filho, Rubens Maciel; Maciel, M R Wolf

    2004-01-01

    DISMOL simulator was used to determine the best possible operating conditions to guide, in future studies, experimental works. This simulator needs several physical-chemical properties and often it is very difficult to determine them because of the complexity of the involved components. Their determinations must be made through correlations and/or predictions, in order to characterize the system and calculate it. The first try is to have simulation results of a system that later can be validated with experimental data. To implement, in the simulator, the necessary parameters of complex systems is a difficult task. In this work, we aimed to determe these properties in order to evaluate the tocopherol (vitamin E) recovery using a DISMOL simulator. The raw material used was the crude deodorizer distillate of soya oil. With this procedure, it is possible to determine the best operating conditions for experimental works and to evaluate the process in the separation of new systems, analyzing the profiles obtained from these simulations for the falling film molecular distillator.

  8. Using Simulation for Launch Team Training and Evaluation

    NASA Technical Reports Server (NTRS)

    Peaden, Cary J.

    2005-01-01

    This document describes some of the histor y and uses of simulation systems and processes for the training and evaluation of Launch Processing, Mission Control, and Mission Management teams. It documents some of the types of simulations that are used at Kennedy Space Center (KSC) today and that could be utilized (and possibly enhanced) for future launch vehicles. This article is intended to provide an initial baseline for further research into simulation for launch team training in the near future.

  9. Simulating an Enactment Effect: Pronouns Guide Action Simulation during Narrative Comprehension

    ERIC Educational Resources Information Center

    Ditman, Tali; Brunye, Tad T.; Mahoney, Caroline R.; Taylor, Holly A.

    2010-01-01

    Recent research has suggested that reading involves the mental simulation of events and actions described in a text. It is possible however that previous findings did not tap into processes engaged during natural reading but rather those triggered by task demands. The present study examined whether readers spontaneously mentally simulate the…

  10. A parallel algorithm for switch-level timing simulation on a hypercube multiprocessor

    NASA Technical Reports Server (NTRS)

    Rao, Hariprasad Nannapaneni

    1989-01-01

    The parallel approach to speeding up simulation is studied, specifically the simulation of digital LSI MOS circuitry on the Intel iPSC/2 hypercube. The simulation algorithm is based on RSIM, an event driven switch-level simulator that incorporates a linear transistor model for simulating digital MOS circuits. Parallel processing techniques based on the concepts of Virtual Time and rollback are utilized so that portions of the circuit may be simulated on separate processors, in parallel for as large an increase in speed as possible. A partitioning algorithm is also developed in order to subdivide the circuit for parallel processing.

  11. Simulation of the Processes of Formation of a Dust Cloud in a Vacuum and in the Absence of Gravitation

    NASA Astrophysics Data System (ADS)

    Avdeev, A. V.; Boreisho, A. S.; Ivakin, S. V.; Moiseev, A. A.; Savin, A. V.; Sokolov, E. I.; Smirnov, P. G.

    2018-01-01

    This article is devoted to the simulation of the processes of formation of dust clouds in the absence of gravitation, which is necessary for understanding the processes proceeding in dust clusters in outer space, upper planetary atmosphere, and on the surface of space objects, as well as for evaluating the possibilities of creating disperse structures with given properties. The chief aim of the simulation is to determine the general laws of the dynamics of the dust cloud at the initial stage of its formation. With the use of the original approach based on the particle-in-cell method that permits investigating the mechanics of large ensembles of particles on contemporary computational platforms, we consider the mechanics of a dusty medium in the process of its excitation in a closed container due to the vibration of the walls, and then in the process of particle scattering when the container opens in outer space. The main formation mechanisms of a dust formation have been elucidated, and the possibilities of mathematical simulation for predicting spatial and time characteristics of disperse structures have been shown.

  12. Representing the work of medical protocols for organizational simulation.

    PubMed Central

    Fridsma, D. B.

    1998-01-01

    Developing and implementing patient care protocols within a specific organizational setting requires knowledge of the protocol, the organization, and the way in which the organization does its work. Computer-based simulation tools have been used in many industries to provide managers with prospective insight into problems of work process and organization design mismatch. Many of these simulation tools are designed for well-understood routine work processes in which there are few contingent tasks. In this paper, we describe theoretic that make it possible to simulate medical protocols using an information-processing theory framework. These simulations will allow medical administrators to test different protocol and organizational designs before actually using them within a particular clinical setting. PMID:9929231

  13. Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes

    NASA Astrophysics Data System (ADS)

    Cropper, A. E.; Wang, Z.

    1995-08-01

    Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.

  14. Theoretical and computational foundations of management class simulation

    Treesearch

    Denie Gerold

    1978-01-01

    Investigations on complicated, complex, and not well-ordered systems are possible only with the aid of mathematical methods and electronic data processing. Simulation as a method of operations research is particularly suitable for this purpose. Theoretical and computational foundations of management class simulation must be integrated into the planning systems of...

  15. New method of processing heat treatment experiments with numerical simulation support

    NASA Astrophysics Data System (ADS)

    Kik, T.; Moravec, J.; Novakova, I.

    2017-08-01

    In this work, benefits of combining modern software for numerical simulations of welding processes with laboratory research was described. Proposed new method of processing heat treatment experiments leading to obtaining relevant input data for numerical simulations of heat treatment of large parts was presented. It is now possible, by using experiments on small tested samples, to simulate cooling conditions comparable with cooling of bigger parts. Results from this method of testing makes current boundary conditions during real cooling process more accurate, but also can be used for improvement of software databases and optimization of a computational models. The point is to precise the computation of temperature fields for large scale hardening parts based on new method of temperature dependence determination of the heat transfer coefficient into hardening media for the particular material, defined maximal thickness of processed part and cooling conditions. In the paper we will also present an example of the comparison standard and modified (according to newly suggested methodology) heat transfer coefficient data’s and theirs influence on the simulation results. It shows how even the small changes influence mainly on distribution of temperature, metallurgical phases, hardness and stresses distribution. By this experiment it is also possible to obtain not only input data and data enabling optimization of computational model but at the same time also verification data. The greatest advantage of described method is independence of used cooling media type.

  16. Experiences in using DISCUS for visualizing human communication

    NASA Astrophysics Data System (ADS)

    Groehn, Matti; Nieminen, Marko; Haho, Paeivi; Smeds, Riitta

    2000-02-01

    In this paper, we present further improvement to the DISCUS software that can be used to record and analyze the flow and constants of business process simulation session discussion. The tool was initially introduced in 'visual data exploration and analysis IV' conference. The initial features of the tool enabled the visualization of discussion flow in business process simulation sessions and the creation of SOM analyses. The improvements of the tool consists of additional visualization possibilities that enable quick on-line analyses and improved graphical statistics. We have also created the very first interface to audio data and implemented two ways to visualize it. We also outline additional possibilities to use the tool in other application areas: these include usability testing and the possibility to use the tool for capturing design rationale in a product development process. The data gathered with DISCUS may be used in other applications, and further work may be done with data ming techniques.

  17. SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method.

    PubMed

    Bernal, Javier; Torres-Jimenez, Jose

    2015-01-01

    SAGRAD (Simulated Annealing GRADient), a Fortran 77 program for computing neural networks for classification using batch learning, is discussed. Neural network training in SAGRAD is based on a combination of simulated annealing and Møller's scaled conjugate gradient algorithm, the latter a variation of the traditional conjugate gradient method, better suited for the nonquadratic nature of neural networks. Different aspects of the implementation of the training process in SAGRAD are discussed, such as the efficient computation of gradients and multiplication of vectors by Hessian matrices that are required by Møller's algorithm; the (re)initialization of weights with simulated annealing required to (re)start Møller's algorithm the first time and each time thereafter that it shows insufficient progress in reaching a possibly local minimum; and the use of simulated annealing when Møller's algorithm, after possibly making considerable progress, becomes stuck at a local minimum or flat area of weight space. Outlines of the scaled conjugate gradient algorithm, the simulated annealing procedure and the training process used in SAGRAD are presented together with results from running SAGRAD on two examples of training data.

  18. Signal Processing Studies of a Simulated Laser Doppler Velocimetry-Based Acoustic Sensor

    DTIC Science & Technology

    1990-10-17

    investigated using spectral correlation methods. Results indicate that it may be possible to extend demonstrated LDV-based acoustic sensor sensitivities using higher order processing techniques. (Author)

  19. Simulation of the effect of incline incident angle in DMD Maskless Lithography

    NASA Astrophysics Data System (ADS)

    Liang, L. W.; Zhou, J. Y.; Xiang, L. L.; Wang, B.; Wen, K. H.; Lei, L.

    2017-06-01

    The aim of this study is to provide a simulation method for investigation of the intensity fluctuation caused by the inclined incident angle in DMD (digital micromirror device) maskless lithography. The simulation consists of eight main processes involving the simplification of the DMD aperture function and light propagation utilizing the non-parallel angular spectrum method. These processes provide a possibility of co-simulation in the spatial frequency domain, which combines the microlens array and DMD in the maskless lithography system. The simulation provided the spot shape and illumination distribution. These two parameters are crucial in determining the exposure dose in the existing maskless lithography system.

  20. The simulation of the geosynchronous Earth orbit plasma environment in Chamber A: An assessment of possible experimental investigations

    NASA Technical Reports Server (NTRS)

    Bernstein, W.

    1981-01-01

    The possible use of Chamber A for the replication or simulation of space plasma physics processes which occur in the geosynchronous Earth orbit (GEO) environment is considered. It is shown that replication is not possible and that scaling of the environmental conditions is required for study of the important instability processes. Rules for such experimental scaling are given. At the present time, it does not appear technologically feasible to satisfy these requirements in Chamber A. It is, however, possible to study and qualitatively evaluate the problem of vehicle charging at GEO. In particular, Chamber A is sufficiently large that a complete operational spacecraft could be irradiated by beams and charged to high potentials. Such testing would contribute to the assessment of the operational malfunctions expected at GEO and their possible correction. However, because of the many tabulated limitations in such a testing programs, its direct relevance to conditions expected in the geo environment remains questionable.

  1. A large-scale forest landscape model incorporating multi-scale processes and utilizing forest inventory data

    Treesearch

    Wen J. Wang; Hong S. He; Martin A. Spetich; Stephen R. Shifley; Frank R. Thompson III; David R. Larsen; Jacob S. Fraser; Jian Yang

    2013-01-01

    Two challenges confronting forest landscape models (FLMs) are how to simulate fine, standscale processes while making large-scale (i.e., .107 ha) simulation possible, and how to take advantage of extensive forest inventory data such as U.S. Forest Inventory and Analysis (FIA) data to initialize and constrain model parameters. We present the LANDIS PRO model that...

  2. Performance issues for domain-oriented time-driven distributed simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1987-01-01

    It has long been recognized that simulations form an interesting and important class of computations that may benefit from distributed or parallel processing. Since the point of parallel processing is improved performance, the recent proliferation of multiprocessors requires that we consider the performance issues that naturally arise when attempting to implement a distributed simulation. Three such issues are: (1) the problem of mapping the simulation onto the architecture, (2) the possibilities for performing redundant computation in order to reduce communication, and (3) the avoidance of deadlock due to distributed contention for message-buffer space. These issues are discussed in the context of a battlefield simulation implemented on a medium-scale multiprocessor message-passing architecture.

  3. Covert rapid action-memory simulation (CRAMS): a hypothesis of hippocampal-prefrontal interactions for adaptive behavior.

    PubMed

    Wang, Jane X; Cohen, Neal J; Voss, Joel L

    2015-01-01

    Effective choices generally require memory, yet little is known regarding the cognitive or neural mechanisms that allow memory to influence choices. We outline a new framework proposing that covert memory processing of hippocampus interacts with action-generation processing of prefrontal cortex in order to arrive at optimal, memory-guided choices. Covert, rapid action-memory simulation (CRAMS) is proposed here as a framework for understanding cognitive and/or behavioral choices, whereby prefrontal-hippocampal interactions quickly provide multiple simulations of potential outcomes used to evaluate the set of possible choices. We hypothesize that this CRAMS process is automatic, obligatory, and covert, meaning that many cycles of action-memory simulation occur in response to choice conflict without an individual's necessary intention and generally without awareness of the simulations, leading to adaptive behavior with little perceived effort. CRAMS is thus distinct from influential proposals that adaptive memory-based behavior in humans requires consciously experienced memory-based construction of possible future scenarios and deliberate decisions among possible future constructions. CRAMS provides an account of why hippocampus has been shown to make critical contributions to the short-term control of behavior, and it motivates several new experimental approaches and hypotheses that could be used to better understand the ubiquitous role of prefrontal-hippocampal interactions in situations that require adaptively using memory to guide choices. Importantly, this framework provides a perspective that allows for testing decision-making mechanisms in a manner that translates well across human and nonhuman animal model systems. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  5. SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method

    PubMed Central

    Bernal, Javier; Torres-Jimenez, Jose

    2015-01-01

    SAGRAD (Simulated Annealing GRADient), a Fortran 77 program for computing neural networks for classification using batch learning, is discussed. Neural network training in SAGRAD is based on a combination of simulated annealing and Møller’s scaled conjugate gradient algorithm, the latter a variation of the traditional conjugate gradient method, better suited for the nonquadratic nature of neural networks. Different aspects of the implementation of the training process in SAGRAD are discussed, such as the efficient computation of gradients and multiplication of vectors by Hessian matrices that are required by Møller’s algorithm; the (re)initialization of weights with simulated annealing required to (re)start Møller’s algorithm the first time and each time thereafter that it shows insufficient progress in reaching a possibly local minimum; and the use of simulated annealing when Møller’s algorithm, after possibly making considerable progress, becomes stuck at a local minimum or flat area of weight space. Outlines of the scaled conjugate gradient algorithm, the simulated annealing procedure and the training process used in SAGRAD are presented together with results from running SAGRAD on two examples of training data. PMID:26958442

  6. Determination of the Order of Passes of AN Austenitic Weld by Optimization of AN Inversion Process of Ultrasound Data

    NASA Astrophysics Data System (ADS)

    Gueudré, C.; Marrec, L. Le; Chekroun, M.; Moysan, J.; Chassignole, B.; Corneloup, G.

    2011-06-01

    Multipass welds made in austenitic stainless steel, in the primary circuit of nuclear power plants with pressurized water reactors, are characterized by an anisotropic and heterogeneous structure that disturbs the ultrasonic propagation and challenge the ultrasonic non-destructive testing. The simulation in this type of structure is now possible thanks to the MINA code which allows the grain orientation modeling taking into account the welding process, and the ATHENA code to exactly simulate the ultrasonic propagation. We propose studying the case where the order of the passes is unknown to estimate the possibility of reconstructing this important parameter by ultrasound measures. The first results are presented.

  7. THE SECOND GENERATION OF THE WASTE REDUCTION (WAR) ALGORITHM: A DECISION SUPPORT SYSTEM FOR GREENER CHEMICAL PROCESSES

    EPA Science Inventory

    chemical process designers using simulation software generate alternative designs for one process. One criterion for evaluating these designs is their potential for adverse environmental impacts due to waste generated, energy consumed, and possibilities for fugitive emissions. Co...

  8. Application of an interactive water simulation model in urban water management: a case study in Amsterdam.

    PubMed

    Leskens, J G; Brugnach, M; Hoekstra, A Y

    2014-01-01

    Water simulation models are available to support decision-makers in urban water management. To use current water simulation models, special expertise is required. Therefore, model information is prepared prior to work sessions, in which decision-makers weigh different solutions. However, this model information quickly becomes outdated when new suggestions for solutions arise and are therefore limited in use. We suggest that new model techniques, i.e. fast and flexible computation algorithms and realistic visualizations, allow this problem to be solved by using simulation models during work sessions. A new Interactive Water Simulation Model was applied for two case study areas in Amsterdam and was used in two workshops. In these workshops, the Interactive Water Simulation Model was positively received. It included non-specialist participants in the process of suggesting and selecting possible solutions and made them part of the accompanying discussions and negotiations. It also provided the opportunity to evaluate and enhance possible solutions more often within the time horizon of a decision-making process. Several preconditions proved to be important for successfully applying the Interactive Water Simulation Model, such as the willingness of the stakeholders to participate and the preparation of different general main solutions that can be used for further iterations during a work session.

  9. Simulation training tools for nonlethal weapons using gaming environments

    NASA Astrophysics Data System (ADS)

    Donne, Alexsana; Eagan, Justin; Tse, Gabriel; Vanderslice, Tom; Woods, Jerry

    2006-05-01

    Modern simulation techniques have a growing role for evaluating new technologies and for developing cost-effective training programs. A mission simulator facilitates the productive exchange of ideas by demonstration of concepts through compellingly realistic computer simulation. Revolutionary advances in 3D simulation technology have made it possible for desktop computers to process strikingly realistic and complex interactions with results depicted in real-time. Computer games now allow for multiple real human players and "artificially intelligent" (AI) simulated robots to play together. Advances in computer processing power have compensated for the inherent intensive calculations required for complex simulation scenarios. The main components of the leading game-engines have been released for user modifications, enabling game enthusiasts and amateur programmers to advance the state-of-the-art in AI and computer simulation technologies. It is now possible to simulate sophisticated and realistic conflict situations in order to evaluate the impact of non-lethal devices as well as conflict resolution procedures using such devices. Simulations can reduce training costs as end users: learn what a device does and doesn't do prior to use, understand responses to the device prior to deployment, determine if the device is appropriate for their situational responses, and train with new devices and techniques before purchasing hardware. This paper will present the status of SARA's mission simulation development activities, based on the Half-Life gameengine, for the purpose of evaluating the latest non-lethal weapon devices, and for developing training tools for such devices.

  10. Numerical simulation of plasma processes driven by transverse ion heating

    NASA Technical Reports Server (NTRS)

    Singh, Nagendra; Chan, C. B.

    1993-01-01

    The plasma processes driven by transverse ion heating in a diverging flux tube are investigated with numerical simulation. The heating is found to drive a host of plasma processes, in addition to the well-known phenomenon of ion conics. The downward electric field near the reverse shock generates a doublestreaming situation consisting of two upflowing ion populations with different average flow velocities. The electric field in the reverse shock region is modulated by the ion-ion instability driven by the multistreaming ions. The oscillating fields in this region have the possibility of heating electrons. These results from the simulations are compared with results from a previous study based on a hydrodynamical model. Effects of spatial resolutions provided by simulations on the evolution of the plasma are discussed.

  11. A high-order language for a system of closely coupled processing elements

    NASA Technical Reports Server (NTRS)

    Feyock, S.; Collins, W. R.

    1986-01-01

    The research reported in this paper was occasioned by the requirements on part of the Real-Time Digital Simulator (RTDS) project under way at NASA Lewis Research Center. The RTDS simulation scheme employs a network of CPUs running lock-step cycles in the parallel computations of jet airplane simulations. Their need for a high order language (HOL) that would allow non-experts to write simulation applications and that could be implemented on a possibly varying network can best be fulfilled by using the programming language Ada. We describe how the simulation problems can be modeled in Ada, how to map a single, multi-processing Ada program into code for individual processors, regardless of network reconfiguration, and why some Ada language features are particulary well-suited to network simulations.

  12. Application of technology developed for flight simulation at NASA. Langley Research Center

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1991-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations including mathematical model computation and data input/output to the simulators must be deterministic and be completed in as short a time as possible. Personnel at NASA's Langley Research Center are currently developing the use of supercomputers for simulation mathematical model computation for real-time simulation. This, coupled with the use of an open systems software architecture, will advance the state-of-the-art in real-time flight simulation.

  13. Event-driven simulation of the state institution activity for the service provision based on business processes

    NASA Astrophysics Data System (ADS)

    Kataev, M. Yu.; Loseva, N. V.; Mitsel, A. A.; Bulysheva, L. A.; Kozlov, S. V.

    2017-01-01

    The paper presents an approach, based on business processes, assessment and control of the state of the state institution, the social insurance Fund. The paper describes the application of business processes, such as items with clear measurable parameters that need to be determined, controlled and changed for management. The example of one of the business processes of the state institutions, which shows the ability to solve management tasks, is given. The authors of the paper demonstrate the possibility of applying the mathematical apparatus of imitative simulation for solving management tasks.

  14. Simulation of Simple Controlled Processes with Dead-Time.

    ERIC Educational Resources Information Center

    Watson, Keith R.; And Others

    1985-01-01

    The determination of closed-loop response of processes containing dead-time is typically not covered in undergraduate process control, possibly because the solution by Laplace transforms requires the use of Pade approximation for dead-time, which makes the procedure lengthy and tedious. A computer-aided method is described which simplifies the…

  15. A systematic petri net approach for multiple-scale modeling and simulation of biochemical processes.

    PubMed

    Chen, Ming; Hu, Minjie; Hofestädt, Ralf

    2011-06-01

    A method to exploit hybrid Petri nets for modeling and simulating biochemical processes in a systematic way was introduced. Both molecular biology and biochemical engineering aspects are manipulated. With discrete and continuous elements, the hybrid Petri nets can easily handle biochemical factors such as metabolites concentration and kinetic behaviors. It is possible to translate both molecular biological behavior and biochemical processes workflow into hybrid Petri nets in a natural manner. As an example, penicillin production bioprocess is modeled to illustrate the concepts of the methodology. Results of the dynamic of production parameters in the bioprocess were simulated and observed diagrammatically. Current problems and post-genomic perspectives were also discussed.

  16. Observations and simulations of specularly reflected He++ at Earth's quasiperpendicular bow shock

    NASA Astrophysics Data System (ADS)

    Broll, J. M.; Fuselier, S. A.; Trattner, K. J.; Anderson, B. J.; Burch, J. L.; Giles, B. L.

    2016-12-01

    Specular reflection of protons at Earth's quasiperpendicular bow shock is an important process for supercritical shock dissipation. Previous studies have found evidence of He++ specular reflection from reduced particle distributions downstream from the shock, but confirmation of the process for heavier ions in the shock foot was not possible due to time resolution constraints. We present He++ distributions, observed by MMS in a quasiperpendicular bow shock crossing, that are consistent with specularly reflected He++. We also investigate the He++ dynamics with test-particle simulations in a simulated shock based on this crossing and we conduct wave analysis to determine what processes lead to separate gyrotropization timescales for the transmitted and reflected populations.

  17. Characterization and Evaluation of Lunar Regolith and Simulants

    NASA Technical Reports Server (NTRS)

    Cross, William M.; Murphy, Gloria A.

    2010-01-01

    A NASA-ESMD (National Aeronautics and Space Administration-Exploration Systems Mission Directorate) funded senior design project "Mineral Separation Technology for Lunar Regolith Simulant Production" is directed toward designing processes to produce Simulant materials as close to lunar regolith as possible. The eight undergraduate (junior and senior) students involved are taking a systems engineering design approach to identifying the most pressing concerns in simulant needs, then designing subsystems and processing strategies to meet these needs using terrestrial materials. This allows the students to, not only learn the systems engineering design process, but also, to make a significant contribution to an important NASA ESMD project. This paper will primarily be focused on the implementation aspect, particularly related to the systems engineering process, of this NASA EMSD senior design project. In addition comparison of the NASA ESMD group experience to the implementation of systems engineering practices into a group of existing design projects is given.

  18. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software.

    PubMed

    Zuckerman, Daniel M; Chong, Lillian T

    2017-05-22

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.

  19. Quantitative modeling of soil genesis processes

    NASA Technical Reports Server (NTRS)

    Levine, E. R.; Knox, R. G.; Kerber, A. G.

    1992-01-01

    For fine spatial scale simulation, a model is being developed to predict changes in properties over short-, meso-, and long-term time scales within horizons of a given soil profile. Processes that control these changes can be grouped into five major process clusters: (1) abiotic chemical reactions; (2) activities of organisms; (3) energy balance and water phase transitions; (4) hydrologic flows; and (5) particle redistribution. Landscape modeling of soil development is possible using digitized soil maps associated with quantitative soil attribute data in a geographic information system (GIS) framework to which simulation models are applied.

  20. Use of high performance networks and supercomputers for real-time flight simulation

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1993-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be consistent in processing time and be completed in as short a time as possible. These operations include simulation mathematical model computation and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to the Computer Automated Measurement and Control (CAMAC) technology which resulted in a factor of ten increase in the effective bandwidth and reduced latency of modules necessary for simulator communication. This technology extension is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC are completing the development of the use of supercomputers for mathematical model computation to support real-time flight simulation. This includes the development of a real-time operating system and development of specialized software and hardware for the simulator network. This paper describes the data acquisition technology and the development of supercomputing for flight simulation.

  1. Mono and multi-objective optimization techniques applied to a large range of industrial test cases using Metamodel assisted Evolutionary Algorithms

    NASA Astrophysics Data System (ADS)

    Fourment, Lionel; Ducloux, Richard; Marie, Stéphane; Ejday, Mohsen; Monnereau, Dominique; Massé, Thomas; Montmitonnet, Pierre

    2010-06-01

    The use of material processing numerical simulation allows a strategy of trial and error to improve virtual processes without incurring material costs or interrupting production and therefore save a lot of money, but it requires user time to analyze the results, adjust the operating conditions and restart the simulation. Automatic optimization is the perfect complement to simulation. Evolutionary Algorithm coupled with metamodelling makes it possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. Ten industrial partners have been selected to cover the different area of the mechanical forging industry and provide different examples of the forming simulation tools. It aims to demonstrate that it is possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. The large computational time is handled by a metamodel approach. It allows interpolating the objective function on the entire parameter space by only knowing the exact function values at a reduced number of "master points". Two algorithms are used: an evolution strategy combined with a Kriging metamodel and a genetic algorithm combined with a Meshless Finite Difference Method. The later approach is extended to multi-objective optimization. The set of solutions, which corresponds to the best possible compromises between the different objectives, is then computed in the same way. The population based approach allows using the parallel capabilities of the utilized computer with a high efficiency. An optimization module, fully embedded within the Forge2009 IHM, makes possible to cover all the defined examples, and the use of new multi-core hardware to compute several simulations at the same time reduces the needed time dramatically. The presented examples demonstrate the method versatility. They include billet shape optimization of a common rail, the cogging of a bar and a wire drawing problem.

  2. Simulation of Stochastic Processes by Coupled ODE-PDE

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2008-01-01

    A document discusses the emergence of randomness in solutions of coupled, fully deterministic ODE-PDE (ordinary differential equations-partial differential equations) due to failure of the Lipschitz condition as a new phenomenon. It is possible to exploit the special properties of ordinary differential equations (represented by an arbitrarily chosen, dynamical system) coupled with the corresponding Liouville equations (used to describe the evolution of initial uncertainties in terms of joint probability distribution) in order to simulate stochastic processes with the proscribed probability distributions. The important advantage of the proposed approach is that the simulation does not require a random-number generator.

  3. The QuakeSim Project: Numerical Simulations for Active Tectonic Processes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry

    2004-01-01

    In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.

  4. Processing for spaceborne synthetic aperture radar imagery

    NASA Technical Reports Server (NTRS)

    Lybanon, M.

    1973-01-01

    The data handling and processing in using synthetic aperture radar as a satellite-borne earth resources remote sensor is considered. The discussion covers the nature of the problem, the theory, both conventional and potential advanced processing techniques, and a complete computer simulation. It is shown that digital processing is a real possibility and suggests some future directions for research.

  5. Radiation damage of biomolecules (RADAM) database development: current status

    NASA Astrophysics Data System (ADS)

    Denifl, S.; Garcia, G.; Huber, B. A.; Marinković, B. P.; Mason, N.; Postler, J.; Rabus, H.; Rixon, G.; Solov'yov, A. V.; Suraud, E.; Yakubovich, A. V.

    2013-06-01

    Ion beam therapy offers the possibility of excellent dose localization for treatment of malignant tumours, minimizing radiation damage in normal tissue, while maximizing cell killing within the tumour. However, as the underlying dependent physical, chemical and biological processes are too complex to treat them on a purely analytical level, most of our current and future understanding will rely on computer simulations, based on mathematical equations, algorithms and last, but not least, on the available atomic and molecular data. The viability of the simulated output and the success of any computer simulation will be determined by these data, which are treated as the input variables in each computer simulation performed. The radiation research community lacks a complete database for the cross sections of all the different processes involved in ion beam induced damage: ionization and excitation cross sections for ions with liquid water and biological molecules, all the possible electron - medium interactions, dielectric response data, electron attachment to biomolecules etc. In this paper we discuss current progress in the creation of such a database, outline the roadmap of the project and review plans for the exploitation of such a database in future simulations.

  6. Automatic Screening for Perturbations in Boolean Networks.

    PubMed

    Schwab, Julian D; Kestler, Hans A

    2018-01-01

    A common approach to address biological questions in systems biology is to simulate regulatory mechanisms using dynamic models. Among others, Boolean networks can be used to model the dynamics of regulatory processes in biology. Boolean network models allow simulating the qualitative behavior of the modeled processes. A central objective in the simulation of Boolean networks is the computation of their long-term behavior-so-called attractors. These attractors are of special interest as they can often be linked to biologically relevant behaviors. Changing internal and external conditions can influence the long-term behavior of the Boolean network model. Perturbation of a Boolean network by stripping a component of the system or simulating a surplus of another element can lead to different attractors. Apparently, the number of possible perturbations and combinations of perturbations increases exponentially with the size of the network. Manually screening a set of possible components for combinations that have a desired effect on the long-term behavior can be very time consuming if not impossible. We developed a method to automatically screen for perturbations that lead to a user-specified change in the network's functioning. This method is implemented in the visual simulation framework ViSiBool utilizing satisfiability (SAT) solvers for fast exhaustive attractor search.

  7. Simulation of Martian surface conditions and dust transport

    NASA Astrophysics Data System (ADS)

    Nørnberg, P.; Merrison, J. P.; Finster, K.; Folkmann, F.; Gunnlaugsson, H. P.; Hansen, A.; Jensen, J.; Kinch, K.; Lomstein, B. Aa.; Mugford, R.

    2002-11-01

    The suspended atmospheric dust which is also found deposited over most of the Martian globe plays an important (possibly vital) role in shaping the surface environment. It affects the weather (solar flux), water transport and possibly also the electrical properties at the surface. The simulation facilities at Aarhus provide excellent tools for studying the properties of this Martian environment. Much can be learned from such simulations, supporting and often inspiring new investigations of the planet. Electrical charging of a Mars analogue dust is being studied within a wind tunnel simulation aerosol. Here electric fields are used to extract dust from suspension. Although preliminary the results indicate that a large fraction of the dust is charged to a high degree, sufficient to dominate adhesion/cohesion processes. A Mars analogue dust layer has been shown to be an excellent trap for moisture, causing increased humidity in the soil below. This allows the possibility for liquid water to be stable close to the surface (less than 10 cm). This is being investigated in an environment simulator where heat and moisture transport can be studied through layers of Mars analogue dust.

  8. Implementation of virtual models from sheet metal forming simulation into physical 3D colour models using 3D printing

    NASA Astrophysics Data System (ADS)

    Junk, S.

    2016-08-01

    Today the methods of numerical simulation of sheet metal forming offer a great diversity of possibilities for optimization in product development and in process design. However, the results from simulation are only available as virtual models. Because there are any forming tools available during the early stages of product development, physical models that could serve to represent the virtual results are therefore lacking. Physical 3D-models can be created using 3D-printing and serve as an illustration and present a better understanding of the simulation results. In this way, the results from the simulation can be made more “comprehensible” within a development team. This paper presents the possibilities of 3D-colour printing with particular consideration of the requirements regarding the implementation of sheet metal forming simulation. Using concrete examples of sheet metal forming, the manufacturing of 3D colour models will be expounded upon on the basis of simulation results.

  9. Benchmarking nitrogen removal suspended-carrier biofilm systems using dynamic simulation.

    PubMed

    Vanhooren, H; Yuan, Z; Vanrolleghem, P A

    2002-01-01

    We are witnessing an enormous growth in biological nitrogen removal from wastewater. It presents specific challenges beyond traditional COD (carbon) removal. A possibility for optimised process design is the use of biomass-supporting media. In this paper, attached growth processes (AGP) are evaluated using dynamic simulations. The advantages of these systems that were qualitatively described elsewhere, are validated quantitatively based on a simulation benchmark for activated sludge treatment systems. This simulation benchmark is extended with a biofilm model that allows for fast and accurate simulation of the conversion of different substrates in a biofilm. The economic feasibility of this system is evaluated using the data generated with the benchmark simulations. Capital savings due to volume reduction and reduced sludge production are weighed out against increased aeration costs. In this evaluation, effluent quality is integrated as well.

  10. Dynamic Simulation of AN Helium Refrigerator

    NASA Astrophysics Data System (ADS)

    Deschildre, C.; Barraud, A.; Bonnay, P.; Briend, P.; Girard, A.; Poncet, J. M.; Roussel, P.; Sequeira, S. E.

    2008-03-01

    A dynamic simulation of a large scale existing refrigerator has been performed using the software Aspen Hysys®. The model comprises the typical equipments of a cryogenic system: heat exchangers, expanders, helium phase separators and cold compressors. It represents the 400 W @ 1.8 K Test Facility located at CEA—Grenoble. This paper describes the model development and shows the possibilities and limitations of the dynamic module of Aspen Hysys®. Then, comparison between simulation results and experimental data are presented; the simulation of cooldown process was also performed.

  11. The application of additive technologies in creation a medical simulator-trainer of the human head operating field

    NASA Astrophysics Data System (ADS)

    Kashapov, L. N.; Kashapov, N. F.; Kashapov, R. N.; Pashaev, B. Y.

    2016-06-01

    The aim of the work was to determine the possible application of additive manufacturing technology during the manufacturing process as close as possible to reality of medical simulator-trainers. In work were used some additive manufacturing technologies: selective laser sintering (SLS), fused deposition modeling (FDM), binder Jetting. As a result, a prototype of simulator-trainer of the human head operating field, which based on the CT real patient, was manufactured and conducted its tests. It was found that structure, which is obtained with the use of 3D-printers ProJet 160, most appropriate and closest to the real properties of the bone.

  12. Provably unbounded memory advantage in stochastic simulation using quantum mechanics

    NASA Astrophysics Data System (ADS)

    Garner, Andrew J. P.; Liu, Qing; Thompson, Jayne; Vedral, Vlatko; Gu, mile

    2017-10-01

    Simulating the stochastic evolution of real quantities on a digital computer requires a trade-off between the precision to which these quantities are approximated, and the memory required to store them. The statistical accuracy of the simulation is thus generally limited by the internal memory available to the simulator. Here, using tools from computational mechanics, we show that quantum processors with a fixed finite memory can simulate stochastic processes of real variables to arbitrarily high precision. This demonstrates a provable, unbounded memory advantage that a quantum simulator can exhibit over its best possible classical counterpart.

  13. Study of changes in properties of solar sail materials from radiation exposure

    NASA Technical Reports Server (NTRS)

    Smith, T.

    1977-01-01

    Techniques for monitoring changes in preparation of solar sail materials resulting from space radiation simulation, stressing (e.g., thermal, mechanical) and exposure to terrestrial environments are developed. The properties of interest are: metallic coating deterioration, polymeric film deterioration, interfacial debonding and possible metallic coating diffusion into the polymeric film. Four accelerated tests were devised to simulate the possible degradation processes mentioned above. These four tests are: a thermal shock test to simulate the wide variation of temperature expected in space (260 C to -100 C), a cyclic temperature test to stimulate the 6 minute temperature cycle anticipated in space, a mechanical vibration test to simulate mechanical bonding, folding and handling, and a humidity test to simulate terrestrial environment effects. The techniques for monitoring property changes are: visual and microscopic examination, ellipsometry, surface potential difference (SPD), photoelectron emission (PEE), and water contact angles.

  14. Burnishing rolling process of the surface prepared in the turning process

    NASA Astrophysics Data System (ADS)

    Kulakowska, Agnieszka; Kukielka, Leon; Kaldunski, Pawel; Bohdal, Lukasz; Patyk, Radoslaw; Chodor, Jaroslaw; Kukielka, Krzysztof

    2018-05-01

    The aim of this article is to demonstrate the possibility of using burnishing rolling process as the technology of product development. The experimental researches were carried out, showing the ability to form the surface layer of the product with the desired properties. First, during turning rolling the surfaces of the samples were prepared. Then, the surfaces were burnished. The influence of turning process on the state of the surface layer parameters of C45 steel shafts are shown. Among the examined aspects the surface roughness, nano-roughness, material bearing, surface microstructure, metallographic structure were considered. Numerical simulation were conducted. Conclusions from the experiments and simulation were given.

  15. Simulation and Analysis of One-time Forming Process of Automobile Steering Ball Head

    NASA Astrophysics Data System (ADS)

    Shi, Peicheng; Zhang, Xujun; Xu, Zengwei; Zhang, Rongyun

    2018-03-01

    Aiming at the problems such as large machining allowance, low production efficiency and material waste during die forging of ball pin, the cold extrusion process of ball head was studied and the analog simulation of the forming process was carried out by using the finite element analysis software DEFORM-3D. Through the analysis of the equivalent stress strain, velocity vector field and load-displacement curve, the flow regularity of the metal during the cold extrusion process of ball pin was clarified, and possible defects during the molding were predicted. The results showed that this process could solve the forming problem of ball pin and provide theoretical basis for actual production of enterprises.

  16. SiMon: Simulation Monitor for Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  17. Application of a neural network to simulate analysis in an optimization process

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Lamarsh, William J., II

    1992-01-01

    A new experimental software package called NETS/PROSSS aimed at reducing the computing time required to solve a complex design problem is described. The software combines a neural network for simulating the analysis program with an optimization program. The neural network is applied to approximate results of a finite element analysis program to quickly obtain a near-optimal solution. Results of the NETS/PROSSS optimization process can also be used as an initial design in a normal optimization process and make it possible to converge to an optimum solution with significantly fewer iterations.

  18. Telerobotic Surgery: An Intelligent Systems Approach to Mitigate the Adverse Effects of Communication Delay. Chapter 4

    NASA Technical Reports Server (NTRS)

    Cardullo, Frank M.; Lewis, Harold W., III; Panfilov, Peter B.

    2007-01-01

    An extremely innovative approach has been presented, which is to have the surgeon operate through a simulator running in real-time enhanced with an intelligent controller component to enhance the safety and efficiency of a remotely conducted operation. The use of a simulator enables the surgeon to operate in a virtual environment free from the impediments of telecommunication delay. The simulator functions as a predictor and periodically the simulator state is corrected with truth data. Three major research areas must be explored in order to ensure achieving the objectives. They are: simulator as predictor, image processing, and intelligent control. Each is equally necessary for success of the project and each of these involves a significant intelligent component in it. These are diverse, interdisciplinary areas of investigation, thereby requiring a highly coordinated effort by all the members of our team, to ensure an integrated system. The following is a brief discussion of those areas. Simulator as a predictor: The delays encountered in remote robotic surgery will be greater than any encountered in human-machine systems analysis, with the possible exception of remote operations in space. Therefore, novel compensation techniques will be developed. Included will be the development of the real-time simulator, which is at the heart of our approach. The simulator will present real-time, stereoscopic images and artificial haptic stimuli to the surgeon. Image processing: Because of the delay and the possibility of insufficient bandwidth a high level of novel image processing is necessary. This image processing will include several innovative aspects, including image interpretation, video to graphical conversion, texture extraction, geometric processing, image compression and image generation at the surgeon station. Intelligent control: Since the approach we propose is in a sense predictor based, albeit a very sophisticated predictor, a controller, which not only optimizes end effector trajectory but also avoids error, is essential. We propose to investigate two different approaches to the controller design. One approach employs an optimal controller based on modern control theory; the other one involves soft computing techniques, i.e. fuzzy logic, neural networks, genetic algorithms and hybrids of these.

  19. The Clone Factory

    ERIC Educational Resources Information Center

    Stoddard, Beryl

    2005-01-01

    Have humans been cloned? Is it possible? Immediate interest is sparked when students are asked these questions. In response to their curiosity, the clone factory activity was developed to help them understand the process of cloning. In this activity, students reenact the cloning process, in a very simplified simulation. After completing the…

  20. Finite Element Analysis of Single Wheat Mechanical Response to Wind and Rain Loads

    NASA Astrophysics Data System (ADS)

    Liang, Li; Guo, Yuming

    One variety of wheat in the breeding process was chosen to determine the wheat morphological traits and biomechanical properties. ANSYS was used to build the mechanical model of wheat to wind load and the dynamic response of wheat to wind load was simulated. The maximum Von Mises stress is obtained by the powerful calculation function of ANSYS. And the changing stress and displacement of each node and finite element in the process of simulation can be output through displacement nephogram and stress nephogram. The load support capability can be evaluated and to predict the wheat lodging. It is concluded that computer simulation technology has unique advantages such as convenient and efficient in simulating mechanical response of wheat stalk under wind and rain load. Especially it is possible to apply various load types on model and the deformation process can be observed simultaneously.

  1. Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software

    NASA Astrophysics Data System (ADS)

    Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.

    2017-09-01

    This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.

  2. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software

    PubMed Central

    Zuckerman, Daniel M.; Chong, Lillian T.

    2018-01-01

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling—the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes—protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation. PMID:28301772

  3. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  4. A Process for the Creation of T-MATS Propulsion System Models from NPSS data

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink (Math Works, Inc.) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.

  5. A Process for the Creation of T-MATS Propulsion System Models from NPSS Data

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink(Trademark) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.

  6. A Process for the Creation of T-MATS Propulsion System Models from NPSS Data

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei

    2014-01-01

    A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink(Registered TradeMark) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.

  7. NASA Lunar Regolith Simulant Program

    NASA Technical Reports Server (NTRS)

    Edmunson, J.; Betts, W.; Rickman, D.; McLemore, C.; Fikes, J.; Stoeser, D.; Wilson, S.; Schrader, C.

    2010-01-01

    Lunar regolith simulant production is absolutely critical to returning man to the Moon. Regolith simulant is used to test hardware exposed to the lunar surface environment, simulate health risks to astronauts, practice in situ resource utilization (ISRU) techniques, and evaluate dust mitigation strategies. Lunar regolith simulant design, production process, and management is a cooperative venture between members of the NASA Marshall Space Flight Center (MSFC) and the U.S. Geological Survey (USGS). The MSFC simulant team is a satellite of the Dust group based at Glenn Research Center. The goals of the cooperative group are to (1) reproduce characteristics of lunar regolith using simulants, (2) produce simulants as cheaply as possible, (3) produce simulants in the amount needed, and (4) produce simulants to meet users? schedules.

  8. Designing and Implementing an OVERFLOW Reader for ParaView and Comparing Performance Between Central Processing Units and Graphical Processing Units

    NASA Technical Reports Server (NTRS)

    Chawner, David M.; Gomez, Ray J.

    2010-01-01

    In the Applied Aerosciences and CFD branch at Johnson Space Center, computational simulations are run that face many challenges. Two of which are the ability to customize software for specialized needs and the need to run simulations as fast as possible. There are many different tools that are used for running these simulations and each one has its own pros and cons. Once these simulations are run, there needs to be software capable of visualizing the results in an appealing manner. Some of this software is called open source, meaning that anyone can edit the source code to make modifications and distribute it to all other users in a future release. This is very useful, especially in this branch where many different tools are being used. File readers can be written to load any file format into a program, to ease the bridging from one tool to another. Programming such a reader requires knowledge of the file format that is being read as well as the equations necessary to obtain the derived values after loading. When running these CFD simulations, extremely large files are being loaded and having values being calculated. These simulations usually take a few hours to complete, even on the fastest machines. Graphics processing units (GPUs) are usually used to load the graphics for computers; however, in recent years, GPUs are being used for more generic applications because of the speed of these processors. Applications run on GPUs have been known to run up to forty times faster than they would on normal central processing units (CPUs). If these CFD programs are extended to run on GPUs, the amount of time they would require to complete would be much less. This would allow more simulations to be run in the same amount of time and possibly perform more complex computations.

  9. Simulation based analysis of laser beam brazing

    NASA Astrophysics Data System (ADS)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  10. Sensitivity Analysis for an Assignment Incentive Pay in the U.S. Navy Enlisted Personnel Assignment Process in a Simulation Environment

    DTIC Science & Technology

    2004-03-01

    Assignment Sub-Process.........................................................................................12 2. Possible Improvements By A Market ...COMPENSATION STARTEGY .............................................17 A. THE RIGHT COMPENSATION SYSTEM ...............................................17 B. AN...5. Market -Based Labor Markets (From: Gates, 2001).........................................13 Figure 6. What should a compensation system do? (From

  11. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. FERN - a Java framework for stochastic simulation and evaluation of reaction networks.

    PubMed

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-08-29

    Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.

  13. Technology, design, simulation, and evaluation for SEP-hardened circuits

    NASA Technical Reports Server (NTRS)

    Adams, J. R.; Allred, D.; Barry, M.; Rudeck, P.; Woodruff, R.; Hoekstra, J.; Gardner, H.

    1991-01-01

    This paper describes the technology, design, simulation, and evaluation for improvement of the Single Event Phenomena (SEP) hardness of gate-array and SRAM cells. Through the use of design and processing techniques, it is possible to achieve an SEP error rate less than 1.0 x 10(exp -10) errors/bit-day for a 9O percent worst-case geosynchronous orbit environment.

  14. Optimization of Collision Detection in Surgical Simulations

    NASA Astrophysics Data System (ADS)

    Custură-Crăciun, Dan; Cochior, Daniel; Neagu, Corneliu

    2014-11-01

    Just like flight and spaceship simulators already represent a standard, we expect that soon enough, surgical simulators should become a standard in medical applications. A simulations quality is strongly related to the image quality as well as the degree of realism of the simulation. Increased quality requires increased resolution, increased representation speed but more important, a larger amount of mathematical equations. To make it possible, not only that we need more efficient computers, but especially more calculation process optimizations. A simulator executes one of the most complex sets of calculations each time it detects a contact between the virtual objects, therefore optimization of collision detection is fatal for the work-speed of a simulator and hence in its quality

  15. Design, Control and in Situ Visualization of Gas Nitriding Processes

    PubMed Central

    Ratajski, Jerzy; Olik, Roman; Suszko, Tomasz; Dobrodziej, Jerzy; Michalski, Jerzy

    2010-01-01

    The article presents a complex system of design, in situ visualization and control of the commonly used surface treatment process: the gas nitriding process. In the computer design conception, analytical mathematical models and artificial intelligence methods were used. As a result, possibilities were obtained of the poly-optimization and poly-parametric simulations of the course of the process combined with a visualization of the value changes of the process parameters in the function of time, as well as possibilities to predict the properties of nitrided layers. For in situ visualization of the growth of the nitrided layer, computer procedures were developed which make use of the results of the correlations of direct and differential voltage and time runs of the process result sensor (magnetic sensor), with the proper layer growth stage. Computer procedures make it possible to combine, in the duration of the process, the registered voltage and time runs with the models of the process. PMID:22315536

  16. Characterizing the role of the hippocampus during episodic simulation and encoding.

    PubMed

    Thakral, Preston P; Benoit, Roland G; Schacter, Daniel L

    2017-12-01

    The hippocampus has been consistently associated with episodic simulation (i.e., the mental construction of a possible future episode). In a recent study, we identified an anterior-posterior temporal dissociation within the hippocampus during simulation. Specifically, transient simulation-related activity occurred in relatively posterior portions of the hippocampus and sustained activity occurred in anterior portions. In line with previous theoretical proposals of hippocampal function during simulation, the posterior hippocampal activity was interpreted as reflecting a transient retrieval process for the episodic details necessary to construct an episode. In contrast, the sustained anterior hippocampal activity was interpreted as reflecting the continual recruitment of encoding and/or relational processing associated with a simulation. In the present study, we provide a direct test of these interpretations by conducting a subsequent memory analysis of our previously published data to assess whether successful encoding during episodic simulation is associated with the anterior hippocampus. Analyses revealed a subsequent memory effect (i.e., later remembered > later forgotten simulations) in the anterior hippocampus. The subsequent memory effect was transient and not sustained. Taken together, the current findings provide further support for a component process model of hippocampal function during simulation. That is, unique regions of the hippocampus support dissociable processes during simulation, which include the transient retrieval of episodic information, the sustained binding of such information into a coherent episode, and the transient encoding of that episode for later retrieval. © 2017 Wiley Periodicals, Inc.

  17. Simulation's Ensemble is Better Than Ensemble Simulation

    NASA Astrophysics Data System (ADS)

    Yan, X.

    2017-12-01

    Simulation's ensemble is better than ensemble simulation Yan Xiaodong State Key Laboratory of Earth Surface Processes and Resource Ecology (ESPRE) Beijing Normal University,19 Xinjiekouwai Street, Haidian District, Beijing 100875, China Email: yxd@bnu.edu.cnDynamical system is simulated from initial state. However initial state data is of great uncertainty, which leads to uncertainty of simulation. Therefore, multiple possible initial states based simulation has been used widely in atmospheric science, which has indeed been proved to be able to lower the uncertainty, that was named simulation's ensemble because multiple simulation results would be fused . In ecological field, individual based model simulation (forest gap models for example) can be regarded as simulation's ensemble compared with community based simulation (most ecosystem models). In this talk, we will address the advantage of individual based simulation and even their ensembles.

  18. The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    1999-01-01

    Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  19. Simulating the phase partitioning of NH3, HNO3, and HCl with size-resolved particles over northern Colorado in winter

    EPA Science Inventory

    Numerical modeling of inorganic aerosol processes is useful in air quality management, but comprehensive evaluation of modeled aerosol processes is rarely possible due to the lack of comprehensive datasets. During the Nitrogen, Aerosol Composition, and Halogens on a Tall Tower (N...

  20. Assembly flow simulation of a radar

    NASA Technical Reports Server (NTRS)

    Rutherford, W. C.; Biggs, P. M.

    1994-01-01

    A discrete event simulation model has been developed to predict the assembly flow time of a new radar product. The simulation was the key tool employed to identify flow constraints. The radar, production facility, and equipment complement were designed, arranged, and selected to provide the most manufacturable assembly possible. A goal was to reduce the assembly and testing cycle time from twenty-six weeks. A computer software simulation package (SLAM 2) was utilized as the foundation for simulating the assembly flow time. FORTRAN subroutines were incorporated into the software to deal with unique flow circumstances that were not accommodated by the software. Detailed information relating to the assembly operations was provided by a team selected from the engineering, manufacturing management, inspection, and production assembly staff. The simulation verified that it would be possible to achieve the cycle time goal of six weeks. Equipment and manpower constraints were identified during the simulation process and adjusted as required to achieve the flow with a given monthly production requirement. The simulation is being maintained as a planning tool to be used to identify constraints in the event that monthly output is increased. 'What-if' studies have been conducted to identify the cost of reducing constraints caused by increases in output requirement.

  1. Translations from Kommunist, Number 13, September 1978

    DTIC Science & Technology

    1978-10-30

    programmed machine tool here is merely a component of a more complex reprogrammable technological system. This includes the robot machine tools with...sufficient possibilities for changing technological operations and processes and automated technological lines. 52 The reprogrammable automated sets will...simulate the possibilities of such sets. A new technological level will be developed in industry related to reprogrammable automated sets, their design

  2. DEVELOPMENT OF AN INSOLUBLE SALT SIMULANT TO SUPPORT ENHANCED CHEMICAL CLEANING TESTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eibling, R

    The closure process for high level waste tanks at the Savannah River Site will require dissolution of the crystallized salts that are currently stored in many of the tanks. The insoluble residue from salt dissolution is planned to be removed by an Enhanced Chemical Cleaning (ECC) process. Development of a chemical cleaning process requires an insoluble salt simulant to support evaluation tests of different cleaning methods. The Process Science and Engineering section of SRNL has been asked to develop an insoluble salt simulant for use in testing potential ECC processes (HLE-TTR-2007-017). An insoluble salt simulant has been developed based uponmore » the residues from salt dissolution of saltcake core samples from Tank 28F. The simulant was developed for use in testing SRS waste tank chemical cleaning methods. Based on the results of the simulant development process, the following observations were developed: (1) A composition based on the presence of 10.35 grams oxalate and 4.68 grams carbonate per 100 grams solids produces a sufficiently insoluble solids simulant. (2) Aluminum observed in the solids remaining from actual waste salt dissolution tests is probably precipitated from sodium aluminate due to the low hydroxide content of the saltcake. (3) In-situ generation of aluminum hydroxide (by use of aluminate as the Al source) appears to trap additional salts in the simulant in a manner similar to that expected for actual waste samples. (4) Alternative compositions are possible with higher oxalate levels and lower carbonate levels. (5) The maximum oxalate level is limited by the required Na content of the insoluble solids. (6) Periodic mixing may help to limit crystal growth in this type of salt simulant. (7) Long term storage of an insoluble salt simulant is likely to produce a material that can not be easily removed from the storage container. Production of a relatively fresh simulant is best if pumping the simulant is necessary for testing purposes. The insoluble salt simulant described in this report represents the initial attempt to represent the material which may be encountered during final waste removal and tank cleaning. The final selected simulant was produced by heating and evaporation of a salt slurry sample to remove excess water and promote formation and precipitation of solids with solubility characteristics which are consistent with actual tank insoluble salt samples. The exact anion composition of the final product solids is not explicitly known since the chemical components in the final product are distributed between the solid and liquid phases. By combining the liquid phase analyses and total solids analysis with mass balance requirements a calculated composition of assumed simple compounds was obtained and is shown in Table 0-1. Additional improvements to and further characterization of the insoluble salt simulant are possible. During the development of these simulants it was recognized that: (1) Additional waste characterization on the residues from salt dissolution tests with actual waste samples to determine the amount of species such as carbonate, oxalate and aluminosilicate would allow fewer assumptions to be made in constructing an insoluble salt simulant. (2) The tank history will impact the amount and type of insoluble solids that exist in the salt dissolution solids. Varying the method of simulant production (elevated temperature processing time, degree of evaporation, amount of mixing (shear) during preparation, etc.) should be tested.« less

  3. Using Open Source Software in Visual Simulation Development

    DTIC Science & Technology

    2005-09-01

    increased the use of the technology in training activities. Using open source/free software tools in the process can expand these possibilities...resulting in even greater cost reduction and allowing the flexibility needed in a training environment. This thesis presents a configuration and architecture...to be used when developing training visual simulations using both personal computers and open source tools. Aspects of the requirements needed in a

  4. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    NASA Astrophysics Data System (ADS)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  5. CFD simulation of reverse water-hammer induced by collapse of draft-tube cavity in a model pump-turbine during runaway process

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoxi; Cheng, Yongguang; Xia, Linsheng; Yang, Jiandong

    2016-11-01

    This paper reports the preliminary progress in the CFD simulation of the reverse water-hammer induced by the collapse of a draft-tube cavity in a model pump-turbine during the runaway process. Firstly, the Fluent customized 1D-3D coupling model for hydraulic transients and the Schnerr & Sauer cavitation model for cavity development are introduced. Then, the methods are validated by simulating the benchmark reverse water-hammer in a long pipe caused by a valve instant closure. The simulated head history at the valve agrees well with the measured data in literature. After that, the more complicated reverse water-hammer in the draft-tube of a runaway model pump-turbine, which is installed in a model pumped-storage power plant, is simulated. The dynamic processes of a vapor cavity, from generation, expansion, shrink to collapse, are shown. After the cavity collapsed, a sudden increase of pressure can be evidently observed. The process is featured by a locally expending and collapsing vapor cavity that is around the runner cone, which is different from the conventional recognition of violent water- column separation. This work reveals the possibility for simulating the reverse water-hammer phenomenon in turbines by 3D CFD.

  6. Processing biobased polymers using plasticizers: Numerical simulations versus experiments

    NASA Astrophysics Data System (ADS)

    Desplentere, Frederik; Cardon, Ludwig; Six, Wim; Erkoç, Mustafa

    2016-03-01

    In polymer processing, the use of biobased products shows lots of possibilities. Considering biobased materials, biodegradability is in most cases the most important issue. Next to this, bio based materials aimed at durable applications, are gaining interest. Within this research, the influence of plasticizers on the processing of the bio based material is investigated. This work is done for an extrusion grade of PLA, Natureworks PLA 2003D. Extrusion through a slit die equipped with pressure sensors is used to compare the experimental pressure values to numerical simulation results. Additional experimental data (temperature and pressure data along the extrusion screw and die are recorded) is generated on a dr. Collin Lab extruder producing a 25mm diameter tube. All these experimental data is used to indicate the appropriate functioning of the numerical simulation tool Virtual Extrusion Laboratory 6.7 for the simulation of both the industrial available extrusion grade PLA and the compound in which 15% of plasticizer is added. Adding the applied plasticizer, resulted in a 40% lower pressure drop over the extrusion die. The combination of different experiments allowed to fit the numerical simulation results closely to the experimental values. Based on this experience, it is shown that numerical simulations also can be used for modified bio based materials if appropriate material and process data are taken into account.

  7. Interactive Learning Environment: Web-based Virtual Hydrological Simulation System using Augmented and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2014-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.

  8. The Application of SNiPER to the JUNO Simulation

    NASA Astrophysics Data System (ADS)

    Lin, Tao; Zou, Jiaheng; Li, Weidong; Deng, Ziyan; Fang, Xiao; Cao, Guofu; Huang, Xingtao; You, Zhengyun; JUNO Collaboration

    2017-10-01

    The JUNO (Jiangmen Underground Neutrino Observatory) is a multipurpose neutrino experiment which is designed to determine neutrino mass hierarchy and precisely measure oscillation parameters. As one of the important systems, the JUNO offline software is being developed using the SNiPER software. In this proceeding, we focus on the requirements of JUNO simulation and present the working solution based on the SNiPER. The JUNO simulation framework is in charge of managing event data, detector geometries and materials, physics processes, simulation truth information etc. It glues physics generator, detector simulation and electronics simulation modules together to achieve a full simulation chain. In the implementation of the framework, many attractive characteristics of the SNiPER have been used, such as dynamic loading, flexible flow control, multiple event management and Python binding. Furthermore, additional efforts have been made to make both detector and electronics simulation flexible enough to accommodate and optimize different detector designs. For the Geant4-based detector simulation, each sub-detector component is implemented as a SNiPER tool which is a dynamically loadable and configurable plugin. So it is possible to select the detector configuration at runtime. The framework provides the event loop to drive the detector simulation and interacts with the Geant4 which is implemented as a passive service. All levels of user actions are wrapped into different customizable tools, so that user functions can be easily extended by just adding new tools. The electronics simulation has been implemented by following an event driven scheme. The SNiPER task component is used to simulate data processing steps in the electronics modules. The electronics and trigger are synchronized by triggered events containing possible physics signals. The JUNO simulation software has been released and is being used by the JUNO collaboration to do detector design optimization, event reconstruction algorithm development and physics sensitivity studies.

  9. Controlled cooling technology for bar and rod mills -- Computer simulation and operational results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mauk, P.J.; Kruse, M.; Plociennik, U.

    The Controlled Cooling Technology (CCT) developed by SMS to simulate the rolling process and automatic control of the water cooling sections is presented. The Controlled Rolling and Cooling Technology (CRCT) model is a key part of the CCT system. It is used to simulate temperature management for the rolling stock on the computer before the actual rolling process takes place. This makes it possible to dispense with extensive rolling tests in the early stages of project planning and to greatly reduce the extent of such tests prior to the start of commercial production in a rolling mill. The CRCT modelmore » has been in use at Von Moos Stahl Ag for three years. It demonstrates that, by targeted improvement of the set-up values in both the technology and the plant, it is possible to improve microstructure quality and achieve better geometrical parameters in the rolled products. Also, the results gained with the CCT system in practical operation at the Kia Steel Bar Mill, Kunsan, Korea, are presented.« less

  10. Numerical simulations of a nonequilibrium argon plasma in a shock-tube experiment

    NASA Technical Reports Server (NTRS)

    Cambier, Jean-Luc

    1991-01-01

    A code developed for the numerical modeling of nonequilibrium radiative plasmas is applied to the simulation of the propagation of strong ionizing shock waves in argon gas. The simulations attempt to reproduce a series of shock-tube experiments which will be used to validate the numerical models and procedures. The ability to perform unsteady simulations makes it possible to observe some fluctuations in the shock propagation, coupled to the kinetic processes. A coupling mechanism by pressure waves, reminiscent of oscillation mechanisms observed in detonation waves, is described. The effect of upper atomic levels is also briefly discussed.

  11. Numerical simulation of controlled directional solidification under microgravity conditions

    NASA Astrophysics Data System (ADS)

    Holl, S.; Roos, D.; Wein, J.

    The computer-assisted simulation of solidification processes influenced by gravity has gained increased importance during the previous years regarding ground-based as well as microgravity research. Depending on the specific needs of the investigator, the simulation model ideally covers a broad spectrum of applications. These primarily include the optimization of furnace design in interaction with selected process parameters to meet the desired crystallization conditions. Different approaches concerning the complexity of the simulation models as well as their dedicated applications will be discussed in this paper. Special emphasis will be put on the potential of software tools to increase the scientific quality and cost-efficiency of microgravity experimentation. The results gained so far in the context of TEXUS, FSLP, D-1 and D-2 (preparatory program) experiments, highlighting their simulation-supported preparation and evaluation will be discussed. An outlook will then be given on the possibilities to enhance the efficiency of pre-industrial research in the Columbus era through the incorporation of suitable simulation methods and tools.

  12. BIOASPEN: System for technology development

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The public version of ASPEN was installed in the VAX 11/750 computer. To examine the idea of BIOASPEN, a test example (the manufacture of acetone, butanol, and ethanol through a biological route) was chosen for simulation. Previous reports on the BIOASPEN project revealed the limitations of ASPEN in modeling this process. To overcome some of the difficulties, modules were written for the acid and enzyme hydrolyzers, the fermentor, and a sterilizer. Information required for these modules was obtained from the literature whenever possible. Additional support modules necessary for interfacing with ASPEN were also written. Some of ASPEN subroutines were themselves altered in order to ensure the correct running of the simulation program. After testing of these additions and charges was completed, the Acetone-Butanol-Ethanol (ABE) process was simulated. A release of ASPEN (which contained the Economic Subsystem) was obtained and installed. This subsection was tested and numerous charges were made in the FORTRAN code. Capital investment and operating cost studies were performed on the ABE process. Some alternatives in certain steps of the ABE simulation were investigated in order to elucidate their effects on the overall economics of the process.

  13. Exploring the physical layer frontiers of cellular uplink: The Vienna LTE-A Uplink Simulator.

    PubMed

    Zöchmann, Erich; Schwarz, Stefan; Pratschner, Stefan; Nagel, Lukas; Lerch, Martin; Rupp, Markus

    Communication systems in practice are subject to many technical/technological constraints and restrictions. Multiple input, multiple output (MIMO) processing in current wireless communications, as an example, mostly employs codebook-based pre-coding to save computational complexity at the transmitters and receivers. In such cases, closed form expressions for capacity or bit-error probability are often unattainable; effects of realistic signal processing algorithms on the performance of practical communication systems rather have to be studied in simulation environments. The Vienna LTE-A Uplink Simulator is a 3GPP LTE-A standard compliant MATLAB-based link level simulator that is publicly available under an academic use license, facilitating reproducible evaluations of signal processing algorithms and transceiver designs in wireless communications. This paper reviews research results that have been obtained by means of the Vienna LTE-A Uplink Simulator, highlights the effects of single-carrier frequency-division multiplexing (as the distinguishing feature to LTE-A downlink), extends known link adaptation concepts to uplink transmission, shows the implications of the uplink pilot pattern for gathering channel state information at the receiver and completes with possible future research directions.

  14. Reintrepreting the cardiovascular system as a mechanical model

    NASA Astrophysics Data System (ADS)

    Lemos, Diogo; Machado, José; Minas, Graça; Soares, Filomena; Barros, Carla; Leão, Celina Pinto

    2013-10-01

    The simulation of the different physiological systems is very useful as a pedagogical tool, allowing a better understanding of the mechanisms and the functions of the processes. The observation of the physiological phenomena through mechanical simulators represents a great asset. Furthermore, the development of these simulators allows reinterpreting physiological systems, with the advantage of using the same transducers and sensors that are commonly used in diagnostic and therapeutic cardiovascular procedures for the monitoring of system' parameters. The cardiovascular system is one of the most important systems of the human body and has been the target of several biomedical studies. The present work describes a mechanical simulation of the cardiovascular system, in particularly, the systemic circulation, which can be described in terms of its hemodynamic variables. From the mechanical process and parameters, physiological system's behavior was reproduced, as accurately as possible.

  15. Extending simulation modeling to activity-based costing for clinical procedures.

    PubMed

    Glick, N D; Blackmore, C C; Zelman, W N

    2000-04-01

    A simulation model was developed to measure costs in an Emergency Department setting for patients presenting with possible cervical-spine injury who needed radiological imaging. Simulation, a tool widely used to account for process variability but typically focused on utilization and throughput analysis, is being introduced here as a realistic means to perform an activity-based-costing (ABC) analysis, because traditional ABC methods have difficulty coping with process variation in healthcare. Though the study model has a very specific application, it can be generalized to other settings simply by changing the input parameters. In essence, simulation was found to be an accurate and viable means to conduct an ABC analysis; in fact, the output provides more complete information than could be achieved through other conventional analyses, which gives management more leverage with which to negotiate contractual reimbursements.

  16. The Development of a 3D LADAR Simulator Based on a Fast Target Impulse Response Generation Approach

    NASA Astrophysics Data System (ADS)

    Al-Temeemy, Ali Adnan

    2017-09-01

    A new laser detection and ranging (LADAR) simulator has been developed, using MATLAB and its graphical user interface, to simulate direct detection time of flight LADAR systems, and to produce 3D simulated scanning images under a wide variety of conditions. This simulator models each stage from the laser source to data generation and can be considered as an efficient simulation tool to use when developing LADAR systems and their data processing algorithms. The novel approach proposed for this simulator is to generate the actual target impulse response. This approach is fast and able to deal with high scanning requirements without losing the fidelity that accompanies increments in speed. This leads to a more efficient LADAR simulator and opens up the possibility for simulating LADAR beam propagation more accurately by using a large number of laser footprint samples. The approach is to select only the parts of the target that lie in the laser beam angular field by mathematically deriving the required equations and calculating the target angular ranges. The performance of the new simulator has been evaluated under different scanning conditions, the results showing significant increments in processing speeds in comparison to conventional approaches, which are also used in this study as a point of comparison for the results. The results also show the simulator's ability to simulate phenomena related to the scanning process, for example, type of noise, scanning resolution and laser beam width.

  17. Mono- and Di-Alkylation Processes of DNA Bases by Nitrogen Mustard Mechlorethamine.

    PubMed

    Larrañaga, Olatz; de Cózar, Abel; Cossío, Fernando P

    2017-12-06

    The reactivity of nitrogen mustard mechlorethamine (mec) with purine bases towards formation of mono- (G-mec and A-mec) and dialkylated (AA-mec, GG-mec and AG-mec) adducts has been studied using density functional theory (DFT). To gain a complete overview of DNA-alkylation processes, direct chloride substitution and formation through activated aziridinium species were considered as possible reaction paths for adduct formation. Our results confirm that DNA alkylation by mec occurs via aziridine intermediates instead of direct substitution. Consideration of explicit water molecules in conjunction with polarizable continuum model (PCM) was shown as an adequate computational method for a proper representation of the system. Moreover, Runge-Kutta numerical kinetic simulations including the possible bisadducts have been performed. These simulations predicted a product ratio of 83:17 of GG-mec and AG-mec diadducts, respectively. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Experimental study of modification mechanism at a wear-resistant surfacing

    NASA Astrophysics Data System (ADS)

    Dema, R. R.; Amirov, R. N.; Kalugina, O. B.

    2018-01-01

    In the study, a simulation of the crystallization process was carried out for the deposition of the near-eutectic structure alloys with inoculants presence in order to reveal the regularities of the inoculant effect and parameters of the process mode simulating surfacing on the structure of the crystallization front and on the nucleation rate and kinetics of growth of equiaxed crystallites of primary phases occurring in the volume of the melt. The simulation technique of primary crystallization of alloys similar to eutectic alloys in the presence of modifiers is offered. The possibility of fully eutectic structure during surfacing of nominal hypereutectic alloys of type white cast irons in wide range of deviations from the nominal composition is revealed.

  19. Computational Analysis of Splash Occurring in the Deposition Process in Annular-Mist Flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Heng; Koshizuka, Seiichi; Oka, Yoshiaki

    2004-07-01

    The deposition process of a single droplet on the film is numerically simulated by the Moving Particle Semi-implicit (MPS) method to analyze the possibility and effect of splash occurring in the deposition process in BWR condition. The model accounts for the presence of inertial, gravitation, viscous and surface tension and is validated by comparison with experiment results. A simple one-dimensional mixture model is developed to calculate the necessary parameters for the simulation of deposition in BWR condition. The deposition process of a single droplet in BWR condition is simulated. The effect of impact angle of droplet and the velocity ofmore » liquid film are analyzed. A film buffer model is developed to fit the simulation results of critical value for splash. A correlation of critical Weber number for splash in BWR condition is obtained and used to analyze the effect of splash. It is found that the splash play important role in the deposition and re-entrainment process in high quality condition in BWR. The mass fraction of re-entrainment caused by splash in different quality condition is also calculated. (authors)« less

  20. The Answering Process for Multiple-Choice Questions in Collaborative Learning: A Mathematical Learning Model Analysis

    ERIC Educational Resources Information Center

    Nakamura, Yasuyuki; Nishi, Shinnosuke; Muramatsu, Yuta; Yasutake, Koichi; Yamakawa, Osamu; Tagawa, Takahiro

    2014-01-01

    In this paper, we introduce a mathematical model for collaborative learning and the answering process for multiple-choice questions. The collaborative learning model is inspired by the Ising spin model and the model for answering multiple-choice questions is based on their difficulty level. An intensive simulation study predicts the possibility of…

  1. Microscopic transport model animation visualisation on KML base

    NASA Astrophysics Data System (ADS)

    Yatskiv, I.; Savrasovs, M.

    2012-10-01

    By reading classical literature devoted to the simulation theory it could be found that one of the greatest possibilities of simulation is the ability to present processes inside the system by animation. This gives to the simulation model additional value during presentation of simulation results for the public and authorities who are not familiar enough with simulation. That is why most of universal and specialised simulation tools have the ability to construct 2D and 3D representation of the model. Usually the development of such representation could take much time and there must be put a lot forces into creating an adequate 3D representation of the model. For long years such well-known microscopic traffic flow simulation software tools as VISSIM, AIMSUN and PARAMICS have had a possibility to produce 2D and 3D animation. But creation of realistic 3D model of the place where traffic flows are simulated, even in these professional software tools it is a hard and time consuming action. The goal of this paper is to describe the concepts of use the existing on-line geographical information systems for visualisation of animation produced by simulation software. For demonstration purposes the following technologies and tools have been used: PTV VISION VISSIM, KML and Google Earth.

  2. The Use of Computer Simulation Methods to Reach Data for Economic Analysis of Automated Logistic Systems

    NASA Astrophysics Data System (ADS)

    Neradilová, Hana; Fedorko, Gabriel

    2016-12-01

    Automated logistic systems are becoming more widely used within enterprise logistics processes. Their main advantage is that they allow increasing the efficiency and reliability of logistics processes. In terms of evaluating their effectiveness, it is necessary to take into account the economic aspect of the entire process. However, many users ignore and underestimate this area,which is not correct. One of the reasons why the economic aspect is overlooked is the fact that obtaining information for such an analysis is not easy. The aim of this paper is to present the possibilities of computer simulation methods for obtaining data for full-scale economic analysis implementation.

  3. Evaluation of center-cut separations applying simulated moving bed chromatography with 8 zones.

    PubMed

    Santos da Silva, Francisco Vitor; Seidel-Morgenstern, Andreas

    2016-07-22

    Different multi-column options to perform continuous chromatographic separations of ternary mixtures have been proposed in order to overcome limitations of batch chromatography. One attractive option is given by simulated moving bed chromatography (SMB) with 8 zones, a process that offers uninterrupted production, and, potentially, improved economy. As in other established ternary separation processes, the separation sequence is crucial for the performance of the process. This problem is addressed here by computing and comparing optimal performances of the two possibilities assuming linear adsorption isotherms. The conclusions are presented in a decision tree which can be used to guide the selection of system configuration and operation. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Simulated annealing in networks for computing possible arrangements for red and green cones

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.

    1987-01-01

    Attention is given to network models in which each of the cones of the retina is given a provisional color at random, and then the cones are allowed to determine the colors of their neighbors through an iterative process. A symmetric-structure spin-glass model has allowed arrays to be generated from completely random arrangements of red and green to arrays with approximately as much disorder as the parafoveal cones. Simulated annealing has also been added to the process in an attempt to generate color arrangements with greater regularity and hence more revealing moirepatterns than than the arrangements yielded by quenched spin-glass processes. Attention is given to the perceptual implications of these results.

  5. Using quantum theory to simplify input-output processes

    NASA Astrophysics Data System (ADS)

    Thompson, Jayne; Garner, Andrew J. P.; Vedral, Vlatko; Gu, Mile

    2017-02-01

    All natural things process and transform information. They receive environmental information as input, and transform it into appropriate output responses. Much of science is dedicated to building models of such systems-algorithmic abstractions of their input-output behavior that allow us to simulate how such systems can behave in the future, conditioned on what has transpired in the past. Here, we show that classical models cannot avoid inefficiency-storing past information that is unnecessary for correct future simulation. We construct quantum models that mitigate this waste, whenever it is physically possible to do so. This suggests that the complexity of general input-output processes depends fundamentally on what sort of information theory we use to describe them.

  6. Analysis of large-scale tablet coating: Modeling, simulation and experiments.

    PubMed

    Boehling, P; Toschkoff, G; Knop, K; Kleinebudde, P; Just, S; Funke, A; Rehbaum, H; Khinast, J G

    2016-07-30

    This work concerns a tablet coating process in an industrial-scale drum coater. We set up a full-scale Design of Simulation Experiment (DoSE) using the Discrete Element Method (DEM) to investigate the influence of various process parameters (the spray rate, the number of nozzles, the rotation rate and the drum load) on the coefficient of inter-tablet coating variation (cv,inter). The coater was filled with up to 290kg of material, which is equivalent to 1,028,369 tablets. To mimic the tablet shape, the glued sphere approach was followed, and each modeled tablet consisted of eight spheres. We simulated the process via the eXtended Particle System (XPS), proving that it is possible to accurately simulate the tablet coating process on the industrial scale. The process time required to reach a uniform tablet coating was extrapolated based on the simulated data and was in good agreement with experimental results. The results are provided at various levels of details, from thorough investigation of the influence that the process parameters have on the cv,inter and the amount of tablets that visit the spray zone during the simulated 90s to the velocity in the spray zone and the spray and bed cycle time. It was found that increasing the number of nozzles and decreasing the spray rate had the highest influence on the cv,inter. Although increasing the drum load and the rotation rate increased the tablet velocity, it did not have a relevant influence on the cv,inter and the process time. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations

    NASA Astrophysics Data System (ADS)

    Niemeier, Wolfgang; Tengen, Dieter

    2017-06-01

    In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.

  8. Design and simulation of programmable relational optoelectronic time-pulse coded processors as base elements for sorting neural networks

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lazareva, Maria V.

    2010-05-01

    In the paper we show that the biologically motivated conception of time-pulse encoding usage gives a set of advantages (single methodological basis, universality, tuning simplicity, learning and programming et al) at creation and design of sensor systems with parallel input-output and processing for 2D structures hybrid and next generations neuro-fuzzy neurocomputers. We show design principles of programmable relational optoelectronic time-pulse encoded processors on the base of continuous logic, order logic and temporal waves processes. We consider a structure that execute analog signal extraction, analog and time-pulse coded variables sorting. We offer optoelectronic realization of such base relational order logic element, that consists of time-pulse coded photoconverters (pulse-width and pulse-phase modulators) with direct and complementary outputs, sorting network on logical elements and programmable commutation blocks. We make technical parameters estimations of devices and processors on such base elements by simulation and experimental research: optical input signals power 0.2 - 20 uW, processing time 1 - 10 us, supply voltage 1 - 3 V, consumption power 10 - 100 uW, extended functional possibilities, learning possibilities. We discuss some aspects of possible rules and principles of learning and programmable tuning on required function, relational operation and realization of hardware blocks for modifications of such processors. We show that it is possible to create sorting machines, neural networks and hybrid data-processing systems with untraditional numerical systems and pictures operands on the basis of such quasiuniversal hardware simple blocks with flexible programmable tuning.

  9. Integrated orbit and attitude hardware-in-the-loop simulations for autonomous satellite formation flying

    NASA Astrophysics Data System (ADS)

    Park, Han-Earl; Park, Sang-Young; Kim, Sung-Woo; Park, Chandeok

    2013-12-01

    Development and experiment of an integrated orbit and attitude hardware-in-the-loop (HIL) simulator for autonomous satellite formation flying are presented. The integrated simulator system consists of an orbit HIL simulator for orbit determination and control, and an attitude HIL simulator for attitude determination and control. The integrated simulator involves four processes (orbit determination, orbit control, attitude determination, and attitude control), which interact with each other in the same way as actual flight processes do. Orbit determination is conducted by a relative navigation algorithm using double-difference GPS measurements based on the extended Kalman filter (EKF). Orbit control is performed by a state-dependent Riccati equation (SDRE) technique that is utilized as a nonlinear controller for the formation control problem. Attitude is determined from an attitude heading reference system (AHRS) sensor, and a proportional-derivative (PD) feedback controller is used to control the attitude HIL simulator using three momentum wheel assemblies. Integrated orbit and attitude simulations are performed for a formation reconfiguration scenario. By performing the four processes adequately, the desired formation reconfiguration from a baseline of 500-1000 m was achieved with meter-level position error and millimeter-level relative position navigation. This HIL simulation demonstrates the performance of the integrated HIL simulator and the feasibility of the applied algorithms in a real-time environment. Furthermore, the integrated HIL simulator system developed in the current study can be used as a ground-based testing environment to reproduce possible actual satellite formation operations.

  10. Elementary process and meteor train spectra

    NASA Technical Reports Server (NTRS)

    Ovezgeldyev, O. G.

    1987-01-01

    Mechanisms of excitation of individual spectral line radiation were studied experimentally and theoretically and it was demonstrated that such processes as oxidation, resonant charge exchange, dissociative recombination and others play an important part in the chemistry of excited particles. The foundation was laid toward simulating the elementary processes of meteor physics. Having a number of advantages and possibilities, this method is sure to find a wide use in the future.

  11. Advancements in Electromagnetic Wave Backscattering Simulations: Applications in Active Lidar Remote Sensing Involving Aerosols

    NASA Astrophysics Data System (ADS)

    Bi, L.

    2016-12-01

    Atmospheric remote sensing based on the Lidar technique fundamentally relies on knowledge of the backscattering of light by particulate matters in the atmosphere. This talk starts with a review of the current capabilities of electromagnetic wave scattering simulations to determine the backscattering optical properties of irregular particles, such as the backscatterer and depolarization ratio. This will be followed by a discussion of possible pitfalls in the relevant simulations. The talk will then be concluded with reports on the latest advancements in computational techniques. In addition, we summarize the laws of the backscattering optical properties of aerosols with respect to particle geometries, particle sizes, and mixing rules. These advancements will be applied to the analysis of the Lidar observation data to reveal the state and possible microphysical processes of various aerosols.

  12. Hybrid-Lambda: simulation of multiple merger and Kingman gene genealogies in species networks and species trees.

    PubMed

    Zhu, Sha; Degnan, James H; Goldstien, Sharyn J; Eldon, Bjarki

    2015-09-15

    There has been increasing interest in coalescent models which admit multiple mergers of ancestral lineages; and to model hybridization and coalescence simultaneously. Hybrid-Lambda is a software package that simulates gene genealogies under multiple merger and Kingman's coalescent processes within species networks or species trees. Hybrid-Lambda allows different coalescent processes to be specified for different populations, and allows for time to be converted between generations and coalescent units, by specifying a population size for each population. In addition, Hybrid-Lambda can generate simulated datasets, assuming the infinitely many sites mutation model, and compute the F ST statistic. As an illustration, we apply Hybrid-Lambda to infer the time of subdivision of certain marine invertebrates under different coalescent processes. Hybrid-Lambda makes it possible to investigate biogeographic concordance among high fecundity species exhibiting skewed offspring distribution.

  13. Processing of Lunar Soil Simulant for Space Exploration Applications

    NASA Technical Reports Server (NTRS)

    Sen, Subhayu; Ray, Chandra S.; Reddy, Ramana

    2005-01-01

    NASA's long-term vision for space exploration includes developing human habitats and conducting scientific investigations on planetary bodies, especially on Moon and Mars. To reduce the level of up-mass processing and utilization of planetary in-situ resources is recognized as an important element of this vision. Within this scope and context, we have undertaken a general effort aimed primarily at extracting and refining metals, developing glass, glass-ceramic, or traditional ceramic type materials using lunar soil simulants. In this paper we will present preliminary results on our effort on carbothermal reduction of oxides for elemental extraction and zone refining for obtaining high purity metals. In additions we will demonstrate the possibility of developing glasses from lunar soil simulant for fixing nuclear waste from potential nuclear power generators on planetary bodies. Compositional analysis, x-ray diffraction patterns and differential thermal analysis of processed samples will be presented.

  14. Simulation of Shock-Shock Interaction in Parsec-Scale Jets

    NASA Astrophysics Data System (ADS)

    Fromm, Christian M.; Perucho, Manel; Ros, Eduardo; Mimica, Petar; Savolainen, Tuomas; Lobanov, Andrei P.; Zensus, J. Anton

    The analysis of the radio light curves of the blazar CTA 102 during its 2006 flare revealed a possible interaction between a standing shock wave and a traveling one. In order to better understand this highly non-linear process, we used a relativistic hydrodynamic code to simulate the high energy interaction and its related emission. The calculated synchrotron emission from these simulations showed an increase in turnover flux density, Sm, and turnover frequency, νm, during the interaction and decrease to its initial values after the passage of the traveling shock wave.

  15. Statistical palaeomagnetic field modelling and dynamo numerical simulation

    NASA Astrophysics Data System (ADS)

    Bouligand, C.; Hulot, G.; Khokhlov, A.; Glatzmaier, G. A.

    2005-06-01

    By relying on two numerical dynamo simulations for which such investigations are possible, we test the validity and sensitivity of a statistical palaeomagnetic field modelling approach known as the giant gaussian process (GGP) modelling approach. This approach is currently used to analyse palaeomagnetic data at times of stable polarity and infer some information about the way the main magnetic field (MF) of the Earth has been behaving in the past and has possibly been influenced by core-mantle boundary (CMB) conditions. One simulation has been run with homogeneous CMB conditions, the other with more realistic non-homogeneous symmetry breaking CMB conditions. In both simulations, it is found that, as required by the GGP approach, the field behaves as a short-term memory process. Some severe non-stationarity is however found in the non-homogeneous case, leading to very significant departures of the Gauss coefficients from a Gaussian distribution, in contradiction with the assumptions underlying the GGP approach. A similar but less severe non-stationarity is found in the case of the homogeneous simulation, which happens to display a more Earth-like temporal behaviour than the non-homogeneous case. This suggests that a GGP modelling approach could nevertheless be applied to try and estimate the mean μ and covariance matrix γ(τ) (first- and second-order statistical moments) of the field produced by the geodynamo. A detailed study of both simulations is carried out to assess the possibility of detecting statistical symmetry breaking properties of the underlying dynamo process by inspection of estimates of μ and γ(τ). As expected (because of the role of the rotation of the Earth in the dynamo process), those estimates reveal spherical symmetry breaking properties. Equatorial symmetry breaking properties are also detected in both simulations, showing that such symmetry breaking properties can occur spontaneously under homogeneous CMB conditions. By contrast axial symmetry breaking is detected only in the non-homogenous simulation, testifying for the constraints imposed by the CMB conditions. The signature of this axial symmetry breaking is however found to be much weaker than the signature of equatorial symmetry breaking. We note that this could be the reason why only equatorial symmetry breaking properties (in the form of the well-known axial quadrupole term in the time-averaged field) have unambiguously been found so far by analysing the real data. However, this could also be because those analyses have all assumed to simple a form for γ(τ) when attempting to estimate μ. Suggestions are provided to make sure future attempts of GGP modelling with real data are being carried out in a more consistent and perhaps more efficient way.

  16. Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling

    NASA Astrophysics Data System (ADS)

    Vidovič, Luka; Milanič, Matija; Majaron, Boris

    2014-02-01

    Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.

  17. Computer simulations and real-time control of ELT AO systems using graphical processing units

    NASA Astrophysics Data System (ADS)

    Wang, Lianqi; Ellerbroek, Brent

    2012-07-01

    The adaptive optics (AO) simulations at the Thirty Meter Telescope (TMT) have been carried out using the efficient, C based multi-threaded adaptive optics simulator (MAOS, http://github.com/lianqiw/maos). By porting time-critical parts of MAOS to graphical processing units (GPU) using NVIDIA CUDA technology, we achieved a 10 fold speed up for each GTX 580 GPU used compared to a modern quad core CPU. Each time step of full scale end to end simulation for the TMT narrow field infrared AO system (NFIRAOS) takes only 0.11 second in a desktop with two GTX 580s. We also demonstrate that the TMT minimum variance reconstructor can be assembled in matrix vector multiply (MVM) format in 8 seconds with 8 GTX 580 GPUs, meeting the TMT requirement for updating the reconstructor. Analysis show that it is also possible to apply the MVM using 8 GTX 580s within the required latency.

  18. Production Strategies for Production-Quality Parts for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Cawley, J. D.; Best, J. E.; Liu, Z.; Eckel, A. J.; Reed, B. D.; Fox, D. S.; Bhatt, R.; Levine, Stanley R. (Technical Monitor)

    2000-01-01

    A combination of rapid prototyping processes (3D Systems' stereolithography and Sanders Prototyping's ModelMaker) are combined with gelcasting to produce high quality silicon nitride components that were performance tested under simulated use conditions. Two types of aerospace components were produced, a low-force rocket thruster and a simulated airfoil section. The rocket was tested in a test stand using varying mixtures of H2 and O2, whereas the simulated airfoil was tested by subjecting it to a 0.3 Mach jet-fuel burner flame. Both parts performed successfully, demonstrating the usefulness of the rapid prototyping in efforts to effect materials substitution. In addition, the simulated airfoil was used to explore the possibility of applying thermal/environmental barrier coatings and providing for internal cooling of ceramic parts. It is concluded that this strategy for processing offers the ceramic engineer all the flexibility normally associated with investment casting of superalloys.

  19. Waiting-time distributions of magnetic discontinuities: clustering or Poisson process?

    PubMed

    Greco, A; Matthaeus, W H; Servidio, S; Dmitruk, P

    2009-10-01

    Using solar wind data from the Advanced Composition Explorer spacecraft, with the support of Hall magnetohydrodynamic simulations, the waiting-time distributions of magnetic discontinuities have been analyzed. A possible phenomenon of clusterization of these discontinuities is studied in detail. We perform a local Poisson's analysis in order to establish if these intermittent events are randomly distributed or not. Possible implications about the nature of solar wind discontinuities are discussed.

  20. Waiting-time distributions of magnetic discontinuities: Clustering or Poisson process?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greco, A.; Matthaeus, W. H.; Servidio, S.

    2009-10-15

    Using solar wind data from the Advanced Composition Explorer spacecraft, with the support of Hall magnetohydrodynamic simulations, the waiting-time distributions of magnetic discontinuities have been analyzed. A possible phenomenon of clusterization of these discontinuities is studied in detail. We perform a local Poisson's analysis in order to establish if these intermittent events are randomly distributed or not. Possible implications about the nature of solar wind discontinuities are discussed.

  1. Experimental Simulations to Understand the Lunar and Martian Surficial Processes

    NASA Astrophysics Data System (ADS)

    Zhao, Y. Y. S.; Li, X.; Tang, H.; Li, Y.; Zeng, X.; Chang, R.; Li, S.; Zhang, S.; Jin, H.; Mo, B.; Li, R.; Yu, W.; Wang, S.

    2016-12-01

    In support with China's Lunar and Mars exploration programs and beyond, our center is dedicated to understand the surficial processes and environments of planetary bodies. Over the latest several years, we design, build and optimize experimental simulation facilities and utilize them to test hypotheses and evaluate affecting mechanisms under controlled conditions particularly relevant to the Moon and Mars. Among the fundamental questions to address, we emphasize on five major areas: (1) Micrometeorites bombardment simulation to evaluate the formation mechanisms of np-Fe0 which was found in lunar samples and the possible sources of Fe. (2) Solar wind implantation simulation to evaluate the alteration/amorphization/OH or H2O formation on the surface of target minerals or rocks. (3) Dusts mobility characteristics on the Moon and other planetary bodies by excitation different types of dust particles and measuring their movements. (4) Mars basaltic soil simulant development (e.g., Jining Martian Soil Simulant (JMSS-1)) and applications for scientific/engineering experiments. (5) Halogens (Cl and Br) and life essential elements (C, H, O, N, P, and S) distribution and speciation on Mars during surficial processes such as sedimentary- and photochemical- related processes. Depending on the variables of interest, the simulation systems provide flexibility to vary source of energy, temperature, pressure, and ambient gas composition in the reaction chambers. Also, simulation products can be observed or analyzed in-situ by various analyzer components inside the chamber, without interrupting the experimental conditions. In addition, behavior of elements and isotopes during certain surficial processes (e.g., evaporation, dissolution, etc.) can be theoretically predicted by our theoretical geochemistry group with thermodynamics-kinetics calculation and modeling, which supports experiment design and result interpretation.

  2. High performance real-time flight simulation at NASA Langley

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1994-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be deterministic and be completed in as short a time as possible. This includes simulation mathematical model computational and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, personnel at NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to a standard input/output system to provide for high bandwidth, low latency data acquisition and distribution. The Computer Automated Measurement and Control technology (IEEE standard 595) was extended to meet the performance requirements for real-time simulation. This technology extension increased the effective bandwidth by a factor of ten and increased the performance of modules necessary for simulator communications. This technology is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications of this technology are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC have completed the development of the use of supercomputers for simulation mathematical model computational to support real-time flight simulation. This includes the development of a real-time operating system and the development of specialized software and hardware for the CAMAC simulator network. This work, coupled with the use of an open systems software architecture, has advanced the state of the art in real time flight simulation. The data acquisition technology innovation and experience with recent developments in this technology are described.

  3. Bioreactor tests preliminary to landfill in situ aeration: A case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raga, Roberto, E-mail: roberto.raga@unipd.it; Cossu, Raffaello

    Highlights: ► Carbon and nitrogen mass balances in aerated landfill simulation reactors. ► Waste stabilization in aerated landfill simulation reactors. ► Effect of temperature on biodegradation processes in aerated landfills. - Abstract: Lab scale tests in bioreactor were carried out in the framework of the characterization studies of a landfill where in situ aeration (possibly followed by landfill mining) had been proposed as part of the novel waste management strategy in a region in northern Italy. The tests were run to monitor the effects produced by aerobic conditions at different temperatures on waste sampled at different depths in the landfill,more » with focus on the carbon and nitrogen conversion during aeration. Temperatures ranging from 35 to 45 °C were chosen, in order to evaluate possible inhibition of biodegradation processes (namely nitrification) at 45 °C in the landfill. The results obtained showed positive effects of the aeration on leachate quality and a significant reduction of waste biodegradability. Although a delay of biodegradation processes was observed in the reactor run at 45 °C, biodegradation rates increased after 2 months of aeration, providing very low values of the relevant parameters (as in the other aerated reactors) by the end of the study. Mass balances were carried out for TOC and N-NH{sub 4}{sup +}; the findings obtained were encouraging and provided evidence of the effectiveness of carbon and nitrogen conversion processes in the aerated landfill simulation reactors.« less

  4. Modelling the development of defects during composite reinforcements and prepreg forming

    PubMed Central

    Hamila, N.; Madeo, A.

    2016-01-01

    Defects in composite materials are created during manufacture to a large extent. To avoid them as much as possible, it is important that process simulations model the onset and the development of these defects. It is then possible to determine the manufacturing conditions that lead to the absence or to the controlled presence of such defects. Three types of defects that may appear during textile composite reinforcement or prepreg forming are analysed and modelled in this paper. Wrinkling is one of the most common flaws that occur during textile composite reinforcement forming processes. The influence of the different rigidities of the textile reinforcement is studied. The concept of ‘locking angle’ is questioned. A second type of unusual behaviour of fibrous composite reinforcements that can be seen as a flaw during their forming process is the onset of peculiar ‘transition zones’ that are directly related to the bending stiffness of the fibres. The ‘transition zones’ are due to the bending stiffness of fibres. The standard continuum mechanics of Cauchy is not sufficient to model these defects. A second gradient approach is presented that allows one to account for such unusual behaviours and to master their onset and development during forming process simulations. Finally, the large slippages that may occur during a preform forming are discussed and simulated with meso finite-element models used for macroscopic forming. This article is part of the themed issue ‘Multiscale modelling of the structural integrity of composite materials’. PMID:27242300

  5. Modelling the development of defects during composite reinforcements and prepreg forming.

    PubMed

    Boisse, P; Hamila, N; Madeo, A

    2016-07-13

    Defects in composite materials are created during manufacture to a large extent. To avoid them as much as possible, it is important that process simulations model the onset and the development of these defects. It is then possible to determine the manufacturing conditions that lead to the absence or to the controlled presence of such defects. Three types of defects that may appear during textile composite reinforcement or prepreg forming are analysed and modelled in this paper. Wrinkling is one of the most common flaws that occur during textile composite reinforcement forming processes. The influence of the different rigidities of the textile reinforcement is studied. The concept of 'locking angle' is questioned. A second type of unusual behaviour of fibrous composite reinforcements that can be seen as a flaw during their forming process is the onset of peculiar 'transition zones' that are directly related to the bending stiffness of the fibres. The 'transition zones' are due to the bending stiffness of fibres. The standard continuum mechanics of Cauchy is not sufficient to model these defects. A second gradient approach is presented that allows one to account for such unusual behaviours and to master their onset and development during forming process simulations. Finally, the large slippages that may occur during a preform forming are discussed and simulated with meso finite-element models used for macroscopic forming. This article is part of the themed issue 'Multiscale modelling of the structural integrity of composite materials'. © 2016 The Author(s).

  6. Integration of Modelling and Graphics to Create an Infrared Signal Processing Test Bed

    NASA Astrophysics Data System (ADS)

    Sethi, H. R.; Ralph, John E.

    1989-03-01

    The work reported in this paper was carried out as part of a contract with MoD (PE) UK. It considers the problems associated with realistic modelling of a passive infrared system in an operational environment. Ideally all aspects of the system and environment should be integrated into a complete end-to-end simulation but in the past limited computing power has prevented this. Recent developments in workstation technology and the increasing availability of parallel processing techniques makes the end-to-end simulation possible. However the complexity and speed of such simulations means difficulties for the operator in controlling the software and understanding the results. These difficulties can be greatly reduced by providing an extremely user friendly interface and a very flexible, high power, high resolution colour graphics capability. Most system modelling is based on separate software simulation of the individual components of the system itself and its environment. These component models may have their own characteristic inbuilt assumptions and approximations, may be written in the language favoured by the originator and may have a wide variety of input and output conventions and requirements. The models and their limitations need to be matched to the range of conditions appropriate to the operational scenerio. A comprehensive set of data bases needs to be generated by the component models and these data bases must be made readily available to the investigator. Performance measures need to be defined and displayed in some convenient graphics form. Some options are presented for combining available hardware and software to create an environment within which the models can be integrated, and which provide the required man-machine interface, graphics and computing power. The impact of massively parallel processing and artificial intelligence will be discussed. Parallel processing will make real time end-to-end simulation possible and will greatly improve the graphical visualisation of the model output data. Artificial intelligence should help to enhance the man-machine interface.

  7. Development of an electromechanical principle for wet and dry milling

    NASA Astrophysics Data System (ADS)

    Halbedel, Bernd; Kazak, Oleg

    2018-05-01

    The paper presents a novel electromechanical principle for wet and dry milling of different materials, in which the milling beads are moved under a time- and local-variable magnetic field. A possibility to optimize the milling process in such a milling machine by simulation of the vector gradient distribution of the electromagnetic field in the process room is presented. The mathematical model and simulation methods based on standard software packages are worked out. The results of numerical simulations and experimental measurements of the electromagnetic field in the working chamber of a developed and manufactured laboratory plant correlate well with each other. Using the obtained operating parameters, dry milling experiments with crushed cement clinker and wet milling experiments of organic agents in the laboratory plant are performed and the results are discussed here.

  8. Considerations for Reporting Finite Element Analysis Studies in Biomechanics

    PubMed Central

    Erdemir, Ahmet; Guess, Trent M.; Halloran, Jason; Tadepalli, Srinivas C.; Morrison, Tina M.

    2012-01-01

    Simulation-based medicine and the development of complex computer models of biological structures is becoming ubiquitous for advancing biomedical engineering and clinical research. Finite element analysis (FEA) has been widely used in the last few decades to understand and predict biomechanical phenomena. Modeling and simulation approaches in biomechanics are highly interdisciplinary, involving novice and skilled developers in all areas of biomedical engineering and biology. While recent advances in model development and simulation platforms offer a wide range of tools to investigators, the decision making process during modeling and simulation has become more opaque. Hence, reliability of such models used for medical decision making and for driving multiscale analysis comes into question. Establishing guidelines for model development and dissemination is a daunting task, particularly with the complex and convoluted models used in FEA. Nonetheless, if better reporting can be established, researchers will have a better understanding of a model’s value and the potential for reusability through sharing will be bolstered. Thus, the goal of this document is to identify resources and considerate reporting parameters for FEA studies in biomechanics. These entail various levels of reporting parameters for model identification, model structure, simulation structure, verification, validation, and availability. While we recognize that it may not be possible to provide and detail all of the reporting considerations presented, it is possible to establish a level of confidence with selective use of these parameters. More detailed reporting, however, can establish an explicit outline of the decision-making process in simulation-based analysis for enhanced reproducibility, reusability, and sharing. PMID:22236526

  9. How we remember what we can do

    PubMed Central

    Declerck, Gunnar

    2015-01-01

    According to the motor simulation theory, the knowledge we possess of what we can do is based on simulation mechanisms triggered by an off-line activation of the brain areas involved in motor control. Action capabilities memory does not work by storing some content, but consists in the capacity, rooted in sensory-motor systems, to reenact off-line action sequences exhibiting the range of our powers. In this paper, I present several arguments from cognitive neuropsychology, but also first-person analysis of experience, against this hypothesis. The claim that perceptual access to affordances is mediated by motor simulation processes rests on a misunderstanding of what affordances are, and comes up against a computational reality principle. Motor simulation cannot provide access to affordances because (i) the affordances we are aware of at each moment are too many for their realization to be simulated by the brain and (ii) affordances are not equivalent to currently or personally feasible actions. The explanatory significance of the simulation theory must then be revised downwards compared to what is claimed by most of its advocates. One additional challenge is to determine the prerequisite, in terms of cognitive processing, for the motor simulation mechanisms to work. To overcome the limitations of the simulation theory, I propose a new approach: the direct content specification hypothesis. This hypothesis states that, at least for the most basic actions of our behavioral repertoire, the action possibilities we are aware of through perception are directly specified by perceptual variables characterizing the content of our experience. The cognitive system responsible for the perception of action possibilities is consequently far more direct, in terms of cognitive processing, than what is stated by the simulation theory. To support this hypothesis I review evidence from current neuropsychological research, in particular data suggesting a phenomenon of ‘fossilization’ of affordances. Fossilization can be defined as a gap between the capacities that are treated as available by the cognitive system and the capacities this system really has at its disposal. These considerations do not mean that motor simulation cannot contribute to explain how we gain perceptual knowledge of what we can do based on the memory of our past performances. However, when precisely motor simulation plays a role and what it is for exactly currently remain largely unknown. PMID:26507953

  10. Laser Simulations of the Destructive Impact of Nuclear Explosions on Hazardous Asteroids

    NASA Astrophysics Data System (ADS)

    Aristova, E. Yu.; Aushev, A. A.; Baranov, V. K.; Belov, I. A.; Bel'kov, S. A.; Voronin, A. Yu.; Voronich, I. N.; Garanin, R. V.; Garanin, S. G.; Gainullin, K. G.; Golubinskii, A. G.; Gorodnichev, A. V.; Denisova, V. A.; Derkach, V. N.; Drozhzhin, V. S.; Ericheva, I. A.; Zhidkov, N. V.; Il'kaev, R. I.; Krayukhin, A. A.; Leonov, A. G.; Litvin, D. N.; Makarov, K. N.; Martynenko, A. S.; Malinov, V. I.; Mis'ko, V. V.; Rogachev, V. G.; Rukavishnikov, A. N.; Salatov, E. A.; Skorochkin, Yu. V.; Smorchkov, G. Yu.; Stadnik, A. L.; Starodubtsev, V. A.; Starodubtsev, P. V.; Sungatullin, R. R.; Suslov, N. A.; Sysoeva, T. I.; Khatunkin, V. Yu.; Tsoi, E. S.; Shubin, O. N.; Yufa, V. N.

    2018-01-01

    We present the results of preliminary experiments at laser facilities in which the processes of the undeniable destruction of stony asteroids (chondrites) in space by nuclear explosions on the asteroid surface are simulated based on the principle of physical similarity. We present the results of comparative gasdynamic computations of a model nuclear explosion on the surface of a large asteroid and computations of the impact of a laser pulse on a miniature asteroid simulator confirming the similarity of the key processes in the fullscale and model cases. The technology of fabricating miniature mockups with mechanical properties close to those of stony asteroids is described. For mini-mockups 4-10 mm in size differing by the shape and impact conditions, we have made an experimental estimate of the energy threshold for the undeniable destruction of a mockup and investigated the parameters of its fragmentation at a laser energy up to 500 J. The results obtained confirm the possibility of an experimental determination of the criteria for the destruction of asteroids of various types by a nuclear explosion in laser experiments. We show that the undeniable destruction of a large asteroid is possible at attainable nuclear explosion energies on its surface.

  11. Assessment of ECG and respiration recordings from simulated emergency landings of ultra light aircraft.

    PubMed

    Bruna, Ondřej; Levora, Tomáš; Holub, Jan

    2018-05-08

    Pilots of ultra light aircraft have limited training resources, but with the use of low cost simulators it might be possible to train and test some parts of their training on the ground. The purpose of this paper is to examine possibility of stress inducement on a low cost flight simulator. Stress is assessed from electrocardiogram and respiration. Engine failure during flight served as a stress inducement stimuli. For one flight, pilots had access to an emergency navigation system. There were recorded some statistically significant changes in parameters regarding breathing frequency. Although no significant change was observed in ECG parameters, there appears to be an effect on respiration parameters. Physiological signals processed with analysis of variance suggest, that the moment of engine failure and approach for landing affected average breathing frequency. Presence of navigation interface does not appear to have a significant effect on pilots.

  12. Parallelizing Timed Petri Net simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1993-01-01

    The possibility of using parallel processing to accelerate the simulation of Timed Petri Nets (TPN's) was studied. It was recognized that complex system development tools often transform system descriptions into TPN's or TPN-like models, which are then simulated to obtain information about system behavior. Viewed this way, it was important that the parallelization of TPN's be as automatic as possible, to admit the possibility of the parallelization being embedded in the system design tool. Later years of the grant were devoted to examining the problem of joint performance and reliability analysis, to explore whether both types of analysis could be accomplished within a single framework. In this final report, the results of our studies are summarized. We believe that the problem of parallelizing TPN's automatically for MIMD architectures has been almost completely solved for a large and important class of problems. Our initial investigations into joint performance/reliability analysis are two-fold; it was shown that Monte Carlo simulation, with importance sampling, offers promise of joint analysis in the context of a single tool, and methods for the parallel simulation of general Continuous Time Markov Chains, a model framework within which joint performance/reliability models can be cast, were developed. However, very much more work is needed to determine the scope and generality of these approaches. The results obtained in our two studies, future directions for this type of work, and a list of publications are included.

  13. Requirements for future development of small scale rainfall simulators

    NASA Astrophysics Data System (ADS)

    Iserloh, Thomas; Ries, Johannes B.; Seeger, Manuel

    2013-04-01

    Rainfall simulation with small scale simulators is a method used worldwide to assess the generation of overland flow, soil erosion, infiltration and interrelated processes such as soil sealing, crusting, splash and redistribution of solids and solutes. Following the outcomes of the project "Comparability of simulation results of different rainfall simulators as input data for soil erosion modelling (Deutsche Forschungsgemeinschaft - DFG, Project No. Ri 835/6-1)" and the "International Rainfall Simulator Workshop 2011" in Trier, the necessity for further technical improvements of simulators and strategies towards an adaption of designs and methods becomes obvious. Uniform measurements of artificially generated rainfall and comparative measurements on a prepared bare fallow with rainfall simulators used by European research groups showed limitations of the comparability of the results. The following requirements, essential for small portable rainfall simulators, were identified: (I) Low and efficient water consumption for use in areas with water shortage, (II) easy handling and control of test conditions, (III) homogeneous spatial rainfall distribution, (IV) best possible drop spectrum (physically), (V) reproducibility and knowledge of spatial distribution and drop spectrum, (VI) easy and fast training of operators to obtain reproducible experiments and (VII) good mobility and easy installation for use in remote areas and in regions where highly erosive rainfall events are rare or irregular. The presentation discusses possibilities for a common use of identical plot designs, rainfall intensities and nozzles.

  14. Dispersal and fallout simulations for urban consequences management (u)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grinstein, Fernando F; Wachtor, Adam J; Nelson, Matt

    2010-01-01

    Hazardous chemical, biological, or radioactive releases from leaks, spills, fires, or blasts, may occur (intentionally or accidentally) in urban environments during warfare or as part of terrorist attacks on military bases or other facilities. The associated contaminant dispersion is complex and semi-chaotic. Urban predictive simulation capabilities can have direct impact in many threat-reduction areas of interest, including, urban sensor placement and threat analysis, contaminant transport (CT) effects on surrounding civilian population (dosages, evacuation, shelter-in-place), education and training of rescue teams and services. Detailed simulations for the various processes involved are in principle possible, but generally not fast. Predicting urban airflowmore » accompanied by CT presents extremely challenging requirements. Crucial technical issues include, simulating turbulent fluid and particulate transport, initial and boundary condition modeling incorporating a consistent stratified urban boundary layer with realistic wind fluctuations, and post-processing of the simulation results for practical consequences management. Relevant fluid dynamic processes to be simulated include, detailed energetic and contaminant sources, complex building vortex shedding and flows in recirculation zones, and modeling of particle distributions, including particulate fallout, as well as deposition, re-suspension and evaporation. Other issues include, modeling building damage effects due to eventual blasts, addressing appropriate regional and atmospheric data reduction.« less

  15. Removal and Transformation of Estrogens During the Coagulation Process

    EPA Science Inventory

    Estrogenic compounds have been shown to be present in surface waters, leading to concerns over the possible presence of endocrine disrupting compounds in finished drinking waters. Bench-scale studies (jar tests) simulating coagulation were conducted to evaluate the ability of tw...

  16. Terrestrial ecosystem process model Biome-BGCMuSo v4.0: summary of improvements and new modeling possibilities

    NASA Astrophysics Data System (ADS)

    Hidy, Dóra; Barcza, Zoltán; Marjanović, Hrvoje; Zorana Ostrogović Sever, Maša; Dobor, Laura; Gelybó, Györgyi; Fodor, Nándor; Pintér, Krisztina; Churkina, Galina; Running, Steven; Thornton, Peter; Bellocchi, Gianni; Haszpra, László; Horváth, Ferenc; Suyker, Andrew; Nagy, Zoltán

    2016-12-01

    The process-based biogeochemical model Biome-BGC was enhanced to improve its ability to simulate carbon, nitrogen, and water cycles of various terrestrial ecosystems under contrasting management activities. Biome-BGC version 4.1.1 was used as a base model. Improvements included addition of new modules such as the multilayer soil module, implementation of processes related to soil moisture and nitrogen balance, soil-moisture-related plant senescence, and phenological development. Vegetation management modules with annually varying options were also implemented to simulate management practices of grasslands (mowing, grazing), croplands (ploughing, fertilizer application, planting, harvesting), and forests (thinning). New carbon and nitrogen pools have been defined to simulate yield and soft stem development of herbaceous ecosystems. The model version containing all developments is referred to as Biome-BGCMuSo (Biome-BGC with multilayer soil module; in this paper, Biome-BGCMuSo v4.0 is documented). Case studies on a managed forest, cropland, and grassland are presented to demonstrate the effect of model developments on the simulation of plant growth as well as on carbon and water balance.

  17. Composite Study Of Aerosol Long-Range Transport Events From East Asia And North America

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Waliser, D. E.; Guan, B.; Xavier, P.; Petch, J.; Klingaman, N. P.; Woolnough, S.

    2011-12-01

    While the Madden-Julian Oscillation (MJO) exerts pronounced influences on global climate and weather systems, current general circulation models (GCMs) exhibit rather limited capability in representing this prominent tropical variability mode. Meanwhile, the fundamental physics of the MJO are still elusive. Given the central role of the diabatic heating for prevailing MJO theories and demands for reducing the model deficiencies in simulating the MJO, a global model inter-comparison project on diabatic processes and vertical heating structure associated with the MJO has been coordinated through a joint effort by the WCRP-WWRP/THORPEX YOTC MJO Task Force and GEWEX GASS Program. In this presentation, progress of this model inter-comparison project will be reported, with main focus on climate simulations from about 27 atmosphere-only and coupled GCMs. Vertical structures of heating and diabatic processes associated with the MJO based on multi-model simulations will be presented along with their reanalysis and satellite estimate counterparts. Key processes possibly responsible for a realistic simulation of the MJO, including moisture-convection interaction, gross moist stability, ocean coupling, and surface heat flux, will be discussed.

  18. Molecular recognition of naphthalene diimide ligands by telomeric quadruplex-DNA: the importance of the protonation state and mediated hydrogen bonds.

    PubMed

    Spinello, A; Barone, G; Grunenberg, J

    2016-01-28

    In depth Monte Carlo conformational scans in combination with molecular dynamics (MD) simulations and electronic structure calculations were applied in order to study the molecular recognition process between tetrasubstituted naphthalene diimide (ND) guests and G-quadruplex (G4) DNA receptors. ND guests are a promising class of telomere stabilizers due to which they are used in novel anticancer therapeutics. Though several ND guests have been studied experimentally in the past, the protonation state under physiological conditions is still unclear. Based on chemical intuition, in the case of N-methyl-piperazine substitution, different protonation states are possible and might play a crucial role in the molecular recognition process by G4-DNA. Depending on the proton concentration, different nitrogen atoms of the N-methyl-piperazine might (or might not) be protonated. This fact was considered in our simulation in terms of a case by case analysis, since the process of molecular recognition is determined by possible donor or acceptor positions. The results of our simulations show that the electrostatic interactions between the ND ligands and the G4 receptor are maximized in the case of the protonation of the terminal nitrogen atoms, forming compact ND G4 complexes inside the grooves. The influence of different protonation states in terms of the ability to form hydrogen bonds with the sugar-phosphate backbone, as well as the importance of mediated vs. direct hydrogen bonding, was analyzed in detail by MD and relaxed force constant (compliance constant) simulations.

  19. A compositional reservoir simulator on distributed memory parallel computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rame, M.; Delshad, M.

    1995-12-31

    This paper presents the application of distributed memory parallel computes to field scale reservoir simulations using a parallel version of UTCHEM, The University of Texas Chemical Flooding Simulator. The model is a general purpose highly vectorized chemical compositional simulator that can simulate a wide range of displacement processes at both field and laboratory scales. The original simulator was modified to run on both distributed memory parallel machines (Intel iPSC/960 and Delta, Connection Machine 5, Kendall Square 1 and 2, and CRAY T3D) and a cluster of workstations. A domain decomposition approach has been taken towards parallelization of the code. Amore » portion of the discrete reservoir model is assigned to each processor by a set-up routine that attempts a data layout as even as possible from the load-balance standpoint. Each of these subdomains is extended so that data can be shared between adjacent processors for stencil computation. The added routines that make parallel execution possible are written in a modular fashion that makes the porting to new parallel platforms straight forward. Results of the distributed memory computing performance of Parallel simulator are presented for field scale applications such as tracer flood and polymer flood. A comparison of the wall-clock times for same problems on a vector supercomputer is also presented.« less

  20. Polishing tool and the resulting TIF for three variable machine parameters as input for the removal simulation

    NASA Astrophysics Data System (ADS)

    Schneider, Robert; Haberl, Alexander; Rascher, Rolf

    2017-06-01

    The trend in the optic industry shows, that it is increasingly important to be able to manufacture complex lens geometries on a high level of precision. From a certain limit on the required shape accuracy of optical workpieces, the processing is changed from the two-dimensional to point-shaped processing. It is very important that the process is as stable as possible during the in point-shaped processing. To ensure stability, usually only one process parameter is varied during processing. It is common that this parameter is the feed rate, which corresponds to the dwell time. In the research project ArenA-FOi (Application-oriented analysis of resource-saving and energy-efficient design of industrial facilities for the optical industry), a touching procedure is used in the point-attack, and in this case a close look is made as to whether a change of several process parameters is meaningful during a processing. The ADAPT tool in size R20 from Satisloh AG is used, which is also available for purchase. The behavior of the tool is tested under constant conditions in the MCP 250 CNC by OptoTech GmbH. A series of experiments should enable the TIF (tool influence function) to be determined using three variable parameters. Furthermore, the maximum error frequency that can be processed is calculated as an example for one parameter set and serves as an outlook for further investigations. The test results serve as the basic for the later removal simulation, which must be able to deal with a variable TIF. This topic has already been successfully implemented in another research project of the Institute for Precision Manufacturing and High-Frequency Technology (IPH) and thus this algorithm can be used. The next step is the useful implementation of the collected knowledge. The TIF must be selected on the basis of the measured data. It is important to know the error frequencies to select the optimal TIF. Thus, it is possible to compare the simulated results with real measurement data and to carry out a revision. From this point onwards, it is possible to evaluate the potential of this approach, and in the ideal case it will be further researched and later found in the production.

  1. Using parallel computing for the display and simulation of the space debris environment

    NASA Astrophysics Data System (ADS)

    Möckel, M.; Wiedemann, C.; Flegel, S.; Gelhaus, J.; Vörsmann, P.; Klinkrad, H.; Krag, H.

    2011-07-01

    Parallelism is becoming the leading paradigm in today's computer architectures. In order to take full advantage of this development, new algorithms have to be specifically designed for parallel execution while many old ones have to be upgraded accordingly. One field in which parallel computing has been firmly established for many years is computer graphics. Calculating and displaying three-dimensional computer generated imagery in real time requires complex numerical operations to be performed at high speed on a large number of objects. Since most of these objects can be processed independently, parallel computing is applicable in this field. Modern graphics processing units (GPUs) have become capable of performing millions of matrix and vector operations per second on multiple objects simultaneously. As a side project, a software tool is currently being developed at the Institute of Aerospace Systems that provides an animated, three-dimensional visualization of both actual and simulated space debris objects. Due to the nature of these objects it is possible to process them individually and independently from each other. Therefore, an analytical orbit propagation algorithm has been implemented to run on a GPU. By taking advantage of all its processing power a huge performance increase, compared to its CPU-based counterpart, could be achieved. For several years efforts have been made to harness this computing power for applications other than computer graphics. Software tools for the simulation of space debris are among those that could profit from embracing parallelism. With recently emerged software development tools such as OpenCL it is possible to transfer the new algorithms used in the visualization outside the field of computer graphics and implement them, for example, into the space debris simulation environment. This way they can make use of parallel hardware such as GPUs and Multi-Core-CPUs for faster computation. In this paper the visualization software will be introduced, including a comparison between the serial and the parallel method of orbit propagation. Ways of how to use the benefits of the latter method for space debris simulation will be discussed. An introduction to OpenCL will be given as well as an exemplary algorithm from the field of space debris simulation.

  2. Using parallel computing for the display and simulation of the space debris environment

    NASA Astrophysics Data System (ADS)

    Moeckel, Marek; Wiedemann, Carsten; Flegel, Sven Kevin; Gelhaus, Johannes; Klinkrad, Heiner; Krag, Holger; Voersmann, Peter

    Parallelism is becoming the leading paradigm in today's computer architectures. In order to take full advantage of this development, new algorithms have to be specifically designed for parallel execution while many old ones have to be upgraded accordingly. One field in which parallel computing has been firmly established for many years is computer graphics. Calculating and displaying three-dimensional computer generated imagery in real time requires complex numerical operations to be performed at high speed on a large number of objects. Since most of these objects can be processed independently, parallel computing is applicable in this field. Modern graphics processing units (GPUs) have become capable of performing millions of matrix and vector operations per second on multiple objects simultaneously. As a side project, a software tool is currently being developed at the Institute of Aerospace Systems that provides an animated, three-dimensional visualization of both actual and simulated space debris objects. Due to the nature of these objects it is possible to process them individually and independently from each other. Therefore, an analytical orbit propagation algorithm has been implemented to run on a GPU. By taking advantage of all its processing power a huge performance increase, compared to its CPU-based counterpart, could be achieved. For several years efforts have been made to harness this computing power for applications other than computer graphics. Software tools for the simulation of space debris are among those that could profit from embracing parallelism. With recently emerged software development tools such as OpenCL it is possible to transfer the new algorithms used in the visualization outside the field of computer graphics and implement them, for example, into the space debris simulation environment. This way they can make use of parallel hardware such as GPUs and Multi-Core-CPUs for faster computation. In this paper the visualization software will be introduced, including a comparison between the serial and the parallel method of orbit propagation. Ways of how to use the benefits of the latter method for space debris simulation will be discussed. An introduction of OpenCL will be given as well as an exemplary algorithm from the field of space debris simulation.

  3. On the possibility of the multiple inductively coupled plasma and helicon plasma sources for large-area processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Jin-Won; Lee, Yun-Seong, E-mail: leeeeys@kaist.ac.kr; Chang, Hong-Young

    2014-08-15

    In this study, we attempted to determine the possibility of multiple inductively coupled plasma (ICP) and helicon plasma sources for large-area processes. Experiments were performed with the one and two coils to measure plasma and electrical parameters, and a circuit simulation was performed to measure the current at each coil in the 2-coil experiment. Based on the result, we could determine the possibility of multiple ICP sources due to a direct change of impedance due to current and saturation of impedance due to the skin-depth effect. However, a helicon plasma source is difficult to adapt to the multiple sources duemore » to the consistent change of real impedance due to mode transition and the low uniformity of the B-field confinement. As a result, it is expected that ICP can be adapted to multiple sources for large-area processes.« less

  4. Process simulation of modified dry grind ethanol plant with recycle of pretreated and enzymatically hydrolyzed distillers' grains.

    PubMed

    Kim, Youngmi; Mosier, Nathan; Ladisch, Michael R

    2008-08-01

    Distillers' grains (DG), a co-product of a dry grind ethanol process, is an excellent source of supplemental proteins in livestock feed. Studies have shown that, due to its high polymeric sugar contents and ease of hydrolysis, the distillers' grains have potential as an additional source of fermentable sugars for ethanol fermentation. The benefit of processing the distillers' grains to extract fermentable sugars lies in an increased ethanol yield without significant modification in the current dry grind technology. Three different potential configurations of process alternatives in which pretreated and hydrolyzed distillers' grains are recycled for an enhanced overall ethanol yield are proposed and discussed in this paper based on the liquid hot water (LHW) pretreatment of distillers' grains. Possible limitations of each proposed process are also discussed. This paper presents a compositional analysis of distillers' grains, as well as a simulation of the modified dry grind processes with recycle of distillers' grains. Simulated material balances for the modified dry grind processes are established based on the base case assumptions. These balances are compared to the conventional dry grind process in terms of ethanol yield, compositions of its co-products, and accumulation of fermentation inhibitors. Results show that 14% higher ethanol yield is achievable by processing and hydrolyzing the distillers' grains for additional fermentable sugars, as compared to the conventional dry grind process. Accumulation of fermentation by-products and inhibitory components in the proposed process is predicted to be 2-5 times higher than in the conventional dry grind process. The impact of fermentation inhibitors is reviewed and discussed. The final eDDGS (enhanced dried distillers' grains) from the modified processes has 30-40% greater protein content per mass than DDGS, and its potential as a value-added process is also analyzed. While the case studies used to illustrate the process simulation are based on LHW pretreated DG, the process simulation itself provides a framework for evaluation of the impact of other pretreatments.

  5. Studies on thermal decomposition behaviors of polypropylene using molecular dynamics simulation

    NASA Astrophysics Data System (ADS)

    Huang, Jinbao; He, Chao; Tong, Hong; Pan, Guiying

    2017-11-01

    Polypropylene (PP) is one of the main components of waste plastics. In order to understand the mechanism of PP thermal decomposition, the pyrolysis behaviour of PP has been simulated from 300 to 1000 K in periodic boundary conditions by molecular dynamic method, based on AMBER force field. The simulation results show that the pyrolysis process of PP can mostly be divided into three stages: low temperature pyrolysis stage, intermediate temperature stage and high temperature pyrolysis stage. PP pyrolysis is typical of random main-chain scission, and the possible formation mechanism of major pyrolysis products was analyzed.

  6. Modeling the rate-controlled sorption of hexavalent chromium

    USGS Publications Warehouse

    Grove, D.B.; Stollenwerk, K.G.

    1985-01-01

    Sorption of chromium VI on the iron-oxide- and hydroxide-coated surface of alluvial material was numerically simulated with rate-controlled reactions. Reaction kinetics and diffusional processes, in the form of film, pore, and particle diffusion, were simulated and compared with experimental results. The use of empirically calculated rate coefficients for diffusion through the reacting surface was found to simulate experimental data; pore or particle diffusion is believed to be a possible rate-controlling mechanism. The use of rate equations to predict conservative transport and rate- and local-equilibrium-controlled reactions was shown to be feasible.

  7. Analysis of the possibilities and limits of the Moldflow method

    NASA Astrophysics Data System (ADS)

    Brierre, M.

    1982-01-01

    The Moldflow information and computation service is presented. Moldflow is a computer program and data bank available as a computer aid to dimensioning thermoplastic injection molding equipment and processes. It is based on the simultaneous solution of thermal and rheological equations and is intended to completely simulate the injection process. The Moldflow system is described and algorithms are discussed, based on Moldflow listings.

  8. Exemplifying the Effects of Parameterization Shortcomings in the Numerical Simulation of Geological Energy and Mass Storage

    NASA Astrophysics Data System (ADS)

    Dethlefsen, Frank; Tilmann Pfeiffer, Wolf; Schäfer, Dirk

    2016-04-01

    Numerical simulations of hydraulic, thermal, geomechanical, or geochemical (THMC-) processes in the subsurface have been conducted for decades. Often, such simulations are commenced by applying a parameter set that is as realistic as possible. Then, a base scenario is calibrated on field observations. Finally, scenario simulations can be performed, for instance to forecast the system behavior after varying input data. In the context of subsurface energy and mass storage, however, these model calibrations based on field data are often not available, as these storage actions have not been carried out so far. Consequently, the numerical models merely rely on the parameter set initially selected, and uncertainties as a consequence of a lack of parameter values or process understanding may not be perceivable, not mentioning quantifiable. Therefore, conducting THMC simulations in the context of energy and mass storage deserves a particular review of the model parameterization with its input data, and such a review so far hardly exists to the required extent. Variability or aleatory uncertainty exists for geoscientific parameter values in general, and parameters for that numerous data points are available, such as aquifer permeabilities, may be described statistically thereby exhibiting statistical uncertainty. In this case, sensitivity analyses for quantifying the uncertainty in the simulation resulting from varying this parameter can be conducted. There are other parameters, where the lack of data quantity and quality implies a fundamental changing of ongoing processes when such a parameter value is varied in numerical scenario simulations. As an example for such a scenario uncertainty, varying the capillary entry pressure as one of the multiphase flow parameters can either allow or completely inhibit the penetration of an aquitard by gas. As the last example, the uncertainty of cap-rock fault permeabilities and consequently potential leakage rates of stored gases into shallow compartments are regarded as recognized ignorance by the authors of this study, as no realistic approach exists to determine this parameter and values are best guesses only. In addition to these aleatory uncertainties, an equivalent classification is possible for rating epistemic uncertainties describing the degree of understanding processes such as the geochemical and hydraulic effects following potential gas intrusions from deeper reservoirs into shallow aquifers. As an outcome of this grouping of uncertainties, prediction errors of scenario simulations can be calculated by sensitivity analyses, if the uncertainties are identified as statistical. However, if scenario uncertainties exist or even recognized ignorance has to be attested to a parameter or a process in question, the outcomes of simulations mainly depend on the decision of the modeler by choosing parameter values or by interpreting the occurring of processes. In that case, the informative value of numerical simulations is limited by ambiguous simulation results, which cannot be refined without improving the geoscientific database through laboratory or field studies on a longer term basis, so that the effects of the subsurface use may be predicted realistically. This discussion, amended by a compilation of available geoscientific data to parameterize such simulations, will be presented in this study.

  9. GPU-Based Interactive Exploration and Online Probability Maps Calculation for Visualizing Assimilated Ocean Ensembles Data

    NASA Astrophysics Data System (ADS)

    Hoteit, I.; Hollt, T.; Hadwiger, M.; Knio, O. M.; Gopalakrishnan, G.; Zhan, P.

    2016-02-01

    Ocean reanalyses and forecasts are nowadays generated by combining ensemble simulations with data assimilation techniques. Most of these techniques resample the ensemble members after each assimilation cycle. Tracking behavior over time, such as all possible paths of a particle in an ensemble vector field, becomes very difficult, as the number of combinations rises exponentially with the number of assimilation cycles. In general a single possible path is not of interest but only the probabilities that any point in space might be reached by a particle at some point in time. We present an approach using probability-weighted piecewise particle trajectories to allow for interactive probability mapping. This is achieved by binning the domain and splitting up the tracing process into the individual assimilation cycles, so that particles that fall into the same bin after a cycle can be treated as a single particle with a larger probability as input for the next cycle. As a result we loose the possibility to track individual particles, but can create probability maps for any desired seed at interactive rates. The technique is integrated in an interactive visualization system that enables the visual analysis of the particle traces side by side with other forecast variables, such as the sea surface height, and their corresponding behavior over time. By harnessing the power of modern graphics processing units (GPUs) for visualization as well as computation, our system allows the user to browse through the simulation ensembles in real-time, view specific parameter settings or simulation models and move between different spatial or temporal regions without delay. In addition our system provides advanced visualizations to highlight the uncertainty, or show the complete distribution of the simulations at user-defined positions over the complete time series of the domain.

  10. Analysis of batch-related influences on injection molding processes viewed in the context of electro plating quality demands

    NASA Astrophysics Data System (ADS)

    Siepmann, Jens P.; Wortberg, Johannes; Heinzler, Felix A.

    2016-03-01

    The injection molding process is mandatorily influenced by the viscosity of the material. By varying the material batch the viscosity of the polymer changes. For the process and part quality the initial conditions of the material in addition to the processing parameters define the process and product quality. A high percentage of technical polymers processed in injection molding is refined in a follow-up production step, for example electro plating. Processing optimized for electro plating often requires avoiding high shear stresses by using low injection speed and pressure conditions. Therefore differences in the material charges' viscosity occur especially in the quality related low shear rate area. These differences and quality related influences can be investigated by high detail rheological analysis and process simulation based on adapted material describing models. Differences in viscosity between batches can be detected by measurements with high-pressure-capillary-rheometers or oscillatory rheometers for low shear rates. A combination of both measurement techniques is possible by the Cox-Merz-Relation. The detected differences in the rheological behavior of both charges are summarized in two material behavior describing model approaches and added to the simulation. In this paper the results of processing-simulations with standard filling parameters are presented with two ABS charges. Part quality defining quantities such as temperature, pressure and shear stress are investigated and the influence of charge variations is pointed out with respect to electro plating quality demands. Furthermore, the results of simulations with a new quality related process control are presented and compared to the standard processing.

  11. Observability of ionospheric space-time structure with ISR: A simulation study

    NASA Astrophysics Data System (ADS)

    Swoboda, John; Semeter, Joshua; Zettergren, Matthew; Erickson, Philip J.

    2017-02-01

    The sources of error from electronically steerable array (ESA) incoherent scatter radar (ISR) systems are investigated both theoretically and with use of an open-source ISR simulator, developed by the authors, called Simulator for ISR (SimISR). The main sources of error incorporated in the simulator include statistical uncertainty, which arises due to nature of the measurement mechanism and the inherent space-time ambiguity from the sensor. SimISR can take a field of plasma parameters, parameterized by time and space, and create simulated ISR data at the scattered electric field (i.e., complex receiver voltage) level, subsequently processing these data to show possible reconstructions of the original parameter field. To demonstrate general utility, we show a number of simulation examples, with two cases using data from a self-consistent multifluid transport model. Results highlight the significant influence of the forward model of the ISR process and the resulting statistical uncertainty on plasma parameter measurements and the core experiment design trade-offs that must be made when planning observations. These conclusions further underscore the utility of this class of measurement simulator as a design tool for more optimal experiment design efforts using flexible ESA class ISR systems.

  12. Virtual Reality: A New Learning Environment.

    ERIC Educational Resources Information Center

    Ferrington, Gary; Loge, Kenneth

    1992-01-01

    Discusses virtual reality (VR) technology and its possible uses in military training, medical education, industrial design and development, the media industry, and education. Three primary applications of VR in the learning process--visualization, simulation, and construction of virtual worlds--are described, and pedagogical and moral issues are…

  13. TESSIM: a simulator for the Athena-X-IFU

    NASA Astrophysics Data System (ADS)

    Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; den Hartog, R. H.; Bandler, S. R.; de Plaa, J.; den Herder, J.-W. A.

    2016-07-01

    We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS- files which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http://www.sternwarte.uni-erlangen.de/research/sixte/).

  14. TESSIM: A Simulator for the Athena-X-IFU

    NASA Technical Reports Server (NTRS)

    Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; Den Hartog, R. H.; Bandler, S. R.; De Plaa, J.; hide

    2016-01-01

    We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS-les which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http:www.sternwarte.uni-erlangen.deresearchsixte).

  15. Solar energetic particle transport and the possibility of wave generation by streaming electrons

    NASA Astrophysics Data System (ADS)

    Strauss, R. D. T.; le Roux, J. A.

    2017-12-01

    After being accelerated close to the Sun, solar energetic particles (SEPs) are transported (mainly) along the turbulent interplanetary magnetic field. In this study, we simulate the propagation of 100 keV electrons as they are scattered in the interplanetary medium. A consequence of these wave-particle interactions is the possible modification (either growth or damping) of the background turbulence by anisotropic SEP electron beams. This process was thought to be negligible, and therefore neglected in past modeling approaches. However, recent observations and modeling by Agueda and Lario (2016) suggest that wave generation may be significant and is therefore included and evaluated in our present model. Our results suggest that wave amplification by streaming SEP electrons is indeed possible and may even significantly alter the background turbulent field. However, the simulations show that this process is much too weak to produce observable effects at Earth's orbit, but such effects may well be observed in future by spacecraft closer to the Sun, presenting an intriguing observational opportunity for either the Solar Orbiter or the Parker Solar Probe spacecraft. Lastly, we note that the level of perpendicular diffusion may also play an important role in determining the effectiveness of the wave growth process. Reference: Agueda, N. and Lario, D. Release History and Transport Parameters of Relativistic Solar Electrons Inferred From Near-the-Sun In Situ Observations, ApJ, 829, 131, 2016.

  16. Molecular Dynamic Studies of Particle Wake Potentials in Plasmas

    NASA Astrophysics Data System (ADS)

    Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren

    2010-11-01

    Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (P^3M) code ddcMD to perform these simulations. As a starting point in our study, we examined the wake of a particle passing through a plasma. In this poster, we compare the wake observed in 3D ddcMD simulations with that predicted by Vlasov theory and those observed in the electrostatic PIC code BEPS where the cell size was reduced to .03λD.

  17. Optimization of the production process using virtual model of a workspace

    NASA Astrophysics Data System (ADS)

    Monica, Z.

    2015-11-01

    Optimization of the production process is an element of the design cycle consisting of: problem definition, modelling, simulation, optimization and implementation. Without the use of simulation techniques, the only thing which could be achieved is larger or smaller improvement of the process, not the optimization (i.e., the best result it is possible to get for the conditions under which the process works). Optimization is generally management actions that are ultimately bring savings in time, resources, and raw materials and improve the performance of a specific process. It does not matter whether it is a service or manufacturing process. Optimizing the savings generated by improving and increasing the efficiency of the processes. Optimization consists primarily of organizational activities that require very little investment, or rely solely on the changing organization of work. Modern companies operating in a market economy shows a significant increase in interest in modern methods of production management and services. This trend is due to the high competitiveness among companies that want to achieve success are forced to continually modify the ways to manage and flexible response to changing demand. Modern methods of production management, not only imply a stable position of the company in the sector, but also influence the improvement of health and safety within the company and contribute to the implementation of more efficient rules for standardization work in the company. This is why in the paper is presented the application of such developed environment like Siemens NX to create the virtual model of a production system and to simulate as well as optimize its work. The analyzed system is the robotized workcell consisting of: machine tools, industrial robots, conveyors, auxiliary equipment and buffers. In the program could be defined the control program realizing the main task in the virtual workcell. It is possible, using this tool, to optimize both the object trajectory and the cooperation process.

  18. Estimation of Kubo number and correlation length of fluctuating magnetic fields and pressure in BOUT + + edge pedestal collapse simulation

    NASA Astrophysics Data System (ADS)

    Kim, Jaewook; Lee, W.-J.; Jhang, Hogun; Kaang, H. H.; Ghim, Y.-C.

    2017-10-01

    Stochastic magnetic fields are thought to be as one of the possible mechanisms for anomalous transport of density, momentum and heat across the magnetic field lines. Kubo number and Chirikov parameter are quantifications of the stochasticity, and previous studies show that perpendicular transport strongly depends on the magnetic Kubo number (MKN). If MKN is smaller than one, diffusion process will follow Rechester-Rosenbluth model; whereas if it is larger than one, percolation theory dominates the diffusion process. Thus, estimation of Kubo number plays an important role to understand diffusion process caused by stochastic magnetic fields. However, spatially localized experimental measurement of fluctuating magnetic fields in a tokamak is difficult, and we attempt to estimate MKNs using BOUT + + simulation data with pedestal collapse. In addition, we calculate correlation length of fluctuating pressures and Chirikov parameters to investigate variation correlation lengths in the simulation. We, then, discuss how one may experimentally estimate MKNs.

  19. Simulation models and designs for advanced Fischer-Tropsch technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, G.N.; Kramer, S.J.; Tam, S.S.

    1995-12-31

    Process designs and economics were developed for three grass-roots indirect Fischer-Tropsch coal liquefaction facilities. A baseline and an alternate upgrading design were developed for a mine-mouth plant located in southern Illinois using Illinois No. 6 coal, and one for a mine-mouth plane located in Wyoming using Power River Basin coal. The alternate design used close-coupled ZSM-5 reactors to upgrade the vapor stream leaving the Fischer-Tropsch reactor. ASPEN process simulation models were developed for all three designs. These results have been reported previously. In this study, the ASPEN process simulation model was enhanced to improve the vapor/liquid equilibrium calculations for themore » products leaving the slurry bed Fischer-Tropsch reactors. This significantly improved the predictions for the alternate ZSM-5 upgrading design. Another model was developed for the Wyoming coal case using ZSM-5 upgrading of the Fischer-Tropsch reactor vapors. To date, this is the best indirect coal liquefaction case. Sensitivity studies showed that additional cost reductions are possible.« less

  20. Brian hears: online auditory processing using vectorization over channels.

    PubMed

    Fontaine, Bertrand; Goodman, Dan F M; Benichoux, Victor; Brette, Romain

    2011-01-01

    The human cochlea includes about 3000 inner hair cells which filter sounds at frequencies between 20 Hz and 20 kHz. This massively parallel frequency analysis is reflected in models of auditory processing, which are often based on banks of filters. However, existing implementations do not exploit this parallelism. Here we propose algorithms to simulate these models by vectorizing computation over frequency channels, which are implemented in "Brian Hears," a library for the spiking neural network simulator package "Brian." This approach allows us to use high-level programming languages such as Python, because with vectorized operations, the computational cost of interpretation represents a small fraction of the total cost. This makes it possible to define and simulate complex models in a simple way, while all previous implementations were model-specific. In addition, we show that these algorithms can be naturally parallelized using graphics processing units, yielding substantial speed improvements. We demonstrate these algorithms with several state-of-the-art cochlear models, and show that they compare favorably with existing, less flexible, implementations.

  1. Parallel VLSI architecture emulation and the organization of APSA/MPP

    NASA Technical Reports Server (NTRS)

    Odonnell, John T.

    1987-01-01

    The Applicative Programming System Architecture (APSA) combines an applicative language interpreter with a novel parallel computer architecture that is well suited for Very Large Scale Integration (VLSI) implementation. The Massively Parallel Processor (MPP) can simulate VLSI circuits by allocating one processing element in its square array to an area on a square VLSI chip. As long as there are not too many long data paths, the MPP can simulate a VLSI clock cycle very rapidly. The APSA circuit contains a binary tree with a few long paths and many short ones. A skewed H-tree layout allows every processing element to simulate a leaf cell and up to four tree nodes, with no loss in parallelism. Emulation of a key APSA algorithm on the MPP resulted in performance 16,000 times faster than a Vax. This speed will make it possible for the APSA language interpreter to run fast enough to support research in parallel list processing algorithms.

  2. Using reinforcement learning to examine dynamic attention allocation during reading.

    PubMed

    Liu, Yanping; Reichle, Erik D; Gao, Ding-Guo

    2013-01-01

    A fundamental question in reading research concerns whether attention is allocated strictly serially, supporting lexical processing of one word at a time, or in parallel, supporting concurrent lexical processing of two or more words (Reichle, Liversedge, Pollatsek, & Rayner, 2009). The origins of this debate are reviewed. We then report three simulations to address this question using artificial reading agents (Liu & Reichle, 2010; Reichle & Laurent, 2006) that learn to dynamically allocate attention to 1-4 words to "read" as efficiently as possible. These simulation results indicate that the agents strongly preferred serial word processing, although they occasionally attended to more than one word concurrently. The reason for this preference is discussed, along with implications for the debate about how humans allocate attention during reading. Copyright © 2013 Cognitive Science Society, Inc.

  3. Virtual Control Systems Environment (VCSE)

    ScienceCinema

    Atkins, Will

    2018-02-14

    Will Atkins, a Sandia National Laboratories computer engineer discusses cybersecurity research work for process control systems. Will explains his work on the Virtual Control Systems Environment project to develop a modeling and simulation framework of the U.S. electric grid in order to study and mitigate possible cyberattacks on infrastructure.

  4. Numerical simulation of deformation and failure processes of a complex technical object under impact loading

    NASA Astrophysics Data System (ADS)

    Kraus, E. I.; Shabalin, I. I.; Shabalin, T. I.

    2018-04-01

    The main points of development of numerical tools for simulation of deformation and failure of complex technical objects under nonstationary conditions of extreme loading are presented. The possibility of extending the dynamic method for construction of difference grids to the 3D case is shown. A 3D realization of discrete-continuum approach to the deformation and failure of complex technical objects is carried out. The efficiency of the existing software package for 3D modelling is shown.

  5. Eddy current NDE performance demonstrations using simulation tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maurice, L.; Costan, V.; Guillot, E.

    2013-01-25

    To carry out performance demonstrations of the Eddy-Current NDE processes applied on French nuclear power plants, EDF studies the possibility of using simulation tools as an alternative to measurements on steam generator tube mocks-up. This paper focuses on the strategy led by EDF to assess and use code{sub C}armel3D and Civa, on the case of Eddy-Current NDE on wears problem which may appear in the U-shape region of steam generator tubes due to the rubbing of anti-vibration bars.

  6. Critical threshold behavior for steady-state internal transport barriers in burning plasmas.

    PubMed

    García, J; Giruzzi, G; Artaud, J F; Basiuk, V; Decker, J; Imbeaux, F; Peysson, Y; Schneider, M

    2008-06-27

    Burning tokamak plasmas with internal transport barriers are investigated by means of integrated modeling simulations. The barrier sustainment in steady state, differently from the barrier formation process, is found to be characterized by a critical behavior, and the critical number of the phase transition is determined. Beyond a power threshold, alignment of self-generated and noninductively driven currents occurs and steady state becomes possible. This concept is applied to simulate a steady-state scenario within the specifications of the International Thermonuclear Experimental Reactor.

  7. Visualization and Analysis of Climate Simulation Performance Data

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and solutions that greatly aided our understanding. The software employed is based on Avizo Green, ParaView and SimVis, as well as own developed software extensions.

  8. The Role of Molecular Dynamics Potential of Mean Force Calculations in the Investigation of Enzyme Catalysis.

    PubMed

    Yang, Y; Pan, L; Lightstone, F C; Merz, K M

    2016-01-01

    The potential of mean force simulations, widely applied in Monte Carlo or molecular dynamics simulations, are useful tools to examine the free energy variation as a function of one or more specific reaction coordinate(s) for a given system. Implementation of the potential of mean force in the simulations of biological processes, such as enzyme catalysis, can help overcome the difficulties of sampling specific regions on the energy landscape and provide useful insights to understand the catalytic mechanism. The potential of mean force simulations usually require many, possibly parallelizable, short simulations instead of a few extremely long simulations and, therefore, are fairly manageable for most research facilities. In this chapter, we provide detailed protocols for applying the potential of mean force simulations to investigate enzymatic mechanisms for several different enzyme systems. © 2016 Elsevier Inc. All rights reserved.

  9. Numerical Simulation of Hydrothermal Salt Separation Process and Analysis and Cost Estimating of Shipboard Liquid Waste Disposal

    DTIC Science & Technology

    2007-06-01

    possible means to improve a variety of processes: supercritical water in steam Rankine cycles (fossil-fuel powered plants), supercritical carbon ... dioxide and supercritical water in advanced nuclear power plants, and oxidation in supercritical water for use in destroying toxic military wastes and...destruction technologies are installed in a class of ship. Additionally, the properties of one waste water destruction medium, supercritical

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moryakov, A. V., E-mail: sailor@yauza.ru; Pylyov, S. S.

    This paper presents the formulation of the problem and the methodical approach for solving large systems of linear differential equations describing nonstationary processes with the use of CUDA technology; this approach is implemented in the ANGEL program. Results for a test problem on transport of radioactive products over loops of a nuclear power plant are given. The possibilities for the use of the ANGEL program for solving various problems that simulate arbitrary nonstationary processes are discussed.

  11. A new Scheme for ATLAS Trigger Simulation using Legacy Code

    NASA Astrophysics Data System (ADS)

    Galster, Gorm; Stelzer, Joerg; Wiedenmann, Werner

    2014-06-01

    Analyses at the LHC which search for rare physics processes or determine with high precision Standard Model parameters require accurate simulations of the detector response and the event selection processes. The accurate determination of the trigger response is crucial for the determination of overall selection efficiencies and signal sensitivities. For the generation and the reconstruction of simulated event data, the most recent software releases are usually used to ensure the best agreement between simulated data and real data. For the simulation of the trigger selection process, however, ideally the same software release that was deployed when the real data were taken should be used. This potentially requires running software dating many years back. Having a strategy for running old software in a modern environment thus becomes essential when data simulated for past years start to present a sizable fraction of the total. We examined the requirements and possibilities for such a simulation scheme within the ATLAS software framework and successfully implemented a proof-of-concept simulation chain. One of the greatest challenges was the choice of a data format which promises long term compatibility with old and new software releases. Over the time periods envisaged, data format incompatibilities are also likely to emerge in databases and other external support services. Software availability may become an issue, when e.g. the support for the underlying operating system might stop. In this paper we present the encountered problems and developed solutions, and discuss proposals for future development. Some ideas reach beyond the retrospective trigger simulation scheme in ATLAS as they also touch more generally aspects of data preservation.

  12. Characteristics of traffic flow at a non-signalized intersection in the framework of game theory

    NASA Astrophysics Data System (ADS)

    Fan, Hongqiang; Jia, Bin; Tian, Junfang; Yun, Lifen

    2014-12-01

    At a non-signalized intersection, some vehicles violate the traffic rules to pass the intersection as soon as possible. These behaviors may cause many traffic conflicts even traffic accidents. In this paper, a simulation model is proposed to research the effects of these behaviors at a non-signalized intersection. Vehicle’s movement is simulated by the cellular automaton (CA) model. The game theory is introduced for simulating the intersection dynamics. Two types of driver participate the game process: cooperator (C) and defector (D). The cooperator obey the traffic rules, but the defector does not. A transition process may occur when the cooperator is waiting before the intersection. The critical value of waiting time follows the Weibull distribution. One transition regime is found in the phase diagram. The simulation results illustrate the applicability of the proposed model and reveal a number of interesting insights into the intersection management, including that the existence of defectors is benefit for the capacity of intersection, but also reduce the safety of intersection.

  13. Studies of Particle Wake Potentials in Plasmas

    NASA Astrophysics Data System (ADS)

    Ellis, Ian; Graziani, Frank; Glosli, James; Strozzi, David; Surh, Michael; Richards, David; Decyk, Viktor; Mori, Warren

    2011-10-01

    Fast Ignition studies require a detailed understanding of electron scattering, stopping, and energy deposition in plasmas with variable values for the number of particles within a Debye sphere. Presently there is disagreement in the literature concerning the proper description of these processes. Developing and validating proper descriptions requires studying the processes using first-principle electrostatic simulations and possibly including magnetic fields. We are using the particle-particle particle-mesh (PPPM) code ddcMD and the particle-in-cell (PIC) code BEPS to perform these simulations. As a starting point in our study, we examine the wake of a particle passing through a plasma in 3D electrostatic simulations performed with ddcMD and with BEPS using various cell sizes. In this poster, we compare the wakes we observe in these simulations with each other and predictions from Vlasov theory. Prepared by LLNL under Contract DE-AC52-07NA27344 and by UCLA under Grant DE-FG52-09NA29552.

  14. Integrating Multibody Simulation and CFD: toward Complex Multidisciplinary Design Optimization

    NASA Astrophysics Data System (ADS)

    Pieri, Stefano; Poloni, Carlo; Mühlmeier, Martin

    This paper describes the use of integrated multidisciplinary analysis and optimization of a race car model on a predefined circuit. The objective is the definition of the most efficient geometric configuration that can guarantee the lowest lap time. In order to carry out this study it has been necessary to interface the design optimization software modeFRONTIER with the following softwares: CATIA v5, a three dimensional CAD software, used for the definition of the parametric geometry; A.D.A.M.S./Motorsport, a multi-body dynamic simulation software; IcemCFD, a mesh generator, for the automatic generation of the CFD grid; CFX, a Navier-Stokes code, for the fluid-dynamic forces prediction. The process integration gives the possibility to compute, for each geometrical configuration, a set of aerodynamic coefficients that are then used in the multiboby simulation for the computation of the lap time. Finally an automatic optimization procedure is started and the lap-time minimized. The whole process is executed on a Linux cluster running CFD simulations in parallel.

  15. A Java-Enabled Interactive Graphical Gas Turbine Propulsion System Simulator

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Afjeh, Abdollah A.

    1997-01-01

    This paper describes a gas turbine simulation system which utilizes the newly developed Java language environment software system. The system provides an interactive graphical environment which allows the quick and efficient construction and analysis of arbitrary gas turbine propulsion systems. The simulation system couples a graphical user interface, developed using the Java Abstract Window Toolkit, and a transient, space- averaged, aero-thermodynamic gas turbine analysis method, both entirely coded in the Java language. The combined package provides analytical, graphical and data management tools which allow the user to construct and control engine simulations by manipulating graphical objects on the computer display screen. Distributed simulations, including parallel processing and distributed database access across the Internet and World-Wide Web (WWW), are made possible through services provided by the Java environment.

  16. Analysis of Operating Modes of Stand-Alone Series Controller of Power Flows for Overhead Power Transmission Lines

    NASA Astrophysics Data System (ADS)

    Astashev, M. G.; Panfilov, D. I.; Seregin, D. A.; Chernyshev, A. A.

    2017-12-01

    The features of using the bridge voltage inverter in small-size stand-alone series controllers of power flows (PFSC) for overhead power transmission lines (OPTL) are examined. The basic processes in the converter during transient and steady state modes were analyzed. The basic relations for calculating the electromagnetic processes taking into account the energy loss in the circuit and without it were received. A simulation model is proposed of a converter that makes it possible to study its operating modes during the formation of reactance introduced into the overhead power transmission line. The results of simulation of operating modes of the PFSC are presented.

  17. Influence of the power law index on the fiber breakage during injection molding by numerical simulations

    NASA Astrophysics Data System (ADS)

    Desplentere, Frederik; Six, Wim; Bonte, Hilde; Debrabandere, Eric

    2013-04-01

    In predictive engineering for polymer processes, the proper prediction of material microstructure from known processing conditions and constituent material properties is a critical step forward properly predicting bulk properties in the finished composite. Operating within the context of long-fiber thermoplastics (LFT, length > 15mm) this investigation concentrates on the influence of the power law index on the final fiber length distribution within the injection molded part. To realize this, the Autodesk Simulation Moldflow Insight Scandium 2013 software has been used. In this software, a fiber breakage algorithm is available from this release on. Using virtual material data with realistic viscosity levels allows to separate the influence of the power law index on the fiber breakage from the other material and process parameters. Applying standard settings for the fiber breakage parameters results in an obvious influence on the fiber length distribution through the thickness of the part and also as function of position in the part. Finally, the influence of the shear rate constant within the fiber breakage model has been investigated illustrating the possibility to fit the virtual fiber length distribution to the possible experimentally available data.

  18. From cellulose to kerogen: molecular simulation of a geological process.

    PubMed

    Atmani, Lea; Bichara, Christophe; Pellenq, Roland J-M; Van Damme, Henri; van Duin, Adri C T; Raza, Zamaan; Truflandier, Lionel A; Obliger, Amaël; Kralert, Paul G; Ulm, Franz J; Leyssale, Jean-Marc

    2017-12-01

    The process by which organic matter decomposes deep underground to form petroleum and its underlying kerogen matrix has so far remained a no man's land to theoreticians, largely because of the geological (Myears) timescale associated with the process. Using reactive molecular dynamics and an accelerated simulation framework, the replica exchange molecular dynamics method, we simulate the full transformation of cellulose into kerogen and its associated fluid phase under prevailing geological conditions. We observe in sequence the fragmentation of the cellulose crystal and production of water, the development of an unsaturated aliphatic macromolecular phase and its aromatization. The composition of the solid residue along the maturation pathway strictly follows what is observed for natural type III kerogen and for artificially matured samples under confined conditions. After expulsion of the fluid phase, the obtained microporous kerogen possesses the structure, texture, density, porosity and stiffness observed for mature type III kerogen and a microporous carbon obtained by saccharose pyrolysis at low temperature. As expected for this variety of precursor, the main resulting hydrocarbon is methane. The present work thus demonstrates that molecular simulations can now be used to assess, almost quantitatively, such complex chemical processes as petrogenesis in fossil reservoirs and, more generally, the possible conversion of any natural product into bio-sourced materials and/or fuel.

  19. Embodied simulation in exposure-based therapies for posttraumatic stress disorder—a possible integration of cognitive behavioral theories, neuroscience, and psychoanalysis

    PubMed Central

    Peri, Tuvia; Gofman, Mordechai; Tal, Shahar; Tuval-Mashiach, Rivka

    2015-01-01

    Exposure to the trauma memory is the common denominator of most evidence-based interventions for posttraumatic stress disorder (PTSD). Although exposure-based therapies aim to change associative learning networks and negative cognitions related to the trauma memory, emotional interactions between patient and therapist have not been thoroughly considered in past evaluations of exposure-based therapy. This work focuses on recent discoveries of the mirror-neuron system and the theory of embodied simulation (ES). These conceptualizations may add a new perspective to our understanding of change processes in exposure-based treatments for PTSD patients. It is proposed that during exposure to trauma memories, emotional responses of the patient are transferred to the therapist through ES and then mirrored back to the patient in a modulated way. This process helps to alleviate the patient's sense of loneliness and enhances his or her ability to exert control over painful, trauma-related emotional responses. ES processes may enhance the integration of clinical insights originating in psychoanalytic theories—such as holding, containment, projective identification, and emotional attunement—with cognitive behavioral theories of learning processes in the alleviation of painful emotional responses aroused by trauma memories. These processes are demonstrated through a clinical vignette from an exposure-based therapy with a trauma survivor. Possible clinical implications for the importance of face-to-face relationships during exposure-based therapy are discussed. PMID:26593097

  20. Embodied simulation in exposure-based therapies for posttraumatic stress disorder-a possible integration of cognitive behavioral theories, neuroscience, and psychoanalysis.

    PubMed

    Peri, Tuvia; Gofman, Mordechai; Tal, Shahar; Tuval-Mashiach, Rivka

    2015-01-01

    Exposure to the trauma memory is the common denominator of most evidence-based interventions for posttraumatic stress disorder (PTSD). Although exposure-based therapies aim to change associative learning networks and negative cognitions related to the trauma memory, emotional interactions between patient and therapist have not been thoroughly considered in past evaluations of exposure-based therapy. This work focuses on recent discoveries of the mirror-neuron system and the theory of embodied simulation (ES). These conceptualizations may add a new perspective to our understanding of change processes in exposure-based treatments for PTSD patients. It is proposed that during exposure to trauma memories, emotional responses of the patient are transferred to the therapist through ES and then mirrored back to the patient in a modulated way. This process helps to alleviate the patient's sense of loneliness and enhances his or her ability to exert control over painful, trauma-related emotional responses. ES processes may enhance the integration of clinical insights originating in psychoanalytic theories-such as holding, containment, projective identification, and emotional attunement-with cognitive behavioral theories of learning processes in the alleviation of painful emotional responses aroused by trauma memories. These processes are demonstrated through a clinical vignette from an exposure-based therapy with a trauma survivor. Possible clinical implications for the importance of face-to-face relationships during exposure-based therapy are discussed.

  1. Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Zhong; Lai, Ying-Cheng

    2018-03-01

    Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.

  2. Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics.

    PubMed

    Chen, Yu-Zhong; Lai, Ying-Cheng

    2018-03-01

    Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.

  3. Validation of landsurface processes in the AMIP models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, T J

    The Atmospheric Model Intercomparison Project (AMIP) is a commonly accepted protocol for testing the performance of the world's atmospheric general circulation models (AGCMs) under common specifications of radiative forcings (in solar constant and carbon dioxide concentration) and observed ocean boundary conditions (Gates 1992, Gates et al. 1999). From the standpoint of landsurface specialists, the AMIP affords an opportunity to investigate the behaviors of a wide variety of land-surface schemes (LSS) that are coupled to their ''native'' AGCMs (Phillips et al. 1995, Phillips 1999). In principle, therefore, the AMIP permits consideration of an overarching question: ''To what extent does an AGCM'smore » performance in simulating continental climate depend on the representations of land-surface processes by the embedded LSS?'' There are, of course, some formidable obstacles to satisfactorily addressing this question. First, there is the dilemna of how to effectively validate simulation performance, given the present dearth of global land-surface data sets. Even if this data problem were to be alleviated, some inherent methodological difficulties would remain: in the context of the AMIP, it is not possible to validate a given LSS per se, since the associated land-surface climate simulation is a product of the coupled AGCM/LSS system. Moreover, aside from the intrinsic differences in LSS across the AMIP models, the varied representations of land-surface characteristics (e.g. vegetation properties, surface albedos and roughnesses, etc.) and related variations in land-surface forcings further complicate such an attribution process. Nevertheless, it may be possible to develop validation methodologies/statistics that are sufficiently penetrating to reveal ''signatures'' of particular ISS representations (e.g. ''bucket'' vs more complex parameterizations of hydrology) in the AMIP land-surface simulations.« less

  4. Quantifying induced effects of subsurface renewable energy storage

    NASA Astrophysics Data System (ADS)

    Bauer, Sebastian; Beyer, Christof; Pfeiffer, Tilmann; Boockmeyer, Anke; Popp, Steffi; Delfs, Jens-Olaf; Wang, Bo; Li, Dedong; Dethlefsen, Frank; Dahmke, Andreas

    2015-04-01

    New methods and technologies for energy storage are required for the transition to renewable energy sources. Subsurface energy storage systems such as salt caverns or porous formations offer the possibility of hosting large amounts of energy or substance. When employing these systems, an adequate system and process understanding is required in order to assess the feasibility of the individual storage option at the respective site and to predict the complex and interacting effects induced. This understanding is the basis for assessing the potential as well as the risks connected with a sustainable usage of these storage options, especially when considering possible mutual influences. For achieving this aim, in this work synthetic scenarios for the use of the geological underground as an energy storage system are developed and parameterized. The scenarios are designed to represent typical conditions in North Germany. The types of subsurface use investigated here include gas storage and heat storage in porous formations. The scenarios are numerically simulated and interpreted with regard to risk analysis and effect forecasting. For this, the numerical simulators Eclipse and OpenGeoSys are used. The latter is enhanced to include the required coupled hydraulic, thermal, geomechanical and geochemical processes. Using the simulated and interpreted scenarios, the induced effects are quantified individually and monitoring concepts for observing these effects are derived. This presentation will detail the general investigation concept used and analyze the parameter availability for this type of model applications. Then the process implementation and numerical methods required and applied for simulating the induced effects of subsurface storage are detailed and explained. Application examples show the developed methods and quantify induced effects and storage sizes for the typical settings parameterized. This work is part of the ANGUS+ project, funded by the German Ministry of Education and Research (BMBF).

  5. The expected results method for data verification

    NASA Astrophysics Data System (ADS)

    Monday, Paul

    2016-05-01

    The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest quality possible.

  6. The Monash University Interactive Simple Climate Model

    NASA Astrophysics Data System (ADS)

    Dommenget, D.

    2013-12-01

    The Monash university interactive simple climate model is a web-based interface that allows students and the general public to explore the physical simulation of the climate system with a real global climate model. It is based on the Globally Resolved Energy Balance (GREB) model, which is a climate model published by Dommenget and Floeter [2011] in the international peer review science journal Climate Dynamics. The model simulates most of the main physical processes in the climate system in a very simplistic way and therefore allows very fast and simple climate model simulations on a normal PC computer. Despite its simplicity the model simulates the climate response to external forcings, such as doubling of the CO2 concentrations very realistically (similar to state of the art climate models). The Monash simple climate model web-interface allows you to study the results of more than a 2000 different model experiments in an interactive way and it allows you to study a number of tutorials on the interactions of physical processes in the climate system and solve some puzzles. By switching OFF/ON physical processes you can deconstruct the climate and learn how all the different processes interact to generate the observed climate and how the processes interact to generate the IPCC predicted climate change for anthropogenic CO2 increase. The presentation will illustrate how this web-base tool works and what are the possibilities in teaching students with this tool are.

  7. Design principles for simulation games for learning clinical reasoning: A design-based research approach.

    PubMed

    Koivisto, J-M; Haavisto, E; Niemi, H; Haho, P; Nylund, S; Multisilta, J

    2018-01-01

    Nurses sometimes lack the competence needed for recognising deterioration in patient conditions and this is often due to poor clinical reasoning. There is a need to develop new possibilities for learning this crucial competence area. In addition, educators need to be future oriented; they need to be able to design and adopt new pedagogical innovations. The purpose of the study is to describe the development process and to generate principles for the design of nursing simulation games. A design-based research methodology is applied in this study. Iterative cycles of analysis, design, development, testing and refinement were conducted via collaboration among researchers, educators, students, and game designers. The study facilitated the generation of reusable design principles for simulation games to guide future designers when designing and developing simulation games for learning clinical reasoning. This study makes a major contribution to research on simulation game development in the field of nursing education. The results of this study provide important insights into the significance of involving nurse educators in the design and development process of educational simulation games for the purpose of nursing education. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Creativity in Education: A Standard for Computer-Based Teaching.

    ERIC Educational Resources Information Center

    Schank, Roger C.; Farrell, Robert

    1988-01-01

    Discussion of the potential of computers in education focuses on the need for experiential learning and developing creativity in students. Learning processes are explained in light of artificial intelligence research, problems with current uses of computers in education are discussed, and possible solutions using intelligent simulation software…

  9. Validating and Optimizing the Effects of Model Progression in Simulation-Based Inquiry Learning

    ERIC Educational Resources Information Center

    Mulder, Yvonne G.; Lazonder, Ard W.; de Jong, Ton; Anjewierden, Anjo; Bollen, Lars

    2012-01-01

    Model progression denotes the organization of the inquiry learning process in successive phases of increasing complexity. This study investigated the effectiveness of model progression in general, and explored the added value of either broadening or narrowing students' possibilities to change model progression phases. Results showed that…

  10. RFI and SCRIMP Model Development and Verification

    NASA Technical Reports Server (NTRS)

    Loos, Alfred C.; Sayre, Jay

    2000-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.

  11. Simulating neutron star mergers as r-process sources in ultrafaint dwarf galaxies

    NASA Astrophysics Data System (ADS)

    Safarzadeh, Mohammadtaher; Scannapieco, Evan

    2017-10-01

    To explain the high observed abundances of r-process elements in local ultrafaint dwarf (UFD) galaxies, we perform cosmological zoom simulations that include r-process production from neutron star mergers (NSMs). We model star formation stochastically and simulate two different haloes with total masses ≈108 M⊙ at z = 6. We find that the final distribution of [Eu/H] versus [Fe/H] is relatively insensitive to the energy by which the r-process material is ejected into the interstellar medium, but strongly sensitive to the environment in which the NSM event occurs. In one halo, the NSM event takes place at the centre of the stellar distribution, leading to high levels of r-process enrichment such as seen in a local UFD, Reticulum II (Ret II). In a second halo, the NSM event takes place outside of the densest part of the galaxy, leading to a more extended r-process distribution. The subsequent star formation occurs in an interstellar medium with shallow levels of r-process enrichment that results in stars with low levels of [Eu/H] compared to Ret II stars even when the maximum possible r-process mass is assumed to be ejected. This suggests that the natal kicks of neutron stars may also play an important role in determining the r-process abundances in UFD galaxies, a topic that warrants further theoretical investigation.

  12. Effects of anthropogenic groundwater exploitation on land surface processes: A case study of the Haihe River Basin, Northern China

    NASA Astrophysics Data System (ADS)

    Xie, Z.; Zou, J.; Qin, P.; Sun, Q.

    2014-12-01

    In this study, we incorporated a groundwater exploitation scheme into the land surface model CLM3.5 to investigate the effects of the anthropogenic exploitation of groundwater on land surface processes in a river basin. Simulations of the Haihe River Basin in northern China were conducted for the years 1965-2000 using the model. A control simulation without exploitation and three exploitation simulations with different water demands derived from socioeconomic data related to the Basin were conducted. The results showed that groundwater exploitation for human activities resulted in increased wetting and cooling effects at the land surface and reduced groundwater storage. A lowering of the groundwater table, increased upper soil moisture, reduced 2 m air temperature, and enhanced latent heat flux were detected by the end of the simulated period, and the changes at the land surface were related linearly to the water demands. To determine the possible responses of the land surface processes in extreme cases (i.e., in which the exploitation process either continued or ceased), additional hypothetical simulations for the coming 200 years with constant climate forcing were conducted, regardless of changes in climate. The simulations revealed that the local groundwater storage on the plains could not contend with high-intensity exploitation for long if the exploitation process continues at the current rate. Changes attributable to groundwater exploitation reached extreme values and then weakened within decades with the depletion of groundwater resources and the exploitation process will therefore cease. However, if exploitation is stopped completely to allow groundwater to recover, drying and warming effects, such as increased temperature, reduced soil moisture, and reduced total runoff, would occur in the Basin within the early decades of the simulation period. The effects of exploitation will then gradually disappear, and the land surface variables will approach the natural state and stabilize at different rates. Simulations were also conducted for cases in which exploitation either continues or ceases using future climate scenario outputs from a general circulation model. The resulting trends were almost the same as those of the simulations with constant climate forcing.

  13. Effects of molecular dissociation on the hydrogen equation of state

    NASA Astrophysics Data System (ADS)

    Bonev, Stanimir; Schwegler, Eric; Galli, Giulia; Gygi, Francois

    2002-03-01

    It has been suggested recently(François Gygi and G. Galli, submitted to Phys. Rev. Lett.) that the physical mechanism behind the larger compressibility of liquid deuterium observed in laser shock experiments as compared to ab initio simulations may be related to shock-induced electronic excitations. A possible result of such non-adiabatic processes is hindering of the molecular dissociation. This has motivated us to study the importance of molecular dissociation on the hydrogen equation of state. To this end, we have carried out ab initio molecular dynamics simulations of liquid deuterium where intramolecular dissociation is prevented by the use of bond length contraints. Simulations at both fixed thermodynamic conditions and dynamical simulations of shocked deuterium will be discussed.

  14. Moving magnets in a micromagnetic finite-difference framework

    NASA Astrophysics Data System (ADS)

    Rissanen, Ilari; Laurson, Lasse

    2018-05-01

    We present a method and an implementation for smooth linear motion in a finite-difference-based micromagnetic simulation code, to be used in simulating magnetic friction and other phenomena involving moving microscale magnets. Our aim is to accurately simulate the magnetization dynamics and relative motion of magnets while retaining high computational speed. To this end, we combine techniques for fast scalar potential calculation and cubic b-spline interpolation, parallelizing them on a graphics processing unit (GPU). The implementation also includes the possibility of explicitly simulating eddy currents in the case of conducting magnets. We test our implementation by providing numerical examples of stick-slip motion of thin films pulled by a spring and the effect of eddy currents on the switching time of magnetic nanocubes.

  15. Development of the CELSS Emulator at NASA JSC

    NASA Technical Reports Server (NTRS)

    Cullingford, Hatice S.

    1989-01-01

    The Controlled Ecological Life Support System (CELSS) Emulator is under development at the NASA Johnson Space Center (JSC) with the purpose to investigate computer simulations of integrated CELSS operations involving humans, plants, and process machinery. This paper describes Version 1.0 of the CELSS Emulator that was initiated in 1988 on the JSC Multi Purpose Applications Console Test Bed as the simulation framework. The run module of the simulation system now contains a CELSS model called BLSS. The CELSS Emulator makes it possible to generate model data sets, store libraries of results for further analysis, and also display plots of model variables as a function of time. The progress of the project is presented with sample test runs and simulation display pages.

  16. Building Interactive Simulations in Web Pages without Programming.

    PubMed

    Mailen Kootsey, J; McAuley, Grant; Bernal, Julie

    2005-01-01

    A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network.

  17. A mathematical and experimental simulation of the hematological response to weightlessness

    NASA Technical Reports Server (NTRS)

    Kimzey, S. L.; Leonard, J. I.; Johnson, P. C.

    1979-01-01

    A mathematical model of erythropoiesis control was used to simulate the effects of bedrest and zero-g on the circulating red cell mass. The model incorporates the best current understanding of the dynamics of red cell production and destruction and the associated feedback regulation. Specifically studied were the hemodynamic responses of a 28-day bedrest study devised to simulate Skylab experience. The results support the hypothesis that red cell loss during supine bedrest is a normal physiological feedback process in response to hemoconcentration enhanced tissue oxygenation and suppression of red cell production. Model simulation suggested the possibilities that this period was marked by some combination of increased oxygen-hemoglobin affinity, small reduction in mean red cell life span, ineffective erythropoiesis, or abnormal reticulocytosis.

  18. Simulation study on characteristics of long-range interaction in randomly asymmetric exclusion process

    NASA Astrophysics Data System (ADS)

    Zhao, Shi-Bo; Liu, Ming-Zhe; Yang, Lan-Ying

    2015-04-01

    In this paper we investigate the dynamics of an asymmetric exclusion process on a one-dimensional lattice with long-range hopping and random update via Monte Carlo simulations theoretically. Particles in the model will firstly try to hop over successive unoccupied sites with a probability q, which is different from previous exclusion process models. The probability q may represent the random access of particles. Numerical simulations for stationary particle currents, density profiles, and phase diagrams are obtained. There are three possible stationary phases: the low density (LD) phase, high density (HD) phase, and maximal current (MC) in the system, respectively. Interestingly, bulk density in the LD phase tends to zero, while the MC phase is governed by α, β, and q. The HD phase is nearly the same as the normal TASEP, determined by exit rate β. Theoretical analysis is in good agreement with simulation results. The proposed model may provide a better understanding of random interaction dynamics in complex systems. Project supported by the National Natural Science Foundation of China (Grant Nos. 41274109 and 11104022), the Fund for Sichuan Youth Science and Technology Innovation Research Team (Grant No. 2011JTD0013), and the Creative Team Program of Chengdu University of Technology.

  19. Disadvantages of interfragmentary shear on fracture healing--mechanical insights through numerical simulation.

    PubMed

    Steiner, Malte; Claes, Lutz; Ignatius, Anita; Simon, Ulrich; Wehner, Tim

    2014-07-01

    The outcome of secondary fracture healing processes is strongly influenced by interfragmentary motion. Shear movement is assumed to be more disadvantageous than axial movement, however, experimental results are contradictory. Numerical fracture healing models allow simulation of the fracture healing process with variation of single input parameters and under comparable, normalized mechanical conditions. Thus, a comparison of the influence of different loading directions on the healing process is possible. In this study we simulated fracture healing under several axial compressive, and translational and torsional shear movement scenarios, and compared their respective healing times. Therefore, we used a calibrated numerical model for fracture healing in sheep. Numerous variations of movement amplitudes and musculoskeletal loads were simulated for the three loading directions. Our results show that isolated axial compression was more beneficial for the fracture healing success than both isolated shearing conditions for load and displacement magnitudes which were identical as well as physiological different, and even for strain-based normalized comparable conditions. Additionally, torsional shear movements had less impeding effects than translational shear movements. Therefore, our findings suggest that osteosynthesis implants can be optimized, in particular, to limit translational interfragmentary shear under musculoskeletal loading. © 2014 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  20. Simulations of Polar Stratospheric Clouds and Denitrification Using Laboratory Freezing Rates

    NASA Technical Reports Server (NTRS)

    Drdla, Katja; Tabazadeh, Azadeh; Gore, Warren J. (Technical Monitor)

    2001-01-01

    During the 1999-2000 Arctic winter, the SAGE (Stratospheric Aerosol and Gas Experiment) III Ozone Loss and Validation Experiment (SOLVE) provided evidence of widespread solid-phase polar stratospheric clouds (PSCs) accompanied by severe nitrification. Previous simulations have shown that a freezing process occurring at temperatures above the ice frost point is necessary to explain these observations. In this work, the nitric acid freezing rates measured by Salcedo et al. and discussed by Tabazadeh et al. have been examined. These freezing rates have been tested in winter-long microphysical simulations of the 1999-2000 Arctic vortex evolution in order to determine whether they can explain the observations. A range of cases have been explored, including whether the PSC particles are composed of nitric acid dihydrate or trihydrate, whether the freezing process is a bulk process or occurs only on the particle surfaces, and uncertainties in the derived freezing rates. Finally, the possibility that meteoritic debris enhances the freezing rate has also been examined. The results of these simulations have been compared with key PSC and denitrification measurements made by the SOLVE campaign. The cases that best reproduce the measurements will he highlighted, with a discussion of the implications for our understanding of PSCs.

  1. Distributed source model for the full-wave electromagnetic simulation of nonlinear terahertz generation.

    PubMed

    Fumeaux, Christophe; Lin, Hungyen; Serita, Kazunori; Withayachumnankul, Withawat; Kaufmann, Thomas; Tonouchi, Masayoshi; Abbott, Derek

    2012-07-30

    The process of terahertz generation through optical rectification in a nonlinear crystal is modeled using discretized equivalent current sources. The equivalent terahertz sources are distributed in the active volume and computed based on a separately modeled near-infrared pump beam. This approach can be used to define an appropriate excitation for full-wave electromagnetic numerical simulations of the generated terahertz radiation. This enables predictive modeling of the near-field interactions of the terahertz beam with micro-structured samples, e.g. in a near-field time-resolved microscopy system. The distributed source model is described in detail, and an implementation in a particular full-wave simulation tool is presented. The numerical results are then validated through a series of measurements on square apertures. The general principle can be applied to other nonlinear processes with possible implementation in any full-wave numerical electromagnetic solver.

  2. Implementation of jump-diffusion algorithms for understanding FLIR scenes

    NASA Astrophysics Data System (ADS)

    Lanterman, Aaron D.; Miller, Michael I.; Snyder, Donald L.

    1995-07-01

    Our pattern theoretic approach to the automated understanding of forward-looking infrared (FLIR) images brings the traditionally separate endeavors of detection, tracking, and recognition together into a unified jump-diffusion process. New objects are detected and object types are recognized through discrete jump moves. Between jumps, the location and orientation of objects are estimated via continuous diffusions. An hypothesized scene, simulated from the emissive characteristics of the hypothesized scene elements, is compared with the collected data by a likelihood function based on sensor statistics. This likelihood is combined with a prior distribution defined over the set of possible scenes to form a posterior distribution. The jump-diffusion process empirically generates the posterior distribution. Both the diffusion and jump operations involve the simulation of a scene produced by a hypothesized configuration. Scene simulation is most effectively accomplished by pipelined rendering engines such as silicon graphics. We demonstrate the execution of our algorithm on a silicon graphics onyx/reality engine.

  3. System simulation application for determining the size of daily raw material purchases at PT XY

    NASA Astrophysics Data System (ADS)

    Napitupulu, H. L.

    2018-02-01

    Every manufacturing company needs to implement green production, including PT XY as a marine catchment processing industry in Sumatera Utara Province. The company is engaged in the processing of squid for export purposes. The company’s problem relates to the absence of a decision on the daily purchase amount of the squid. The purchase of daily raw materials in varying quantities has caused companies to face the problem of excess raw materials or otherwise the lack of raw materials. The low purchase of raw materials will result in reduced productivity, while large purchases will lead to increased cooling costs for storage of excess raw materials, as well as possible loss of damage raw material. Therefore it is necessary to determine the optimal amount of raw material purchases every day. This can be determined by applying simulation. Application of system simulations can provide the expected optimal amount of raw material purchases.

  4. Severe Nuclear Accident Program (SNAP) - a real time model for accidental releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saltbones, J.; Foss, A.; Bartnicki, J.

    1996-12-31

    The model: Several Nuclear Accident Program (SNAP) has been developed at the Norwegian Meteorological Institute (DNMI) in Oslo to provide decision makers and Government officials with real-time tool for simulating large accidental releases of radioactivity from nuclear power plants or other sources. SNAP is developed in the Lagrangian framework in which atmospheric transport of radioactive pollutants is simulated by emitting a large number of particles from the source. The main advantage of the Lagrangian approach is a possibility of precise parameterization of advection processes, especially close to the source. SNAP can be used to predict the transport and deposition ofmore » a radioactive cloud in e future (up to 48 hours, in the present version) or to analyze the behavior of the cloud in the past. It is also possible to run the model in the mixed mode (partly analysis and partly forecast). In the routine run we assume unit (1 g s{sup -1}) emission in each of three classes. This assumption is very convenient for the main user of the model output in case of emergency: Norwegian Radiation Protection Agency. Due to linearity of the model equations, user can test different emission scenarios as a post processing task by assigning different weights to concentration and deposition fields corresponding to each of three emission classes. SNAP is fully operational and can be run by the meteorologist on duty at any time. The output from SNAP has two forms: First on the maps of Europe, or selected parts of Europe, individual particles are shown during the simulation period. Second, immediately after the simulation, concentration/deposition fields can be shown every three hours of the simulation period as isoline maps for each emission class. In addition, concentration and deposition maps, as well as some meteorological data, are stored on a public accessible disk for further processing by the model users.« less

  5. Understanding Atmospheric Anomalies Associated with Seasonal Pluvial-Drought Processes Using Southwest China as an Example

    NASA Astrophysics Data System (ADS)

    Liu, Z.; LU, G.; He, H.; Wu, Z.; He, J.

    2017-12-01

    Seasonal pluvial-drought transition processes are unique natural phenomena. To explore possible mechanisms, we considered Southwest China (SWC) as the study region and comprehensively investigated the temporal evolution of large-scale and regional atmospheric variables with the simple method of Standardized Anomalies (SA). Some key results include: (1) The net vertical integral of water vapour flux (VIWVF) across the four boundaries may be a feasible indicator of pluvial-drought transition processes over SWC, because its SA-based index is almost consistent with process development. (2) The vertical SA-based patterns of regional horizontal divergence (D) and vertical motion (ω) also coincides with the pluvial-drought transition processes well, and the SA-based index of regional D show relatively high correlation with the identified processes over SWC. (3) With respect to large-scale anomalies of circulation patterns, a well-organized Eurasian Pattern is one important feature during the pluvial-drought transition over SWC. (4) To explore the possibility of simulating drought development using previous pluvial anomalies, large-scale and regional atmospheric SA-based indices were used. As a whole, when SA-based indices of regional dynamic and water-vapor variables are introduced, simulated drought development only with large-scale anomalies can be improved a lot. (5) Eventually, pluvial-drought transition processes and associated regional atmospheric anomalies over nine Chinese drought study regions were investigated. With respect to regional D, vertically single or double "upper-positive-lower-negative" and "upper-negative-lower-positive" patterns are the most common vertical SA-based patterns during the pluvial and drought parts of transition processes, respectively.

  6. Using in situ simulation to evaluate operational readiness of a children's hospital-based obstetrics unit.

    PubMed

    Ventre, Kathleen M; Barry, James S; Davis, Deborah; Baiamonte, Veronica L; Wentworth, Allen C; Pietras, Michele; Coughlin, Liza; Barley, Gwyn

    2014-04-01

    Relocating obstetric (OB) services to a children's hospital imposes demands on facility operations, which must be met to ensure quality care and a satisfactory patient experience. We used in situ simulations to prospectively and iteratively evaluate operational readiness of a children's hospital-based OB unit before it opened for patient care. This project took place at a 314-bed, university-affiliated children's hospital. We developed 3 full-scale simulation scenarios depicting a concurrent maternal and neonatal emergency. One scenario began with a standardized patient experiencing admission; the mannequin portrayed a mother during delivery. We ran all 3 scenarios on 2 dates scheduled several weeks apart. We ran 2 of the scenarios on a third day to verify the reliability of key processes. During the simulations, content experts completed equipment checklists, and participants identified latent safety hazards. Each simulation involved a unique combination of scheduled participants who were supplemented by providers from responding ancillary services. The simulations involved 133 scheduled participants representing OB, neonatology, and anesthesiology. We exposed and addressed operational deficiencies involving equipment availability, staffing, interprofessional communication, and systems issues such as transfusion protocol failures and electronic order entry challenges. Process changes between simulation days 1 to 3 decreased the elapsed time between transfusion protocol activation and blood arrival to the operating room and labor/delivery/recovery/postpartum setting. In situ simulations identified multiple operational deficiencies on the OB unit, allowing us to take corrective action before its opening. This project may guide other children's hospitals regarding care processes likely to require significant focus and possible modification to accommodate an OB service.

  7. Three-dimensional motor schema based navigation

    NASA Technical Reports Server (NTRS)

    Arkin, Ronald C.

    1989-01-01

    Reactive schema-based navigation is possible in space domains by extending the methods developed for ground-based navigation found within the Autonomous Robot Architecture (AuRA). Reformulation of two dimensional motor schemas for three dimensional applications is a straightforward process. The manifold advantages of schema-based control persist, including modular development, amenability to distributed processing, and responsiveness to environmental sensing. Simulation results show the feasibility of this methodology for space docking operations in a cluttered work area.

  8. A knowledge based expert system for propellant system monitoring at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Jamieson, J. R.; Delaune, C.; Scarl, E.

    1985-01-01

    The Lox Expert System (LES) is the first attempt to build a realtime expert system capable of simulating the thought processes of NASA system engineers, with regard to fluids systems analysis and troubleshooting. An overview of the hardware and software describes the techniques used, and possible applications to other process control systems. LES is now in the advanced development stage, with a full implementation planned for late 1985.

  9. Microsurgery Simulator of Cerebral Aneurysm Clipping with Interactive Cerebral Deformation Featuring a Virtual Arachnoid.

    PubMed

    Shono, Naoyuki; Kin, Taichi; Nomura, Seiji; Miyawaki, Satoru; Saito, Toki; Imai, Hideaki; Nakatomi, Hirofumi; Oyama, Hiroshi; Saito, Nobuhito

    2018-05-01

    A virtual reality simulator for aneurysmal clipping surgery is an attractive research target for neurosurgeons. Brain deformation is one of the most important functionalities necessary for an accurate clipping simulator and is vastly affected by the status of the supporting tissue, such as the arachnoid membrane. However, no virtual reality simulator implementing the supporting tissue of the brain has yet been developed. To develop a virtual reality clipping simulator possessing interactive brain deforming capability closely dependent on arachnoid dissection and apply it to clinical cases. Three-dimensional computer graphics models of cerebral tissue and surrounding structures were extracted from medical images. We developed a new method for modifiable cerebral tissue complex deformation by incorporating a nonmedical image-derived virtual arachnoid/trabecula in a process called multitissue integrated interactive deformation (MTIID). MTIID made it possible for cerebral tissue complexes to selectively deform at the site of dissection. Simulations for 8 cases of actual clipping surgery were performed before surgery and evaluated for their usefulness in surgical approach planning. Preoperatively, each operative field was precisely reproduced and visualized with the virtual brain retraction defined by users. The clear visualization of the optimal approach to treating the aneurysm via an appropriate arachnoid incision was possible with MTIID. A virtual clipping simulator mainly focusing on supporting tissues and less on physical properties seemed to be useful in the surgical simulation of cerebral aneurysm clipping. To our knowledge, this article is the first to report brain deformation based on supporting tissues.

  10. High fidelity studies of exploding foil initiator bridges, Part 3: ALEGRA MHD simulations

    NASA Astrophysics Data System (ADS)

    Neal, William; Garasi, Christopher

    2017-01-01

    Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage, and in the case of EFIs, flyer velocity. Experimental methods have correspondingly generally been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA and ALE-MHD, it is now possible to simulate these components in three dimensions, and predict a much greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately verified. In this third paper of a three part study, the experimental results presented in part 2 are compared against 3-dimensional MHD simulations. This improved experimental capability, along with advanced simulations, offer an opportunity to gain a greater understanding of the processes behind the functioning of EBW and EFI detonators.

  11. [Use of driving simulators in psychological research].

    PubMed

    Andysz, Aleksandra; Waszkowska, Małgorzata; Merecz, Dorota; Drabek, Marcin

    2010-01-01

    The history of simulators dates back to the first decades of the twentieth century. At the beginning they were used to train pilots, and eventually they were used in the automotive industry for testing the strength of new vehicles and ergonomic solutions. With time research institutions and technical universities from outside the automotive industry have become more and more interested in simulators. Attractiveness of simulators for researchers is based on a number of important factors: they create the possibility of modeling, control and repeatability of different experimental situations, reducing at the same time the impact of confounding factors. Simulators have a great potential for data collection and processing. What's more, they are safe and ecologic. These values make them almost an ideal research tool. The article presents a review of psychological studies with use of vehicle driving simulators. It also points to advantages and disadvantages of these devices and outlines the future prospects for experimental research.

  12. Hitchhiker mission operations: Past, present, and future

    NASA Technical Reports Server (NTRS)

    Anderson, Kathryn

    1995-01-01

    What is mission operations? Mission operations is an iterative process aimed at achieving the greatest possible mission success with the resources available. The process involves understanding of the science objectives, investigation of which system capabilities can best meet these objectives, integration of the objectives and resources into a cohesive mission operations plan, evaluation of the plan through simulations, and implementation of the plan in real-time. In this paper, the authors present a comprehensive description of what the Hitchhiker mission operations approach is and why it is crucial to mission success. The authors describe the significance of operational considerations from the beginning and throughout the experiment ground and flight systems development. The authors also address the necessity of training and simulations. Finally, the authors cite several examples illustrating the benefits of understanding and utilizing the mission operations process.

  13. Gender and speaker identification as a function of the number of channels in spectrally reduced speech

    NASA Astrophysics Data System (ADS)

    Gonzalez, Julio; Oliver, Juan C.

    2005-07-01

    Considerable research on speech intelligibility for cochlear-implant users has been conducted using acoustic simulations with normal-hearing subjects. However, some relevant topics about perception through cochlear implants remain scantly explored. The present study examined the perception by normal-hearing subjects of gender and identity of a talker as a function of the number of channels in spectrally reduced speech. Two simulation strategies were compared. They were implemented by two different processors that presented signals as either the sum of sine waves at the center of the channels or as the sum of noise bands. In Experiment 1, 15 subjects determined the gender of 40 talkers (20 males + 20 females) from a natural utterance processed through 3, 4, 5, 6, 8, 10, 12, and 16 channels with both processors. In Experiment 2, 56 subjects matched a natural sentence uttered by 10 talkers with the corresponding simulation replicas processed through 3, 4, 8, and 16 channels for each processor. In Experiment 3, 72 subjects performed the same task but different sentences were used for natural and processed stimuli. A control Experiment 4 was conducted to equate the processing steps between the two simulation strategies. Results showed that gender and talker identification was better for the sine-wave processor, and that performance through the noise-band processor was more sensitive to the number of channels. Implications and possible explanations for the superiority of sine-wave simulations are discussed.

  14. Status of the Correlation Process of the V-HAB Simulation with Ground Tests and ISS Telemetry Data

    NASA Technical Reports Server (NTRS)

    Ploetner, P.; Roth, C.; Zhukov, A.; Czupalla, M.; Anderson, M.; Ewert, M.

    2013-01-01

    The Virtual Habitat (V-HAB) is a dynamic Life Support System (LSS) simulation, created for investigation of future human spaceflight missions. It provides the capability to optimize LSS during early design phases. The focal point of the paper is the correlation and validation of V-HAB against ground test and flight data. In order to utilize V-HAB to design an Environmental Control and Life Support System (ECLSS) it is important to know the accuracy of simulations, strengths and weaknesses. Therefore, simulations of real systems are essential. The modeling of the International Space Station (ISS) ECLSS in terms of single technologies as well as an integrated system and correlation against ground and flight test data is described. The results of the simulations make it possible to prove the approach taken by V-HAB.

  15. Atomic-level characterization of the structural dynamics of proteins.

    PubMed

    Shaw, David E; Maragakis, Paul; Lindorff-Larsen, Kresten; Piana, Stefano; Dror, Ron O; Eastwood, Michael P; Bank, Joseph A; Jumper, John M; Salmon, John K; Shan, Yibing; Wriggers, Willy

    2010-10-15

    Molecular dynamics (MD) simulations are widely used to study protein motions at an atomic level of detail, but they have been limited to time scales shorter than those of many biologically critical conformational changes. We examined two fundamental processes in protein dynamics--protein folding and conformational change within the folded state--by means of extremely long all-atom MD simulations conducted on a special-purpose machine. Equilibrium simulations of a WW protein domain captured multiple folding and unfolding events that consistently follow a well-defined folding pathway; separate simulations of the protein's constituent substructures shed light on possible determinants of this pathway. A 1-millisecond simulation of the folded protein BPTI reveals a small number of structurally distinct conformational states whose reversible interconversion is slower than local relaxations within those states by a factor of more than 1000.

  16. Floating gastroretentive drug delivery systems: Comparison of experimental and simulated dissolution profiles and floatation behavior.

    PubMed

    Eberle, Veronika A; Schoelkopf, Joachim; Gane, Patrick A C; Alles, Rainer; Huwyler, Jörg; Puchkov, Maxim

    2014-07-16

    Gastroretentive drug delivery systems (GRDDS) play an important role in the delivery of drug substances to the upper part of the gastrointestinal tract; they offer a possibility to overcome the limited gastric residence time of conventional dosage forms. The aim of the study was to understand drug-release and floatation mechanisms of a floating GRDDS based on functionalized calcium carbonate (FCC). The inherently low apparent density of the excipient (approx. 0.6 g/cm(3)) enabled a mechanism of floatation. The higher specific surface of FCC (approx. 70 m(2)) allowed sufficient hardness of resulting compacts. The floating mechanism of GRDDS was simulated in silico under simulated acidic and neutral conditions, and the results were compared to those obtained in vitro. United States Pharmacopeia (USP) dissolution methods are of limited usefulness for evaluating floating behavior and drug release of floating dosage forms. Therefore, we developed a custom-built stomach model to simultaneously analyze floating characteristics and drug release. In silico dissolution and floatation profiles of the FCC-based tablet were simulated using a three-dimensional cellular automata-based model. In simulated gastric fluid, the FCC-based tablets showed instant floatation. The compacts stayed afloat during the measurement in 0.1 N HCl and eroded completely while releasing the model drug substance. When water was used as dissolution medium, the tablets had no floating lag time and sank down during the measurement, resulting in a change of release kinetics. Floating dosage forms based on FCC appear promising. It was possible to manufacture floating tablets featuring a density of less than unity and sufficient hardness for further processing. In silico dissolution simulation offered a possibility to understand floating behavior and drug-release mechanism. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Effects of ground-water withdrawals on flow in the Sauk River Valley Aquifer and on streamflow in the Cold Spring area, Minnesota

    USGS Publications Warehouse

    Lindgren, R.J.

    2001-01-01

    The simulated contributing areas for selected watersupply wells in the Cold Spring area generally extend to and possibly beyond the model boundaries to the north and to the southeast. The contributing areas for the Gold'n Plump Poultry Processing Plant supply wells extend: (1) to the Sauk River, (2) to the north to and possibly beyond to the northern model boundary, and (3) to the southeast to and possibly beyond the southeastern model boundary. The primary effects of projected increased ground-water withdrawals of 0.23 cubic feet per second (7.5 percent increase) were to: (1) decrease outflow from the Sauk River Valley aquifer through constant-head boundaries and (2) decrease leakage from the valley unit of the Sauk River Valley aquifer to the streams. No appreciable differences were discernible between the simulated steady-state contributing areas to wells with 1998 pumpage and those with the projected pumpage.

  18. Using numeric simulation in an online e-learning environment to teach functional physiological contexts.

    PubMed

    Christ, Andreas; Thews, Oliver

    2016-04-01

    Mathematical models are suitable to simulate complex biological processes by a set of non-linear differential equations. These simulation models can be used as an e-learning tool in medical education. However, in many cases these mathematical systems have to be treated numerically which is computationally intensive. The aim of the study was to develop a system for numerical simulation to be used in an online e-learning environment. In the software system the simulation is located on the server as a CGI application. The user (student) selects the boundary conditions for the simulation (e.g., properties of a simulated patient) on the browser. With these parameters the simulation on the server is started and the simulation result is re-transferred to the browser. With this system two examples of e-learning units were realized. The first one uses a multi-compartment model of the glucose-insulin control loop for the simulation of the plasma glucose level after a simulated meal or during diabetes (including treatment by subcutaneous insulin application). The second one simulates the ion transport leading to the resting and action potential in nerves. The student can vary parameters systematically to explore the biological behavior of the system. The described system is able to simulate complex biological processes and offers the possibility to use these models in an online e-learning environment. As far as the underlying principles can be described mathematically, this type of system can be applied to a broad spectrum of biomedical or natural scientific topics. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Simulating fail-stop in asynchronous distributed systems

    NASA Technical Reports Server (NTRS)

    Sabel, Laura; Marzullo, Keith

    1994-01-01

    The fail-stop failure model appears frequently in the distributed systems literature. However, in an asynchronous distributed system, the fail-stop model cannot be implemented. In particular, it is impossible to reliably detect crash failures in an asynchronous system. In this paper, we show that it is possible to specify and implement a failure model that is indistinguishable from the fail-stop model from the point of view of any process within an asynchronous system. We give necessary conditions for a failure model to be indistinguishable from the fail-stop model, and derive lower bounds on the amount of process replication needed to implement such a failure model. We present a simple one-round protocol for implementing one such failure model, which we call simulated fail-stop.

  20. Organic Chemistry in Interstellar Ices: Connection to the Comet Halley Results

    NASA Technical Reports Server (NTRS)

    Schutte, W. A.; Agarwal, V. K.; deGroot, M. S.; Greenberg, J. M.; McCain, P.; Ferris, J. P.; Briggs, R.

    1997-01-01

    Mass spectroscopic measurements on the gas and dust in the coma of Comet Halley revealed the presence of considerable amounts of organic species. Greenberg (1973) proposed that prior to the formation of the comet UV processing of the ice mantles on grains in dense clouds could lead to the formation of complex organic molecules. Theoretical predictions of the internal UV field in dense clouds as well as the discovery in interstellar ices of species like OCS and OCN- which have been formed in simulation experiments by photoprocessing of interstellar ice analogues point to the importance of such processing. We undertook a laboratory simulation study of the formation of organic molecules in interstellar ices and their possible relevance to the Comet Halley results.

  1. First-principles quantum-mechanical investigations of biomass conversion at the liquid-solid interfaces

    NASA Astrophysics Data System (ADS)

    Dang, Hongli; Xue, Wenhua; Liu, Yingdi; Jentoft, Friederike; Resasco, Daniel; Wang, Sanwu

    2014-03-01

    We report first-principles density-functional calculations and ab initio molecular dynamics (MD) simulations for the reactions involving furfural, which is an important intermediate in biomass conversion, at the catalytic liquid-solid interfaces. The different dynamic processes of furfural at the water-Cu(111) and water-Pd(111) interfaces suggest different catalytic reaction mechanisms for the conversion of furfural. Simulations for the dynamic processes with and without hydrogen demonstrate the importance of the liquid-solid interface as well as the presence of hydrogen in possible catalytic reactions including hydrogenation and decarbonylation of furfural. Supported by DOE (DE-SC0004600). This research used the supercomputer resources of the XSEDE, the NERSC Center, and the Tandy Supercomputing Center.

  2. Qualitative simulation for process modeling and control

    NASA Technical Reports Server (NTRS)

    Dalle Molle, D. T.; Edgar, T. F.

    1989-01-01

    A qualitative model is developed for a first-order system with a proportional-integral controller without precise knowledge of the process or controller parameters. Simulation of the qualitative model yields all of the solutions to the system equations. In developing the qualitative model, a necessary condition for the occurrence of oscillatory behavior is identified. Initializations that cannot exhibit oscillatory behavior produce a finite set of behaviors. When the phase-space behavior of the oscillatory behavior is properly constrained, these initializations produce an infinite but comprehensible set of asymptotically stable behaviors. While the predictions include all possible behaviors of the real system, a class of spurious behaviors has been identified. When limited numerical information is included in the model, the number of predictions is significantly reduced.

  3. Generation of ordinary mode electromagnetic radiation near the upper hybrid frequency in the magnetosphere

    NASA Technical Reports Server (NTRS)

    Ashour-Abdalla, M.; Okuda, H.

    1984-01-01

    It is shown by means of plasma numerical simulations that long-wavelength ordinary mode electromagnetic radiation can be generated from short-wavelength electrostatic waves near the upper hybrid resonance frequency in an inhomogeneous plasma. A possible relation of this process to nonthermal continuum radiation in the magnetosphere is discussed.

  4. Representing climate, disturbance, and vegetation interactions in landscape models

    Treesearch

    Robert E. Keane; Donald McKenzie; Donald A. Falk; Erica A.H. Smithwick; Carol Miller; Lara-Karena B. Kellogg

    2015-01-01

    The prospect of rapidly changing climates over the next century calls for methods to predict their effects on myriad, interactive ecosystem processes. Spatially explicit models that simulate ecosystem dynamics at fine (plant, stand) to coarse (regional, global) scales are indispensable tools for meeting this challenge under a variety of possible futures. A special...

  5. Electro-peroxone pretreatment for enhanced simulated hospital wastewater treatment and antibiotic resistance genes reduction.

    PubMed

    Zheng, He-Shan; Guo, Wan-Qian; Wu, Qu-Li; Ren, Nan-Qi; Chang, Jo-Shu

    2018-06-01

    Hospital wastewater is one of the possible sources responsible for antibiotic resistant bacteria spread into the environment. This study proposed a promising strategy, electro-peroxone (E-peroxone) pretreatment followed by a sequencing batch reactor (SBR) for simulated hospital wastewater treatment, aiming to enhance the wastewater treatment performance and to reduce antibiotic resistance genes production simultaneously. The highest chemical oxygen demand (COD) and total organic carbon (TOC) removal efficiency of 94.3% and 92.8% were obtained using the E-peroxone-SBR process. The microbial community analysis through high-throughput sequencing showed that E-peroxone pretreatment could guarantee microbial richness and diversity in SBR, as well as reduce the microbial inhibitions caused by antibiotic and raise the amount of nitrification and denitrification genera. Specially, quantitative real-time PCRs revealed that E-peroxone pretreatment could largely reduce the numbers and contents of antibiotic resistance genes (ARGs) production in the following biological treatment unit. It was indicated that E-peroxone-SBR process may provide an effective way for hospital wastewater treatment and possible ARGs reduction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. A Tool to Simulate the Transmission, Reception, and Execution of Interactive TV Applications

    PubMed Central

    Kulesza, Raoni; Rodrigues, Thiago; Machado, Felipe A. L.; Santos, Celso A. S.

    2017-01-01

    The emergence of Interactive Digital Television (iDTV) opened a set of technological possibilities that go beyond those offered by conventional TV. Among these opportunities we can highlight interactive contents that run together with linear TV program (television service where the viewer has to watch a scheduled TV program at the particular time it is offered and on the particular channel it is presented on). However, developing interactive contents for this new platform is not as straightforward as, for example, developing Internet applications. One of the options to make this development process easier and safer is to use an iDTV simulator. However, after having investigated some of the existing iDTV simulation environments, we have found a limitation: these simulators mainly present solutions focused on the TV receiver, whose interactive content must be loaded in advance by the programmer to a local repository (e.g., Hard Drive, USB). Therefore, in this paper, we propose a tool, named BiS (Broadcast iDTV content Simulator), which makes possible a broader solution for the simulation of interactive contents. It allows simulating the transmission of interactive content along with the linear TV program (simulating the transmission of content over the air and in broadcast to the receivers). To enable this, we defined a generic and easy-to-customize communication protocol that was implemented in the tool. The proposed environment differs from others because it allows simulating reception of both linear content and interactive content while running Java applications to allow such a content presentation. PMID:28280770

  7. Application of welding simulation to block joints in shipbuilding and assessment of welding-induced residual stresses and distortions

    NASA Astrophysics Data System (ADS)

    Fricke, Wolfgang; Zacke, Sonja

    2014-06-01

    During ship design, welding-induced distortions are roughly estimated as a function of the size of the component as well as the welding process and residual stresses are assumed to be locally in the range of the yield stress. Existing welding simulation methods are very complex and time-consuming and therefore not applicable to large structures like ships. Simplified methods for the estimation of welding effects were and still are subject of several research projects, but mostly concerning smaller structures. The main goal of this paper is the application of a multi-layer welding simulation to the block joint of a ship structure. When welding block joints, high constraints occur due to the ship structure which are assumed to result in accordingly high residual stresses. Constraints measured during construction were realized in a test plant for small-scale welding specimens in order to investigate their and other effects on the residual stresses. Associated welding simulations were successfully performed with fine-mesh finite element models. Further analyses showed that a courser mesh was also able to reproduce the welding-induced reaction forces and hence the residual stresses after some calibration. Based on the coarse modeling it was possible to perform the welding simulation at a block joint in order to investigate the influence of the resulting residual stresses on the behavior of the real structure, showing quite interesting stress distributions. Finally it is discussed whether smaller and idealized models of definite areas of the block joint can be used to achieve the same results offering possibilities to consider residual stresses in the design process.

  8. ENERGY DISSIPATION AND LANDAU DAMPING IN TWO- AND THREE-DIMENSIONAL PLASMA TURBULENCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Tak Chu; Howes, Gregory G.; Klein, Kristopher G.

    Plasma turbulence is ubiquitous in space and astrophysical plasmas, playing an important role in plasma energization, but the physical mechanisms leading to dissipation of the turbulent energy remain to be definitively identified. Kinetic simulations in two dimensions (2D) have been extensively used to study the dissipation process. How the limitation to 2D affects energy dissipation remains unclear. This work provides a model of comparison between two- and three-dimensional (3D) plasma turbulence using gyrokinetic simulations; it also explores the dynamics of distribution functions during the dissipation process. It is found that both 2D and 3D nonlinear gyrokinetic simulations of a low-betamore » plasma generate electron velocity-space structures with the same characteristics as that of the linear Landau damping of Alfvén waves in a 3D linear simulation. The continual occurrence of the velocity-space structures throughout the turbulence simulations suggests that the action of Landau damping may be responsible for the turbulent energy transfer to electrons in both 2D and 3D, and makes possible the subsequent irreversible heating of the plasma through collisional smoothing of the velocity-space fluctuations. Although, in the 2D case where variation along the equilibrium magnetic field is absent, it may be expected that Landau damping is not possible, a common trigonometric factor appears in the 2D resonant denominator, leaving the resonance condition unchanged from the 3D case. The evolution of the 2D and 3D cases is qualitatively similar. However, quantitatively, the nonlinear energy cascade and subsequent dissipation is significantly slower in the 2D case.« less

  9. Integrating Visualizations into Modeling NEST Simulations

    PubMed Central

    Nowke, Christian; Zielasko, Daniel; Weyers, Benjamin; Peyser, Alexander; Hentschel, Bernd; Kuhlen, Torsten W.

    2015-01-01

    Modeling large-scale spiking neural networks showing realistic biological behavior in their dynamics is a complex and tedious task. Since these networks consist of millions of interconnected neurons, their simulation produces an immense amount of data. In recent years it has become possible to simulate even larger networks. However, solutions to assist researchers in understanding the simulation's complex emergent behavior by means of visualization are still lacking. While developing tools to partially fill this gap, we encountered the challenge to integrate these tools easily into the neuroscientists' daily workflow. To understand what makes this so challenging, we looked into the workflows of our collaborators and analyzed how they use the visualizations to solve their daily problems. We identified two major issues: first, the analysis process can rapidly change focus which requires to switch the visualization tool that assists in the current problem domain. Second, because of the heterogeneous data that results from simulations, researchers want to relate data to investigate these effectively. Since a monolithic application model, processing and visualizing all data modalities and reflecting all combinations of possible workflows in a holistic way, is most likely impossible to develop and to maintain, a software architecture that offers specialized visualization tools that run simultaneously and can be linked together to reflect the current workflow, is a more feasible approach. To this end, we have developed a software architecture that allows neuroscientists to integrate visualization tools more closely into the modeling tasks. In addition, it forms the basis for semantic linking of different visualizations to reflect the current workflow. In this paper, we present this architecture and substantiate the usefulness of our approach by common use cases we encountered in our collaborative work. PMID:26733860

  10. Micromagnetic simulation of exchange coupled ferri-/ferromagnetic heterostructures

    PubMed Central

    Oezelt, Harald; Kovacs, Alexander; Reichel, Franz; Fischbacher, Johann; Bance, Simon; Gusenbauer, Markus; Schubert, Christian; Albrecht, Manfred; Schrefl, Thomas

    2015-01-01

    Exchange coupled ferri-/ferromagnetic heterostructures are a possible material composition for future magnetic storage and sensor applications. In order to understand the driving mechanisms in the demagnetization process, we perform micromagnetic simulations by employing the Landau–Lifshitz–Gilbert equation. The magnetization reversal is dominated by pinning events within the amorphous ferrimagnetic layer and at the interface between the ferrimagnetic and the ferromagnetic layer. The shape of the computed magnetization reversal loop corresponds well with experimental data, if a spatial variation of the exchange coupling across the ferri-/ferromagnetic interface is assumed. PMID:25937693

  11. Numerical simulation of thermal stress distributions in Czochralski-grown silicon crystals

    NASA Astrophysics Data System (ADS)

    Kumar, M. Avinash; Srinivasan, M.; Ramasamy, P.

    2018-04-01

    Numerical simulation is one of the important tools in the investigation and optimization of the single-crystal silicon grown by the Czochralski (Cz) method. A 2D steady global heat transfer model was used to investigate the temperature distribution and the thermal stress distributions at particular crystal position during the Cz growth process. The computation determines the thermal stress such as von Mises stress and maximum shear stress distribution along grown crystal and shows possible reason for dislocation formation in the Cz-grown single-crystal silicon.

  12. Modeling of radiation damage recovery in particle detectors based on GaN

    NASA Astrophysics Data System (ADS)

    Gaubas, E.; Ceponis, T.; Pavlov, J.

    2015-12-01

    The pulsed characteristics of the capacitor-type and PIN diode type detectors based on GaN have been simulated using the dynamic and drift-diffusion models. The drift-diffusion current simulations have been implemented by employing the commercial software package Synopsys TCAD Sentaurus. The bipolar drift regime has been analyzed. The possible internal gain in charge collection through carrier multiplication processes determined by impact ionization has been considered in order to compensate carrier lifetime reduction due to radiation defects introduced into GaN material of detector.

  13. Computer simulations in teaching physics: Development and implementation of a hypermedia system for high school teachers

    NASA Astrophysics Data System (ADS)

    da Silva, A. M. R.; de Macêdo, J. A.

    2016-06-01

    On the basis of the technological advancement in the middle and the difficulty of learning by the students in the discipline of physics, this article describes the process of elaboration and implementation of a hypermedia system for high school teachers involving computer simulations for teaching basic concepts of electromagnetism, using free tool. With the completion and publication of the project there will be a new possibility of interaction of students and teachers with the technology in the classroom and in labs.

  14. Brian Hears: Online Auditory Processing Using Vectorization Over Channels

    PubMed Central

    Fontaine, Bertrand; Goodman, Dan F. M.; Benichoux, Victor; Brette, Romain

    2011-01-01

    The human cochlea includes about 3000 inner hair cells which filter sounds at frequencies between 20 Hz and 20 kHz. This massively parallel frequency analysis is reflected in models of auditory processing, which are often based on banks of filters. However, existing implementations do not exploit this parallelism. Here we propose algorithms to simulate these models by vectorizing computation over frequency channels, which are implemented in “Brian Hears,” a library for the spiking neural network simulator package “Brian.” This approach allows us to use high-level programming languages such as Python, because with vectorized operations, the computational cost of interpretation represents a small fraction of the total cost. This makes it possible to define and simulate complex models in a simple way, while all previous implementations were model-specific. In addition, we show that these algorithms can be naturally parallelized using graphics processing units, yielding substantial speed improvements. We demonstrate these algorithms with several state-of-the-art cochlear models, and show that they compare favorably with existing, less flexible, implementations. PMID:21811453

  15. Emerging CFD technologies and aerospace vehicle design

    NASA Technical Reports Server (NTRS)

    Aftosmis, Michael J.

    1995-01-01

    With the recent focus on the needs of design and applications CFD, research groups have begun to address the traditional bottlenecks of grid generation and surface modeling. Now, a host of emerging technologies promise to shortcut or dramatically simplify the simulation process. This paper discusses the current status of these emerging technologies. It will argue that some tools are already available which can have positive impact on portions of the design cycle. However, in most cases, these tools need to be integrated into specific engineering systems and process cycles to be used effectively. The rapidly maturing status of unstructured and Cartesian approaches for inviscid simulations makes suggests the possibility of highly automated Euler-boundary layer simulations with application to loads estimation and even preliminary design. Similarly, technology is available to link block structured mesh generation algorithms with topology libraries to avoid tedious re-meshing of topologically similar configurations. Work in algorithmic based auto-blocking suggests that domain decomposition and point placement operations in multi-block mesh generation may be properly posed as problems in Computational Geometry, and following this approach may lead to robust algorithmic processes for automatic mesh generation.

  16. myPresto/omegagene: a GPU-accelerated molecular dynamics simulator tailored for enhanced conformational sampling methods with a non-Ewald electrostatic scheme.

    PubMed

    Kasahara, Kota; Ma, Benson; Goto, Kota; Dasgupta, Bhaskar; Higo, Junichi; Fukuda, Ikuo; Mashimo, Tadaaki; Akiyama, Yutaka; Nakamura, Haruki

    2016-01-01

    Molecular dynamics (MD) is a promising computational approach to investigate dynamical behavior of molecular systems at the atomic level. Here, we present a new MD simulation engine named "myPresto/omegagene" that is tailored for enhanced conformational sampling methods with a non-Ewald electrostatic potential scheme. Our enhanced conformational sampling methods, e.g. , the virtual-system-coupled multi-canonical MD (V-McMD) method, replace a multi-process parallelized run with multiple independent runs to avoid inter-node communication overhead. In addition, adopting the non-Ewald-based zero-multipole summation method (ZMM) makes it possible to eliminate the Fourier space calculations altogether. The combination of these state-of-the-art techniques realizes efficient and accurate calculations of the conformational ensemble at an equilibrium state. By taking these advantages, myPresto/omegagene is specialized for the single process execution with Graphics Processing Unit (GPU). We performed benchmark simulations for the 20-mer peptide, Trp-cage, with explicit solvent. One of the most thermodynamically stable conformations generated by the V-McMD simulation is very similar to an experimentally solved native conformation. Furthermore, the computation speed is four-times faster than that of our previous simulation engine, myPresto/psygene-G. The new simulator, myPresto/omegagene, is freely available at the following URLs: http://www.protein.osaka-u.ac.jp/rcsfp/pi/omegagene/ and http://presto.protein.osaka-u.ac.jp/myPresto4/.

  17. Nonstationary envelope process and first excursion probability

    NASA Technical Reports Server (NTRS)

    Yang, J.

    1972-01-01

    A definition of the envelope of nonstationary random processes is proposed. The establishment of the envelope definition makes it possible to simulate the nonstationary random envelope directly. Envelope statistics, such as the density function, joint density function, moment function, and level crossing rate, which are relevent to analyses of catastrophic failure, fatigue, and crack propagation in structures, are derived. Applications of the envelope statistics to the prediction of structural reliability under random loadings are discussed in detail.

  18. Understanding Atmospheric Anomalies Associated With Seasonal Pluvial-Drought Processes Using Southwest China as an Example

    NASA Astrophysics Data System (ADS)

    Liu, Zhenchen; Lu, Guihua; He, Hai; Wu, Zhiyong; He, Jian

    2017-11-01

    Seasonal pluvial-drought transition processes are unique natural phenomena. To explore possible mechanisms, we considered Southwest China (SWC) as the study region and comprehensively investigated the temporal evolution or spatial patterns of large-scale and regional atmospheric variables with the simple method of Standardized Anomalies (SA). Some key procedures and results include the following: (1) Because regional atmospheric variables are more directly responsible for the transition processes, we investigate it in detail. The temporal evolution of net vertical integral water vapor flux (net VIWVF) across SWC, together with vertical SA-based patterns of regional horizontal divergence (D) and vertical motion (ω), coincides well with pluvial-drought transition processes. (2) With respect to large-scale circulation patterns, a well-organized Eurasian (EU) Pattern is one important feature during the pluvial-drought transitions over SWC. (3) Based on these large-scale and regional atmospheric anomalous features, relevant SA-based indices were built, to explore the possibility of simulating drought development using previous pluvial anomalies. As a whole, simulated drought development only with SA-based indices of large-scale circulation patterns does not perform well. Further, it can be improved a lot when SA-based indices of regional D and net VIWVF are introduced. (4) In addition, the potential drought prediction using pluvial anomalies, together with the deep understanding of physical mechanisms responsible for pluvial-drought transitions, need to be further explored.

  19. Method for construction of a biased potential for hyperdynamic simulation of atomic systems

    NASA Astrophysics Data System (ADS)

    Duda, E. V.; Kornich, G. V.

    2017-10-01

    An approach to constructing a biased potential for hyperdynamic simulation of atomic systems is considered. Using this approach, the diffusion of an atom adsorbed on the surface of a two-dimensional crystal and a vacancy in the bulk of the crystal are simulated. The influence of the variation in the potential barriers due to thermal vibrations of atoms on the results of calculations is discussed. It is shown that the bias of the potential in the hyperdynamic simulation makes it possible to obtain statistical samples of transitions of atomic systems between states, similar to those given by classical molecular dynamics. However, hyperdynamics significantly accelerates computations in comparison with molecular dynamics in the case of temperature-activated transitions and the associated processes in atomic systems.

  20. Accurate lithography simulation model based on convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Watanabe, Yuki; Kimura, Taiki; Matsunawa, Tetsuaki; Nojima, Shigeki

    2017-07-01

    Lithography simulation is an essential technique for today's semiconductor manufacturing process. In order to calculate an entire chip in realistic time, compact resist model is commonly used. The model is established for faster calculation. To have accurate compact resist model, it is necessary to fix a complicated non-linear model function. However, it is difficult to decide an appropriate function manually because there are many options. This paper proposes a new compact resist model using CNN (Convolutional Neural Networks) which is one of deep learning techniques. CNN model makes it possible to determine an appropriate model function and achieve accurate simulation. Experimental results show CNN model can reduce CD prediction errors by 70% compared with the conventional model.

  1. Investigating Small-Molecule Ligand Binding to G Protein-Coupled Receptors with Biased or Unbiased Molecular Dynamics Simulations

    PubMed Central

    Marino, Kristen A.; Filizola, Marta

    2017-01-01

    An increasing number of G protein-coupled receptor (GPCR) crystal structures provide important—albeit static—pictures of how small molecules or peptides interact with their receptors. These high-resolution structures represent a tremendous opportunity to apply molecular dynamics (MD) simulations to capture atomic-level dynamical information that is not easy to obtain experimentally. Understanding ligand binding and unbinding processes, as well as the related responses of the receptor, is crucial to the design of better drugs targeting GPCRs. Here, we discuss possible ways to study the dynamics involved in the binding of small molecules to GPCRs, using long timescale MD simulations or metadynamics-based approaches. PMID:29188572

  2. Investigating Small-Molecule Ligand Binding to G Protein-Coupled Receptors with Biased or Unbiased Molecular Dynamics Simulations.

    PubMed

    Marino, Kristen A; Filizola, Marta

    2018-01-01

    An increasing number of G protein-coupled receptor (GPCR) crystal structures provide important-albeit static-pictures of how small molecules or peptides interact with their receptors. These high-resolution structures represent a tremendous opportunity to apply molecular dynamics (MD) simulations to capture atomic-level dynamical information that is not easy to obtain experimentally. Understanding ligand binding and unbinding processes, as well as the related responses of the receptor, is crucial to the design of better drugs targeting GPCRs. Here, we discuss possible ways to study the dynamics involved in the binding of small molecules to GPCRs, using long timescale MD simulations or metadynamics-based approaches.

  3. syris: a flexible and efficient framework for X-ray imaging experiments simulation.

    PubMed

    Faragó, Tomáš; Mikulík, Petr; Ershov, Alexey; Vogelgesang, Matthias; Hänschke, Daniel; Baumbach, Tilo

    2017-11-01

    An open-source framework for conducting a broad range of virtual X-ray imaging experiments, syris, is presented. The simulated wavefield created by a source propagates through an arbitrary number of objects until it reaches a detector. The objects in the light path and the source are time-dependent, which enables simulations of dynamic experiments, e.g. four-dimensional time-resolved tomography and laminography. The high-level interface of syris is written in Python and its modularity makes the framework very flexible. The computationally demanding parts behind this interface are implemented in OpenCL, which enables fast calculations on modern graphics processing units. The combination of flexibility and speed opens new possibilities for studying novel imaging methods and systematic search of optimal combinations of measurement conditions and data processing parameters. This can help to increase the success rates and efficiency of valuable synchrotron beam time. To demonstrate the capabilities of the framework, various experiments have been simulated and compared with real data. To show the use case of measurement and data processing parameter optimization based on simulation, a virtual counterpart of a high-speed radiography experiment was created and the simulated data were used to select a suitable motion estimation algorithm; one of its parameters was optimized in order to achieve the best motion estimation accuracy when applied on the real data. syris was also used to simulate tomographic data sets under various imaging conditions which impact the tomographic reconstruction accuracy, and it is shown how the accuracy may guide the selection of imaging conditions for particular use cases.

  4. Modeling the influence of climate change on watershed systems: Adaptation through targeted practices

    NASA Astrophysics Data System (ADS)

    Dudula, John; Randhir, Timothy O.

    2016-10-01

    Climate change may influence hydrologic processes of watersheds (IPCC, 2013) and increased runoff may cause flooding, eroded stream banks, widening of stream channels, increased pollutant loading, and consequently impairment of aquatic life. The goal of this study was to quantify the potential impacts of climate change on watershed hydrologic processes and to evaluate scale and effectiveness of management practices for adaptation. We simulate baseline watershed conditions using the Hydrological Simulation Program Fortran (HSPF) simulation model to examine the possible effects of changing climate on watershed processes. We also simulate the effects of adaptation and mitigation through specific best management strategies for various climatic scenarios. With continuing low-flow conditions and vulnerability to climate change, the Ipswich watershed is the focus of this study. We quantify fluxes in runoff, evapotranspiration, infiltration, sediment load, and nutrient concentrations under baseline and climate change scenarios (near and far future). We model adaptation options for mitigating climate effects on watershed processes using bioretention/raingarden Best Management Practices (BMPs). It was observed that climate change has a significant impact on watershed runoff and carefully designed and maintained BMPs at subwatershed scale can be effective in mitigating some of the problems related to stormwater runoff. Policy options include implementation of BMPs through education and incentives for scale-dependent and site specific bioretention units/raingardens to increase the resilience of the watershed system to current and future climate change.

  5. The application of virtual reality systems as a support of digital manufacturing and logistics

    NASA Astrophysics Data System (ADS)

    Golda, G.; Kampa, A.; Paprocka, I.

    2016-08-01

    Modern trends in development of computer aided techniques are heading toward the integration of design competitive products and so-called "digital manufacturing and logistics", supported by computer simulation software. All phases of product lifecycle: starting from design of a new product, through planning and control of manufacturing, assembly, internal logistics and repairs, quality control, distribution to customers and after-sale service, up to its recycling or utilization should be aided and managed by advanced packages of product lifecycle management software. Important problems for providing the efficient flow of materials in supply chain management of whole product lifecycle, using computer simulation will be described on that paper. Authors will pay attention to the processes of acquiring relevant information and correct data, necessary for virtual modeling and computer simulation of integrated manufacturing and logistics systems. The article describes possibilities of use an applications of virtual reality software for modeling and simulation the production and logistics processes in enterprise in different aspects of product lifecycle management. The authors demonstrate effective method of creating computer simulations for digital manufacturing and logistics and show modeled and programmed examples and solutions. They pay attention to development trends and show options of the applications that go beyond enterprise.

  6. Novel Web-based Education Platforms for Information Communication utilizing Gamification, Virtual and Immersive Reality

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2015-12-01

    Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.

  7. Foundations of a query and simulation system for the modeling of biochemical and biological processes.

    PubMed

    Antoniotti, M; Park, F; Policriti, A; Ugel, N; Mishra, B

    2003-01-01

    The analysis of large amounts of data, produced as (numerical) traces of in vivo, in vitro and in silico experiments, has become a central activity for many biologists and biochemists. Recent advances in the mathematical modeling and computation of biochemical systems have moreover increased the prominence of in silico experiments; such experiments typically involve the simulation of sets of Differential Algebraic Equations (DAE), e.g., Generalized Mass Action systems (GMA) and S-systems. In this paper we reason about the necessary theoretical and pragmatic foundations for a query and simulation system capable of analyzing large amounts of such trace data. To this end, we propose to combine in a novel way several well-known tools from numerical analysis (approximation theory), temporal logic and verification, and visualization. The result is a preliminary prototype system: simpathica/xssys. When dealing with simulation data simpathica/xssys exploits the special structure of the underlying DAE, and reduces the search space in an efficient way so as to facilitate any queries about the traces. The proposed system is designed to give the user possibility to systematically analyze and simultaneously query different possible timed evolutions of the modeled system.

  8. Cluster observations and simulations of He+ EMIC triggered emissions

    NASA Astrophysics Data System (ADS)

    Grison, B.; Shoji, M.; Santolik, O.; Omura, Y.

    2012-12-01

    EMIC triggered emissions have been reported in the inner magnetosphere at the edge of the plasmapause nightside [Pickett et al., 2010]. The generation mechanism proposed by Omura et al. [2010] is very similar to the one of the whistler chorus emissions and simulation results agree with observations and theory [Shoji et Omura, 2011]. The main characteristics of these emissions generated in the magnetic equatorial plane region are a frequency with time dispersion and a high level of coherence. The start frequency of previously mentioned observations is above half of the proton gyrofrequency. It means that the emissions are generated on the proton branch. On the He+ branch, generation of triggered emissions, in the same region, requests more energetic protons and the triggering process starts below the He+ gyrofrequency. It makes their identification in Cluster data rather difficult. Recent simulation results confirm the possibility of EMIC triggered emission on the He+ branch. In the present contribution we propose to compare a Cluster event to simulation results in order to investigate the possibility to identify observations to a He+ triggered emission. The impact of the observed waves on particle precipitation is also investigated.

  9. Preferred Tempo and Low-Audio-Frequency Bias Emerge From Simulated Sub-cortical Processing of Sounds With a Musical Beat

    PubMed Central

    Zuk, Nathaniel J.; Carney, Laurel H.; Lalor, Edmund C.

    2018-01-01

    Prior research has shown that musical beats are salient at the level of the cortex in humans. Yet below the cortex there is considerable sub-cortical processing that could influence beat perception. Some biases, such as a tempo preference and an audio frequency bias for beat timing, could result from sub-cortical processing. Here, we used models of the auditory-nerve and midbrain-level amplitude modulation filtering to simulate sub-cortical neural activity to various beat-inducing stimuli, and we used the simulated activity to determine the tempo or beat frequency of the music. First, irrespective of the stimulus being presented, the preferred tempo was around 100 beats per minute, which is within the range of tempi where tempo discrimination and tapping accuracy are optimal. Second, sub-cortical processing predicted a stronger influence of lower audio frequencies on beat perception. However, the tempo identification algorithm that was optimized for simple stimuli often failed for recordings of music. For music, the most highly synchronized model activity occurred at a multiple of the beat frequency. Using bottom-up processes alone is insufficient to produce beat-locked activity. Instead, a learned and possibly top-down mechanism that scales the synchronization frequency to derive the beat frequency greatly improves the performance of tempo identification. PMID:29896080

  10. A novel approach to simulate gene-environment interactions in complex diseases.

    PubMed

    Amato, Roberto; Pinelli, Michele; D'Andrea, Daniel; Miele, Gennaro; Nicodemi, Mario; Raiconi, Giancarlo; Cocozza, Sergio

    2010-01-05

    Complex diseases are multifactorial traits caused by both genetic and environmental factors. They represent the major part of human diseases and include those with largest prevalence and mortality (cancer, heart disease, obesity, etc.). Despite a large amount of information that has been collected about both genetic and environmental risk factors, there are few examples of studies on their interactions in epidemiological literature. One reason can be the incomplete knowledge of the power of statistical methods designed to search for risk factors and their interactions in these data sets. An improvement in this direction would lead to a better understanding and description of gene-environment interactions. To this aim, a possible strategy is to challenge the different statistical methods against data sets where the underlying phenomenon is completely known and fully controllable, for example simulated ones. We present a mathematical approach that models gene-environment interactions. By this method it is possible to generate simulated populations having gene-environment interactions of any form, involving any number of genetic and environmental factors and also allowing non-linear interactions as epistasis. In particular, we implemented a simple version of this model in a Gene-Environment iNteraction Simulator (GENS), a tool designed to simulate case-control data sets where a one gene-one environment interaction influences the disease risk. The main aim has been to allow the input of population characteristics by using standard epidemiological measures and to implement constraints to make the simulator behaviour biologically meaningful. By the multi-logistic model implemented in GENS it is possible to simulate case-control samples of complex disease where gene-environment interactions influence the disease risk. The user has full control of the main characteristics of the simulated population and a Monte Carlo process allows random variability. A knowledge-based approach reduces the complexity of the mathematical model by using reasonable biological constraints and makes the simulation more understandable in biological terms. Simulated data sets can be used for the assessment of novel statistical methods or for the evaluation of the statistical power when designing a study.

  11. Test/score/report: Simulation techniques for automating the test process

    NASA Technical Reports Server (NTRS)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary to test a POCC/MOC software delivery, as well as improve the quality of the test process.

  12. Validated simulator for space debris removal with nets and other flexible tethers applications

    NASA Astrophysics Data System (ADS)

    Gołębiowski, Wojciech; Michalczyk, Rafał; Dyrek, Michał; Battista, Umberto; Wormnes, Kjetil

    2016-12-01

    In the context of active debris removal technologies and preparation activities for the e.Deorbit mission, a simulator for net-shaped elastic bodies dynamics and their interactions with rigid bodies, has been developed. Its main application is to aid net design and test scenarios for space debris deorbitation. The simulator can model all the phases of the debris capturing process: net launch, flight and wrapping around the target. It handles coupled simulation of rigid and flexible bodies dynamics. Flexible bodies were implemented using Cosserat rods model. It allows to simulate flexible threads or wires with elasticity and damping for stretching, bending and torsion. Threads may be combined into structures of any topology, so the software is able to simulate nets, pure tethers, tether bundles, cages, trusses, etc. Full contact dynamics was implemented. Programmatic interaction with simulation is possible - i.e. for control implementation. The underlying model has been experimentally validated and due to significant gravity influence, experiment had to be performed in microgravity conditions. Validation experiment for parabolic flight was a downscaled process of Envisat capturing. The prepacked net was launched towards the satellite model, it expanded, hit the model and wrapped around it. The whole process was recorded with 2 fast stereographic camera sets for full 3D trajectory reconstruction. The trajectories were used to compare net dynamics to respective simulations and then to validate the simulation tool. The experiments were performed on board of a Falcon-20 aircraft, operated by National Research Council in Ottawa, Canada. Validation results show that model reflects phenomenon physics accurately enough, so it may be used for scenario evaluation and mission design purposes. The functionalities of the simulator are described in detail in the paper, as well as its underlying model, sample cases and methodology behind validation. Results are presented and typical use cases are discussed showing that the software may be used to design throw nets for space debris capturing, but also to simulate deorbitation process, chaser control system or general interactions between rigid and elastic bodies - all in convenient and efficient way. The presented work was led by SKA Polska under the ESA contract, within the CleanSpace initiative.

  13. Computer simulation of a space SAR using a range-sequential processor for soil moisture mapping

    NASA Technical Reports Server (NTRS)

    Fujita, M.; Ulaby, F. (Principal Investigator)

    1982-01-01

    The ability of a spaceborne synthetic aperture radar (SAR) to detect soil moisture was evaluated by means of a computer simulation technique. The computer simulation package includes coherent processing of the SAR data using a range-sequential processor, which can be set up through hardware implementations, thereby reducing the amount of telemetry involved. With such a processing approach, it is possible to monitor the earth's surface on a continuous basis, since data storage requirements can be easily met through the use of currently available technology. The Development of the simulation package is described, followed by an examination of the application of the technique to actual environments. The results indicate that in estimating soil moisture content with a four-look processor, the difference between the assumed and estimated values of soil moisture is within + or - 20% of field capacity for 62% of the pixels for agricultural terrain and for 53% of the pixels for hilly terrain. The estimation accuracy for soil moisture may be improved by reducing the effect of fading through non-coherent averaging.

  14. Assessment of a Hybrid Continuous/Discontinuous Galerkin Finite Element Code for Geothermal Reservoir Simulations

    DOE PAGES

    Xia, Yidong; Podgorney, Robert; Huang, Hai

    2016-03-17

    FALCON (“Fracturing And Liquid CONvection”) is a hybrid continuous / discontinuous Galerkin finite element geothermal reservoir simulation code based on the MOOSE (“Multiphysics Object-Oriented Simulation Environment”) framework being developed and used for multiphysics applications. In the present work, a suite of verification and validation (“V&V”) test problems for FALCON was defined to meet the design requirements, and solved to the interests of enhanced geothermal system (“EGS”) design. Furthermore, the intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the FALCON solution methods. The simulation problems vary in complexity from singly mechanical ormore » thermo process, to coupled thermo-hydro-mechanical processes in geological porous media. Numerical results obtained by FALCON agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these capabilities in FALCON. Some form of solution verification has been attempted to identify sensitivities in the solution methods, where possible, and suggest best practices when using the FALCON code.« less

  15. The R package 'RLumModel': Simulating charge transfer in quartz

    NASA Astrophysics Data System (ADS)

    Friedrich, Johannes; Kreutzer, Sebastian; Schmidt, Christoph

    2017-04-01

    Kinetic models of quartz luminescence have gained an important role for predicting experimental results and for understanding charge transfers in (natural) quartz as well as for other dosimetric materials, e.g., Al2O3:C. We present the R package 'RLumModel', offering an easy-to-use tool for simulating quartz luminescence signals (TL, OSL, LM-OSL and RF) based on five integrated and published parameter sets as well as the possibility to use own parameters. Simulation commands can be created (a) using the Risø Sequence Editor, (b) a built-in SAR sequence generator or (c) self-explanatory keywords for customised sequences. Results can be analysed seamlessly using the R package 'Luminescence' along with a visualisation of concentrations of electrons and holes in every trap/centre as well as in the valence and conduction band during all stages of the simulation. Modelling luminescence signals can help understanding charge transfer processes occurring in nature or during measurements in the laboratory. This will lead to a better understanding of several processes concerning geoscientific questions, because quartz is the second most abundant mineral in the Earth's continental crust.

  16. Atomistic minimal model for estimating profile of electrodeposited nanopatterns

    NASA Astrophysics Data System (ADS)

    Asgharpour Hassankiadeh, Somayeh; Sadeghi, Ali

    2018-06-01

    We develop a computationally efficient and methodologically simple approach to realize molecular dynamics simulations of electrodeposition. Our minimal model takes into account the nontrivial electric field due a sharp electrode tip to perform simulations of the controllable coating of a thin layer on a surface with an atomic precision. On the atomic scale a highly site-selective electrodeposition of ions and charged particles by means of the sharp tip of a scanning probe microscope is possible. A better understanding of the microscopic process, obtained mainly from atomistic simulations, helps us to enhance the quality of this nanopatterning technique and to make it applicable in fabrication of nanowires and nanocontacts. In the limit of screened inter-particle interactions, it is feasible to run very fast simulations of the electrodeposition process within the framework of the proposed model and thus to investigate how the shape of the overlayer depends on the tip-sample geometry and dielectric properties, electrolyte viscosity, etc. Our calculation results reveal that the sharpness of the profile of a nano-scale deposited overlayer is dictated by the normal-to-sample surface component of the electric field underneath the tip.

  17. A framework for stochastic simulations and visualization of biological electron-transfer dynamics

    NASA Astrophysics Data System (ADS)

    Nakano, C. Masato; Byun, Hye Suk; Ma, Heng; Wei, Tao; El-Naggar, Mohamed Y.

    2015-08-01

    Electron transfer (ET) dictates a wide variety of energy-conversion processes in biological systems. Visualizing ET dynamics could provide key insight into understanding and possibly controlling these processes. We present a computational framework named VizBET to visualize biological ET dynamics, using an outer-membrane Mtr-Omc cytochrome complex in Shewanella oneidensis MR-1 as an example. Starting from X-ray crystal structures of the constituent cytochromes, molecular dynamics simulations are combined with homology modeling, protein docking, and binding free energy computations to sample the configuration of the complex as well as the change of the free energy associated with ET. This information, along with quantum-mechanical calculations of the electronic coupling, provides inputs to kinetic Monte Carlo (KMC) simulations of ET dynamics in a network of heme groups within the complex. Visualization of the KMC simulation results has been implemented as a plugin to the Visual Molecular Dynamics (VMD) software. VizBET has been used to reveal the nature of ET dynamics associated with novel nonequilibrium phase transitions in a candidate configuration of the Mtr-Omc complex due to electron-electron interactions.

  18. Global magnetohydrodynamic simulations on multiple GPUs

    NASA Astrophysics Data System (ADS)

    Wong, Un-Hong; Wong, Hon-Cheng; Ma, Yonghui

    2014-01-01

    Global magnetohydrodynamic (MHD) models play the major role in investigating the solar wind-magnetosphere interaction. However, the huge computation requirement in global MHD simulations is also the main problem that needs to be solved. With the recent development of modern graphics processing units (GPUs) and the Compute Unified Device Architecture (CUDA), it is possible to perform global MHD simulations in a more efficient manner. In this paper, we present a global magnetohydrodynamic (MHD) simulator on multiple GPUs using CUDA 4.0 with GPUDirect 2.0. Our implementation is based on the modified leapfrog scheme, which is a combination of the leapfrog scheme and the two-step Lax-Wendroff scheme. GPUDirect 2.0 is used in our implementation to drive multiple GPUs. All data transferring and kernel processing are managed with CUDA 4.0 API instead of using MPI or OpenMP. Performance measurements are made on a multi-GPU system with eight NVIDIA Tesla M2050 (Fermi architecture) graphics cards. These measurements show that our multi-GPU implementation achieves a peak performance of 97.36 GFLOPS in double precision.

  19. Interactive Exploration and Analysis of Large-Scale Simulations Using Topology-Based Data Segmentation.

    PubMed

    Bremer, Peer-Timo; Weber, Gunther; Tierny, Julien; Pascucci, Valerio; Day, Marcus S; Bell, John B

    2011-09-01

    Large-scale simulations are increasingly being used to study complex scientific and engineering phenomena. As a result, advanced visualization and data analysis are also becoming an integral part of the scientific process. Often, a key step in extracting insight from these large simulations involves the definition, extraction, and evaluation of features in the space and time coordinates of the solution. However, in many applications, these features involve a range of parameters and decisions that will affect the quality and direction of the analysis. Examples include particular level sets of a specific scalar field, or local inequalities between derived quantities. A critical step in the analysis is to understand how these arbitrary parameters/decisions impact the statistical properties of the features, since such a characterization will help to evaluate the conclusions of the analysis as a whole. We present a new topological framework that in a single-pass extracts and encodes entire families of possible features definitions as well as their statistical properties. For each time step we construct a hierarchical merge tree a highly compact, yet flexible feature representation. While this data structure is more than two orders of magnitude smaller than the raw simulation data it allows us to extract a set of features for any given parameter selection in a postprocessing step. Furthermore, we augment the trees with additional attributes making it possible to gather a large number of useful global, local, as well as conditional statistic that would otherwise be extremely difficult to compile. We also use this representation to create tracking graphs that describe the temporal evolution of the features over time. Our system provides a linked-view interface to explore the time-evolution of the graph interactively alongside the segmentation, thus making it possible to perform extensive data analysis in a very efficient manner. We demonstrate our framework by extracting and analyzing burning cells from a large-scale turbulent combustion simulation. In particular, we show how the statistical analysis enabled by our techniques provides new insight into the combustion process.

  20. 3D Printing of Polymer-Bonded Rare-Earth Magnets With a Variable Magnetic Compound Fraction for a Predefined Stray Field.

    PubMed

    Huber, Christian; Abert, Claas; Bruckner, Florian; Groenefeld, Martin; Schuschnigg, Stephan; Teliban, Iulian; Vogler, Christoph; Wautischer, Gregor; Windl, Roman; Suess, Dieter

    2017-08-25

    Additive manufacturing of polymer-bonded magnets is a recently developed technique, for single-unit production, and for structures that have been impossible to manufacture previously. Also, new possibilities to create a specific stray field around the magnet are triggered. The current work presents a method to 3D print polymer-bonded magnets with a variable magnetic compound fraction distribution. This means the saturation magnetization can be adjusted during the printing process to obtain a required external field of the manufactured magnets. A low-cost, end-user 3D printer with a mixing extruder is used to mix permanent magnetic filaments with pure polyamide (PA12) filaments. The magnetic filaments are compounded, extruded, and characterized for the printing process. To deduce the quality of the manufactured magnets with a variable magnetic compound fraction, an inverse stray field framework is developed. The effectiveness of the printing process and the simulation method is shown. It can also be used to manufacture magnets that produce a predefined stray field in a given region. This opens new possibilities for magnetic sensor applications. This setup and simulation framework allows the design and manufacturing of polymer-bonded permanent magnets, which are impossible to create with conventional methods.

  1. The trade-off between morphology and control in the co-optimized design of robots.

    PubMed

    Rosendo, Andre; von Atzigen, Marco; Iida, Fumiya

    2017-01-01

    Conventionally, robot morphologies are developed through simulations and calculations, and different control methods are applied afterwards. Assuming that simulations and predictions are simplified representations of our reality, how sure can roboticists be that the chosen morphology is the most adequate for the possible control choices in the real-world? Here we study the influence of the design parameters in the creation of a robot with a Bayesian morphology-control (MC) co-optimization process. A robot autonomously creates child robots from a set of possible design parameters and uses Bayesian Optimization (BO) to infer the best locomotion behavior from real world experiments. Then, we systematically change from an MC co-optimization to a control-only (C) optimization, which better represents the traditional way that robots are developed, to explore the trade-off between these two methods. We show that although C processes can greatly improve the behavior of poor morphologies, such agents are still outperformed by MC co-optimization results with as few as 25 iterations. Our findings, on one hand, suggest that BO should be used in the design process of robots for both morphological and control parameters to reach optimal performance, and on the other hand, point to the downfall of current design methods in face of new search techniques.

  2. The trade-off between morphology and control in the co-optimized design of robots

    PubMed Central

    Iida, Fumiya

    2017-01-01

    Conventionally, robot morphologies are developed through simulations and calculations, and different control methods are applied afterwards. Assuming that simulations and predictions are simplified representations of our reality, how sure can roboticists be that the chosen morphology is the most adequate for the possible control choices in the real-world? Here we study the influence of the design parameters in the creation of a robot with a Bayesian morphology-control (MC) co-optimization process. A robot autonomously creates child robots from a set of possible design parameters and uses Bayesian Optimization (BO) to infer the best locomotion behavior from real world experiments. Then, we systematically change from an MC co-optimization to a control-only (C) optimization, which better represents the traditional way that robots are developed, to explore the trade-off between these two methods. We show that although C processes can greatly improve the behavior of poor morphologies, such agents are still outperformed by MC co-optimization results with as few as 25 iterations. Our findings, on one hand, suggest that BO should be used in the design process of robots for both morphological and control parameters to reach optimal performance, and on the other hand, point to the downfall of current design methods in face of new search techniques. PMID:29023482

  3. Investigation of silicide-induced-dopant-activation for steep tunnel junction in tunnel field effect transistor (TFET)

    NASA Astrophysics Data System (ADS)

    Kim, Sihyun; Kwon, Dae Woong; Park, Euyhwan; Lee, Junil; Lee, Roongbin; Lee, Jong-Ho; Park, Byung-Gook

    2018-02-01

    Numerous researches for making steep tunnel junction within tunnel field-effect transistor (TFET) have been conducted. One of the ways to make an abrupt junction is source/drain silicidation, which uses the phenomenon often called silicide-induced-dopant-segregation. It is revealed that the silicide process not only helps dopants to pile up adjacent to the metal-silicon alloy, also induces the dopant activation, thereby making it possible to avoid additional high temperature process. In this report, the availability of dopant activation induced by metal silicide process was thoroughly investigated by diode measurement and device simulation. Metal-silicon (MS) diodes having p+ and n+ silicon formed on the p- substrate exhibit the characteristics of ohmic and pn diodes respectively, for both the samples with and without high temperature annealing. The device simulation for TFETs with dopant-segregated source was also conducted, which verified enhanced DC performance.

  4. Surfactant Based Enhanced Oil Recovery and Foam Mobility Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George J. Hirasaki; Clarence A. Miller; Gary A. Pope

    2005-07-01

    Surfactant flooding has the potential to significantly increase recovery over that of conventional waterflooding. The availability of a large number of surfactant structures makes it possible to conduct a systematic study of the relation between surfactant structure and its efficacy for oil recovery. A combination of two surfactants was found to be particularly effective for application in carbonate formations at low temperature. A formulation has been designed for a particular field application. The addition of an alkali such as sodium carbonate makes possible in situ generation of surfactant and significant reduction of surfactant adsorption. In addition to reduction of interfacialmore » tension to ultra-low values, surfactants and alkali can be designed to alter wettability to enhance oil recovery. The design of the process to maximize the region of ultra-low IFT is more challenging since the ratio of soap to synthetic surfactant is a parameter in the conditions for optimal salinity. Compositional simulation of the displacement process demonstrates the interdependence of the various components for oil recovery. An alkaline surfactant process is designed to enhance spontaneous imbibition in fractured, oil-wet, carbonate formations. It is able to recover oil from dolomite core samples from which there was no oil recovery when placed in formation brine. Mobility control is essential for surfactant EOR. Foam is evaluated to improve the sweep efficiency of surfactant injected into fractured reservoirs. UTCHEM is a reservoir simulator specially designed for surfactant EOR. It has been modified to represent the effects of a change in wettability. Simulated case studies demonstrate the effects of wettability.« less

  5. High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6

    NASA Astrophysics Data System (ADS)

    Haarsma, Reindert J.; Roberts, Malcolm J.; Vidale, Pier Luigi; Senior, Catherine A.; Bellucci, Alessio; Bao, Qing; Chang, Ping; Corti, Susanna; Fučkar, Neven S.; Guemas, Virginie; von Hardenberg, Jost; Hazeleger, Wilco; Kodama, Chihiro; Koenigk, Torben; Leung, L. Ruby; Lu, Jian; Luo, Jing-Jia; Mao, Jiafu; Mizielinski, Matthew S.; Mizuta, Ryo; Nobre, Paulo; Satoh, Masaki; Scoccimarro, Enrico; Semmler, Tido; Small, Justin; von Storch, Jin-Song

    2016-11-01

    Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. HighResMIP thereby focuses on one of the CMIP6 broad questions, "what are the origins and consequences of systematic model biases?", but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.

  6. A 2D modeling approach for fluid propagation during FE-forming simulation of continuously reinforced composites in wet compression moulding

    NASA Astrophysics Data System (ADS)

    Poppe, Christian; Dörr, Dominik; Henning, Frank; Kärger, Luise

    2018-05-01

    Wet compression moulding (WCM) provides large-scale production potential for continuously fiber reinforced components as a promising alternative to resin transfer moulding (RTM). Lower cycle times are possible due to parallelization of the process steps draping, infiltration and curing during moulding (viscous draping). Experimental and theoretical investigations indicate a strong mutual dependency between the physical mechanisms, which occur during draping and mould filling (fluid-structure-interaction). Thus, key process parameters, like fiber orientation, fiber volume fraction, cavity pressure and the amount and viscosity of the resin are physically coupled. To enable time and cost efficient product and process development throughout all design stages, accurate process simulation tools are desirable. Separated draping and mould filling simulation models, as appropriate for the sequential RTM-process, cannot be applied for the WCM process due to the above outlined physical couplings. Within this study, a two-dimensional Darcy-Propagation-Element (DPE-2D) based on a finite element formulation with additional control volumes (FE/CV) is presented, verified and applied to forming simulation of a generic geometry, as a first step towards a fluid-structure-interaction model taking into account simultaneous resin infiltration and draping. The model is implemented in the commercial FE-Solver Abaqus by means of several user subroutines considering simultaneous draping and 2D-infiltration mechanisms. Darcy's equation is solved with respect to a local fiber orientation. Furthermore, the material model can access the local fluid domain properties to update the mechanical forming material parameter, which enables further investigations on the coupled physical mechanisms.

  7. AmapSim: a structural whole-plant simulator based on botanical knowledge and designed to host external functional models.

    PubMed

    Barczi, Jean-François; Rey, Hervé; Caraglio, Yves; de Reffye, Philippe; Barthélémy, Daniel; Dong, Qiao Xue; Fourcaud, Thierry

    2008-05-01

    AmapSim is a tool that implements a structural plant growth model based on a botanical theory and simulates plant morphogenesis to produce accurate, complex and detailed plant architectures. This software is the result of more than a decade of research and development devoted to plant architecture. New advances in the software development have yielded plug-in external functions that open up the simulator to functional processes. The simulation of plant topology is based on the growth of a set of virtual buds whose activity is modelled using stochastic processes. The geometry of the resulting axes is modelled by simple descriptive functions. The potential growth of each bud is represented by means of a numerical value called physiological age, which controls the value for each parameter in the model. The set of possible values for physiological ages is called the reference axis. In order to mimic morphological and architectural metamorphosis, the value allocated for the physiological age of buds evolves along this reference axis according to an oriented finite state automaton whose occupation and transition law follows a semi-Markovian function. Simulations were performed on tomato plants to demonstrate how the AmapSim simulator can interface external modules, e.g. a GREENLAB growth model and a radiosity model. The algorithmic ability provided by AmapSim, e.g. the reference axis, enables unified control to be exercised over plant development parameter values, depending on the biological process target: how to affect the local pertinent process, i.e. the pertinent parameter(s), while keeping the rest unchanged. This opening up to external functions also offers a broadened field of applications and thus allows feedback between plant growth and the physical environment.

  8. AmapSim: A Structural Whole-plant Simulator Based on Botanical Knowledge and Designed to Host External Functional Models

    PubMed Central

    Barczi, Jean-François; Rey, Hervé; Caraglio, Yves; de Reffye, Philippe; Barthélémy, Daniel; Dong, Qiao Xue; Fourcaud, Thierry

    2008-01-01

    Background and Aims AmapSim is a tool that implements a structural plant growth model based on a botanical theory and simulates plant morphogenesis to produce accurate, complex and detailed plant architectures. This software is the result of more than a decade of research and development devoted to plant architecture. New advances in the software development have yielded plug-in external functions that open up the simulator to functional processes. Methods The simulation of plant topology is based on the growth of a set of virtual buds whose activity is modelled using stochastic processes. The geometry of the resulting axes is modelled by simple descriptive functions. The potential growth of each bud is represented by means of a numerical value called physiological age, which controls the value for each parameter in the model. The set of possible values for physiological ages is called the reference axis. In order to mimic morphological and architectural metamorphosis, the value allocated for the physiological age of buds evolves along this reference axis according to an oriented finite state automaton whose occupation and transition law follows a semi-Markovian function. Key Results Simulations were performed on tomato plants to demostrate how the AmapSim simulator can interface external modules, e.g. a GREENLAB growth model and a radiosity model. Conclusions The algorithmic ability provided by AmapSim, e.g. the reference axis, enables unified control to be exercised over plant development parameter values, depending on the biological process target: how to affect the local pertinent process, i.e. the pertinent parameter(s), while keeping the rest unchanged. This opening up to external functions also offers a broadened field of applications and thus allows feedback between plant growth and the physical environment. PMID:17766310

  9. SIM_ADJUST -- A computer code that adjusts simulated equivalents for observations or predictions

    USGS Publications Warehouse

    Poeter, Eileen P.; Hill, Mary C.

    2008-01-01

    This report documents the SIM_ADJUST computer code. SIM_ADJUST surmounts an obstacle that is sometimes encountered when using universal model analysis computer codes such as UCODE_2005 (Poeter and others, 2005), PEST (Doherty, 2004), and OSTRICH (Matott, 2005; Fredrick and others (2007). These codes often read simulated equivalents from a list in a file produced by a process model such as MODFLOW that represents a system of interest. At times values needed by the universal code are missing or assigned default values because the process model could not produce a useful solution. SIM_ADJUST can be used to (1) read a file that lists expected observation or prediction names and possible alternatives for the simulated values; (2) read a file produced by a process model that contains space or tab delimited columns, including a column of simulated values and a column of related observation or prediction names; (3) identify observations or predictions that have been omitted or assigned a default value by the process model; and (4) produce an adjusted file that contains a column of simulated values and a column of associated observation or prediction names. The user may provide alternatives that are constant values or that are alternative simulated values. The user may also provide a sequence of alternatives. For example, the heads from a series of cells may be specified to ensure that a meaningful value is available to compare with an observation located in a cell that may become dry. SIM_ADJUST is constructed using modules from the JUPITER API, and is intended for use on any computer operating system. SIM_ADJUST consists of algorithms programmed in Fortran90, which efficiently performs numerical calculations.

  10. Impact of tool wear on cross wedge rolling process stability and on product quality

    NASA Astrophysics Data System (ADS)

    Gutierrez, Catalina; Langlois, Laurent; Baudouin, Cyrille; Bigot, Régis; Fremeaux, Eric

    2017-10-01

    Cross wedge rolling (CWR) is a metal forming process used in the automotive industry. One of its applications is in the manufacturing process of connecting rods. CWR transforms a cylindrical billet into a complex axisymmetrical shape with an accurate distribution of material. This preform is forged into shape in a forging die. In order to improve CWR tool lifecycle and product quality it is essential to understand tool wear evolution and the physical phenomena that change on the CWR process due to the resulting geometry of the tool when undergoing tool wear. In order to understand CWR tool wear behavior, numerical simulations are necessary. Nevertheless, if the simulations are performed with the CAD geometry of the tool, results are limited. To solve this difficulty, two numerical simulations with FORGE® were performed using the real geometry of the tools (both up and lower roll) at two different states: (1) before starting lifecycle and (2) end of lifecycle. The tools were 3D measured with ATOS triple scan by GOM® using optical 3D measuring techniques. The result was a high-resolution point cloud of the entire geometry of the tool. Each 3D point cloud was digitalized and converted into a STL format. The geometry of the tools in a STL format was input for the 3D simulations. Both simulations were compared. Defects of products obtained in simulation were compared to main defects of products found industrially. Two main defects are: (a) surface defects on the preform that are not fixed in the die forging operation; and (b) Preform bent (no longer straight), with two possible impacts: on the one hand that the robot cannot grab it to take it to the forging stage; on the other hand, an unfilled section in the forging operation.

  11. A method for validation of finite element forming simulation on basis of a pointwise comparison of distance and curvature

    NASA Astrophysics Data System (ADS)

    Dörr, Dominik; Joppich, Tobias; Schirmaier, Fabian; Mosthaf, Tobias; Kärger, Luise; Henning, Frank

    2016-10-01

    Thermoforming of continuously fiber reinforced thermoplastics (CFRTP) is ideally suited to thin walled and complex shaped products. By means of forming simulation, an initial validation of the producibility of a specific geometry, an optimization of the forming process and the prediction of fiber-reorientation due to forming is possible. Nevertheless, applied methods need to be validated. Therefor a method is presented, which enables the calculation of error measures for the mismatch between simulation results and experimental tests, based on measurements with a conventional coordinate measuring device. As a quantitative measure, describing the curvature is provided, the presented method is also suitable for numerical or experimental sensitivity studies on wrinkling behavior. The applied methods for forming simulation, implemented in Abaqus explicit, are presented and applied to a generic geometry. The same geometry is tested experimentally and simulation and test results are compared by the proposed validation method.

  12. Atomic quantum simulation of the lattice gauge-Higgs model: Higgs couplings and emergence of exact local gauge symmetry.

    PubMed

    Kasamatsu, Kenichi; Ichinose, Ikuo; Matsui, Tetsuo

    2013-09-13

    Recently, the possibility of quantum simulation of dynamical gauge fields was pointed out by using a system of cold atoms trapped on each link in an optical lattice. However, to implement exact local gauge invariance, fine-tuning the interaction parameters among atoms is necessary. In the present Letter, we study the effect of violation of the U(1) local gauge invariance by relaxing the fine-tuning of the parameters and showing that a wide variety of cold atoms is still a faithful quantum simulator for a U(1) gauge-Higgs model containing a Higgs field sitting on sites. The clarification of the dynamics of this gauge-Higgs model sheds some light upon various unsolved problems, including the inflation process of the early Universe. We study the phase structure of this model by Monte Carlo simulation and also discuss the atomic characteristics of the Higgs phase in each simulator.

  13. The effect of carrier gas flow rate and source cell temperature on low pressure organic vapor phase deposition simulation by direct simulation Monte Carlo method

    PubMed Central

    Wada, Takao; Ueda, Noriaki

    2013-01-01

    The process of low pressure organic vapor phase deposition (LP-OVPD) controls the growth of amorphous organic thin films, where the source gases (Alq3 molecule, etc.) are introduced into a hot wall reactor via an injection barrel using an inert carrier gas (N2 molecule). It is possible to control well the following substrate properties such as dopant concentration, deposition rate, and thickness uniformity of the thin film. In this paper, we present LP-OVPD simulation results using direct simulation Monte Carlo-Neutrals (Particle-PLUS neutral module) which is commercial software adopting direct simulation Monte Carlo method. By estimating properly the evaporation rate with experimental vaporization enthalpies, the calculated deposition rates on the substrate agree well with the experimental results that depend on carrier gas flow rate and source cell temperature. PMID:23674843

  14. Investigating the Macrodispersion Experiment (MADE) site in Columbus, Mississippi, using a three‐dimensional inverse flow and transport model

    USGS Publications Warehouse

    Christiansen Barlebo , Heidi; Hill, Mary C.; Rosbjerg, Dan

    2004-01-01

    Flowmeter‐measured hydraulic conductivities from the heterogeneous MADE site have been used predictively in advection‐dispersion models. Resulting simulated concentrations failed to reproduce even major plume characteristics and some have concluded that other mechanisms, such as dual porosity, are important. Here an alternative possibility is investigated: that the small‐scale flowmeter measurements are too noisy and possibly too biased to use so directly in site‐scale models and that the hydraulic head and transport data are more suitable for site‐scale characterization. Using a calibrated finite element model of the site and a new framework to evaluate random and systematic model and measurement errors, the following conclusions are derived. (1) If variations in subsurface fluid velocities like those simulated in this work (0.1 and 2.0 m per day along parallel and reasonably close flow paths) exist, it is likely that classical advection‐dispersion processes can explain the measured plume characteristics. (2) The flowmeter measurements are possibly systematically lower than site‐scale values when the measurements are considered individually and using common averaging methods and display variability that obscures abrupt changes in hydraulic conductivities that are well supported by changes in hydraulic gradients and are important to the simulation of transport.

  15. Systematic reconstruction of TRANSPATH data into Cell System Markup Language

    PubMed Central

    Nagasaki, Masao; Saito, Ayumu; Li, Chen; Jeong, Euna; Miyano, Satoru

    2008-01-01

    Background Many biological repositories store information based on experimental study of the biological processes within a cell, such as protein-protein interactions, metabolic pathways, signal transduction pathways, or regulations of transcription factors and miRNA. Unfortunately, it is difficult to directly use such information when generating simulation-based models. Thus, modeling rules for encoding biological knowledge into system-dynamics-oriented standardized formats would be very useful for fully understanding cellular dynamics at the system level. Results We selected the TRANSPATH database, a manually curated high-quality pathway database, which provides a plentiful source of cellular events in humans, mice, and rats, collected from over 31,500 publications. In this work, we have developed 16 modeling rules based on hybrid functional Petri net with extension (HFPNe), which is suitable for graphical representing and simulating biological processes. In the modeling rules, each Petri net element is incorporated with Cell System Ontology to enable semantic interoperability of models. As a formal ontology for biological pathway modeling with dynamics, CSO also defines biological terminology and corresponding icons. By combining HFPNe with the CSO features, it is possible to make TRANSPATH data to simulation-based and semantically valid models. The results are encoded into a biological pathway format, Cell System Markup Language (CSML), which eases the exchange and integration of biological data and models. Conclusion By using the 16 modeling rules, 97% of the reactions in TRANSPATH are converted into simulation-based models represented in CSML. This reconstruction demonstrates that it is possible to use our rules to generate quantitative models from static pathway descriptions. PMID:18570683

  16. Probing dissipation mechanisms in BL Lac jets through X-ray polarimetry

    NASA Astrophysics Data System (ADS)

    Tavecchio, F.; Landoni, M.; Sironi, L.; Coppi, P.

    2018-06-01

    The dissipation of energy flux in blazar jets plays a key role in the acceleration of relativistic particles. Two possibilities are commonly considered for the dissipation processes, magnetic reconnection - possibly triggered by instabilities in magnetically-dominated jets - , or shocks - for weakly magnetized flows. We consider the polarimetric features expected for the two scenarios analyzing the results of state-of-the-art simulations. For the magnetic reconnection scenario we conclude, using results from global relativistic MHD simulations, that the emission likely occurs in turbulent regions with unstructured magnetic fields, although the simulations do not allow us to draw firm conclusions. On the other hand, with local particle-in-cell simulations we show that, for shocks with a magnetic field geometry suitable for particle acceleration, the self-generated magnetic field at the shock front is predominantly orthogonal to the shock normal and becomes quasi-parallel downstream. Based on this result we develop a simplified model to calculate the frequency-dependent degree of polarization, assuming that high-energy particles are injected at the shock and cool downstream. We apply our results to HBLs, blazars with the maximum of their synchrotron output at UV-soft X-ray energies. While in the optical band the predicted degree of polarization is low, in the X-ray emission it can ideally reach 50%, especially during active/flaring states. The comparison between measurements in the optical and in the X-ray band made during active states (feasible with the planned IXPE satellite) are expected to provide valuable constraints on the dissipation and acceleration processes.

  17. Systematic reconstruction of TRANSPATH data into cell system markup language.

    PubMed

    Nagasaki, Masao; Saito, Ayumu; Li, Chen; Jeong, Euna; Miyano, Satoru

    2008-06-23

    Many biological repositories store information based on experimental study of the biological processes within a cell, such as protein-protein interactions, metabolic pathways, signal transduction pathways, or regulations of transcription factors and miRNA. Unfortunately, it is difficult to directly use such information when generating simulation-based models. Thus, modeling rules for encoding biological knowledge into system-dynamics-oriented standardized formats would be very useful for fully understanding cellular dynamics at the system level. We selected the TRANSPATH database, a manually curated high-quality pathway database, which provides a plentiful source of cellular events in humans, mice, and rats, collected from over 31,500 publications. In this work, we have developed 16 modeling rules based on hybrid functional Petri net with extension (HFPNe), which is suitable for graphical representing and simulating biological processes. In the modeling rules, each Petri net element is incorporated with Cell System Ontology to enable semantic interoperability of models. As a formal ontology for biological pathway modeling with dynamics, CSO also defines biological terminology and corresponding icons. By combining HFPNe with the CSO features, it is possible to make TRANSPATH data to simulation-based and semantically valid models. The results are encoded into a biological pathway format, Cell System Markup Language (CSML), which eases the exchange and integration of biological data and models. By using the 16 modeling rules, 97% of the reactions in TRANSPATH are converted into simulation-based models represented in CSML. This reconstruction demonstrates that it is possible to use our rules to generate quantitative models from static pathway descriptions.

  18. Optimization of Friction Stir Welding Tool Advance Speed via Monte-Carlo Simulation of the Friction Stir Welding Process

    PubMed Central

    Fraser, Kirk A.; St-Georges, Lyne; Kiss, Laszlo I.

    2014-01-01

    Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time. PMID:28788627

  19. Optimization of Friction Stir Welding Tool Advance Speed via Monte-Carlo Simulation of the Friction Stir Welding Process.

    PubMed

    Fraser, Kirk A; St-Georges, Lyne; Kiss, Laszlo I

    2014-04-30

    Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time.

  20. Spatially explicit simulation of hydrologically controlled carbon and nitrogen cycles and associated feedback mechanisms in a boreal ecosystem

    NASA Astrophysics Data System (ADS)

    Govind, Ajit; Chen, Jing Ming; Ju, Weimin

    2009-06-01

    Ecosystem models that simulate biogeochemical processes usually ignore hydrological controls that govern them. It is quite possible that topographically driven water fluxes significantly influence the spatial distribution of C sources and sinks because of their large contribution to the local water balance. To investigate this, we simulated biogeochemical processes along with the associated feedback mechanisms in a boreal ecosystem using a spatially explicit hydroecological model, boreal ecosystem productivity simulator (BEPS)-TerrainLab V2.0, that has a tight coupling of ecophysiological, hydrological, and biogeochemical processes. First, the simulated dynamics of snowpack, soil temperature, net ecosystem productivity (NEP), and total ecosystem respiration (TER) were validated with high-frequency measurements for 2 years. The model was able to explain 80% of the variability in NEP and 84% of the variability in TER. Further, we investigated the influence of topographically driven subsurface base flow on soil C and N cycling and on the spatiotemporal patterns of C sources and sinks using three hydrological modeling scenarios that differed in hydrological conceptualizations. In general, the scenarios that had nonexplicit hydrological representation overestimated NEP, as opposed to the scenario that had an explicit (realistic) representation. The key processes controlling the NEP differences were attributed to the combined effects of variations in photosynthesis (due to changes in stomatal conductance and nitrogen (N) availability), heterotrophic respiration, and autotrophic respiration, all of which occur simultaneously affecting NEP. Feedback relationships were also found to exacerbate the differences. We identified six types of NEP differences (biases), of which the most commonly found was due to an underestimation of the existing C sources, highlighting the vulnerability of regional-scale ecosystem models that ignore hydrological processes.

  1. [Simulation of administrative influence on a public health under conditions of changing the terminal values during the process of society transformation].

    PubMed

    Fed'ko, O A

    2010-01-01

    The article is devoted to the study of interrelation between terminal values and subjective health in the context of community development and modeling of possible directions of administrative influence on a public health taking into account such interrelation. The structural model of influence of terminal values on a subjective health is presented and the assessment of possible changes of health on the population level under condition of the introduction of the proper administrative interventions is given.

  2. Effects of anthropogenic groundwater exploitation on land surface processes: A case study of the Haihe River Basin, northern China

    NASA Astrophysics Data System (ADS)

    Zou, Jing; Xie, Zhenghui; Zhan, Chesheng; Qin, Peihua; Sun, Qin; Jia, Binghao; Xia, Jun

    2015-05-01

    In this study, we incorporated a groundwater exploitation scheme into the land surface model CLM3.5 to investigate the effects of the anthropogenic exploitation of groundwater on land surface processes in a river basin. Simulations of the Haihe River Basin in northern China were conducted for the years 1965-2000 using the model. A control simulation without exploitation and three exploitation simulations with different water demands derived from socioeconomic data related to the Basin were conducted. The results showed that groundwater exploitation for human activities resulted in increased wetting and cooling effects at the land surface and reduced groundwater storage. A lowering of the groundwater table, increased upper soil moisture, reduced 2 m air temperature, and enhanced latent heat flux were detected by the end of the simulated period, and the changes at the land surface were related linearly to the water demands. To determine the possible responses of the land surface processes in extreme cases (i.e., in which the exploitation process either continued or ceased), additional hypothetical simulations for the coming 200 years with constant climate forcing were conducted, regardless of changes in climate. The simulations revealed that the local groundwater storage on the plains could not contend with high-intensity exploitation for long if the exploitation process continues at the current rate. Changes attributable to groundwater exploitation reached extreme values and then weakened within decades with the depletion of groundwater resources and the exploitation process will therefore cease. However, if exploitation is stopped completely to allow groundwater to recover, drying and warming effects, such as increased temperature, reduced soil moisture, and reduced total runoff, would occur in the Basin within the early decades of the simulation period. The effects of exploitation will then gradually disappear, and the variables will approach the natural state and stabilize at different rates. Simulations were also conducted for cases in which exploitation either continues or ceases using future climate scenario outputs from a general circulation model. The resulting trends were almost the same as those of the simulations with constant climate forcing, despite differences in the climate data input. Therefore, a balance between slow groundwater restoration and rapid human development of the land must be achieved to maintain a sustainable water resource.

  3. Impregnation of Composite Materials: a Numerical Study

    NASA Astrophysics Data System (ADS)

    Baché, Elliott; Dupleix-Couderc, Chloé; Arquis, Eric; Berdoyes, Isabelle

    2017-12-01

    Oxide ceramic matrix composites are currently being developed for aerospace applications such as the exhaust, where the parts are subject to moderately high temperatures (≈ 700 ∘C) and oxidation. These composite materials are normally formed by, among other steps, impregnating a ceramic fabric with a slurry of ceramic particles. This impregnation process can be complex, with voids possibly forming in the fabric depending on the process parameters and material properties. Unwanted voids or macroporosity within the fabric can decrease the mechanical properties of the parts. In order to design an efficient manufacturing process able to impregnate the fabric well, numerical simulations may be used to design the process as well as the slurry. In this context, a tool is created for modeling different processes. Thétis, which solves the Navier-Stokes-Darcy-Brinkman equation using finite volumes, is expanded to take into account capillary pressures on the mesoscale. This formulation allows for more representativity than for Darcy's law (homogeneous preform) simulations while avoiding the prohibitive simulation times of a full discretization for the composing fibers at the representative elementary volume scale. The resulting tool is first used to investigate the effect of varying the slurry parameters on impregnation evolution. Two different processes, open bath impregnation and wet lay-up, are then studied with emphasis on varying their input parameters (e.g. inlet velocity).

  4. Modeling and Simulation of An Adaptive Neuro-Fuzzy Inference System (ANFIS) for Mobile Learning

    ERIC Educational Resources Information Center

    Al-Hmouz, A.; Shen, Jun; Al-Hmouz, R.; Yan, Jun

    2012-01-01

    With recent advances in mobile learning (m-learning), it is becoming possible for learning activities to occur everywhere. The learner model presented in our earlier work was partitioned into smaller elements in the form of learner profiles, which collectively represent the entire learning process. This paper presents an Adaptive Neuro-Fuzzy…

  5. Addressing spatial scales and new mechanisms in climate impact ecosystem modeling

    NASA Astrophysics Data System (ADS)

    Poulter, B.; Joetzjer, E.; Renwick, K.; Ogunkoya, G.; Emmett, K.

    2015-12-01

    Climate change impacts on vegetation distributions are typically addressed using either an empirical approach, such as a species distribution model (SDM), or with process-based methods, for example, dynamic global vegetation models (DGVMs). Each approach has its own benefits and disadvantages. For example, an SDM is constrained by data and few parameters, but does not include adaptation or acclimation processes or other ecosystem feedbacks that may act to mitigate or enhance climate effects. Alternatively, a DGVM model includes many mechanisms relating plant growth and disturbance to climate, but simulations are costly to perform at high-spatial resolution and there remains large uncertainty on a variety of fundamental physical processes. To address these issues, here, we present two DGVM-based case studies where i) high-resolution (1 km) simulations are being performed for vegetation in the Greater Yellowstone Ecosystem using a biogeochemical, forest gap model, LPJ-GUESS, and ii) where new mechanisms for simulating tropical tree-mortality are being introduced. High-resolution DGVM model simulations require not only computing and reorganizing code but also a consideration of scaling issues on vegetation dynamics and stochasticity and also on disturbance and migration. New mechanisms for simulating forest mortality must consider hydraulic limitations and carbon reserves and their interactions on source-sink dynamics and in controlling water potentials. Improving DGVM approaches by addressing spatial scale challenges and integrating new approaches for estimating forest mortality will provide new insights more relevant for land management and possibly reduce uncertainty by physical processes more directly comparable to experimental and observational evidence.

  6. Encapsulation materials research

    NASA Technical Reports Server (NTRS)

    Willis, P. B.

    1984-01-01

    Encapsulation materials for solar cells were investigated. The different phases consisted of: (1) identification and development of low cost module encapsulation materials; (2) materials reliability examination; and (3) process sensitivity and process development. It is found that outdoor photothermal aging devices (OPT) are the best accelerated aging methods, simulate worst case field conditions, evaluate formulation and module performance and have a possibility for life assessment. Outdoor metallic copper exposure should be avoided, self priming formulations have good storage stability, stabilizers enhance performance, and soil resistance treatment is still effective.

  7. Meteorite-asteroid spectral comparison - The effects of comminution, melting, and recrystallization

    NASA Technical Reports Server (NTRS)

    Clark, Beth E.; Fanale, Fraser P.; Salisbury, John W.

    1992-01-01

    The present laboratory simulation of possible spectral-alteration effects on the optical surface of ordinary chondrite parent bodies duplicated regolith processes through comminution of the samples to finer rain sizes. After reflectance spectra characterization, the comminuted samples were melted, crystallized, recomminuted, and again characterized. While individual spectral characteristics could be significantly changed by these processes, no combination of the alteration procedures appeared capable of affecting all relevant parameters in a way that improved the match between chondritic meteorites and S-class asteroids.

  8. Synthesis of Formamide and Related Organic Species in the Interstellar Medium via Chemical Dynamics Simulations

    NASA Astrophysics Data System (ADS)

    Spezia, Riccardo; Jeanvoine, Yannick; Hase, William L.; Song, Kihyung; Largo, Antonio

    2016-08-01

    We show, by means of direct dynamics simulations, how it is possible to define possible reactants and mechanisms leading to the formation of formamide in the interstellar medium. In particular, different ion-molecule reactions in the gas phase were considered: NH3OH+, NH2OH{}2+, H2COH+, and NH4 + for the ions and NH2OH, H2CO, and NH3 for the partner neutrals. These calculations were combined with high level ab initio calculations to investigate possible further evolution of the products observed. In particular, for formamide, we propose that the NH2OH{}2+ + H2CO reaction can produce an isomer, NH2OCH{}2+, that, after dissociative recombination, can produce neutral formamide, which was observed in space. The direct dynamics do not pre-impose any reaction pathways and in other reactions, we did not observe the formation of formamide or any possible precursor. On the other hand, we obtained other interesting reactions, like the formation of NH2CH{}2+. Finally, some radiative association processes are proposed. All of the results obtained are discussed in light of the species observed in radioastronomy.

  9. Molecular dynamic approach to the study of the intense heat and mass transfer processes on the vapor-liquid interface

    NASA Astrophysics Data System (ADS)

    Levashov, V. Yu; Kamenov, P. K.

    2017-10-01

    The paper is devoted to research of the heat and mass transfer processes on the vapor-liquid interface. These processes can be realized for example at metal tempering, accidents at nuclear power stations, followed by the release of the corium into the heat carrier, getting hot magma into the water during volcanic eruptions and other. In all these examples the vapor film can arise on the heated body surface. In this paper the vapor film formation process will be considered with help of molecular dynamics simulation methods. The main attention during this process modeling will be focused on the subject of the fluid and vapor interactions with the heater surface. Another direction of this work is to study of the processes inside the droplet that may take place as result of impact of the high-power laser radiation. Such impact can lead to intensive evaporation and explosive destruction of the droplet. At that the duration of heat and mass transfer processes in droplet substance is tens of femtoseconds. Thus, the methods of molecular dynamics simulation can give the possibilities describe the heat and mass transfer processes in the droplet and the vapor phase formation.

  10. The ν process in the innermost supernova ejecta

    NASA Astrophysics Data System (ADS)

    Sieverding, Andre; Martínez Pinedo, Gabriel; Langanke, Karlheinz; Harris, J. Austin; Hix, W. Raphael

    2018-01-01

    The neutrino-induced nucleosynthesis (ν process) in supernova explosions of massive stars of solar metallicity with initial main sequence masses between 13 and 30 M⊙ has been studied with an analytic explosion model using a new extensive set of neutrino-nucleus cross-sections and spectral properties that agree with modern supernova simulations. The production factors for the nuclei 7Li, 11B, 19F, 138La and 180Ta, are still significantly enhanced but do not reproduce the full solar abundances. We study the possible contribution of the innermost supernova eject to the production of the light elements 7Li and 11B with tracer particles based on a 2D supernova simulation of a 12 M⊙ progenitor and conclude, that a contribution exists but is negligible for the total yield for this explosion model.

  11. Research on the laser angle deception jamming technology of laser countermeasure

    NASA Astrophysics Data System (ADS)

    Ma, Shi-wei; Chen, Wen-jian; Gao, Wei; Duan, Yuan-yuan

    2015-10-01

    In recent years , laser guided weapons behave very well at destroying the military goals in the local wars, the single-shot probability, effective range and hitting precision getting better. And the semi-active laser guided weapons are the most widely used laser guided weapons. In order to improve the viability and protect important military goals, it's necessary to study the technology to against the semi-active guided weapons. This paper studies the working principle, the advantages and disadvantages of the semi-active guided weapons at first, and analyze the possibility of laser angle deception jamming system working. Then it analyzes the working principle and process of laser angle deception jamming technology. Finally it designs a half-real simulation system of laser angle deception jamming, which consists of semi-active laser guided weapons simulation system and laser angle deception jamming system. The simulation system demonstrates the working process of the laser angle deception jamming system. This paper provides fundamental base for the research on the countermeasure technology of semi-active laser guided weapons.

  12. Multifractal vector fields and stochastic Clifford algebra.

    PubMed

    Schertzer, Daniel; Tchiguirinskaia, Ioulia

    2015-12-01

    In the mid 1980s, the development of multifractal concepts and techniques was an important breakthrough for complex system analysis and simulation, in particular, in turbulence and hydrology. Multifractals indeed aimed to track and simulate the scaling singularities of the underlying equations instead of relying on numerical, scale truncated simulations or on simplified conceptual models. However, this development has been rather limited to deal with scalar fields, whereas most of the fields of interest are vector-valued or even manifold-valued. We show in this paper that the combination of stable Lévy processes with Clifford algebra is a good candidate to bridge up the present gap between theory and applications. We show that it indeed defines a convenient framework to generate multifractal vector fields, possibly multifractal manifold-valued fields, based on a few fundamental and complementary properties of Lévy processes and Clifford algebra. In particular, the vector structure of these algebra is much more tractable than the manifold structure of symmetry groups while the Lévy stability grants a given statistical universality.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schertzer, Daniel, E-mail: Daniel.Schertzer@enpc.fr; Tchiguirinskaia, Ioulia, E-mail: Ioulia.Tchiguirinskaia@enpc.fr

    In the mid 1980s, the development of multifractal concepts and techniques was an important breakthrough for complex system analysis and simulation, in particular, in turbulence and hydrology. Multifractals indeed aimed to track and simulate the scaling singularities of the underlying equations instead of relying on numerical, scale truncated simulations or on simplified conceptual models. However, this development has been rather limited to deal with scalar fields, whereas most of the fields of interest are vector-valued or even manifold-valued. We show in this paper that the combination of stable Lévy processes with Clifford algebra is a good candidate to bridge upmore » the present gap between theory and applications. We show that it indeed defines a convenient framework to generate multifractal vector fields, possibly multifractal manifold-valued fields, based on a few fundamental and complementary properties of Lévy processes and Clifford algebra. In particular, the vector structure of these algebra is much more tractable than the manifold structure of symmetry groups while the Lévy stability grants a given statistical universality.« less

  14. Color visual simulation applications at the Defense Mapping Agency

    NASA Astrophysics Data System (ADS)

    Simley, J. D.

    1984-09-01

    The Defense Mapping Agency (DMA) produces the Digital Landmass System data base to provide culture and terrain data in support of numerous aircraft simulators. In order to conduct data base and simulation quality control and requirements analysis, DMA has developed the Sensor Image Simulator which can rapidly generate visual and radar static scene digital simulations. The use of color in visual simulation allows the clear portrayal of both landcover and terrain data, whereas the initial black and white capabilities were restricted in this role and thus found limited use. Color visual simulation has many uses in analysis to help determine the applicability of current and prototype data structures to better meet user requirements. Color visual simulation is also significant in quality control since anomalies can be more easily detected in natural appearing forms of the data. The realism and efficiency possible with advanced processing and display technology, along with accurate data, make color visual simulation a highly effective medium in the presentation of geographic information. As a result, digital visual simulation is finding increased potential as a special purpose cartographic product. These applications are discussed and related simulation examples are presented.

  15. Choosing the best partition of the output from a large-scale simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Challacombe, Chelsea Jordan; Casleton, Emily Michele

    Data partitioning becomes necessary when a large-scale simulation produces more data than can be feasibly stored. The goal is to partition the data, typically so that every element belongs to one and only one partition, and store summary information about the partition, either a representative value plus an estimate of the error or a distribution. Once the partitions are determined and the summary information stored, the raw data is discarded. This process can be performed in-situ; meaning while the simulation is running. When creating the partitions there are many decisions that researchers must make. For instance, how to determine oncemore » an adequate number of partitions have been created, how are the partitions created with respect to dividing the data, or how many variables should be considered simultaneously. In addition, decisions must be made for how to summarize the information within each partition. Because of the combinatorial number of possible ways to partition and summarize the data, a method of comparing the different possibilities will help guide researchers into choosing a good partitioning and summarization scheme for their application.« less

  16. LPWA using supersonic gas jet with tailored density profile

    NASA Astrophysics Data System (ADS)

    Kononenko, O.; Bohlen, S.; Dale, J.; D'Arcy, R.; Dinter, M.; Erbe, J. H.; Indorf, G.; di Lucchio, L.; Goldberg, L.; Gruse, J. N.; Karstensen, S.; Libov, V.; Ludwig, K.; Martinez de La Ossa, A.; Marutzky, F.; Niroula, A.; Osterhoff, J.; Quast, M.; Schaper, L.; Schwinkendorf, J.-P.; Streeter, M.; Tauscher, G.; Weichert, S.; Palmer, C.; Horbatiuk, Taras

    2016-10-01

    Laser driven plasma wakefield accelerators have been explored as a potential compact, reproducible source of relativistic electron bunches, utilising an electric field of many GV/m. Control over injection of electrons into the wakefield is of crucial importance in producing stable, mono-energetic electron bunches. Density tailoring of the target, to control the acceleration process, can also be used to improve the quality of the bunch. By using gas jets to provide tailored targets it is possible to provide good access for plasma diagnostics while also producing sharp density gradients for density down-ramp injection. OpenFOAM hydrodynamic simulations were used to investigate the possibility of producing tailored density targets in a supersonic gas jet. Particle-in-cell simulations of the resulting density profiles modelled the effect of the tailored density on the properties of the accelerated electron bunch. Here, we present the simulation results together with preliminary experimental measurements of electron and x-ray properties from LPWA experiments using gas jet targets and a 25 TW, 25 fs Ti:Sa laser system at DESY.

  17. When Everybody Anticipates in a Different Way …

    NASA Astrophysics Data System (ADS)

    Kindler, Eugene

    2002-09-01

    The paper is oriented to the computer modeling of anticipatory systems in which there are more than one anticipating individuals. The anticipating of each of them can mutually differ. In such a case we can meet four main cases: (1) the anticipating persons make a dialogue to access some agreement and by such a way they can optimize the anticipation, (2) one of the anticipating persons is a teacher of the other ones and can show them where they had to be better in their anticipation, (3) the anticipating persons compete, each of them expecting to make the best anticipation and wishes to apply it in order to make the other ones weaker, (4) the anticipating persons do not mutually communicate. A human often anticipates so that he imagines the possible processes of the future and so he performs a certain "mental simulation", but nowadays a human uses computer simulation to replace that (insufficient) mental simulation. All the variants were simulated so that the human imagining was transferred to a computer simulation. Thus systems containing several simulating elements were simulated. Experiences with that "nested" simulation and applications of it are described.

  18. Fully kinetic simulations of dense plasma focus Z-pinch devices.

    PubMed

    Schmidt, A; Tang, V; Welch, D

    2012-11-16

    Dense plasma focus Z-pinch devices are sources of copious high energy electrons and ions, x rays, and neutrons. The mechanisms through which these physically simple devices generate such high-energy beams in a relatively short distance are not fully understood. We now have, for the first time, demonstrated a capability to model these plasmas fully kinetically, allowing us to simulate the pinch process at the particle scale. We present here the results of the initial kinetic simulations, which reproduce experimental neutron yields (~10(7)) and high-energy (MeV) beams for the first time. We compare our fluid, hybrid (kinetic ions and fluid electrons), and fully kinetic simulations. Fluid simulations predict no neutrons and do not allow for nonthermal ions, while hybrid simulations underpredict neutron yield by ~100x and exhibit an ion tail that does not exceed 200 keV. Only fully kinetic simulations predict MeV-energy ions and experimental neutron yields. A frequency analysis in a fully kinetic simulation shows plasma fluctuations near the lower hybrid frequency, possibly implicating lower hybrid drift instability as a contributor to anomalous resistivity in the plasma.

  19. The devil is in the details: Comparisons of episodic simulations of positive and negative future events.

    PubMed

    Puig, Vannia A; Szpunar, Karl K

    2017-08-01

    Over the past decade, psychologists have devoted considerable attention to episodic simulation-the ability to imagine specific hypothetical events. Perhaps one of the most consistent patterns of data to emerge from this literature is that positive simulations of the future are rated as more detailed than negative simulations of the future, a pattern of results that is commonly interpreted as evidence for a positivity bias in future thinking. In the present article, we demonstrate across two experiments that negative future events are consistently simulated in more detail than positive future events when frequency of prior thinking is taken into account as a possible confounding variable and when level of detail associated with simulated events is assessed using an objective scoring criterion. Our findings are interpreted in the context of the mobilization-minimization hypothesis of event cognition that suggests people are especially likely to devote cognitive resources to processing negative scenarios. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Numerical simulations in the development of propellant management devices

    NASA Astrophysics Data System (ADS)

    Gaulke, Diana; Winkelmann, Yvonne; Dreyer, Michael

    Propellant management devices (PMDs) are used for positioning the propellant at the propel-lant port. It is important to provide propellant without gas bubbles. Gas bubbles can inflict cavitation and may lead to system failures in the worst case. Therefore, the reliable operation of such devices must be guaranteed. Testing these complex systems is a very intricate process. Furthermore, in most cases only tests with downscaled geometries are possible. Numerical sim-ulations are used here as an aid to optimize the tests and to predict certain results. Based on these simulations, parameters can be determined in advance and parts of the equipment can be adjusted in order to minimize the number of experiments. In return, the simulations are validated regarding the test results. Furthermore, if the accuracy of the numerical prediction is verified, then numerical simulations can be used for validating the scaling of the experiments. This presentation demonstrates some selected numerical simulations for the development of PMDs at ZARM.

  1. Conceptual Design of Simulation Models in an Early Development Phase of Lunar Spacecraft Simulator Using SMP2 Standard

    NASA Astrophysics Data System (ADS)

    Lee, Hoon Hee; Koo, Cheol Hea; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok

    2013-08-01

    The conceptual study for Korean lunar orbiter/lander prototype has been performed in Korea Aerospace Research Institute (KARI). Across diverse space programs around European countries, a variety of simulation application has been developed using SMP2 (Simulation Modelling Platform) standard related to portability and reuse of simulation models by various model users. KARI has not only first-hand experience of a development of SMP compatible simulation environment but also an ongoing study to apply the SMP2 development process of simulation model to a simulator development project for lunar missions. KARI has tried to extend the coverage of the development domain based on SMP2 standard across the whole simulation model life-cycle from software design to its validation through a lunar exploration project. Figure. 1 shows a snapshot from a visualization tool for the simulation of lunar lander motion. In reality, a demonstrator prototype on the right-hand side of image was made and tested in 2012. In an early phase of simulator development prior to a kick-off start in the near future, targeted hardware to be modelled has been investigated and indentified at the end of 2012. The architectural breakdown of the lunar simulator at system level was performed and the architecture with a hierarchical tree of models from the system to parts at lower level has been established. Finally, SMP Documents such as Catalogue, Assembly, Schedule and so on were converted using a XML(eXtensible Mark-up Language) converter. To obtain benefits of the suggested approaches and design mechanisms in SMP2 standard as far as possible, the object-oriented and component-based design concepts were strictly chosen throughout a whole model development process.

  2. Manipulating acoustic wave reflection by a nonlinear elastic metasurface

    NASA Astrophysics Data System (ADS)

    Guo, Xinxin; Gusev, Vitalyi E.; Bertoldi, Katia; Tournat, Vincent

    2018-03-01

    The acoustic wave reflection properties of a nonlinear elastic metasurface, derived from resonant nonlinear elastic elements, are theoretically and numerically studied. The metasurface is composed of a two degree-of-freedom mass-spring system with quadratic elastic nonlinearity. The possibility of converting, during the reflection process, most of the fundamental incoming wave energy into the second harmonic wave is shown, both theoretically and numerically, by means of a proper design of the nonlinear metasurface. The theoretical results from the harmonic balance method for a monochromatic source are compared with time domain simulations for a wave packet source. This protocol allows analyzing the dynamics of the nonlinear reflection process in the metasurface as well as exploring the limits of the operating frequency bandwidth. The reported methodology can be applied to a wide variety of nonlinear metasurfaces, thus possibly extending the family of exotic nonlinear reflection processes.

  3. Photo-induced free radicals on a simulated Martian surface

    NASA Technical Reports Server (NTRS)

    Tseng, S.-S.; Chang, S.

    1974-01-01

    Results of an electron spin resonance study of free radicals in the ultraviolet irradiation of a simulated Martian surface suggest that the ultraviolet photolysis of CO or CO2, or a mixture of both, adsorbed on silica gel at minus 170 C involves the formation of OH radicals and possibly of H atoms as the primary process, followed by the formation of CO2H radicals. It is concluded that the photochemical synthesis of organic compounds could occur on Mars if the siliceous surface dust contains enough silanol groups and/or adsorbed H2O in the form of bound water.

  4. Mathematical simulation of the process of condensing natural gas

    NASA Astrophysics Data System (ADS)

    Tastandieva, G. M.

    2015-01-01

    Presents a two-dimensional unsteady model of heat transfer in terms of condensation of natural gas at low temperatures. Performed calculations of the process heat and mass transfer of liquefied natural gas (LNG) storage tanks of cylindrical shape. The influence of model parameters on the nature of heat transfer. Defined temperature regimes eliminate evaporation by cooling liquefied natural gas. The obtained dependence of the mass flow rate of vapor condensation gas temperature. Identified the possibility of regulating the process of "cooling down" liquefied natural gas in terms of its partial evaporation with low cost energy.

  5. Ab initio Simulation of Helium-Ion Microscopy Images: The Case of Suspended Graphene

    NASA Astrophysics Data System (ADS)

    Zhang, Hong; Miyamoto, Yoshiyuki; Rubio, Angel

    2012-12-01

    Helium ion microscopy (HIM), which was released in 2006 by Ward et al., provides nondestructive imaging of nanoscale objects with higher contrast than scanning electron microscopy. HIM measurement of suspended graphene under typical conditions is simulated by first-principles time-dependent density functional theory and the 30 keV He+ collision is found to induce the emission of electrons dependent on the impact point. This finding suggests the possibility of obtaining a highly accurate image of the honeycomb pattern of suspended graphene by HIM. Comparison with a simulation of He0 under the same kinetic energy shows that electron emission is governed by the impact ionization instead of Auger process initiated by neutralization of He+.

  6. Convection links biomass burning to increased tropical ozone - However, models will tend to overpredict O3

    NASA Technical Reports Server (NTRS)

    Chatfield, Robert B.; Delany, Anthony C.

    1990-01-01

    Biomass burning throughout the inhabited portions of the tropics generates precursors which lead to significant local atmospheric ozone pollution. Several simulations show how this smog could be only an easily observed, local manifestation of a much broader increase in tropospheric ozone. The basic processes are illustrated with a one-dimensional time-dependent model that is closer to true meteorological motions than commonly used eddy diffusion models. Its application to a representative region of South America gives reasonable simulations of the local pollutants measured there. Three illustrative simulations indicate the importance of dilution, principally due to vertical transport, in increasing the efficiency of ozone production, possibly enough for high ozone to be apparent on a very large, intercontinental scale.

  7. Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment

    NASA Technical Reports Server (NTRS)

    Miranda, David J.; Fayez, Sam; Steele, Martin J.

    2011-01-01

    On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete event simulations.

  8. Engineering workstation: Sensor modeling

    NASA Technical Reports Server (NTRS)

    Pavel, M; Sweet, B.

    1993-01-01

    The purpose of the engineering workstation is to provide an environment for rapid prototyping and evaluation of fusion and image processing algorithms. Ideally, the algorithms are designed to optimize the extraction of information that is useful to a pilot for all phases of flight operations. Successful design of effective fusion algorithms depends on the ability to characterize both the information available from the sensors and the information useful to a pilot. The workstation is comprised of subsystems for simulation of sensor-generated images, image processing, image enhancement, and fusion algorithms. As such, the workstation can be used to implement and evaluate both short-term solutions and long-term solutions. The short-term solutions are being developed to enhance a pilot's situational awareness by providing information in addition to his direct vision. The long term solutions are aimed at the development of complete synthetic vision systems. One of the important functions of the engineering workstation is to simulate the images that would be generated by the sensors. The simulation system is designed to use the graphics modeling and rendering capabilities of various workstations manufactured by Silicon Graphics Inc. The workstation simulates various aspects of the sensor-generated images arising from phenomenology of the sensors. In addition, the workstation can be used to simulate a variety of impairments due to mechanical limitations of the sensor placement and due to the motion of the airplane. Although the simulation is currently not performed in real-time, sequences of individual frames can be processed, stored, and recorded in a video format. In that way, it is possible to examine the appearance of different dynamic sensor-generated and fused images.

  9. Parabolic flights as Earth analogue for surface processes on Mars

    NASA Astrophysics Data System (ADS)

    Kuhn, Nikolaus J.

    2017-04-01

    The interpretation of landforms and environmental archives on Mars with regards to habitability and preservation of traces of life requires a quantitative understanding of the processes that shaped them. Commonly, qualitative similarities in sedimentary rocks between Earth and Mars are used as an analogue to reconstruct the environments in which they formed on Mars. However, flow hydraulics and sedimentation differ between Earth and Mars, requiring a recalibration of models describing runoff, erosion, transport and deposition. Simulation of these processes on Earth is limited because gravity cannot be changed and the trade-off between adjusting e.g. fluid or particle density generates other mismatches, such as fluid viscosity. Computational Fluid Dynamics offer an alternative, but would also require a certain degree of calibration or testing. Parabolic flights offer a possibility to amend the shortcomings of these approaches. Parabolas with reduced gravity last up to 30 seconds, which allows the simulation of sedimentation processes and the measurement of flow hydraulics. This study summarizes the experience gathered during four campaigns of parabolic flights, aimed at identifying potential and limitations of their use as an Earth analogue for surface processes on Mars.

  10. Analysis of roll-stamped light guide plate fabricated with laser-ablated stamper

    NASA Astrophysics Data System (ADS)

    Na, Hyunjun; Hong, Seokkwan; Kim, Jongsun; Hwang, Jeongho; Joo, Byungyun; Yoon, Kyunghwan; Kang, Jeongjin

    2017-12-01

    LGP (light guide plate) is one of the major components of LCD (liquid crystal display), and it makes surface illumination for LCD backlit. LGP is a transparent plastic plate usually produced by injection molding process. On the back of LGP there are micron size patterns for extraction of light. Recently a roll-stamping process has achieved the high mass productivity of thinner LGPs. In order to fabricate optical patterns on LGPs, a fabricating tool called as a stamper is used. Micro patterns on metallic stampers are made by several micro machining processes such as chemical etching, LIGA-reflow, and laser ablation. In this study, a roll-stamping process by using a laser ablated metallic stamper was dealt with in consideration of the compatibility with the roll-stamping process. LGP fabricating tests were performed using a roll-stamping process with four different roll pressures. Pattern shapes on the stamper fabricated by laser ablation and transcription ratios of the roll-stamping process were analyzed, and LGP luminance was evaluated. Based on the evaluation, optical simulation model for LGP was made and simulation accuracy was evaluated. Simulation results showed good agreements with optical performance of LGPs in the brightness and uniformity. It was also shown that the roll-stamped LGP has the possibility of better optical performance than the conventional injection molded LGP. It was also shown that the roll-stamped LGP with the laser ablated stamper is potential to have better optical performance than the conventional injection molded LGP.

  11. Multiscale simulations of patchy particle systems combining Molecular Dynamics, Path Sampling and Green's Function Reaction Dynamics

    NASA Astrophysics Data System (ADS)

    Bolhuis, Peter

    Important reaction-diffusion processes, such as biochemical networks in living cells, or self-assembling soft matter, span many orders in length and time scales. In these systems, the reactants' spatial dynamics at mesoscopic length and time scales of microns and seconds is coupled to the reactions between the molecules at microscopic length and time scales of nanometers and milliseconds. This wide range of length and time scales makes these systems notoriously difficult to simulate. While mean-field rate equations cannot describe such processes, the mesoscopic Green's Function Reaction Dynamics (GFRD) method enables efficient simulation at the particle level provided the microscopic dynamics can be integrated out. Yet, many processes exhibit non-trivial microscopic dynamics that can qualitatively change the macroscopic behavior, calling for an atomistic, microscopic description. The recently developed multiscale Molecular Dynamics Green's Function Reaction Dynamics (MD-GFRD) approach combines GFRD for simulating the system at the mesocopic scale where particles are far apart, with microscopic Molecular (or Brownian) Dynamics, for simulating the system at the microscopic scale where reactants are in close proximity. The association and dissociation of particles are treated with rare event path sampling techniques. I will illustrate the efficiency of this method for patchy particle systems. Replacing the microscopic regime with a Markov State Model avoids the microscopic regime completely. The MSM is then pre-computed using advanced path-sampling techniques such as multistate transition interface sampling. I illustrate this approach on patchy particle systems that show multiple modes of binding. MD-GFRD is generic, and can be used to efficiently simulate reaction-diffusion systems at the particle level, including the orientational dynamics, opening up the possibility for large-scale simulations of e.g. protein signaling networks.

  12. The calculation of the phase equilibrium of the multicomponent hydrocarbon systems

    NASA Astrophysics Data System (ADS)

    Molchanov, D. A.

    2018-01-01

    Hydrocarbon mixtures filtration process simulation development has resulted in use of cubic equations of state of the van der Waals type to describe the thermodynamic properties of natural fluids under real thermobaric conditions. Binary hydrocarbon systems allow to simulate the fluids of different types of reservoirs qualitatively, what makes it possible to carry out the experimental study of their filtration features. Exploitation of gas-condensate reservoirs shows the possibility of existence of various two-phase filtration regimes, including self-oscillatory one, which occurs under certain values of mixture composition, temperature and pressure drop. Plotting of the phase diagram of the model mixture is required to determine these values. A software package to calculate the vapor-liquid equilibrium of binary systems using cubic equation of state of the van der Waals type has been created. Phase diagrams of gas-condensate model mixtures have been calculated.

  13. Lunar dust simulant charging and transport under UV irradiation in vacuum: Experiments and numerical modeling

    NASA Astrophysics Data System (ADS)

    Champlain, A.; Matéo-Vélez, J.-C.; Roussel, J.-F.; Hess, S.; Sarrailh, P.; Murat, G.; Chardon, J.-P.; Gajan, A.

    2016-01-01

    Recent high-altitude observations, made by the Lunar Dust Experiment (LDEX) experiment on board LADEE orbiting the Moon, indicate that high-altitude (>10 km) dust particle densities are well correlated with interplanetary dust impacts. They show no evidence of high dust density suggested by Apollo 15 and 17 observations and possibly explained by electrostatic forces imposed by the plasma environment and photon irradiation. This paper deals with near-surface conditions below the domain of observation of LDEX where electrostatic forces could clearly be at play. The upper and lower limits of the cohesive force between dusts are obtained by comparing experiments and numerical simulations of dust charging under ultraviolet irradiation in the presence of an electric field and mechanical vibrations. It is suggested that dust ejection by electrostatic forces is made possible by microscopic-scale amplifications due to soil irregularities. At low altitude, this process may be complementary to interplanetary dust impacts.

  14. Solar radio emissions: 2D full PIC simulations

    NASA Astrophysics Data System (ADS)

    Pierre, H.; Sgattoni, A.; Briand, C.; Amiranoff, F.; Riconda, C.

    2016-12-01

    Solar radio emissions are electromagnetic waves observed at the local plasma frequency and/or at twice the plasma frequency. To describe their origin a multi-stage model has been proposed by Ginzburg & Zhelezniakov (1958) and further developed by several authors, which consider a succession of non-linear three-wave interaction processes. Electron beams accelerated by solar flares travel in the interplanetary plasma and provide the free energy for the development of plasma instabilities. The model describes how part of the free energy of these beams can be transformed in a succession of plasma waves and eventually into electromagnetic waves. Following the work of Thurgood & Tsiklauri (2015) we performed several 2D Particle In Cell simulations. The simulations follow the entire set of processes from the electron beam propagation in the background plasma to the generation of the electromagnetic waves in particular the 2ωp emission, including the excitation of the low frequency waves. As suggested by Thurgood & Tsiklauri (2015) it is possible to identify regimes where the radiation emission can be directly linked to the electron beams. Our attention was devoted to estimate the conversion efficiency from electron kinetic energy to the em energy, and the growth rate of the several processes which can be identified. We studied the emission angles of the 2ωpradiation and compared them with the theoretical predictions of Willes et. al. (1995). We also show the role played by some numerical parameters i.e. the size and shape of the simulation box. This work is the first step to prepare laser-plasma experiments. V. L. Ginzburg, V. V. Zhelezniakov On the Possible Mechanisms of Sporadic Solar Radio Emission (Radiation in an Isotropic Plasma) Soviet Astronomy, Vol. 2, p.653 (1958) J. O. Thurgood and D. Tsiklauri Self-consistent particle-in-cell simulations of funda- mental and harmonic plasma radio emission mechanisms. Astronomy & Astrophysics 584, A83 (2015). A. Willes, P. Robinson and D. Melrose (1995). Second harmonic electromagnetic emis- sion via Langmuir wave coalescence. Physics of Plasmas, 3(1), 149-159 (1995).

  15. Transferability of optimally-selected climate models in the quantification of climate change impacts on hydrology

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Brissette, François P.; Lucas-Picher, Philippe

    2016-11-01

    Given the ever increasing number of climate change simulations being carried out, it has become impractical to use all of them to cover the uncertainty of climate change impacts. Various methods have been proposed to optimally select subsets of a large ensemble of climate simulations for impact studies. However, the behaviour of optimally-selected subsets of climate simulations for climate change impacts is unknown, since the transfer process from climate projections to the impact study world is usually highly non-linear. Consequently, this study investigates the transferability of optimally-selected subsets of climate simulations in the case of hydrological impacts. Two different methods were used for the optimal selection of subsets of climate scenarios, and both were found to be capable of adequately representing the spread of selected climate model variables contained in the original large ensemble. However, in both cases, the optimal subsets had limited transferability to hydrological impacts. To capture a similar variability in the impact model world, many more simulations have to be used than those that are needed to simply cover variability from the climate model variables' perspective. Overall, both optimal subset selection methods were better than random selection when small subsets were selected from a large ensemble for impact studies. However, as the number of selected simulations increased, random selection often performed better than the two optimal methods. To ensure adequate uncertainty coverage, the results of this study imply that selecting as many climate change simulations as possible is the best avenue. Where this was not possible, the two optimal methods were found to perform adequately.

  16. Numerical and analytical simulation of the production process of ZrO2 hollow particles

    NASA Astrophysics Data System (ADS)

    Safaei, Hadi; Emami, Mohsen Davazdah

    2017-12-01

    In this paper, the production process of hollow particles from the agglomerated particles is addressed analytically and numerically. The important parameters affecting this process, in particular, the initial porosity level of particles and the plasma gun types are investigated. The analytical model adopts a combination of quasi-steady thermal equilibrium and mechanical balance. In the analytical model, the possibility of a solid core existing in agglomerated particles is examined. In this model, a range of particle diameters (50μm ≤ D_{p0} ≤ 160 μ m) and various initial porosities ( 0.2 ≤ p ≤ 0.7) are considered. The numerical model employs the VOF technique for two-phase compressible flows. The production process of hollow particles from the agglomerated particles is simulated, considering an initial diameter of D_{p0} = 60 μm and initial porosity of p = 0.3, p = 0.5, and p = 0.7. Simulation results of the analytical model indicate that the solid core diameter is independent of the initial porosity, whereas the thickness of the particle shell strongly depends on the initial porosity. In both models, a hollow particle may hardly develop at small initial porosity values ( p < 0.3), while the particle disintegrates at high initial porosity values ( p > 0.6.

  17. Integration agent-based models and GIS as a virtual urban dynamic laboratory

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Liu, Miaolong

    2007-06-01

    Based on the Agent-based Model and spatial data model, a tight-coupling integrating method of GIS and Agent-based Model (ABM) is to be discussed in this paper. The use of object-orientation for both spatial data and spatial process models facilitates their integration, which can allow exploration and explanation of spatial-temporal phenomena such as urban dynamic. In order to better understand how tight coupling might proceed and to evaluate the possible functional and efficiency gains from such a tight coupling, the agent-based model and spatial data model are discussed, and then the relationships affecting spatial data model and agent-based process models interaction. After that, a realistic crowd flow simulation experiment is presented. Using some tools provided by general GIS systems and a few specific programming languages, a new software system integrating GIS and MAS as a virtual laboratory applicable for simulating pedestrian flows in a crowd activity centre has been developed successfully. Under the environment supported by the software system, as an applicable case, a dynamic evolution process of the pedestrian's flows (dispersed process for the spectators) in a crowds' activity center - The Shanghai Stadium has been simulated successfully. At the end of the paper, some new research problems have been pointed out for the future.

  18. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    NASA Astrophysics Data System (ADS)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  19. Dynamic Monte Carlo description of thermal desorption processes

    NASA Astrophysics Data System (ADS)

    Weinketz, Sieghard

    1994-07-01

    The applicability of the dynamic Monte Carlo method of Fichthorn and Weinberg, in which the time evolution of a system is described in terms of the absolute number of different microscopic possible events and their associated transition rates, is discussed for the case of thermal desorption simulations. It is shown that the definition of the time increment at each successful event leads naturally to the macroscopic differential equation of desorption, in the case of simple first- and second-order processes in which the only possible events are desorption and diffusion. This equivalence is numerically demonstrated for a second-order case. In the sequence, the equivalence of this method with the Monte Carlo method of Sales and Zgrablich for more complex desorption processes, allowing for lateral interactions between adsorbates, is shown, even though the dynamic Monte Carlo method does not bear their limitation of a rapid surface diffusion condition, thus being able to describe a more complex ``kinetics'' of surface reactive processes, and therefore be applied to a wider class of phenomena, such as surface catalysis.

  20. New Computer Simulations of Macular Neural Functioning

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  1. Modeling a Glacial Lake Outburst Flood Process Chain: The Case of Lake Palcacocha and Huaraz, Peru

    NASA Astrophysics Data System (ADS)

    Chisolm, Rachel; Somos-Valenzuela, Marcelo; Rivas Gomez, Denny; McKinney, Daene C.; Portocarrero Rodriguez, Cesar

    2016-04-01

    One of the consequences of recent glacier recession in the Cordillera Blanca, Peru, is the risk of Glacial Lake Outburst Floods (GLOFs) from lakes that have formed at the base of retreating glaciers. GLOFs are often triggered by avalanches falling into glacial lakes, initiating a chain of processes that may culminate in significant inundation and destruction downstream. This paper presents simulations of all of the processes involved in a potential GLOF originating from Lake Palcacocha, the source of a previously catastrophic GLOF on December 13, 1941, 1800 people in the city of Huaraz, Peru. The chain of processes simulated here includes: (1) avalanches above the lake; (2) lake dynamics resulting from the avalanche impact, including wave generation, propagation, and run-up across lakes; (3) terminal moraine overtopping and dynamic moraine erosion simulations to determine the possibility of breaching; (4) flood propagation along downstream valleys; and (5) inundation of populated areas. The results of each process feed into simulations of subsequent processes in the chain, finally resulting in estimates of inundation in the city of Huaraz. The results of the inundation simulations were converted into flood intensity and hazard maps (based on an intensity-likelihood matrix) that may be useful for city planning and regulation. Three avalanche events with volumes ranging from 0.5-3 x 106 m3 were simulated, and two scenarios of 15 m and 30 m lake lowering were simulated to assess the potential of mitigating the hazard level in Huaraz. For all three avalanche events, three-dimensional hydrodynamic models show large waves generated in the lake from the impact resulting in overtopping of the damming-moraine. Despite very high discharge rates (up to 63.4 x 103 m3/s), the erosion from the overtopping wave did not result in failure of the damming-moraine when simulated with a hydro-morphodynamic model using excessively conservative soil characteristics that provide very little erosion resistance. With the current lake level, all three avalanche events result in inundation in Huaraz, and the resulting hazard map shows a total affected area of 2.01 km2, most of which is in the high-hazard category. Lowering the lake has the potential to reduce the affected area by up to 35% resulting in a smaller portion of the inundated area in the high-hazard category.

  2. Relative frequencies of constrained events in stochastic processes: An analytical approach.

    PubMed

    Rusconi, S; Akhmatskaya, E; Sokolovski, D; Ballard, N; de la Cal, J C

    2015-10-01

    The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈10(4)). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications.

  3. Development and evaluation of a profile negotiation process for integrating aircraft and air traffic control automation

    NASA Technical Reports Server (NTRS)

    Green, Steven M.; Denbraven, Wim; Williams, David H.

    1993-01-01

    The development and evaluation of the profile negotiation process (PNP), an interactive process between an aircraft and air traffic control (ATC) that integrates airborne and ground-based automation capabilities to determine conflict-free trajectories that are as close to an aircraft's preference as possible, are described. The PNP was evaluated in a real-time simulation experiment conducted jointly by NASA's Ames and Langley Research Centers. The Ames Center/TRACON Automation System (CTAS) was used to support the ATC environment, and the Langley Transport Systems Research Vehicle (TSRV) piloted cab was used to simulate a 4D Flight Management System (FMS) capable aircraft. Both systems were connected in real time by way of voice and data lines; digital datalink communications capability was developed and evaluated as a means of supporting the air/ground exchange of trajectory data. The controllers were able to consistently and effectively negotiate nominally conflict-free vertical profiles with the 4D-equipped aircraft. The actual profiles flown were substantially closer to the aircraft's preference than would have been possible without the PNP. However, there was a strong consensus among the pilots and controllers that the level of automation of the PNP should be increased to make the process more transparent. The experiment demonstrated the importance of an aircraft's ability to accurately execute a negotiated profile as well as the need for digital datalink to support advanced air/ground data communications. The concept of trajectory space is proposed as a comprehensive approach for coupling the processes of trajectory planning and tracking to allow maximum pilot discretion in meeting ATC constraints.

  4. Just-in-time Time Data Analytics and Visualization of Climate Simulations using the Bellerophon Framework

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.

    2015-12-01

    Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.

  5. Possible effects of two-phase flow pattern on the mechanical behavior of mudstones

    NASA Astrophysics Data System (ADS)

    Goto, H.; Tokunaga, T.; Aichi, M.

    2016-12-01

    To investigate the influence of two-phase flow pattern on the mechanical behavior of mudstones, laboratory experiments were conducted. In the experiment, air was injected from the bottom of the water-saturated Quaternary Umegase mudstone sample under hydrostatic external stress condition. Both axial and circumferential strains at half the height of the sample and volumetric discharge of water at the outlet were monitored during the experiment. Numerical simulation of the experiment was tried by using a simulator which can solve coupled two-phase flow and poroelastic deformation assuming the extended-Darcian flow with relative permeability and capillary pressure as functions of the wetting-phase fluid saturation. In the numerical simulation, the volumetric discharge of water was reproduced well while both strains were not. Three dimensionless numbers, i.e., the viscosity ratio, the Capillary number, and the Bond number, which characterize the two-phase flow pattern (Lenormand et al., 1988; Ewing and Berkowitz, 1998) were calculated to be 2×10-2, 2×10-11, and 7×10-11, respectively, in the experiment. Because the Bond number was quite small, it was possible to apply Lenormand et al. (1988)'s diagram to evaluate the flow regime, and the flow regime was considered to be capillary fingering. While, in the numerical simulation, air moved uniformly upward with quite low non-wetting phase saturation conditions because the fluid flow obeyed the two-phase Darcy's law. These different displacement patterns developed in the experiment and assumed in the numerical simulation were considered to be the reason why the deformation behavior observed in the experiment could not be reproduced by numerical simulation, suggesting that the two-phase flow pattern could affect the changes of internal fluid pressure patterns during displacement processes. For further studies, quantitative analysis of the experimental results by using a numerical simulator which can solve the coupled processes of two-phase flow through preferential flow paths and deformation of porous media is needed. References: Ewing R. P., and B. Berkowitz (1998), Water Resour. Res., 34, 611-622. Lenormand, R., E. Touboul, and C. Zarcone (1988), J. Fluid Mech., 189, 165-187.

  6. Exploring the Dynamics of Cell Processes through Simulations of Fluorescence Microscopy Experiments

    PubMed Central

    Angiolini, Juan; Plachta, Nicolas; Mocskos, Esteban; Levi, Valeria

    2015-01-01

    Fluorescence correlation spectroscopy (FCS) methods are powerful tools for unveiling the dynamical organization of cells. For simple cases, such as molecules passively moving in a homogeneous media, FCS analysis yields analytical functions that can be fitted to the experimental data to recover the phenomenological rate parameters. Unfortunately, many dynamical processes in cells do not follow these simple models, and in many instances it is not possible to obtain an analytical function through a theoretical analysis of a more complex model. In such cases, experimental analysis can be combined with Monte Carlo simulations to aid in interpretation of the data. In response to this need, we developed a method called FERNET (Fluorescence Emission Recipes and Numerical routines Toolkit) based on Monte Carlo simulations and the MCell-Blender platform, which was designed to treat the reaction-diffusion problem under realistic scenarios. This method enables us to set complex geometries of the simulation space, distribute molecules among different compartments, and define interspecies reactions with selected kinetic constants, diffusion coefficients, and species brightness. We apply this method to simulate single- and multiple-point FCS, photon-counting histogram analysis, raster image correlation spectroscopy, and two-color fluorescence cross-correlation spectroscopy. We believe that this new program could be very useful for predicting and understanding the output of fluorescence microscopy experiments. PMID:26039162

  7. CFD-based optimization in plastics extrusion

    NASA Astrophysics Data System (ADS)

    Eusterholz, Sebastian; Elgeti, Stefanie

    2018-05-01

    This paper presents novel ideas in numerical design of mixing elements in single-screw extruders. The actual design process is reformulated as a shape optimization problem, given some functional, but possibly inefficient initial design. Thereby automatic optimization can be incorporated and the design process is advanced, beyond the simulation-supported, but still experience-based approach. This paper proposes concepts to extend a method which has been developed and validated for die design to the design of mixing-elements. For simplicity, it focuses on single-phase flows only. The developed method conducts forward-simulations to predict the quasi-steady melt behavior in the relevant part of the extruder. The result of each simulation is used in a black-box optimization procedure based on an efficient low-order parameterization of the geometry. To minimize user interaction, an objective function is formulated that quantifies the products' quality based on the forward simulation. This paper covers two aspects: (1) It reviews the set-up of the optimization framework as discussed in [1], and (2) it details the necessary extensions for the optimization of mixing elements in single-screw extruders. It concludes with a presentation of first advances in the unsteady flow simulation of a metering and mixing section with the SSMUM [2] using the Carreau material model.

  8. Modeling a Million-Node Slim Fly Network Using Parallel Discrete-Event Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, Noah; Carothers, Christopher; Mubarak, Misbah

    As supercomputers close in on exascale performance, the increased number of processors and processing power translates to an increased demand on the underlying network interconnect. The Slim Fly network topology, a new lowdiameter and low-latency interconnection network, is gaining interest as one possible solution for next-generation supercomputing interconnect systems. In this paper, we present a high-fidelity Slim Fly it-level model leveraging the Rensselaer Optimistic Simulation System (ROSS) and Co-Design of Exascale Storage (CODES) frameworks. We validate our Slim Fly model with the Kathareios et al. Slim Fly model results provided at moderately sized network scales. We further scale the modelmore » size up to n unprecedented 1 million compute nodes; and through visualization of network simulation metrics such as link bandwidth, packet latency, and port occupancy, we get an insight into the network behavior at the million-node scale. We also show linear strong scaling of the Slim Fly model on an Intel cluster achieving a peak event rate of 36 million events per second using 128 MPI tasks to process 7 billion events. Detailed analysis of the underlying discrete-event simulation performance shows that a million-node Slim Fly model simulation can execute in 198 seconds on the Intel cluster.« less

  9. Accelerating large-scale simulation of seismic wave propagation by multi-GPUs and three-dimensional domain decomposition

    NASA Astrophysics Data System (ADS)

    Okamoto, Taro; Takenaka, Hiroshi; Nakamura, Takeshi; Aoki, Takayuki

    2010-12-01

    We adopted the GPU (graphics processing unit) to accelerate the large-scale finite-difference simulation of seismic wave propagation. The simulation can benefit from the high-memory bandwidth of GPU because it is a "memory intensive" problem. In a single-GPU case we achieved a performance of about 56 GFlops, which was about 45-fold faster than that achieved by a single core of the host central processing unit (CPU). We confirmed that the optimized use of fast shared memory and registers were essential for performance. In the multi-GPU case with three-dimensional domain decomposition, the non-contiguous memory alignment in the ghost zones was found to impose quite long time in data transfer between GPU and the host node. This problem was solved by using contiguous memory buffers for ghost zones. We achieved a performance of about 2.2 TFlops by using 120 GPUs and 330 GB of total memory: nearly (or more than) 2200 cores of host CPUs would be required to achieve the same performance. The weak scaling was nearly proportional to the number of GPUs. We therefore conclude that GPU computing for large-scale simulation of seismic wave propagation is a promising approach as a faster simulation is possible with reduced computational resources compared to CPUs.

  10. Remembering the past and imagining the future: Identifying and enhancing the contribution of episodic memory

    PubMed Central

    Schacter, Daniel L; Madore, Kevin P

    2016-01-01

    Recent studies have shown that imagining or simulating future events relies on many of the same cognitive and neural processes as remembering past events. According to the constructive episodic simulation hypothesis (Schacter and Addis, 2007), such overlap indicates that both remembered past and imagined future events rely heavily on episodic memory: future simulations are built on retrieved details of specific past experiences that are recombined into novel events. An alternative possibility is that commonalities between remembering and imagining reflect the influence of more general, non-episodic factors such as narrative style or communicative goals that shape the expression of both memory and imagination. We consider recent studies that distinguish the contributions of episodic and non-episodic processes in remembering the past and imagining the future by using an episodic specificity induction – brief training in recollecting the details of a past experience – and also extend this approach to the domains of problem solving and creative thinking. We conclude by suggesting that the specificity induction may target a process of scene construction that contributes to episodic memory as well as to imagination, problem solving, and creative thinking. PMID:28163775

  11. Time and length scales within a fire and implications for numerical simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TIESZEN,SHELDON R.

    2000-02-02

    A partial non-dimensionalization of the Navier-Stokes equations is used to obtain order of magnitude estimates of the rate-controlling transport processes in the reacting portion of a fire plume as a function of length scale. Over continuum length scales, buoyant times scales vary as the square root of the length scale; advection time scales vary as the length scale, and diffusion time scales vary as the square of the length scale. Due to the variation with length scale, each process is dominant over a given range. The relationship of buoyancy and baroclinc vorticity generation is highlighted. For numerical simulation, first principlesmore » solution for fire problems is not possible with foreseeable computational hardware in the near future. Filtered transport equations with subgrid modeling will be required as two to three decades of length scale are captured by solution of discretized conservation equations. By whatever filtering process one employs, one must have humble expectations for the accuracy obtainable by numerical simulation for practical fire problems that contain important multi-physics/multi-length-scale coupling with up to 10 orders of magnitude in length scale.« less

  12. Process fault detection and nonlinear time series analysis for anomaly detection in safeguards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, T.L.; Mullen, M.F.; Wangen, L.E.

    In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect lossmore » of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values.« less

  13. Modeling and simulating vortex pinning and transport currents for high temperature superconductors

    NASA Astrophysics Data System (ADS)

    Sockwell, K. Chad

    Superconductivity is a phenomenon characterized by two hallmark properties, zero electrical resistance and the Meissner effect. These properties give great promise to a new generation of resistance free electronics and powerful superconducting magnets. However this possibility is limited by the extremely low critical temperature the superconductors must operate under, typically close to 0K. The recent discovery of high temperature superconductors has brought the critical temperature closer to room temperature than ever before, making the realization of room temperature superconductivity a possibility. Simulations of superconducting technology and materials will be necessary to usher in the new wave of superconducting electronics. Unfortunately these new materials come with new properties such as effects from multiple electron bands, as is the case for magnesium diboride. Moreover, we must consider that all high temperature superconductors are of a Type II variety, which possess magnetic tubes of flux, known as vortices. These vortices interact with transport currents, creating an electrical resistance through a process known as flux flow. Thankfully this process can be prevented by placing impurities in the superconductor, pinning the vortices, making vortex pinning a necessary aspect of our model. At this time there are no other models or simulations that are aimed at modeling vortex pinning, using impurities, in two-band materials. In this work we modify an existing Ginzburg-Landau model for two-band superconductors and add the ability to model normal inclusions (impurities) with a new approach which is unique to the two-band model. Simulations in an attempt to model the material magnesium diboride are also presented. In particular simulations of vortex pinning and transport currents are shown using the modified model. The qualitative properties of magnesium diboride are used to validate the model and its simulations. One main goal from the computational end of the simulations is to enlarge the domain size to produce more realistic simulations that avoid boundary pinning effects. In this work we also implement the numerical software library Trilinos in order to parallelize the simulation to enlarge the domain size. Decoupling methods are also investigated with a goal of enlarging the domain size as well. The One-Band Ginzburg-Landau model serves as a prototypical problem in this endeavor and the methods shown that enlarge the domain size can be easily implemented in the two-band model.

  14. Objects Mental Rotation under 7 Days Simulated Weightlessness Condition: An ERP Study.

    PubMed

    Wang, Hui; Duan, Jiaobo; Liao, Yang; Wang, Chuang; Li, Hongzheng; Liu, Xufeng

    2017-01-01

    During the spaceflight under weightlessness condition, human's brain function may be affected by the changes of physiological effects along with the distribution of blood and body fluids to the head. This variation of brain function will influence the performance of astronauts and therefore create possible harm to flight safety. This study employs 20 male subjects in a 7-day-6° head-down tilted (HDT) bed rest model to simulate physiological effects under weightlessness condition, and use behavioral, electrophysiological techniques to compare the changes of mental rotation ability (MR ability) before and after short-term simulated weightlessness state. Behavioral results suggested that significant linear relationship existed between the rotation angle of stimuli and the reaction time, which means mental rotation process do happen during the MR task in simulated weightlessness state. In the first 3 days, the P300 component induced by object mental rotation followed the "down-up-down" pattern. In the following 4 days it changed randomly. On HDT D2, the mean of the amplitude of the P300 was the lowest, while increased gently on HDT D3. There was no obvious changing pattern of the amplitude of P300 observed after 3 days of HDT. Simulated weightlessness doesn't change the basic process of mental rotation. The effect of simulated weightlessness is neural mechanism of self-adaptation. MR ability didn't bounce back to the original level after HDT test.

  15. Modeling of cobalt-based catalyst use during CSS for low-temperature heavy oil upgrading

    NASA Astrophysics Data System (ADS)

    Kadyrov, R.; Sitnov, S.; Gareev, B.; Batalin, G.

    2018-05-01

    One of the methods, which is actively used on deposits of heavy oils of the Upper Kungurian (Ufimian) sandstones of the Republic of Tatarstan, is cyclic steam simulation (CSS). This method consists of 3 stages: injection, soaking, and production. Steam is injected into a well at a temperature of 300 to 340° C for a period of weeks to months. Then, the well is allowed to sit for days to weeks to allow heat to soak into the formation. Finally, the hot oil is pumped out of the well for a period of weeks or months. Once the production rate falls off, the well is put through another cycle. The injection of the catalyst solution before the injection of steam opens the possibility for upgrading the heavy oil in the process of aquathermolysis directly in the reservoir. In this paper, the possibility of using a catalyst precursor based on cobalt for upgrading the hydrocarbons of this field in the process of their extraction is represented. SARA analysis on oil saturated sandstones shows an increase in the proportion of saturated hydrocarbons by 11.1% due to the hydrogenation of aromatic hydrocarbons and their derivatives, the content of resins and asphaltenes are remained practically unchanged. A new method for estimating the adsorption of a catalyst based on taking into account the change in the concentration of the base metal before and after simulation of catalyst injection in the thermobaric conditions of the reservoir is proposed. During the study of catalyst adsorption in the rock, when simulating the CSS process, it is found that almost 28% of the cobalt, which is the main element of the catalyst precursor, is retained in the rock.

  16. Sensitivity Analysis and Optimization of Enclosure Radiation with Applications to Crystal Growth

    NASA Technical Reports Server (NTRS)

    Tiller, Michael M.

    1995-01-01

    In engineering, simulation software is often used as a convenient means for carrying out experiments to evaluate physical systems. The benefit of using simulations as 'numerical' experiments is that the experimental conditions can be easily modified and repeated at much lower cost than the comparable physical experiment. The goal of these experiments is to 'improve' the process or result of the experiment. In most cases, the computational experiments employ the same trial and error approach as their physical counterparts. When using this approach for complex systems, the cause and effect relationship of the system may never be fully understood and efficient strategies for improvement never utilized. However, it is possible when running simulations to accurately and efficiently determine the sensitivity of the system results with respect to simulation to accurately and efficiently determine the sensitivity of the system results with respect to simulation parameters (e.g., initial conditions, boundary conditions, and material properties) by manipulating the underlying computations. This results in a better understanding of the system dynamics and gives us efficient means to improve processing conditions. We begin by discussing the steps involved in performing simulations. Then we consider how sensitivity information about simulation results can be obtained and ways this information may be used to improve the process or result of the experiment. Next, we discuss optimization and the efficient algorithms which use sensitivity information. We draw on all this information to propose a generalized approach for integrating simulation and optimization, with an emphasis on software programming issues. After discussing our approach to simulation and optimization we consider an application involving crystal growth. This application is interesting because it includes radiative heat transfer. We discuss the computation of radiative new factors and the impact this mode of heat transfer has on our approach. Finally, we will demonstrate the results of our optimization.

  17. Implementing a warm cloud microphysics parameterization for convective clouds in NCAR CESM

    NASA Astrophysics Data System (ADS)

    Shiu, C.; Chen, Y.; Chen, W.; Li, J. F.; Tsai, I.; Chen, J.; Hsu, H.

    2013-12-01

    Most of cumulus convection schemes use simple empirical approaches to convert cloud liquid mass to rain water or cloud ice to snow e.g. using a constant autoconversion rate and dividing cloud liquid mass into cloud water and ice as function of air temperature (e.g. Zhang and McFarlane scheme in NCAR CAM model). There are few studies trying to use cloud microphysical schemes to better simulate such precipitation processes in the convective schemes of global models (e.g. Lohmann [2008] and Song, Zhang, and Li [2012]). A two-moment warm cloud parameterization (i.e. Chen and Liu [2004]) is implemented into the deep convection scheme of CAM5.2 of CESM model for treatment of conversion of cloud liquid water to rain water. Short-term AMIP type global simulations are conducted to evaluate the possible impacts from the modification of this physical parameterization. Simulated results are further compared to observational results from AMWG diagnostic package and CloudSAT data sets. Several sensitivity tests regarding to changes in cloud top droplet concentration (here as a rough testing for aerosol indirect effects) and changes in detrained cloud size of convective cloud ice are also carried out to understand their possible impacts on the cloud and precipitation simulations.

  18. Comparing simulations and test data of a radiation damaged charge-coupled device for the Euclid mission

    NASA Astrophysics Data System (ADS)

    Skottfelt, Jesper; Hall, David J.; Gow, Jason P. D.; Murray, Neil J.; Holland, Andrew D.; Prod'homme, Thibaut

    2017-04-01

    The visible imager instrument on board the Euclid mission is a weak-lensing experiment that depends on very precise shape measurements of distant galaxies obtained by a large charge-coupled device (CCD) array. Due to the harsh radiative environment outside the Earth's atmosphere, it is anticipated that the CCDs over the mission lifetime will be degraded to an extent that these measurements will be possible only through the correction of radiation damage effects. We have therefore created a Monte Carlo model that simulates the physical processes taking place when transferring signals through a radiation-damaged CCD. The software is based on Shockley-Read-Hall theory and is made to mimic the physical properties in the CCD as closely as possible. The code runs on a single electrode level and takes the three-dimensional trap position, potential structure of the pixel, and multilevel clocking into account. A key element of the model is that it also takes device specific simulations of electron density as a direct input, thereby avoiding making any analytical assumptions about the size and density of the charge cloud. This paper illustrates how test data and simulated data can be compared in order to further our understanding of the positions and properties of the individual radiation-induced traps.

  19. The Shale Hills Critical Zone Observatory for Embedded Sensing and Simulation

    NASA Astrophysics Data System (ADS)

    Duffy, C.; Davis, K.; Kane, T.; Boyer, E.

    2009-04-01

    The future of environmental observing systems will utilize embedded sensor networks with continuous real-time measurement of hydrologic, atmospheric, biogeochemical, and ecological variables across diverse terrestrial environments. Embedded environmental sensors, benefitting from advances in information sciences, networking technology, materials science, computing capacity, and data synthesis methods, are undergoing revolutionary change. It is now possible to field spatially-distributed, multi-node sensor networks that provide density and spatial coverage previously accessible only via numerical simulation. At the same time, computational tools are advancing rapidly to the point where it is now possible to simulate the physical processes controlling individual parcels of water and solutes through the complete terrestrial water cycle. Our goal for the Penn State Critical Zone Observatory is to apply environmental sensor arrays, integrated hydrologic models deployed and coordinated at a testbed within the Penn State Experimental Forest. The NSF-funded CZO is designed to observe the detailed space and time complexities of the water and energy cycle for a watershed and ultimately the river basin for all physical states and fluxes (groundwater, soil moisture, temperature, streamflow, latent heat, snowmelt, chemistry, isotopes etc.). Presently fully-coupled physical models are being developed that link the atmosphere-land-vegetation-subsurface system into a fully-coupled distributed system. During the last 5 years the Penn State Integrated Hydrologic Modeling System has been under development as an open-source community modeling project funded by NSF EAR/GEO and NSF CBET/ENG. PIHM represents a strategy for the formulation and solution of fully-coupled process equations at the watershed and river basin scales, and includes a tightly coupled GIS tool for data handling, domain decomposition, optimal unstructured grid generation, and model parameterization. (PIHM; http://sourceforge.net/projects/pihmmodel/; http://sourceforge.net/projects/pihmgis/ ) The CZO sensor and simulation system is being developed to have the following elements: 1) extensive, spatially-distributed smart sensor networks to gather intensive soil, geologic, hydrologic, geochemical and isotopic data; 2) spatially-explicit multiphysics models/solutions of the land-subsurface-vegetation-atmosphere system; and 3) parallel/distributed, adaptive algorithms for rapidly simulating the states of the watershed at high resolution, and 4) signal processing tools for data mining and parameter estimation. The prototype proposed sensor array and simulation system proposed is demonstrated with preliminary results from our first year.

  20. Prototyping and Simulation of Robot Group Intelligence using Kohonen Networks.

    PubMed

    Wang, Zhijun; Mirdamadi, Reza; Wang, Qing

    2016-01-01

    Intelligent agents such as robots can form ad hoc networks and replace human being in many dangerous scenarios such as a complicated disaster relief site. This project prototypes and builds a computer simulator to simulate robot kinetics, unsupervised learning using Kohonen networks, as well as group intelligence when an ad hoc network is formed. Each robot is modeled using an object with a simple set of attributes and methods that define its internal states and possible actions it may take under certain circumstances. As the result, simple, reliable, and affordable robots can be deployed to form the network. The simulator simulates a group of robots as an unsupervised learning unit and tests the learning results under scenarios with different complexities. The simulation results show that a group of robots could demonstrate highly collaborative behavior on a complex terrain. This study could potentially provide a software simulation platform for testing individual and group capability of robots before the design process and manufacturing of robots. Therefore, results of the project have the potential to reduce the cost and improve the efficiency of robot design and building.

  1. Evaluation of WRF Model Against Satellite and Field Measurements During ARM March 2000 IOP

    NASA Astrophysics Data System (ADS)

    Wu, J.; Zhang, M.

    2003-12-01

    Meso-scale WRF model is employed to simulate the organization of clouds related with the cyclogenesis occurred during March 1-4, 2000 over ARM SGP CART site. Qualitative comparisons of simulated clouds with GOES8 satellite images show that the WRF model can capture the main features of clouds related with the cyclogenesis. The simulated precipitation patterns also match the Radar reflectivity images well. Further evaluation of the simulated features on GCM grid-scale is conducted against ARM field measurements. The evaluation shows that the evolutions of the simulated state fields such as temperature and moisture, the simulated wind fields and the derived large-scale temperature and moisture tendencies closely follow the observed patterns. These results encourages us to use meso-scale WRF model as a tool to verify the performance of GCMs in simulating cloud feedback processes related with the frontal clouds such that we can test and validate the current cloud parameterizations in climate models, and make possible improvements to different components of current cloud parameterizations in GCMs.

  2. Prototyping and Simulation of Robot Group Intelligence using Kohonen Networks

    PubMed Central

    Wang, Zhijun; Mirdamadi, Reza; Wang, Qing

    2016-01-01

    Intelligent agents such as robots can form ad hoc networks and replace human being in many dangerous scenarios such as a complicated disaster relief site. This project prototypes and builds a computer simulator to simulate robot kinetics, unsupervised learning using Kohonen networks, as well as group intelligence when an ad hoc network is formed. Each robot is modeled using an object with a simple set of attributes and methods that define its internal states and possible actions it may take under certain circumstances. As the result, simple, reliable, and affordable robots can be deployed to form the network. The simulator simulates a group of robots as an unsupervised learning unit and tests the learning results under scenarios with different complexities. The simulation results show that a group of robots could demonstrate highly collaborative behavior on a complex terrain. This study could potentially provide a software simulation platform for testing individual and group capability of robots before the design process and manufacturing of robots. Therefore, results of the project have the potential to reduce the cost and improve the efficiency of robot design and building. PMID:28540284

  3. The STARTEC Decision Support Tool for Better Tradeoffs between Food Safety, Quality, Nutrition, and Costs in Production of Advanced Ready-to-Eat Foods.

    PubMed

    Skjerdal, Taran; Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; de Cecare, Alessandra; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Trevisiani, Marcello; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine

    2017-01-01

    A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes , quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as "good"; "sufficient"; or "corrective action needed" based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users.

  4. Patterns of deoxygenation: sensitivity to natural and anthropogenic drivers

    NASA Astrophysics Data System (ADS)

    Oschlies, Andreas; Duteil, Olaf; Getzlaff, Julia; Koeve, Wolfgang; Landolfi, Angela; Schmidtko, Sunke

    2017-08-01

    Observational estimates and numerical models both indicate a significant overall decline in marine oxygen levels over the past few decades. Spatial patterns of oxygen change, however, differ considerably between observed and modelled estimates. Particularly in the tropical thermocline that hosts open-ocean oxygen minimum zones, observations indicate a general oxygen decline, whereas most of the state-of-the-art models simulate increasing oxygen levels. Possible reasons for the apparent model-data discrepancies are examined. In order to attribute observed historical variations in oxygen levels, we here study mechanisms of changes in oxygen supply and consumption with sensitivity model simulations. Specifically, the role of equatorial jets, of lateral and diapycnal mixing processes, of changes in the wind-driven circulation and atmospheric nutrient supply, and of some poorly constrained biogeochemical processes are investigated. Predominantly wind-driven changes in the low-latitude oceanic ventilation are identified as a possible factor contributing to observed oxygen changes in the low-latitude thermocline during the past decades, while the potential role of biogeochemical processes remains difficult to constrain. We discuss implications for the attribution of observed oxygen changes to anthropogenic impacts and research priorities that may help to improve our mechanistic understanding of oxygen changes and the quality of projections into a changing future. This article is part of the themed issue 'Ocean ventilation and deoxygenation in a warming world'.

  5. The STARTEC Decision Support Tool for Better Tradeoffs between Food Safety, Quality, Nutrition, and Costs in Production of Advanced Ready-to-Eat Foods

    PubMed Central

    Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch.; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine

    2017-01-01

    A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes, quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as “good”; “sufficient”; or “corrective action needed” based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users. PMID:29457031

  6. A stochastic estimation procedure for intermittently-observed semi-Markov multistate models with back transitions.

    PubMed

    Aralis, Hilary; Brookmeyer, Ron

    2017-01-01

    Multistate models provide an important method for analyzing a wide range of life history processes including disease progression and patient recovery following medical intervention. Panel data consisting of the states occupied by an individual at a series of discrete time points are often used to estimate transition intensities of the underlying continuous-time process. When transition intensities depend on the time elapsed in the current state and back transitions between states are possible, this intermittent observation process presents difficulties in estimation due to intractability of the likelihood function. In this manuscript, we present an iterative stochastic expectation-maximization algorithm that relies on a simulation-based approximation to the likelihood function and implement this algorithm using rejection sampling. In a simulation study, we demonstrate the feasibility and performance of the proposed procedure. We then demonstrate application of the algorithm to a study of dementia, the Nun Study, consisting of intermittently-observed elderly subjects in one of four possible states corresponding to intact cognition, impaired cognition, dementia, and death. We show that the proposed stochastic expectation-maximization algorithm substantially reduces bias in model parameter estimates compared to an alternative approach used in the literature, minimal path estimation. We conclude that in estimating intermittently observed semi-Markov models, the proposed approach is a computationally feasible and accurate estimation procedure that leads to substantial improvements in back transition estimates.

  7. Discrete Element Method and its application to materials failure problem on the example of Brazilian Test

    NASA Astrophysics Data System (ADS)

    Klejment, Piotr; Kosmala, Alicja; Foltyn, Natalia; Dębski, Wojciech

    2017-04-01

    The earthquake focus is the point where a rock under external stress starts to fracture. Understanding earthquake nucleation and earthquake dynamics requires thus understanding of fracturing of brittle materials. This, however, is a continuing problem and enduring challenge to geoscience. In spite of significant progress we still do not fully understand the failure of rock materials due to extreme stress concentration in natural condition. One of the reason of this situation is that information about natural or induced seismic events is still not sufficient for precise description of physical processes in seismic foci. One of the possibility of improving this situation is using numerical simulations - a powerful tool of contemporary physics. For this reason we used an advanced implementation of the Discrete Element Method (DEM). DEM's main task is to calculate physical properties of materials which are represented as an assembly of a great number of particles interacting with each other. We analyze the possibility of using DEM for describing materials during so called Brazilian Test. Brazilian Test is a testing method to obtain the tensile strength of brittle material. One of the primary reasons for conducting such simulations is to measure macroscopic parameters of the rock sample. We would like to report our efforts of describing the fracturing process during the Brazilian Test from the microscopic point of view and give an insight into physical processes preceding materials failure.

  8. Protein dynamics in a broad frequency range: Dielectric spectroscopy studies

    DOE PAGES

    Nakanishi, Masahiro; Sokolov, Alexei P.

    2014-09-17

    We present detailed dielectric spectroscopy studies of dynamics in two hydrated proteins, lysozyme and myoglobin. We emphasize the importance of explicit account for possible Maxwell-Wagner (MW) polarization effects in protein powder samples. Combining our data with earlier literature results, we demonstrate the existence of three major relaxation processes in globular proteins. To understand the mechanisms of these relaxations we involve literature data on neutron scattering, simulations and NMR studies. The faster process is ascribed to coupled protein-hydration water motions and has relaxation time similar to 10-50 Ps at room temperature. The intermediate process is similar to 10(2)-10(3) times slower thanmore » the faster process and might be strongly affected by MW polarizations. Based on the analysis of data obtained by different experimental techniques and simulations, we ascribe this process to large scale domain-like motions of proteins. The slowest observed process is similar to 10(6)-10(7) times slower than the faster process and has anomalously large dielectric amplitude Delta epsilon similar to 10(2)-10(4). The microscopic nature of this process is not clear, but it seems to be related to the glass transition of hydrated proteins. The presentedresults suggest a general classification of the relaxation processes in hydrated proteins. (c) 2014 Elsevier B.V. All rights reserved.« less

  9. Interactive, graphical processing unitbased evaluation of evacuation scenarios at the state scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S; Aaby, Brandon G; Yoginath, Srikanth B

    2011-01-01

    In large-scale scenarios, transportation modeling and simulation is severely constrained by simulation time. For example, few real- time simulators scale to evacuation traffic scenarios at the level of an entire state, such as Louisiana (approximately 1 million links) or Florida (2.5 million links). New simulation approaches are needed to overcome severe computational demands of conventional (microscopic or mesoscopic) modeling techniques. Here, a new modeling and execution methodology is explored that holds the potential to provide a tradeoff among the level of behavioral detail, the scale of transportation network, and real-time execution capabilities. A novel, field-based modeling technique and its implementationmore » on graphical processing units are presented. Although additional research with input from domain experts is needed for refining and validating the models, the techniques reported here afford interactive experience at very large scales of multi-million road segments. Illustrative experiments on a few state-scale net- works are described based on an implementation of this approach in a software system called GARFIELD. Current modeling cap- abilities and implementation limitations are described, along with possible use cases and future research.« less

  10. Knowledge Assisted Integrated Design of a Component and Its Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Gautham, B. P.; Kulkarni, Nagesh; Khan, Danish; Zagade, Pramod; Reddy, Sreedhar; Uppaluri, Rohith

    Integrated design of a product and its manufacturing processes would significantly reduce the total cost of the products as well as the cost of its development. However this would only be possible if we have a platform that allows us to link together simulations tools used for product design, performance evaluation and its manufacturing processes in a closed loop. In addition to that having a comprehensive knowledgebase that provides systematic knowledge guided assistance to product or process designers who may not possess in-depth design knowledge or in-depth knowledge of the simulation tools, would significantly speed up the end-to-end design process. In this paper, we propose a process and illustrate a case for achieving an integrated product and manufacturing process design assisted by knowledge support for the user to make decisions at various stages. We take transmission component design as an example. The example illustrates the design of a gear for its geometry, material selection and its manufacturing processes, particularly, carburizing-quenching and tempering, and feeding the material properties predicted during heat treatment into performance estimation in a closed loop. It also identifies and illustrates various decision stages in the integrated life cycle and discusses the use of knowledge engineering tools such as rule-based guidance, to assist the designer make informed decisions. Simulation tools developed on various commercial, open-source platforms as well as in-house tools along with knowledge engineering tools are linked to build a framework with appropriate navigation through user-friendly interfaces. This is illustrated through examples in this paper.

  11. The Use of Microgravity Simulators for Space Research

    NASA Technical Reports Server (NTRS)

    Zhang, Ye; Richards, Stephanie E.; Richards, Jeffrey T.; Levine, Howard G.

    2016-01-01

    The spaceflight environment is known to influence biological processes ranging from stimulation of cellular metabolism to possible impacts on cellular damage repair, suppression of immune functions, and bone loss in astronauts. Microgravity is one of the most significant stress factors experienced by living organisms during spaceflight, and therefore, understanding cellular responses to altered gravity at the physiological and molecular level is critical for expanding our knowledge of life in space. Since opportunities to conduct experiments in space are scarce, various microgravity simulators and analogues have been widely used in space biology ground studies. Even though simulated microgravity conditions have produced some, but not all of the biological effects observed in the true microgravity environment, they provide test beds that are effective, affordable, and readily available to facilitate microgravity research. Kennedy Space Center (KSC) provides ground microgravity simulator support to offer a variety of microgravity simulators and platforms for Space Biology investigators. Assistance will be provided by both KSC and external experts in molecular biology, microgravity simulation, and engineering. Comparisons between the physical differences in microgravity simulators, examples of experiments using the simulators, and scientific questions regarding the use of microgravity simulators will be discussed.

  12. The Use of Microgravity Simulators for Space Research

    NASA Technical Reports Server (NTRS)

    Zhang, Ye; Richards, Stephanie E.; Wade, Randall I.; Richards, Jeffrey T.; Fritsche, Ralph F.; Levine, Howard G.

    2016-01-01

    The spaceflight environment is known to influence biological processes ranging from stimulation of cellular metabolism to possible impacts on cellular damage repair, suppression of immune functions, and bone loss in astronauts. Microgravity is one of the most significant stress factors experienced by living organisms during spaceflight, and therefore, understanding cellular responses to altered gravity at the physiological and molecular level is critical for expanding our knowledge of life in space. Since opportunities to conduct experiments in space are scarce, various microgravity simulators and analogues have been widely used in space biology ground studies. Even though simulated microgravity conditions have produced some, but not all of the biological effects observed in the true microgravity environment, they provide test beds that are effective, affordable, and readily available to facilitate microgravity research. A Micro-g Simulator Center is being developed at Kennedy Space Center (KSC) to offer a variety of microgravity simulators and platforms for Space Biology investigators. Assistance will be provided by both KSC and external experts in molecular biology, microgravity simulation, and engineering. Comparisons between the physical differences in microgravity simulators, examples of experiments using the simulators, and scientific questions regarding the use of microgravity simulators will be discussed.

  13. 3D FEM Simulation of Flank Wear in Turning

    NASA Astrophysics Data System (ADS)

    Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio

    2011-05-01

    This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.

  14. How much expert knowledge is it worth to put in conceptual hydrological models?

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Zappa, Massimiliano

    2017-04-01

    Both modellers and experimentalists agree on using expert knowledge to improve our conceptual hydrological simulations on ungauged basins. However, they use expert knowledge differently for both hydrologically mapping the landscape and parameterising a given hydrological model. Modellers use generally very simplified (e.g. topography-based) mapping approaches and put most of the knowledge for constraining the model by defining parameter and process relational rules. In contrast, experimentalists tend to invest all their detailed and qualitative knowledge about processes to obtain a spatial distribution of areas with different dominant runoff generation processes (DRPs) as realistic as possible, and for defining plausible narrow value ranges for each model parameter. Since, most of the times, the modelling goal is exclusively to simulate runoff at a specific site, even strongly simplified hydrological classifications can lead to satisfying results due to equifinality of hydrological models, overfitting problems and the numerous uncertainty sources affecting runoff simulations. Therefore, to test to which extent expert knowledge can improve simulation results under uncertainty, we applied a typical modellers' modelling framework relying on parameter and process constraints defined based on expert knowledge to several catchments on the Swiss Plateau. To map the spatial distribution of the DRPs, mapping approaches with increasing involvement of expert knowledge were used. Simulation results highlighted the potential added value of using all the expert knowledge available on a catchment. Also, combinations of event types and landscapes, where even a simplified mapping approach can lead to satisfying results, were identified. Finally, the uncertainty originated by the different mapping approaches was compared with the one linked to meteorological input data and catchment initial conditions.

  15. Pinhole induced efficiency variation in perovskite solar cells

    NASA Astrophysics Data System (ADS)

    Agarwal, Sumanshu; Nair, Pradeep R.

    2017-10-01

    Process induced efficiency variation is a major concern for all thin film solar cells, including the emerging perovskite based solar cells. In this article, we address the effect of pinholes or process induced surface coverage aspects on the efficiency of such solar cells through detailed numerical simulations. Interestingly, we find that the pinhole size distribution affects the short circuit current and open circuit voltage in contrasting manners. Specifically, while the JS C is heavily dependent on the pinhole size distribution, surprisingly, the VO C seems to be only nominally affected by it. Further, our simulations also indicate that, with appropriate interface engineering, it is indeed possible to design a nanostructured device with efficiencies comparable to those of ideal planar structures. Additionally, we propose a simple technique based on terminal I-V characteristics to estimate the surface coverage in perovskite solar cells.

  16. Optimization of droplets for UV-NIL using coarse-grain simulation of resist flow

    NASA Astrophysics Data System (ADS)

    Sirotkin, Vadim; Svintsov, Alexander; Zaitsev, Sergey

    2009-03-01

    A mathematical model and numerical method are described, which make it possible to simulate ultraviolet ("step and flash") nanoimprint lithography (UV-NIL) process adequately even using standard Personal Computers. The model is derived from 3D Navier-Stokes equations with the understanding that the resist motion is largely directed along the substrate surface and characterized by ultra-low values of the Reynolds number. By the numerical approximation of the model, a special finite difference method is applied (a coarse-grain method). A coarse-grain modeling tool for detailed analysis of resist spreading in UV-NIL at the structure-scale level is tested. The obtained results demonstrate the high ability of the tool to calculate optimal dispensing for given stamp design and process parameters. This dispensing provides uniform filled areas and a homogeneous residual layer thickness in UV-NIL.

  17. The ν process in the innermost supernova ejecta

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sieverding, Andre; Martínez-Pinedo, Gabriel; Langanke, Karlheinz

    2017-12-01

    The neutrino-induced nucleosynthesis (ν process) in supernova explosions of massive stars of solar metallicity with initial main sequence masses between 13 and 30 M⊙ has been studied with an analytic explosion model using a new extensive set of neutrino-nucleus cross-sections and spectral properties that agree with modern supernova simulations. The production factors for the nuclei 7Li, 11B, 19F, 138La and 180Ta, are still significantly enhanced but do not reproduce the full solar abundances. We study the possible contribution of the innermost supernova eject to the production of the light elements 7Li and 11B with tracer particles based on a 2Dmore » supernova simulation of a 12 M⊙ progenitor and conclude, that a contribution exists but is negligible for the total yield for this explosion model.« less

  18. Development of analysis technique to predict the material behavior of blowing agent

    NASA Astrophysics Data System (ADS)

    Hwang, Ji Hoon; Lee, Seonggi; Hwang, So Young; Kim, Naksoo

    2014-11-01

    In order to numerically simulate the foaming behavior of mastic sealer containing the blowing agent, a foaming and driving force model are needed which incorporate the foaming characteristics. Also, the elastic stress model is required to represent the material behavior of co-existing phase of liquid state and the cured polymer. It is important to determine the thermal properties such as thermal conductivity and specific heat because foaming behavior is heavily influenced by temperature change. In this study, three models are proposed to explain the foaming process and material behavior during and after the process. To obtain the material parameters in each model, following experiments and the numerical simulations are performed: thermal test, simple shear test and foaming test. The error functions are defined as differences between the experimental measurements and the numerical simulation results, and then the parameters are determined by minimizing the error functions. To ensure the validity of the obtained parameters, the confirmation simulation for each model is conducted by applying the determined parameters. The cross-verification is performed by measuring the foaming/shrinkage force. The results of cross-verification tended to follow the experimental results. Interestingly, it was possible to estimate the micro-deformation occurring in automobile roof surface by applying the proposed model to oven process analysis. The application of developed analysis technique will contribute to the design with minimized micro-deformation.

  19. Simulating the formation of cosmic structure.

    PubMed

    Frenk, C S

    2002-06-15

    A timely combination of new theoretical ideas and observational discoveries has brought about significant advances in our understanding of cosmic evolution. Computer simulations have played a key role in these developments by providing the means to interpret astronomical data in the context of physical and cosmological theory. In the current paradigm, our Universe has a flat geometry, is undergoing accelerated expansion and is gravitationally dominated by elementary particles that make up cold dark matter. Within this framework, it is possible to simulate in a computer the emergence of galaxies and other structures from small quantum fluctuations imprinted during an epoch of inflationary expansion shortly after the Big Bang. The simulations must take into account the evolution of the dark matter as well as the gaseous processes involved in the formation of stars and other visible components. Although many unresolved questions remain, a coherent picture for the formation of cosmic structure is now beginning to emerge.

  20. Preliminary Dynamic Feasibility and Analysis of a Spherical, Wind-Driven (Tumbleweed), Martian Rover

    NASA Technical Reports Server (NTRS)

    Flick, John J.; Toniolo, Matthew D.

    2005-01-01

    The process and findings are presented from a preliminary feasibility study examining the dynamics characteristics of a spherical wind-driven (or Tumbleweed) rover, which is intended for exploration of the Martian surface. The results of an initial feasibility study involving several worst-case mobility situations that a Tumbleweed rover might encounter on the surface of Mars are discussed. Additional topics include the evaluation of several commercially available analysis software packages that were examined as possible platforms for the development of a Monte Carlo Tumbleweed mission simulation tool. This evaluation lead to the development of the Mars Tumbleweed Monte Carlo Simulator (or Tumbleweed Simulator) using the Vortex physics software package from CM-Labs, Inc. Discussions regarding the development and evaluation of the Tumbleweed Simulator, as well as the results of a preliminary analysis using the tool are also presented. Finally, a brief conclusions section is presented.

  1. Earth System Modeling 2.0: A Blueprint for Models That Learn From Observations and Targeted High-Resolution Simulations

    NASA Astrophysics Data System (ADS)

    Schneider, Tapio; Lan, Shiwei; Stuart, Andrew; Teixeira, João.

    2017-12-01

    Climate projections continue to be marred by large uncertainties, which originate in processes that need to be parameterized, such as clouds, convection, and ecosystems. But rapid progress is now within reach. New computational tools and methods from data assimilation and machine learning make it possible to integrate global observations and local high-resolution simulations in an Earth system model (ESM) that systematically learns from both and quantifies uncertainties. Here we propose a blueprint for such an ESM. We outline how parameterization schemes can learn from global observations and targeted high-resolution simulations, for example, of clouds and convection, through matching low-order statistics between ESMs, observations, and high-resolution simulations. We illustrate learning algorithms for ESMs with a simple dynamical system that shares characteristics of the climate system; and we discuss the opportunities the proposed framework presents and the challenges that remain to realize it.

  2. Cloud-resolving model intercomparison of an MC3E squall line case: Part I-Convective updrafts: CRM Intercomparison of a Squall Line

    DOE PAGES

    Fan, Jiwen; Han, Bin; Varble, Adam; ...

    2017-09-06

    An intercomparison study of a midlatitude mesoscale squall line is performed using the Weather Research and Forecasting (WRF) model at 1 km horizontal grid spacing with eight different cloud microphysics schemes to investigate processes that contribute to the large variability in simulated cloud and precipitation properties. All simulations tend to produce a wider area of high radar reflectivity (Z e > 45 dBZ) than observed but a much narrower stratiform area. Furthermore, the magnitude of the virtual potential temperature drop associated with the gust front passage is similar in simulations and observations, while the pressure rise and peak wind speedmore » are smaller than observed, possibly suggesting that simulated cold pools are shallower than observed. Most of the microphysics schemes overestimate vertical velocity and Z e in convective updrafts as compared with observational retrievals. Simulated precipitation rates and updraft velocities have significant variability across the eight schemes, even in this strongly dynamically driven system. Differences in simulated updraft velocity correlate well with differences in simulated buoyancy and low-level vertical perturbation pressure gradient, which appears related to cold pool intensity that is controlled by the evaporation rate. Simulations with stronger updrafts have a more optimal convective state, with stronger cold pools, ambient low-level vertical wind shear, and rear-inflow jets. We found that updraft velocity variability between schemes is mainly controlled by differences in simulated ice-related processes, which impact the overall latent heating rate, whereas surface rainfall variability increases in no-ice simulations mainly because of scheme differences in collision-coalescence parameterizations.« less

  3. Application of evolutionary games to modeling carcinogenesis.

    PubMed

    Swierniak, Andrzej; Krzeslak, Michal

    2013-06-01

    We review a quite large volume of literature concerning mathematical modelling of processes related to carcinogenesis and the growth of cancer cell populations based on the theory of evolutionary games. This review, although partly idiosyncratic, covers such major areas of cancer-related phenomena as production of cytotoxins, avoidance of apoptosis, production of growth factors, motility and invasion, and intra- and extracellular signaling. We discuss the results of other authors and append to them some additional results of our own simulations dealing with the possible dynamics and/or spatial distribution of the processes discussed.

  4. Design and optimization of a fiber optic data link for new generation on-board SAR processing architectures

    NASA Astrophysics Data System (ADS)

    Ciminelli, Caterina; Dell'Olio, Francesco; Armenise, Mario N.; Iacomacci, Francesco; Pasquali, Franca; Formaro, Roberto

    2017-11-01

    A fiber optic digital link for on-board data handling is modeled, designed and optimized in this paper. Design requirements and constraints relevant to the link, which is in the frame of novel on-board processing architectures, are discussed. Two possible link configurations are investigated, showing their advantages and disadvantages. An accurate mathematical model of each link component and the entire system is reported and results of link simulation based on those models are presented. Finally, some details on the optimized design are provided.

  5. Towards Real Time Diagnostics of Hybrid Welding Laser/GMAW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy Mcjunkin; Dennis C. Kunerth; Corrie Nichol

    2013-07-01

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defectsmore » or precursors to defects and correct when possible during the weld process.« less

  6. Towards real time diagnostics of Hybrid Welding Laser/GMAW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, T. R.; Kunerth, D. C.; Nichol, C. I.

    2014-02-18

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defectsmore » or precursors to defects and correct when possible during the weld process.« less

  7. Towards real time diagnostics of Hybrid Welding Laser/GMAW

    NASA Astrophysics Data System (ADS)

    McJunkin, T. R.; Kunerth, D. C.; Nichol, C. I.; Todorov, E.; Levesque, S.

    2014-02-01

    Methods are currently being developed towards a more robust system real time feedback in the high throughput process combining laser welding with gas metal arc welding. A combination of ultrasonic, eddy current, electronic monitoring, and visual techniques are being applied to the welding process. Initial simulation and bench top evaluation of proposed real time techniques on weld samples are presented along with the concepts to apply the techniques concurrently to the weld process. Consideration for the eventual code acceptance of the methods and system are also being researched as a component of this project. The goal is to detect defects or precursors to defects and correct when possible during the weld process.

  8. Simulations for designing and interpreting intervention trials in infectious diseases.

    PubMed

    Halloran, M Elizabeth; Auranen, Kari; Baird, Sarah; Basta, Nicole E; Bellan, Steven E; Brookmeyer, Ron; Cooper, Ben S; DeGruttola, Victor; Hughes, James P; Lessler, Justin; Lofgren, Eric T; Longini, Ira M; Onnela, Jukka-Pekka; Özler, Berk; Seage, George R; Smith, Thomas A; Vespignani, Alessandro; Vynnycky, Emilia; Lipsitch, Marc

    2017-12-29

    Interventions in infectious diseases can have both direct effects on individuals who receive the intervention as well as indirect effects in the population. In addition, intervention combinations can have complex interactions at the population level, which are often difficult to adequately assess with standard study designs and analytical methods. Herein, we urge the adoption of a new paradigm for the design and interpretation of intervention trials in infectious diseases, particularly with regard to emerging infectious diseases, one that more accurately reflects the dynamics of the transmission process. In an increasingly complex world, simulations can explicitly represent transmission dynamics, which are critical for proper trial design and interpretation. Certain ethical aspects of a trial can also be quantified using simulations. Further, after a trial has been conducted, simulations can be used to explore the possible explanations for the observed effects. Much is to be gained through a multidisciplinary approach that builds collaborations among experts in infectious disease dynamics, epidemiology, statistical science, economics, simulation methods, and the conduct of clinical trials.

  9. High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6

    DOE PAGES

    Haarsma, Reindert J.; Roberts, Malcolm J.; Vidale, Pier Luigi; ...

    2016-11-22

    Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relativelymore » few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950–2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. Lastly, HighResMIP thereby focuses on one of the CMIP6 broad questions, “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.« less

  10. Simulation of Oil Palm Shell Pyrolysis to Produce Bio-Oil with Self-Pyrolysis Reactor

    NASA Astrophysics Data System (ADS)

    Fika, R.; Nelwan, L. O.; Yulianto, M.

    2018-05-01

    A new self-pyrolysis reactor was designed to reduce the utilization of electric heater due to the energy saving for the production of bio-oil from oil palm shell. The yield of the bio- oil was then evaluated with the developed mathematical model by Sharma [1] with the characteristic of oil palm shell [2]. During the simulation, the temperature on the combustion chamber on the release of the bio-oil was utilized to determine the volatile composition from the combustion of the oil palm shell as fuel. The mass flow was assumed constant for three experiments. The model resulted in a significant difference between the simulated bio-oil and experiments. The bio-oil yields from the simulation were 22.01, 16.36, and 21.89 % (d.b.) meanwhile the experimental yields were 10.23, 9.82, and 8.41% (d.b.). The char yield varied from 30.7 % (d.b.) from the simulation to 40.9 % (d.b.) from the experiment. This phenomenon was due to the development of process temperature over time which was not considered as one of the influential factors in producing volatile matters on the simulation model. Meanwhile the real experiments highly relied on the process conditions (reactor type, temperature over time, gas flow). There was also possibilities of the occurrence of the gasification inside the reactor which caused the liquid yield was not as high as simulated. Further simulation model research on producing the bio-oil yield will be needed to predict the optimum condition and temperature development on the newly self-pyrolysis reactor.

  11. A Crack Growth Evaluation Method for Interacting Multiple Cracks

    NASA Astrophysics Data System (ADS)

    Kamaya, Masayuki

    When stress corrosion cracking or corrosion fatigue occurs, multiple cracks are frequently initiated in the same area. According to section XI of the ASME Boiler and Pressure Vessel Code, multiple cracks are considered as a single combined crack in crack growth analysis, if the specified conditions are satisfied. In crack growth processes, however, no prescription for the interference between multiple cracks is given in this code. The JSME Post-Construction Code, issued in May 2000, prescribes the conditions of crack coalescence in the crack growth process. This study aimed to extend this prescription to more general cases. A simulation model was applied, to simulate the crack growth process, taking into account the interference between two cracks. This model made it possible to analyze multiple crack growth behaviors for many cases (e. g. different relative position and length) that could not be studied by experiment only. Based on these analyses, a new crack growth analysis method was suggested for taking into account the interference between multiple cracks.

  12. A model for simulating the grinding and classification cyclic system of waste PCBs recycling production line.

    PubMed

    Yang, Deming; Xu, Zhenming

    2011-09-15

    Crushing and separating technology is widely used in waste printed circuit boards (PCBs) recycling process. A set of automatic line without negative impact to environment for recycling waste PCBs was applied in industry scale. Crushed waste PCBs particles grinding and classification cyclic system is the most important part of the automatic production line, and it decides the efficiency of the whole production line. In this paper, a model for computing the process of the system was established, and matrix analysis method was adopted. The result showed that good agreement can be achieved between the simulation model and the actual production line, and the system is anti-jamming. This model possibly provides a basis for the automatic process control of waste PCBs production line. With this model, many engineering problems can be reduced, such as metals and nonmetals insufficient dissociation, particles over-pulverizing, incomplete comminuting, material plugging and equipment fever. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. OpenWorm: an open-science approach to modeling Caenorhabditis elegans.

    PubMed

    Szigeti, Balázs; Gleeson, Padraig; Vella, Michael; Khayrulin, Sergey; Palyanov, Andrey; Hokanson, Jim; Currie, Michael; Cantarelli, Matteo; Idili, Giovanni; Larson, Stephen

    2014-01-01

    OpenWorm is an international collaboration with the aim of understanding how the behavior of Caenorhabditis elegans (C. elegans) emerges from its underlying physiological processes. The project has developed a modular simulation engine to create computational models of the worm. The modularity of the engine makes it possible to easily modify the model, incorporate new experimental data and test hypotheses. The modeling framework incorporates both biophysical neuronal simulations and a novel fluid-dynamics-based soft-tissue simulation for physical environment-body interactions. The project's open-science approach is aimed at overcoming the difficulties of integrative modeling within a traditional academic environment. In this article the rationale is presented for creating the OpenWorm collaboration, the tools and resources developed thus far are outlined and the unique challenges associated with the project are discussed.

  14. Optical CAD Utilization for the Design and Testing of a LED Streetlamp.

    PubMed

    Jafrancesco, David; Mercatelli, Luca; Fontani, Daniela; Sansoni, Paola

    2017-08-24

    The design and testing of LED lamps are vital steps toward broader use of LED lighting for outdoor illumination and traffic signalling. The characteristics of LED sources, in combination with the need to limit light pollution and power consumption, require a precise optical design. In particular, in every step of the process, it is important to closely compare theoretical or simulated results with measured data (obtained from a prototype). This work examines the various possibilities for using an optical CAD (Lambda Research TracePro ) to design and check a LED lamp for outdoor use. This analysis includes the simulations and testing on a prototype as an example; data acquired by measurement are inserted into the same simulation software, making it easy to compare theoretical and actual results.

  15. Enhanced centrifuge-based approach to powder characterization

    NASA Astrophysics Data System (ADS)

    Thomas, Myles Calvin

    Many types of manufacturing processes involve powders and are affected by powder behavior. It is highly desirable to implement tools that allow the behavior of bulk powder to be predicted based on the behavior of only small quantities of powder. Such descriptions can enable engineers to significantly improve the performance of powder processing and formulation steps. In this work, an enhancement of the centrifuge technique is proposed as a means of powder characterization. This enhanced method uses specially designed substrates with hemispherical indentations within the centrifuge. The method was tested using simulations of the momentum balance at the substrate surface. Initial simulations were performed with an ideal powder containing smooth, spherical particles distributed on substrates designed with indentations. The van der Waals adhesion between the powder, whose size distribution was based on an experimentally-determined distribution from a commercial silica powder, and the indentations was calculated and compared to the removal force created in the centrifuge. This provided a way to relate the powder size distribution to the rotational speed required for particle removal for various indentation sizes. Due to the distinct form of the data from these simulations, the cumulative size distribution of the powder and the Hamaker constant for the system were be extracted. After establishing adhesion force characterization for an ideal powder, the same proof-of-concept procedure was followed for a more realistic system with a simulated rough powder modeled as spheres with sinusoidal protrusions and intrusions around the surface. From these simulations, it was discovered that an equivalent powder of smooth spherical particles could be used to describe the adhesion behavior of the rough spherical powder by establishing a size-dependent 'effective' Hamaker constant distribution. This development made it possible to describe the surface roughness effects of the entire powder through one adjustable parameter that was linked to the size distribution. It is important to note that when the engineered substrates (hemispherical indentations) were applied, it was possible to extract both powder size distribution and effective Hamaker constant information from the simulated centrifuge adhesion experiments. Experimental validation of the simulated technique was performed with a silica powder dispersed onto a stainless steel substrate with no engineered surface features. Though the proof-of-concept work was accomplished for indented substrates, non-ideal, relatively flat (non-indented) substrates were used experimentally to demonstrate that the technique can be extended to this case. The experimental data was then used within the newly developed simulation procedure to show its application to real systems. In the absence of engineered features on the substrates, it was necessary to specify the size distribution of the powder as an input to the simulator. With this information, it was possible to extract an effective Hamaker constant distribution and when the effective Hamaker constant distribution was applied in conjunction with the size distribution, the observed adhesion force distribution was described precisely. An equation was developed that related the normalized effective Hamaker constants (normalized by the particle diameter) to the particle diameter was formulated from the effective Hamaker constant distribution. It was shown, by application of the equation, that the adhesion behavior of an ideal (smooth, spherical) powder with an experimentally-validated, effective Hamaker constant distribution could be used to effectively represent that of a realistic powder. Thus, the roughness effects and size variations of a real powder are captured in this one distributed parameter (effective Hamaker constant distribution) which provides a substantial improvement to the existing technique. This can lead to better optimization of powder processing by enhancing powder behavior models.

  16. An Investigation of the Reverse Water Gas Shift Process and Operating Alternatives

    NASA Technical Reports Server (NTRS)

    Whitlow, Jonathan E.

    2002-01-01

    The Reverse Water Gas Shift (RWGS) process can produce water and ultimately oxygen through electrolysis. This technology is being investigated for possible use in the exploration of Mars as well as a potential process to aid in the regeneration of oxygen from carbon dioxide. The initial part of this report summarizes the results obtained from operation of the RWGS process at Kennedy Space Center during May and June of this year. It has been demonstrated that close to complete conversion can be achieved with the RWGS process under certain operating conditions. The report also presents results obtained through simulation for an alternative staged configuration for RWGS which eliminates the recycle compressor. This configuration looks promising and hence seems worthy of experimental investigation.

  17. Modern trends in industrial technology of production of optical polymeric components for night vision devices

    NASA Astrophysics Data System (ADS)

    Goev, A. I.; Knyazeva, N. A.; Potelov, V. V.; Senik, B. N.

    2005-06-01

    The present paper represents in detail the complex approach to creating industrial technology of production of polymeric optical components: information has been given on optical polymeric materials, automatic machines for injection moulding, the possibilities of the Moldflow system (the AB "Universal" company) used for mathematical simulation of the technological process of injection moulding and making the moulds.

  18. Simulation and Calculation of the APEX Attitude

    DTIC Science & Technology

    1992-07-29

    attitude computation. As a by-product, several interesting features that may be present in the APEX attitude behavior are noted. The APEX satellite...DEFINITION OF THE ATTITUDE Generally speaking, it is possible to define the spacecraft . ttitude in several ways, so long as the process of computation and...actual APEX attitude behavior . However, it is not the purpose of this work to assess the probable degree of attitude

  19. 3D FE simulation of semi-finishing machining of Ti6Al4V additively manufactured by direct metal laser sintering

    NASA Astrophysics Data System (ADS)

    Imbrogno, Stano; Rinaldi, Sergio; Raso, Antonio; Bordin, Alberto; Bruschi, Stefania; Umbrello, Domenico

    2018-05-01

    The Additive Manufacturing techniques are gaining more and more interest in various industrial fields due to the possibility of drastically reduce the material waste during the production processes, revolutionizing the standard scheme and strategies of the manufacturing processes. However, the metal parts shape produced, frequently do not satisfy the tolerances as well as the surface quality requirements. During the design phase, the finite element simulation results a fundamental tool to help the engineers in the correct decision of the most suitable process parameters, especially in manufacturing processes, in order to produce products of high quality. The aim of this work is to develop a 3D finite element model of semi-finishing turning operation of Ti6Al4V, produced via Direct Metal Laser Sintering (DMLS). A customized user sub-routine was built-up in order to model the mechanical behavior of the material under machining operations to predict the main fundamental variables as cutting forces and temperature. Moreover, the machining induced alterations are also studied by the finite element model developed.

  20. Forward and adjoint spectral-element simulations of seismic wave propagation using hardware accelerators

    NASA Astrophysics Data System (ADS)

    Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri

    2015-04-01

    Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  1. A dynamic population model to investigate effects of climate and climate-independent factors on the lifecycle of the tick Amblyomma americanum (Acari: Ixodidae)

    USGS Publications Warehouse

    Ludwig, Antoinette; Ginsberg, Howard; Hickling, Graham J.; Ogden, Nicholas H.

    2016-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick.

  2. A Dynamic Population Model to Investigate Effects of Climate and Climate-Independent Factors on the Lifecycle of Amblyomma americanum (Acari: Ixodidae).

    PubMed

    Ludwig, Antoinette; Ginsberg, Howard S; Hickling, Graham J; Ogden, Nicholas H

    2016-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick. © Crown copyright 2015.

  3. Analysis of l-glutamic acid fermentation by using a dynamic metabolic simulation model of Escherichia coli

    PubMed Central

    2013-01-01

    Background Understanding the process of amino acid fermentation as a comprehensive system is a challenging task. Previously, we developed a literature-based dynamic simulation model, which included transcriptional regulation, transcription, translation, and enzymatic reactions related to glycolysis, the pentose phosphate pathway, the tricarboxylic acid (TCA) cycle, and the anaplerotic pathway of Escherichia coli. During simulation, cell growth was defined such as to reproduce the experimental cell growth profile of fed-batch cultivation in jar fermenters. However, to confirm the biological appropriateness of our model, sensitivity analysis and experimental validation were required. Results We constructed an l-glutamic acid fermentation simulation model by removing sucAB, a gene encoding α-ketoglutarate dehydrogenase. We then performed systematic sensitivity analysis for l-glutamic acid production; the results of this process corresponded with previous experimental data regarding l-glutamic acid fermentation. Furthermore, it allowed us to predicted the possibility that accumulation of 3-phosphoglycerate in the cell would regulate the carbon flux into the TCA cycle and lead to an increase in the yield of l-glutamic acid via fermentation. We validated this hypothesis through a fermentation experiment involving a model l-glutamic acid-production strain, E. coli MG1655 ΔsucA in which the phosphoglycerate kinase gene had been amplified to cause accumulation of 3-phosphoglycerate. The observed increase in l-glutamic acid production verified the biologically meaningful predictive power of our dynamic metabolic simulation model. Conclusions In this study, dynamic simulation using a literature-based model was shown to be useful for elucidating the precise mechanisms involved in fermentation processes inside the cell. Further exhaustive sensitivity analysis will facilitate identification of novel factors involved in the metabolic regulation of amino acid fermentation. PMID:24053676

  4. Analysis of L-glutamic acid fermentation by using a dynamic metabolic simulation model of Escherichia coli.

    PubMed

    Nishio, Yousuke; Ogishima, Soichi; Ichikawa, Masao; Yamada, Yohei; Usuda, Yoshihiro; Masuda, Tadashi; Tanaka, Hiroshi

    2013-09-22

    Understanding the process of amino acid fermentation as a comprehensive system is a challenging task. Previously, we developed a literature-based dynamic simulation model, which included transcriptional regulation, transcription, translation, and enzymatic reactions related to glycolysis, the pentose phosphate pathway, the tricarboxylic acid (TCA) cycle, and the anaplerotic pathway of Escherichia coli. During simulation, cell growth was defined such as to reproduce the experimental cell growth profile of fed-batch cultivation in jar fermenters. However, to confirm the biological appropriateness of our model, sensitivity analysis and experimental validation were required. We constructed an L-glutamic acid fermentation simulation model by removing sucAB, a gene encoding α-ketoglutarate dehydrogenase. We then performed systematic sensitivity analysis for L-glutamic acid production; the results of this process corresponded with previous experimental data regarding L-glutamic acid fermentation. Furthermore, it allowed us to predicted the possibility that accumulation of 3-phosphoglycerate in the cell would regulate the carbon flux into the TCA cycle and lead to an increase in the yield of L-glutamic acid via fermentation. We validated this hypothesis through a fermentation experiment involving a model L-glutamic acid-production strain, E. coli MG1655 ΔsucA in which the phosphoglycerate kinase gene had been amplified to cause accumulation of 3-phosphoglycerate. The observed increase in L-glutamic acid production verified the biologically meaningful predictive power of our dynamic metabolic simulation model. In this study, dynamic simulation using a literature-based model was shown to be useful for elucidating the precise mechanisms involved in fermentation processes inside the cell. Further exhaustive sensitivity analysis will facilitate identification of novel factors involved in the metabolic regulation of amino acid fermentation.

  5. Principles of VCSEL designing

    NASA Astrophysics Data System (ADS)

    Nakwaski, W.

    2008-03-01

    Comprehensive computer simulations are currently the most efficient and cheap methods in designing and optimisation of semiconductor device structures. Seemingly they should be as exact as possible, but in practice it is well known that the most exact approaches are also the most involved and the most time-consuming ones and need powerful computers. In some cases, cheaper somewhat simplified modelling simulations are sufficiently accurate. Therefore, an appropriate modelling approach should be chosen taking into account a compromise between our needs and our possibilities. Modelling of operation and designing of structures of vertical-cavity surface-emitting diode lasers (VCSELs) requires appropriate mathematical description of physical processes crucial for devices operation, i.e., various optical, electrical, thermal, recombination and sometimes also mechanical phenomena taking place within their volumes. Equally important are mutual interactions between above individual processes, usually strongly non-linear and creating a real network of various inter-relations. Chain is as strong as its weakest link. Analogously, model is as exact as its less exact part. Therefore it is useless to improve exactness of its more accurate parts and not to care about less exact ones. All model parts should exhibit similar accuracy. In any individual case, a reasonable compromise should be reached between high modelling fidelity and its practical convenience depending on a main modelling goal, importance and urgency of expected results, available equipment and also financial possibilities. In the present paper, some simplifications used in VCSEL modelling are discussed and their impact on exactness of VCSEL designing is analysed.

  6. State-transfer simulation in integrated waveguide circuits

    NASA Astrophysics Data System (ADS)

    Latmiral, L.; Di Franco, C.; Mennea, P. L.; Kim, M. S.

    2015-08-01

    Spin-chain models have been widely studied in terms of quantum information processes, for instance for the faithful transmission of quantum states. Here, we investigate the limitations of mapping this process to an equivalent one through a bosonic chain. In particular, we keep in mind experimental implementations, which the progress in integrated waveguide circuits could make possible in the very near future. We consider the feasibility of exploiting the higher dimensionality of the Hilbert space of the chain elements for the transmission of a larger amount of information, and the effects of unwanted excitations during the process. Finally, we exploit the information-flux method to provide bounds to the transfer fidelity.

  7. Three-dimensional numerical simulation of a continuously rotating detonation in the annular combustion chamber with a wide gap and separate delivery of fuel and oxidizer

    NASA Astrophysics Data System (ADS)

    Frolov, S. M.; Dubrovskii, A. V.; Ivanov, V. S.

    2016-07-01

    The possibility of integrating the Continuous Detonation Chamber (CDC) in a gas turbine engine (GTE) is demonstrated by means of three-dimensional (3D) numerical simulations, i. e., the feasibility of the operation process in the annular combustion chamber with a wide gap and with separate feeding of fuel (hydrogen) and oxidizer (air) is proved computationally. The CDC with an upstream isolator damping pressure disturbances propagating towards the compressor is shown to exhibit a gain in the total pressure of 15% as compared with the same combustion chamber operating in the deflagration mode.

  8. The formation of fragments at corotation in isothermal protoplanetary disks

    NASA Astrophysics Data System (ADS)

    Durisen, Richard H.; Hartquist, Thomas W.; Pickett, Megan K.

    2008-09-01

    Numerical hydrodynamics simulations have established that disks which are evolved under the condition of local isothermality will fragment into small dense clumps due to gravitational instabilities when the Toomre stability parameter Q is sufficiently low. Because fragmentation through disk instability has been suggested as a gas giant planet formation mechanism, it is important to understand the physics underlying this process as thoroughly as possible. In this paper, we offer analytic arguments for why, at low Q, fragments are most likely to form first at the corotation radii of growing spiral modes, and we support these arguments with results from 3D hydrodynamics simulations.

  9. Simulation of the Press Hardening Process and Prediction of the Final Mechanical Material Properties

    NASA Astrophysics Data System (ADS)

    Hochholdinger, Bernd; Hora, Pavel; Grass, Hannes; Lipp, Arnulf

    2011-08-01

    Press hardening is a well-established production process in the automotive industry today. The actual trend of this process technology points towards the manufacturing of parts with tailored properties. Since the knowledge of the mechanical properties of a structural part after forming and quenching is essential for the evaluation of for example the crash performance, an accurate as possible virtual assessment of the production process is more than ever necessary. In order to achieve this, the definition of reliable input parameters and boundary conditions for the thermo-mechanically coupled simulation of the process steps is required. One of the most important input parameters, especially regarding the final properties of the quenched material, is the contact heat transfer coefficient (IHTC). The CHTC depends on the effective pressure or the gap distance between part and tool. The CHTC at different contact pressures and gap distances is determined through inverse parameter identification. Furthermore a simulation strategy for the subsequent steps of the press hardening process as well as adequate modeling approaches for part and tools are discussed. For the prediction of the yield curves of the material after press hardening a phenomenological model is presented. This model requires the knowledge of the microstructure within the part. By post processing the nodal temperature history with a CCT diagram the quantitative distribution of the phase fractions martensite, bainite, ferrite and pearlite after press hardening is determined. The model itself is based on a Hockett-Sherby approach with the Hockett-Sherby parameters being defined in function of the phase fractions and a characteristic cooling rate.

  10. Large Scale Geologic Controls on Hydraulic Stimulation

    NASA Astrophysics Data System (ADS)

    McLennan, J. D.; Bhide, R.

    2014-12-01

    When simulating a hydraulic fracturing, the analyst has historically prescribed a single planar fracture. Originally (in the 1950s through the 1970s) this was necessitated by computational restrictions. In the latter part of the twentieth century, hydraulic fracture simulation evolved to incorporate vertical propagation controlled by modulus, fluid loss, and the minimum principal stress. With improvements in software, computational capacity, and recognition that in-situ discontinuities are relevant, fully three-dimensional hydraulic simulation is now becoming possible. Advances in simulation capabilities enable coupling structural geologic data (three-dimensional representation of stresses, natural fractures, and stratigraphy) with decision making processes for stimulation - volumes, rates, fluid types, completion zones. Without this interaction between simulation capabilities and geological information, low permeability formation exploitation may linger on the fringes of real economic viability. Comparative simulations have been undertaken in varying structural environments where the stress contrast and the frequency of natural discontinuities causes varying patterns of multiple, hydraulically generated or reactivated flow paths. Stress conditions and nature of the discontinuities are selected as variables and are used to simulate how fracturing can vary in different structural regimes. The basis of the simulations is commercial distinct element software (Itasca Corporation's 3DEC).

  11. Influence of the track quality and of the properties of the wheel-rail rolling contact on vehicle dynamics

    NASA Astrophysics Data System (ADS)

    Suarez, Berta; Felez, Jesus; Lozano, José Antonio; Rodriguez, Pablo

    2013-02-01

    This work describes an analytical approach to determine what degree of accuracy is required in the definition of the rail vehicle models used for dynamic simulations. This way it would be possible to know in advance how the results of simulations may be altered due to the existence of errors in the creation of rolling stock models, whilst also identifying their critical parameters. This would make it possible to maximise the time available to enhance dynamic analysis and focus efforts on factors that are strictly necessary. In particular, the parameters related both to the track quality and to the rolling contact were considered in this study. With this aim, a sensitivity analysis was performed to assess their influence on the vehicle dynamic behaviour. To do this, 72 dynamic simulations were performed modifying, one at a time, the track quality, the wheel-rail friction coefficient and the equivalent conicity of both new and worn wheels. Three values were assigned to each parameter, and two wear states were considered for each type of wheel, one for new wheels and another one for reprofiled wheels. After processing the results of these simulations, it was concluded that all the parameters considered show very high influence, though the friction coefficient shows the highest influence. Therefore, it is recommended to undertake any future simulation job with measured track geometry and track irregularities, measured wheel profiles and normative values of the wheel-rail friction coefficient.

  12. On the relevance of modeling viscoelastic bending behavior in finite element forming simulation of continuously fiber reinforced thermoplastics

    NASA Astrophysics Data System (ADS)

    Dörr, Dominik; Schirmaier, Fabian J.; Henning, Frank; Kärger, Luise

    2017-10-01

    Finite Element (FE) forming simulation offers the possibility of a detailed analysis of the deformation behavior of multilayered thermoplastic blanks during forming, considering material behavior and process conditions. Rate-dependent bending behavior is a material characteristic, which is so far not considered in FE forming simulation of pre-impregnated, continuously fiber reinforced polymers (CFRPs). Therefore, an approach for modeling viscoelastic bending behavior in FE composite forming simulation is presented in this work. The presented approach accounts for the distinct rate-dependent bending behavior of e.g. thermoplastic CFRPs at process conditions. The approach is based on a Voigt-Kelvin (VK) and a generalized Maxwell (GM) approach, implemented within a FE forming simulation framework implemented in several user-subroutines of the commercially available FE solver Abaqus. The VK, GM, as well as purely elastic bending modeling approaches are parameterized according to dynamic bending characterization results for a PA6-CF UD-tape. It is found that only the GM approach is capable to represent the bending deformation characteristic for all of the considered bending deformation rates. The parameterized bending modeling approaches are applied to a hemisphere test and to a generic geometry. A comparison of the forming simulation results of the generic geometry to experimental tests show a good agreement between simulation and experiments. Furthermore, the simulation results reveal that especially a correct modeling of the initial bending stiffness is relevant for the prediction of wrinkling behavior, as a similar onset of wrinkles is observed for the GM, the VK and an elastic approach, fitted to the stiffness observed in the dynamic rheometer test for low curvatures. Hence, characterization and modeling of rate-dependent bending behavior is crucial for FE forming simulation of thermoplastic CFRPs.

  13. Computational study of noise in a large signal transduction network.

    PubMed

    Intosalmi, Jukka; Manninen, Tiina; Ruohonen, Keijo; Linne, Marja-Leena

    2011-06-21

    Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies. © 2011 Intosalmi et al; licensee BioMed Central Ltd.

  14. Virtual reality case-specific rehearsal in temporal bone surgery: a preliminary evaluation.

    PubMed

    Arora, Asit; Swords, Chloe; Khemani, Sam; Awad, Zaid; Darzi, Ara; Singh, Arvind; Tolley, Neil

    2014-01-01

    1. To investigate the feasibility of performing case-specific surgical rehearsal using a virtual reality temporal bone simulator. 2. To identify potential clinical applications in temporal bone surgery. Prospective assessment study. St Mary's Hospital, Imperial College NHS Trust, London UK. Sixteen participants consisting of a trainer and trainee group. Twenty-four cadaver temporal bones were CT-scanned and uploaded onto the Voxelman simulator. Sixteen participants performed a 90-min temporal bone dissection on the generic simulation model followed by 3 dissection tasks on the case simulation and cadaver models. Case rehearsal was assessed for feasibility. Clinical applications and usefulness were evaluated using a 5-point Likert-type scale. The upload process required a semi-automated system. Average time for upload was 20 min. Suboptimal reconstruction occurred in 21% of cases arising when the mastoid process and ossicular chain were not captured (n = 2) or when artefact was generated (n = 3). Case rehearsal rated highly (Likert score >4) for confidence (75%), facilitating planning (75%) and training (94%). Potential clinical applications for case rehearsal include ossicular chain surgery, cochlear implantation and congenital anomalies. Case rehearsal of cholesteatoma surgery is not possible on the current platform due to suboptimal soft tissue representation. The process of uploading CT data onto a virtual reality temporal bone simulator to perform surgical rehearsal is feasible using a semi-automated system. Further clinical evaluation is warranted to assess the benefit of performing patient-specific surgical rehearsal in selected procedures. Copyright © 2013 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  15. Nonlinear relaxation algorithms for circuit simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saleh, R.A.

    Circuit simulation is an important Computer-Aided Design (CAD) tool in the design of Integrated Circuits (IC). However, the standard techniques used in programs such as SPICE result in very long computer-run times when applied to large problems. In order to reduce the overall run time, a number of new approaches to circuit simulation were developed and are described. These methods are based on nonlinear relaxation techniques and exploit the relative inactivity of large circuits. Simple waveform-processing techniques are described to determine the maximum possible speed improvement that can be obtained by exploiting this property of large circuits. Three simulation algorithmsmore » are described, two of which are based on the Iterated Timing Analysis (ITA) method and a third based on the Waveform-Relaxation Newton (WRN) method. New programs that incorporate these techniques were developed and used to simulate a variety of industrial circuits. The results from these simulations are provided. The techniques are shown to be much faster than the standard approach. In addition, a number of parallel aspects of these algorithms are described, and a general space-time model of parallel-task scheduling is developed.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLaughlin, E.; Gupta, S.

    This project mainly involves a molecular dynamics and Monte Carlo study of the effect of molecular shape on thermophysical properties of bulk fluids with an emphasis on the aromatic hydrocarbon liquids. In this regard we have studied the modeling, simulation methodologies, and predictive and correlating methods for thermodynamic properties of fluids of nonspherical molecules. In connection with modeling we have studied the use of anisotropic site-site potentials, through a modification of the Gay-Berne Gaussian overlap potential, to successfully model the aromatic rings after adding the necessary electrostatic moments. We have also shown these interaction sites should be located at themore » geometric centers of the chemical groups. In connection with predictive methods, we have shown two perturbation type theories to work well for fluids modeled using one-center anisotropic potentials and the possibility exists for extending these to anisotropic site-site models. In connection with correlation methods, we have studied, through simulations, the effect of molecular shape on the attraction term in the generalized van der Waals equation of state for fluids of nonspherical molecules and proposed a possible form which is to be studied further. We have successfully studied the vector and parallel processing aspects of molecular simulations for fluids of nonspherical molecules.« less

  17. Numerical Simulation on a Possible Formation Mechanism of Interplanetary Magnetic Cloud Boundaries

    NASA Astrophysics Data System (ADS)

    Fan, Quan-Lin; Wei, Feng-Si; Feng, Xue-Shang

    2003-08-01

    The formation mechanism of the interplanetary magnetic cloud (MC) boundaries is numerically investigated by simulating the interactions between an MC of some initial momentum and a local interplanetary current sheet. The compressible 2.5D MHD equations are solved. Results show that the magnetic reconnection process is a possible formation mechanism when an MC interacts with a surrounding current sheet. A number of interesting features are found. For instance, the front boundary of the MCs is a magnetic reconnection boundary that could be caused by a driven reconnection ahead of the cloud, and the tail boundary might be caused by the driving of the entrained flow as a result of the Bernoulli principle. Analysis of the magnetic field and plasma data demonstrates that at these two boundaries appear large value of the plasma parameter β, clear increase of plasma temperature and density, distinct decrease of magnetic magnitude, and a transition of magnetic field direction of about 180 degrees. The outcome of the present simulation agrees qualitatively with the observational results on MC boundary inferred from IMP-8, etc. The project supported by National Natural Science Foundation of China under Grant Nos. 40104006, 49925412, and 49990450

  18. Use of the AHP methodology in system dynamics: Modelling and simulation for health technology assessments to determine the correct prosthesis choice for hernia diseases.

    PubMed

    Improta, Giovanni; Russo, Mario Alessandro; Triassi, Maria; Converso, Giuseppe; Murino, Teresa; Santillo, Liberatina Carmela

    2018-05-01

    Health technology assessments (HTAs) are often difficult to conduct because of the decisive procedures of the HTA algorithm, which are often complex and not easy to apply. Thus, their use is not always convenient or possible for the assessment of technical requests requiring a multidisciplinary approach. This paper aims to address this issue through a multi-criteria analysis focusing on the analytic hierarchy process (AHP). This methodology allows the decision maker to analyse and evaluate different alternatives and monitor their impact on different actors during the decision-making process. However, the multi-criteria analysis is implemented through a simulation model to overcome the limitations of the AHP methodology. Simulations help decision-makers to make an appropriate decision and avoid unnecessary and costly attempts. Finally, a decision problem regarding the evaluation of two health technologies, namely, the evaluation of two biological prostheses for incisional infected hernias, will be analysed to assess the effectiveness of the model. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Remote control system for high-perfomance computer simulation of crystal growth by the PFC method

    NASA Astrophysics Data System (ADS)

    Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.

  20. Spatial structure of the arc in a pulsed GMAW process

    NASA Astrophysics Data System (ADS)

    Kozakov, R.; Gött, G.; Schöpp, H.; Uhrlandt, D.; Schnick, M.; Häßler, M.; Füssel, U.; Rose, S.

    2013-06-01

    A pulsed gas metal arc welding (GMAW) process of steel under argon shielding gas in the globular mode is investigated by measurements and simulation. The analysis is focussed on the spatial structure of the arc during the current pulse. Therefore, the radial profiles of the temperature, the metal vapour species and the electric conductivity are determined at different heights above the workpiece by optical emission spectroscopy (OES). It is shown that under the presence of metal vapour the temperature minimum occurs at the centre of the arc. This minimum is preserved at different axial positions up to 1 mm above the workpiece. In addition, estimations of the electric field in the arc from the measurements are given. All these results are compared with magneto-hydrodynamic simulations which include the evaporation of the wire material and the change of the plasma properties due to the metal vapour admixture in particular. The experimental method and the simulation model are validated by means of the satisfactory correspondence between the results. Possible reasons for the remaining deviations and improvements of the methods which should be aspired are discussed.

  1. A Case Study Using Modeling and Simulation to Predict Logistics Supply Chain Issues

    NASA Technical Reports Server (NTRS)

    Tucker, David A.

    2007-01-01

    Optimization of critical supply chains to deliver thousands of parts, materials, sub-assemblies, and vehicle structures as needed is vital to the success of the Constellation Program. Thorough analysis needs to be performed on the integrated supply chain processes to plan, source, make, deliver, and return critical items efficiently. Process modeling provides simulation technology-based, predictive solutions for supply chain problems which enable decision makers to reduce costs, accelerate cycle time and improve business performance. For example, United Space Alliance, LLC utilized this approach in late 2006 to build simulation models that recreated shuttle orbiter thruster failures and predicted the potential impact of thruster removals on logistics spare assets. The main objective was the early identification of possible problems in providing thruster spares for the remainder of the Shuttle Flight Manifest. After extensive analysis the model results were used to quantify potential problems and led to improvement actions in the supply chain. Similarly the proper modeling and analysis of Constellation parts, materials, operations, and information flows will help ensure the efficiency of the critical logistics supply chains and the overall success of the program.

  2. Forward flux sampling calculation of homogeneous nucleation rates from aqueous NaCl solutions.

    PubMed

    Jiang, Hao; Haji-Akbari, Amir; Debenedetti, Pablo G; Panagiotopoulos, Athanassios Z

    2018-01-28

    We used molecular dynamics simulations and the path sampling technique known as forward flux sampling to study homogeneous nucleation of NaCl crystals from supersaturated aqueous solutions at 298 K and 1 bar. Nucleation rates were obtained for a range of salt concentrations for the Joung-Cheatham NaCl force field combined with the Extended Simple Point Charge (SPC/E) water model. The calculated nucleation rates are significantly lower than the available experimental measurements. The estimates for the nucleation rates in this work do not rely on classical nucleation theory, but the pathways observed in the simulations suggest that the nucleation process is better described by classical nucleation theory than an alternative interpretation based on Ostwald's step rule, in contrast to some prior simulations of related models. In addition to the size of NaCl nucleus, we find that the crystallinity of a nascent cluster plays an important role in the nucleation process. Nuclei with high crystallinity were found to have higher growth probability and longer lifetimes, possibly because they are less exposed to hydration water.

  3. Forward flux sampling calculation of homogeneous nucleation rates from aqueous NaCl solutions

    NASA Astrophysics Data System (ADS)

    Jiang, Hao; Haji-Akbari, Amir; Debenedetti, Pablo G.; Panagiotopoulos, Athanassios Z.

    2018-01-01

    We used molecular dynamics simulations and the path sampling technique known as forward flux sampling to study homogeneous nucleation of NaCl crystals from supersaturated aqueous solutions at 298 K and 1 bar. Nucleation rates were obtained for a range of salt concentrations for the Joung-Cheatham NaCl force field combined with the Extended Simple Point Charge (SPC/E) water model. The calculated nucleation rates are significantly lower than the available experimental measurements. The estimates for the nucleation rates in this work do not rely on classical nucleation theory, but the pathways observed in the simulations suggest that the nucleation process is better described by classical nucleation theory than an alternative interpretation based on Ostwald's step rule, in contrast to some prior simulations of related models. In addition to the size of NaCl nucleus, we find that the crystallinity of a nascent cluster plays an important role in the nucleation process. Nuclei with high crystallinity were found to have higher growth probability and longer lifetimes, possibly because they are less exposed to hydration water.

  4. A molecular dynamics study of the complete binding process of meropenem to New Delhi metallo-β-lactamase 1.

    PubMed

    Duan, Juan; Hu, Chuncai; Guo, Jiafan; Guo, Lianxian; Sun, Jia; Zhao, Zuguo

    2018-02-28

    The mechanism of substrate hydrolysis of New Delhi metallo-β-lactamase 1 (NDM-1) has been reported, but the process in which NDM-1 captures and transports the substrate into its active center remains unknown. In this study, we investigated the process of the substrate entry into the NDM-1 activity center through long unguided molecular dynamics simulations using meropenem as the substrate. A total of 550 individual simulations were performed, each of which for 200 ns, and 110 of them showed enzyme-substrate binding events. The results reveal three categories of relatively persistent and noteworthy enzyme-substrate binding configurations, which we call configurations A, B, and C. We performed binding free energy calculations of the enzyme-substrate complexes of different configurations using the molecular mechanics Poisson-Boltzmann surface area method. The role of each residue of the active site in binding the substrate was investigated using energy decomposition analysis. The simulated trajectories provide a continuous atomic-level view of the entire binding process, revealing potentially valuable regions where the enzyme and the substrate interact persistently and five possible pathways of the substrate entering into the active center, which were validated using well-tempered metadynamics. These findings provide important insights into the binding mechanism of meropenem to NDM-1, which may provide new prospects for the design of novel metallo-β-lactamase inhibitors and enzyme-resistant antibiotics.

  5. NREL: News - Advisor 2002-A Powerful Vehicle Simulation Tool Gets Better

    Science.gov Websites

    Advisor 2002-A Powerful Vehicle Simulation Tool Gets Better Golden, Colo., June 11, 2002 A powerful analysis is made possible by co-simulation links to Avant!'s Saber and Ansoft's SIMPLORER�. Transient air conditioning system analysis is possible by co-simulation with C&R Technologies' SINDA/FLUINT

  6. In Vitro Model Simulating Gastro-Intestinal Digestion in the Pediatric Population (Neonates and Young Infants).

    PubMed

    Kamstrup, Danna; Berthelsen, Ragna; Sassene, Philip Jonas; Selen, Arzu; Müllertz, Anette

    2017-02-01

    The focus on drug delivery for the pediatric population has been steadily increasing in the last decades. In terms of developing in vitro models simulating characteristics of the targeted pediatric population, with the purpose of predicting drug product performance after oral administration, it is important to simulate the gastro-intestinal conditions and processes the drug will encounter upon oral administration. When a drug is administered in the fed state, which is commonly the case for neonates, as they are typically fed every 3 h, the digestion of the milk will affect the composition of the fluid available for drug dissolution/solubilization. Therefore, in order to predict the solubilized amount of drug available for absorption, an in vitro model simulating digestion in the gastro-intestinal tract should be utilized. In order to simulate the digestion process and the drug solubilization taking place in vivo, the following aspects should be considered; physiologically relevant media, media volume, use of physiological enzymes in proper amounts, as well as correct pH and addition of relevant co-factors, e.g., bile salts and co-enzymes. Furthermore, physiological transit times and appropriate mixing should be considered and mimicked as close as possible. This paper presents a literature review on physiological factors relevant for digestion and drug solubilization in neonates. Based on the available literature data, a novel in vitro digestion model simulating digestion and drug solubilization in the neonate and young infant pediatric population (2 months old and younger) was designed.

  7. An efficient spectral method for the simulation of dynamos in Cartesian geometry and its implementation on massively parallel computers

    NASA Astrophysics Data System (ADS)

    Stellmach, Stephan; Hansen, Ulrich

    2008-05-01

    Numerical simulations of the process of convection and magnetic field generation in planetary cores still fail to reach geophysically realistic control parameter values. Future progress in this field depends crucially on efficient numerical algorithms which are able to take advantage of the newest generation of parallel computers. Desirable features of simulation algorithms include (1) spectral accuracy, (2) an operation count per time step that is small and roughly proportional to the number of grid points, (3) memory requirements that scale linear with resolution, (4) an implicit treatment of all linear terms including the Coriolis force, (5) the ability to treat all kinds of common boundary conditions, and (6) reasonable efficiency on massively parallel machines with tens of thousands of processors. So far, algorithms for fully self-consistent dynamo simulations in spherical shells do not achieve all these criteria simultaneously, resulting in strong restrictions on the possible resolutions. In this paper, we demonstrate that local dynamo models in which the process of convection and magnetic field generation is only simulated for a small part of a planetary core in Cartesian geometry can achieve the above goal. We propose an algorithm that fulfills the first five of the above criteria and demonstrate that a model implementation of our method on an IBM Blue Gene/L system scales impressively well for up to O(104) processors. This allows for numerical simulations at rather extreme parameter values.

  8. Objects Mental Rotation under 7 Days Simulated Weightlessness Condition: An ERP Study

    PubMed Central

    Wang, Hui; Duan, Jiaobo; Liao, Yang; Wang, Chuang; Li, Hongzheng; Liu, Xufeng

    2017-01-01

    During the spaceflight under weightlessness condition, human's brain function may be affected by the changes of physiological effects along with the distribution of blood and body fluids to the head. This variation of brain function will influence the performance of astronauts and therefore create possible harm to flight safety. This study employs 20 male subjects in a 7-day−6° head-down tilted (HDT) bed rest model to simulate physiological effects under weightlessness condition, and use behavioral, electrophysiological techniques to compare the changes of mental rotation ability (MR ability) before and after short-term simulated weightlessness state. Behavioral results suggested that significant linear relationship existed between the rotation angle of stimuli and the reaction time, which means mental rotation process do happen during the MR task in simulated weightlessness state. In the first 3 days, the P300 component induced by object mental rotation followed the “down-up-down” pattern. In the following 4 days it changed randomly. On HDT D2, the mean of the amplitude of the P300 was the lowest, while increased gently on HDT D3. There was no obvious changing pattern of the amplitude of P300 observed after 3 days of HDT. Simulated weightlessness doesn't change the basic process of mental rotation. The effect of simulated weightlessness is neural mechanism of self-adaptation. MR ability didn't bounce back to the original level after HDT test. PMID:29270115

  9. Computer simulations of structural transitions in large ferrofluid aggregates

    NASA Astrophysics Data System (ADS)

    Yoon, Mina; Tomanek, David

    2003-03-01

    We have developed a quaternion molecular dynamics formalism to study structural transitions in systems of ferrofluid particles in colloidal suspensions. Our approach takes advantage of the viscous damping provided by the surrounding liquid and enables us to study the time evolution of these systems over milli-second time periods as a function of the number of particles, initial geometry, and an externally applied magnetic field. Our computer simulations for aggregates containing tens to hundreds of ferrofluid particles suggest that these systems relax to the global optimum structure in a step-wise manner. During the relaxation process, the potential energy decreases by two mechanisms, which occur on different time scales. Short time periods associated with structural relaxations within a given morphology are followed by much slower processes that generally lead to a simpler morphology. We discuss possible applications of these externally driven structural transitions for targeted medication delivery.

  10. A unified analysis of solidification in Bridgman crystal growth

    NASA Astrophysics Data System (ADS)

    Lu, Ming-Fang

    2012-04-01

    The simulation of multiphase solidification process can be handled by combining the VOF (Volume of Fluid) transport equation, in which the continuum mechanics model is used to simulate the melt/solid interface and the conservation of mass, momentum, and energy. Because the melt phase, the solid phase, and the melt/solid interface are controlled by a single control equation; if the enthalpy model based on porosity concept represents the processing of the phase transformation range, it is possible to solve the problem of phase transformation in the same way as solving the single-phase problem. Once the energy field of enthalpy for each step in time is resolved, the position of the interface can be precisely calculated with the use of VOF equation. This type of novel VOF method can be applied to find out the conditions of vertical Bridgman crystal growing located on the earth or under microgravity.

  11. A unified analysis of solidification in Bridgman crystal growth

    NASA Astrophysics Data System (ADS)

    Lu, Ming-Fang

    2011-11-01

    The simulation of multiphase solidification process can be handled by combining the VOF (Volume of Fluid) transport equation, in which the continuum mechanics model is used to simulate the melt/solid interface and the conservation of mass, momentum, and energy. Because the melt phase, the solid phase, and the melt/solid interface are controlled by a single control equation; if the enthalpy model based on porosity concept represents the processing of the phase transformation range, it is possible to solve the problem of phase transformation in the same way as solving the single-phase problem. Once the energy field of enthalpy for each step in time is resolved, the position of the interface can be precisely calculated with the use of VOF equation. This type of novel VOF method can be applied to find out the conditions of vertical Bridgman crystal growing located on the earth or under microgravity.

  12. Aerolastic tailoring and integrated wing design

    NASA Technical Reports Server (NTRS)

    Love, Mike H.; Bohlmann, Jon

    1989-01-01

    Much has been learned from the TSO optimization code over the years in determining aeroelastic tailoring's place in the integrated design process. Indeed, it has become apparent that aeroelastic tailoring is and should be deeply embedded in design. Aeroelastic tailoring can have tremendous effects on the design loads, and design loads affect every aspect of the design process. While optimization enables the evaluation of design sensitivities, valid computational simulations are required to make these sensitivities valid. Aircraft maneuvers simulated must adequately cover the plane's intended flight envelope, realistic design criteria must be included, and models among the various disciplines must be calibrated among themselves and with any hard-core (e.g., wind tunnel) data available. The information gained and benefits derived from aeroelastic tailoring provide a focal point for the various disciplines to become involved and communicate with one another to reach the best design possible.

  13. Adsorption of charged protein residues on an inorganic nanosheet: Computer simulation of LDH interaction with ion channel

    NASA Astrophysics Data System (ADS)

    Tsukanov, Alexey A.; Psakhie, Sergey G.

    2016-08-01

    Quasi-two-dimensional and hybrid nanomaterials based on layered double hydroxides (LDH), cationic clays, layered oxyhydroxides and hydroxides of metals possess large specific surface area and strong electrostatic properties with permanent or pH-dependent electric charge. Such nanomaterials may impact cellular electrostatics, changing the ion balance, pH and membrane potential. Selective ion adsorption/exchange may alter the transmembrane electrochemical gradient, disrupting potential-dependent cellular processes. Cellular proteins as a rule have charged residues which can be effectively adsorbed on the surface of layered hydroxide based nanomaterials. The aim of this study is to attempt to shed some light on the possibility and mechanisms of protein "adhesion" an LDH nanosheet and to propose a new direction in anticancer medicine, based on physical impact and strong electrostatics. An unbiased molecular dynamics simulation was performed and the combined process free energy estimation (COPFEE) approach was used.

  14. Large-Angle Scattering of Multi-GeV Muons on Thin Lead Targets

    NASA Astrophysics Data System (ADS)

    Longhin, A.; Paoloni, A.; Pupilli, F.

    2015-10-01

    The probability of large-angle scattering for multi-GeV muons in lead targets with a thickness of O(10 - 1) radiation lengths is studied. The new estimates presented here are based both on simulation programs (GEANT4 libraries) and theoretical calculations. In order to validate the results provided by simulation, a comparison is drawn with experimental data from the literature. This study is particularly relevant when applied to muons originating from νμ CC interactions of CNGS beam neutrinos. In that circumstance the process under study represents the dominant background for the νμ → ντ search in the τ→ μ channel for the OPERA experiment at LNGS. Finally we also investigate, in the CNGS context, possible contributions from the muon photo-nuclear process which might in principle also produce a large-angle muon scattering signature in the detector.

  15. An analytical probabilistic model of the quality efficiency of a sewer tank

    NASA Astrophysics Data System (ADS)

    Balistrocchi, Matteo; Grossi, Giovanna; Bacchi, Baldassare

    2009-12-01

    The assessment of the efficiency of a storm water storage facility devoted to the sewer overflow control in urban areas strictly depends on the ability to model the main features of the rainfall-runoff routing process and the related wet weather pollution delivery. In this paper the possibility of applying the analytical probabilistic approach for developing a tank design method, whose potentials are similar to the continuous simulations, is proved. In the model derivation the quality issues of such devices were implemented. The formulation is based on a Weibull probabilistic model of the main characteristics of the rainfall process and on a power law describing the relationship between the dimensionless storm water cumulative runoff volume and the dimensionless cumulative pollutograph. Following this approach, efficiency indexes were established. The proposed model was verified by comparing its results to those obtained by continuous simulations; satisfactory agreement is shown for the proposed efficiency indexes.

  16. Metallurgical Plant Optimization Through the use of Flowsheet Simulation Modelling

    NASA Astrophysics Data System (ADS)

    Kennedy, Mark William

    Modern metallurgical plants typically have complex flowsheets and operate on a continuous basis. Real time interactions within such processes can be complex and the impacts of streams such as recycles on process efficiency and stability can be highly unexpected prior to actual operation. Current desktop computing power, combined with state-of-the-art flowsheet simulation software like Metsim, allow for thorough analysis of designs to explore the interaction between operating rate, heat and mass balances and in particular the potential negative impact of recycles. Using plant information systems, it is possible to combine real plant data with simple steady state models, using dynamic data exchange links to allow for near real time de-bottlenecking of operations. Accurate analytical results can also be combined with detailed unit operations models to allow for feed-forward model-based-control. This paper will explore some examples of the application of Metsim to real world engineering and plant operational issues.

  17. Physics-Based Modeling of Electric Operation, Heat Transfer, and Scrap Melting in an AC Electric Arc Furnace

    NASA Astrophysics Data System (ADS)

    Opitz, Florian; Treffinger, Peter

    2016-04-01

    Electric arc furnaces (EAF) are complex industrial plants whose actual behavior depends upon numerous factors. Due to its energy intensive operation, the EAF process has always been subject to optimization efforts. For these reasons, several models have been proposed in literature to analyze and predict different modes of operation. Most of these models focused on the processes inside the vessel itself. The present paper introduces a dynamic, physics-based model of a complete EAF plant which consists of the four subsystems vessel, electric system, electrode regulation, and off-gas system. Furthermore the solid phase is not treated to be homogenous but a simple spatial discretization is employed. Hence it is possible to simulate the energy input by electric arcs and fossil fuel burners depending on the state of the melting progress. The model is implemented in object-oriented, equation-based language Modelica. The simulation results are compared to literature data.

  18. Implementation of an interactive liver surgery planning system

    NASA Astrophysics Data System (ADS)

    Wang, Luyao; Liu, Jingjing; Yuan, Rong; Gu, Shuguo; Yu, Long; Li, Zhitao; Li, Yanzhao; Li, Zhen; Xie, Qingguo; Hu, Daoyu

    2011-03-01

    Liver tumor, one of the most wide-spread diseases, has a very high mortality in China. To improve success rates of liver surgeries and life qualities of such patients, we implement an interactive liver surgery planning system based on contrastenhanced liver CT images. The system consists of five modules: pre-processing, segmentation, modeling, quantitative analysis and surgery simulation. The Graph Cuts method is utilized to automatically segment the liver based on an anatomical prior knowledge that liver is the biggest organ and has almost homogeneous gray value. The system supports users to build patient-specific liver segment and sub-segment models using interactive portal vein branch labeling, and to perform anatomical resection simulation. It also provides several tools to simulate atypical resection, including resection plane, sphere and curved surface. To match actual surgery resections well and simulate the process flexibly, we extend our work to develop a virtual scalpel model and simulate the scalpel movement in the hepatic tissue using multi-plane continuous resection. In addition, the quantitative analysis module makes it possible to assess the risk of a liver surgery. The preliminary results show that the system has the potential to offer an accurate 3D delineation of the liver anatomy, as well as the tumors' location in relation to vessels, and to facilitate liver resection surgeries. Furthermore, we are testing the system in a full-scale clinical trial.

  19. Creating and Testing Simulation Software

    NASA Technical Reports Server (NTRS)

    Heinich, Christina M.

    2013-01-01

    The goal of this project is to learn about the software development process, specifically the process to test and fix components of the software. The paper will cover the techniques of testing code, and the benefits of using one style of testing over another. It will also discuss the overall software design and development lifecycle, and how code testing plays an integral role in it. Coding is notorious for always needing to be debugged due to coding errors or faulty program design. Writing tests either before or during program creation that cover all aspects of the code provide a relatively easy way to locate and fix errors, which will in turn decrease the necessity to fix a program after it is released for common use. The backdrop for this paper is the Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI), a project whose goal is to simulate a launch using simulated models of the ground systems and the connections between them and the control room. The simulations will be used for training and to ensure that all possible outcomes and complications are prepared for before the actual launch day. The code being tested is the Programmable Logic Controller Interface (PLCIF) code, the component responsible for transferring the information from the models to the model Programmable Logic Controllers (PLCs), basic computers that are used for very simple tasks.

  20. Integrated surface/subsurface permafrost thermal hydrology: Model formulation and proof-of-concept simulations

    DOE PAGES

    Painter, Scott L.; Coon, Ethan T.; Atchley, Adam L.; ...

    2016-08-11

    The need to understand potential climate impacts and feedbacks in Arctic regions has prompted recent interest in modeling of permafrost dynamics in a warming climate. A new fine-scale integrated surface/subsurface thermal hydrology modeling capability is described and demonstrated in proof-of-concept simulations. The new modeling capability combines a surface energy balance model with recently developed three-dimensional subsurface thermal hydrology models and new models for nonisothermal surface water flows and snow distribution in the microtopography. Surface water flows are modeled using the diffusion wave equation extended to include energy transport and phase change of ponded water. Variation of snow depth in themore » microtopography, physically the result of wind scour, is also modeled heuristically with a diffusion wave equation. The multiple surface and subsurface processes are implemented by leveraging highly parallel community software. Fully integrated thermal hydrology simulations on the tilted open book catchment, an important test case for integrated surface/subsurface flow modeling, are presented. Fine-scale 100-year projections of the integrated permafrost thermal hydrological system on an ice wedge polygon at Barrow Alaska in a warming climate are also presented. Finally, these simulations demonstrate the feasibility of microtopography-resolving, process-rich simulations as a tool to help understand possible future evolution of the carbon-rich Arctic tundra in a warming climate.« less

  1. Investigation of a Mercury-Argon Hot Cathode Discharge

    NASA Astrophysics Data System (ADS)

    Wamsley, Robert Charles

    Classical absorption and laser induced fluorescence (LIF) experiments are used to investigate processes in the cathode region of a Hg-Ar hot cathode discharge. The absorption and LIF measurements are used to test the qualitative understanding and develop a quantitative model of a hot cathode discharge. The main contribution of this thesis is a model of the negative glow region that demonstrates the importance of Penning ionization to the ionization balance in the negative glow. We modeled the excited argon balance equation using a Monte Carlo simulation. In this simulation we used the trapped radiative decay rate of the resonance levels and the Penning ionization rate as the dominant loss terms in the balance equation. The simulated data is compared to and found to agree with absolute excited argon densities measured in a classical absorption experiment. We found the primary production rate per unit volume of excited Ar atoms in the simulation is sharply peaked near the cathode hot spot. We used the ion production rate from this simulation and a Green's function solution to the ambipolar diffusion equation to calculate the contribution of Penning ionization to the total ion density. We compared the results of this calculation to our experimental values of the Hg ^+ densities in the negative glow. We found that Penning ionization is an important and possibly the dominant ionization process in the negative glow.

  2. Bivalves: From individual to population modelling

    NASA Astrophysics Data System (ADS)

    Saraiva, S.; van der Meer, J.; Kooijman, S. A. L. M.; Ruardij, P.

    2014-11-01

    An individual based population model for bivalves was designed, built and tested in a 0D approach, to simulate the population dynamics of a mussel bed located in an intertidal area. The processes at the individual level were simulated following the dynamic energy budget theory, whereas initial egg mortality, background mortality, food competition, and predation (including cannibalism) were additional population processes. Model properties were studied through the analysis of theoretical scenarios and by simulation of different mortality parameter combinations in a realistic setup, imposing environmental measurements. Realistic criteria were applied to narrow down the possible combination of parameter values. Field observations obtained in the long-term and multi-station monitoring program were compared with the model scenarios. The realistically selected modeling scenarios were able to reproduce reasonably the timing of some peaks in the individual abundances in the mussel bed and its size distribution but the number of individuals was not well predicted. The results suggest that the mortality in the early life stages (egg and larvae) plays an important role in population dynamics, either by initial egg mortality, larvae dispersion, settlement failure or shrimp predation. Future steps include the coupling of the population model with a hydrodynamic and biogeochemical model to improve the simulation of egg/larvae dispersion, settlement probability, food transport and also to simulate the feedback of the organisms' activity on the water column properties, which will result in an improvement of the food quantity and quality characterization.

  3. Robotic Attention Processing And Its Application To Visual Guidance

    NASA Astrophysics Data System (ADS)

    Barth, Matthew; Inoue, Hirochika

    1988-03-01

    This paper describes a method of real-time visual attention processing for robots performing visual guidance. This robot attention processing is based on a novel vision processor, the multi-window vision system that was developed at the University of Tokyo. The multi-window vision system is unique in that it only processes visual information inside local area windows. These local area windows are quite flexible in their ability to move anywhere on the visual screen, change their size and shape, and alter their pixel sampling rate. By using these windows for specific attention tasks, it is possible to perform high speed attention processing. The primary attention skills of detecting motion, tracking an object, and interpreting an image are all performed at high speed on the multi-window vision system. A basic robotic attention scheme using the attention skills was developed. The attention skills involved detection and tracking of salient visual features. The tracking and motion information thus obtained was utilized in producing the response to the visual stimulus. The response of the attention scheme was quick enough to be applicable to the real-time vision processing tasks of playing a video 'pong' game, and later using an automobile driving simulator. By detecting the motion of a 'ball' on a video screen and then tracking the movement, the attention scheme was able to control a 'paddle' in order to keep the ball in play. The response was faster than that of a human's, allowing the attention scheme to play the video game at higher speeds. Further, in the application to the driving simulator, the attention scheme was able to control both direction and velocity of a simulated vehicle following a lead car. These two applications show the potential of local visual processing in its use for robotic attention processing.

  4. Microphysical processing of aerosol particles in orographic clouds

    NASA Astrophysics Data System (ADS)

    Pousse-Nottelmann, S.; Zubler, E. M.; Lohmann, U.

    2015-01-01

    An explicit and detailed treatment of cloud-borne particles allowing for the consideration of aerosol cycling in clouds has been implemented in the regional weather forecast and climate model COSMO. The effects of aerosol scavenging, cloud microphysical processing and regeneration upon cloud evaporation on the aerosol population and on subsequent cloud formation are investigated. For this, two-dimensional idealized simulations of moist flow over two bell-shaped mountains were carried out varying the treatment of aerosol scavenging and regeneration processes for a warm-phase and a mixed-phase orographic cloud. The results allowed to identify different aerosol cycling mechanisms. In the simulated non-precipitating warm-phase cloud, aerosol mass is incorporated into cloud droplets by activation scavenging and released back to the atmosphere upon cloud droplet evaporation. In the mixed-phase cloud, a first cycle comprises cloud droplet activation and evaporation via the Wegener-Bergeron-Findeisen process. A second cycle includes below-cloud scavenging by precipitating snow particles and snow sublimation and is connected to the first cycle via the riming process which transfers aerosol mass from cloud droplets to snow flakes. In the simulated mixed-phase cloud, only a negligible part of the total aerosol mass is incorporated into ice crystals. Sedimenting snow flakes reaching the surface remove aerosol mass from the atmosphere. The results show that aerosol processing and regeneration lead to a vertical redistribution of aerosol mass and number. However, the processes not only impact the total aerosol number and mass, but also the shape of the aerosol size distributions by enhancing the internally mixed/soluble accumulation mode and generating coarse mode particles. Concerning subsequent cloud formation at the second mountain, accounting for aerosol processing and regeneration increases the cloud droplet number concentration with possible implications for the ice crystal number concentration.

  5. Gene tree rooting methods give distributions that mimic the coalescent process.

    PubMed

    Tian, Yuan; Kubatko, Laura S

    2014-01-01

    Multi-locus phylogenetic inference is commonly carried out via models that incorporate the coalescent process to model the possibility that incomplete lineage sorting leads to incongruence between gene trees and the species tree. An interesting question that arises in this context is whether data "fit" the coalescent model. Previous work (Rosenfeld et al., 2012) has suggested that rooting of gene trees may account for variation in empirical data that has been previously attributed to the coalescent process. We examine this possibility using simulated data. We show that, in the case of four taxa, the distribution of gene trees observed from rooting estimated gene trees with either the molecular clock or with outgroup rooting can be closely matched by the distribution predicted by the coalescent model with specific choices of species tree branch lengths. We apply commonly-used coalescent-based methods of species tree inference to assess their performance in these situations. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Process integration possibilities for biodiesel production from palm oil using ethanol obtained from lignocellulosic residues of oil palm industry.

    PubMed

    Gutiérrez, Luis F; Sánchez, Oscar J; Cardona, Carlos A

    2009-02-01

    In this paper, integration possibilities for production of biodiesel and bioethanol using a single source of biomass as a feedstock (oil palm) were explored through process simulation. The oil extracted from Fresh Fruit Bunches was considered as the feedstock for biodiesel production. An extractive reaction process is proposed for transesterification reaction using in situ produced ethanol, which is obtained from two types of lignocellulosic residues of palm industry (Empty Fruit Bunches and Palm Press Fiber). Several ways of integration were analyzed. The integration of material flows between ethanol and biodiesel production lines allowed a reduction in unit energy costs down to 3.4%, whereas the material and energy integration leaded to 39.8% decrease of those costs. The proposed integrated configuration is an important option when the technology for ethanol production from biomass reaches such a degree of maturity that its production costs be comparable with those of grain or cane ethanol.

  7. Modeling a glacial lake outburst flood process chain: the case of Lake Palcacocha and Huaraz, Peru

    NASA Astrophysics Data System (ADS)

    Somos-Valenzuela, Marcelo A.; Chisolm, Rachel E.; Rivas, Denny S.; Portocarrero, Cesar; McKinney, Daene C.

    2016-07-01

    One of the consequences of recent glacier recession in the Cordillera Blanca, Peru, is the risk of glacial lake outburst floods (GLOFs) from lakes that have formed at the base of retreating glaciers. GLOFs are often triggered by avalanches falling into glacial lakes, initiating a chain of processes that may culminate in significant inundation and destruction downstream. This paper presents simulations of all of the processes involved in a potential GLOF originating from Lake Palcacocha, the source of a previously catastrophic GLOF on 13 December 1941, killing about 1800 people in the city of Huaraz, Peru. The chain of processes simulated here includes (1) avalanches above the lake; (2) lake dynamics resulting from the avalanche impact, including wave generation, propagation, and run-up across lakes; (3) terminal moraine overtopping and dynamic moraine erosion simulations to determine the possibility of breaching; (4) flood propagation along downstream valleys; and (5) inundation of populated areas. The results of each process feed into simulations of subsequent processes in the chain, finally resulting in estimates of inundation in the city of Huaraz. The results of the inundation simulations were converted into flood intensity and preliminary hazard maps (based on an intensity-likelihood matrix) that may be useful for city planning and regulation. Three avalanche events with volumes ranging from 0.5 to 3 × 106 m3 were simulated, and two scenarios of 15 and 30 m lake lowering were simulated to assess the potential of mitigating the hazard level in Huaraz. For all three avalanche events, three-dimensional hydrodynamic models show large waves generated in the lake from the impact resulting in overtopping of the damming moraine. Despite very high discharge rates (up to 63.4 × 103 m3 s-1), the erosion from the overtopping wave did not result in failure of the damming moraine when simulated with a hydro-morphodynamic model using excessively conservative soil characteristics that provide very little erosion resistance. With the current lake level, all three avalanche events result in inundation in Huaraz due to wave overtopping, and the resulting preliminary hazard map shows a total affected area of 2.01 km2, most of which is in the high hazard category. Lowering the lake has the potential to reduce the affected area by up to 35 %, resulting in a smaller portion of the inundated area in the high hazard category.

  8. Testing Land-Vegetation retrieval algorithms for the ICESat-2 mission

    NASA Astrophysics Data System (ADS)

    Zhou, T.; Popescu, S. C.

    2017-12-01

    The upcoming spaceborne satellite, the Ice, Cloud and land Elevation Satellite 2 (ICESat-2), will provide topography and canopy profiles at the global scale using photon counting LiDAR. To prepare for the mission launch, the aim of this research is to develop a framework for retrieving ground and canopy height in different forest types and noise levels using two ICESat-2 testbed sensor data: MABEL (Multiple Altimeter Beam Experimental Lidar) data and simulated ICESat-2 data. The first step of the framework is to reduce as many noise photons as possible through grid statistical methods and cluster analysis. Subsequently, we employed the overlapping moving windows and estimated quantile heights in each window to characterize the possible ground and canopy top using the filtered photons. Both MABEL and simulated ICESat-2 data generated satisfactory results with reasonable accuracy, while the results of simulated ICESat-2 data were better than that of MABEL data with smaller root mean square errors (RMSEs). For example, the RMSEs of canopy top identification in various vegetation using simulated ICESat-2 data were less than 3.78 m comparing to 6.48 m for the MABE data. It is anticipated that the methodology will advance data processing of the ICESat-2 mission and expand potential applications of ICESat-2 data once available such as mapping vegetation canopy heights.

  9. Controlling protein molecular dynamics: How to accelerate folding while preserving the native state

    NASA Astrophysics Data System (ADS)

    Jensen, Christian H.; Nerukh, Dmitry; Glen, Robert C.

    2008-12-01

    The dynamics of peptides and proteins generated by classical molecular dynamics (MD) is described by using a Markov model. The model is built by clustering the trajectory into conformational states and estimating transition probabilities between the states. Assuming that it is possible to influence the dynamics of the system by varying simulation parameters, we show how to use the Markov model to determine the parameter values that preserve the folded state of the protein and at the same time, reduce the folding time in the simulation. We investigate this by applying the method to two systems. The first system is an imaginary peptide described by given transition probabilities with a total folding time of 1μs. We find that only small changes in the transition probabilities are needed to accelerate (or decelerate) the folding. This implies that folding times for slowly folding peptides and proteins calculated using MD cannot be meaningfully compared to experimental results. The second system is a four residue peptide valine-proline-alanine-leucine in water. We control the dynamics of the transitions by varying the temperature and the atom masses. The simulation results show that it is possible to find the combinations of parameter values that accelerate the dynamics and at the same time preserve the native state of the peptide. A method for accelerating larger systems without performing simulations for the whole folding process is outlined.

  10. Modelling the Mont Terri HE-D experiment for the Thermal–Hydraulic–Mechanical response of a bedded argillaceous formation to heating

    DOE PAGES

    Garitte, B.; Nguyen, T. S.; Barnichon, J. D.; ...

    2017-05-09

    Coupled thermal–hydrological–mechanical (THM) processes in the near field of deep geological repositories can influence several safety features of the engineered and geological barriers. Among those features are: the possibility of damage in the host rock, the time for re-saturation of the bentonite, and the perturbations in the hydraulic regime in both the rock and engineered seals. Within the international cooperative code-validation project DECOVALEX-2015, eight research teams developed models to simulate an in situ heater experiment, called HE-D, in Opalinus Clay at the Mont Terri Underground Research Laboratory in Switzerland. The models were developed from the theory of poroelasticity in ordermore » to simulate the coupled THM processes that prevailed during the experiment and thereby to characterize the in situ THM properties of Opalinus Clay. The modelling results for the evolution of temperature, pore water pressure, and deformation at different points are consistent among the research teams and compare favourably with the experimental data in terms of trends and absolute values. The models were able to reproduce the main physical processes of the experiment. In particular, most teams simulated temperature and thermally induced pore water pressure well, including spatial variations caused by inherent anisotropy due to bedding.« less

  11. Modelling the Mont Terri HE-D experiment for the Thermal–Hydraulic–Mechanical response of a bedded argillaceous formation to heating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garitte, B.; Nguyen, T. S.; Barnichon, J. D.

    Coupled thermal–hydrological–mechanical (THM) processes in the near field of deep geological repositories can influence several safety features of the engineered and geological barriers. Among those features are: the possibility of damage in the host rock, the time for re-saturation of the bentonite, and the perturbations in the hydraulic regime in both the rock and engineered seals. Within the international cooperative code-validation project DECOVALEX-2015, eight research teams developed models to simulate an in situ heater experiment, called HE-D, in Opalinus Clay at the Mont Terri Underground Research Laboratory in Switzerland. The models were developed from the theory of poroelasticity in ordermore » to simulate the coupled THM processes that prevailed during the experiment and thereby to characterize the in situ THM properties of Opalinus Clay. The modelling results for the evolution of temperature, pore water pressure, and deformation at different points are consistent among the research teams and compare favourably with the experimental data in terms of trends and absolute values. The models were able to reproduce the main physical processes of the experiment. In particular, most teams simulated temperature and thermally induced pore water pressure well, including spatial variations caused by inherent anisotropy due to bedding.« less

  12. The resilience and functional role of moss in boreal and arctic ecosystems.

    PubMed

    Turetsky, M R; Bond-Lamberty, B; Euskirchen, E; Talbot, J; Frolking, S; McGuire, A D; Tuittila, E-S

    2012-10-01

    Mosses in northern ecosystems are ubiquitous components of plant communities, and strongly influence nutrient, carbon and water cycling. We use literature review, synthesis and model simulations to explore the role of mosses in ecological stability and resilience. Moss community responses to disturbance showed all possible responses (increases, decreases, no change) within most disturbance categories. Simulations from two process-based models suggest that northern ecosystems would need to experience extreme perturbation before mosses were eliminated. But simulations with two other models suggest that loss of moss will reduce soil carbon accumulation primarily by influencing decomposition rates and soil nitrogen availability. It seems clear that mosses need to be incorporated into models as one or more plant functional types, but more empirical work is needed to determine how to best aggregate species. We highlight several issues that have not been adequately explored in moss communities, such as functional redundancy and singularity, relationships between response and effect traits, and parameter vs conceptual uncertainty in models. Mosses play an important role in several ecosystem processes that play out over centuries - permafrost formation and thaw, peat accumulation, development of microtopography - and there is a need for studies that increase our understanding of slow, long-term dynamical processes. © 2012 The Authors. New Phytologist © 2012 New Phytologist Trust.

  13. Theories of Spoken Word Recognition Deficits in Aphasia: Evidence from Eye-Tracking and Computational Modeling

    PubMed Central

    Mirman, Daniel; Yee, Eiling; Blumstein, Sheila E.; Magnuson, James S.

    2011-01-01

    We used eye tracking to investigate lexical processing in aphasic participants by examining the fixation time course for rhyme (e.g., carrot – parrot) and cohort (e.g., beaker – beetle) competitors. Broca’s aphasic participants exhibited larger rhyme competition effects than age-matched controls. A reanalysis of previously reported data (Yee, Blumstein, & Sedivy, 2008) confirmed that Wernicke’s aphasic participants exhibited larger cohort competition effects. Individual-level analyses revealed a negative correlation between rhyme and cohort competition effect size across both groups of aphasic participants. Computational model simulations were performed to examine which of several accounts of lexical processing deficits in aphasia might account for the observed effects. Simulation results revealed that slower deactivation of lexical competitors could account for increased cohort competition in Wernicke’s aphasic participants; auditory perceptual impairment could account for increased rhyme competition in Broca's aphasic participants; and a perturbation of a parameter controlling selection among competing alternatives could account for both patterns, as well as the correlation between the effects. In light of these simulation results, we discuss theoretical accounts that have the potential to explain the dynamics of spoken word recognition in aphasia and the possible roles of anterior and posterior brain regions in lexical processing and cognitive control. PMID:21371743

  14. The resilience and functional role of moss in boreal and arctic ecosystems

    USGS Publications Warehouse

    Turetsky, M.; Bond-Lamberty, B.; Euskirchen, E.S.; Talbot, J. J.; Frolking, S.; McGuire, A.D.; Tuittila, E.S.

    2012-01-01

    Mosses in northern ecosystems are ubiquitous components of plant communities, and strongly influence nutrient, carbon and water cycling. We use literature review, synthesis and model simulations to explore the role of mosses in ecological stability and resilience. Moss community responses to disturbance showed all possible responses (increases, decreases, no change) within most disturbance categories. Simulations from two process-based models suggest that northern ecosystems would need to experience extreme perturbation before mosses were eliminated. But simulations with two other models suggest that loss of moss will reduce soil carbon accumulation primarily by influencing decomposition rates and soil nitrogen availability. It seems clear that mosses need to be incorporated into models as one or more plant functional types, but more empirical work is needed to determine how to best aggregate species. We highlight several issues that have not been adequately explored in moss communities, such as functional redundancy and singularity, relationships between response and effect traits, and parameter vs conceptual uncertainty in models. Mosses play an important role in several ecosystem processes that play out over centuries – permafrost formation and thaw, peat accumulation, development of microtopography – and there is a need for studies that increase our understanding of slow, long-term dynamical processes.

  15. Modern methods for the quality management of high-rate melt solidification

    NASA Astrophysics Data System (ADS)

    Vasiliev, V. A.; Odinokov, S. A.; Serov, M. M.

    2016-12-01

    The quality management of high-rate melt solidification needs combined solution obtained by methods and approaches adapted to a certain situation. Technological audit is recommended to estimate the possibilities of the process. Statistical methods are proposed with the choice of key parameters. Numerical methods, which can be used to perform simulation under multifactor technological conditions, and an increase in the quality of decisions are of particular importance.

  16. Sensor fusion for synthetic vision

    NASA Technical Reports Server (NTRS)

    Pavel, M.; Larimer, J.; Ahumada, A.

    1991-01-01

    Display methodologies are explored for fusing images gathered by millimeter wave sensors with images rendered from an on-board terrain data base to facilitate visually guided flight and ground operations in low visibility conditions. An approach to fusion based on multiresolution image representation and processing is described which facilitates fusion of images differing in resolution within and between images. To investigate possible fusion methods, a workstation-based simulation environment is being developed.

  17. Comment on ''the relative concentrations of radon daughter products in surface air and the significance of their ratios'' by C. Rangarajan, S. Gopalakrishnan, V. R. Chandrasekaran, and C. D. Eapen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marenco, A.; Fontan, J.

    1975-12-20

    Measurement of the ratio beweeen the short-lived radon daughters and $sup 210$Pb in order to determine the aerosol residence time in the troposphere is discussed. It is concluded that the various residence time values obtained experimentally with radioactive elements make it possible to determine parameters representing the processes of vertical exchanges and of scavenging which prevail on a large scale in the troposphere, thus making it possible to use numerical models of simulation for calculating the tropospheric residence time of any other element. (HLW)

  18. Technologies for Decreasing Mining Losses

    NASA Astrophysics Data System (ADS)

    Valgma, Ingo; Väizene, Vivika; Kolats, Margit; Saarnak, Martin

    2013-12-01

    In case of stratified deposits like oil shale deposit in Estonia, mining losses depend on mining technologies. Current research focuses on extraction and separation possibilities of mineral resources. Selective mining, selective crushing and separation tests have been performed, showing possibilities of decreasing mining losses. Rock crushing and screening process simulations were used for optimizing rock fractions. In addition mine backfilling, fine separation, and optimized drilling and blasting have been analyzed. All tested methods show potential and depend on mineral usage. Usage in addition depends on the utilization technology. The questions like stability of the material flow and influences of the quality fluctuations to the final yield are raised.

  19. Etching of semiconductor cubic crystals: Determination of the dissolution slowness surfaces

    NASA Astrophysics Data System (ADS)

    Tellier, C. R.

    1990-03-01

    Equations of the representative surface of dissolution slowness for cubic crystals are determined in the framework of a tensorial approach of the orientation-dependent etching process. The independent dissolution constants are deduced from symmetry considerations. Using previous data on the chemical etching of germanium and gallium arsenide crystals, some possible polar diagrams of the dissolution slowness are proposed. A numerical and graphical simulation method is used to obtain the derived dissolution shapes. The influence of extrema in the dissolution slowness on the successive dissolution shapes is also examined. A graphical construction of limiting shapes of etched crystals appears possible using the tensorial representation of the dissolution slowness.

  20. The effect of thermal and ultrasonic treatment on amino acid composition, radical scavenging and reducing potential of hydrolysates obtained from simulated gastrointestinal digestion of cowpea proteins.

    PubMed

    Quansah, Joycelyn K; Udenigwe, Chibuike C; Saalia, Firibu K; Yada, Rickey Y

    2013-03-01

    The effect of thermal and ultrasonic treatment of cowpea proteins (CP) on amino acid composition, radical scavenging and reducing potential of hydrolysates (CPH) obtained from in vitro simulated gastrointestinal digestion of CP was evaluated. Hydrolysis of native and treated CP with gastrointestinal pepsin and pancreatin yielded CPH that displayed antioxidant activities based on oxygen radical scavenging capacity (ORAC), ferric reducing antioxidant power (FRAP) and superoxide radical scavenging activity (SRSA). CPH derived from the treated CP yielded higher ORAC values than CPH from untreated proteins. However, lower significant FRAP and SRSA values were observed for these samples compared to untreated CPH (p < 0.05). Amino acid analysis indicated that CP processing decreased total sulphur-containing amino acids in the hydrolysates, particularly cysteine. The amount of cysteine appeared to be positively related to FRAP and SRSA values of CPH samples, but not ORAC. The results indicated that thermal and ultrasonic processing of CP can reduce the radical scavenging and reducing potential of the enzymatic hydrolysates possibly due to the decreased amounts of cysteine. Since the hydrolysates were generated with gastrointestinal enzymes, it is possible that the resulting compounds are produced to exert some health functions during normal consumption of cowpea.

  1. Building and assessing anatomically relevant phantoms for neonatal transcranial ultrasound

    NASA Astrophysics Data System (ADS)

    Memoli, G.; Gatto, M.; Sadhoo, N.; Gélat, P.; Harris, R. A.; Shaw, A.

    2011-02-01

    This study describes the design and construction of a clinically relevant phantom to survey the temperature increase caused by ultrasound equipment, as currently used in neonatal head-scanning in the UK. The phantom is an ellipsoid of bone-mimic material, filled with brain-mimic; a circular hole in the external surface mimicks the fontanel, through which most clinically relevant scans are made. Finite-element simulations were used to identify possible hot spots and decide the most effective thermocouple positions within the phantom to investigate temperature rise during a typical scan. Novel materials were purposively designed to simulate key acoustic and thermal properties. Three Dimensional Printing (3DP) was employed for the fabrication of the skull phantom, and a specific strategy was successfully pursued to embed a thermocouple within the 3DP skull phantom during the manufacturing process. An in-process Non-Destructive Analysis (NDA) was used to assess the correct position of the deposited thermocouple inside the fabricated skull phantom. The temperature increase in the phantom for a typical trans-fontanellar scan is also presented here. The current phantom will be used in a hospital survey in the UK and, in its final design, will allow for a more reliable evaluation of ultrasound heating than is currently possible.

  2. Room-temperature preparation of trisilver-copper-sulfide/polymer based heterojunction thin film for solar cell application

    NASA Astrophysics Data System (ADS)

    Lei, Yan; Yang, Xiaogang; Gu, Longyan; Jia, Huimin; Ge, Suxiang; Xiao, Pin; Fan, Xiaoli; Zheng, Zhi

    2015-04-01

    Solar cells devices based on inorganic/polymer heterojunction can be a possible solution to harvest solar energy and convert to electric energy with high efficiency through a cost-effective fabrication. The solution-process method can be easily used to produce large area devices. Moreover, due to the intrinsic different charge separation, diffusion or recombination in various semiconductors, the interfaces between each component may strongly influence the inorganic/polymer heterojunction performance. Here we prepared a n-type Ag3CuS2 (Eg = 1.25 eV) nanostructured film through a room-temperature element reaction process, which was confirmed as direct bandgap semiconductor through density function theory simulation. This Ag3CuS2 film was spin-coated with an organic semiconducting poly(3-hexythiophene) (P3HT) or polythieno[3,4-b]-thiophene-co-benzodithiophene (PTB7) film, which formed an inorganic/polymer heterojunction. After constructing it to a solar cell device, the power conversion efficiencies of 0.79% and 0.31% were achieved with simulated solar illumination on Ag3CuS2/P3HT and Ag3CuS2/PTB7, respectively. A possible mechanism was discussed and we showed the charge separation at interface of inorganic and polymer semiconductors played an important role.

  3. Simulation of possible regolith optical alteration effects on carbonaceous chondrite meteorites

    NASA Technical Reports Server (NTRS)

    Clark, Beth E.; Fanale, Fraser P.; Robinson, Mark S.

    1993-01-01

    As the spectral reflectance search continues for links between meteorites and their parent body asteroids, the effects of optical surface alteration processes need to be considered. We present the results of an experimental simulation of the melting and recrystallization that occurs to a carbonaceous chondrite meteorite regolith powder upon heating. As done for the ordinary chondrite meteorites, we show the effects of possible parent-body regolith alteration processes on reflectance spectra of carbonaceous chondrites (CC's). For this study, six CC's of different mineralogical classes were obtained from the Antarctic Meteorite Collection: two CM meteorites, two CO meteorites, one CK, and one CV. Each sample was ground with a ceramic mortar and pestle to powders with maximum grain sizes of 180 and 90 microns. The reflectance spectra of these powders were measured at RELAB (Brown University) from 0.3 to 2.5 microns. Following comminution, the 90 micron grain size was melted in a nitrogen controlled-atmosphere fusion furnace at an approximate temperature of 1700 C. The fused sample was immediately held above a flow of nitrogen at 0 C for quenching. Following melting and recrystallization, the samples were reground to powders, and the reflectance spectra were remeasured. The effects on spectral reflectance for a sample of the CM carbonaceous chondrite called Murchison are shown.

  4. Simulating pathways of subsurface oil in the Faroe-Shetland Channel using an ocean general circulation model.

    PubMed

    Main, C E; Yool, A; Holliday, N P; Popova, E E; Jones, D O B; Ruhl, H A

    2017-01-15

    Little is known about the fate of subsurface hydrocarbon plumes from deep-sea oil well blowouts and their effects on processes and communities. As deepwater drilling expands in the Faroe-Shetland Channel (FSC), oil well blowouts are a possibility, and the unusual ocean circulation of this region presents challenges to understanding possible subsurface oil pathways in the event of a spill. Here, an ocean general circulation model was used with a particle tracking algorithm to assess temporal variability of the oil-plume distribution from a deep-sea oil well blowout in the FSC. The drift of particles was first tracked for one year following release. Then, ambient model temperatures were used to simulate temperature-mediated biodegradation, truncating the trajectories of particles accordingly. Release depth of the modeled subsurface plumes affected both their direction of transport and distance travelled from their release location, and there was considerable interannual variability in transport. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Modelling Management Practices in Viticulture while Considering Resource Limitations: The Dhivine Model

    PubMed Central

    Martin-Clouaire, Roger; Rellier, Jean-Pierre; Paré, Nakié; Voltz, Marc; Biarnès, Anne

    2016-01-01

    Many farming-system studies have investigated the design and evaluation of crop-management practices with respect to economic performance and reduction in environmental impacts. In contrast, little research has been devoted to analysing these practices in terms of matching the recurrent context-dependent demand for resources (labour in particular) with those available on the farm. This paper presents Dhivine, a simulation model of operational management of grape production at the vineyard scale. Particular attention focuses on representing a flexible plan, which organises activities temporally, the resources available to the vineyard manager and the process of scheduling and executing the activities. The model relies on a generic production-system ontology used in several agricultural production domains. The types of investigations that the model supports are briefly illustrated. The enhanced realism of the production-management situations simulated makes it possible to examine and understand properties of resource-constrained work-organisation strategies and possibilities for improving them. PMID:26990089

  6. Ram Pressure Stripping of Galaxy JO201

    NASA Astrophysics Data System (ADS)

    Zhong, Greta; Tonnesen, Stephanie; Jaffé, Yara; Bellhouse, Callum; Bianca Poggianti

    2017-01-01

    Despite the discovery of the morphology-density relation more than 30 years ago, the process driving the evolution of spiral galaxies into S0s in clusters is still widely debated. Ram pressure stripping--the removal of a galaxy's interstellar medium by the pressure of the intracluster medium through which it orbits--may help explain galactic evolution and quenching in clusters. MUSE (Multi Unit Spectroscopic Explorer) observational data of galaxy JO201 in cluster Abell 85 reveal it to be a jellyfish galaxy--one with an H-alpha emitting gas tail on only one side. We model the possible orbits for this galaxy, constrained by the cluster mass profile, line of sight velocity, and projected distance from the cluster center. Using Enzo, an adaptive mesh refinement hydrodynamics code, we simulate effects of ram pressure on this galaxy for a range of possible orbits. We present comparisons of both the morphology and velocity structure of our simulated galaxy to the observations of H-alpha emission.

  7. Simulated learning environment experience in nursing students for paediatric practice.

    PubMed

    Mendoza-Maldonado, Yessy; Barría-Pailaquilén, René Mauricio

    The training of health professionals requires the acquisition of clinical skills in a safe and efficient manner, which is facilitated by a simulated learning environment (SLE). It is also an efficient alternative when there are limitations for clinical practice in certain areas. This paper shows the work undertaken in a Chilean university in implementing paediatric practice using SLE. Over eight days, the care experience of a hospitalized infant was studied applying the nursing process. The participation of a paediatrician, resident physician, nursing technician, and simulated user was included in addition to the use of a simulation mannequin and equipment. Simulation of care was integral and covered interaction with the child and family and was developed in groups of six students by a teacher. The different phases of the simulation methodology were developed from a pedagogical point of view. The possibility of implementing paediatric clinical practice in an efficient and safe way was confirmed. The experience in SLE was highly valued by the students, allowing them to develop different skills and abilities required for paediatric nursing through simulation. Copyright © 2018 Elsevier España, S.L.U. All rights reserved.

  8. Ground simulation of wide frequency band angular vibration for Lander's optic sensors

    NASA Astrophysics Data System (ADS)

    Xing, Zhigang; Xiang, Jianwei; Zheng, Gangtie

    2017-11-01

    To guide a lander of Moon or Mars exploration spacecraft during the stage of descent onto a desired place, optic sensors have been chosen to take the task, which include optic cameras and laser distance meters. However, such optic sensors are sensitive to vibrations, especially angular vibrations, from the lander. To reduce the risk of abnormal function and ensure the performance of optic sensors, ground simulations are necessary. More importantly, the simulations can be used as a method for examining the sensor performance and finding possible improvement on the sensor design. In the present paper, we proposed an angular vibration simulation method during the landing. This simulation method has been realized into product and applied to optic sensor tests for the moon lander. This simulator can generate random angular vibration in a frequency range from 0 to 2000Hz, the control precision is +/-1dB, and the linear translational speed can be set to the required descent speed. The operation and data processing methods of this developed simulator are the same as a normal shake table. The analysis and design methods are studied in the present paper, and test results are also provided.

  9. Modeling Neutron stars as r-process sources in Ultra Faint Dwarf galaxies

    NASA Astrophysics Data System (ADS)

    Safarzadeh, Mohammadtaher; Scannapieco, Evan

    2018-06-01

    To explain the high observed abundances of r-process elements in local ultrafaint dwarf (UFD) galaxies, we perform cosmological zoom simulations that include r-process production from neutron star mergers (NSMs). We model star formation stochastically and simulate two different haloes with total masses ≈108 M⊙ at z = 6. We find that the final distribution of [Eu/H] versus [Fe/H] is relatively insensitive to the energy by which the r-process material is ejected into the interstellar medium, but strongly sensitive to the environment in which the NSM event occurs. In one halo, the NSM event takes place at the centre of the stellar distribution, leading to high levels of r-process enrichment such as seen in a local UFD, Reticulum II (Ret II). In a second halo, the NSM event takes place outside of the densest part of the galaxy, leading to a more extended r-process distribution. The subsequent star formation occurs in an interstellar medium with shallow levels of r-process enrichment that results in stars with low levels of [Eu/H] compared to Ret II stars even when the maximum possible r-process mass is assumed to be ejected. This suggests that the natal kicks of neutron stars may also play an important role in determining the r-process abundances in UFD galaxies, a topic that warrants further theoretical investigation.

  10. Models for discrete-time self-similar vector processes with application to network traffic

    NASA Astrophysics Data System (ADS)

    Lee, Seungsin; Rao, Raghuveer M.; Narasimha, Rajesh

    2003-07-01

    The paper defines self-similarity for vector processes by employing the discrete-time continuous-dilation operation which has successfully been used previously by the authors to define 1-D discrete-time stochastic self-similar processes. To define self-similarity of vector processes, it is required to consider the cross-correlation functions between different 1-D processes as well as the autocorrelation function of each constituent 1-D process in it. System models to synthesize self-similar vector processes are constructed based on the definition. With these systems, it is possible to generate self-similar vector processes from white noise inputs. An important aspect of the proposed models is that they can be used to synthesize various types of self-similar vector processes by choosing proper parameters. Additionally, the paper presents evidence of vector self-similarity in two-channel wireless LAN data and applies the aforementioned systems to simulate the corresponding network traffic traces.

  11. Irdis: A Digital Scene Storage And Processing System For Hardware-In-The-Loop Missile Testing

    NASA Astrophysics Data System (ADS)

    Sedlar, Michael F.; Griffith, Jerry A.

    1988-07-01

    This paper describes the implementation of a Seeker Evaluation and Test Simulation (SETS) Facility at Eglin Air Force Base. This facility will be used to evaluate imaging infrared (IIR) guided weapon systems by performing various types of laboratory tests. One such test is termed Hardware-in-the-Loop (HIL) simulation (Figure 1) in which the actual flight of a weapon system is simulated as closely as possible in the laboratory. As shown in the figure, there are four major elements in the HIL test environment; the weapon/sensor combination, an aerodynamic simulator, an imagery controller, and an infrared imagery system. The paper concentrates on the approaches and methodologies used in the imagery controller and infrared imaging system elements for generating scene information. For procurement purposes, these two elements have been combined into an Infrared Digital Injection System (IRDIS) which provides scene storage, processing, and output interface to drive a radiometric display device or to directly inject digital video into the weapon system (bypassing the sensor). The paper describes in detail how standard and custom image processing functions have been combined with off-the-shelf mass storage and computing devices to produce a system which provides high sample rates (greater than 90 Hz), a large terrain database, high weapon rates of change, and multiple independent targets. A photo based approach has been used to maximize terrain and target fidelity, thus providing a rich and complex scene for weapon/tracker evaluation.

  12. Geant4 simulations of a wide-angle x-ray focusing telescope

    NASA Astrophysics Data System (ADS)

    Zhao, Donghua; Zhang, Chen; Yuan, Weimin; Zhang, Shuangnan; Willingale, Richard; Ling, Zhixing

    2017-06-01

    The rapid development of X-ray astronomy has been made possible by widely deploying X-ray focusing telescopes on board many X-ray satellites. Geant4 is a very powerful toolkit for Monte Carlo simulations and has remarkable abilities to model complex geometrical configurations. However, the library of physical processes available in Geant4 lacks a description of the reflection of X-ray photons at a grazing incident angle which is the core physical process in the simulation of X-ray focusing telescopes. The scattering of low-energy charged particles from the mirror surfaces is another noteworthy process which is not yet incorporated into Geant4. Here we describe a Monte Carlo model of a simplified wide-angle X-ray focusing telescope adopting lobster-eye optics and a silicon detector using the Geant4 toolkit. With this model, we simulate the X-ray tracing, proton scattering and background detection. We find that: (1) the effective area obtained using Geant4 is in agreement with that obtained using Q software with an average difference of less than 3%; (2) X-rays are the dominant background source below 10 keV; (3) the sensitivity of the telescope is better by at least one order of magnitude than that of a coded mask telescope with the same physical dimensions; (4) the number of protons passing through the optics and reaching the detector by Firsov scattering is about 2.5 times that of multiple scattering for the lobster-eye telescope.

  13. Device and circuit analysis of a sub 20 nm double gate MOSFET with gate stack using a look-up-table-based approach

    NASA Astrophysics Data System (ADS)

    Chakraborty, S.; Dasgupta, A.; Das, R.; Kar, M.; Kundu, A.; Sarkar, C. K.

    2017-12-01

    In this paper, we explore the possibility of mapping devices designed in TCAD environment to its modeled version developed in cadence virtuoso environment using a look-up table (LUT) approach. Circuit simulation of newly designed devices in TCAD environment is a very slow and tedious process involving complex scripting. Hence, the LUT based modeling approach has been proposed as a faster and easier alternative in cadence environment. The LUTs are prepared by extracting data from the device characteristics obtained from device simulation in TCAD. A comparative study is shown between the TCAD simulation and the LUT-based alternative to showcase the accuracy of modeled devices. Finally the look-up table approach is used to evaluate the performance of circuits implemented using 14 nm nMOSFET.

  14. Bias of cylinder diameter estimation from ground-based laser scanners with different beam widths: A simulation study

    NASA Astrophysics Data System (ADS)

    Forsman, Mona; Börlin, Niclas; Olofsson, Kenneth; Reese, Heather; Holmgren, Johan

    2018-01-01

    In this study we have investigated why diameters of tree stems, which are approximately cylindrical, are often overestimated by mobile laser scanning. This paper analyzes the physical processes when using ground-based laser scanning that may contribute to a bias when estimating cylinder diameters using circle-fit methods. A laser scanner simulator was implemented and used to evaluate various properties, such as distance, cylinder diameter, and beam width of a laser scanner-cylinder system to find critical conditions. The simulation results suggest that a positive bias of the diameter estimation is expected. Furthermore, the bias follows a quadratic function of one parameter - the relative footprint, i.e., the fraction of the cylinder width illuminated by the laser beam. The quadratic signature opens up a possibility to construct a compensation model for the bias.

  15. A Novel Method for Characterization of Superconductors: Physical Measurements and Modeling of Thin Films

    NASA Technical Reports Server (NTRS)

    Kim, B. F.; Moorjani, K.; Phillips, T. E.; Adrian, F. J.; Bohandy, J.; Dolecek, Q. E.

    1993-01-01

    A method for characterization of granular superconducting thin films has been developed which encompasses both the morphological state of the sample and its fabrication process parameters. The broad scope of this technique is due to the synergism between experimental measurements and their interpretation using numerical simulation. Two novel technologies form the substance of this system: the magnetically modulated resistance method for characterizing superconductors; and a powerful new computer peripheral, the Parallel Information Processor card, which provides enhanced computing capability for PC computers. This enhancement allows PC computers to operate at speeds approaching that of supercomputers. This makes atomic scale simulations possible on low cost machines. The present development of this system involves the integration of these two technologies using mesoscale simulations of thin film growth. A future stage of development will incorporate atomic scale modeling.

  16. Defect characterization by inductive heated thermography

    NASA Astrophysics Data System (ADS)

    Noethen, Matthias; Meyendorf, Norbert

    2012-05-01

    During inductive-thermographic inspection, an eddy current of high intensity is induced into the inspected material and the thermal response is detected by an infrared camera. Anomalies in the surface temperature during and after inductive heating correspond to inhomogeneities in the material. A finite element simulation of the surface crack detection process using active thermography with inductive heating has been developed. The simulation model is based on the finite element software ANSYS. The simulation tool was tested and used for investigations on steel components with different longitudinal orientated cracks, varying in shape, width and height. This paper focuses on surface connected longitudinal orientated cracks in austenitic steel. The results show that depending on the excitation frequency the temperature distribution of the material under test are different and a possible way to measure the depth of the crack will be discussed.

  17. Cadmium: Simulation of environmental control strategies to reduce exposure

    NASA Astrophysics Data System (ADS)

    Yost, K. J.; Miles, L. J.; Greenkorn, R. A.

    1981-07-01

    The effects of selected environmental control strategies on human dietary and respiratory exposure to environmental cadmium (Cd) have been simulated. For each control strategy, mean Cd dietary and respiratory exposures are presented for a twenty-year simulation period. Human exposures related to cadmium are associated with both process waste disposal and product disposal. Dietary exposure is by far the dominant mechanism for Cd intake. Dietary exposure related to aqueous discharges is primarily a result of municipal sludge landspreading, whereas that associated with emissions to the atmosphere derives mainly from the deposition on cropland of airborne particulates from product incineration. Only relatively small dietary exposure reductions are possible through restrictions on any single Cd use. Combinations of waste management and environmental control measures promise greater reductions in dietary and respiratory exposure than those achievable through use restrictions.

  18. Failure prediction for the optimization of stretch forming aluminium-polymer laminate foils used for pharmaceutical packaging

    NASA Astrophysics Data System (ADS)

    Müller, Simon; Weygand, Sabine M.

    2018-05-01

    Axisymmetric stretch forming processes of aluminium-polymer laminate foils (e.g. consisting of PA-Al-PVC layers) are analyzed numerically by finite element modeling of the multi-layer material as well as experimentally in order to identify a suitable damage initiation criterion. A simple ductile fracture criterion is proposed to predict the forming limits. The corresponding material constants are determined from tensile tests and then applied in forming simulations with different punch geometries. A comparison between the simulations and the experimental results shows that the determined failure constants are not applicable. Therefore, one forming experiment was selected and in the corresponding simulation the failure constant was fitted to its measured maximum stretch. With this approach it is possible to predict the forming limit of the laminate foil with satisfying accuracy for different punch geometries.

  19. Manufacturing of ArF chromeless hard shifter for 65-nm technology

    NASA Astrophysics Data System (ADS)

    Park, Keun-Taek; Dieu, Laurent; Hughes, Greg P.; Green, Kent G.; Croffie, Ebo H.; Taravade, Kunal N.

    2003-12-01

    For logic design, Chrome-less Phase Shift Mask is one of the possible solutions for defining small geometry with low MEF (mask enhancement factor) for the 65nm node. There have been lots of dedicated studies on the PCO (Phase Chrome Off-axis) mask technology and several design approaches have been proposed including grating background, chrome patches (or chrome shield) for applying PCO on line/space and contact pattern. In this paper, we studied the feasibility of grating design for line and contact pattern. The design of the grating pattern was provided from the EM simulation software (TEMPEST) and the aerial image simulation software. AIMS measurements with high NA annular illumination were done. Resist images were taken on designed pattern in different focus. Simulations, AIMS are compared to verify the consistency of the process with wafer printed performance.

  20. A Study of Umbilical Communication Interface of Simulator Kernel to Enhance Visibility and Controllability

    NASA Astrophysics Data System (ADS)

    Koo, Cheol Hea; Lee, Hoon Hee; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok

    2013-08-01

    In aerospace research and practical development area, increasing the usage of simulation in software development, component design and system operation has been maintained and the increasing speed getting faster. This phenomenon can be found from the easiness of handling of simulation and the powerfulness of the output from the simulation. Simulation brings lots of benefit from the several characteristics of it as following, - easy to handle ; it is never broken or damaged by mistake - never wear out ; it is never getting old - cost effective ; once it is built, it can be distributed over 100 ~ 1000 people GenSim (Generic Simulator) which is developing by KARI and compatible with ESA SMP standard provides such a simulation platform to support flight software validation and mission operation verification. User interface of GenSim is shown in Figure 1 [1,2]. As shown in Figure 1, as most simulation platform typically has, GenSim has GRD (Graphical Display) and AND (Alpha Numeric Display). But frequently more complex and powerful handling of the simulated data is required at the actual system validation for example mission operation. In Figure 2, system simulation result of COMS (Communication, Ocean, and Meteorological Satellite, launched at June 28 2008) is being drawn by Celestia 3D program. In this case, the needed data from Celestia is given by one of the simulation model resident in system simulator through UDP network connection in this case. But the requirement of displaying format, data size, and communication rate is variable so developer has to manage the connection protocol manually at each time and each case. It brings a chaos in the simulation model design and development, also to the performance issue at last. Performance issue is happen when the required data magnitude is higher than the capacity of simulation kernel to process the required data safely. The problem is that the sending data to a visualization tool such as celestia is given by a simulation model not kernel. Because the simulation model has no way to know about the status of simulation kernel load to process simulation events, as the result the simulation model sends the data as frequent as needed. This story may make many potential problems like lack of response, failure of meeting deadline and data integrity problem with the model data during the simulation. SIMSAT and EuroSim gives a warning message if the user request event such as printing log can't be processed as planned or requested. As the consequence the requested event will be delayed or not be able to be processed, and it means that this phenomenon may violate the planned deadline. In most soft real time simulation, this can be neglected and just make a little inconvenience of users. But it shall be noted that if the user request is not managed properly at some critical situation, the simulation results may be ended with a mess and chaos. As we traced the disadvantages of what simulation model provide the user request, simulation model is not appropriate to provide a service for such user request. This kind of work shall be minimized as much as possible.

  1. Using HPC within an operational forecasting configuration

    NASA Astrophysics Data System (ADS)

    Jagers, H. R. A.; Genseberger, M.; van den Broek, M. A. F. H.

    2012-04-01

    Various natural disasters are caused by high-intensity events, for example: extreme rainfall can in a short time cause major damage in river catchments, storms can cause havoc in coastal areas. To assist emergency response teams in operational decisions, it's important to have reliable information and predictions as soon as possible. This starts before the event by providing early warnings about imminent risks and estimated probabilities of possible scenarios. In the context of various applications worldwide, Deltares has developed an open and highly configurable forecasting and early warning system: Delft-FEWS. Finding the right balance between simulation time (and hence prediction lead time) and simulation accuracy and detail is challenging. Model resolution may be crucial to capture certain critical physical processes. Uncertainty in forcing conditions may require running large ensembles of models; data assimilation techniques may require additional ensembles and repeated simulations. The computational demand is steadily increasing and data streams become bigger. Using HPC resources is a logical step; in different settings Delft-FEWS has been configured to take advantage of distributed computational resources available to improve and accelerate the forecasting process (e.g. Montanari et al, 2006). We will illustrate the system by means of a couple of practical applications including the real-time dynamic forecasting of wind driven waves, flow of water, and wave overtopping at dikes of Lake IJssel and neighboring lakes in the center of The Netherlands. Montanari et al., 2006. Development of an ensemble flood forecasting system for the Po river basin, First MAP D-PHASE Scientific Meeting, 6-8 November 2006, Vienna, Austria.

  2. Simulating the elimination of sleeping sickness with an agent-based model.

    PubMed

    Grébaut, Pascal; Girardin, Killian; Fédérico, Valentine; Bousquet, François

    2016-01-01

    Although Human African Trypanosomiasis is largely considered to be in the process of extinction today, the persistence of human and animal reservoirs, as well as the vector, necessitates a laborious elimination process. In this context, modeling could be an effective tool to evaluate the ability of different public health interventions to control the disease. Using the Cormas ® system, we developed HATSim, an agent-based model capable of simulating the possible endemic evolutions of sleeping sickness and the ability of National Control Programs to eliminate the disease. This model takes into account the analysis of epidemiological, entomological, and ecological data from field studies conducted during the last decade, making it possible to predict the evolution of the disease within this area over a 5-year span. In this article, we first present HATSim according to the Overview, Design concepts, and Details (ODD) protocol that is classically used to describe agent-based models, then, in a second part, we present predictive results concerning the evolution of Human African Trypanosomiasis in the village of Lambi (Cameroon), in order to illustrate the interest of such a tool. Our results are consistent with what was observed in the field by the Cameroonian National Control Program (CNCP). Our simulations also revealed that regular screening can be sufficient, although vector control applied to all areas with human activities could be significantly more efficient. Our results indicate that the current model can already help decision-makers in planning the elimination of the disease in foci. © P. Grébaut et al., published by EDP Sciences, 2016.

  3. Optimal control of laser-induced spin-orbit mediated ultrafast demagnetization

    NASA Astrophysics Data System (ADS)

    Elliott, P.; Krieger, K.; Dewhurst, J. K.; Sharma, S.; Gross, E. K. U.

    2016-01-01

    Laser induced ultrafast demagnetization is the process whereby the magnetic moment of a ferromagnetic material is seen to drop significantly on a timescale of 10-100 s of femtoseconds due to the application of a strong laser pulse. If this phenomenon can be harnessed for future technology, it offers the possibility for devices operating at speeds several orders of magnitude faster than at present. A key component to successful transfer of such a process to technology is the controllability of the process, i.e. that it can be tuned in order to overcome the practical and physical limitations imposed on the system. In this paper, we demonstrate that the spin-orbit mediated form of ultrafast demagnetization recently investigated (Krieger et al 2015 J. Chem. Theory Comput. 11 4870) by ab initio time-dependent density functional theory (TDDFT) can be controlled. To do so we use quantum optimal control theory (OCT) to couple our TDDFT simulations to the optimization machinery of OCT. We show that a laser pulse can be found which maximizes the loss of moment within a given time interval while subject to several practical and physical constraints. Furthermore we also include a constraint on the fluence of the laser pulses and find the optimal pulse that combines significant demagnetization with a desire for less powerful pulses. These calculations demonstrate optimal control is possible for spin-orbit mediated ultrafast demagnetization and lays the foundation for future optimizations/simulations which can incorporate even more constraints.

  4. Predictable turn-around time for post tape-out flow

    NASA Astrophysics Data System (ADS)

    Endo, Toshikazu; Park, Minyoung; Ghosh, Pradiptya

    2012-03-01

    A typical post-out flow data path at the IC Fabrication has following major components of software based processing - Boolean operations before the application of resolution enhancement techniques (RET) and optical proximity correctin (OPC), the RET and OPC step [etch retargeting, sub-resolution assist feature insertion (SRAF) and OPC], post-OPCRET Boolean operations and sometimes in the same flow simulation based verification. There are two objectives that an IC Fabrication tapeout flow manager wants to achieve with the flow - predictable completion time and fastest turn-around time (TAT). At times they may be competing. There have been studies in the literature modeling the turnaround time from historical data for runs with the same recipe and later using that to derive the resource allocation for subsequent runs. [3]. This approach is more feasible in predominantly simulation dominated tools but for edge operation dominated flow it may not be possible especially if some processing acceleration methods like pattern matching or hierarchical processing is involved. In this paper, we suggest an alternative method of providing target turnaround time and managing the priority of jobs while not doing any upfront resource modeling and resource planning. The methodology then systematically either meets the turnaround time need and potentially lets the user know if it will not as soon as possible. This builds on top of the Calibre Cluster Management (CalCM) resource management work previously published [1][2]. The paper describes the initial demonstration of the concept.

  5. Laboratory Simulations of Martian and Venusian Aeolian Processes

    NASA Technical Reports Server (NTRS)

    Greeley, Ronald

    1999-01-01

    With the flyby of the Neptune system by Voyager, the preliminary exploration of the Solar System was accomplished. Data have been returned for all major planets and satellites except the Pluto system. Results show that the surfaces of terrestrial planets and satellites have been subjected to a wide variety of geological processes. On solid- surface planetary objects having an atmosphere, aeolian processes are important in modifying their surfaces through the redistribution of fine-grained material by the wind. Bedrock may be eroded to produce particles and the particles transported by wind for deposition in other areas. This process operates on Earth today and is evident throughout the geological record. Aeolian processes also occur on Mars, Venus, and possibly Titan and Triton, both of which are outer planet satellites that have atmospheres. Mariner 9 and Viking results show abundant wind-related landforms on Mars, including dune fields and yardangs (wind-eroded hills). On Venus, measurements made by the Soviet Venera and Vega spacecraft and extrapolations from the Pioneer Venus atmospheric probes show that surface winds are capable of transporting particulate materials and suggest that aeolian processes may operate on that planet as well. Magellan radar images of Venus show abundant wind streaks in some areas, as well as dune fields and a zone of possible yardangs. The study of planetary aeolian processes must take into account diverse environments, from the cold, low-density atmosphere of Mars to the extremely hot, high- density Venusian atmosphere. Factors such as threshold wind speeds (minimum wind velocity needed to move particles), rates of erosion and deposition, trajectories of windblown particles, and aeolian flow fields over various landforms are all important aspects of the problem. In addition, study of aeolian terrains on Earth using data analogous to planetary data-collection systems is critical to the interpretation of spacecraft information and places constraints on results from numerical models and laboratory simulations.

  6. Some aspects of modeling hydrocarbon oxidation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gal, D.; Botar, L.; Danoczy, E.

    1981-01-01

    A modeling procedure for the study of hydrocarbon oxidation is suggested, and its effectiveness for the oxidation of ethylbenzene is demonstrated. As a first step in modeling, systematization involves compilation of possible mechanisms. Then, by introduction of the concept of kinetic communication, the chaotic set of possible mechanisms is systematized into a network. Experimentation serves both as feedback to the systematic arrangement of information and source of new information. Kinetic treatment of the possible mechanism has been accomplished by two different approaches: by classical inductive calculations starting with a small mechanism and using kinetic approximations, and by computer simulation. Themore » authors have compiled a so-called Main Contributory Mechanism, involving processes - within the possible mechanism - which contribute basically to the formation and consumption of the intermediates, to the consumption of the starting compounds and to the formation of the end products. 24 refs.« less

  7. Influence of computational domain size on the pattern formation of the phase field crystals

    NASA Astrophysics Data System (ADS)

    Starodumov, Ilya; Galenko, Peter; Alexandrov, Dmitri; Kropotin, Nikolai

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) represents one of the important directions of modern computational materials science. This method makes it possible to research the formation of stable or metastable crystal structures. In this paper, we study the effect of computational domain size on the crystal pattern formation obtained as a result of computer simulation by the PFC method. In the current report, we show that if the size of a computational domain is changed, the result of modeling may be a structure in metastable phase instead of pure stable state. The authors present a possible theoretical justification for the observed effect and provide explanations on the possible modification of the PFC method to account for this phenomenon.

  8. Superlative TINs

    NASA Technical Reports Server (NTRS)

    Chamberlin, R.

    2002-01-01

    TIN is short for 'triangulated irregular network,' which is a piecewise planar model of a surface. If properly constructed, a TIN can be more than 30 times as efficient as a regular triangulation. In our project (a ground combat simulation to support U.S. Army training exercises), the TIN is used to represent the Earth's surface and is used primarily to determine whether line of sight is blocked by terrain. High efficiency requires accurate identification of ridgelines with as few triangles as possible. The work currently in progress is the implementation of a TINning process that we hope will produce superlative TINs. This presentation describes that process.

  9. Polymerization of amino acids under primitive earth conditions.

    NASA Technical Reports Server (NTRS)

    Flores, J. J.; Ponnamperuma, C.

    1972-01-01

    Small amounts of peptides were obtained when equal amounts of methane and ammonia were reacted with vaporized aqueous solutions of C14-labeled glycine, L-alanine, L-aspartic acid, L-glutamic acid and L-threonine in the presence of a continuous spark discharge in a 24-hr cyclic process. The experiment was designed to demonstrate the possibility of peptide synthesis under simulated primeval earth conditions. It is theorized that some dehydration-condensation processes may have taken place, with ammonium cyanide, the hydrogencyanide tetramer or aminonitriles as intermediate products, during the early chemical evolution of the earth.

  10. Neural pulse frequency modulation of an exponentially correlated Gaussian process

    NASA Technical Reports Server (NTRS)

    Hutchinson, C. E.; Chon, Y.-T.

    1976-01-01

    The effect of NPFM (Neural Pulse Frequency Modulation) on a stationary Gaussian input, namely an exponentially correlated Gaussian input, is investigated with special emphasis on the determination of the average number of pulses in unit time, known also as the average frequency of pulse occurrence. For some classes of stationary input processes where the formulation of the appropriate multidimensional Markov diffusion model of the input-plus-NPFM system is possible, the average impulse frequency may be obtained by a generalization of the approach adopted. The results are approximate and numerical, but are in close agreement with Monte Carlo computer simulation results.

  11. Parameter extraction with neural networks

    NASA Astrophysics Data System (ADS)

    Cazzanti, Luca; Khan, Mumit; Cerrina, Franco

    1998-06-01

    In semiconductor processing, the modeling of the process is becoming more and more important. While the ultimate goal is that of developing a set of tools for designing a complete process (Technology CAD), it is also necessary to have modules to simulate the various technologies and, in particular, to optimize specific steps. This need is particularly acute in lithography, where the continuous decrease in CD forces the technologies to operate near their limits. In the development of a 'model' for a physical process, we face several levels of challenges. First, it is necessary to develop a 'physical model,' i.e. a rational description of the process itself on the basis of know physical laws. Second, we need an 'algorithmic model' to represent in a virtual environment the behavior of the 'physical model.' After a 'complete' model has been developed and verified, it becomes possible to do performance analysis. In many cases the input parameters are poorly known or not accessible directly to experiment. It would be extremely useful to obtain the values of these 'hidden' parameters from experimental results by comparing model to data. This is particularly severe, because the complexity and costs associated with semiconductor processing make a simple 'trial-and-error' approach infeasible and cost- inefficient. Even when computer models of the process already exists, obtaining data through simulations may be time consuming. Neural networks (NN) are powerful computational tools to predict the behavior of a system from an existing data set. They are able to adaptively 'learn' input/output mappings and to act as universal function approximators. In this paper we use artificial neural networks to build a mapping from the input parameters of the process to output parameters which are indicative of the performance of the process. Once the NN has been 'trained,' it is also possible to observe the process 'in reverse,' and to extract the values of the inputs which yield outputs with desired characteristics. Using this method, we can extract optimum values for the parameters and determine the process latitude very quickly.

  12. Accelerating population balance-Monte Carlo simulation for coagulation dynamics from the Markov jump model, stochastic algorithm and GPU parallel computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zuwei; Zhao, Haibo, E-mail: klinsmannzhb@163.com; Zheng, Chuguang

    2015-01-15

    This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule providesmore » a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are demonstrated in a physically realistic Brownian coagulation case. The computational accuracy is validated with benchmark solution of discrete-sectional method. The simulation results show that the comprehensive approach can attain very favorable improvement in cost without sacrificing computational accuracy.« less

  13. FPGA based charge acquisition algorithm for soft x-ray diagnostics system

    NASA Astrophysics Data System (ADS)

    Wojenski, A.; Kasprowicz, G.; Pozniak, K. T.; Zabolotny, W.; Byszuk, A.; Juszczyk, B.; Kolasinski, P.; Krawczyk, R. D.; Zienkiewicz, P.; Chernyshova, M.; Czarski, T.

    2015-09-01

    Soft X-ray (SXR) measurement systems working in tokamaks or with laser generated plasma can expect high photon fluxes. Therefore it is necessary to focus on data processing algorithms to have the best possible efficiency in term of processed photon events per second. This paper refers to recently designed algorithm and data-flow for implementation of charge data acquisition in FPGA. The algorithms are currently on implementation stage for the soft X-ray diagnostics system. In this paper despite of the charge processing algorithm is also described general firmware overview, data storage methods and other key components of the measurement system. The simulation section presents algorithm performance and expected maximum photon rate.

  14. Application of computational methods to analyse and investigate physical and chemical processes of high-temperature mineralizing of condensed substances in gas stream

    NASA Astrophysics Data System (ADS)

    Markelov, A. Y.; Shiryaevskii, V. L.; Kudrinskiy, A. A.; Anpilov, S. V.; Bobrakov, A. N.

    2017-11-01

    A computational method of analysis of physical and chemical processes of high-temperature mineralizing of low-level radioactive waste in gas stream in the process of plasma treatment of radioactive waste in shaft furnaces was introduced. It was shown that the thermodynamic simulation method allows fairly adequately describing the changes in the composition of the pyrogas withdrawn from the shaft furnace at different waste treatment regimes. This offers a possibility of developing environmentally and economically viable technologies and small-sized low-cost facilities for plasma treatment of radioactive waste to be applied at currently operating nuclear power plants.

  15. [Logistic and production process in a regional blood center: modeling and analysis].

    PubMed

    Baesler, Felipe; Martínez, Cristina; Yaksic, Eduardo; Herrera, Claudia

    2011-09-01

    The blood supply chain is a complex system that considers different interconnected elements that have to be synchronized correctly to satisfy in quality and quantity the final patient requirements. To determine the blood center maximum production capacity, as well as the determination of the necessary changes for a future production capacity expansion. This work was developed in the Blood Center of Concepción, Chile, operations management tools were applied to model it and to propose improvement alternatives for the production process. The use of simulation is highlighted, which permitted the replication of the center behavior and the evaluation of expansion alternatives. It is possible to absorb a 100% increment in blood demand, without making major changes or investments in the production process. Also it was possible to determine the subsequent steps in terms of investments in equipment and human resources for a future expansion of the center coverage. The techniques used to model the production process of the blood center of Concepción, Chile, allowed us to analyze how it operates, to detect "bottle necks", and to support the decision making process for a future expansion of its capacity.

  16. Implementation of Lean System on Erbium Doped Fibre Amplifier Manufacturing Process to Reduce Production Time

    NASA Astrophysics Data System (ADS)

    Maneechote, T.; Luangpaiboon, P.

    2010-10-01

    A manufacturing process of erbium doped fibre amplifiers is complicated. It needs to meet the customers' requirements under a present economic status that products need to be shipped to customers as soon as possible after purchasing orders. This research aims to study and improve processes and production lines of erbium doped fibre amplifiers using lean manufacturing systems via an application of computer simulation. Three scenarios of lean tooled box systems are selected via the expert system. Firstly, the production schedule based on shipment date is combined with a first in first out control system. The second scenario focuses on a designed flow process plant layout. Finally, the previous flow process plant layout combines with production schedule based on shipment date including the first in first out control systems. The computer simulation with the limited data via an expected value is used to observe the performance of all scenarios. The most preferable resulted lean tooled box systems from a computer simulation are selected to implement in the real process of a production of erbium doped fibre amplifiers. A comparison is carried out to determine the actual performance measures via an analysis of variance of the response or the production time per unit achieved in each scenario. The goodness of an adequacy of the linear statistical model via experimental errors or residuals is also performed to check the normality, constant variance and independence of the residuals. The results show that a hybrid scenario of lean manufacturing system with the first in first out control and flow process plant lay out statistically leads to better performance in terms of the mean and variance of production times.

  17. Risk Reduction and Training using Simulation Based Tools - 12180

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Irin P.

    2012-07-01

    Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less

  18. Simulation of complex pharmacokinetic models in Microsoft Excel.

    PubMed

    Meineke, Ingolf; Brockmöller, Jürgen

    2007-12-01

    With the arrival of powerful personal computers in the office numerical methods are accessible to everybody. Simulation of complex processes therefore has become an indispensible tool in research and education. In this paper Microsoft EXCEL is used as a platform for a universal differential equation solver. The software is designed as an add-in aiming at a minimum of required user input to perform a given task. Four examples are included to demonstrate both, the simplicity of use and the versatility of possible applications. While the layout of the program is admittedly geared to the needs of pharmacokineticists, it can be used in any field where sets of differential equations are involved. The software package is available upon request.

  19. Remarks on the maximum luminosity

    NASA Astrophysics Data System (ADS)

    Cardoso, Vitor; Ikeda, Taishi; Moore, Christopher J.; Yoo, Chul-Moon

    2018-04-01

    The quest for fundamental limitations on physical processes is old and venerable. Here, we investigate the maximum possible power, or luminosity, that any event can produce. We show, via full nonlinear simulations of Einstein's equations, that there exist initial conditions which give rise to arbitrarily large luminosities. However, the requirement that there is no past horizon in the spacetime seems to limit the luminosity to below the Planck value, LP=c5/G . Numerical relativity simulations of critical collapse yield the largest luminosities observed to date, ≈ 0.2 LP . We also present an analytic solution to the Einstein equations which seems to give an unboundedly large luminosity; this will guide future numerical efforts to investigate super-Planckian luminosities.

  20. Humanoid Flight Metabolic Simulator Project

    NASA Technical Reports Server (NTRS)

    Ross, Stuart

    2015-01-01

    NASA's Evolvable Mars Campaign (EMC) has identified several areas of technology that will require significant improvements in terms of performance, capacity, and efficiency, in order to make a manned mission to Mars possible. These include crew vehicle Environmental Control and Life Support System (ECLSS), EVA suit Portable Life Support System (PLSS) and Information Systems, autonomous environmental monitoring, radiation exposure monitoring and protection, and vehicle thermal control systems (TCS). (MADMACS) in a Suit can be configured to simulate human metabolism, consuming crew resources (oxygen) in the process. In addition to providing support for testing Life Support on unmanned flights, MADMACS will also support testing of suit thermal controls, and monitor radiation exposure, body zone temperatures, moisture, and loads.

  1. Phantom-based interactive simulation system for dental treatment training.

    PubMed

    Sae-Kee, Bundit; Riener, Robert; Frey, Martin; Pröll, Thomas; Burgkart, Rainer

    2004-01-01

    In this paper, we propose a new interactive simulation system for dental treatment training. The system comprises a virtual reality environment and a force-torque measuring device to enhance the capabilities of a passive phantom of tooth anatomy in dental treatment training processes. The measuring device is connected to the phantom, and provides essential input data for generating the graphic animations of physical behaviors such as drilling and bleeding. The animation methods of those physical behaviors are also presented. This system is not only able to enhance interactivity and accessibility of the training system compared to conventional methods but it also provides possibilities of recording, evaluating, and verifying the training results.

  2. A cellular automata model of SARS epidemic spreading

    NASA Astrophysics Data System (ADS)

    Xu, Tian; Zhang, Peipei; Su, Beibei; Jiang, Yumai; He, Da-Ren

    2004-03-01

    We suggest a cellular automata model for a simulation on the process of SARS spreading in Beijing. Suppose a number of people are located in a two-dimensional lattice, in which a certain portion belongs to immune and others belong to acceptive. In every time step each of the acceptive people may become ill with a certain probability if one of his 8 neighbors is a SARS patient. At same time all the people have another possibility to change their positions. Each patient will recover or die after different number of days. A recovered patient becomes immune. The numerical simulation by this model leads to the results, which are in a good agreement with the practical statistical data.

  3. Mathematical simulation of the amplification of 1790-nm laser radiation in a nuclear-excited He - Ar plasma containing nanoclusters of uranium compounds

    NASA Astrophysics Data System (ADS)

    Kosarev, V. A.; Kuznetsova, E. E.

    2014-02-01

    The possibility of applying dusty active media in nuclearpumped lasers has been considered. The amplification of 1790-nm radiation in a nuclear-excited dusty He - Ar plasma is studied by mathematical simulation. The influence of nanoclusters on the component composition of the medium and the kinetics of the processes occurring in it is analysed using a specially developed kinetic model, including 72 components and more than 400 reactions. An analysis of the results indicates that amplification can in principle be implemented in an active laser He - Ar medium containing 10-nm nanoclusters of metallic uranium and uranium dioxide.

  4. Characterization and modeling of radiation effects NASA/MSFC semiconductor devices

    NASA Technical Reports Server (NTRS)

    Kerns, D. V., Jr.; Cook, K. B., Jr.

    1978-01-01

    A literature review of the near-Earth trapped radiation of the Van Allen Belts, the radiation within the solar system resulting from the solar wind, and the cosmic radiation levels of deep space showed that a reasonable simulation of space radiation, particularly the Earth orbital environment, could be simulated in the laboratory by proton bombardment. A 3 MeV proton accelerator was used to irradiate CMOS integrated circuits fabricated from three different processes. The drain current and output voltage for three inverters was recorded as the input voltage was swept from zero to ten volts after each successive irradiation. Device parameters were extracted. Possible damage mechanisms are discussed and recommendations for improved radiation hardness are suggested.

  5. Use of EPANET solver to manage water distribution in Smart City

    NASA Astrophysics Data System (ADS)

    Antonowicz, A.; Brodziak, R.; Bylka, J.; Mazurkiewicz, J.; Wojtecki, S.; Zakrzewski, P.

    2018-02-01

    Paper presents a method of using EPANET solver to support manage water distribution system in Smart City. The main task is to develop the application that allows remote access to the simulation model of the water distribution network developed in the EPANET environment. Application allows to perform both single and cyclic simulations with the specified step of changing the values of the selected process variables. In the paper the architecture of application was shown. The application supports the selection of the best device control algorithm using optimization methods. Optimization procedures are possible with following methods: brute force, SLSQP (Sequential Least SQuares Programming), Modified Powell Method. Article was supplemented by example of using developed computer tool.

  6. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  7. Bridgman growth of semiconductors

    NASA Technical Reports Server (NTRS)

    Carlson, F. M.

    1985-01-01

    The purpose of this study was to improve the understanding of the transport phenomena which occurs in the directional solidification of alloy semiconductors. In particular, emphasis was placed on the strong role of convection in the melt. Analytical solutions were not deemed possible for such an involved problem. Accordingly, a numerical model of the process was developed which simulated the transport. This translates into solving the partial differential equations of energy, mass, species, and momentum transfer subject to various boundary and initial conditions. A finite element method with simple elements was initially chosen. This simulation tool will enable the crystal grower to systematically identify and modify the important design factors within her control to produce better crystals.

  8. Cascade Defect Evolution Processes: Comparison of Atomistic Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Haixuan; Stoller, Roger E; Osetskiy, Yury N

    2013-11-01

    Determining the defect evolution beyond the molecular dynamics (MD) time scale is critical in bridging the gap between atomistic simulations and experiments. The recently developed self-evolving atomistic kinetic Monte Carlo (SEAKMC) method provides new opportunities to simulate long-term defect evolution with MD-like fidelity. In this study, SEAKMC is applied to investigate the cascade defect evolution in bcc iron. First, the evolution of a vacancy rich region is simulated and compared with results obtained using autonomous basin climbing (ABC) +KMC and kinetic activation-relaxation technique (kART) simulations. Previously, it is found the results from kART are orders of magnitude faster than ABC+KMC.more » The results obtained from SEAKMC are similar to kART but the time predicted is about one order of magnitude faster than kART. The fidelity of SEAKMC is confirmed by statistically relevant MD simulations at multiple higher temperatures, which proves that the saddle point sampling is close to complete in SEAKMC. The second is the irradiation-induced formation of C15 Laves phase nano-size defect clusters. In contrast to previous studies, which claim the defects can grow by capturing self-interstitials, we found these highly stable clusters can transform to <111> glissile configuration on a much longer time scale. Finally, cascade-annealing simulations using SEAKMC is compared with traditional object KMC (OKMC) method. SEAKMC predicts substantially fewer surviving defects compared with OKMC. The possible origin of this difference is discussed and a possible way to improve the accuracy of OKMC based on SEAKMC results is outlined. These studies demonstrate the atomistic fidelity of SEAKMC in comparison with other on-the-fly KMC methods and provide new information on long-term defect evolution in iron.« less

  9. Petri net based model of the body iron homeostasis.

    PubMed

    Formanowicz, Dorota; Sackmann, Andrea; Formanowicz, Piotr; Błazewicz, Jacek

    2007-10-01

    The body iron homeostasis is a not fully understood complex process. Despite the fact that some components of this process have been described in the literature, the complete model of the whole process has not been proposed. In this paper a Petri net based model of the body iron homeostasis is presented. Recently, Petri nets have been used for describing and analyzing various biological processes since they allow modeling the system under consideration very precisely. The main result presented in the paper is twofold, i.e., an informal description of the main part of the whole iron homeostasis process is described, and then it is also formulated in the formal language of Petri net theory. This model allows for a possible simulation of the process, since Petri net theory provides a lot of established analysis techniques.

  10. Dissociation of a Dynamic Protein Complex Studied by All-Atom Molecular Simulations.

    PubMed

    Zhang, Liqun; Borthakur, Susmita; Buck, Matthias

    2016-02-23

    The process of protein complex dissociation remains to be understood at the atomic level of detail. Computers now allow microsecond timescale molecular-dynamics simulations, which make the visualization of such processes possible. Here, we investigated the dissociation process of the EphA2-SHIP2 SAM-SAM domain heterodimer complex using unrestrained all-atom molecular-dynamics simulations. Previous studies on this system have shown that alternate configurations are sampled, that their interconversion can be fast, and that the complex is dynamic by nature. Starting from different NMR-derived structures, mutants were designed to stabilize a subset of configurations by swapping ion pairs across the protein-protein interface. We focused on two mutants, K956D/D1235K and R957D/D1223R, with attenuated binding affinity compared with the wild-type proteins. In contrast to calculations on the wild-type complexes, the majority of simulations of these mutants showed protein dissociation within 2.4 μs. During the separation process, we observed domain rotation and pivoting as well as a translation and simultaneous rolling, typically to alternate and weaker binding interfaces. Several unsuccessful recapturing attempts occurred once the domains were moderately separated. An analysis of protein solvation suggests that the dissociation process correlates with a progressive loss of protein-protein contacts. Furthermore, an evaluation of internal protein dynamics using quasi-harmonic and order parameter analyses indicates that changes in protein internal motions are expected to contribute significantly to the thermodynamics of protein dissociation. Considering protein association as the reverse of the separation process, the initial role of charged/polar interactions is emphasized, followed by changes in protein and solvent dynamics. The trajectories show that protein separation does not follow a single distinct pathway, but suggest that the mechanism of dissociation is common in that it initially involves transitions to surfaces with fewer, less favorable contacts compared with those seen in the fully formed complex. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  11. ATP concentration as possible marker of liver damage at leukaemia treatment: confocal microscopy-based experimental study and numerical simulations

    NASA Astrophysics Data System (ADS)

    Malashchenko, V.; Zyubin, A.; Babak, S.; Lavrova, A.

    2017-04-01

    We consider the method of confocal microscopy as a convenient instrument for determination of chemical compounds in biological tissues and cells. In particular, we study the dynamics of adenosine triphosphate (ATP) concentration that could be used as a bio-marker of energy metabolism pathologies at the treatment of acute lymphoblastic leukaemia (ALL). On the basis of data obtained by the confocal microscopy, the values of ATP concentration have been calculated for each case. Possible correlations with other characteristics of pathology processes obtained from plasma of leukemia patients show that ATP value could be a prognostic factor of the treatment success. The role of ATP in the drug metabolism switching is also discussed within the context of kinetic modelling of metabolism processes leading to the production of 6-Thioguanosine monophosphate, which is a principal acting agent in chemotherapy.

  12. Carbon membranes for efficient water-ethanol separation.

    PubMed

    Gravelle, Simon; Yoshida, Hiroaki; Joly, Laurent; Ybert, Christophe; Bocquet, Lydéric

    2016-09-28

    We demonstrate, on the basis of molecular dynamics simulations, the possibility of an efficient water-ethanol separation using nanoporous carbon membranes, namely, carbon nanotube membranes, nanoporous graphene sheets, and multilayer graphene membranes. While these carbon membranes are in general permeable to both pure liquids, they exhibit a counter-intuitive "self-semi-permeability" to water in the presence of water-ethanol mixtures. This originates in a preferred ethanol adsorption in nanoconfinement that prevents water molecules from entering the carbon nanopores. An osmotic pressure is accordingly expressed across the carbon membranes for the water-ethanol mixture, which agrees with the classic van't Hoff type expression. This suggests a robust and versatile membrane-based separation, built on a pressure-driven reverse-osmosis process across these carbon-based membranes. In particular, the recent development of large-scale "graphene-oxide" like membranes then opens an avenue for a versatile and efficient ethanol dehydration using this separation process, with possible application for bio-ethanol fabrication.

  13. Carbon membranes for efficient water-ethanol separation

    NASA Astrophysics Data System (ADS)

    Gravelle, Simon; Yoshida, Hiroaki; Joly, Laurent; Ybert, Christophe; Bocquet, Lydéric

    2016-09-01

    We demonstrate, on the basis of molecular dynamics simulations, the possibility of an efficient water-ethanol separation using nanoporous carbon membranes, namely, carbon nanotube membranes, nanoporous graphene sheets, and multilayer graphene membranes. While these carbon membranes are in general permeable to both pure liquids, they exhibit a counter-intuitive "self-semi-permeability" to water in the presence of water-ethanol mixtures. This originates in a preferred ethanol adsorption in nanoconfinement that prevents water molecules from entering the carbon nanopores. An osmotic pressure is accordingly expressed across the carbon membranes for the water-ethanol mixture, which agrees with the classic van't Hoff type expression. This suggests a robust and versatile membrane-based separation, built on a pressure-driven reverse-osmosis process across these carbon-based membranes. In particular, the recent development of large-scale "graphene-oxide" like membranes then opens an avenue for a versatile and efficient ethanol dehydration using this separation process, with possible application for bio-ethanol fabrication.

  14. Modeling antigen-antibody nanoparticle bioconjugates and their polymorphs

    NASA Astrophysics Data System (ADS)

    Desgranges, Caroline; Delhommelle, Jerome

    2018-03-01

    The integration of nanomaterials with biomolecules has recently led to the development of new ways of designing biosensors, and through their assembly, to new hybrid structures for novel and exciting applications. In this work, we develop a coarse-grained model for nanoparticles grafted with antibody molecules and their binding with antigens. In particular, we isolate two possible states for antigen-antibody pairs during the binding process, termed as recognition and anchoring states. Using molecular simulation, we calculate the thermodynamic and structural features of three possible crystal structures or polymorphs, the body-centered cubic, simple cubic, and face-centered cubic phases, and of the melt. This leads us to determine the domain of stability of the three solid phases. In particular, the role played by the switching process between anchoring and recognition states during melting is identified, shedding light on the complex microscopic mechanisms in these systems.

  15. Combustion distribution control using the extremum seeking algorithm

    NASA Astrophysics Data System (ADS)

    Marjanovic, A.; Krstic, M.; Djurovic, Z.; Kvascev, G.; Papic, V.

    2014-12-01

    Quality regulation of the combustion process inside the furnace is the basis of high demands for increasing robustness, safety and efficiency of thermal power plants. The paper considers the possibility of spatial temperature distribution control inside the boiler, based on the correction of distribution of coal over the mills. Such control system ensures the maintenance of the flame focus away from the walls of the boiler, and thus preserves the equipment and reduces the possibility of ash slugging. At the same time, uniform heat dissipation over mills enhances the energy efficiency of the boiler, while reducing the pollution of the system. A constrained multivariable extremum seeking algorithm is proposed as a tool for combustion process optimization with the main objective of centralizing the flame in the furnace. Simulations are conducted on a model corresponding to the 350MW boiler of the Nikola Tesla Power Plant, in Obrenovac, Serbia.

  16. Constructing event trees for volcanic crises

    USGS Publications Warehouse

    Newhall, C.; Hoblitt, R.

    2002-01-01

    Event trees are useful frameworks for discussing probabilities of possible outcomes of volcanic unrest. Each branch of the tree leads from a necessary prior event to a more specific outcome, e.g., from an eruption to a pyroclastic flow. Where volcanic processes are poorly understood, probability estimates might be purely empirical - utilizing observations of past and current activity and an assumption that the future will mimic the past or follow a present trend. If processes are better understood, probabilities might be estimated from a theoritical model, either subjectively or by numerical simulations. Use of Bayes' theorem aids in the estimation of how fresh unrest raises (or lowers) the probabilities of eruptions. Use of event trees during volcanic crises can help volcanologists to critically review their analysis of hazard, and help officials and individuals to compare volcanic risks with more familiar risks. Trees also emphasize the inherently probabilistic nature of volcano forecasts, with multiple possible outcomes.

  17. Simulating eroded soil organic carbon with the SWAT-C model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xuesong

    The soil erosion and associated lateral movement of eroded carbon (C) have been identified as a possible mechanism explaining the elusive terrestrial C sink of ca. 1.7-2.6 PgC yr(-1). Here we evaluated the SWAT-C model for simulating long-term soil erosion and associated eroded C yields. Our method couples the CENTURY carbon cycling processes with a Modified Universal Soil Loss Equation (MUSLE) to estimate C losses associated with soil erosion. The results show that SWAT-C is able to simulate well long-term average eroded C yields, as well as correctly estimate the relative magnitude of eroded C yields by crop rotations. Wemore » also evaluated three methods of calculating C enrichment ratio in mobilized sediments, and found that errors associated with enrichment ratio estimation represent a significant uncertainty in SWAT-C simulations. Furthermore, we discussed limitations and future development directions for SWAT-C to advance C cycling modeling and assessment.« less

  18. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  19. Assessing the Effects of Data Compression in Simulations Using Physically Motivated Metrics

    DOE PAGES

    Laney, Daniel; Langer, Steven; Weber, Christopher; ...

    2014-01-01

    This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3–5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. Wemore » compress relevant quantities after each time-step to approximate the effects of tightly coupled compression and study the compression rates to estimate memory and disk-bandwidth reduction. We find that the error characteristics of compression algorithms must be carefully considered in the context of the underlying physics being modeled.« less

  20. High fidelity studies of exploding foil initiator bridges, Part 1: Experimental method

    NASA Astrophysics Data System (ADS)

    Bowden, Mike; Neal, William

    2017-01-01

    Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage and in the case of EFIs, flyer velocity. Correspondingly, experimental methods have in general been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA and ALE-MHD, it is now possible to simulate these components in three dimensions, predicting a much greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately validated. In this first paper of a three part study, the experimental method for determining the current, voltage, flyer velocity and multi-dimensional profile of detonator components is presented. This improved capability, along with high fidelity simulations, offer an opportunity to gain a greater understanding of the processes behind the functioning of EBW and EFI detonators.

Top