Etude aerodynamique d'un jet turbulent impactant une paroi concave
NASA Astrophysics Data System (ADS)
LeBlanc, Benoit
Etant donne la demande croissante de temperatures elevees dans des chambres de combustion de systemes de propulsions en aerospatiale (turbomoteurs, moteur a reaction, etc.), l'interet dans le refroidissement par jets impactant s'est vu croitre. Le refroidissement des aubes de turbine permet une augmentation de temperature de combustion, ce qui se traduit en une augmentation de l'efficacite de combustion et donc une meilleure economie de carburant. Le transfert de chaleur dans les au bages est influence par les aspects aerodynamiques du refroidissement a jet, particulierement dans le cas d'ecoulements turbulents. Un manque de comprehension de l'aerodynamique a l'interieur de ces espaces confinees peut mener a des changements de transfert thermique qui sont inattendus, ce qui augmente le risque de fluage. Il est donc d'interet pour l'industrie aerospatiale et l'academie de poursuivre la recherche dans l'aerodynamique des jets turbulents impactant les parois courbes. Les jets impactant les surfaces courbes ont deja fait l'objet de nombreuses etudes. Par contre des conditions oscillatoires observees en laboratoire se sont averees difficiles a reproduire en numerique, puisque les structures d'ecoulements impactants des parois concaves sont fortement dependantes de la turbulence et des effets instationnaires. Une etude experimentale fut realisee a l'institut PPRIME a l'Universite de Poitiers afin d'observer le phenomene d'oscillation dans le jet. Une serie d'essais ont verifie les conditions d'ecoulement laminaires et turbulentes, toutefois le cout des essais experimentaux a seulement permis d'avoir un apercu du phenomene global. Une deuxieme serie d'essais fut realisee numeriquement a l'Universite de Moncton avec l'outil OpenFOAM pour des conditions d'ecoulement laminaire et bidimensionnel. Cette etude a donc comme but de poursuivre l'enquete de l'aerodynamique oscillatoire des jets impactant des parois courbes, mais pour un regime d'ecoulement transitoire, turbulent, tridimensionnel. Les nombres de Reynolds utilises dans l'etude numerique, bases sur le diametre du jet lineaire observe, sont de Red = 3333 et 6667, consideres comme etant en transition vers la turbulence. Dans cette etude, un montage numerique est construit. Le maillage, le schema numerique, les conditions frontiere et la discretisation sont discutes et choisis. Les resultats sont ensuite valides avec des donnees turbulentes experimentales. En modelisation numerique de turbulence, les modeles de Moyennage Reynolds des Equations Naviers Stokes (RANS) presentent des difficultes avec des ecoulements instationnaires en regime transitionnel. La Simulation des Grandes Echelles (LES) presente une solution plus precise, mais au cout encore hors de portee pour cette etude. La methode employee pour cette etude est la Simulation des Tourbillons Detaches (DES), qui est un hybride des deux methodes (RANS et LES). Pour analyser la topologie de l'ecoulement, la decomposition des modes propres (POD) a ete egalement ete effectuee sur les resultats numeriques. L'etude a demontre d'abord le temps de calcul relativement eleve associe a des essais DES pour garder le nombre de Courant faible. Les resultats numeriques ont cependant reussi a reproduire correctement le basculement asynchrone observe dans les essais experimentaux. Le basculement observe semble etre cause par des effets transitionnels, ce qui expliquerait la difficulte des modeles RANS a correctement reproduire l'aerodynamique de l'ecoulement. L'ecoulement du jet, a son tour, est pour la plupart du temps tridimensionnel et turbulent sauf pour de courtes periodes de temps stable et independant de la troisieme dimension. L'etude topologique de l'ecoulement a egalement permit la reconaissances de structures principales sousjacentes qui etaient brouillees par la turbulence. Mots cles : jet impactant, paroi concave, turbulence, transitionnel, simulation des tourbillons detaches (DES), OpenFOAM.
NASA Astrophysics Data System (ADS)
Goyette, Stephane
1995-11-01
Le sujet de cette these concerne la modelisation numerique du climat regional. L'objectif principal de l'exercice est de developper un modele climatique regional ayant les capacites de simuler des phenomenes de meso-echelle spatiale. Notre domaine d'etude se situe sur la Cote Ouest nord americaine. Ce dernier a retenu notre attention a cause de la complexite du relief et de son controle sur le climat. Les raisons qui motivent cette etude sont multiples: d'une part, nous ne pouvons pas augmenter, en pratique, la faible resolution spatiale des modeles de la circulation generale de l'atmosphere (MCG) sans augmenter a outrance les couts d'integration et, d'autre part, la gestion de l'environnement exige de plus en plus de donnees climatiques regionales determinees avec une meilleure resolution spatiale. Jusqu'alors, les MCG constituaient les modeles les plus estimes pour leurs aptitudes a simuler le climat ainsi que les changements climatiques mondiaux. Toutefois, les phenomenes climatiques de fine echelle echappent encore aux MCG a cause de leur faible resolution spatiale. De plus, les repercussions socio-economiques des modifications possibles des climats sont etroitement liees a des phenomenes imperceptibles par les MCG actuels. Afin de circonvenir certains problemes inherents a la resolution, une approche pratique vise a prendre un domaine spatial limite d'un MCG et a y imbriquer un autre modele numerique possedant, lui, un maillage de haute resolution spatiale. Ce processus d'imbrication implique alors une nouvelle simulation numerique. Cette "retro-simulation" est guidee dans le domaine restreint a partir de pieces d'informations fournies par le MCG et forcee par des mecanismes pris en charge uniquement par le modele imbrique. Ainsi, afin de raffiner la precision spatiale des previsions climatiques de grande echelle, nous developpons ici un modele numerique appele FIZR, permettant d'obtenir de l'information climatique regionale valide a la fine echelle spatiale. Cette nouvelle gamme de modeles-interpolateurs imbriques qualifies d'"intelligents" fait partie de la famille des modeles dits "pilotes". L'hypothese directrice de notre etude est fondee sur la supposition que le climat de fine echelle est souvent gouverne par des forcages provenant de la surface plutot que par des transports atmospheriques de grande echelle spatiale. La technique que nous proposons vise donc a guider FIZR par la Dynamique echantillonnee d'un MCG et de la forcer par la Physique du MCG ainsi que par un forcage orographique de meso-echelle, en chacun des noeuds de la grille fine de calculs. Afin de valider la robustesse et la justesse de notre modele climatique regional, nous avons choisi la region de la Cote Ouest du continent nord americain. Elle est notamment caracterisee par une distribution geographique des precipitations et des temperatures fortement influencee par le relief sous-jacent. Les resultats d'une simulation d'un mois de janvier avec FIZR demontrent que nous pouvons simuler des champs de precipitations et de temperatures au niveau de l'abri beaucoup plus pres des observations climatiques comparativement a ceux simules a partir d'un MCG. Ces performances sont manifestement attribuees au forcage orographique de meso-echelle de meme qu'aux caracteristiques de surface determinees a fine echelle. Un modele similaire a FIZR peut, en principe, etre implante sur l'importe quel MCG, donc, tout organisme de recherche implique en modelisation numerique mondiale de grande echelle pourra se doter d'un el outil de regionalisation.
1980-11-21
defensive , and both the question and the answer seemed to generate supporting reactions from the audience. Discrete Event Simulation The session on...R. Toscano / A. Maceri / F. Maceri (Italy) Analyse numerique de quelques problemes de contact en theorie des membranes 3:40 - 4:00 p.m. COFFEE BREAK...Switzerland Stockage de chaleur faible profondeur : Simulation par elements finis 3:40 - 4:00 p.m. A. Rizk Abu El-Wafa / M. Tawfik / M.S. Mansour (Egypt) Digital
Turbomachinery Design Using CFD (La Conception des Turbomachines par l’Aerodynamique Numerique).
1994-05-01
Method for Flow Calculations in Turbomachines", Vrije Thompkins, W.T.,1981, "A Fortran Program for Calcu- Univ.Brussel, Dienst Stromingsmechanica, VUB- STR ...Model Equation for Simulating Flows in mung um Profile Multistage Turbomachinery MBB-Bericht Nr. UFE 1352, 1977 ASME paper 85-GT-226, Houston, March
NASA Astrophysics Data System (ADS)
Bel Hadj Kacem, Mohamed Salah
All hydrological processes are affected by the spatial variability of the physical parameters of the watershed, and also by human intervention on the landscape. The water outflow from a watershed strictly depends on the spatial and temporal variabilities of the physical parameters of the watershed. It is now apparent that the integration of mathematical models into GIS's can benefit both GIS and three-dimension environmental models: a true modeling capability can help the modeling community bridge the gap between planners, scientists, decision-makers and end-users. The main goal of this research is to design a practical tool to simulate run-off water surface using Geographic design a practical tool to simulate run-off water surface using Geographic Information Systems and the simulation of the hydrological behavior by the Finite Element Method.
NASA Astrophysics Data System (ADS)
Rebaine, Ali
1997-08-01
Ce travail consiste en la simulation numerique des ecoulements internes compressibles bidimensionnels laminaires et turbulents. On s'interesse, particulierement, aux ecoulements dans les ejecteurs supersoniques. Les equations de Navier-Stokes sont formulees sous forme conservative et utilisent, comme variables independantes, les variables dites enthalpiques a savoir: la pression statique, la quantite de mouvement et l'enthalpie totale specifique. Une formulation variationnelle stable des equations de Navier-Stokes est utilisee. Elle est base sur la methode SUPG (Streamline Upwinding Petrov Galerkin) et utilise un operateur de capture des forts gradients. Un modele de turbulence, pour la simulation des ecoulements dans les ejecteurs, est mis au point. Il consiste a separer deux regions distinctes: une region proche de la paroi solide, ou le modele de Baldwin et Lomax est utilise et l'autre, loin de la paroi, ou une formulation nouvelle, basee sur le modele de Schlichting pour les jets, est proposee. Une technique de calcul de la viscosite turbulente, sur un maillage non structure, est implementee. La discretisation dans l'espace de la forme variationnelle est faite a l'aide de la methode des elements finis en utilisant une approximation mixte: quadratique pour les composantes de la quantite de mouvement et de la vitesse et lineaire pour le reste des variables. La discretisation temporelle est effectuee par une methode de differences finies en utilisant le schema d'Euler implicite. Le systeme matriciel, resultant de la discretisation spatio-temporelle, est resolu a l'aide de l'algorithme GMRES en utilisant un preconditionneur diagonal. Les validations numeriques ont ete menees sur plusieurs types de tuyeres et ejecteurs. La principale validation consiste en la simulation de l'ecoulement dans l'ejecteur teste au centre de recherche NASA Lewis. Les resultats obtenus sont tres comparables avec ceux des travaux anterieurs et sont nettement superieurs concernant les ecoulements turbulents dans les ejecteurs.
Algorithms for Robust Identification and Control of Large Space Structures. Phase 1.
1988-05-14
Variate Analysis," Proc. Amer. Control Conf., San Francisco, * pp. 445-451. LECTIQUE, J., Rault, A., Tessier, M., and Testud , J.L. (1978), "Multivariable...Rault, J.L. Testud , and J. Papon (1978), "Model Predictive Heuris- tic Control: Applications to Industrial Processes," Automatica, Vol. 14, pp. 413...Control ’. Conference, Minneapolis, MN, June. TESTUD , J.L. (1979), "Commande Numerique Multivariable du Ballon de Recupera- tion de Vapeur," Adersa/Gerbios
Etude thermo-hydraulique de l'ecoulement du moderateur dans le reacteur CANDU-6
NASA Astrophysics Data System (ADS)
Mehdi Zadeh, Foad
Etant donne la taille (6,0 m x 7,6 m) ainsi que le domaine multiplement connexe qui caracterisent la cuve des reacteurs CANDU-6 (380 canaux dans la cuve), la physique qui gouverne le comportement du fluide moderateur est encore mal connue de nos jours. L'echantillonnage de donnees dans un reacteur en fonction necessite d'apporter des changements a la configuration de la cuve du reacteur afin d'y inserer des sondes. De plus, la presence d'une zone intense de radiations empeche l'utilisation des capteurs courants d'echantillonnage. En consequence, l'ecoulement du moderateur doit necessairement etre etudie a l'aide d'un modele experimental ou d'un modele numerique. Pour ce qui est du modele experimental, la fabrication et la mise en fonction de telles installations coutent tres cher. De plus, les parametres de la mise a l'echelle du systeme pour fabriquer un modele experimental a l'echelle reduite sont en contradiction. En consequence, la modelisation numerique reste une alternative importante. Actuellement, l'industrie nucleaire utilise une approche numerique, dite de milieu poreux, qui approxime le domaine par un milieu continu ou le reseau des tubes est remplace par des resistances hydrauliques distribuees. Ce modele est capable de decrire les phenomenes macroscopiques de l'ecoulement, mais ne tient pas compte des effets locaux ayant un impact sur l'ecoulement global, tel que les distributions de temperatures et de vitesses a proximite des tubes ainsi que des instabilites hydrodynamiques. Dans le contexte de la surete nucleaire, on s'interesse aux effets locaux autour des tubes de calandre. En effet, des simulations faites par cette approche predisent que l'ecoulement peut prendre plusieurs configurations hydrodynamiques dont, pour certaines, l'ecoulement montre un comportement asymetrique au sein de la cuve. Ceci peut provoquer une ebullition du moderateur sur la paroi des canaux. Dans de telles conditions, le coefficient de reactivite peut varier de maniere importante, se traduisant par l'accroissement de la puissance du reacteur. Ceci peut avoir des consequences majeures pour la surete nucleaire. Une modelisation CFD (Computational Fluid Dynamics) detaillee tenant compte des effets locaux s'avere donc necessaire. Le but de ce travail de recherche est de modeliser le comportement complexe de l'ecoulement du moderateur au sein de la cuve d'un reacteur nucleaire CANDU-6, notamment a proximite des tubes de calandre. Ces simulations servent a identifier les configurations possibles de l'ecoulement dans la calandre. Cette etude consiste ainsi a formuler des bases theoriques a l'origine des instabilites macroscopiques du moderateur, c.-a-d. des mouvements asymetriques qui peuvent provoquer l'ebullition du moderateur. Le defi du projet est de determiner l'impact de ces configurations de l'ecoulement sur la reactivite du reacteur CANDU-6.
NASA Astrophysics Data System (ADS)
LeBlanc, Luc R.
Les materiaux composites sont de plus en plus utilises dans des domaines tels que l'aerospatiale, les voitures a hautes performances et les equipements sportifs, pour en nommer quelques-uns. Des etudes ont demontre qu'une exposition a l'humidite nuit a la resistance des composites en favorisant l'initiation et la propagation du delaminage. De ces etudes, tres peu traitent de l'effet de l'humidite sur l'initiation du delaminage en mode mixte I/II et aucune ne traite des effets de l'humidite sur le taux de propagation du delaminage en mode mixte I/II dans un composite. La premiere partie de cette these consiste a determiner les effets de l'humidite sur la propagation du delaminage lors d'une sollicitation en mode mixte I/II. Des eprouvettes d'un composite unidirectionnel de carbone/epoxy (G40-800/5276-1) ont ete immergees dans un bain d'eau distillee a 70°C jusqu'a leur saturation. Des essais experimentaux quasi-statiques avec des chargements d'une gamme de mixites des modes I/II (0%, 25%, 50%, 75% et 100%) ont ete executes pour determiner les effets de l'humidite sur la resistance au delaminage du composite. Des essais de fatigue ont ete realises, avec la meme gamme de mixite des modes I/II, pour determiner 1'effet de 1'humidite sur l'initiation et sur le taux de propagation du delaminage. Les resultats des essais en chargement quasi-statique ont demontre que l'humidite reduit la resistance au delaminage d'un composite carbone/epoxy pour toute la gamme des mixites des modes I/II, sauf pour le mode I ou la resistance au delaminage augmente apres une exposition a l'humidite. Pour les chargements en fatigue, l'humidite a pour effet d'accelerer l'initiation du delaminage et d'augmenter le taux de propagation pour toutes les mixites des modes I/II. Les donnees experimentales recueillies ont ete utilisees pour determiner lesquels des criteres de delaminage en statique et des modeles de taux de propagation du delaminage en fatigue en mode mixte I/II proposes dans la litterature representent le mieux le delaminage du composite etudie. Une courbe de regression a ete utilisee pour determiner le meilleur ajustement entre les donnees experimentales et les criteres de delaminage en statique etudies. Une surface de regression a ete utilisee pour determiner le meilleur ajustement entre les donnees experimentales et les modeles de taux de propagation en fatigue etudies. D'apres les ajustements, le meilleur critere de delaminage en statique est le critere B-K et le meilleur modele de propagation en fatigue est le modele de Kenane-Benzeggagh. Afin de predire le delaminage lors de la conception de pieces complexes, des modeles numeriques peuvent etre utilises. La prediction de la longueur de delaminage lors des chargements en fatigue d'une piece est tres importante pour assurer qu'une fissure interlaminaire ne va pas croitre excessivement et causer la rupture de cette piece avant la fin de sa duree de vie de conception. Selon la tendance recente, ces modeles sont souvent bases sur l'approche de zone cohesive avec une formulation par elements finis. Au cours des travaux presentes dans cette these, le modele de progression du delaminage en fatigue de Landry & LaPlante (2012) a ete ameliore en y ajoutant le traitement des chargements en mode mixte I/II et en y modifiant l'algorithme du calcul de la force d'entrainement maximale du delaminage. Une calibration des parametres de zone cohesive a ete faite a partir des essais quasi-statiques experimentaux en mode I et II. Des resultats de simulations numeriques des essais quasi-statiques en mode mixte I/II, avec des eprouvettes seches et humides, ont ete compares avec les essais experimentaux. Des simulations numeriques en fatigue ont aussi ete faites et comparees avec les resultats experimentaux du taux de propagation du delaminage. Les resultats numeriques des essais quasi-statiques et de fatigue ont montre une bonne correlation avec les resultats experimentaux pour toute la gamme des mixites des modes I/II etudiee.
Vectored Thrust Digital Flight Control for Crew Escape. Volume 2.
1985-12-01
no. 24. Lecrique, J., A. Rault, M. Tessier and J.L. Testud (1978), - "Multivariable Regulation of a Thermal Power Plant Steam Generator," presented...and Extended Kalman Observers," presented at the Conf. Decision and Control, San Diego, CA. Testud , J.L. (1977), Commande Numerique Multivariable du
Methodes d'amas quantiques a temperature finie appliquees au modele de Hubbard
NASA Astrophysics Data System (ADS)
Plouffe, Dany
Depuis leur decouverte dans les annees 80, les supraconducteurs a haute temperature critique ont suscite beaucoup d'interet en physique du solide. Comprendre l'origine des phases observees dans ces materiaux, telle la supraconductivite, est l'un des grands defis de la physique theorique du solide des 25 dernieres annees. L'un des mecanismes pressentis pour expliquer ces phenomenes est la forte interaction electron-electron. Le modele de Hubbard est l'un des modeles les plus simples pour tenir compte de ces interactions. Malgre la simplicite apparente de ce modele, certaines de ses caracteristiques, dont son diagramme de phase, ne sont toujours pas bien etablies, et ce malgre plusieurs avancements theoriques dans les dernieres annees. Cette etude se consacre a faire une analyse de methodes numeriques permettant de calculer diverses proprietes du modele de Hubbard en fonction de la temperature. Nous decrivons des methodes (la VCA et la CPT) qui permettent de calculer approximativement la fonction de Green a temperature finie sur un systeme infini a partir de la fonction de Green calculee sur un amas de taille finie. Pour calculer ces fonctions de Green, nous allons utiliser des methodes permettant de reduire considerablement les efforts numeriques necessaires pour les calculs des moyennes thermodynamiques, en reduisant considerablement l'espace des etats a considerer dans ces moyennes. Bien que cette etude vise d'abord a developper des methodes d'amas pour resoudre le modele de Hubbard a temperature finie de facon generale ainsi qu'a etudier les proprietes de base de ce modele, nous allons l'appliquer a des conditions qui s'approchent de supraconducteurs a haute temperature critique. Les methodes presentees dans cette etude permettent de tracer un diagramme de phase pour l'antiferromagnetisme et la supraconductivite qui presentent plusieurs similarites avec celui des supraconducteurs a haute temperature. Mots-cles : modele de Hubbard, thermodynamique, antiferromagnetisme, supraconductivite, methodes numeriques, larges matrices
2003-03-01
combat modernes et des avions d’affaires E. Garrigues, Th. Percheron DASSAULT AVIATION DGT/DTA/IAP F-922 14, Saint-Cloud Cedex France 1. Introduction ...de vol, des acedidrations rigides et des rdponses de la structure ( jauges et acedidrations). Struturl Premdicton Grdjustments n~~~ligh Testsn~n Fig4ure
1993-11-01
sont caracterises par la striosconic continue tic la la partie tie la couche tie melange situde sous le jet figure 3 ainsi que par la tomoscopie tie... caracterisent les ondzs). Ces ondes un disque; de Mach. Sur la figuire 4, on observe la semblent proveriir tie la r6gion tie l’jccteur, juste trace du
NASA Astrophysics Data System (ADS)
Mejdi, Abderrazak
Les fuselages des avions sont generalement en aluminium ou en composite renforces par des raidisseurs longitudinaux (lisses) et transversaux (cadres). Les raidisseurs peuvent etre metalliques ou en composite. Durant leurs differentes phases de vol, les structures d'avions sont soumises a des excitations aeriennes (couche limite turbulente : TBL, champs diffus : DAF) sur la peau exterieure dont l'energie acoustique produite se transmet a l'interieur de la cabine. Les moteurs, montes sur la structure, produisent une excitation solidienne significative. Ce projet a pour objectifs de developper et de mettre en place des strategies de modelisations des fuselages d'avions soumises a des excitations aeriennes et solidiennes. Tous d'abord, une mise a jour des modeles existants de la TBL apparait dans le deuxieme chapitre afin de mieux les classer. Les proprietes de la reponse vibro-acoustique des structures planes finies et infinies sont analysees. Dans le troisieme chapitre, les hypotheses sur lesquelles sont bases les modeles existants concernant les structures metalliques orthogonalement raidies soumises a des excitations mecaniques, DAF et TBL sont reexamines en premier lieu. Ensuite, une modelisation fine et fiable de ces structures est developpee. Le modele est valide numeriquement a l'aide des methodes des elements finis (FEM) et de frontiere (BEM). Des tests de validations experimentales sont realises sur des panneaux d'avions fournis par des societes aeronautiques. Au quatrieme chapitre, une extension vers les structures composites renforcees par des raidisseurs aussi en composites et de formes complexes est etablie. Un modele analytique simple est egalement implemente et valide numeriquement. Au cinquieme chapitre, la modelisation des structures raidies periodiques en composites est beaucoup plus raffinee par la prise en compte des effets de couplage des deplacements planes et transversaux. L'effet de taille des structures finies periodiques est egalement pris en compte. Les modeles developpes ont permis de conduire plusieurs etudes parametriques sur les proprietes vibro-acoustiques des structures d'avions facilitant ainsi la tache des concepteurs. Dans le cadre de cette these, un article a ete publie dans le Journal of Sound and Vibration et trois autres soumis, respectivement aux Journal of Acoustical Society of America, International Journal of Solid Mechanics et au Journal of Sound and Vibration Mots cles : structures raidies, composites, vibro-acoustique, perte par transmission.
Methodes de caracterisation des proprietes thermomecaniques d'un acier martensitique =
NASA Astrophysics Data System (ADS)
Ausseil, Lucas
Le but de l'etude est de developper des methodes permettant de mesurer les proprietes thermomecaniques d'un acier martensitique lors de chauffe rapide. Ces donnees permettent d'alimenter les modeles d'elements finis existant avec des donnees experimentales. Pour cela, l'acier 4340 est utilise. Cet acier est notamment utilise dans les roues d'engrenage, il a des proprietes mecaniques tres interessantes. Il est possible de modifier ses proprietes grâce a des traitements thermiques. Le simulateur thermomecanique Gleeble 3800 est utilise. Il permet de tester theoriquement toutes les conditions presentes dans les procedes de fabrication. Avec les tests de dilatation realises dans ce projet, les temperatures exactes de changement de phases austenitiques et martensitiques sont obtenues. Des tests de traction ont aussi permis de deduire la limite d'elasticite du materiau dans le domaine austenitique allant de 850 °C a 1100 °C. L'effet des deformations sur la temperature de debut de transformation est montre qualitativement. Une simulation numerique est aussi realisee pour comprendre les phenomenes intervenant pendant les essais.
Sustaining Tunisian SMEs' Competitiveness in the Knowledge Society
NASA Astrophysics Data System (ADS)
Del Vecchio, Pasquale; Elia, Gianluca; Secundo, Giustina
The paper aims to contribute to the debate about knowledge and digital divide affecting countries' competitiveness in the knowledge society. A survey based on qualitative and quantitative data collection has been performed to analyze the level of ICTs and e-Business adoption of the Tunisian SMEs. The results shows that to increase the SMEs competitiveness is necessary to invest in all the components of Intellectual capital: human capital (knowledge, skills, and the abilities of people for using the ICTs), structural capital (supportive infrastructure such as buildings, software, processes, patents, and trademarks, proprietary databases) and social capital (relations and collaboration inside and outside the company). At this purpose, the LINCET "Laboratoire d'Innovation Numerique pour la Competitivité de l'Entreprise Tunisienne" project is finally proposed as a coherent proposition to foster the growth of all the components of the Intellectual Capital for the benefits of competitiveness of Tunisian SMEs.
1991-06-01
intensive systems, including the use of onboard digital computers. Topics include: measurements that are digital in origin, sampling, encoding, transmitting...Individuals charged with designing aircraft measuring systems to become better acquainted with new solutions to their requirements. This volume Is...concerned with aircraft measuring systems as related to flight test and flight research. Measure - ments that are digital in origin or that must be
1994-08-01
volume H1. Le rapport ext accompagnt5 doun jeo die disqoettex contenant les donn~es appropri~es Li bous let cas d’essai. (’es disqoettes sont disponibles ...GERMANY PURPL’Sb OF THE TESi The tests are part of a larger effort to establish a database of experimental measurements for missile configurations
1992-02-01
CONCLUDING REMARKS secondary flow pattern. Probably both factors are influential. Unfortunately The present study has examined the the secondary...Panels which are compesed of experts appointed - by the National Delegates, the Consultant and Exchange Programme and the Aerospace Applications Studies ...CP 352. September 1983 /Combustion Problems in Turbine Engines AGARD CP 353, January 1984 (,rHazard Studies for Solid Propellant Rocket Motors AGARD CP
1994-01-01
0 The Mission of AGARD 0 According to its Charter, the mission of AGARD is to bring together the leading personalities of the NATO nations in the...advances in the aerospace sciences relevant to strengthening the common defence posture; • - Improving the co-operation among member nations in aerospace...for the physical principles. To construct the relevant equations for fluid gas consisting of pseudo particles, 10 is the internal energy due motion it
1994-08-01
c S c o I -2 b I c 5 ^ A9-I0 Kfc 0) >% 3 .« W) o w O OJ a) 5. u > o ^a 5 ~ ^ o ra to *- w 0 " ro d) iO II...12). The technique was modified to calculate the drag %*<• 4 «.c* A12-II using ihc ncin-minjsivc LUV and sidewall pressure measure- menu rather
NASA Astrophysics Data System (ADS)
Ait Hammou, Zouhair
Cette etude porte sur la conception d'un accumulateur echangeur de chaleur hybride (AECH) pour la gestion simultanee des energies solaire et electrique. Un modele mathematique reposant sur les equations de conservation de la quantite d'energie est expose. Il est developpe pour tester differents materiaux de stockage, entre autres, les materiaux a changement de phase (solide/liquide) et les materiaux de stockage sensible. Un code de calcul est mis en eeuvre sur ordinateur, puis valide a l'aide des resultats analytiques et numeriques de la litterature. En parallele, un prototype experimental a echelle reduite est concu au laboratoire afin de valider le code de calcul. Des simulations sont effectuees pour etudier les effets des parametres de conception et des materiaux de stockage sur le comportement thermique de l'AECH et sur la consommation d'energie electrique. Les resultats des simulations sur quatre mois d'hiver montrent que la paraffine n-octadecane et l'acide caprique sont deux candidats souhaitables pour le stockage d'energie destine au chauffage des habitats. L'utilisation de ces deux materiaux dans l'AECH permet de reduire la consommation d'energie electrique de 32% et d'aplanir le probleme de pointe electrique puisque 90% de l'energie electrique est consommee durant les heures creuses. En plus, en adoptant un tarif preferentiel, le calcul des couts lies a la consommation d'energie electrique montre que le consommateur adoptant ce systeme beneficie d'une reduction de 50% de la facture d'electricite.
Modelisation par elements finis du muscle strie
NASA Astrophysics Data System (ADS)
Leonard, Mathieu
Ce present projet de recherche a permis. de creer un modele par elements finis du muscle strie humain dans le but d'etudier les mecanismes engendrant les lesions musculaires traumatiques. Ce modele constitue une plate-forme numerique capable de discerner l'influence des proprietes mecaniques des fascias et de la cellule musculaire sur le comportement dynamique du muscle lors d'une contraction excentrique, notamment le module de Young et le module de cisaillement de la couche de tissu conjonctif, l'orientation des fibres de collagene de cette membrane et le coefficient de poisson du muscle. La caracterisation experimentale in vitro de ces parametres pour des vitesses de deformation elevees a partir de muscles stries humains actifs est essentielle pour l'etude de lesions musculaires traumatiques. Le modele numerique developpe est capable de modeliser la contraction musculaire comme une transition de phase de la cellule musculaire par un changement de raideur et de volume a l'aide des lois de comportement de materiau predefinies dans le logiciel LS-DYNA (v971, Livermore Software Technology Corporation, Livermore, CA, USA). Le present projet de recherche introduit donc un phenomene physiologique qui pourrait expliquer des blessures musculaires courantes (crampes, courbatures, claquages, etc.), mais aussi des maladies ou desordres touchant le tissu conjonctif comme les collagenoses et la dystrophie musculaire. La predominance de blessures musculaires lors de contractions excentriques est egalement exposee. Le modele developpe dans ce projet de recherche met ainsi a l'avant-scene le concept de transition de phase ouvrant la porte au developpement de nouvelles technologies pour l'activation musculaire chez les personnes atteintes de paraplegie ou de muscles artificiels compacts pour l'elaboration de protheses ou d'exosquelettes. Mots-cles Muscle strie, lesion musculaire, fascia, contraction excentrique, modele par elements finis, transition de phase
NASA Astrophysics Data System (ADS)
Bejaoui, Najoua
The pressurized water nuclear reactors (PWRs) is the largest fleet of nuclear reactors in operation around the world. Although these reactors have been studied extensively by designers and operators using efficient numerical methods, there are still some calculation weaknesses, given the geometric complexity of the core, still unresolved such as the analysis of the neutron flux's behavior at the core-reflector interface. The standard calculation scheme is a two steps process. In the first step, a detailed calculation at the assembly level with reflective boundary conditions, provides homogenized cross-sections for the assemblies, condensed to a reduced number of groups; this step is called the lattice calculation. The second step uses homogenized properties in each assemblies to calculate reactor properties at the core level. This step is called the full-core calculation or whole-core calculation. This decoupling of the two calculation steps is the origin of methodological bias particularly at the interface core reflector: the periodicity hypothesis used to calculate cross section librairies becomes less pertinent for assemblies that are adjacent to the reflector generally represented by these two models: thus the introduction of equivalent reflector or albedo matrices. The reflector helps to slowdown neutrons leaving the reactor and returning them to the core. This effect leads to two fission peaks in fuel assemblies localised at the core/reflector interface, the fission rate increasing due to the greater proportion of reentrant neutrons. This change in the neutron spectrum arises deep inside the fuel located on the outskirts of the core. To remedy this we simulated a peripheral assembly reflected with TMI-PWR reflector and developed an advanced calculation scheme that takes into account the environment of the peripheral assemblies and generate equivalent neutronic properties for the reflector. This scheme is tested on a core without control mechanisms and charged with fresh fuel. The results of this study showed that explicit representation of reflector and calculation of peripheral assembly with our advanced scheme allow corrections to the energy spectrum at the core interface and increase the peripheral power by up to 12% compared with that of the reference scheme.
1992-07-01
have become quite common in science and engineering, and will become more so as the demand for reliable data increases, and with it the pace of data...la derniere decennie. Elles sont appelees a jouer un r6le plus important. a l’avenir. avec l’evolution de Ia demande d’intormations tiables et...computational codes. The wind tunnel data contained in the SEADS data base were obained using these forward fuselage models (10%, 4% and 2%) over the Match
NASA Astrophysics Data System (ADS)
Boissonneault, Maxime
L'electrodynamique quantique en circuit est une architecture prometteuse pour le calcul quantique ainsi que pour etudier l'optique quantique. Dans cette architecture, on couple un ou plusieurs qubits supraconducteurs jouant le role d'atomes a un ou plusieurs resonateurs jouant le role de cavites optiques. Dans cette these, j'etudie l'interaction entre un seul qubit supraconducteur et un seul resonateur, en permettant cependant au qubit d'avoir plus de deux niveaux et au resonateur d'avoir une non-linearite Kerr. Je m'interesse particulierement a la lecture de l'etat du qubit et a son amelioration, a la retroaction du processus de mesure sur le qubit de meme qu'a l'etude des proprietes quantiques du resonateur a l'aide du qubit. J'utilise pour ce faire un modele analytique reduit que je developpe a partir de la description complete du systeme en utilisant principalement des transfprmations unitaires et une elimination adiabatique. J'utilise aussi une librairie de calcul numerique maison permettant de simuler efficacement l'evolution du systeme complet. Je compare les predictions du modele analytique reduit et les resultats de simulations numeriques a des resultats experimentaux obtenus par l'equipe de quantronique du CEASaclay. Ces resultats sont ceux d'une spectroscopie d'un qubit supraconducteur couple a un resonateur non lineaire excite. Dans un regime de faible puissance de spectroscopie le modele reduit predit correctement la position et la largeur de la raie. La position de la raie subit les decalages de Lamb et de Stark, et sa largeur est dominee par un dephasage induit par le processus de mesure. Je montre que, pour les parametres typiques de l'electrodynamique quantique en circuit, un accord quantitatif requiert un modele en reponse non lineaire du champ intra-resonateur, tel que celui developpe. Dans un regime de forte puissance de spectroscopie, des bandes laterales apparaissent et sont causees par les fluctuations quantiques du champ electromagnetique intra-resonateur autour de sa valeur d'equilibre. Ces fluctuations sont causees par la compression du champ electromagnetique due a la non-linearite du resonateur, et l'observation de leur effet via la spectroscopie d'un qubit constitue une premiere. Suite aux succes quantitatifs du modele reduit, je montre que deux regimes de parametres ameliorent marginalement la mesure dispersive d'un qubit avec un resonateur lineaire, et significativement une mesure par bifurcation avec un resonateur non lineaire. J'explique le fonctionnement d'une mesure de qubit dans un resonateur lineaire developpee par une equipe experimentale de l'Universite de Yale. Cette mesure, qui utilise les non-linearites induites par le qubit, a une haute fidelite, mais utilise une tres haute puissance et est destructrice. Dans tous ces cas, la structure multi-niveaux du qubit s'avere cruciale pour la mesure. En suggerant des facons d'ameliorer la mesure de qubits supraconducteurs, et en decrivant quantitativement la physique d'un systeme a plusieurs niveaux couple a un resonateur non lineaire excite, les resultats presentes dans cette these sont pertinents autant pour l'utilisation de l'architecture d'electrodynamique quantique en circuit pour l'informatique quantique que pour l'optique quantique. Mots-cles: electrodynamique quantique en circuit, informatique quantique, mesure, qubit supraconducteur, transmon, non-linearite Kerr
NASA Astrophysics Data System (ADS)
Filali, Bilai
Graphene, as an advanced carbon nano-structure, has attracted a deluge of interest of scholars recently because of it's outstanding mechanical, electrical and thermal properties. There are several different ways to synthesis graphene in practical ways, such as Mechanical Exfoliation, Chemical Vapor Deposition (CVD), and Anodic Arc discharge. In this thesis a method of graphene synthesis in plasma will be discussed, in which this synthesis method is supported by the erosion of the anode material. This graphene synthesis method is one of the most practical methods which can provide high production rate. High purity of graphene flakes have been synthesized with an anodic arc method under certain pressure (about 500 torr). Raman spectrometer, Scanning Electron Microscope (SEM), Atomic Force Microscopy (AFM) and Transmission Electron Microscopy (TEM) have been utilized for characterization of the synthesis products. Arc produced graphene and commercially available graphene was compared by those machine and the difference lies in the number of layers, the thicknesses of each layer and the shape of the structure itself. Temperature dependence of the synthesis procedure has been studied. It has been found that the graphene can be produced on a copper foil substrate under temperatures near the melting point of copper. However, with a decrease in substrate temperature yields a transformation of the synthesized graphene into amorphous carbon. Glow discharge was utilized to functionalize grapheme. SEM and EDS observation indicated increases of oxygen content in the graphene after its exposure to glow discharge.
2003-02-01
Stromingsmechanica Industriale Pleinlaan, 2 Universita Roma Tre B-1050 Brussel via della Vasca Navale 79 em: hirsch@stro10.vub.ac.be 00146 Roma em...the flow and noise in the diffuser of an industrial gas turbine engine. A steady RANS CFD calculation and experiments were used to identify the gross...finally, defence industry was restructuring demanding that we review our relationship with them. (SYA) KN1-5 Ministers agreed that changes were
2001-12-01
product operator, Ucg = X body axis velocity at the cg, Uvane = X body axis velocity at the cg, Vcg = Y body axis velocity at the cg, Vvane = Y body axis...Tan vane Uvane α β = = (5) Ucg = VtrueCOS(βtrue)COS(αtrue) Vcg = VtrueSIN(βtrue) Wcg = VtrueCOS(βtrue)SIN...from the definitions of these angles. 2 2 2 1 1 V U V Wcg cg cgtrue Wcg Tantrue Ucg Vcg Sintrue Vtrue α β = + + −= −= (12) 53
Conductivite dans le modele de Hubbard bi-dimensionnel a faible couplage
NASA Astrophysics Data System (ADS)
Bergeron, Dominic
Le modele de Hubbard bi-dimensionnel (2D) est souvent considere comme le modele minimal pour les supraconducteurs a haute temperature critique a base d'oxyde de cuivre (SCHT). Sur un reseau carre, ce modele possede les phases qui sont communes a tous les SCHT, la phase antiferromagnetique, la phase supraconductrice et la phase dite du pseudogap. Il n'a pas de solution exacte, toutefois, plusieurs methodes approximatives permettent d'etudier ses proprietes de facon numerique. Les proprietes optiques et de transport sont bien connues dans les SCHT et sont donc de bonne candidates pour valider un modele theorique et aider a comprendre mieux la physique de ces materiaux. La presente these porte sur le calcul de ces proprietes pour le modele de Hubbard 2D a couplage faible ou intermediaire. La methode de calcul utilisee est l'approche auto-coherente a deux particules (ACDP), qui est non-perturbative et inclue l'effet des fluctuations de spin et de charge a toutes les longueurs d'onde. La derivation complete de l'expression de la conductivite dans l'approche ACDP est presentee. Cette expression contient ce qu'on appelle les corrections de vertex, qui tiennent compte des correlations entre quasi-particules. Pour rendre possible le calcul numerique de ces corrections, des algorithmes utilisant, entre autres, des transformees de Fourier rapides et des splines cubiques sont developpes. Les calculs sont faits pour le reseau carre avec sauts aux plus proches voisins autour du point critique antiferromagnetique. Aux dopages plus faibles que le point critique, la conductivite optique presente une bosse dans l'infrarouge moyen a basse temperature, tel qu'observe dans plusieurs SCHT. Dans la resistivite en fonction de la temperature, on trouve un comportement isolant dans le pseudogap lorsque les corrections de vertex sont negligees et metallique lorsqu'elles sont prises en compte. Pres du point critique, la resistivite est lineaire en T a basse temperature et devient progressivement proportionnelle a T 2 a fort dopage. Quelques resultats avec sauts aux voisins plus eloignes sont aussi presentes. Mots-cles: Hubbard, point critique quantique, conductivite, corrections de vertex
Reconnaissance invariante d'objets 3-D et correlation SONG
NASA Astrophysics Data System (ADS)
Roy, Sebastien
Cette these propose des solutions a deux problemes de la reconnaissance automatique de formes: la reconnaissance invariante d'objets tridimensionnels a partir d'images d'intensite et la reconnaissance robuste a la presence de bruit disjoint. Un systeme utilisant le balayage angulaire des images et un classificateur par trajectoires d'espace des caracteristiques permet d'obtenir la reconnaissance invariante d'objets tridimensionnels. La reconnaissance robuste a la presence de bruit disjoint est realisee au moyen de la correlation SONG. Nous avons realise la reconnaissance invariante aux translations, rotations et changements d'echelle d'objets tridimensionnels a partir d'images d'intensite segmentees. Nous utilisons le balayage angulaire et un classificateur a trajectoires d'espace des caracteris tiques. Afin d'obtenir l'invariance aux translations, le centre de balayage angulaire coincide avec le centre geometrique de l'image. Le balayage angulaire produit un vecteur de caracteristiques invariant aux changements d'echelle de l'image et il transforme en translations du signal les rotations autour d'un axe parallele a la ligne de visee. Le classificateur par trajectoires d'espace des caracteristiques represente une rotation autour d'un axe perpendiculaire a la ligne de visee par une courbe dans l'espace. La classification se fait par la mesure de la distance du vecteur de caracteristiques de l'image a reconnaitre aux trajectoires stockees dans l'espace. Nos resultats numeriques montrent un taux de classement atteignant 98% sur une banque d'images composee de 5 vehicules militaires. La correlation non-lineaire generalisee en tranches orthogonales (SONG) traite independamment les niveaux de gris presents dans une image. Elle somme les correlations lineaires des images binaires ayant le meme niveau de gris. Cette correlation est equivalente a compter le nombre de pixels situes aux memes positions relatives et ayant les memes intensites sur deux images. Nous presentons une realisation opto-electronique de la correlation SONG. Cette realisation utilise le correlateur a transformees conjointes. Les resultats des experiences numeriques et optiques montrent que le bruit disjoint ne nuit pas a la correlation SONG.
Developpement de techniques de diagnostic non intrusif par tomographie optique
NASA Astrophysics Data System (ADS)
Dubot, Fabien
Que ce soit dans les domaines des procedes industriels ou de l'imagerie medicale, on a assiste ces deux dernieres decennies a un developpement croissant des techniques optiques de diagnostic. L'engouement pour ces methodes repose principalement sur le fait qu'elles sont totalement non invasives, qu'elle utilisent des sources de rayonnement non nocives pour l'homme et l'environnement et qu'elles sont relativement peu couteuses et faciles a mettre en oeuvre comparees aux autres techniques d'imagerie. Une de ces techniques est la Tomographie Optique Diffuse (TOD). Cette methode d'imagerie tridimensionnelle consiste a caracteriser les proprietes radiatives d'un Milieu Semi-Transparent (MST) a partir de mesures optiques dans le proche infrarouge obtenues a l'aide d'un ensemble de sources et detecteurs situes sur la frontiere du domaine sonde. Elle repose notamment sur un modele direct de propagation de la lumiere dans le MST, fournissant les predictions, et un algorithme de minimisation d'une fonction de cout integrant les predictions et les mesures, permettant la reconstruction des parametres d'interet. Dans ce travail, le modele direct est l'approximation diffuse de l'equation de transfert radiatif dans le regime frequentiel tandis que les parametres d'interet sont les distributions spatiales des coefficients d'absorption et de diffusion reduit. Cette these est consacree au developpement d'une methode inverse robuste pour la resolution du probleme de TOD dans le domaine frequentiel. Pour repondre a cet objectif, ce travail est structure en trois parties qui constituent les principaux axes de la these. Premierement, une comparaison des algorithmes de Gauss-Newton amorti et de Broyden- Fletcher-Goldfarb-Shanno (BFGS) est proposee dans le cas bidimensionnel. Deux methodes de regularisation sont combinees pour chacun des deux algorithmes, a savoir la reduction de la dimension de l'espace de controle basee sur le maillage et la regularisation par penalisation de Tikhonov pour l'algorithme de Gauss-Newton amorti, et les regularisations basees sur le maillage et l'utilisation des gradients de Sobolev, uniformes ou spatialement dependants, lors de l'extraction du gradient de la fonction cout, pour la methode BFGS. Les resultats numeriques indiquent que l'algorithme de BFGS surpasse celui de Gauss-Newton amorti en ce qui concerne la qualite des reconstructions obtenues, le temps de calcul ou encore la facilite de selection du parametre de regularisation. Deuxiemement, une etude sur la quasi-independance du parametre de penalisation de Tikhonov optimal par rapport a la dimension de l'espace de controle dans les problemes inverses d'estimation de fonctions spatialement dependantes est menee. Cette etude fait suite a une observation realisee lors de la premiere partie de ce travail ou le parametre de Tikhonov, determine par la methode " L-curve ", se trouve etre independant de la dimension de l'espace de controle dans le cas sous-determine. Cette hypothese est demontree theoriquement puis verifiee numeriquement sur un probleme inverse lineaire de conduction de la chaleur puis sur le probleme inverse non-lineaire de TOD. La verification numerique repose sur la determination d'un parametre de Tikhonov optimal, defini comme etant celui qui minimise les ecarts entre les cibles et les reconstructions. La demonstration theorique repose sur le principe de Morozov (discrepancy principle) dans le cas lineaire, tandis qu'elle repose essentiellement sur l'hypothese que les fonctions radiatives a reconstruire sont des variables aleatoires suivant une loi normale dans le cas non-lineaire. En conclusion, la these demontre que le parametre de Tikhonov peut etre determine en utilisant une parametrisation des variables de controle associee a un maillage lâche afin de reduire les temps de calcul. Troisiemement, une methode inverse multi-echelle basee sur les ondelettes associee a l'algorithme de BFGS est developpee. Cette methode, qui s'appuie sur une reformulation du probleme inverse original en une suite de sous-problemes inverses de la plus grande echelle a la plus petite, a l'aide de la transformee en ondelettes, permet de faire face a la propriete de convergence locale de l'optimiseur et a la presence de nombreux minima locaux dans la fonction cout. Les resultats numeriques montrent que la methode proposee est plus stable vis-a-vis de l'estimation initiale des proprietes radiatives et fournit des reconstructions finales plus precises que l'algorithme de BFGS ordinaire tout en necessitant des temps de calcul semblables. Les resultats de ces travaux sont presentes dans cette these sous forme de quatre articles. Le premier article a ete accepte dans l'International Journal of Thermal Sciences, le deuxieme est accepte dans la revue Inverse Problems in Science and Engineering, le troisieme est accepte dans le Journal of Computational and Applied Mathematics et le quatrieme a ete soumis au Journal of Quantitative Spectroscopy & Radiative Transfer. Dix autres articles ont ete publies dans des comptes-rendus de conferences avec comite de lecture. Ces articles sont disponibles en format pdf sur le site de la Chaire de recherche t3e (www.t3e.info).
NASA Astrophysics Data System (ADS)
Bergeron, Alain
Cette recherche vise a la mise en oeuvre optique de reseaux neuronaux. Deux architectures differentes sont proposees. La premiere est la memoire associative permettant d'associer a un objet quelconque une sortie arbitraire tout en preservant l'information sur sa position. La seconde architecture, le classificateur neuronal pour le controle robotique, permet l'identification d'une entree et son classement selon differentes categories. La sortie est compatible avec les systemes numeriques standard. Pour realiser ces architectures, une approche modulaire est privilegiee. Le correlateur constitue le module de base des realisations. Differents modules sont de plus introduits pour realiser convenablement les operations neuronales. Le premier de ces modules est le seuil optoelectronique permettant de realiser une fonction non lineaire, element essentiel des reseaux neuronaux. Le second module a etre introduit est l'encodeur optonumerique, utile au classement des objets. Le probleme de l'enregistrement de la memoire est aborde a l'aide du codage iteratif global.
NASA Astrophysics Data System (ADS)
Xing, Jacques
Dielectric barrier discharge (DBD) plasma actuator is a proposed device for active for control in order to improve the performances of aircraft and turbomachines. Essentially, these actuators are made of two electrodes separated by a layer of dielectric material and convert electricity directly into flow. Because of the high costs associated with experiences in realistic operating conditions, there is a need to develop a robust numerical model that can predict the plasma body force and the effects of various parameters on it. Indeed, this plasma body force can be affected by atmospheric conditions (temperature, pressure, and humidity), velocity of the neutral flow, applied voltage (amplitude, frequency, and waveform), and by the actuator geometry. In that respect, the purpose of this thesis is to implement a plasma model for DBD actuator that has the potential to consider the effects of these various parameters. In DBD actuator modelling, two types of approach are commonly proposed, low-order modelling (or phenomenological) and high-order modelling (or scientific). However a critical analysis, presented in this thesis, showed that phenomenological models are not robust enough to predict the plasma body force without artificial calibration for each specific case. Moreover, there are based on erroneous assumptions. Hence, the selected approach to model the plasma body force is a scientific drift-diffusion model with four chemical species (electrons, positive ions, negative ions, and neutrals). This model was chosen because it gives consistent numerical results comparatively with experimental data. Moreover, this model has great potential to include the effect of temperature, pressure, and humidity on the plasma body force and requires only a reasonable computational time. This model was independently implemented in C++ programming language and validated with several test cases. This model was later used to simulate the effect of the plasma body force on the laminar-turbulent transition on airfoil in order to validate the performance of this model in practical CFD simulation. Numerical results show that this model gives a better prediction of the effect of the plasma on the fluid flow for a practical case in aerospace than a phenomenological model.
Implementation en VHDl/FPGA d'afficheur video numerique (AVN) pour des applications aerospatiales
NASA Astrophysics Data System (ADS)
Pelletier, Sebastien
L'objectif de ce projet est de developper un controleur video en langage VHDL afin de remplacer la composante specialisee presentement utilisee chez CMC Electronique. Une recherche approfondie des tendances et de ce qui se fait actuellement dans le domaine des controleurs video est effectuee afin de definir les specifications du systeme. Les techniques d'entreposage et d'affichage des images sont expliquees afin de mener ce projet a terme. Le nouveau controleur est developpe sur une plateforme electronique possedant un FPGA, un port VGA et de la memoire pour emmagasiner les donnees. Il est programmable et prend peu d'espace dans un FPGA, ce qui lui permet de s'inserer dans n'importe quelle nouvelle technologie de masse a faible cout. Il s'adapte rapidement a toutes les resolutions d'affichage puisqu'il est modulaire et configurable. A court terme, ce projet permettra un controle ameliore des specifications et des normes de qualite liees aux contraintes de l'avionique.
NASA Astrophysics Data System (ADS)
Chastenay, Pierre
Since the Quebec Education Program came into effect in 2001, Quebec classrooms have again been teaching astronomy. Unfortunately, schools are ill-equipped to teach complex astronomical concepts, most of which occur outside school hours and over long periods of time. Furthermore, many astronomical phenomena involve celestial objects travelling through three-dimensional space, which we cannot access from our geocentric point of view. The lunar phases, a concept prescribed in secondary cycle one, fall into that category. Fortunately, schools can count on support from the planetarium, a science museum dedicated to presenting ultra-realistic simulations of astronomical phenomena in fast time and at any hour of the day. But what type of planetarium will support schools? Recently, planetariums also underwent their own revolution: they switched from analogue to digital, replacing geocentric opto-mechanical projectors with video projectors that offer the possibility of travelling virtually through a completely immersive simulation of the three-dimensional Universe. Although research into planetarium education has focused little on this new paradigm, certain of its conclusions, based on the study of analogue planetariums, can help us develop a rewarding teaching intervention in these new digital simulators. But other sources of inspiration will be cited, primarily the teaching of science, which views learning no longer as the transfer of knowledge, but rather as the construction of knowledge by the learners themselves, with and against their initial conceptions. The conception and use of constructivist learning environments, of which the digital planetarium is a fine example, and the use of simulations in astronomy will complete our theoretical framework and lead to the conception of a teaching intervention focusing on the lunar phases in a digital planetarium and targeting students aged 12 to 14. This teaching intervention was initially tested as part of development research (didactic engineering) aimed at improving it, both theoretically and practically, through multiple iterations in its "natural" environment, in this case an inflatable digital planetarium six metres in diameter. We are presenting the results of our first iteration, completed with help from six children aged 12 to 14 (four boys and two girls) whose conceptions about the lunar phases were noted before, during and after the intervention through group interviews, questionnaires, group exercises and recordings of the interventions throughout the activity. The evaluation was essentially qualitative, based on the traces obtained throughout the session, in particular within the planetarium itself. This material was then analyzed to validate the theoretical concepts that led to the conception of the teaching intervention and also to reveal possible ways to improve the intervention. We noted that the intervention indeed changed most participants' conceptions about the lunar phases, but also identified ways to boost its effectiveness in the future.
Simulation numerique de l'accretion de glace sur une pale d'eolienne
NASA Astrophysics Data System (ADS)
Fernando, Villalpando
The wind energy industry is growing steadily, and an excellent place for the construction of wind farms is northern Quebec. This region has huge wind energy production potential, as the cold temperatures increase air density and with it the available wind energy. However, some issues associated with arctic climates cause production losses on wind farms. Icing conditions occur frequently, as high air humidity and freezing temperatures cause ice to build up on the blades, resulting in wind turbines operating suboptimally. One of the negative consequences of ice accretion is degradation of the blade's aerodynamics, in the form of a decrease in lift and an increase in drag. Also, the ice grows unevenly, which unbalances the blades and induces vibration. This reduces the expected life of some of the turbine components. If the ice accretion continues, the ice can reach a mass that endangers the wind turbine structure, and operation must be suspended in order to prevent mechanical failure. To evaluate the impact of ice on the profits of wind farms, it is important to understand how ice builds up and how much it can affect blade aerodynamics. In response, researchers in the wind energy field have attempted to simulate ice accretion on airfoils in refrigerated wind tunnels. Unfortunately, this is an expensive endeavor, and researchers' budgets are limited. However, ice accretion can be simulated more cost-effectively and with fewer limitations on airfoil size and air speed using numerical methods. Numerical simulation is an approach that can help researchers acquire knowledge in the field of wind energy more quickly. For years, the aviation industry has invested time and money developing computer codes to simulate ice accretion on aircraft wings. Nearly all these codes are restricted to use by aircraft developers, and so they are not accessible to researchers in the wind engineering field. Moreover, these codes have been developed to meet aeronautical industry specifications, which are different from those that must be met in the wind energy industry. Among these differences are the following: wind turbines operate at subsonic speeds; the cords and angles of attack of wind turbine blades are smaller than those of aircraft wings; and a wind turbine can operate with a larger ice mass on its blades than an aircraft can. So, it is important to provide wind energy researchers with tools specifically validated with the operations parameters of a wind turbine. The main goal of this work is to develop a methodology to simulate ice accretion in 2D using Fluent and Matlab, commercial software programs that are available at nearly all research institutions. In this study, we used Gambit, previously the companion tool of Fluent, for mesh generation, and which has now been replaced by ICEM. We decided to stay with Gambit, because we were already deeply involved with the meshing procedure for our simulation of ice accretion at the time Gambit was removed from the market. We validate the methodology with experimental data consisting of iced airfoil contours obtained in a refrigerated wind tunnel using the parameters of actual ice conditions recorded in northern Quebec. This methodology consists of four steps: airfoil meshing, droplet trajectory calculation, thermodynamic model application, and airfoil contour updating. The total simulation time is divided into several time steps, for each of which the four steps are performed until the total time has elapsed. The time step length depends on the icing conditions. (Abstract shortened by UMI.).
NASA Astrophysics Data System (ADS)
Lallier-Daniels, Dominic
La conception de ventilateurs est souvent basée sur une méthodologie « essais/erreurs » d'amélioration de géométries existantes ainsi que sur l'expérience de design et les résultats expérimentaux cumulés par les entreprises. Cependant, cette méthodologie peut se révéler coûteuse en cas d'échec; même en cas de succès, des améliorations significatives en performance sont souvent difficiles, voire impossibles à obtenir. Le projet présent propose le développement et la validation d'une méthodologie de conception basée sur l'emploi du calcul méridien pour la conception préliminaire de turbomachines hélico-centrifuges (ou flux-mixte) et l'utilisation du calcul numérique d'écoulement fluides (CFD) pour la conception détaillée. La méthode de calcul méridien à la base du processus de conception proposé est d'abord présentée. Dans un premier temps, le cadre théorique est développé. Le calcul méridien demeurant fondamentalement un processus itératif, le processus de calcul est également présenté, incluant les méthodes numériques de calcul employée pour la résolution des équations fondamentales. Une validation du code méridien écrit dans le cadre du projet de maîtrise face à un algorithme de calcul méridien développé par l'auteur de la méthode ainsi qu'à des résultats de simulation numérique sur un code commercial est également réalisée. La méthodologie de conception de turbomachines développée dans le cadre de l'étude est ensuite présentée sous la forme d'une étude de cas pour un ventilateur hélico-centrifuge basé sur des spécifications fournies par le partenaire industriel Venmar. La méthodologie se divise en trois étapes: le calcul méridien est employé pour le pré-dimensionnement, suivi de simulations 2D de grilles d'aubes pour la conception détaillée des pales et finalement d'une analyse numérique 3D pour la validation et l'optimisation fine de la géométrie. Les résultats de calcul méridien sont en outre comparés aux résultats de simulation pour la géométrie 3D afin de valider l'emploi du calcul méridien comme outil de prédimensionnement.
NASA Astrophysics Data System (ADS)
Lavergne, Catherine
Geological formations of the Montreal area are mostly made of limestones. The usual approach for design is based on rock mass classification systems considering the rock mass as an equivalent continuous and isotropic material. However, for shallow excavations, stability is generally controlled by geological structures, that in Montreal, are bedding plans that give to the rock mass a strong strain and stress anisotropy. Objects of the research are to realize a numerical modeling that considers sedimentary rocks anisotropy and to determine the influence of the design parameters on displacements, stresses and failure around metro unsupported underground excavations. Geotechnical data used for this study comes from a metro extension project and has been made available to the author. The excavation geometries analyzed are the tunnel, the station and a garage consisting of three (3) parallel tunnels for rock covered between 4 and 16 m. The numerical modeling has been done with FLAC software that represents continuous environment, and ubiquitous joint behavior model to simulate strength anisotropy of sedimentary rock masses. The model considers gravity constraints for an anisotropic material and pore pressures. In total, eleven (11) design parameters have been analyzed. Results show that unconfined compressive strength of intact rock, fault zones and pore pressures in soils have an important influence on the stability of the numerical model. The geometry of excavation, the thickness of rock covered, the RQD, Poisson's ratio and the horizontal tectonic stresses have a moderate influence. Finally, ubiquitous joint parameters, pore pressures in rock mass, width of the pillars of the garage and the damage linked to the excavation method have a low impact. FLAC results have been compared with those of UDEC, a software that uses the distinct element method. Similar conclusions were obtained on displacements, stress state and failure modes. However, UDEC model give slightly less conservative results than FLAC. This study stands up by his local character and the large amount of geotechnical data available used to determine parameters of the numerical model. The results led to recommendations for laboratory tests that can be applied to characterize more specifically anisotropy of sedimentary rocks.
NASA Astrophysics Data System (ADS)
Pham, Trinh Hung
Monitoring hydrological behavior of a large tropical watershed following a forest cover variation has an important role in water resource management planning as well as for forest sustainable management. Traditional methods in forest hydrology studies are Experimental watersheds, Upstream-downstream, Experimental plots, Statistical regional analysis and Watershed simulation. Those methodes have limitations for large watersheds concerning the monitoring time, the lack of input data especially about forest cover and the capacity of extrapolating results accurately in terms of large watersheds. Moreover, there is still currently a scientific debate in forest ecology on relation between water and forest. The reason of this problem comes from geographical differences in publication concerning study zones, experimental watershed size and applied methods. It gives differences in the conclusions on the influence of tropical forest cover change on the changes of outlet water and yet on the yearly runoff in terms of large watershed. In order to exceed the limitations of actual methods, to solve the difficulty of acquiring forest cover data and to have a better understanding of the relation between tropical forest cover change and hydrological behavior evolution of a large watershed, it is necessary to develop a new approach by using numeric remote sensing. We used the watershed of Dong Nai as a case study. Results show that a fusion between TM and ETM+ Landsat image series and hydro-meteorologic data allow us to observe and detect flooding trends and flooding peaks after an intensive forest cover change from 16% to 20%. Flooding frequency and flooding peaks have clearly decreased when there is an increase of the forest cover from 1983 to 1990. The influence of tropical forest cover on the hydrological behavior is varying with geographical locations of watershed. There is a significant relation between forest cover evolution and environmental facteurs as the runoff coefficient (R = 0,87) and the yearly precipitation (R = 0,93).
Navigation d'un vehicule autonome autour d'un asteroide
NASA Astrophysics Data System (ADS)
Dionne, Karine
Les missions d'exploration planetaire utilisent des vehicules spatiaux pour acquerir les donnees scientifiques qui font avancer notre connaissance du systeme solaire. Depuis les annees 90, ces missions ciblent non seulement les planetes, mais aussi les corps celestes de plus petite taille comme les asteroides. Ces astres representent un defi particulier du point de vue des systemes de navigation, car leur environnement dynamique est complexe. Une sonde spatiale doit reagir rapidement face aux perturbations gravitationnelles en presence, sans quoi sa securite pourrait etre compromise. Les delais de communication avec la Terre pouvant souvent atteindre plusieurs dizaines de minutes, il est necessaire de developper des logiciels permettant une plus grande autonomie d'operation pour ce type de mission. Ce memoire presente un systeme de navigation autonome qui determine la position et la vitesse d'un satellite en orbite autour d'un asteroide. Il s'agit d'un filtre de Kalman etendu adaptatif a trois degres de liberte. Le systeme propose se base sur l'imagerie optique pour detecter des " points de reperes " qui ont ete prealablement cartographies. Il peut s'agir de crateres, de rochers ou de n'importe quel trait physique discernable a la camera. Les travaux de recherche realises se concentrent sur les techniques d'estimation d'etat propres a la navigation autonome. Ainsi, on suppose l'existence d'un logiciel approprie qui realise les fonctions de traitement d'image. La principale contribution de recherche consiste en l'inclusion, a chaque cycle d'estimation, d'une mesure de distance afin d'ameliorer les performances de navigation. Un estimateur d'etat de type adaptatif est necessaire pour le traitement de ces mesures, car leur precision varie dans le temps en raison de l'erreur de pointage. Les contributions secondaires de recherche sont liees a l'analyse de l'observabilite du systeme ainsi qu'a une analyse de sensibilite pour six parametres principaux de conception. Les resultats de simulation montrent que l'ajout d'une mesure de distance par cycle de mise a jour entraine une amelioration significative des performances de navigation. Ce procede reduit l'erreur d'estimation ainsi que les periodes de non-observabilite en plus de contrer la dilution de precision des mesures. Les analyses de sensibilite confirment quant a elles la contribution des mesures de distance a la diminution globale de l'erreur d'estimation et ce pour une large gamme de parametres de conception. Elles indiquent egalement que l'erreur de cartographie est un parametre critique pour les performances du systeme de navigation developpe. Mots cles : Estimation d'etat, filtre de Kalman adaptatif, navigation optique, lidar, asteroide, simulations numeriques
Launch Site Computer Simulation and its Application to Processes
NASA Technical Reports Server (NTRS)
Sham, Michael D.
1995-01-01
This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.
An intelligent processing environment for real-time simulation
NASA Technical Reports Server (NTRS)
Carroll, Chester C.; Wells, Buren Earl, Jr.
1988-01-01
The development of a highly efficient and thus truly intelligent processing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.
Virtual Collaborative Simulation Environment for Integrated Product and Process Development
NASA Technical Reports Server (NTRS)
Gulli, Michael A.
1997-01-01
Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.
LISP based simulation generators for modeling complex space processes
NASA Technical Reports Server (NTRS)
Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing
1987-01-01
The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Day, John H. (Technical Monitor)
2000-01-01
Post-Processing of data related to a Global Positioning System (GPS) simulation is an important activity in qualification of a GPS receiver for space flight. Because a GPS simulator is a critical resource it is desirable to move off the pertinent simulation data from the simulator as soon as a test is completed. The simulator data files are usually moved to a Personal Computer (PC), where the post-processing of the receiver logged measurements and solutions data and simulated data is performed. Typically post-processing is accomplished using PC-based commercial software languages and tools. Because of commercial software systems generality their general-purpose functions are notoriously slow and more than often are the bottleneck problem even for short duration experiments. For example, it may take 8 hours to post-process data from a 6-hour simulation. There is a need to do post-processing faster, especially in order to use the previous test results as feedback for a next simulation setup. This paper demonstrates that a fast software linear interpolation algorithm is applicable to a large class of engineering problems, like GPS simulation data post-processing, where computational time is a critical resource and is one of the most important considerations. An approach is developed that allows to speed-up post-processing by an order of magnitude. It is based on improving the post-processing bottleneck interpolation algorithm using apriori information that is specific to the GPS simulation application. The presented post-processing scheme was used in support of a few successful space flight missions carrying GPS receivers. A future approach to solving the post-processing performance problem using Field Programmable Gate Array (FPGA) technology is described.
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Day, John H. (Technical Monitor)
2000-01-01
Post-processing of data, related to a GPS receiver test in a GPS simulator and test facility, is an important step towards qualifying a receiver for space flight. Although the GPS simulator provides all the parameters needed to analyze a simulation, as well as excellent analysis tools on the simulator workstation, post-processing is not a GPS simulator or receiver function alone, and it must be planned as a separate pre-flight test program requirement. A GPS simulator is a critical resource, and it is desirable to move off the pertinent test data from the simulator as soon as a test is completed. The receiver and simulator databases are used to extract the test data files for postprocessing. These files are then usually moved from the simulator and receiver systems to a personal computer (PC) platform, where post-processing is done typically using PC-based commercial software languages and tools. Because of commercial software systems generality their functions are notoriously slow and more than often are the bottleneck even for short duration simulator-based tests. There is a need to do post-processing faster and within an hour after test completion, including all required operations on the simulator and receiver to prepare and move off the post-processing files. This is especially significant in order to use the previous test feedback for the next simulation setup or to run near back-to-back simulation scenarios. Solving the post-processing timing problem is critical for a pre-flight test program success. Towards this goal an approach was developed that allows to speed-up post-processing by an order of a magnitude. It is based on improving the post-processing bottleneck function algorithm using a priory information that is specific to a GPS simulation application and using only the necessary volume of truth data. The presented postprocessing scheme was used in support of a few successful space flight missions carrying GPS receivers.
Practical Unitary Simulator for Non-Markovian Complex Processes
NASA Astrophysics Data System (ADS)
Binder, Felix C.; Thompson, Jayne; Gu, Mile
2018-06-01
Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.
Process Modeling and Dynamic Simulation for EAST Helium Refrigerator
NASA Astrophysics Data System (ADS)
Lu, Xiaofei; Fu, Peng; Zhuang, Ming; Qiu, Lilong; Hu, Liangbing
2016-06-01
In this paper, the process modeling and dynamic simulation for the EAST helium refrigerator has been completed. The cryogenic process model is described and the main components are customized in detail. The process model is controlled by the PLC simulator, and the realtime communication between the process model and the controllers is achieved by a customized interface. Validation of the process model has been confirmed based on EAST experimental data during the cool down process of 300-80 K. Simulation results indicate that this process simulator is able to reproduce dynamic behaviors of the EAST helium refrigerator very well for the operation of long pulsed plasma discharge. The cryogenic process simulator based on control architecture is available for operation optimization and control design of EAST cryogenic systems to cope with the long pulsed heat loads in the future. supported by National Natural Science Foundation of China (No. 51306195) and Key Laboratory of Cryogenics, Technical Institute of Physics and Chemistry, CAS (No. CRYO201408)
Viscoelastic properties of chalcogenide glasses and the simulation of their molding processes
NASA Astrophysics Data System (ADS)
Liu, Weiguo; Shen, Ping; Jin, Na
In order to simulate the precision molding process, the viscoelastic properties of chalcogenide glasses under high temperatures were investigated. Thermomechanical analysis were performed to measure and analysis the thermomechanical properties of chalcogenide glasses. The creep responses of the glasses at different temperatures were obtained. Finite element analysis was applied for the simulation of the molding processes. The simulation results were in consistence with previously reported experiment results. Stress concentration and evolution during the molding processes was also described with the simulation results.
A Low Cost Microcomputer System for Process Dynamics and Control Simulations.
ERIC Educational Resources Information Center
Crowl, D. A.; Durisin, M. J.
1983-01-01
Discusses a video simulator microcomputer system used to provide real-time demonstrations to strengthen students' understanding of process dynamics and control. Also discusses hardware/software and simulations developed using the system. The four simulations model various configurations of a process liquid level tank system. (JN)
Application of simulation models for the optimization of business processes
NASA Astrophysics Data System (ADS)
Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří
2016-06-01
The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.
Traversari, Roberto; Goedhart, Rien; Schraagen, Jan Maarten
2013-01-01
The objective is evaluation of a traditionally designed operating room using simulation of various surgical workflows. A literature search showed that there is no evidence for an optimal operating room layout regarding the position and size of an ultraclean ventilation (UCV) canopy with a separate preparation room for laying out instruments and in which patients are induced in the operating room itself. Neither was literature found reporting on process simulation being used for this application. Many technical guidelines and designs have mainly evolved over time, and there is no evidence on whether the proposed measures are also effective for the optimization of the layout for workflows. The study was conducted by applying observational techniques to simulated typical surgical procedures. Process simulations which included complete surgical teams and equipment required for the intervention were carried out for four typical interventions. Four observers used a form to record conflicts with the clean area boundaries and the height of the supply bridge. Preferences for particular layouts were discussed with the surgical team after each simulated procedure. We established that a clean area measuring 3 × 3 m and a supply bridge height of 2.05 m was satisfactory for most situations, provided a movable operation table is used. The only cases in which conflicts with the supply bridge were observed were during the use of a surgical robot (Da Vinci) and a surgical microscope. During multiple trauma interventions, bottlenecks regarding the dimensions of the clean area will probably arise. The process simulation of four typical interventions has led to significantly different operating room layouts than were arrived at through the traditional design process. Evidence-based design, human factors, work environment, operating room, traditional design, process simulation, surgical workflowsPreferred Citation: Traversari, R., Goedhart, R., & Schraagen, J. M. (2013). Process simulation during the design process makes the difference: Process simulations applied to a traditional design. Health Environments Research & Design Journal 6(2), pp 58-76.
Design of penicillin fermentation process simulation system
NASA Astrophysics Data System (ADS)
Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi
2011-10-01
Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.
Virtual milk for modelling and simulation of dairy processes.
Munir, M T; Zhang, Y; Yu, W; Wilson, D I; Young, B R
2016-05-01
The modeling of dairy processing using a generic process simulator suffers from shortcomings, given that many simulators do not contain milk components in their component libraries. Recently, pseudo-milk components for a commercial process simulator were proposed for simulation and the current work extends this pseudo-milk concept by studying the effect of both total milk solids and temperature on key physical properties such as thermal conductivity, density, viscosity, and heat capacity. This paper also uses expanded fluid and power law models to predict milk viscosity over the temperature range from 4 to 75°C and develops a succinct regressed model for heat capacity as a function of temperature and fat composition. The pseudo-milk was validated by comparing the simulated and actual values of the physical properties of milk. The milk thermal conductivity, density, viscosity, and heat capacity showed differences of less than 2, 4, 3, and 1.5%, respectively, between the simulated results and actual values. This work extends the capabilities of the previously proposed pseudo-milk and of a process simulator to model dairy processes, processing different types of milk (e.g., whole milk, skim milk, and concentrated milk) with different intrinsic compositions, and to predict correct material and energy balances for dairy processes. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.; McCorkle, D.; Yang, C.
Process modeling and simulation tools are widely used for the design and operation of advanced power generation systems. These tools enable engineers to solve the critical process systems engineering problems that arise throughout the lifecycle of a power plant, such as designing a new process, troubleshooting a process unit or optimizing operations of the full process. To analyze the impact of complex thermal and fluid flow phenomena on overall power plant performance, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has developed the Advanced Process Engineering Co-Simulator (APECS). The APECS system is an integrated software suite that combinesmore » process simulation (e.g., Aspen Plus) and high-fidelity equipment simulations such as those based on computational fluid dynamics (CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper we discuss the initial phases of the integration of the APECS system with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite uses the ActiveX (OLE Automation) controls in the Aspen Plus process simulator wrapped by the CASI library developed by Reaction Engineering International to run process/CFD co-simulations and query for results. This integration represents a necessary step in the development of virtual power plant co-simulations that will ultimately reduce the time, cost, and technical risk of developing advanced power generation systems.« less
Collaborative simulation method with spatiotemporal synchronization process control
NASA Astrophysics Data System (ADS)
Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian
2016-10-01
When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.
ISPE: A knowledge-based system for fluidization studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, S.
1991-01-01
Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all specified goals'' are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less
Temporal Gillespie Algorithm: Fast Simulation of Contagion Processes on Time-Varying Networks
Vestergaard, Christian L.; Génois, Mathieu
2015-01-01
Stochastic simulations are one of the cornerstones of the analysis of dynamical processes on complex networks, and are often the only accessible way to explore their behavior. The development of fast algorithms is paramount to allow large-scale simulations. The Gillespie algorithm can be used for fast simulation of stochastic processes, and variants of it have been applied to simulate dynamical processes on static networks. However, its adaptation to temporal networks remains non-trivial. We here present a temporal Gillespie algorithm that solves this problem. Our method is applicable to general Poisson (constant-rate) processes on temporal networks, stochastically exact, and up to multiple orders of magnitude faster than traditional simulation schemes based on rejection sampling. We also show how it can be extended to simulate non-Markovian processes. The algorithm is easily applicable in practice, and as an illustration we detail how to simulate both Poissonian and non-Markovian models of epidemic spreading. Namely, we provide pseudocode and its implementation in C++ for simulating the paradigmatic Susceptible-Infected-Susceptible and Susceptible-Infected-Recovered models and a Susceptible-Infected-Recovered model with non-constant recovery rates. For empirical networks, the temporal Gillespie algorithm is here typically from 10 to 100 times faster than rejection sampling. PMID:26517860
Temporal Gillespie Algorithm: Fast Simulation of Contagion Processes on Time-Varying Networks.
Vestergaard, Christian L; Génois, Mathieu
2015-10-01
Stochastic simulations are one of the cornerstones of the analysis of dynamical processes on complex networks, and are often the only accessible way to explore their behavior. The development of fast algorithms is paramount to allow large-scale simulations. The Gillespie algorithm can be used for fast simulation of stochastic processes, and variants of it have been applied to simulate dynamical processes on static networks. However, its adaptation to temporal networks remains non-trivial. We here present a temporal Gillespie algorithm that solves this problem. Our method is applicable to general Poisson (constant-rate) processes on temporal networks, stochastically exact, and up to multiple orders of magnitude faster than traditional simulation schemes based on rejection sampling. We also show how it can be extended to simulate non-Markovian processes. The algorithm is easily applicable in practice, and as an illustration we detail how to simulate both Poissonian and non-Markovian models of epidemic spreading. Namely, we provide pseudocode and its implementation in C++ for simulating the paradigmatic Susceptible-Infected-Susceptible and Susceptible-Infected-Recovered models and a Susceptible-Infected-Recovered model with non-constant recovery rates. For empirical networks, the temporal Gillespie algorithm is here typically from 10 to 100 times faster than rejection sampling.
NASA Astrophysics Data System (ADS)
Bednar, Earl; Drager, Steven L.
2007-04-01
Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.
NASA Astrophysics Data System (ADS)
Painter, S.; Moulton, J. D.; Berndt, M.; Coon, E.; Garimella, R.; Lewis, K. C.; Manzini, G.; Mishra, P.; Travis, B. J.; Wilson, C. J.
2012-12-01
The frozen soils of the Arctic and subarctic regions contain vast amounts of stored organic carbon. This carbon is vulnerable to release to the atmosphere as temperatures warm and permafrost degrades. Understanding the response of the subsurface and surface hydrologic system to degrading permafrost is key to understanding the rate, timing, and chemical form of potential carbon releases to the atmosphere. Simulating the hydrologic system in degrading permafrost regions is challenging because of the potential for topographic evolution and associated drainage network reorganization as permafrost thaws and massive ground ice melts. The critical process models required for simulating hydrology include subsurface thermal hydrology of freezing/thawing soils, thermal processes within ice wedges, mechanical deformation processes, overland flow, and surface energy balances including snow dynamics. A new simulation tool, the Arctic Terrestrial Simulator (ATS), is being developed to simulate these coupled processes. The computational infrastructure must accommodate fully unstructured grids that track evolving topography, allow accurate solutions on distorted grids, provide robust and efficient solutions on highly parallel computer architectures, and enable flexibility in the strategies for coupling among the various processes. The ATS is based on Amanzi (Moulton et al. 2012), an object-oriented multi-process simulator written in C++ that provides much of the necessary computational infrastructure. Status and plans for the ATS including major hydrologic process models and validation strategies will be presented. Highly parallel simulations of overland flow using high-resolution digital elevation maps of polygonal patterned ground landscapes demonstrate the feasibility of the approach. Simulations coupling three-phase subsurface thermal hydrology with a simple thaw-induced subsidence model illustrate the strong feedbacks among the processes. D. Moulton, M. Berndt, M. Day, J. Meza, et al., High-Level Design of Amanzi, the Multi-Process High Performance Computing Simulator, Technical Report ASCEM-HPC-2011-03-1, DOE Environmental Management, 2012.
An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process
NASA Technical Reports Server (NTRS)
Carter, M. C.; Madison, M. W.
1973-01-01
The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.
Simulation Methods for Poisson Processes in Nonstationary Systems.
1978-08-01
for simulation of nonhomogeneous Poisson processes is stated with log-linear rate function. The method is based on an identity relating the...and relatively efficient new method for simulation of one-dimensional and two-dimensional nonhomogeneous Poisson processes is described. The method is
40 CFR Appendix C to Part 75 - Missing Data Estimation Procedures
Code of Federal Regulations, 2010 CFR
2010-07-01
... certification of a parametric, empirical, or process simulation method or model for calculating substitute data... available process simulation methods and models. 1.2Petition Requirements Continuously monitor, determine... desulfurization, a corresponding empirical correlation or process simulation parametric method using appropriate...
Simulation in Metallurgical Processing: Recent Developments and Future Perspectives
NASA Astrophysics Data System (ADS)
Ludwig, Andreas; Wu, Menghuai; Kharicha, Abdellah
2016-08-01
This article briefly addresses the most important topics concerning numerical simulation of metallurgical processes, namely, multiphase issues (particle and bubble motion and flotation/sedimentation of equiaxed crystals during solidification), multiphysics issues (electromagnetic stirring, electro-slag remelting, Cu-electro-refining, fluid-structure interaction, and mushy zone deformation), process simulations on graphical processing units, integrated computational materials engineering, and automatic optimization via simulation. The present state-of-the-art as well as requirements for future developments are presented and briefly discussed.
NASA Technical Reports Server (NTRS)
Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.
2005-01-01
Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.
Simulation of salt production process
NASA Astrophysics Data System (ADS)
Muraveva, E. A.
2017-10-01
In this paper an approach to the use of simulation software iThink to simulate the salt production system has been proposed. The dynamic processes of the original system are substituted by processes simulated in the abstract model, but in compliance with the basic rules of the original system, which allows one to accelerate and reduce the cost of the research. As a result, a stable workable simulation model was obtained that can display the rate of the salt exhaustion and many other parameters which are important for business planning.
Why a simulation system doesn`t match the plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sowell, R.
1998-03-01
Process simulations, or mathematical models, are widely used by plant engineers and planners to obtain a better understanding of a particular process. These simulations are used to answer questions such as how can feed rate be increased, how can yields be improved, how can energy consumption be decreased, or how should the available independent variables be set to maximize profit? Although current process simulations are greatly improved over those of the `70s and `80s, there are many reasons why a process simulation doesn`t match the plant. Understanding these reasons can assist in using simulations to maximum advantage. The reasons simulationsmore » do not match the plant may be placed in three main categories: simulation effects or inherent error, sampling and analysis effects of measurement error, and misapplication effects or set-up error.« less
Wieland, Birgit; Ropte, Sven
2017-01-01
The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results. PMID:28981458
Wieland, Birgit; Ropte, Sven
2017-10-05
The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results.
Effects of Thinking Style on Design Strategies: Using Bridge Construction Simulation Programs
ERIC Educational Resources Information Center
Sun, Chuen-Tsai; Wang, Dai-Yi; Chang, Yu-Yeh
2013-01-01
Computer simulation users can freely control operational factors and simulation results, repeat processes, make changes, and learn from simulation environment feedback. The focus of this paper is on simulation-based design tools and their effects on student learning processes in a group of 101 Taiwanese senior high school students. Participants…
Parkinson, William J.
1987-01-01
A fossil fuel furnace reactor is provided for simulating a continuous processing plant with a batch reactor. An internal reaction vessel contains a batch of shale oil, with the vessel having a relatively thin wall thickness for a heat transfer rate effective to simulate a process temperature history in the selected continuous processing plant. A heater jacket is disposed about the reactor vessel and defines a number of independent controllable temperature zones axially spaced along the reaction vessel. Each temperature zone can be energized to simulate a time-temperature history of process material through the continuous plant. A pressure vessel contains both the heater jacket and the reaction vessel at an operating pressure functionally selected to simulate the continuous processing plant. The process yield from the oil shale may be used as feedback information to software simulating operation of the continuous plant to provide operating parameters, i.e., temperature profiles, ambient atmosphere, operating pressure, material feed rates, etc., for simulation in the batch reactor.
Exact simulation of max-stable processes.
Dombry, Clément; Engelke, Sebastian; Oesting, Marco
2016-06-01
Max-stable processes play an important role as models for spatial extreme events. Their complex structure as the pointwise maximum over an infinite number of random functions makes their simulation difficult. Algorithms based on finite approximations are often inexact and computationally inefficient. We present a new algorithm for exact simulation of a max-stable process at a finite number of locations. It relies on the idea of simulating only the extremal functions, that is, those functions in the construction of a max-stable process that effectively contribute to the pointwise maximum. We further generalize the algorithm by Dieker & Mikosch (2015) for Brown-Resnick processes and use it for exact simulation via the spectral measure. We study the complexity of both algorithms, prove that our new approach via extremal functions is always more efficient, and provide closed-form expressions for their implementation that cover most popular models for max-stable processes and multivariate extreme value distributions. For simulation on dense grids, an adaptive design of the extremal function algorithm is proposed.
Towards Automatic Processing of Virtual City Models for Simulations
NASA Astrophysics Data System (ADS)
Piepereit, R.; Schilling, A.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.
2016-10-01
Especially in the field of numerical simulations, such as flow and acoustic simulations, the interest in using virtual 3D models to optimize urban systems is increasing. The few instances in which simulations were already carried out in practice have been associated with an extremely high manual and therefore uneconomical effort for the processing of models. Using different ways of capturing models in Geographic Information System (GIS) and Computer Aided Engineering (CAE), increases the already very high complexity of the processing. To obtain virtual 3D models suitable for simulation, we developed a tool for automatic processing with the goal to establish ties between the world of GIS and CAE. In this paper we introduce a way to use Coons surfaces for the automatic processing of building models in LoD2, and investigate ways to simplify LoD3 models in order to reduce unnecessary information for a numerical simulation.
Progress in Unsteady Turbopump Flow Simulations
NASA Technical Reports Server (NTRS)
Kiris, Cetin C.; Chan, William; Kwak, Dochan; Williams, Robert
2002-01-01
This viewgraph presentation discusses unsteady flow simulations for a turbopump intended for a reusable launch vehicle (RLV). The simulation process makes use of computational grids and parallel processing. The architecture of the parallel computers used is discussed, as is the scripting of turbopump simulations.
ISPE: A knowledge-based system for fluidization studies. 1990 Annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddy, S.
1991-01-01
Chemical engineers use mathematical simulators to design, model, optimize and refine various engineering plants/processes. This procedure requires the following steps: (1) preparation of an input data file according to the format required by the target simulator; (2) excecuting the simulation; and (3) analyzing the results of the simulation to determine if all ``specified goals`` are satisfied. If the goals are not met, the input data file must be modified and the simulation repeated. This multistep process is continued until satisfactory results are obtained. This research was undertaken to develop a knowledge based system, IPSE (Intelligent Process Simulation Environment), that canmore » enhance the productivity of chemical engineers/modelers by serving as an intelligent assistant to perform a variety tasks related to process simulation. ASPEN, a widely used simulator by the US Department of Energy (DOE) at Morgantown Energy Technology Center (METC) was selected as the target process simulator in the project. IPSE, written in the C language, was developed using a number of knowledge-based programming paradigms: object-oriented knowledge representation that uses inheritance and methods, rulebased inferencing (includes processing and propagation of probabilistic information) and data-driven programming using demons. It was implemented using the knowledge based environment LASER. The relationship of IPSE with the user, ASPEN, LASER and the C language is shown in Figure 1.« less
Process simulation and dynamic control for marine oily wastewater treatment using UV irradiation.
Jing, Liang; Chen, Bing; Zhang, Baiyu; Li, Pu
2015-09-15
UV irradiation and advanced oxidation processes have been recently regarded as promising solutions in removing polycyclic aromatic hydrocarbons (PAHs) from marine oily wastewater. However, such treatment methods are generally not sufficiently understood in terms of reaction mechanisms, process simulation and process control. These deficiencies can drastically hinder their application in shipping and offshore petroleum industries which produce bilge/ballast water and produced water as the main streams of marine oily wastewater. In this study, the factorial design of experiment was carried out to investigate the degradation mechanism of a typical PAH, namely naphthalene, under UV irradiation in seawater. Based on the experimental results, a three-layer feed-forward artificial neural network simulation model was developed to simulate the treatment process and to forecast the removal performance. A simulation-based dynamic mixed integer nonlinear programming (SDMINP) approach was then proposed to intelligently control the treatment process by integrating the developed simulation model, genetic algorithm and multi-stage programming. The applicability and effectiveness of the developed approach were further tested though a case study. The experimental results showed that the influences of fluence rate and temperature on the removal of naphthalene were greater than those of salinity and initial concentration. The developed simulation model could well predict the UV-induced removal process under varying conditions. The case study suggested that the SDMINP approach, with the aid of the multi-stage control strategy, was able to significantly reduce treatment cost when comparing to the traditional single-stage process optimization. The developed approach and its concept/framework have high potential of applicability in other environmental fields where a treatment process is involved and experimentation and modeling are used for process simulation and control. Copyright © 2015 Elsevier Ltd. All rights reserved.
Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M
2018-04-01
Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.
Simulation Framework for Teaching in Modeling and Simulation Areas
ERIC Educational Resources Information Center
De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan
2008-01-01
Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…
Operational coupled atmosphere - ocean - ice forecast system for the Gulf of St. Lawrence, Canada
NASA Astrophysics Data System (ADS)
Faucher, M.; Roy, F.; Desjardins, S.; Fogarty, C.; Pellerin, P.; Ritchie, H.; Denis, B.
2009-09-01
A fully interactive coupled atmosphere-ocean-ice forecasting system for the Gulf of St. Lawrence (GSL) has been running in experimental mode at the Canadian Meteorological Centre (CMC) for the last two winter seasons. The goal of this project is to provide more accurate weather and sea ice forecasts over the GSL and adjacent coastal areas by including atmosphere-oceanice interactions in the CMC operational forecast system using a formal coupling strategy between two independent modeling components. The atmospheric component is the Canadian operational GEM model (Côté et al. 1998) and the oceanic component is the ocean-ice model for the Gulf of St. Lawrence developed at the Maurice Lamontagne Institute (IML) (Saucier et al. 2003, 2004). The coupling between those two models is achieved by exchanging surface fluxes and variables through MPI communication. The re-gridding of the variables is done with a package developed at the Recherche en Prevision Numerique centre (RPN, Canada). Coupled atmosphere - ocean - ice forecasts are issued once a day based on 00GMT data. Results for the past two years have demonstrated that the coupled system produces improved forecasts in and around the GSL during all seasons, proving that atmosphere-ocean-ice interactions are indeed important even for short-term Canadian weather forecasts. This has important implications for other coupled modeling and data assimilation partnerships that are in progress involving EC, the Department of Fisheries and Oceans (DFO) and the National Defense (DND). Following this experimental phase, it is anticipated that this GSL system will be the first fully interactive coupled system to be implemented at CMC.
Using Simulation Module, PCLAB, for Steady State Disturbance Sensitivity Analysis in Process Control
ERIC Educational Resources Information Center
Ali, Emad; Idriss, Arimiyawo
2009-01-01
Recently, chemical engineering education moves towards utilizing simulation soft wares to enhance the learning process especially in the field of process control. These training simulators provide interactive learning through visualization and practicing which will bridge the gap between the theoretical abstraction of textbooks and the…
Hydrological and water quality processes simulation by the integrated MOHID model
NASA Astrophysics Data System (ADS)
Epelde, Ane; Antiguedad, Iñaki; Brito, David; Eduardo, Jauch; Neves, Ramiro; Sauvage, Sabine; Sánchez-Pérez, José Miguel
2016-04-01
Different modelling approaches have been used in recent decades to study the water quality degradation caused by non-point source pollution. In this study, the MOHID fully distributed and physics-based model has been employed to simulate hydrological processes and nitrogen dynamics in a nitrate vulnerable zone: the Alegria River watershed (Basque Country, Northern Spain). The results of this study indicate that the MOHID code is suitable for hydrological processes simulation at the watershed scale, as the model shows satisfactory performance at simulating the discharge (with NSE: 0.74 and 0.76 during calibration and validation periods, respectively). The agronomical component of the code, allowed the simulation of agricultural practices, which lead to adequate crop yield simulation in the model. Furthermore, the nitrogen exportation also shows satisfactory performance (with NSE: 0.64 and 0.69 during calibration and validation periods, respectively). While the lack of field measurements do not allow to evaluate the nutrient cycling processes in depth, it has been observed that the MOHID model simulates the annual denitrification according to general ranges established for agricultural watersheds (in this study, 9 kg N ha-1 year-1). In addition, the model has simulated coherently the spatial distribution of the denitrification process, which is directly linked to the simulated hydrological conditions. Thus, the model has localized the highest rates nearby the discharge zone of the aquifer and also where the aquifer thickness is low. These results evidence the strength of this model to simulate watershed scale hydrological processes as well as the crop production and the agricultural activity derived water quality degradation (considering both nutrient exportation and nutrient cycling processes).
Plasma Processing of Lunar Regolith Simulant for Diverse Applications
NASA Technical Reports Server (NTRS)
Schofield, Elizabeth C.; Sen, Subhayu; O'Dell, J. Scott
2008-01-01
Versatile manufacturing technologies for extracting resources from the moon are needed to support future space missions. Of particular interest is the production of gases and metals from lunar resources for life support, propulsion, and in-space fabrication. Deposits made from lunar regolith could yield highly emissive coatings and near-net shaped parts for replacement or repair of critical components. Equally important is development of high fidelity lunar simulants for ground based validation of potential lunar surface operations. Described herein is an innovative plasma processing technique for insitu production of gases, metals, coatings, and deposits from lunar regolith, and synthesis of high fidelity lunar simulant from NASA issued lunar simulant JSC-1. Initial plasma reduction trials of JSC-1 lunar simulant have indicated production of metallic iron and magnesium. Evolution of carbon monoxide has been detected subsequent to reduction of the simulant using the plasma process. Plasma processing of the simulant has also resulted in glassy phases resembling the volcanic glass and agglutinates found in lunar regolith. Complete and partial glassy phase deposits have been obtained by varying the plasma process variables. Experimental techniques, product characterization, and process gas analysis will be discussed.
Grace: A cross-platform micromagnetic simulator on graphics processing units
NASA Astrophysics Data System (ADS)
Zhu, Ru
2015-12-01
A micromagnetic simulator running on graphics processing units (GPUs) is presented. Different from GPU implementations of other research groups which are predominantly running on NVidia's CUDA platform, this simulator is developed with C++ Accelerated Massive Parallelism (C++ AMP) and is hardware platform independent. It runs on GPUs from venders including NVidia, AMD and Intel, and achieves significant performance boost as compared to previous central processing unit (CPU) simulators, up to two orders of magnitude. The simulator paved the way for running large size micromagnetic simulations on both high-end workstations with dedicated graphics cards and low-end personal computers with integrated graphics cards, and is freely available to download.
A Simplified Finite Element Simulation for Straightening Process of Thin-Walled Tube
NASA Astrophysics Data System (ADS)
Zhang, Ziqian; Yang, Huilin
2017-12-01
The finite element simulation is an effective way for the study of thin-walled tube in the two cross rolls straightening process. To determine the accurate radius of curvature of the roll profile more efficiently, a simplified finite element model based on the technical parameters of an actual two cross roll straightening machine, was developed to simulate the complex straightening process. Then a dynamic simulation was carried out using ANSYS LS-DYNA program. The result implied that the simplified finite element model was reasonable for simulate the two cross rolls straightening process, and can be obtained the radius of curvature of the roll profile with the tube’s straightness 2 mm/m.
PLYMAP : a computer simulation model of the rotary peeled softwood plywood manufacturing process
Henry Spelter
1990-01-01
This report documents a simulation model of the plywood manufacturing process. Its purpose is to enable a user to make quick estimates of the economic impact of a particular process change within a mill. The program was designed to simulate the processing of plywood within a relatively simplified mill design. Within that limitation, however, it allows a wide range of...
NASA Astrophysics Data System (ADS)
Tong, Qiujie; Wang, Qianqian; Li, Xiaoyang; Shan, Bin; Cui, Xuntai; Li, Chenyu; Peng, Zhong
2016-11-01
In order to satisfy the requirements of the real-time and generality, a laser target simulator in semi-physical simulation system based on RTX+LabWindows/CVI platform is proposed in this paper. Compared with the upper-lower computers simulation platform architecture used in the most of the real-time system now, this system has better maintainability and portability. This system runs on the Windows platform, using Windows RTX real-time extension subsystem to ensure the real-time performance of the system combining with the reflective memory network to complete some real-time tasks such as calculating the simulation model, transmitting the simulation data, and keeping real-time communication. The real-time tasks of simulation system run under the RTSS process. At the same time, we use the LabWindows/CVI to compile a graphical interface, and complete some non-real-time tasks in the process of simulation such as man-machine interaction, display and storage of the simulation data, which run under the Win32 process. Through the design of RTX shared memory and task scheduling algorithm, the data interaction between the real-time tasks process of RTSS and non-real-time tasks process of Win32 is completed. The experimental results show that this system has the strongly real-time performance, highly stability, and highly simulation accuracy. At the same time, it also has the good performance of human-computer interaction.
A Process for Comparing Dynamics of Distributed Space Systems Simulations
NASA Technical Reports Server (NTRS)
Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.
2009-01-01
The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.
Using a simulation assistant in modeling manufacturing systems
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.
1988-01-01
Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.
Synchronization Of Parallel Discrete Event Simulations
NASA Technical Reports Server (NTRS)
Steinman, Jeffrey S.
1992-01-01
Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.
Designing a SCADA system simulator for fast breeder reactor
NASA Astrophysics Data System (ADS)
Nugraha, E.; Abdullah, A. G.; Hakim, D. L.
2016-04-01
SCADA (Supervisory Control and Data Acquisition) system simulator is a Human Machine Interface-based software that is able to visualize the process of a plant. This study describes the results of the process of designing a SCADA system simulator that aims to facilitate the operator in monitoring, controlling, handling the alarm, accessing historical data and historical trend in Nuclear Power Plant (NPP) type Fast Breeder Reactor (FBR). This research used simulation to simulate NPP type FBR Kalpakkam in India. This simulator was developed using Wonderware Intouch software 10 and is equipped with main menu, plant overview, area graphics, control display, set point display, alarm system, real-time trending, historical trending and security system. This simulator can properly simulate the principle of energy flow and energy conversion process on NPP type FBR. This SCADA system simulator can be used as training media for NPP type FBR prospective operators.
Process simulations for manufacturing of thick composites
NASA Astrophysics Data System (ADS)
Kempner, Evan A.
The availability of manufacturing simulations for composites can significantly reduce the costs associated with process development. Simulations provide a tool for evaluating the effect of processing conditions on the quality of parts produced without requiring numerous experiments. This is especially significant in parts that have troublesome features such as large thickness. The development of simulations for thick walled composites has been approached by examining the mechanics of resin flow and fiber deformation during processing, applying these evaluations to develop simulations, and evaluating the simulation with experimental results. A unified analysis is developed to describe the three-dimensional resin flow and fiber preform deformation during processing regardless of the manufacturing process used. It is shown how the generic governing evaluations in the unified analysis can be applied to autoclave molding, compression molding, pultrusion, filament winding, and resin transfer molding. A comparison is provided with earlier models derived individually for these processes. The evaluations described for autoclave curing were used to produce a one-dimensional cure simulation for autoclave curing of thick composites. The simulation consists of an analysis for heat transfer and resin flow in the composite as well as bleeder plies used to absorb resin removed from the part. Experiments were performed in a hot press to approximate curing in an autoclave. Graphite/epoxy laminates of 3 cm and 5 cm thickness were cured while monitoring temperatures at several points inside the laminate and thickness. The simulation predicted temperatures fairly closely, but difficulties were encountered in correlation of thickness results. This simulation was also used to study the effects of prepreg aging on processing of thick composites. An investigation was also performed on filament winding with prepreg tow. Cylinders were wound of approximately 12 mm thickness with pressure gages at the mandrel-composite interface. Cylinders were hoop wound with tensions ranging from 13-34 N. An analytical model was developed to calculate change in stress due to relaxation during winding. Although compressive circumferential stresses occurred throughout each of the cylinders, the magnitude was fairly low.
NASA Technical Reports Server (NTRS)
Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga
2005-01-01
Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.
NASA Astrophysics Data System (ADS)
Abustan, M. S.; Rahman, N. A.; Gotoh, H.; Harada, E.; Talib, S. H. A.
2016-07-01
In Malaysia, not many researches on crowd evacuation simulation had been reported. Hence, the development of numerical crowd evacuation process by taking into account people behavioral patterns and psychological characteristics is crucial in Malaysia. On the other hand, tsunami disaster began to gain attention of Malaysian citizens after the 2004 Indian Ocean Tsunami that need quick evacuation process. In relation to the above circumstances, we have conducted simulations of tsunami evacuation process at the Miami Beach of Penang Island by using Distinct Element Method (DEM)-based crowd behavior simulator. The main objectives are to investigate and reproduce current conditions of evacuation process at the said locations under different hypothetical scenarios for the efficiency study of the evacuation. The sim-1 is initial condition of evacuation planning while sim-2 as improvement of evacuation planning by adding new evacuation area. From the simulation result, sim-2 have a shorter time of evacuation process compared to the sim-1. The evacuation time recuded 53 second. The effect of the additional evacuation place is confirmed from decreasing of the evacuation completion time. Simultaneously, the numerical simulation may be promoted as an effective tool in studying crowd evacuation process.
The use of discrete-event simulation modelling to improve radiation therapy planning processes.
Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven
2009-07-01
The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Rui, E-mail: Sunsr@hit.edu.cn; Ismail, Tamer M., E-mail: temoil@aucegypt.edu; Ren, Xiaohan
Highlights: • The effects of moisture content on the burning process of MSW are investigated. • A two-dimensional mathematical model was built to simulate the combustion process. • Temperature distributions, process rates, gas species were measured and simulated. • The The conversion ratio of C/CO and N/NO in MSW are inverse to moisture content. - Abstract: In order to reveal the features of the combustion process in the porous bed of a waste incinerator, a two-dimensional unsteady state model and experimental study were employed to investigate the combustion process in a fixed bed of municipal solid waste (MSW) on themore » combustion process in a fixed bed reactor. Conservation equations of the waste bed were implemented to describe the incineration process. The gas phase turbulence was modeled using the k–ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The rate of moisture evaporation, devolatilization rate, and char burnout was calculated according to the waste property characters. The simulation results were then compared with experimental data for different moisture content of MSW, which shows that the incineration process of waste in the fixed bed is reasonably simulated. The simulation results of solid temperature, gas species and process rate in the bed are accordant with experimental data. Due to the high moisture content of fuel, moisture evaporation consumes a vast amount of heat, and the evaporation takes up most of the combustion time (about 2/3 of the whole combustion process). The whole bed combustion process reduces greatly as MSW moisture content increases. The experimental and simulation results provide direction for design and optimization of the fixed bed of MSW.« less
Dynamic Simulation of a Helium Liquefier
NASA Astrophysics Data System (ADS)
Maekawa, R.; Ooba, K.; Nobutoki, M.; Mito, T.
2004-06-01
Dynamic behavior of a helium liquefier has been studied in detail with a Cryogenic Process REal-time SimulaTor (C-PREST) at the National Institute for Fusion Science (NIFS). The C-PREST is being developed to integrate large-scale helium cryogenic plant design, operation and maintenance for optimum process establishment. As a first step of simulations of cooldown to 4.5 K with the helium liquefier model is conducted, which provides a plant-process validation platform. The helium liquefier consists of seven heat exchangers, a liquid-nitrogen (LN2) precooler, two expansion turbines and a liquid-helium (LHe) reservoir. Process simulations are fulfilled with sequence programs, which were implemented with C-PREST based on an existing liquefier operation. The interactions of a JT valve, a JT-bypass valve and a reservoir-return valve have been dynamically simulated. The paper discusses various aspects of refrigeration process simulation, including its difficulties such as a balance between complexity of the adopted models and CPU time.
Bencala, Kenneth E.
1984-01-01
Solute transport in streams is determined by the interaction of physical and chemical processes. Data from an injection experiment for chloride and several cations indicate significant influence of solutestreambed processes on transport in a mountain stream. These data are interpreted in terms of transient storage processes for all tracers and sorption processes for the cations. Process parameter values are estimated with simulations based on coupled quasi-two-dimensional transport and first-order mass transfer sorption. Comparative simulations demonstrate the relative roles of the physical and chemical processes in determining solute transport. During the first 24 hours of the experiment, chloride concentrations were attenuated relative to expected plateau levels. Additional attenuation occurred for the sorbing cation strontium. The simulations account for these storage processes. Parameter values determined by calibration compare favorably with estimates from other studies in mountain streams. Without further calibration, the transport of potassium and lithium is adequately simulated using parameters determined in the chloride-strontium simulation and with measured cation distribution coefficients.
A Computer Simulation of Bacterial Growth During Food-Processing
1974-11-01
1 AD A TECHNICAL REPORT A COMPUTER SIMULATION OF BACTERIAL GROWTH DURING FOOD PROCESSING =r= by Edward W. Ross, Jr. Approved for public...COMPUTER SIMULATION OF BACTERIAL GROWTH DURING FOOD - PROCESSING Edward W. Ross, Jr. Army Natick Laboratories Natick, Massachusetts Novembe...CATALOG NUMBER 4. TITLE fand SubtKUJ "A Computer Sinulatlon of Bacterial Growth During Food - Processing " 5. TYPE OF REPORT A PERIOD COVERED 6
Software-Engineering Process Simulation (SEPS) model
NASA Technical Reports Server (NTRS)
Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.
1992-01-01
The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.
A Software Development Simulation Model of a Spiral Process
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Malone, Linda
2007-01-01
There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.
Architectural Improvements and New Processing Tools for the Open XAL Online Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, Christopher K; Pelaia II, Tom; Freed, Jonathan M
The online model is the component of Open XAL providing accelerator modeling, simulation, and dynamic synchronization to live hardware. Significant architectural changes and feature additions have been recently made in two separate areas: 1) the managing and processing of simulation data, and 2) the modeling of RF cavities. Simulation data and data processing have been completely decoupled. A single class manages all simulation data while standard tools were developed for processing the simulation results. RF accelerating cavities are now modeled as composite structures where parameter and dynamics computations are distributed. The beam and hardware models both maintain their relative phasemore » information, which allows for dynamic phase slip and elapsed time computation.« less
Reduced order model based on principal component analysis for process simulation and optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lang, Y.; Malacina, A.; Biegler, L.
2009-01-01
It is well-known that distributed parameter computational fluid dynamics (CFD) models provide more accurate results than conventional, lumped-parameter unit operation models used in process simulation. Consequently, the use of CFD models in process/equipment co-simulation offers the potential to optimize overall plant performance with respect to complex thermal and fluid flow phenomena. Because solving CFD models is time-consuming compared to the overall process simulation, we consider the development of fast reduced order models (ROMs) based on CFD results to closely approximate the high-fidelity equipment models in the co-simulation. By considering process equipment items with complicated geometries and detailed thermodynamic property models,more » this study proposes a strategy to develop ROMs based on principal component analysis (PCA). Taking advantage of commercial process simulation and CFD software (for example, Aspen Plus and FLUENT), we are able to develop systematic CFD-based ROMs for equipment models in an efficient manner. In particular, we show that the validity of the ROM is more robust within well-sampled input domain and the CPU time is significantly reduced. Typically, it takes at most several CPU seconds to evaluate the ROM compared to several CPU hours or more to solve the CFD model. Two case studies, involving two power plant equipment examples, are described and demonstrate the benefits of using our proposed ROM methodology for process simulation and optimization.« less
A simulation framework for mapping risks in clinical processes: the case of in-patient transfers.
Dunn, Adam G; Ong, Mei-Sing; Westbrook, Johanna I; Magrabi, Farah; Coiera, Enrico; Wobcke, Wayne
2011-05-01
To model how individual violations in routine clinical processes cumulatively contribute to the risk of adverse events in hospital using an agent-based simulation framework. An agent-based simulation was designed to model the cascade of common violations that contribute to the risk of adverse events in routine clinical processes. Clinicians and the information systems that support them were represented as a group of interacting agents using data from direct observations. The model was calibrated using data from 101 patient transfers observed in a hospital and results were validated for one of two scenarios (a misidentification scenario and an infection control scenario). Repeated simulations using the calibrated model were undertaken to create a distribution of possible process outcomes. The likelihood of end-of-chain risk is the main outcome measure, reported for each of the two scenarios. The simulations demonstrate end-of-chain risks of 8% and 24% for the misidentification and infection control scenarios, respectively. Over 95% of the simulations in both scenarios are unique, indicating that the in-patient transfer process diverges from prescribed work practices in a variety of ways. The simulation allowed us to model the risk of adverse events in a clinical process, by generating the variety of possible work subject to violations, a novel prospective risk analysis method. The in-patient transfer process has a high proportion of unique trajectories, implying that risk mitigation may benefit from focusing on reducing complexity rather than augmenting the process with further rule-based protocols.
Llorens, Esther; Saaltink, Maarten W; Poch, Manel; García, Joan
2011-01-01
The performance and reliability of the CWM1-RETRASO model for simulating processes in horizontal subsurface flow constructed wetlands (HSSF CWs) and the relative contribution of different microbial reactions to organic matter (COD) removal in a HSSF CW treating urban wastewater were evaluated. Various different approaches with diverse influent configurations were simulated. According to the simulations, anaerobic processes were more widespread in the simulated wetland and contributed to a higher COD removal rate [72-79%] than anoxic [0-1%] and aerobic reactions [20-27%] did. In all the cases tested, the reaction that most contributed to COD removal was methanogenesis [58-73%]. All results provided by the model were in consonance with literature and experimental field observations, suggesting a good performance and reliability of CWM1-RETRASO. According to the good simulation predictions, CWM1-RETRASO is the first mechanistic model able to successfully simulate the processes described by the CWM1 model in HSSF CWs. Copyright © 2010 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Clarke, Matthew A.; Giraldo, Carlos
2009-01-01
Chemical process simulation is one of the most fundamental skills that is expected from chemical engineers, yet relatively few graduates have the opportunity to learn, in depth, how a process simulator works, from programming the unit operations to the sequencing. The University of Calgary offers a "hands-on" postgraduate course in…
ERIC Educational Resources Information Center
Duffy, Melissa C.; Azevedo, Roger; Sun, Ning-Zi; Griscom, Sophia E.; Stead, Victoria; Crelinsten, Linda; Wiseman, Jeffrey; Maniatis, Thomas; Lachapelle, Kevin
2015-01-01
This study examined the nature of cognitive, metacognitive, and affective processes among a medical team experiencing difficulty managing a challenging simulated medical emergency case by conducting in-depth analysis of process data. Medical residents participated in a simulation exercise designed to help trainees to develop medical expertise,…
Modeling and Simulation of Quenching and Tempering Process in steels
NASA Astrophysics Data System (ADS)
Deng, Xiaohu; Ju, Dongying
Quenching and tempering (Q&T) is a combined heat treatment process to achieve maximum toughness and ductility at a specified hardness and strength. It is important to develop a mathematical model for quenching and tempering process for satisfy requirement of mechanical properties with low cost. This paper presents a modified model to predict structural evolution and hardness distribution during quenching and tempering process of steels. The model takes into account tempering parameters, carbon content, isothermal and non-isothermal transformations. Moreover, precipitation of transition carbides, decomposition of retained austenite and precipitation of cementite can be simulated respectively. Hardness distributions of quenched and tempered workpiece are predicted by experimental regression equation. In order to validate the model, it is employed to predict the tempering of 80MnCr5 steel. The predicted precipitation dynamics of transition carbides and cementite is consistent with the previous experimental and simulated results from literature. Then the model is implemented within the framework of the developed simulation code COSMAP to simulate microstructure, stress and distortion in the heat treated component. It is applied to simulate Q&T process of J55 steel. The calculated results show a good agreement with the experimental ones. This agreement indicates that the model is effective for simulation of Q&T process of steels.
The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator.
Roh, S D; Kim, S W; Cho, W S
2001-10-01
The numerical modelling and process simulation for the fault diagnosis of rotary kiln incinerator were accomplished. In the numerical modelling, two models applied to the modelling within the kiln are the combustion chamber model including the mass and energy balance equations for two combustion chambers and 3D thermal model. The combustion chamber model predicts temperature within the kiln, flue gas composition, flux and heat of combustion. Using the combustion chamber model and 3D thermal model, the production-rules for the process simulation can be obtained through interrelation analysis between control and operation variables. The process simulation of the kiln is operated with the production-rules for automatic operation. The process simulation aims to provide fundamental solutions to the problems in incineration process by introducing an online expert control system to provide an integrity in process control and management. Knowledge-based expert control systems use symbolic logic and heuristic rules to find solutions for various types of problems. It was implemented to be a hybrid intelligent expert control system by mutually connecting with the process control systems which has the capability of process diagnosis, analysis and control.
A framework of knowledge creation processes in participatory simulation of hospital work systems.
Andersen, Simone Nyholm; Broberg, Ole
2017-04-01
Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the hidden steps of PS that are essential when planning and facilitating PS that aims at designing work systems. Practitioner Summary: When facilitating participatory simulation (PS) in work system design, achieving an understanding of the PS process is essential. By applying a knowledge creation perspective and process mining, we investigated the knowledge-creating activities constituting the PS process. The analysis resulted in a framework of the knowledge-creating process in PS.
Discrete-Event Simulation in Chemical Engineering.
ERIC Educational Resources Information Center
Schultheisz, Daniel; Sommerfeld, Jude T.
1988-01-01
Gives examples, descriptions, and uses for various types of simulation systems, including the Flowtran, Process, Aspen Plus, Design II, GPSS, Simula, and Simscript. Explains similarities in simulators, terminology, and a batch chemical process. Tables and diagrams are included. (RT)
Fuzzy simulation in concurrent engineering
NASA Technical Reports Server (NTRS)
Kraslawski, A.; Nystrom, L.
1992-01-01
Concurrent engineering is becoming a very important practice in manufacturing. A problem in concurrent engineering is the uncertainty associated with the values of the input variables and operating conditions. The problem discussed in this paper concerns the simulation of processes where the raw materials and the operational parameters possess fuzzy characteristics. The processing of fuzzy input information is performed by the vertex method and the commercial simulation packages POLYMATH and GEMS. The examples are presented to illustrate the usefulness of the method in the simulation of chemical engineering processes.
Simulation of SiO2 etching in an inductively coupled CF4 plasma
NASA Astrophysics Data System (ADS)
Xu, Qing; Li, Yu-Xing; Li, Xiao-Ning; Wang, Jia-Bin; Yang, Fan; Yang, Yi; Ren, Tian-Ling
2017-02-01
Plasma etching technology is an indispensable processing method in the manufacturing process of semiconductor devices. Because of the high fluorine/carbon ratio of CF4, the CF4 gas is often used for etching SiO2. A commercial software ESI-CFD is used to simulate the process of plasma etching with an inductively coupled plasma model. For the simulation part, CFD-ACE is used to simulate the chamber, and CFD-TOPO is used to simulate the surface of the sample. The effects of chamber pressure, bias voltage and ICP power on the reactant particles were investigated, and the etching profiles of SiO2 were obtained. Simulation can be used to predict the effects of reaction conditions on the density, energy and angular distributions of reactant particles, which can play a good role in guiding the etching process.
Numerical simulation study on rolling-chemical milling process of aluminum-lithium alloy skin panel
NASA Astrophysics Data System (ADS)
Huang, Z. B.; Sun, Z. G.; Sun, X. F.; Li, X. Q.
2017-09-01
Single curvature parts such as aircraft fuselage skin panels are usually manufactured by rolling-chemical milling process, which is usually faced with the problem of geometric accuracy caused by springback. In most cases, the methods of manual adjustment and multiple roll bending are used to control or eliminate the springback. However, these methods can cause the increase of product cost and cycle, and lead to material performance degradation. Therefore, it is of significance to precisely control the springback of rolling-chemical milling process. In this paper, using the method of experiment and numerical simulation on rolling-chemical milling process, the simulation model for rolling-chemical milling process of 2060-T8 aluminum-lithium alloy skin was established and testified by the comparison between numerical simulation and experiment results for the validity. Then, based on the numerical simulation model, the relative technological parameters which influence on the curvature of the skin panel were analyzed. Finally, the prediction of springback and the compensation can be realized by controlling the process parameters.
Simulation Learning: PC-Screen Based (PCSB) versus High Fidelity Simulation (HFS)
2012-08-01
methods for the use of simulation for teaching clinical skills to military and civilian clinicians . High fidelity simulation is an expensive method of...without the knowledge and approval of the IRB. Changes include, but not limited to, modifications in study design, recruitment process and number of...Person C-Collar simulation algorithm Pathway A Scenario A - Spinal stabilization: Sub processes Legend: Pathway Points Complex task to be performed by
Use of high performance networks and supercomputers for real-time flight simulation
NASA Technical Reports Server (NTRS)
Cleveland, Jeff I., II
1993-01-01
In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be consistent in processing time and be completed in as short a time as possible. These operations include simulation mathematical model computation and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to the Computer Automated Measurement and Control (CAMAC) technology which resulted in a factor of ten increase in the effective bandwidth and reduced latency of modules necessary for simulator communication. This technology extension is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC are completing the development of the use of supercomputers for mathematical model computation to support real-time flight simulation. This includes the development of a real-time operating system and development of specialized software and hardware for the simulator network. This paper describes the data acquisition technology and the development of supercomputing for flight simulation.
Simulation methods supporting homologation of Electronic Stability Control in vehicle variants
NASA Astrophysics Data System (ADS)
Lutz, Albert; Schick, Bernhard; Holzmann, Henning; Kochem, Michael; Meyer-Tuve, Harald; Lange, Olav; Mao, Yiqin; Tosolin, Guido
2017-10-01
Vehicle simulation has a long tradition in the automotive industry as a powerful supplement to physical vehicle testing. In the field of Electronic Stability Control (ESC) system, the simulation process has been well established to support the ESC development and application by suppliers and Original Equipment Manufacturers (OEMs). The latest regulation of the United Nations Economic Commission for Europe UN/ECE-R 13 allows also for simulation-based homologation. This extends the usage of simulation from ESC development to homologation. This paper gives an overview of simulation methods, as well as processes and tools used for the homologation of ESC in vehicle variants. The paper first describes the generic homologation process according to the European Regulation (UN/ECE-R 13H, UN/ECE-R 13/11) and U.S. Federal Motor Vehicle Safety Standard (FMVSS 126). Subsequently the ESC system is explained as well as the generic application and release process at the supplier and OEM side. Coming up with the simulation methods, the ESC development and application process needs to be adapted for the virtual vehicles. The simulation environment, consisting of vehicle model, ESC model and simulation platform, is explained in detail with some exemplary use-cases. In the final section, examples of simulation-based ESC homologation in vehicle variants are shown for passenger cars, light trucks, heavy trucks and trailers. This paper is targeted to give a state-of-the-art account of the simulation methods supporting the homologation of ESC systems in vehicle variants. However, the described approach and the lessons learned can be used as reference in future for an extended usage of simulation-supported releases of the ESC system up to the development and release of driver assistance systems.
Knowledge-based simulation for aerospace systems
NASA Technical Reports Server (NTRS)
Will, Ralph W.; Sliwa, Nancy E.; Harrison, F. Wallace, Jr.
1988-01-01
Knowledge-based techniques, which offer many features that are desirable in the simulation and development of aerospace vehicle operations, exhibit many similarities to traditional simulation packages. The eventual solution of these systems' current symbolic processing/numeric processing interface problem will lead to continuous and discrete-event simulation capabilities in a single language, such as TS-PROLOG. Qualitative, totally-symbolic simulation methods are noted to possess several intrinsic characteristics that are especially revelatory of the system being simulated, and capable of insuring that all possible behaviors are considered.
JIMM: the next step for mission-level models
NASA Astrophysics Data System (ADS)
Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.
2001-09-01
The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.
Rausch, Alexander M; Küng, Vera E; Pobel, Christoph; Markl, Matthias; Körner, Carolin
2017-09-22
The resulting properties of parts fabricated by powder bed fusion additive manufacturing processes are determined by their porosity, local composition, and microstructure. The objective of this work is to examine the influence of the stochastic powder bed on the process window for dense parts by means of numerical simulation. The investigations demonstrate the unique capability of simulating macroscopic domains in the range of millimeters with a mesoscopic approach, which resolves the powder bed and the hydrodynamics of the melt pool. A simulated process window reveals the influence of the stochastic powder layer. The numerical results are verified with an experimental process window for selective electron beam-melted Ti-6Al-4V. Furthermore, the influence of the powder bulk density is investigated numerically. The simulations predict an increase in porosity and surface roughness for samples produced with lower powder bulk densities. Due to its higher probability for unfavorable powder arrangements, the process stability is also decreased. This shrinks the actual parameter range in a process window for producing dense parts.
Rausch, Alexander M.; Küng, Vera E.; Pobel, Christoph; Körner, Carolin
2017-01-01
The resulting properties of parts fabricated by powder bed fusion additive manufacturing processes are determined by their porosity, local composition, and microstructure. The objective of this work is to examine the influence of the stochastic powder bed on the process window for dense parts by means of numerical simulation. The investigations demonstrate the unique capability of simulating macroscopic domains in the range of millimeters with a mesoscopic approach, which resolves the powder bed and the hydrodynamics of the melt pool. A simulated process window reveals the influence of the stochastic powder layer. The numerical results are verified with an experimental process window for selective electron beam-melted Ti-6Al-4V. Furthermore, the influence of the powder bulk density is investigated numerically. The simulations predict an increase in porosity and surface roughness for samples produced with lower powder bulk densities. Due to its higher probability for unfavorable powder arrangements, the process stability is also decreased. This shrinks the actual parameter range in a process window for producing dense parts. PMID:28937633
NASA Astrophysics Data System (ADS)
Daïf, A.; Ali Chérif, A.; Bresson, J.; Sarh, B.
1995-10-01
The vaporization of one or two multi-component fuel droplets in hot air-stream is presented. A thermal wind tunnel with experimental channel has been designed to develop an experimental process. Firstly, the comparison between experimental results and numerical data is presented for the case of an isolated multi-component droplet. The numerical method is based on the resolution of heat and mass transfer equations between the droplet and the gas stream. This model includes the effect of Stephan flow, the effect of variable thermophysical properties of the components, and the non-unitary Lewis number in the gas film. The experimental results show the micro-explosion phenomenon observed in the liquid phase of multi-component droplet at low temperature. The experimental case of two pure or multi-component droplets in interaction is also presented. On présente un article de synthèse sur l'évaporation d'une ou deux gouttes de carburants à plusieurs composants dans un écoulement d'air chaud. Un dispositif expérimental constitué d'une soufflerie thermique, avec veine d'expérimentation, est réalisé pour permettre cette étude. Pour le cas d'une goutte isolée, une comparaison expérience-calcul est entreprise. Le principe de la méthode numerique consiste en la résolution des équations de transfert de masse et de chaleur entre la goutte et l'écoulement. Ce modèle prend en compte les effets de l'écoulement de Stephan, les variations des propriétés thermophysiques des composants dans les deux phases et la valeur du nombre de Lewis différente de l'unité dans le film de vapeur. Outre l'analyse plus approfondie qu'apporte la confrontation entre le calcul et l'expérience, les résultats expérimentaux montrent le phénomène de micro-explosion observé à l'intérieur de la goutte liquide. Le cas expérimental de deux gouttes en interaction est abordé qu'il s'agisse de gouttes de carburant pur ou de mélange.
When teams shift among processes: insights from simulation and optimization.
Kennedy, Deanna M; McComb, Sara A
2014-09-01
This article introduces process shifts to study the temporal interplay among transition and action processes espoused in the recurring phase model proposed by Marks, Mathieu, and Zacarro (2001). Process shifts are those points in time when teams complete a focal process and change to another process. By using team communication patterns to measure process shifts, this research explores (a) when teams shift among different transition processes and initiate action processes and (b) the potential of different interventions, such as communication directives, to manipulate process shift timing and order and, ultimately, team performance. Virtual experiments are employed to compare data from observed laboratory teams not receiving interventions, simulated teams receiving interventions, and optimal simulated teams generated using genetic algorithm procedures. Our results offer insights about the potential for different interventions to affect team performance. Moreover, certain interventions may promote discussions about key issues (e.g., tactical strategies) and facilitate shifting among transition processes in a manner that emulates optimal simulated teams' communication patterns. Thus, we contribute to theory regarding team processes in 2 important ways. First, we present process shifts as a way to explore the timing of when teams shift from transition to action processes. Second, we use virtual experimentation to identify those interventions with the greatest potential to affect performance by changing when teams shift among processes. Additionally, we employ computational methods including neural networks, simulation, and optimization, thereby demonstrating their applicability in conducting team research. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Mattsson, Sofia; Sjöström, Hans-Erik; Englund, Claire
2016-06-25
Objective. To develop and implement a virtual tablet machine simulation to aid distance students' understanding of the processes involved in tablet production. Design. A tablet simulation was created enabling students to study the effects different parameters have on the properties of the tablet. Once results were generated, students interpreted and explained them on the basis of current theory. Assessment. The simulation was evaluated using written questionnaires and focus group interviews. Students appreciated the exercise and considered it to be motivational. Students commented that they found the simulation, together with the online seminar and the writing of the report, was beneficial for their learning process. Conclusion. According to students' perceptions, the use of the tablet simulation contributed to their understanding of the compaction process.
Sjöström, Hans-Erik; Englund, Claire
2016-01-01
Objective. To develop and implement a virtual tablet machine simulation to aid distance students’ understanding of the processes involved in tablet production. Design. A tablet simulation was created enabling students to study the effects different parameters have on the properties of the tablet. Once results were generated, students interpreted and explained them on the basis of current theory. Assessment. The simulation was evaluated using written questionnaires and focus group interviews. Students appreciated the exercise and considered it to be motivational. Students commented that they found the simulation, together with the online seminar and the writing of the report, was beneficial for their learning process. Conclusion. According to students’ perceptions, the use of the tablet simulation contributed to their understanding of the compaction process. PMID:27402990
Probabilistic simulation of concurrent engineering of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Technology readiness and the available infrastructure is assessed for timely computational simulation of concurrent engineering for propulsion systems. Results for initial coupled multidisciplinary, fabrication-process, and system simulators are presented including uncertainties inherent in various facets of engineering processes. An approach is outlined for computationally formalizing the concurrent engineering process from cradle-to-grave via discipline dedicated workstations linked with a common database.
ERIC Educational Resources Information Center
Cohen, Edward Charles
2013-01-01
Design based research was utilized to investigate how students use a greenhouse effect simulation in order to derive best learning practices. During this process, students recognized the authentic scientific process involving computer simulations. The simulation used is embedded within an inquiry-based technology-mediated science curriculum known…
Improving the Aircraft Design Process Using Web-Based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)
2000-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Improving the Aircraft Design Process Using Web-based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.
2003-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Numerical Simulation of Cast Distortion in Gas Turbine Engine Components
NASA Astrophysics Data System (ADS)
Inozemtsev, A. A.; Dubrovskaya, A. S.; Dongauser, K. A.; Trufanov, N. A.
2015-06-01
In this paper the process of multiple airfoilvanes manufacturing through investment casting is considered. The mathematical model of the full contact problem is built to determine stress strain state in a cast during the process of solidification. Studies are carried out in viscoelastoplastic statement. Numerical simulation of the explored process is implemented with ProCASTsoftware package. The results of simulation are compared with the real production process. By means of computer analysis the optimization of technical process parameters is done in order to eliminate the defect of cast walls thickness variation.
Modeling and simulation of offshore wind farm O&M processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joschko, Philip, E-mail: joschko@informatik.uni-hamburg.de; Widok, Andi H., E-mail: a.widok@htw-berlin.de; Appel, Susanne, E-mail: susanne.appel@hs-bremen.de
2015-04-15
This paper describes a holistic approach to operation and maintenance (O&M) processes in the domain of offshore wind farm power generation. The acquisition and process visualization is followed by a risk analysis of all relevant processes. Hereafter, a tool was designed, which is able to model the defined processes in a BPMN 2.0 notation, as well as connect and simulate them. Furthermore, the notation was enriched with new elements, representing other relevant factors that were, to date, only displayable with much higher effort. In that regard a variety of more complex situations were integrated, such as for example new processmore » interactions depending on different weather influences, in which case a stochastic weather generator was combined with the business simulation or other wind farm aspects important to the smooth running of the offshore wind farms. In addition, the choices for different methodologies, such as the simulation framework or the business process notation will be presented and elaborated depending on the impact they had on the development of the approach and the software solution. - Highlights: • Analysis of operation and maintenance processes of offshore wind farms • Process modeling with BPMN 2.0 • Domain-specific simulation tool.« less
Macro Level Simulation Model Of Space Shuttle Processing
NASA Technical Reports Server (NTRS)
2000-01-01
The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.
A Digital Sensor Simulator of the Pushbroom Offner Hyperspectral Imaging Spectrometer
Tao, Dongxing; Jia, Guorui; Yuan, Yan; Zhao, Huijie
2014-01-01
Sensor simulators can be used in forecasting the imaging quality of a new hyperspectral imaging spectrometer, and generating simulated data for the development and validation of the data processing algorithms. This paper presents a novel digital sensor simulator for the pushbroom Offner hyperspectral imaging spectrometer, which is widely used in the hyperspectral remote sensing. Based on the imaging process, the sensor simulator consists of a spatial response module, a spectral response module, and a radiometric response module. In order to enhance the simulation accuracy, spatial interpolation-resampling, which is implemented before the spatial degradation, is developed to compromise the direction error and the extra aliasing effect. Instead of using the spectral response function (SRF), the dispersive imaging characteristics of the Offner convex grating optical system is accurately modeled by its configuration parameters. The non-uniformity characteristics, such as keystone and smile effects, are simulated in the corresponding modules. In this work, the spatial, spectral and radiometric calibration processes are simulated to provide the parameters of modulation transfer function (MTF), SRF and radiometric calibration parameters of the sensor simulator. Some uncertainty factors (the stability, band width of the monochromator for the spectral calibration, and the integrating sphere uncertainty for the radiometric calibration) are considered in the simulation of the calibration process. With the calibration parameters, several experiments were designed to validate the spatial, spectral and radiometric response of the sensor simulator, respectively. The experiment results indicate that the sensor simulator is valid. PMID:25615727
Modeling and Simulation of Metallurgical Process Based on Hybrid Petri Net
NASA Astrophysics Data System (ADS)
Ren, Yujuan; Bao, Hong
2016-11-01
In order to achieve the goals of energy saving and emission reduction of iron and steel enterprises, an increasing number of modeling and simulation technologies are used to research and analyse metallurgical production process. In this paper, the basic principle of Hybrid Petri net is used to model and analyse the Metallurgical Process. Firstly, the definition of Hybrid Petri Net System of Metallurgical Process (MPHPNS) and its modeling theory are proposed. Secondly, the model of MPHPNS based on material flow is constructed. The dynamic flow of materials and the real-time change of each technological state in metallurgical process are simulated vividly by using this model. The simulation process can implement interaction between the continuous event dynamic system and the discrete event dynamic system at the same level, and play a positive role in the production decision.
Simulating the decentralized processes of the human immune system in a virtual anatomy model.
Sarpe, Vladimir; Jacob, Christian
2013-01-01
Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a network of computers. As a future endeavour, we plan to use parameter tuning techniques on our model to further enhance its biological credibility. We consider these in silico experiments and their associated modeling and optimization techniques as essential components in further enhancing our capabilities of simulating a whole-body, decentralized immune system, to be used both for medical education and research as well as for virtual studies in immunoinformatics.
NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML
NASA Technical Reports Server (NTRS)
Stueber, Thomas J.; Paxson, Daniel E.
2014-01-01
The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.
NASA Astrophysics Data System (ADS)
Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji
Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.
Numerical Simulation of Sintering Process in Ceramic Powder Injection Moulded Components
NASA Astrophysics Data System (ADS)
Song, J.; Barriere, T.; Liu, B.; Gelin, J. C.
2007-05-01
A phenomenological model based on viscoplastic constitutive law is presented to describe the sintering process of ceramic components obtained by powder injection moulding. The parameters entering in the model are identified through sintering experiments in dilatometer with the proposed optimization method. The finite element simulations are carried out to predict the density variations and dimensional changes of the components during sintering. A simulation example on the sintering process of hip implant in alumina has been conducted. The simulation results have been compared with the experimental ones. A good agreement is obtained.
Guillermo A. Mendoza; Roger J. Meimban; Philip A. Araman; William G. Luppold
1991-01-01
A log inventory model and a real-time hardwood process simulation model were developed and combined into an integrated production planning and control system for hardwood sawmills. The log inventory model was designed to monitor and periodically update the status of the logs in the log yard. The process simulation model was designed to estimate various sawmill...
2014-10-01
offer a practical solution to calculating the grain -scale hetero- geneity present in the deformation field. Consequently, crystal plasticity models...process/performance simulation codes (e.g., crystal plasticity finite element method). 15. SUBJECT TERMS ICME; microstructure informatics; higher...iii) protocols for direct and efficient linking of materials models/databases into process/performance simulation codes (e.g., crystal plasticity
Tools for 3D scientific visualization in computational aerodynamics
NASA Technical Reports Server (NTRS)
Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val
1989-01-01
The purpose is to describe the tools and techniques in use at the NASA Ames Research Center for performing visualization of computational aerodynamics, for example visualization of flow fields from computer simulations of fluid dynamics about vehicles such as the Space Shuttle. The hardware used for visualization is a high-performance graphics workstation connected to a super computer with a high speed channel. At present, the workstation is a Silicon Graphics IRIS 3130, the supercomputer is a CRAY2, and the high speed channel is a hyperchannel. The three techniques used for visualization are post-processing, tracking, and steering. Post-processing analysis is done after the simulation. Tracking analysis is done during a simulation but is not interactive, whereas steering analysis involves modifying the simulation interactively during the simulation. Using post-processing methods, a flow simulation is executed on a supercomputer and, after the simulation is complete, the results of the simulation are processed for viewing. The software in use and under development at NASA Ames Research Center for performing these types of tasks in computational aerodynamics is described. Workstation performance issues, benchmarking, and high-performance networks for this purpose are also discussed as well as descriptions of other hardware for digital video and film recording.
Shim, Sung J; Kumar, Arun; Jiao, Roger
2016-01-01
A hospital is considering deploying a radiofrequency identification (RFID) system and setting up a new "discharge lounge" to improve the patient discharge process. This study uses computer simulation to model and compare the current process and the new process, and it assesses the impact of the RFID system and the discharge lounge on the process in terms of resource utilization and time taken in the process. The simulation results regarding resource utilization suggest that the RFID system can slightly relieve the burden on all resources, whereas the RFID system and the discharge lounge together can significantly mitigate the nurses' tasks. The simulation results in terms of the time taken demonstrate that the RFID system can shorten patient wait times, staff busy times, and bed occupation times. The results of the study could prove helpful to others who are considering the use of an RFID system in the patient discharge process in hospitals or similar processes.
Simulation of beam-induced plasma in gas-filled rf cavities
Yu, Kwangmin; Samulyak, Roman; Yonehara, Katsuya; ...
2017-03-07
Processes occurring in a radio-frequency (rf) cavity, filled with high pressure gas and interacting with proton beams, have been studied via advanced numerical simulations. Simulations support the experimental program on the hydrogen gas-filled rf cavity in the Mucool Test Area (MTA) at Fermilab, and broader research on the design of muon cooling devices. space, a 3D electromagnetic particle-in-cell (EM-PIC) code with atomic physics support, was used in simulation studies. Plasma dynamics in the rf cavity, including the process of neutral gas ionization by proton beams, plasma loading of the rf cavity, and atomic processes in plasma such as electron-ion andmore » ion-ion recombination and electron attachment to dopant molecules, have been studied. Here, through comparison with experiments in the MTA, simulations quantified several uncertain values of plasma properties such as effective recombination rates and the attachment time of electrons to dopant molecules. Simulations have achieved very good agreement with experiments on plasma loading and related processes. Lastly, the experimentally validated code space is capable of predictive simulations of muon cooling devices.« less
Comparative Implementation of High Performance Computing for Power System Dynamic Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Shuangshuang; Huang, Zhenyu; Diao, Ruisheng
Dynamic simulation for transient stability assessment is one of the most important, but intensive, computations for power system planning and operation. Present commercial software is mainly designed for sequential computation to run a single simulation, which is very time consuming with a single processer. The application of High Performance Computing (HPC) to dynamic simulations is very promising in accelerating the computing process by parallelizing its kernel algorithms while maintaining the same level of computation accuracy. This paper describes the comparative implementation of four parallel dynamic simulation schemes in two state-of-the-art HPC environments: Message Passing Interface (MPI) and Open Multi-Processing (OpenMP).more » These implementations serve to match the application with dedicated multi-processor computing hardware and maximize the utilization and benefits of HPC during the development process.« less
SiMon: Simulation Monitor for Computational Astrophysics
NASA Astrophysics Data System (ADS)
Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming
2017-09-01
Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.
Finite-element simulation of ceramic drying processes
NASA Astrophysics Data System (ADS)
Keum, Y. T.; Jeong, J. H.; Auh, K. H.
2000-07-01
A finite-element simulation for the drying process of ceramics is performed. The heat and moisture movements in green ceramics caused by the temperature gradient, moisture gradient, conduction, convection and evaporation are considered. The finite-element formulation for solving the temperature and moisture distributions, which not only change the volume but also induce the hygro-thermal stress, is carried out. Employing the internally discontinuous interface elements, the numerical divergence problem arising from sudden changes in heat capacity in the phase zone is solved. In order to verify the reliability of the formulation, the drying process of a coal and the wetting process of a graphite epoxy are simulated and the results are compared with the analytical solution and another investigator's result. Finally, the drying process of a ceramic electric insulator is simulated.
Parallel Signal Processing and System Simulation using aCe
NASA Technical Reports Server (NTRS)
Dorband, John E.; Aburdene, Maurice F.
2003-01-01
Recently, networked and cluster computation have become very popular for both signal processing and system simulation. A new language is ideally suited for parallel signal processing applications and system simulation since it allows the programmer to explicitly express the computations that can be performed concurrently. In addition, the new C based parallel language (ace C) for architecture-adaptive programming allows programmers to implement algorithms and system simulation applications on parallel architectures by providing them with the assurance that future parallel architectures will be able to run their applications with a minimum of modification. In this paper, we will focus on some fundamental features of ace C and present a signal processing application (FFT).
NASA Astrophysics Data System (ADS)
Martin, Ffion A.; Warrior, Nicholas A.; Simacek, Pavel; Advani, Suresh; Hughes, Adrian; Darlington, Roger; Senan, Eissa
2018-03-01
Very short manufacture cycle times are required if continuous carbon fibre and epoxy composite components are to be economically viable solutions for high volume composite production for the automotive industry. Here, a manufacturing process variant of resin transfer moulding (RTM), targets a reduction of in-mould manufacture time by reducing the time to inject and cure components. The process involves two stages; resin injection followed by compression. A flow simulation methodology using an RTM solver for the process has been developed. This paper compares the simulation prediction to experiments performed using industrial equipment. The issues encountered during the manufacturing are included in the simulation and their sensitivity to the process is explored.
Darkwah, Kwabena; Nokes, Sue E; Seay, Jeffrey R; Knutson, Barbara L
2018-05-22
Process simulations of batch fermentations with in situ product separation traditionally decouple these interdependent steps by simulating a separate "steady state" continuous fermentation and separation units. In this study, an integrated batch fermentation and separation process was simulated for a model system of acetone-butanol-ethanol (ABE) fermentation with in situ gas stripping, such that the fermentation kinetics are linked in real-time to the gas stripping process. A time-dependent cell growth, substrate utilization, and product production is translated to an Aspen Plus batch reactor. This approach capitalizes on the phase equilibria calculations of Aspen Plus to predict the effect of stripping on the ABE fermentation kinetics. The product profiles of the integrated fermentation and separation are shown to be sensitive to gas flow rate, unlike separate steady state fermentation and separation simulations. This study demonstrates the importance of coupled fermentation and separation simulation approaches for the systematic analyses of unsteady state processes.
Representing the work of medical protocols for organizational simulation.
Fridsma, D. B.
1998-01-01
Developing and implementing patient care protocols within a specific organizational setting requires knowledge of the protocol, the organization, and the way in which the organization does its work. Computer-based simulation tools have been used in many industries to provide managers with prospective insight into problems of work process and organization design mismatch. Many of these simulation tools are designed for well-understood routine work processes in which there are few contingent tasks. In this paper, we describe theoretic that make it possible to simulate medical protocols using an information-processing theory framework. These simulations will allow medical administrators to test different protocol and organizational designs before actually using them within a particular clinical setting. PMID:9929231
NASA Astrophysics Data System (ADS)
Wu, Longtao; Wong, Sun; Wang, Tao; Huffman, George J.
2018-01-01
Simulation of moist convective processes is critical for accurately representing the interaction among tropical wave activities, atmospheric water vapor transport, and clouds associated with the Indian monsoon Intraseasonal Oscillation (ISO). In this study, we apply the Weather Research and Forecasting (WRF) model to simulate Indian monsoon ISO with three different treatments of moist convective processes: (1) the Betts-Miller-Janjić (BMJ) adjustment cumulus scheme without explicit simulation of moist convective processes; (2) the New Simplified Arakawa-Schubert (NSAS) mass-flux scheme with simplified moist convective processes; and (3) explicit simulation of moist convective processes at convection permitting scale (Nest). Results show that the BMJ experiment is unable to properly reproduce the equatorial Rossby wave activities and the corresponding phase relationship between moisture advection and dynamical convergence during the ISO. These features associated with the ISO are approximately captured in the NSAS experiment. The simulation with resolved moist convective processes significantly improves the representation of the ISO evolution, and has good agreements with the observations. This study features the first attempt to investigate the Indian monsoon at convection permitting scale.
Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh
2010-02-01
Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsui, H.; Koike, Makoto; Kondo, Yutaka
Organic aerosol (OA) simulations using the volatility basis-set approach were made for East Asia and its outflow region. Model simulations were evaluated through comparisons with OA measured by aerosol mass spectrometers in and around Tokyo (at Komaba and Kisai in summer 2003 and 2004) and over the outflow region in East Asia (at Fukue and Hedo in spring 2009). The simulations with aging processes of organic vapors reasonably well reproduced mass concentrations, temporal variations, and formation efficiency of observed OA at all sites. As OA mass was severely underestimated in the simulations without the aging processes, the oxidations of organicmore » vapors are essential for reasonable OA simulations over East Asia. By considering the aging processes, simulated OA concentrations considerably increased from 0.24 to 1.28 µg m-3 in the boundary layer over the whole of East Asia. OA formed from the interaction of anthropogenic and biogenic sources was also enhanced by the aging processes. The fraction of controllable OA was estimated to be 87 % of total OA over the whole of East Asia, showing that most of the OA in our simulations formed anthropogenically (controllable). A large portion of biogenic secondary OA (78 % of biogenic secondary OA) formed through the influence of anthropogenic sources. The high fraction of controllable OA in our simulations is likely because anthropogenic emissions are dominant over East Asia and OA formation is enhanced by anthropogenic sources and their aging processes. Both the amounts (from 0.18 to 1.12 µg m-3) and the fraction (from 75 % to 87 %) of controllable OA were increased by aging processes of organic vapors over East Asia.« less
NASA Astrophysics Data System (ADS)
Harvey, Jean-Philippe
In this work, the possibility to calculate and evaluate with a high degree of precision the Gibbs energy of complex multiphase equilibria for which chemical ordering is explicitly and simultaneously considered in the thermodynamic description of solid (short range order and long range order) and liquid (short range order) metallic phases is studied. The cluster site approximation (CSA) and the cluster variation method (CVM) are implemented in a new minimization technique of the Gibbs energy of multicomponent and multiphase systems to describe the thermodynamic behaviour of metallic solid solutions showing strong chemical ordering. The modified quasichemical model in the pair approximation (MQMPA) is also implemented in the new minimization algorithm presented in this work to describe the thermodynamic behaviour of metallic liquid solutions. The constrained minimization technique implemented in this work consists of a sequential quadratic programming technique based on an exact Newton’s method (i.e. the use of exact second derivatives in the determination of the Hessian of the objective function) combined to a line search method to identify a direction of sufficient decrease of the merit function. The implementation of a new algorithm to perform the constrained minimization of the Gibbs energy is justified by the difficulty to identify, in specific cases, the correct multiphase assemblage of a system where the thermodynamic behaviour of the equilibrium phases is described by one of the previously quoted models using the FactSage software (ex.: solid_CSA+liquid_MQMPA; solid1_CSA+solid2_CSA). After a rigorous validation of the constrained Gibbs energy minimization algorithm using several assessed binary and ternary systems found in the literature, the CVM and the CSA models used to describe the energetic behaviour of metallic solid solutions present in systems with key industrial applications such as the Cu-Zr and the Al-Zr systems are parameterized using fully consistent thermodynamic an structural data generated from a Monte Carlo (MC) simulator also implemented in the framework of this project. In this MC simulator, the modified embedded atom model in the second nearest neighbour formalism (MEAM-2NN) is used to describe the cohesive energy of each studied structure. A new Al-Zr MEAM-2NN interatomic potential needed to evaluate the cohesive energy of the condensed phases of this system is presented in this work. The thermodynamic integration (TI) method implemented in the MC simulator allows the evaluation of the absolute Gibbs energy of the considered solid or liquid structures. The original implementation of the TI method allowed us to evaluate theoretically for the first time all the thermodynamic mixing contributions (i.e., mixing enthalpy and mixing entropy contributions) of a metallic liquid (Cu-Zr and Al-Zr) and of a solid solution (face-centered cubic (FCC) Al-Zr solid solution) described by the MEAM-2NN. Thermodynamic and structural data obtained from MC and molecular dynamic simulations are then used to parameterize the CVM for the Al-Zr FCC solid solution and the MQMPA for the Al-Zr and the Cu-Zr liquid phase respectively. The extended thermodynamic study of these systems allow the introduction of a new type of configuration-dependent excess parameters in the definition of the thermodynamic function of solid solutions described by the CVM or the CSA. These parameters greatly improve the precision of these thermodynamic models based on experimental evidences found in the literature. A new parameterization approach of the MQMPA model of metallic liquid solutions is presented throughout this work. In this new approach, calculated pair fractions obtained from MC/MD simulations are taken into account as well as configuration-independent volumetric relaxation effects (regular like excess parameters) in order to parameterize precisely the Gibbs energy function of metallic melts. The generation of a complete set of fully consistent thermodynamic, physical and structural data for solid, liquid, and stoichiometric compounds and the subsequent parameterization of their respective thermodynamic model lead to the first description of the complete Al-Zr phase diagram in the range of composition [0 ≤ XZr ≤ 5 / 9] based on theoretical and fully consistent thermodynamic properties. MC and MD simulations are performed for the Al-Zr system to define for the first time the precise thermodynamic behaviour of the amorphous phase for its entire range of composition. Finally, all the thermodynamic models for the liquid phase, the FCC solid solution and the amorphous phase are used to define conditions based on thermodynamic and volumetric considerations that favor the amorphization of Al-Zr alloys.
Mota, J.P.B.; Esteves, I.A.A.C.; Rostam-Abadi, M.
2004-01-01
A computational fluid dynamics (CFD) software package has been coupled with the dynamic process simulator of an adsorption storage tank for methane fuelled vehicles. The two solvers run as independent processes and handle non-overlapping portions of the computational domain. The codes exchange data on the boundary interface of the two domains to ensure continuity of the solution and of its gradient. A software interface was developed to dynamically suspend and activate each process as necessary, and be responsible for data exchange and process synchronization. This hybrid computational tool has been successfully employed to accurately simulate the discharge of a new tank design and evaluate its performance. The case study presented here shows that CFD and process simulation are highly complementary computational tools, and that there are clear benefits to be gained from a close integration of the two. ?? 2004 Elsevier Ltd. All rights reserved.
General simulation algorithm for autocorrelated binary processes.
Serinaldi, Francesco; Lombardo, Federico
2017-02-01
The apparent ubiquity of binary random processes in physics and many other fields has attracted considerable attention from the modeling community. However, generation of binary sequences with prescribed autocorrelation is a challenging task owing to the discrete nature of the marginal distributions, which makes the application of classical spectral techniques problematic. We show that such methods can effectively be used if we focus on the parent continuous process of beta distributed transition probabilities rather than on the target binary process. This change of paradigm results in a simulation procedure effectively embedding a spectrum-based iterative amplitude-adjusted Fourier transform method devised for continuous processes. The proposed algorithm is fully general, requires minimal assumptions, and can easily simulate binary signals with power-law and exponentially decaying autocorrelation functions corresponding, for instance, to Hurst-Kolmogorov and Markov processes. An application to rainfall intermittency shows that the proposed algorithm can also simulate surrogate data preserving the empirical autocorrelation.
Ontological simulation for educational process organisation in a higher educational institution
NASA Astrophysics Data System (ADS)
Berestneva, O. G.; Marukhina, O. V.; Bahvalov, S. V.; Fisochenko, O. N.; Berestneva, E. V.
2017-01-01
Following the new-generation standards is needed to form a task list connected with planning and organizing of an academic process, structure and content formation of degree programmes. Even when planning the structure and content of an academic process, one meets some problems concerning the necessity to assess the correlation between degree programmes and demands of educational and professional standards and to consider today’s job-market and students demands. The paper presents examples of ontological simulations for solutions of organizing educational process problems in a higher educational institution and gives descriptions of model development. The article presents two examples: ontological simulation when planning an educational process in a higher educational institution and ontological simulation for describing competences of an IT-specialist. The paper sets a conclusion about ontology application perceptiveness for formalization of educational process organization in a higher educational institution.
Neural Processing of Musical and Vocal Emotions Through Cochlear Implants Simulation.
Ahmed, Duha G; Paquette, Sebastian; Zeitouni, Anthony; Lehmann, Alexandre
2018-05-01
Cochlear implants (CIs) partially restore the sense of hearing in the deaf. However, the ability to recognize emotions in speech and music is reduced due to the implant's electrical signal limitations and the patient's altered neural pathways. Electrophysiological correlations of these limitations are not yet well established. Here we aimed to characterize the effect of CIs on auditory emotion processing and, for the first time, directly compare vocal and musical emotion processing through a CI-simulator. We recorded 16 normal hearing participants' electroencephalographic activity while listening to vocal and musical emotional bursts in their original form and in a degraded (CI-simulated) condition. We found prolonged P50 latency and reduced N100-P200 complex amplitude in the CI-simulated condition. This points to a limitation in encoding sound signals processed through CI simulation. When comparing the processing of vocal and musical bursts, we found a delay in latency with the musical bursts compared to the vocal bursts in both conditions (original and CI-simulated). This suggests that despite the cochlear implants' limitations, the auditory cortex can distinguish between vocal and musical stimuli. In addition, it adds to the literature supporting the complexity of musical emotion. Replicating this study with actual CI users might lead to characterizing emotional processing in CI users and could ultimately help develop optimal rehabilitation programs or device processing strategies to improve CI users' quality of life.
Kim, Sunghee; Shin, Gisoo
2016-02-01
Since previous studies on simulation-based education have been focused on fundamental nursing skills for nursing students in South Korea, there is little research available that focuses on clinical nurses in simulation-based training. Further, there is a paucity of research literature related to the integration of the nursing process into simulation training particularly in the emergency nursing care of high-risk maternal and neonatal patients. The purpose of this study was to identify the effects of nursing process-based simulation on knowledge, attitudes, and skills for maternal and child emergency nursing care in clinical nurses in South Korea. Data were collected from 49 nurses, 25 in the experimental group and 24 in the control group, from August 13 to 14, 2013. This study was an equivalent control group pre- and post-test experimental design to compare the differences in knowledge, attitudes, and skills for maternal and child emergency nursing care between the experimental group and the control group. The experimental group was trained by the nursing process-based simulation training program, while the control group received traditional methods of training for maternal and child emergency nursing care. The experimental group was more likely to improve knowledge, attitudes, and skills required for clinical judgment about maternal and child emergency nursing care than the control group. Among five stages of nursing process in simulation, the experimental group was more likely to improve clinical skills required for nursing diagnosis and nursing evaluation than the control group. These results will provide valuable information on developing nursing process-based simulation training to improve clinical competency in nurses. Further research should be conducted to verify the effectiveness of nursing process-based simulation with more diverse nurse groups on more diverse subjects in the future. Copyright © 2015 Elsevier Ltd. All rights reserved.
Numerical simulation of the SAGD process coupled with geomechanical behavior
NASA Astrophysics Data System (ADS)
Li, Pingke
Canada has vast oil sand resources. While a large portion of this resource can be recovered by surface mining techniques, a majority is located at depths requiring the application of in situ recovery technologies. Although a number of in situ recovery technologies exist, the steam assisted gravity drainage (SAGD) process has emerged as one of the most promising technologies to develop the in situ oil sands resources. During the SAGD operations, saturated steam is continuously injected into the oil sands reservoir, which induces pore pressure and stress variations. As a result, reservoir parameters and processes may also vary, particularly when tensile and shear failure occur. This geomechanical effect is obvious for oil sands material because oil sands have the in situ interlocked fabric. The conventional reservoir simulation generally does not take this coupled mechanism into consideration. Therefore, this research is to improve the reservoir simulation techniques of the SAGD process applied in the development of oil sands and heavy oil reservoirs. The analyses of the decoupled reservoir geomechanical simulation results show that the geomechanical behavior in SAGD has obvious impact on reservoir parameters, such as absolute permeability. The issues with the coupled reservoir geomechanical simulations of the SAGD process have been clarified and the permeability variations due to geomechanical behaviors in the SAGD process investigated. A methodology of sequentially coupled reservoir geomechanical simulation technique was developed based on the reservoir simulator, EXOTHERM, and the geomechanical simulator, FLAC. In addition, a representative geomechanical model of oil sands material was summarized in this research. Finally, this reservoir geomechanical simulation methodology was verified with the UTF Phase A SAGD project and applied in a SAGD operation with gas-over-bitumen geometry. Based on this methodology, the geomechanical effect on the SAGD production performance can be quantified. This research program involves the analyses of laboratory testing results obtained from literatures. However, no laboratory testing was conducted in the process of this research.
Thermo-mechanical simulation of liquid-supported stretch blow molding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zimmer, J.; Stommel, M.
2015-05-22
Stretch blow molding is the well-established plastics forming method to produce Polyehtylene therephtalate (PET) bottles. An injection molded preform is heated up above the PET glass transition temperature (Tg∼85°C) and subsequently inflated by pressurized air into a closed cavity. In the follow-up filling process, the resulting bottle is filled with the final product. A recently developed modification of the process combines the blowing and filling stages by directly using the final liquid product to inflate the preform. In a previously published paper, a mechanical simulation and successful evaluation of this liquid-driven stretch blow molding process was presented. In this way,more » a realistic process parameter dependent simulation of the preform deformation throughout the forming process was enabled, whereas the preform temperature evolution during forming was neglected. However, the formability of the preform is highly reduced when the temperature sinks below Tg during forming. Experimental investigations show temperature-induced failure cases due to the fast heat transfer between hot preform and cold liquid. Therefore, in this paper, a process dependent simulation of the temperature evolution during processing to avoid preform failure is presented. For this purpose, the previously developed mechanical model is used to extract the time dependent thickness evolution. This information serves as input for the heat transfer simulation. The required material parameters are calibrated from preform cooling experiments recorded with an infrared-camera. Furthermore, the high deformation ratios during processing lead to strain induced crystallization. This exothermal reaction is included into the simulation by extracting data from preform measurements at different stages of deformation via Differential Scanning Calorimetry (DSC). Finally, the thermal simulation model is evaluated by free forming experiments, recorded by a high-speed infrared camera.« less
Ogata, Yuma; Ohnishi, Takashi; Moriya, Takahiro; Inadama, Naoko; Nishikido, Fumihiko; Yoshida, Eiji; Murayama, Hideo; Yamaya, Taiga; Haneishi, Hideaki
2014-01-01
The X'tal cube is a next-generation DOI detector for PET that we are developing to offer higher resolution and higher sensitivity than is available with present detectors. It is constructed from a cubic monolithic scintillation crystal and silicon photomultipliers which are coupled on various positions of the six surfaces of the cube. A laser-processing technique is applied to produce 3D optical boundaries composed of micro-cracks inside the monolithic scintillator crystal. The current configuration is based on an empirical trial of a laser-processed boundary. There is room to improve the spatial resolution by optimizing the setting of the laser-processed boundary. In fact, the laser-processing technique has high freedom in setting the parameters of the boundary such as size, pitch, and angle. Computer simulation can effectively optimize such parameters. In this study, to design optical characteristics properly for the laser-processed crystal, we developed a Monte Carlo simulator which can model arbitrary arrangements of laser-processed optical boundaries (LPBs). The optical characteristics of the LPBs were measured by use of a setup with a laser and a photo-diode, and then modeled in the simulator. The accuracy of the simulator was confirmed by comparison of position histograms obtained from the simulation and from experiments with a prototype detector composed of a cubic LYSO monolithic crystal with 6 × 6 × 6 segments and multi-pixel photon counters. Furthermore, the simulator was accelerated by parallel computing with general-purpose computing on a graphics processing unit. The calculation speed was about 400 times faster than that with a CPU.
Controlling Ethylene for Extended Preservation of Fresh Fruits and Vegetables
2008-12-01
into a process simulation to determine the effects of key design parameters on the overall performance of the system. Integrating process simulation...High Decay [Asian Pears High High Decay [ Avocados High High Decay lBananas Moderate ~igh Decay Cantaloupe High Moderate Decay Cherimoya Very High High...ozonolysis. Process simulation was subsequently used to understand the effect of key system parameters on EEU performance. Using this modeling work
NASA Astrophysics Data System (ADS)
Lee, C. H.; Yang, D. Y.; Lee, S. R.; Chang, I. G.; Lee, T. W.
2011-08-01
The shielded slot plate, which has a sheared corrugated trapezoidal pattern, is a component of the metallic bipolar plate for the molten carbonate fuel cell (MCFC). In order to increase the efficiency of the fuel cell, the unit cell of the shielded slot plate should have a relatively large upper area. Additionally, defects from the forming process should be minimized. In order to simulate the slitting process, whereby sheared corrugated patterns are formed, ductile fracture criteria based on the histories of stress and strain are employed. The user material subroutine VUMAT is employed for implementation of the material and ductile fracture criteria in the commercial FEM software ABAQUS. The variables of the ductile fracture criteria were determined by comparing the simulation results and the experimental results of the tension test and the shearing test. Parametric studies were conducted to determine the critical value of the ductile fracture criterion. Employing these ductile fracture criteria, the three dimensional forming process of the shielded slot plate was numerically simulated. The effects of the slitting process in the forming process of the shielded slot plate were analyzed through a FEM simulation and experimental studies. Finally, experiments involving microscopic and macroscopic observations were conducted to verify the numerical simulations of the 3-step forming process.
Potential application of artificial concepts to aerodynamic simulation
NASA Technical Reports Server (NTRS)
Kutler, P.; Mehta, U. B.; Andrews, A.
1984-01-01
The concept of artificial intelligence as it applies to computational fluid dynamics simulation is investigated. How expert systems can be adapted to speed the numerical aerodynamic simulation process is also examined. A proposed expert grid generation system is briefly described which, given flow parameters, configuration geometry, and simulation constraints, uses knowledge about the discretization process to determine grid point coordinates, computational surface information, and zonal interface parameters.
ROMI-3: Rough-Mill Simulator Version 3.0: User's Guide
Joel M. Weiss; R. Edward Thomas; R. Edward Thomas
2005-01-01
ROMI-3 Rough-Mill Simulator is a software package that simulates current industrial practices for rip-first and chop-first lumber processing. This guide shows the user how to set up and examine the results of simulations of current or proposed mill practices. ROMI-3 accepts cutting bills with as many as 600 combined solid and/or panel part sizes. Plots of processed...
Janice K. Wiedenbeck; Philip A. Araman
1995-01-01
We've been telling the wood industry about our process simulation modeling research and development work for several years. We've demonstrated our crosscut-first and rip-first rough mill simulation and animation models. Weâve advised companies on how they could use simulation modeling to help make critically important, pending decisions related to mill layout...
ERIC Educational Resources Information Center
Neely, Pat; Tucker, Jan
2013-01-01
Purpose: Simulations are designed as activities which imitate real world scenarios and are often used to teach and enhance skill building. The purpose of this case study is to examine the decision making process and outcomes of a faculty committee tasked with examining simulations in the marketplace to determine if the simulations could be used as…
The Australian Computational Earth Systems Simulator
NASA Astrophysics Data System (ADS)
Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.
2001-12-01
Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.
Visualization Methods for Viability Studies of Inspection Modules for the Space Shuttle
NASA Technical Reports Server (NTRS)
Mobasher, Amir A.
2005-01-01
An effective simulation of an object, process, or task must be similar to that object, process, or task. A simulation could consist of a physical device, a set of mathematical equations, a computer program, a person, or some combination of these. There are many reasons for the use of simulators. Although some of the reasons are unique to a specific situation, there are many general reasons and purposes for using simulators. Some are listed but not limited to (1) Safety, (2) Scarce resources, (3) Teaching/education, (4) Additional capabilities, (5) Flexibility and (6) Cost. Robot simulators are in use for all of these reasons. Virtual environments such as simulators will eliminate physical contact with humans and hence will increase the safety of work environment. Corporations with limited funding and resources may utilize simulators to accomplish their goals while saving manpower and money. A computer simulation is safer than working with a real robot. Robots are typically a scarce resource. Schools typically don t have a large number of robots, if any. Factories don t want the robots not performing useful work unless absolutely necessary. Robot simulators are useful in teaching robotics. A simulator gives a student hands-on experience, if only with a simulator. The simulator is more flexible. A user can quickly change the robot configuration, workcell, or even replace the robot with a different one altogether. In order to be useful, a robot simulator must create a model that accurately performs like the real robot. A powerful simulator is usually thought of as a combination of a CAD package with simulation capabilities. Computer Aided Design (CAD) techniques are used extensively by engineers in virtually all areas of engineering. Parts are designed interactively aided by the graphical display of both wireframe and more realistic shaded renderings. Once a part s dimensions have been specified to the CAD package, designers can view the part from any direction to examine how it will look and perform in relation to other parts. If changes are deemed necessary, the designer can easily make the changes and view the results graphically. However, a complex process of moving parts intended for operation in a complex environment can only be fully understood through the process of animated graphical simulation. A CAD package with simulation capabilities allows the designer to develop geometrical models of the process being designed, as well as the environment in which the process will be used, and then test the process in graphical animation much as the actual physical system would be run . By being able to operate the system of moving and stationary parts, the designer is able to see in simulation how the system will perform under a wide variety of conditions. If, for example, undesired collisions occur between parts of the system, design changes can be easily made without the expense or potential danger of testing the physical system.
Simulation modeling for the health care manager.
Kennedy, Michael H
2009-01-01
This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.
A parallel algorithm for switch-level timing simulation on a hypercube multiprocessor
NASA Technical Reports Server (NTRS)
Rao, Hariprasad Nannapaneni
1989-01-01
The parallel approach to speeding up simulation is studied, specifically the simulation of digital LSI MOS circuitry on the Intel iPSC/2 hypercube. The simulation algorithm is based on RSIM, an event driven switch-level simulator that incorporates a linear transistor model for simulating digital MOS circuits. Parallel processing techniques based on the concepts of Virtual Time and rollback are utilized so that portions of the circuit may be simulated on separate processors, in parallel for as large an increase in speed as possible. A partitioning algorithm is also developed in order to subdivide the circuit for parallel processing.
Simulation of a master-slave event set processor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Comfort, J.C.
1984-03-01
Event set manipulation may consume a considerable amount of the computation time spent in performing a discrete-event simulation. One way of minimizing this time is to allow event set processing to proceed in parallel with the remainder of the simulation computation. The paper describes a multiprocessor simulation computer, in which all non-event set processing is performed by the principal processor (called the host). Event set processing is coordinated by a front end processor (the master) and actually performed by several other functionally identical processors (the slaves). A trace-driven simulation program modeling this system was constructed, and was run with tracemore » output taken from two different simulation programs. Output from this simulation suggests that a significant reduction in run time may be realized by this approach. Sensitivity analysis was performed on the significant parameters to the system (number of slave processors, relative processor speeds, and interprocessor communication times). A comparison between actual and simulation run times for a one-processor system was used to assist in the validation of the simulation. 7 references.« less
ASPEN simulation of a fixed-bed integrated gasification combined-cycle power plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, K.R.
1986-03-01
A fixed-bed integrated gasification combined-cycle (IGCC) power plant has been modeled using the Advanced System for Process ENgineering (ASPEN). The ASPEN simulation is based on a conceptual design of a 509-MW IGCC power plant that uses British Gas Corporation (BGC)/Lurgi slagging gasifiers and the Lurgi acid gas removal process. The 39.3-percent thermal efficiency of the plant that was calculated by the simulation compares very favorably with the 39.4 percent that was reported by EPRI. The simulation addresses only thermal performance and does not calculate capital cost or process economics. Portions of the BGC-IGCC simulation flowsheet are based on the SLAGGERmore » fixed-bed gasifier model (Stefano May 1985), and the Kellogg-Rust-Westinghouse (KRW) iGCC, and the Texaco-IGCC simulations (Stone July 1985) that were developed at the Department of Energy (DOE), Morgantown Energy Technology Center (METC). The simulation runs in 32 minutes of Central Processing Unit (CPU) time on the VAX-11/780. The BGC-IGCC simulation was developed to give accurate mass and energy balances and to track coal tars and environmental species such as SO/sub x/ and NO/sub x/ for a fixed-bed, coal-to-electricity system. This simulation is the third in a series of three IGCC simulations that represent fluidized-bed, entrained-flow, and fixed-bed gasification processes. Alternate process configurations can be considered by adding, deleting, or rearranging unit operation blocks. The gasifier model is semipredictive; it can properly respond to a limited range of coal types and gasifier operating conditions. However, some models in the flowsheet are based on correlations that were derived from the EPRI study, and are therefore limited to coal types and operating conditions that are reasonably close to those given in the EPRI design. 4 refs., 7 figs., 2 tabs.« less
NASA Astrophysics Data System (ADS)
Raj, Rahul; van der Tol, Christiaan; Hamm, Nicholas Alexander Samuel; Stein, Alfred
2018-01-01
Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP) data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT), ratio of fine root carbon to leaf carbon (FRC : LC), ratio of carbon to nitrogen in leaf (C : Nleaf), canopy water interception coefficient (Wint), fraction of leaf nitrogen in RuBisCO (FLNR), and effective soil rooting depth (SD) characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash-Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.
IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
William M. Bond; Salih Ersayin
2007-03-30
This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency ofmore » individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern Minnesota, and future proposals are pending with non-taconite mineral processing applications.« less
De l'importance des orbites periodiques: Detection et applications
NASA Astrophysics Data System (ADS)
Doyon, Bernard
L'ensemble des Orbites Periodiques Instables (OPIs) d'un systeme chaotique est intimement relie a ses proprietes dynamiques. A partir de l'ensemble (en principe infini) d'OPIs cachees dans l'espace des phases, on peut obtenir des quantites dynamiques importantes telles les exposants de Lyapunov, la mesure invariante, l'entropie topologique et la dimension fractale. En chaos quantique (i.e. l'etude de systemes quantiques qui ont un equivalent chaotique dans la limite classique), ces memes OPIs permettent de faire le pont entre le comportement classique et quantique de systemes non-integrables. La localisation de ces cycles fondamentaux est un probleme complexe. Cette these aborde dans un premier temps le probleme de la detection des OPIs dans les systemes chaotiques. Une etude comparative de deux algorithmes recents est presentee. Nous approfondissons ces deux methodes afin de les utiliser sur differents systemes dont des flots continus dissipatifs et conservatifs. Une analyse du taux de convergence des algorithmes est aussi realisee afin de degager les forces et les limites de ces schemes numeriques. Les methodes de detection que nous utilisons reposent sur une transformation particuliere de la dynamique initiale. Cette astuce nous a inspire une methode alternative pour cibler et stabiliser une orbite periodique quelconque dans un systeme chaotique. Le ciblage est en general combine aux methodes de controle pour stabiliser rapidement un cycle donne. En general, il faut connaitre la position et la stabilite du cycle en question. La nouvelle methode de ciblage que nous presentons ne demande pas de connaitre a priori la position et la stabilite des orbites periodiques. Elle pourrait etre un outil complementaire aux methodes de ciblage et de controle actuelles.
Effets non lineaires transversaux dans les guides d'ondes plans
NASA Astrophysics Data System (ADS)
Dumais, Patrick
Les effets non lineaires transversaux dus a l'effet Kerr optique non resonant sont etudies dans deux types de guides a geometrie plane. D'abord (au chapitre 2), l'emission de solitons spatiaux d'un guide de type canal est etudie historiquement, analytiquement et numeriquement dans le but d'en faire la conception et la fabrication, en AlGaAs, dans la region spectrale en deca de la moitie de la bande interdite de ce materiau, soit autour de 1,5 microns. Le composant, tel que concu, comporte une structure de multipuits quantiques. Le desordonnement local de cette structure permet une variation locale du coefficient Kerr dans le guide, ce qui mene a l'emission d'un soliton spatial au-dela d'une puissance optique de seuil. L'observation experimentale d'un changement en fonction de l'intensite du profil de champ a la sortie du guide realise est presentee. Deuxiemement (au chapitre 3) une technique de mesure du coefficient Kerr dans un guide plan est presentee. Cette technique consiste a mesurer le changement de transmission au travers d'un cache place a la sortie du guide en fonction de l'intensite crete a l'entree du guide plan. Une methode pour determiner les conditions optimales pour la sensibilite de la mesure est presentee, illustree de plusieurs exemples. Finalement, la realisation d'un oscillateur parametrique optique basee sur un cristal de niobate de lithium a domaines periodiquement inverses est presentee. La theorie des oscillateurs parametriques optiques est exposee avec une emphase sur la generation d'impulsions intenses a des longueurs d'onde autour de 1,5 microns a partir d'un laser Ti:saphir, dans le but d'obtenir une source pour faire les experiences sur l'emission solitonique.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to complymore » with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.« less
Simulation of Triple Oxidation Ditch Wastewater Treatment Process
NASA Astrophysics Data System (ADS)
Yang, Yue; Zhang, Jinsong; Liu, Lixiang; Hu, Yongfeng; Xu, Ziming
2010-11-01
This paper presented the modeling mechanism and method of a sewage treatment system. A triple oxidation ditch process of a WWTP was simulated based on activated sludge model ASM2D with GPS-X software. In order to identify the adequate model structure to be implemented into the GPS-X environment, the oxidation ditch was divided into several completely stirred tank reactors depended on the distribution of aeration devices and dissolved oxygen concentration. The removal efficiency of COD, ammonia nitrogen, total nitrogen, total phosphorus and SS were simulated by GPS-X software with influent quality data of this WWTP from June to August 2009, to investigate the differences between the simulated results and the actual results. The results showed that, the simulated values could well reflect the actual condition of the triple oxidation ditch process. Mathematical modeling method was appropriate in effluent quality predicting and process optimizing.
Optimal segmentation and packaging process
Kostelnik, Kevin M.; Meservey, Richard H.; Landon, Mark D.
1999-01-01
A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D&D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded.
Ludwig, T; Kern, P; Bongards, M; Wolf, C
2011-01-01
The optimization of relaxation and filtration times of submerged microfiltration flat modules in membrane bioreactors used for municipal wastewater treatment is essential for efficient plant operation. However, the optimization and control of such plants and their filtration processes is a challenging problem due to the underlying highly nonlinear and complex processes. This paper presents the use of genetic algorithms for this optimization problem in conjunction with a fully calibrated simulation model, as computational intelligence methods are perfectly suited to the nonconvex multi-objective nature of the optimization problems posed by these complex systems. The simulation model is developed and calibrated using membrane modules from the wastewater simulation software GPS-X based on the Activated Sludge Model No.1 (ASM1). Simulation results have been validated at a technical reference plant. They clearly show that filtration process costs for cleaning and energy can be reduced significantly by intelligent process optimization.
Teaching Workflow Analysis and Lean Thinking via Simulation: A Formative Evaluation
Campbell, Robert James; Gantt, Laura; Congdon, Tamara
2009-01-01
This article presents the rationale for the design and development of a video simulation used to teach lean thinking and workflow analysis to health services and health information management students enrolled in a course on the management of health information. The discussion includes a description of the design process, a brief history of the use of simulation in healthcare, and an explanation of how video simulation can be used to generate experiential learning environments. Based on the results of a survey given to 75 students as part of a formative evaluation, the video simulation was judged effective because it allowed students to visualize a real-world process (concrete experience), contemplate the scenes depicted in the video along with the concepts presented in class in a risk-free environment (reflection), develop hypotheses about why problems occurred in the workflow process (abstract conceptualization), and develop solutions to redesign a selected process (active experimentation). PMID:19412533
A framework for service enterprise workflow simulation with multi-agents cooperation
NASA Astrophysics Data System (ADS)
Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun
2013-11-01
Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.
NASA Astrophysics Data System (ADS)
Fedulov, Boris N.; Safonov, Alexander A.; Sergeichev, Ivan V.; Ushakov, Andrey E.; Klenin, Yuri G.; Makarenko, Irina V.
2016-10-01
An application of composites for construction of subway brackets is a very effective approach to extend their lifetime. However, this approach involves the necessity to prevent process-induced distortions of the bracket due to thermal deformation and chemical shrinkage. At present study, a process simulation has been carried out to support the design of the production tooling. The simulation was based on the application of viscoelastic model for the resin. Simulation results were verified by comparison with results of manufacturing experiments. To optimize the bracket structure the strength analysis was carried out as well.
Modeling and simulation: A key to future defense technology
NASA Technical Reports Server (NTRS)
Muccio, Anthony B.
1993-01-01
The purpose of this paper is to express the rationale for continued technological and scientific development of the modeling and simulation process for the defense industry. The defense industry, along with a variety of other industries, is currently being forced into making sacrifices in response to the current economic hardships. These sacrifices, which may not compromise the safety of our nation, nor jeopardize our current standing as the world peace officer, must be concentrated in areas which will withstand the needs of the changing world. Therefore, the need for cost effective alternatives of defense issues must be examined. This paper provides support that the modeling and simulation process is an economically feasible process which will ensure our nation's safety as well as provide and keep up with the future technological developments and demands required by the defense industry. The outline of this paper is as follows: introduction, which defines and describes the modeling and simulation process; discussion, which details the purpose and benefits of modeling and simulation and provides specific examples of how the process has been successful; and conclusion, which summarizes the specifics of modeling and simulation of defense issues and lends the support for its continued use in the defense arena.
Integration of High-resolution Data for Temporal Bone Surgical Simulations
Wiet, Gregory J.; Stredney, Don; Powell, Kimerly; Hittle, Brad; Kerwin, Thomas
2016-01-01
Purpose To report on the state of the art in obtaining high-resolution 3D data of the microanatomy of the temporal bone and to process that data for integration into a surgical simulator. Specifically, we report on our experience in this area and discuss the issues involved to further the field. Data Sources Current temporal bone image acquisition and image processing established in the literature as well as in house methodological development. Review Methods We reviewed the current English literature for the techniques used in computer-based temporal bone simulation systems to obtain and process anatomical data for use within the simulation. Search terms included “temporal bone simulation, surgical simulation, temporal bone.” Articles were chosen and reviewed that directly addressed data acquisition and processing/segmentation and enhancement with emphasis given to computer based systems. We present the results from this review in relationship to our approach. Conclusions High-resolution CT imaging (≤100μm voxel resolution), along with unique image processing and rendering algorithms, and structure specific enhancement are needed for high-level training and assessment using temporal bone surgical simulators. Higher resolution clinical scanning and automated processes that run in efficient time frames are needed before these systems can routinely support pre-surgical planning. Additionally, protocols such as that provided in this manuscript need to be disseminated to increase the number and variety of virtual temporal bones available for training and performance assessment. PMID:26762105
Sludge batch 9 simulant runs using the nitric-glycolic acid flowsheet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambert, D. P.; Williams, M. S.; Brandenburg, C. H.
Testing was completed to develop a Sludge Batch 9 (SB9) nitric-glycolic acid chemical process flowsheet for the Defense Waste Processing Facility’s (DWPF) Chemical Process Cell (CPC). CPC simulations were completed using SB9 sludge simulant, Strip Effluent Feed Tank (SEFT) simulant and Precipitate Reactor Feed Tank (PRFT) simulant. Ten sludge-only Sludge Receipt and Adjustment Tank (SRAT) cycles and four SRAT/Slurry Mix Evaporator (SME) cycles, and one actual SB9 sludge (SRAT/SME cycle) were completed. As has been demonstrated in over 100 simulations, the replacement of formic acid with glycolic acid virtually eliminates the CPC’s largest flammability hazards, hydrogen and ammonia. Recommended processingmore » conditions are summarized in section 3.5.1. Testing demonstrated that the interim chemistry and Reduction/Oxidation (REDOX) equations are sufficient to predict the composition of DWPF SRAT product and SME product. Additional reports will finalize the chemistry and REDOX equations. Additional testing developed an antifoam strategy to minimize the hexamethyldisiloxane (HMDSO) peak at boiling, while controlling foam based on testing with simulant and actual waste. Implementation of the nitric-glycolic acid flowsheet in DWPF is recommended. This flowsheet not only eliminates the hydrogen and ammonia hazards but will lead to shorter processing times, higher elemental mercury recovery, and more concentrated SRAT and SME products. The steady pH profile is expected to provide flexibility in processing the high volume of strip effluent expected once the Salt Waste Processing Facility starts up.« less
Stochastic simulation of spatially correlated geo-processes
Christakos, G.
1987-01-01
In this study, developments in the theory of stochastic simulation are discussed. The unifying element is the notion of Radon projection in Euclidean spaces. This notion provides a natural way of reconstructing the real process from a corresponding process observable on a reduced dimensionality space, where analysis is theoretically easier and computationally tractable. Within this framework, the concept of space transformation is defined and several of its properties, which are of significant importance within the context of spatially correlated processes, are explored. The turning bands operator is shown to follow from this. This strengthens considerably the theoretical background of the geostatistical method of simulation, and some new results are obtained in both the space and frequency domains. The inverse problem is solved generally and the applicability of the method is extended to anisotropic as well as integrated processes. Some ill-posed problems of the inverse operator are discussed. Effects of the measurement error and impulses at origin are examined. Important features of the simulated process as described by geomechanical laws, the morphology of the deposit, etc., may be incorporated in the analysis. The simulation may become a model-dependent procedure and this, in turn, may provide numerical solutions to spatial-temporal geologic models. Because the spatial simu??lation may be technically reduced to unidimensional simulations, various techniques of generating one-dimensional realizations are reviewed. To link theory and practice, an example is computed in detail. ?? 1987 International Association for Mathematical Geology.
NASA Astrophysics Data System (ADS)
Amran, M. A. M.; Idayu, N.; Faizal, K. M.; Sanusi, M.; Izamshah, R.; Shahir, M.
2016-11-01
In this study, the main objective is to determine the percentage difference of part weight between experimental and simulation work. The effect of process parameters on weight of plastic part is also investigated. The process parameters involved were mould temperature, melt temperature, injection time and cooling time. Autodesk Simulation Moldflow software was used to run the simulation of the plastic part. Taguchi method was selected as Design of Experiment to conduct the experiment. Then, the simulation result was validated with the experimental result. It was found that the minimum and maximum percentage of differential of part weight between simulation and experimental work are 0.35 % and 1.43 % respectively. In addition, the most significant parameter that affected part weight is the mould temperature, followed by melt temperature, injection time and cooling time.
Quantitative computer simulations of extraterrestrial processing operations
NASA Technical Reports Server (NTRS)
Vincent, T. L.; Nikravesh, P. E.
1989-01-01
The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.
NASA Astrophysics Data System (ADS)
Li, Gen; Tang, Chun-An; Liang, Zheng-Zhao
2017-01-01
Multi-scale high-resolution modeling of rock failure process is a powerful means in modern rock mechanics studies to reveal the complex failure mechanism and to evaluate engineering risks. However, multi-scale continuous modeling of rock, from deformation, damage to failure, has raised high requirements on the design, implementation scheme and computation capacity of the numerical software system. This study is aimed at developing the parallel finite element procedure, a parallel rock failure process analysis (RFPA) simulator that is capable of modeling the whole trans-scale failure process of rock. Based on the statistical meso-damage mechanical method, the RFPA simulator is able to construct heterogeneous rock models with multiple mechanical properties, deal with and represent the trans-scale propagation of cracks, in which the stress and strain fields are solved for the damage evolution analysis of representative volume element by the parallel finite element method (FEM) solver. This paper describes the theoretical basis of the approach and provides the details of the parallel implementation on a Windows - Linux interactive platform. A numerical model is built to test the parallel performance of FEM solver. Numerical simulations are then carried out on a laboratory-scale uniaxial compression test, and field-scale net fracture spacing and engineering-scale rock slope examples, respectively. The simulation results indicate that relatively high speedup and computation efficiency can be achieved by the parallel FEM solver with a reasonable boot process. In laboratory-scale simulation, the well-known physical phenomena, such as the macroscopic fracture pattern and stress-strain responses, can be reproduced. In field-scale simulation, the formation process of net fracture spacing from initiation, propagation to saturation can be revealed completely. In engineering-scale simulation, the whole progressive failure process of the rock slope can be well modeled. It is shown that the parallel FE simulator developed in this study is an efficient tool for modeling the whole trans-scale failure process of rock from meso- to engineering-scale.
NASA Astrophysics Data System (ADS)
Singh, Swadesh Kumar; Kumar, D. Ravi
2005-08-01
Hydro-mechanical deep drawing is a process for producing cup shaped parts with the assistance of a pressurized fluid. In the present work, numerical simulation of the conventional and counter pressure deep drawing processes has been done with the help of a finite element method based software. Simulation results were analyzed to study the improvement in drawability by using hydro-mechanical processes. The thickness variations in the drawn cups were analyzed and also the effect of counter pressure and oil gap on the thickness distribution was studied. Numerical simulations were also used for the die design, which combines both drawing and ironing processes in a single operation. This modification in the die provides high drawability, facilitates smooth material flow, gives more uniform thickness distribution and corrects the shape distortion.
Simulation of a Start-Up Manufacturing Facility for Nanopore Arrays
ERIC Educational Resources Information Center
Field, Dennis W.
2009-01-01
Simulation is a powerful tool in developing and troubleshooting manufacturing processes, particularly when considering process flows for manufacturing systems that do not yet exist. Simulation can bridge the gap in terms of setting up full-scale manufacturing for nanotechnology products if limited production experience is an issue. An effective…
Modelling and Simulation as a Recognizing Method in Education
ERIC Educational Resources Information Center
Stoffa, Veronika
2004-01-01
Computer animation-simulation models of complex processes and events, which are the method of instruction, can be an effective didactic device. Gaining deeper knowledge about objects modelled helps to plan simulation experiments oriented on processes and events researched. Animation experiments realized on multimedia computers can aid easier…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-10
... qualification process as an important tool for the assessment of vehicle performance. These simulations are... qualification process, simulations would be conducted using both a measured track geometry segment... on the results of simulation studies designed to identify track geometry irregularities associated...
Software quality and process improvement in scientific simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, J.; Webster, R.
1997-11-01
This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.
VARTM Process Modeling of Aerospace Composite Structures
NASA Technical Reports Server (NTRS)
Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.
2003-01-01
A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.
Simulation based analysis of laser beam brazing
NASA Astrophysics Data System (ADS)
Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael
2016-03-01
Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.
Business process study simulation for resource management in an emergency department.
Poomkothammal, Velusamy
2006-01-01
Alexandra Hospital conducted a business process reengineering exercise for all its main processes in order to further improve on their efficiencies with the ultimate aim to provide a higher level of services to patients. The goal of the DEM is to manage an anticipated increase in the volume of patients without much increase in resources. As a start, the Department of Emergency (DEM) medicine studied its AS-IS process and has designed and implemented the new TO-BE process. As part of this continuous improvement effort, staff from Nanyang Polytechnic (NYP) has been assigned the task of applying engineering and analytical techniques to simulate the new process. The simulations were conducted to show on process management and resource planning.
Chen, P P; Tsui, N Tk; Fung, A Sw; Chiu, A Hf; Wong, W Cw; Leong, H T; Lee, P Sf; Lau, J Yw
2017-08-01
The implementation of a new clinical service is associated with anxiety and challenges that may prevent smooth and safe execution of the service. Unexpected issues may not be apparent until the actual clinical service commences. We present a novel approach to test the new clinical setting before actual implementation of our endovascular aortic repair service. In-situ simulation at the new clinical location would enable identification of potential process and system issues prior to implementation of the service. After preliminary planning, a simulation test utilising a case scenario with actual simulation of the entire care process was carried out to identify any logistic, equipment, settings or clinical workflow issues, and to trial a contingency plan for a surgical complication. All patient care including anaesthetic, surgical, and nursing procedures and processes were simulated and tested. Overall, 17 vital process and system issues were identified during the simulation as potential clinical concerns. They included difficult patient positioning, draping pattern, unsatisfactory equipment setup, inadequate critical surgical instruments, blood products logistics, and inadequate nursing support during crisis. In-situ simulation provides an innovative method to identify critical deficiencies and unexpected issues before implementation of a new clinical service. Life-threatening and serious practical issues can be identified and corrected before formal service commences. This article describes our experience with the use of simulation in pre-implementation testing of a clinical process or service. We found the method useful and would recommend it to others.
Simulation Assessment Validation Environment (SAVE). Software User’s Manual
2000-09-01
requirements and decisions are made. The integration is leveraging work from other DoD organizations so that high -end results are attainable much faster than...planning through the modeling and simulation data capture and visualization process. The planners can complete the manufacturing process plan with a high ...technologies. This tool is also used to perform “ high level” factory process simulation prior to full CAD model development and help define feasible
Logistics of Trainsets Creation with the Use of Simulation Models
NASA Astrophysics Data System (ADS)
Sedláček, Michal; Pavelka, Hynek
2016-12-01
This paper focuses on rail transport in following the train formation operational processes problem using computer simulations. The problem has been solved using SIMUL8 and applied to specific train formation station in the Czech Republic. The paper describes a proposal simulation model of the train formation work. Experimental modeling with an assessment of achievements and design solution for optimizing of the train formation operational process is also presented.
USING SIMULATION FOR POLLUTION PREVENTION
The ability to design or modify chemical processes in a way that minimizes the formation of unwanted by-products is an ongoing goal for process engineers. Two simulation and design methods are discussed here: Process Integration (PI) developed by El-Halwagi and Manousiouthakis a...
An application of sedimentation simulation in Tahe oilfield
NASA Astrophysics Data System (ADS)
Tingting, He; Lei, Zhao; Xin, Tan; Dongxu, He
2017-12-01
The braided river delta develops in Triassic low oil formation in the block 9 of Tahe oilfield, but its sedimentation evolution process is unclear. By using sedimentation simulation technology, sedimentation process and distribution of braided river delta are studied based on the geological parameters including sequence stratigraphic division, initial sedimentation environment, relative lake level change and accommodation change, source supply and sedimentary transport pattern. The simulation result shows that the error rate between strata thickness of simulation and actual strata thickness is small, and the single well analysis result of simulation is highly consistent with the actual analysis, which can prove that the model is reliable. The study area belongs to braided river delta retrogradation evolution process, which provides favorable basis for fine reservoir description and prediction.
Danielson, Thomas; Sutton, Jonathan E.; Hin, Céline; ...
2017-06-09
Lattice based Kinetic Monte Carlo (KMC) simulations offer a powerful simulation technique for investigating large reaction networks while retaining spatial configuration information, unlike ordinary differential equations. However, large chemical reaction networks can contain reaction processes with rates spanning multiple orders of magnitude. This can lead to the problem of “KMC stiffness” (similar to stiffness in differential equations), where the computational expense has the potential to be overwhelmed by very short time-steps during KMC simulations, with the simulation spending an inordinate amount of KMC steps / cpu-time simulating fast frivolous processes (FFPs) without progressing the system (reaction network). In order tomore » achieve simulation times that are experimentally relevant or desired for predictions, a dynamic throttling algorithm involving separation of the processes into speed-ranks based on event frequencies has been designed and implemented with the intent of decreasing the probability of FFP events, and increasing the probability of slow process events -- allowing rate limiting events to become more likely to be observed in KMC simulations. This Staggered Quasi-Equilibrium Rank-based Throttling for Steady-state (SQERTSS) algorithm designed for use in achieving and simulating steady-state conditions in KMC simulations. Lastly, as shown in this work, the SQERTSS algorithm also works for transient conditions: the correct configuration space and final state will still be achieved if the required assumptions are not violated, with the caveat that the sizes of the time-steps may be distorted during the transient period.« less
Characterizing the role of the hippocampus during episodic simulation and encoding.
Thakral, Preston P; Benoit, Roland G; Schacter, Daniel L
2017-12-01
The hippocampus has been consistently associated with episodic simulation (i.e., the mental construction of a possible future episode). In a recent study, we identified an anterior-posterior temporal dissociation within the hippocampus during simulation. Specifically, transient simulation-related activity occurred in relatively posterior portions of the hippocampus and sustained activity occurred in anterior portions. In line with previous theoretical proposals of hippocampal function during simulation, the posterior hippocampal activity was interpreted as reflecting a transient retrieval process for the episodic details necessary to construct an episode. In contrast, the sustained anterior hippocampal activity was interpreted as reflecting the continual recruitment of encoding and/or relational processing associated with a simulation. In the present study, we provide a direct test of these interpretations by conducting a subsequent memory analysis of our previously published data to assess whether successful encoding during episodic simulation is associated with the anterior hippocampus. Analyses revealed a subsequent memory effect (i.e., later remembered > later forgotten simulations) in the anterior hippocampus. The subsequent memory effect was transient and not sustained. Taken together, the current findings provide further support for a component process model of hippocampal function during simulation. That is, unique regions of the hippocampus support dissociable processes during simulation, which include the transient retrieval of episodic information, the sustained binding of such information into a coherent episode, and the transient encoding of that episode for later retrieval. © 2017 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Danielson, Thomas; Sutton, Jonathan E.; Hin, Céline; Savara, Aditya
2017-10-01
Lattice based Kinetic Monte Carlo (KMC) simulations offer a powerful simulation technique for investigating large reaction networks while retaining spatial configuration information, unlike ordinary differential equations. However, large chemical reaction networks can contain reaction processes with rates spanning multiple orders of magnitude. This can lead to the problem of "KMC stiffness" (similar to stiffness in differential equations), where the computational expense has the potential to be overwhelmed by very short time-steps during KMC simulations, with the simulation spending an inordinate amount of KMC steps/CPU time simulating fast frivolous processes (FFPs) without progressing the system (reaction network). In order to achieve simulation times that are experimentally relevant or desired for predictions, a dynamic throttling algorithm involving separation of the processes into speed-ranks based on event frequencies has been designed and implemented with the intent of decreasing the probability of FFP events, and increasing the probability of slow process events-allowing rate limiting events to become more likely to be observed in KMC simulations. This Staggered Quasi-Equilibrium Rank-based Throttling for Steady-state (SQERTSS) algorithm is designed for use in achieving and simulating steady-state conditions in KMC simulations. As shown in this work, the SQERTSS algorithm also works for transient conditions: the correct configuration space and final state will still be achieved if the required assumptions are not violated, with the caveat that the sizes of the time-steps may be distorted during the transient period.
NASA Astrophysics Data System (ADS)
Johnson, Donald R.; Lenzen, Allen J.; Zapotocny, Tom H.; Schaack, Todd K.
2000-11-01
A challenge common to weather, climate, and seasonal numerical prediction is the need to simulate accurately reversible isentropic processes in combination with appropriate determination of sources/sinks of energy and entropy. Ultimately, this task includes the distribution and transport of internal, gravitational, and kinetic energies, the energies of water substances in all forms, and the related thermodynamic processes of phase changes involved with clouds, including condensation, evaporation, and precipitation processes.All of the processes noted above involve the entropies of matter, radiation, and chemical substances, conservation during transport, and/or changes in entropies by physical processes internal to the atmosphere. With respect to the entropy of matter, a means to study a model's accuracy in simulating internal hydrologic processes is to determine its capability to simulate the appropriate conservation of potential and equivalent potential temperature as surrogates of dry and moist entropy under reversible adiabatic processes in which clouds form, evaporate, and precipitate. In this study, a statistical strategy utilizing the concept of `pure error' is set forth to assess the numerical accuracies of models to simulate reversible processes during 10-day integrations of the global circulation corresponding to the global residence time of water vapor. During the integrations, the sums of squared differences between equivalent potential temperature e numerically simulated by the governing equations of mass, energy, water vapor, and cloud water and a proxy equivalent potential temperature te numerically simulated as a conservative property are monitored. Inspection of the differences of e and te in time and space and the relative frequency distribution of the differences details bias and random errors that develop from nonlinear numerical inaccuracies in the advection and transport of potential temperature and water substances within the global atmosphere.A series of nine global simulations employing various versions of Community Climate Models CCM2 and CCM3-all Eulerian spectral numerics, all semi-Lagrangian numerics, mixed Eulerian spectral, and semi-Lagrangian numerics-and the University of Wisconsin-Madison (UW) isentropic-sigma gridpoint model provides an interesting comparison of numerical accuracies in the simulation of reversibility. By day 10, large bias and random differences were identified in the simulation of reversible processes in all of the models except for the UW isentropic-sigma model. The CCM2 and CCM3 simulations yielded systematic differences that varied zonally, vertically, and temporally. Within the comparison, the UW isentropic-sigma model was superior in transporting water vapor and cloud water/ice and in simulating reversibility involving the conservation of dry and moist entropy. The only relative frequency distribution of differences that appeared optimal, in that the distribution remained unbiased and equilibrated with minimal variance as it remained statistically stationary, was the distribution from the UW isentropic-sigma model. All other distributions revealed nonstationary characteristics with spreading and/or shifting of the maxima as the biases and variances of the numerical differences of e and te amplified.
Three Dimensional Transient Turbulent Simulations of Scramjet Fuel Injection and Combustion
NASA Astrophysics Data System (ADS)
Bahbaz, Marwane
2011-11-01
Scramjet is a propulsion system that is more effective for hypersonic flights (M >5). The main objective of the simulation is to understand both the mixing and combustion process of air flow using hydrogen fuel in high speed environment s. The understanding of this phenomenon is used to determine the number of fuel injectors required to increase combustion efficiency and energy transfer. Due to the complexity of this simulation, multiple software tools are used to achieve this objective. First, Solid works is used to draw a scramjet combustor with accurate measurements. Second software tool used is Gambit; It is used to make several types of meshes for the scramjet combustor. Finally, Open Foam and CFD++ are software used to process and post process the scramjet combustor. At this stage, the simulation is divided into two categories. The cold flow category is a series of simulations that include subsonic and supersonic turbulent air flow across the combustor channel with fuel interaction from one or more injectors'. The second category is the combustion simulations which involve fluid flow and fuel mixing with ignition. The simulation and modeling of scramjet combustor will assist to investigate and understand the combustion process and energy transfer in hypersonic environment.
NASA Astrophysics Data System (ADS)
Hur, Min Young; Verboncoeur, John; Lee, Hae June
2014-10-01
Particle-in-cell (PIC) simulations have high fidelity in the plasma device requiring transient kinetic modeling compared with fluid simulations. It uses less approximation on the plasma kinetics but requires many particles and grids to observe the semantic results. It means that the simulation spends lots of simulation time in proportion to the number of particles. Therefore, PIC simulation needs high performance computing. In this research, a graphic processing unit (GPU) is adopted for high performance computing of PIC simulation for low temperature discharge plasmas. GPUs have many-core processors and high memory bandwidth compared with a central processing unit (CPU). NVIDIA GeForce GPUs were used for the test with hundreds of cores which show cost-effective performance. PIC code algorithm is divided into two modules which are a field solver and a particle mover. The particle mover module is divided into four routines which are named move, boundary, Monte Carlo collision (MCC), and deposit. Overall, the GPU code solves particle motions as well as electrostatic potential in two-dimensional geometry almost 30 times faster than a single CPU code. This work was supported by the Korea Institute of Science Technology Information.
Neurological evidence linguistic processes precede perceptual simulation in conceptual processing.
Louwerse, Max; Hutchinson, Sterling
2012-01-01
There is increasing evidence from response time experiments that language statistics and perceptual simulations both play a role in conceptual processing. In an EEG experiment we compared neural activity in cortical regions commonly associated with linguistic processing and visual perceptual processing to determine to what extent symbolic and embodied accounts of cognition applied. Participants were asked to determine the semantic relationship of word pairs (e.g., sky - ground) or to determine their iconic relationship (i.e., if the presentation of the pair matched their expected physical relationship). A linguistic bias was found toward the semantic judgment task and a perceptual bias was found toward the iconicity judgment task. More importantly, conceptual processing involved activation in brain regions associated with both linguistic and perceptual processes. When comparing the relative activation of linguistic cortical regions with perceptual cortical regions, the effect sizes for linguistic cortical regions were larger than those for the perceptual cortical regions early in a trial with the reverse being true later in a trial. These results map upon findings from other experimental literature and provide further evidence that processing of concept words relies both on language statistics and on perceptual simulations, whereby linguistic processes precede perceptual simulation processes.
Neurological Evidence Linguistic Processes Precede Perceptual Simulation in Conceptual Processing
Louwerse, Max; Hutchinson, Sterling
2012-01-01
There is increasing evidence from response time experiments that language statistics and perceptual simulations both play a role in conceptual processing. In an EEG experiment we compared neural activity in cortical regions commonly associated with linguistic processing and visual perceptual processing to determine to what extent symbolic and embodied accounts of cognition applied. Participants were asked to determine the semantic relationship of word pairs (e.g., sky – ground) or to determine their iconic relationship (i.e., if the presentation of the pair matched their expected physical relationship). A linguistic bias was found toward the semantic judgment task and a perceptual bias was found toward the iconicity judgment task. More importantly, conceptual processing involved activation in brain regions associated with both linguistic and perceptual processes. When comparing the relative activation of linguistic cortical regions with perceptual cortical regions, the effect sizes for linguistic cortical regions were larger than those for the perceptual cortical regions early in a trial with the reverse being true later in a trial. These results map upon findings from other experimental literature and provide further evidence that processing of concept words relies both on language statistics and on perceptual simulations, whereby linguistic processes precede perceptual simulation processes. PMID:23133427
Hafnium transistor process design for neural interfacing.
Parent, David W; Basham, Eric J
2009-01-01
A design methodology is presented that uses 1-D process simulations of Metal Insulator Semiconductor (MIS) structures to design the threshold voltage of hafnium oxide based transistors used for neural recording. The methodology is comprised of 1-D analytical equations for threshold voltage specification, and doping profiles, and 1-D MIS Technical Computer Aided Design (TCAD) to design a process to implement a specific threshold voltage, which minimized simulation time. The process was then verified with a 2-D process/electrical TCAD simulation. Hafnium oxide films (HfO) were grown and characterized for dielectric constant and fixed oxide charge for various annealing temperatures, two important design variables in threshold voltage design.
A chemical EOR benchmark study of different reservoir simulators
NASA Astrophysics Data System (ADS)
Goudarzi, Ali; Delshad, Mojdeh; Sepehrnoori, Kamy
2016-09-01
Interest in chemical EOR processes has intensified in recent years due to the advancements in chemical formulations and injection techniques. Injecting Polymer (P), surfactant/polymer (SP), and alkaline/surfactant/polymer (ASP) are techniques for improving sweep and displacement efficiencies with the aim of improving oil production in both secondary and tertiary floods. There has been great interest in chemical flooding recently for different challenging situations. These include high temperature reservoirs, formations with extreme salinity and hardness, naturally fractured carbonates, and sandstone reservoirs with heavy and viscous crude oils. More oil reservoirs are reaching maturity where secondary polymer floods and tertiary surfactant methods have become increasingly important. This significance has added to the industry's interest in using reservoir simulators as tools for reservoir evaluation and management to minimize costs and increase the process efficiency. Reservoir simulators with special features are needed to represent coupled chemical and physical processes present in chemical EOR processes. The simulators need to be first validated against well controlled lab and pilot scale experiments to reliably predict the full field implementations. The available data from laboratory scale include 1) phase behavior and rheological data; and 2) results of secondary and tertiary coreflood experiments for P, SP, and ASP floods under reservoir conditions, i.e. chemical retentions, pressure drop, and oil recovery. Data collected from corefloods are used as benchmark tests comparing numerical reservoir simulators with chemical EOR modeling capabilities such as STARS of CMG, ECLIPSE-100 of Schlumberger, REVEAL of Petroleum Experts. The research UTCHEM simulator from The University of Texas at Austin is also included since it has been the benchmark for chemical flooding simulation for over 25 years. The results of this benchmark comparison will be utilized to improve chemical design for field-scale studies using commercial simulators. The benchmark tests illustrate the potential of commercial simulators for chemical flooding projects and provide a comprehensive table of strengths and limitations of each simulator for a given chemical EOR process. Mechanistic simulations of chemical EOR processes will provide predictive capability and can aid in optimization of the field injection projects. The objective of this paper is not to compare the computational efficiency and solution algorithms; it only focuses on the process modeling comparison.
Multi-scale Modeling of Arctic Clouds
NASA Astrophysics Data System (ADS)
Hillman, B. R.; Roesler, E. L.; Dexheimer, D.
2017-12-01
The presence and properties of clouds are critically important to the radiative budget in the Arctic, but clouds are notoriously difficult to represent in global climate models (GCMs). The challenge stems partly from a disconnect in the scales at which these models are formulated and the scale of the physical processes important to the formation of clouds (e.g., convection and turbulence). Because of this, these processes are parameterized in large-scale models. Over the past decades, new approaches have been explored in which a cloud system resolving model (CSRM), or in the extreme a large eddy simulation (LES), is embedded into each gridcell of a traditional GCM to replace the cloud and convective parameterizations to explicitly simulate more of these important processes. This approach is attractive in that it allows for more explicit simulation of small-scale processes while also allowing for interaction between the small and large-scale processes. The goal of this study is to quantify the performance of this framework in simulating Arctic clouds relative to a traditional global model, and to explore the limitations of such a framework using coordinated high-resolution (eddy-resolving) simulations. Simulations from the global model are compared with satellite retrievals of cloud fraction partioned by cloud phase from CALIPSO, and limited-area LES simulations are compared with ground-based and tethered-balloon measurements from the ARM Barrow and Oliktok Point measurement facilities.
NASA Technical Reports Server (NTRS)
Wang, Yansen; Tao, W.-K.; Lau, K.-M.; Wetzel, Peter J.
2003-01-01
The onset of the southeast Asian monsoon during 1997 and 1998 was simulated with a coupled mesoscale atmospheric model (MM5) and a detailed land surface model. The rainfall results from the simulations were compared with observed satellite data fiom the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The simulation with the land surface model captured basic signatures of the monsoon onset processes and associated rainfall statistics. The sensitivity tests indicated that land surface processes had a greater impact on the simulated rainfall results than that of a small sea surface temperature change during the onset period. In both the 1997 and 1998 cases, the simulations were significantly improved by including the land surface processes. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation; the southwest low-level flow over the Indo- China peninsula and the northern cold front intrusion from southern China. The surface sensible and latent heat exchange between the land and atmosphere modified the lowlevel temperature distribution and gradient, and therefore the low-level. The more realistic forcing of the sensible and latent heat from the detailed land surface model improved the monsoon rainfall and associated wind simulation.
The development of an industrial-scale fed-batch fermentation simulation.
Goldrick, Stephen; Ştefan, Andrei; Lovett, David; Montague, Gary; Lennox, Barry
2015-01-10
This paper describes a simulation of an industrial-scale fed-batch fermentation that can be used as a benchmark in process systems analysis and control studies. The simulation was developed using a mechanistic model and validated using historical data collected from an industrial-scale penicillin fermentation process. Each batch was carried out in a 100,000 L bioreactor that used an industrial strain of Penicillium chrysogenum. The manipulated variables recorded during each batch were used as inputs to the simulator and the predicted outputs were then compared with the on-line and off-line measurements recorded in the real process. The simulator adapted a previously published structured model to describe the penicillin fermentation and extended it to include the main environmental effects of dissolved oxygen, viscosity, temperature, pH and dissolved carbon dioxide. In addition the effects of nitrogen and phenylacetic acid concentrations on the biomass and penicillin production rates were also included. The simulated model predictions of all the on-line and off-line process measurements, including the off-gas analysis, were in good agreement with the batch records. The simulator and industrial process data are available to download at www.industrialpenicillinsimulation.com and can be used to evaluate, study and improve on the current control strategy implemented on this facility. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.
Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Johansson, Gerd
2014-01-01
To demonstrate the use of visualization and simulation tools in order to involve stakeholders and inform the process in hospital change processes, illustrated by an empirical study from a children's emergency clinic. Reorganization and redevelopment of a hospital is a complex activity that involves many stakeholders and demands. Visualization and simulation tools have proven useful for involving practitioners and eliciting relevant knowledge. More knowledge is desired about how these tools can be implemented in practice for hospital planning processes. A participatory planning process including practitioners and researchers was executed over a 3-year period to evaluate a combination of visualization and simulation tools to involve stakeholders in the planning process and to elicit knowledge about needs and requirements. The initial clinic proposal from the architect was discarded as a result of the empirical study. Much general knowledge about the needs of the organization was extracted by means of the adopted tools. Some of the tools proved to be more accessible than others for the practitioners participating in the study. The combination of tools added value to the process by presenting information in alternative ways and eliciting questions from different angles. Visualization and simulation tools inform a planning process (or other types of change processes) by providing the means to see beyond present demands and current work structures. Long-term involvement in combination with accessible tools is central for creating a participatory setting where the practitioners' knowledge guides the process. © 2014 Vendome Group, LLC.
Chen, Weiliang; De Schutter, Erik
2017-01-01
Stochastic, spatial reaction-diffusion simulations have been widely used in systems biology and computational neuroscience. However, the increasing scale and complexity of models and morphologies have exceeded the capacity of any serial implementation. This led to the development of parallel solutions that benefit from the boost in performance of modern supercomputers. In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies. Simulation results indicate that our implementation is capable of achieving super-linear speedup for balanced loading simulations with reasonable molecule density and mesh quality. In the best scenario, a parallel simulation with 2,000 processes runs more than 3,600 times faster than its serial SSA counterpart, and achieves more than 20-fold speedup relative to parallel simulation with 100 processes. In a more realistic scenario with dynamic calcium influx and data recording, the parallel simulation with 1,000 processes and no load balancing is still 500 times faster than the conventional serial SSA simulation. PMID:28239346
Chen, Weiliang; De Schutter, Erik
2017-01-01
Stochastic, spatial reaction-diffusion simulations have been widely used in systems biology and computational neuroscience. However, the increasing scale and complexity of models and morphologies have exceeded the capacity of any serial implementation. This led to the development of parallel solutions that benefit from the boost in performance of modern supercomputers. In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies. Simulation results indicate that our implementation is capable of achieving super-linear speedup for balanced loading simulations with reasonable molecule density and mesh quality. In the best scenario, a parallel simulation with 2,000 processes runs more than 3,600 times faster than its serial SSA counterpart, and achieves more than 20-fold speedup relative to parallel simulation with 100 processes. In a more realistic scenario with dynamic calcium influx and data recording, the parallel simulation with 1,000 processes and no load balancing is still 500 times faster than the conventional serial SSA simulation.
NASA Astrophysics Data System (ADS)
Rock, Gilles; Fischer, Kim; Schlerf, Martin; Gerhards, Max; Udelhoven, Thomas
2017-04-01
The development and optimization of image processing algorithms requires the availability of datasets depicting every step from earth surface to the sensor's detector. The lack of ground truth data obliges to develop algorithms on simulated data. The simulation of hyperspectral remote sensing data is a useful tool for a variety of tasks such as the design of systems, the understanding of the image formation process, and the development and validation of data processing algorithms. An end-to-end simulator has been set up consisting of a forward simulator, a backward simulator and a validation module. The forward simulator derives radiance datasets based on laboratory sample spectra, applies atmospheric contributions using radiative transfer equations, and simulates the instrument response using configurable sensor models. This is followed by the backward simulation branch, consisting of an atmospheric correction (AC), a temperature and emissivity separation (TES) or a hybrid AC and TES algorithm. An independent validation module allows the comparison between input and output dataset and the benchmarking of different processing algorithms. In this study, hyperspectral thermal infrared scenes of a variety of surfaces have been simulated to analyze existing AC and TES algorithms. The ARTEMISS algorithm was optimized and benchmarked against the original implementations. The errors in TES were found to be related to incorrect water vapor retrieval. The atmospheric characterization could be optimized resulting in increasing accuracies in temperature and emissivity retrieval. Airborne datasets of different spectral resolutions were simulated from terrestrial HyperCam-LW measurements. The simulated airborne radiance spectra were subjected to atmospheric correction and TES and further used for a plant species classification study analyzing effects related to noise and mixed pixels.
NASA Astrophysics Data System (ADS)
Gigan, Olivier; Chen, Hua; Robert, Olivier; Renard, Stephane; Marty, Frederic
2002-11-01
This paper is dedicated to the fabrication and technological aspect of a silicon microresonator sensor. The entire project includes the fabrication processes, the system modelling/simulation, and the electronic interface. The mechanical model of such resonator is presented including description of frequency stability and Hysterises behaviour of the electrostatically driven resonator. Numeric model and FEM simulations are used to simulate the system dynamic behaviour. The complete fabrication process is based on standard microelectronics technology with specific MEMS technological steps. The key steps are described: micromachining on SOI by Deep Reactive Ion Etching (DRIE), specific release processes to prevent sticking (resist and HF-vapour release process) and collective vacuum encapsulation by Silicon Direct Bonding (SDB). The complete process has been validated and prototypes have been fabricated. The ASIC was designed to interface the sensor and to control the vibration amplitude. This electronic was simulated and designed to work up to 200°C and implemented in a standard 0.6μ CMOS technology. Characterizations of sensor prototypes are done both mechanically and electrostatically. These measurements showed good agreements with theory and FEM simulations.
Learning-Testing Process in Classroom: An Empirical Simulation Model
ERIC Educational Resources Information Center
Buda, Rodolphe
2009-01-01
This paper presents an empirical micro-simulation model of the teaching and the testing process in the classroom (Programs and sample data are available--the actual names of pupils have been hidden). It is a non-econometric micro-simulation model describing informational behaviors of the pupils, based on the observation of the pupils'…
ERIC Educational Resources Information Center
Keskitalo, Tuulikki
2012-01-01
Expectations for simulations in healthcare education are high; however, little is known about healthcare students' expectations of the learning process in virtual reality (VR) and simulation-based learning environments (SBLEs). This research aims to describe first-year healthcare students' (N=97) expectations regarding teaching, studying, and…
Furniture rough mill costs evaluated by computer simulation
R. Bruce Anderson
1983-01-01
A crosscut-first furniture rough mill was simulated to evaluate processing and raw material costs on an individual part basis. Distributions representing the real-world characteristics of lumber, equipment feed speeds, and processing requirements are programed into the simulation. Costs of parts from a specific cutting bill are given, and effects of lumber input costs...
Simulation-Based Learning: The Learning-Forgetting-Relearning Process and Impact of Learning History
ERIC Educational Resources Information Center
Davidovitch, Lior; Parush, Avi; Shtub, Avy
2008-01-01
The results of empirical experiments evaluating the effectiveness and efficiency of the learning-forgetting-relearning process in a dynamic project management simulation environment are reported. Sixty-six graduate engineering students performed repetitive simulation-runs with a break period of several weeks between the runs. The students used a…
10 CFR 434.517 - HVAC systems and equipment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... simulation, except that excess capacity provided to meet process loads need not be modeled unless the process... Reference Buildings. The zones in the simulation shall correspond to the zones provided by the controls in... simulation. Table 517.4.1—HVAC System Description for Prototype and Reference Buildings 1,2 HVAC component...
ERIC Educational Resources Information Center
Weiss, Charles J.
2017-01-01
An introduction to digital stochastic simulations for modeling a variety of physical and chemical processes is presented. Despite the importance of stochastic simulations in chemistry, the prevalence of turn-key software solutions can impose a layer of abstraction between the user and the underlying approach obscuring the methodology being…
USDA-ARS?s Scientific Manuscript database
A three-dimensional water quality model was developed for simulating temporal and spatial variations of phytoplankton, nutrients, and dissolved oxygen in freshwater bodies. Effects of suspended and bed sediment on the water quality processes were simulated. A formula was generated from field measure...
FEA Simulation of Free-Bending - a Preforming Step in the Hydroforming Process Chain
NASA Astrophysics Data System (ADS)
Beulich, N.; Craighero, P.; Volk, W.
2017-09-01
High-strength steel and aluminum alloys are essential for developing innovative, lightly-weighted space frame concepts. The intended design is built from car body parts with high geometrical complexity and reduced material-thickness. Over the past few years, many complex car body parts have been produced using hydroforming. To increase the accuracy of hydroforming in relation to prospective car concepts, the virtual manufacturing of forming becomes more important. As a part of process digitalization, it is necessary to develop a simulation model for the hydroforming process chain. The preforming of longitudinal welded tubes is therefore implemented by the use of three-dimensional free-bending. This technique is able to reproduce complex deflection curves in combination with innovative low-thickness material design for hydroforming processes. As a first step to the complete process simulation, the content of this paper deals with the development of a finite element simulation model for the free-bending process with 6 degrees of freedom. A mandrel built from spherical segments connected by a steel rope is located inside of the tube to prevent geometrical instability. Critical parameters for the result of the bending process are therefore evaluated and optimized. The simulation model is verified by surface measurements of a two-dimensional bending test.
COMPUTERIZED TRAINING OF CRYOSURGERY – A SYSTEM APPROACH
Keelan, Robert; Yamakawa, Soji; Shimada, Kenji; Rabin, Yoed
2014-01-01
The objective of the current study is to provide the foundation for a computerized training platform for cryosurgery. Consistent with clinical practice, the training process targets the correlation of the frozen region contour with the target region shape, using medical imaging and accepted criteria for clinical success. The current study focuses on system design considerations, including a bioheat transfer model, simulation techniques, optimal cryoprobe layout strategy, and a simulation core framework. Two fundamentally different approaches were considered for the development of a cryosurgery simulator, based on a finite-elements (FE) commercial code (ANSYS) and a proprietary finite-difference (FD) code. Results of this study demonstrate that the FE simulator is superior in terms of geometric modeling, while the FD simulator is superior in terms of runtime. Benchmarking results further indicate that the FD simulator is superior in terms of usage of memory resources, pre-processing, parallel processing, and post-processing. It is envisioned that future integration of a human-interface module and clinical data into the proposed computer framework will make computerized training of cryosurgery a practical reality. PMID:23995400
Automated Simulation For Analysis And Design
NASA Technical Reports Server (NTRS)
Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.
1992-01-01
Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.
NASA Astrophysics Data System (ADS)
Dwivany, Fenny Martha; Esyanti, Rizkita R.; Prapaisie, Adeline; Puspa Kirana, Listya; Latief, Chunaeni; Ginaldi, Ari
2016-11-01
The objective of the research was to determine the effect of microgravity simulation by 3D clinostat on Cavendish banana (Musa acuminata AAA group) ripening process. In this study, physical, physiological changes as well as genes expression were analysed. The result showed that in microgravity simulation condition ripening process in banana was delayed and the MaACOl, MaACSl and MaACS5 gene expression were affected.
Seasonal changes in the atmospheric heat balance simulated by the GISS general circulation model
NASA Technical Reports Server (NTRS)
Stone, P. H.; Chow, S.; Helfand, H. M.; Quirk, W. J.; Somerville, R. C. J.
1975-01-01
Tests of the ability of numerical general circulation models to simulate the atmosphere have focussed so far on simulations of the January climatology. These models generally present boundary conditions such as sea surface temperature, but this does not prevent testing their ability to simulate seasonal changes in atmospheric processes that accompany presented seasonal changes in boundary conditions. Experiments to simulate changes in the zonally averaged heat balance are discussed since many simplified models of climatic processes are based solely on this balance.
Simulation of transient flow in a shock tunnel and a high Mach number nozzle
NASA Technical Reports Server (NTRS)
Jacobs, P. A.
1991-01-01
A finite volume Navier-Stokes code was used to simulate the shock reflection and nozzle starting processes in an axisymmetric shock tube and a high Mach number nozzle. The simulated nozzle starting processes were found to match the classical quasi-1-D theory and some features of the experimental measurements. The shock reflection simulation illustrated a new mechanism for the driver gas contamination of the stagnated test gas.
Hydro turbine governor’s power control of hydroelectric unit with sloping ceiling tailrace tunnel
NASA Astrophysics Data System (ADS)
Fu, Liang; Wu, Changli; Tang, Weiping
2018-02-01
The primary frequency regulation and load regulation transient process when the hydro turbine governor is under the power mode of hydropower unit with sloping ceiling tailrace are analysed by field test and numerical simulation in this paper. A simulation method based on “three-zone model” to simulate small fluctuation transient process of the sloping ceiling tailrace is proposed. The simulation model of hydraulic turbine governor power mode is established by governor’s PLC program identification and parameter measurement, and the simulation model is verified by the test. The slow-fast-slow “three-stage regulation” method which can improve the dynamic quality of hydro turbine governor power mode is proposed. The power regulation strategy and parameters are optimized by numerical simulation, the performance of primary frequency regulation and load regulation transient process when the hydro turbine governor is under power mode are improved significantly.
King, Gillian; Shepherd, Tracy A; Servais, Michelle; Willoughby, Colleen; Bolack, Linda; Strachan, Deborah; Moodie, Sheila; Baldwin, Patricia; Knickle, Kerry; Parker, Kathryn; Savage, Diane; McNaughton, Nancy
2016-10-01
To describe the creation and validation of six simulations concerned with effective listening and interpersonal communication in pediatric rehabilitation. The simulations involved clinicians from various disciplines, were based on clinical scenarios related to client issues, and reflected core aspects of listening/communication. Each simulation had a key learning objective, thus focusing clinicians on specific listening skills. The article outlines the process used to turn written scenarios into digital video simulations, including steps taken to establish content validity and authenticity, and to establish a series of videos based on the complexity of their learning objectives, given contextual factors and associated macrocognitive processes that influence the ability to listen. A complexity rating scale was developed and used to establish a gradient of easy/simple, intermediate, and hard/complex simulations. The development process exemplifies an evidence-based, integrated knowledge translation approach to the teaching and learning of listening and communication skills.
Understanding Islamist political violence through computational social simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G
Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less
Managing complexity in simulations of land surface and near-surface processes
Coon, Ethan T.; Moulton, J. David; Painter, Scott L.
2016-01-12
Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less
Simulation and flavor compound analysis of dealcoholized beer via one-step vacuum distillation.
Andrés-Iglesias, Cristina; García-Serna, Juan; Montero, Olimpio; Blanco, Carlos A
2015-10-01
The coupled operation of vacuum distillation process to produce alcohol free beer at laboratory scale and Aspen HYSYS simulation software was studied to define the chemical changes during the dealcoholization process in the aroma profiles of 2 different lager beers. At the lab-scale process, 2 different parameters were chosen to dealcoholize beer samples, 102mbar at 50°C and 200mbar at 67°C. Samples taken at different steps of the process were analyzed by HS-SPME-GC-MS focusing on the concentration of 7 flavor compounds, 5 alcohols and 2 esters. For simulation process, the EoS parameters of the Wilson-2 property package were adjusted to the experimental data and one more pressure was tested (60mbar). Simulation methods represent a viable alternative to predict results of the volatile compound composition of a final dealcoholized beer. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Elliott, Thomas J.; Gu, Mile
2018-03-01
Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.
More Than One Way to Debrief: A Critical Review of Healthcare Simulation Debriefing Methods.
Sawyer, Taylor; Eppich, Walter; Brett-Fleegler, Marisa; Grant, Vincent; Cheng, Adam
2016-06-01
Debriefing is a critical component in the process of learning through healthcare simulation. This critical review examines the timing, facilitation, conversational structures, and process elements used in healthcare simulation debriefing. Debriefing occurs either after (postevent) or during (within-event) the simulation. The debriefing conversation can be guided by either a facilitator (facilitator-guided) or the simulation participants themselves (self-guided). Postevent facilitator-guided debriefing may incorporate several conversational structures. These conversational structures break the debriefing discussion into a series of 3 or more phases to help organize the debriefing and ensure the conversation proceeds in an orderly manner. Debriefing process elements are an array of techniques to optimize reflective experience and maximize the impact of debriefing. These are divided here into the following 3 categories: essential elements, conversational techniques/educational strategies, and debriefing adjuncts. This review provides both novice and advanced simulation educators with an overview of various methods of conducting healthcare simulation debriefing. Future research will investigate which debriefing methods are best for which contexts and for whom, and also explore how lessons from simulation debriefing translate to debriefing in clinical practice.
Simulation software: engineer processes before reengineering.
Lepley, C J
2001-01-01
People make decisions all the time using intuition. But what happens when you are asked: "Are you sure your predictions are accurate? How much will a mistake cost? What are the risks associated with this change?" Once a new process is engineered, it is difficult to analyze what would have been different if other options had been chosen. Simulating a process can help senior clinical officers solve complex patient flow problems and avoid wasted efforts. Simulation software can give you the data you need to make decisions. The author introduces concepts, methodologies, and applications of computer aided simulation to illustrate their use in making decisions to improve workflow design.
Spatio-Temporal Process Simulation of Dam-Break Flood Based on SPH
NASA Astrophysics Data System (ADS)
Wang, H.; Ye, F.; Ouyang, S.; Li, Z.
2018-04-01
On the basis of introducing the SPH (Smooth Particle Hydrodynamics) simulation method, the key research problems were given solutions in this paper, which ere the spatial scale and temporal scale adapting to the GIS(Geographical Information System) application, the boundary condition equations combined with the underlying surface, and the kernel function and parameters applicable to dam-break flood simulation. In this regards, a calculation method of spatio-temporal process emulation with elaborate particles for dam-break flood was proposed. Moreover the spatio-temporal process was dynamic simulated by using GIS modelling and visualization. The results show that the method gets more information, objectiveness and real situations.
PROCESS SIMULATION TOOLS FOR POLLUTION PREVENTION: NEW METHODS REDUCE THE MAGNITUDE OF WASTE STREAMS
Growing environmental concerns have spurred considerable interest in pollution prevention. In most instances, pollution prevention involves introducing radical changes to the design of processes so that waste generation is minimized. Process simulators can be effective tools in a...
Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation
2018-01-01
ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model......Do not return it to the originator. ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory
Virtual tryout planning in automotive industry based on simulation metamodels
NASA Astrophysics Data System (ADS)
Harsch, D.; Heingärtner, J.; Hortig, D.; Hora, P.
2016-11-01
Deep drawn sheet metal parts are increasingly designed to the feasibility limit, thus achieving a robust manufacturing is often challenging. The fluctuation of process and material properties often lead to robustness problems. Therefore, numerical simulations are used to detect the critical regions. To enhance the agreement with the real process conditions, the material data are acquired through a variety of experiments. Furthermore, the force distribution is taken into account. The simulation metamodel contains the virtual knowledge of a particular forming process, which is determined based on a series of finite element simulations with variable input parameters. Based on the metamodels, virtual process windows can be displayed for different configurations. This helps to improve the operating point as well as to adjust process settings in case the process becomes unstable. Furthermore, the time of tool tryout can be shortened due to transfer of the virtual knowledge contained in the metamodels on the optimisation of the drawbeads. This allows the tool manufacturer to focus on the essential, to save time and to recognize complex relationships.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCorkle, D.; Yang, C.; Jordan, T.
2007-06-01
Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less
Wu, Kuo-Tsai; Hwang, Sheng-Jye; Lee, Huei-Huang
2017-01-01
Although wafer-level camera lenses are a very promising technology, problems such as warpage with time and non-uniform thickness of products still exist. In this study, finite element simulation was performed to simulate the compression molding process for acquiring the pressure distribution on the product on completion of the process and predicting the deformation with respect to the pressure distribution. Results show that the single-gate compression molding process significantly increases the pressure at the center of the product, whereas the multi-gate compressing molding process can effectively distribute the pressure. This study evaluated the non-uniform thickness of product and changes in the process parameters through computer simulations, which could help to improve the compression molding process. PMID:28617315
Defense Waste Processing Facility Simulant Chemical Processing Cell Studies for Sludge Batch 9
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Tara E.; Newell, J. David; Woodham, Wesley H.
The Savannah River National Laboratory (SRNL) received a technical task request from Defense Waste Processing Facility (DWPF) and Saltstone Engineering to perform simulant tests to support the qualification of Sludge Batch 9 (SB9) and to develop the flowsheet for SB9 in the DWPF. These efforts pertained to the DWPF Chemical Process Cell (CPC). CPC experiments were performed using SB9 simulant (SB9A) to qualify SB9 for sludge-only and coupled processing using the nitric-formic flowsheet in the DWPF. Two simulant batches were prepared, one representing SB8 Tank 40H and another representing SB9 Tank 51H. The simulant used for SB9 qualification testing wasmore » prepared by blending the SB8 Tank 40H and SB9 Tank 51H simulants. The blended simulant is referred to as SB9A. Eleven CPC experiments were run with an acid stoichiometry ranging between 105% and 145% of the Koopman minimum acid equation (KMA), which is equivalent to 109.7% and 151.5% of the Hsu minimum acid factor. Three runs were performed in the 1L laboratory scale setup, whereas the remainder were in the 4L laboratory scale setup. Sludge Receipt and Adjustment Tank (SRAT) and Slurry Mix Evaporator (SME) cycles were performed on nine of the eleven. The other two were SRAT cycles only. One coupled flowsheet and one extended run were performed for SRAT and SME processing. Samples of the condensate, sludge, and off-gas were taken to monitor the chemistry of the CPC experiments.« less
Numerical simulation and optimization of casting process for complex pump
NASA Astrophysics Data System (ADS)
Liu, Xueqin; Dong, Anping; Wang, Donghong; Lu, Yanling; Zhu, Guoliang
2017-09-01
The complex shape of the casting pump body has large complicated structure and uniform wall thickness, which easy give rise to casting defects. The numerical simulation software ProCAST is used to simulate the initial top gating process, after analysis of the material and structure characteristics of the high-pressure pump. The filling process was overall smooth, not there the water shortage phenomenon. But the circular shrinkage defects appear at the bottom of casting during solidification process. Then, the casting parameters were optimized and adding cold iron in the bottom. The shrinkage weight was reduced from 0.00167g to 0.0005g. The porosity volume was reduced from 1.39cm3 to 0.41cm3. The optimization scheme is simulated and actual experimented. The defect has been significantly improved.
Communication Systems Simulation Laboratory (CSSL): Simulation Planning Guide
NASA Technical Reports Server (NTRS)
Schlesinger, Adam
2012-01-01
The simulation process, milestones and inputs are unknowns to first-time users of the CSSL. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.
Systems Engineering Simulator (SES) Simulator Planning Guide
NASA Technical Reports Server (NTRS)
McFarlane, Michael
2011-01-01
The simulation process, milestones and inputs are unknowns to first-time users of the SES. The Simulator Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.
Numerical investigation of coupled density-driven flow and hydrogeochemical processes below playas
NASA Astrophysics Data System (ADS)
Hamann, Enrico; Post, Vincent; Kohfahl, Claus; Prommer, Henning; Simmons, Craig T.
2015-11-01
Numerical modeling approaches with varying complexity were explored to investigate coupled groundwater flow and geochemical processes in saline basins. Long-term model simulations of a playa system gain insights into the complex feedback mechanisms between density-driven flow and the spatiotemporal patterns of precipitating evaporites and evolving brines. Using a reactive multicomponent transport model approach, the simulations reproduced, for the first time in a numerical study, the evaporite precipitation sequences frequently observed in saline basins ("bull's eyes"). Playa-specific flow, evapoconcentration, and chemical divides were found to be the primary controls for the location of evaporites formed, and the resulting brine chemistry. Comparative simulations with the computationally far less demanding surrogate single-species transport models showed that these were still able to replicate the major flow patterns obtained by the more complex reactive transport simulations. However, the simulated degree of salinization was clearly lower than in reactive multicomponent transport simulations. For example, in the late stages of the simulations, when the brine becomes halite-saturated, the nonreactive simulation overestimated the solute mass by almost 20%. The simulations highlight the importance of the consideration of reactive transport processes for understanding and quantifying geochemical patterns, concentrations of individual dissolved solutes, and evaporite evolution.
Vibronic coupling simulations for linear and nonlinear optical processes: Simulation results
NASA Astrophysics Data System (ADS)
Silverstein, Daniel W.; Jensen, Lasse
2012-02-01
A vibronic coupling model based on time-dependent wavepacket approach is applied to simulate linear optical processes, such as one-photon absorbance and resonance Raman scattering, and nonlinear optical processes, such as two-photon absorbance and resonance hyper-Raman scattering, on a series of small molecules. Simulations employing both the long-range corrected approach in density functional theory and coupled cluster are compared and also examined based on available experimental data. Although many of the small molecules are prone to anharmonicity in their potential energy surfaces, the harmonic approach performs adequately. A detailed discussion of the non-Condon effects is illustrated by the molecules presented in this work. Linear and nonlinear Raman scattering simulations allow for the quantification of interference between the Franck-Condon and Herzberg-Teller terms for different molecules.
NASA Technical Reports Server (NTRS)
Parrish, R. S.; Carter, M. C.
1974-01-01
This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.
Development of IR imaging system simulator
NASA Astrophysics Data System (ADS)
Xiang, Xinglang; He, Guojing; Dong, Weike; Dong, Lu
2017-02-01
To overcome the disadvantages of the tradition semi-physical simulation and injection simulation equipment in the performance evaluation of the infrared imaging system (IRIS), a low-cost and reconfigurable IRIS simulator, which can simulate the realistic physical process of infrared imaging, is proposed to test and evaluate the performance of the IRIS. According to the theoretical simulation framework and the theoretical models of the IRIS, the architecture of the IRIS simulator is constructed. The 3D scenes are generated and the infrared atmospheric transmission effects are simulated using OGRE technology in real-time on the computer. The physical effects of the IRIS are classified as the signal response characteristic, modulation transfer characteristic and noise characteristic, and they are simulated on the single-board signal processing platform based on the core processor FPGA in real-time using high-speed parallel computation method.
Compound simulator IR radiation characteristics test and calibration
NASA Astrophysics Data System (ADS)
Li, Yanhong; Zhang, Li; Li, Fan; Tian, Yi; Yang, Yang; Li, Zhuo; Shi, Rui
2015-10-01
The Hardware-in-the-loop simulation can establish the target/interference physical radiation and interception of product flight process in the testing room. In particular, the simulation of environment is more difficult for high radiation energy and complicated interference model. Here the development in IR scene generation produced by a fiber array imaging transducer with circumferential lamp spot sources is introduced. The IR simulation capability includes effective simulation of aircraft signatures and point-source IR countermeasures. Two point-sources as interference can move in two-dimension random directions. For simulation the process of interference release, the radiation and motion characteristic is tested. Through the zero calibration for optical axis of simulator, the radiation can be well projected to the product detector. The test and calibration results show the new type compound simulator can be used in the hardware-in-the-loop simulation trial.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, J.; Mowrey, J.
1995-12-01
This report describes the design, development and testing of process controls for selected system operations in the Browns Ferry Nuclear Plant (BFNP) Reactor Water Cleanup System (RWCU) using a Computer Simulation Platform which simulates the RWCU System and the BFNP Integrated Computer System (ICS). This system was designed to demonstrate the feasibility of the soft control (video touch screen) of nuclear plant systems through an operator console. The BFNP Integrated Computer System, which has recently. been installed at BFNP Unit 2, was simulated to allow for operator control functions of the modeled RWCU system. The BFNP Unit 2 RWCU systemmore » was simulated using the RELAP5 Thermal/Hydraulic Simulation Model, which provided the steady-state and transient RWCU process variables and simulated the response of the system to control system inputs. Descriptions of the hardware and software developed are also included in this report. The testing and acceptance program and results are also detailed in this report. A discussion of potential installation of an actual RWCU process control system in BFNP Unit 2 is included. Finally, this report contains a section on industry issues associated with installation of process control systems in nuclear power plants.« less
NASA Astrophysics Data System (ADS)
Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang
2010-11-01
This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.
NASA Technical Reports Server (NTRS)
Tao, W.-K.; Lau, W.; Baker, R.
2004-01-01
The onset of the southeast Asian monsoon during 1997 and 1998 was simulated with a coupled mesoscale atmospheric model (MM5) and a detailed land surface model. The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The simulation with the land surface model captured basic signatures of the monsoon onset processes and associated rainfall statistics. The sensitivity tests indicated that land surface processes had a greater impact on the simulated rainfall results than that of a small sea surface temperature change during the onset period. In both the 1997 and 1998 cases, the simulations were significantly improved by including the land surface processes. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation; the southwest low-level flow over the Indo-China peninsula and the northern cold front intrusion from southern China. The surface sensible and latent heat exchange between the land and atmosphere modified the low-level temperature distribution and gradient, and therefore the low-level. The more realistic forcing of the sensible and latent heat from the detailed land surface model improved the monsoon rainfall and associated wind simulation. The model results will be compared to the simulation of the 6-7 May 2000 Missouri flash flood event. In addition, the impact of model initialization and land surface treatment on timing, intensity, and location of extreme precipitation will be examined.
NASA Technical Reports Server (NTRS)
Tao, W.-K.; Wang, Y.; Lau, W.; Baker, R. D.
2004-01-01
The onset of the southeast Asian monsoon during 1997 and 1998 was simulated with a coupled mesoscale atmospheric model (MM5) and a detailed land surface model. The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The simulation with the land surface model captured basic signatures of the monsoon onset processes and associated rainfall statistics. The sensitivity tests indicated that land surface processes had a greater impact on the simulated rainfall results than that of a small sea surface temperature change during the onset period. In both the 1997 and 1998 cases, the simulations were significantly improved by including the land surface processes. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation; the southwest low-level flow over the Indo-China peninsula and the northern cold front intrusion from southern China. The surface sensible and latent heat exchange between the land and atmosphere modified the low-level temperature distribution and gradient, and therefore the low-level. The more realistic forcing of the sensible and latent heat from the detailed land surface model improved the monsoon rainfall and associated wind simulation. The model results will be compared to the simulation of the 6-7 May 2000 Missouri flash flood event. In addition, the impact of model initialization and land surface treatment on timing, intensity, and location of extreme precipitation will be examined.
Development and training of a learning expert system in an autonomous mobile robot via simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spelt, P.F.; Lyness, E.; DeSaussure, G.
1989-11-01
The Center for Engineering Systems Advanced Research (CESAR) conducts basic research in the area of intelligent machines. Recently at CESAR a learning expert system was created to operate on board an autonomous robot working at a process control panel. The authors discuss two-computer simulation system used to create, evaluate and train this learning system. The simulation system has a graphics display of the current status of the process being simulated, and the same program which does the simulating also drives the actual control panel. Simulation results were validated on the actual robot. The speed and safety values of using amore » computerized simulator to train a learning computer, and future uses of the simulation system, are discussed.« less
Hybrid neuro-heuristic methodology for simulation and control of dynamic systems over time interval.
Woźniak, Marcin; Połap, Dawid
2017-09-01
Simulation and positioning are very important aspects of computer aided engineering. To process these two, we can apply traditional methods or intelligent techniques. The difference between them is in the way they process information. In the first case, to simulate an object in a particular state of action, we need to perform an entire process to read values of parameters. It is not very convenient for objects for which simulation takes a long time, i.e. when mathematical calculations are complicated. In the second case, an intelligent solution can efficiently help on devoted way of simulation, which enables us to simulate the object only in a situation that is necessary for a development process. We would like to present research results on developed intelligent simulation and control model of electric drive engine vehicle. For a dedicated simulation method based on intelligent computation, where evolutionary strategy is simulating the states of the dynamic model, an intelligent system based on devoted neural network is introduced to control co-working modules while motion is in time interval. Presented experimental results show implemented solution in situation when a vehicle transports things over area with many obstacles, what provokes sudden changes in stability that may lead to destruction of load. Therefore, applied neural network controller prevents the load from destruction by positioning characteristics like pressure, acceleration, and stiffness voltage to absorb the adverse changes of the ground. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.A.; Sepehrnoori, K.
1994-09-01
The objective of this research is to develop cost-effective surfactant flooding technology by using surfactant simulation studies to evaluate and optimize alternative design strategies taking into account reservoir characteristics, process chemistry, and process design options such as horizontal wells. Task 1 is the development of an improved numerical method for our simulator that will enable us to solve a wider class of these difficult simulation problems accurately and affordably. Task 2 is the application of this simulator to the optimization of surfactant flooding to reduce its risk and cost. The goal of Task 2 is to understand and generalize themore » impact of both process and reservoir characteristics on the optimal design of surfactant flooding. We have studied the effect of process parameters such as salinity gradient, surfactant adsorption, surfactant concentration, surfactant slug size, pH, polymer concentration and well constraints on surfactant floods. In this report, we show three dimensional field scale simulation results to illustrate the impact of one important design parameter, the salinity gradient. Although the use of a salinity gradient to improve the efficiency and robustness of surfactant flooding has been studied and applied for many years, this is the first time that we have evaluated it using stochastic simulations rather than simulations using the traditional layered reservoir description. The surfactant flooding simulations were performed using The University of Texas chemical flooding simulator called UTCHEM.« less
A finite element simulation of biological conversion processes in landfills.
Robeck, M; Ricken, T; Widmann, R
2011-04-01
Landfills are the most common way of waste disposal worldwide. Biological processes convert the organic material into an environmentally harmful landfill gas, which has an impact on the greenhouse effect. After the depositing of waste has been stopped, current conversion processes continue and emissions last for several decades and even up to 100years and longer. A good prediction of these processes is of high importance for landfill operators as well as for authorities, but suitable models for a realistic description of landfill processes are rather poor. In order to take the strong coupled conversion processes into account, a constitutive three-dimensional model based on the multiphase Theory of Porous Media (TPM) has been developed at the University of Duisburg-Essen. The theoretical formulations are implemented in the finite element code FEAP. With the presented calculation concept we are able to simulate the coupled processes that occur in an actual landfill. The model's theoretical background and the results of the simulations as well as the meantime successfully performed simulation of a real landfill body will be shown in the following. Copyright © 2010 Elsevier Ltd. All rights reserved.
Visualizing human communication in business process simulations
NASA Astrophysics Data System (ADS)
Groehn, Matti; Jalkanen, Janne; Haho, Paeivi; Nieminen, Marko; Smeds, Riitta
1999-03-01
In this paper a description of business process simulation is given. Crucial part in the simulation of business processes is the analysis of social contacts between the participants. We will introduce a tool to collect log data and how this log data can be effectively analyzed using two different kind of methods: discussion flow charts and self-organizing maps. Discussion flow charts revealed the communication patterns and self-organizing maps are a very effective way of clustering the participants into development groups.
DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION
The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...
ERIC Educational Resources Information Center
Peng, Jacob; Abdullah, Ira
2018-01-01
The emphases of student involvement and meaningful engagement in the learner-centered education model have created a new paradigm in an effort to generate a more engaging learning environment. This study examines the success of using different simulation platforms in creating a market simulation to teach business processes in the accounting…
Artistic understanding as embodied simulation.
Gibbs, Raymond W
2013-04-01
Bullot & Reber (B&R) correctly include historical perspectives into the scientific study of art appreciation. But artistic understanding always emerges from embodied simulation processes that incorporate the ongoing dynamics of brains, bodies, and world interactions. There may not be separate modes of artistic understanding, but a continuum of processes that provide imaginative simulations of the artworks we see or hear.
Optimal segmentation and packaging process
Kostelnik, K.M.; Meservey, R.H.; Landon, M.D.
1999-08-10
A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D and D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded. 3 figs.
Weighted Ensemble Simulation: Review of Methodology, Applications, and Software.
Zuckerman, Daniel M; Chong, Lillian T
2017-05-22
The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.
Kumar, Sameer
2011-01-01
It is increasingly recognized that hospital operation is an intricate system with limited resources and many interacting sources of both positive and negative feedback. The purpose of this study is to design a surgical delivery process in a county hospital in the U.S where patient flow through a surgical ward is optimized. The system simulation modeling is used to address questions of capacity planning, throughput management and interacting resources which constitute the constantly changing complexity that characterizes designing a contemporary surgical delivery process in a hospital. The steps in building a system simulation model is demonstrated using an example of building a county hospital in a small city in the US. It is used to illustrate a modular system simulation modeling of patient surgery process flows. The system simulation model development will enable planners and designers how they can build in overall efficiencies in a healthcare facility through optimal bed capacity for peak patient flow of emergency and routine patients.
Manufacturing Process Simulation of Large-Scale Cryotanks
NASA Technical Reports Server (NTRS)
Babai, Majid; Phillips, Steven; Griffin, Brian
2003-01-01
NASA's Space Launch Initiative (SLI) is an effort to research and develop the technologies needed to build a second-generation reusable launch vehicle. It is required that this new launch vehicle be 100 times safer and 10 times cheaper to operate than current launch vehicles. Part of the SLI includes the development of reusable composite and metallic cryotanks. The size of these reusable tanks is far greater than anything ever developed and exceeds the design limits of current manufacturing tools. Several design and manufacturing approaches have been formulated, but many factors must be weighed during the selection process. Among these factors are tooling reachability, cycle times, feasibility, and facility impacts. The manufacturing process simulation capabilities available at NASA.s Marshall Space Flight Center have played a key role in down selecting between the various manufacturing approaches. By creating 3-D manufacturing process simulations, the varying approaches can be analyzed in a virtual world before any hardware or infrastructure is built. This analysis can detect and eliminate costly flaws in the various manufacturing approaches. The simulations check for collisions between devices, verify that design limits on joints are not exceeded, and provide cycle times which aide in the development of an optimized process flow. In addition, new ideas and concerns are often raised after seeing the visual representation of a manufacturing process flow. The output of the manufacturing process simulations allows for cost and safety comparisons to be performed between the various manufacturing approaches. This output helps determine which manufacturing process options reach the safety and cost goals of the SLI. As part of the SLI, The Boeing Company was awarded a basic period contract to research and propose options for both a metallic and a composite cryotank. Boeing then entered into a task agreement with the Marshall Space Flight Center to provide manufacturing simulation support. This paper highlights the accomplishments of this task agreement, while also introducing the capabilities of simulation software.
Payload crew training complex simulation engineer's handbook
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1984-01-01
The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.
Simulant Basis for the Standard High Solids Vessel Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Reid A.; Fiskum, Sandra K.; Suffield, Sarah R.
The Waste Treatment and Immobilization Plant (WTP) is working to develop a Standard High Solids Vessel Design (SHSVD) process vessel. To support testing of this new design, WTP engineering staff requested that a Newtonian simulant and a non-Newtonian simulant be developed that would represent the Most Adverse Design Conditions (in development) with respect to mixing performance as specified by WTP. The majority of the simulant requirements are specified in 24590-PTF-RPT-PE-16-001, Rev. 0. The first step in this process is to develop the basis for these simulants. This document describes the basis for the properties of these two simulant types. Themore » simulant recipes that meet this basis will be provided in a subsequent document.« less
Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics
NASA Technical Reports Server (NTRS)
Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.
NASA Astrophysics Data System (ADS)
Zubanov, V. M.; Stepanov, D. V.; Shabliy, L. S.
2017-01-01
The article describes the method for simulation of transient combustion processes in the rocket engine. The engine operates on gaseous propellant: oxygen and hydrogen. Combustion simulation was performed using the ANSYS CFX software. Three reaction mechanisms for the stationary mode were considered and described in detail. Reactions mechanisms have been taken from several sources and verified. The method for converting ozone properties from the Shomate equation to the NASA-polynomial format was described in detail. The way for obtaining quick CFD-results with intermediate combustion components using an EDM model was found. Modeling difficulties with combustion model Finite Rate Chemistry, associated with a large scatter of reference data were identified and described. The way to generate the Flamelet library with CFX-RIF is described. Formulated adequate reaction mechanisms verified at a steady state have also been tested for transient simulation. The Flamelet combustion model was recognized as adequate for the transient mode. Integral parameters variation relates to the values obtained during stationary simulation. A cyclic irregularity of the temperature field, caused by precession of the vortex core, was detected in the chamber with the proposed simulation technique. Investigations of unsteady processes of rocket engines including the processes of ignition were proposed as the area for application of the described simulation technique.
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2005-07-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A Framework to Design and Optimize Chemical Flooding Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2006-08-31
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2004-11-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
Modeling and FE Simulation of Quenchable High Strength Steels Sheet Metal Hot Forming Process
NASA Astrophysics Data System (ADS)
Liu, Hongsheng; Bao, Jun; Xing, Zhongwen; Zhang, Dejin; Song, Baoyu; Lei, Chengxi
2011-08-01
High strength steel (HSS) sheet metal hot forming process is investigated by means of numerical simulations. With regard to a reliable numerical process design, the knowledge of the thermal and thermo-mechanical properties is essential. In this article, tensile tests are performed to examine the flow stress of the material HSS 22MnB5 at different strains, strain rates, and temperatures. Constitutive model based on phenomenological approach is developed to describe the thermo-mechanical properties of the material 22MnB5 by fitting the experimental data. A 2D coupled thermo-mechanical finite element (FE) model is developed to simulate the HSS sheet metal hot forming process for U-channel part. The ABAQUS/explicit model is used conduct the hot forming stage simulations, and ABAQUS/implicit model is used for accurately predicting the springback which happens at the end of hot forming stage. Material modeling and FE numerical simulations are carried out to investigate the effect of the processing parameters on the hot forming process. The processing parameters have significant influence on the microstructure of U-channel part. The springback after hot forming stage is the main factor impairing the shape precision of hot-formed part. The mechanism of springback is advanced and verified through numerical simulations and tensile loading-unloading tests. Creep strain is found in the tensile loading-unloading test under isothermal condition and has a distinct effect on springback. According to the numerical and experimental results, it can be concluded that springback is mainly caused by different cooling rats and the nonhomogengeous shrink of material during hot forming process, the creep strain is the main factor influencing the amount of the springback.
Improving surgeon utilization in an orthopedic department using simulation modeling
Simwita, Yusta W; Helgheim, Berit I
2016-01-01
Purpose Worldwide more than two billion people lack appropriate access to surgical services due to mismatch between existing human resource and patient demands. Improving utilization of existing workforce capacity can reduce the existing gap between surgical demand and available workforce capacity. In this paper, the authors use discrete event simulation to explore the care process at an orthopedic department. Our main focus is improving utilization of surgeons while minimizing patient wait time. Methods The authors collaborated with orthopedic department personnel to map the current operations of orthopedic care process in order to identify factors that influence poor surgeons utilization and high patient waiting time. The authors used an observational approach to collect data. The developed model was validated by comparing the simulation output with the actual patient data that were collected from the studied orthopedic care process. The authors developed a proposal scenario to show how to improve surgeon utilization. Results The simulation results showed that if ancillary services could be performed before the start of clinic examination services, the orthopedic care process could be highly improved. That is, improved surgeon utilization and reduced patient waiting time. Simulation results demonstrate that with improved surgeon utilizations, up to 55% increase of future demand can be accommodated without patients reaching current waiting time at this clinic, thus, improving patient access to health care services. Conclusion This study shows how simulation modeling can be used to improve health care processes. This study was limited to a single care process; however the findings can be applied to improve other orthopedic care process with similar operational characteristics. PMID:29355193
Analysis of large-scale tablet coating: Modeling, simulation and experiments.
Boehling, P; Toschkoff, G; Knop, K; Kleinebudde, P; Just, S; Funke, A; Rehbaum, H; Khinast, J G
2016-07-30
This work concerns a tablet coating process in an industrial-scale drum coater. We set up a full-scale Design of Simulation Experiment (DoSE) using the Discrete Element Method (DEM) to investigate the influence of various process parameters (the spray rate, the number of nozzles, the rotation rate and the drum load) on the coefficient of inter-tablet coating variation (cv,inter). The coater was filled with up to 290kg of material, which is equivalent to 1,028,369 tablets. To mimic the tablet shape, the glued sphere approach was followed, and each modeled tablet consisted of eight spheres. We simulated the process via the eXtended Particle System (XPS), proving that it is possible to accurately simulate the tablet coating process on the industrial scale. The process time required to reach a uniform tablet coating was extrapolated based on the simulated data and was in good agreement with experimental results. The results are provided at various levels of details, from thorough investigation of the influence that the process parameters have on the cv,inter and the amount of tablets that visit the spray zone during the simulated 90s to the velocity in the spray zone and the spray and bed cycle time. It was found that increasing the number of nozzles and decreasing the spray rate had the highest influence on the cv,inter. Although increasing the drum load and the rotation rate increased the tablet velocity, it did not have a relevant influence on the cv,inter and the process time. Copyright © 2015 Elsevier B.V. All rights reserved.
Dynamic Biological Functioning Important for Simulating and Stabilizing Ocean Biogeochemistry
NASA Astrophysics Data System (ADS)
Buchanan, P. J.; Matear, R. J.; Chase, Z.; Phipps, S. J.; Bindoff, N. L.
2018-04-01
The biogeochemistry of the ocean exerts a strong influence on the climate by modulating atmospheric greenhouse gases. In turn, ocean biogeochemistry depends on numerous physical and biological processes that change over space and time. Accurately simulating these processes is fundamental for accurately simulating the ocean's role within the climate. However, our simulation of these processes is often simplistic, despite a growing understanding of underlying biological dynamics. Here we explore how new parameterizations of biological processes affect simulated biogeochemical properties in a global ocean model. We combine 6 different physical realizations with 6 different biogeochemical parameterizations (36 unique ocean states). The biogeochemical parameterizations, all previously published, aim to more accurately represent the response of ocean biology to changing physical conditions. We make three major findings. First, oxygen, carbon, alkalinity, and phosphate fields are more sensitive to changes in the ocean's physical state. Only nitrate is more sensitive to changes in biological processes, and we suggest that assessment protocols for ocean biogeochemical models formally include the marine nitrogen cycle to assess their performance. Second, we show that dynamic variations in the production, remineralization, and stoichiometry of organic matter in response to changing environmental conditions benefit the simulation of ocean biogeochemistry. Third, dynamic biological functioning reduces the sensitivity of biogeochemical properties to physical change. Carbon and nitrogen inventories were 50% and 20% less sensitive to physical changes, respectively, in simulations that incorporated dynamic biological functioning. These results highlight the importance of a dynamic biology for ocean properties and climate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Provost, G.; Zitney, S.; Turton, R.
2009-01-01
To meet increasing demand for education and experience with commercial-scale, coal-fired, integrated gasification combined cycle (IGCC) plants with CO2 capture, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) is leading a project to deploy a generic, full-scope, real-time IGCC dynamic plant simulator for use in establishing a world-class research and training center, and to promote and demonstrate IGCC technology to power industry personnel. The simulator, being built by Invensys Process Systems (IPS), will be installed at two separate sites, at NETL and West Virginia University (WVU), and will combine a process/gasification simulator with a power/combined-cycle simulator together inmore » a single dynamic simulation framework for use in engineering research studies and training applications. The simulator, scheduled to be launched in mid-year 2010, will have the following capabilities: High-fidelity, dynamic model of process-side (gasification and gas cleaning with CO2 capture) and power-block-side (combined cycle) for a generic IGCC plant fueled by coal and/or petroleum coke. Highly flexible configuration that allows concurrent training on separate gasification and combined cycle simulators, or up to two IGCC simulators. Ability to enhance and modify the plant model to facilitate studies of changes in plant configuration, equipment, and control strategies to support future R&D efforts. Training capabilities including startup, shutdown, load following and shedding, response to fuel and ambient condition variations, control strategy analysis (turbine vs. gasifier lead, etc.), representative malfunctions/trips, alarms, scenarios, trending, snapshots, data historian, etc. To support this effort, process descriptions and control strategies were developed for key sections of the plant as part of the detailed functional specification, which is serving as the basis of the simulator development. In this paper, we highlight the contents of the detailed functional specification for the simulator. We also describe the engineering, design, and expert testing process that the simulator will undergo in order to ensure that maximum fidelity is built into the generic simulator. Future applications and training programs associated with gasification, combined cycle, and IGCC simulations are discussed, including plant operation and control demonstrations, as well as education and training services.« less
Katsuo, Shigeharu; Langel, Christian; Sandré, Anne-Laure; Mazzotti, Marco
2011-12-30
One of the modified simulated moving bed (SMB) processes, the intermittent SMB (I-SMB) process, has been recently analyzed theoretically [1] and its superior performance compared to the conventional SMB process has been demonstrated at a rather low total feed concentration through experiments and simulations [2]. This work shows that the I-SMB process outperforms the conventional SMB process also at high feed concentration where the species are clearly subject to a nonlinear adsorption isotherm. In the case of the separation of the Tröger's base's enantiomers in ethanol on ChiralPak AD, the two processes operated in a six-column 1-2-2-1 configuration (one column in sections 1 and 4 and two columns in sections 2 and 3) and in a four-column 1-1-1-1 configuration (one column in each section) are compared at high feed concentration through both experiments and simulations. Even under nonlinear conditions the four column I-SMB process can successfully separate the two enantiomers achieving purity levels as high as the two six column processes and exhibiting better productivity. Copyright © 2011 Elsevier B.V. All rights reserved.
The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing
NASA Technical Reports Server (NTRS)
Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.
2010-01-01
The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.
Kim, Youngmi; Mosier, Nathan; Ladisch, Michael R
2008-08-01
Distillers' grains (DG), a co-product of a dry grind ethanol process, is an excellent source of supplemental proteins in livestock feed. Studies have shown that, due to its high polymeric sugar contents and ease of hydrolysis, the distillers' grains have potential as an additional source of fermentable sugars for ethanol fermentation. The benefit of processing the distillers' grains to extract fermentable sugars lies in an increased ethanol yield without significant modification in the current dry grind technology. Three different potential configurations of process alternatives in which pretreated and hydrolyzed distillers' grains are recycled for an enhanced overall ethanol yield are proposed and discussed in this paper based on the liquid hot water (LHW) pretreatment of distillers' grains. Possible limitations of each proposed process are also discussed. This paper presents a compositional analysis of distillers' grains, as well as a simulation of the modified dry grind processes with recycle of distillers' grains. Simulated material balances for the modified dry grind processes are established based on the base case assumptions. These balances are compared to the conventional dry grind process in terms of ethanol yield, compositions of its co-products, and accumulation of fermentation inhibitors. Results show that 14% higher ethanol yield is achievable by processing and hydrolyzing the distillers' grains for additional fermentable sugars, as compared to the conventional dry grind process. Accumulation of fermentation by-products and inhibitory components in the proposed process is predicted to be 2-5 times higher than in the conventional dry grind process. The impact of fermentation inhibitors is reviewed and discussed. The final eDDGS (enhanced dried distillers' grains) from the modified processes has 30-40% greater protein content per mass than DDGS, and its potential as a value-added process is also analyzed. While the case studies used to illustrate the process simulation are based on LHW pretreated DG, the process simulation itself provides a framework for evaluation of the impact of other pretreatments.
Sub-half-micron contact window design with 3D photolithography simulator
NASA Astrophysics Data System (ADS)
Brainerd, Steve K.; Bernard, Douglas A.; Rey, Juan C.; Li, Jiangwei; Granik, Yuri; Boksha, Victor V.
1997-07-01
In state of the art IC design and manufacturing certain lithography layers have unique requirements. Latitudes and tolerances that apply to contacts and polysilicon gates are tight for such critical layers. Industry experts are discussing the most cost effective ways to use feature- oriented equipment and materials already developed for these layers. Such requirements introduce new dimensions into the traditionally challenging task for the photolithography engineer when considering various combinations of multiple factors to optimize and control the process. In addition, he/she faces a rapidly increasing cost of experiments, limited time and scarce access to equipment to conduct them. All the reasons presented above support simulation as an ideal method to satisfy these demands. However lithography engineers may be easily dissatisfied with a simulation tool when discovering disagreement between the simulation and experimental data. The problem is that several parameters used in photolithography simulation are very process specific. Calibration, i.e. matching experimental and simulation data using a specific set of procedures allows one to effectively use the simulation tool. We present results of a simulation based approach to optimize photolithography processes for sub-0.5 micron contact windows. Our approach consists of: (1) 3D simulation to explore different lithographic options, (2) calibration to a range of process conditions with extensive use of specifically developed optimization techniques. The choice of a 3D simulator is essential because of 3D nature of the problem of contact window design. We use DEPICT 4.1. This program performs fast aerial image simulation as presented before. For 3D exposure the program uses an extension to three-dimensions of the high numerical aperture model combined with Fast Fourier Transforms for maximum performance and accuracy. We use Kim (U.C. Berkeley) model and the fast marching Level Set method respectively for the calculation of resist development rates and resist surface movement during development process. Calibration efforts were aimed at matching experimental results on contact windows obtained after exposure of a binary mask. Additionally, simulation was applied to conduct quantitative analysis of PSM design capabilities, optical proximity correction, and stepper parameter optimization. Extensive experiments covered exposure (ASML 5500/100D stepper), pre- and post-exposure bake and development (2.38% TMAH, puddle process) of JSR IX725D2G and TOK iP3500 photoresists films on 200 mm test wafers. `Aquatar' was used as top antireflective coating, SEM pictures of developed patterns were analyzed and compared with simulation results for different values of defocus, exposure energies, numerical aperture and partial coherence.
Development of Partial Discharging Simulation Test Equipment
NASA Astrophysics Data System (ADS)
Kai, Xue; Genghua, Liu; Yan, Jia; Ziqi, Chai; Jian, Lu
2017-12-01
In the case of partial discharge training for recruits who lack of on-site work experience, the risk of physical shock and damage of the test equipment may be due to the limited skill level and improper operation by new recruits. Partial discharge simulation tester is the use of simulation technology to achieve partial discharge test process simulation, relatively true reproduction of the local discharge process and results, so that the operator in the classroom will be able to get familiar with and understand the use of the test process and equipment.The teacher sets up the instrument to display different partial discharge waveforms so that the trainees can analyze the test results of different partial discharge types.
Physics-based interactive volume manipulation for sharing surgical process.
Nakao, Megumi; Minato, Kotaro
2010-05-01
This paper presents a new set of techniques by which surgeons can interactively manipulate patient-specific volumetric models for sharing surgical process. To handle physical interaction between the surgical tools and organs, we propose a simple surface-constraint-based manipulation algorithm to consistently simulate common surgical manipulations such as grasping, holding and retraction. Our computation model is capable of simulating soft-tissue deformation and incision in real time. We also present visualization techniques in order to rapidly visualize time-varying, volumetric information on the deformed image. This paper demonstrates the success of the proposed methods in enabling the simulation of surgical processes, and the ways in which this simulation facilitates preoperative planning and rehearsal.
Process Simulation of Gas Metal Arc Welding Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murray, Paul E.
2005-09-06
ARCWELDER is a Windows-based application that simulates gas metal arc welding (GMAW) of steel and aluminum. The software simulates the welding process in an accurate and efficient manner, provides menu items for process parameter selection, and includes a graphical user interface with the option to animate the process. The user enters the base and electrode material, open circuit voltage, wire diameter, wire feed speed, welding speed, and standoff distance. The program computes the size and shape of a square-groove or V-groove weld in the flat position. The program also computes the current, arc voltage, arc length, electrode extension, transfer ofmore » droplets, heat input, filler metal deposition, base metal dilution, and centerline cooling rate, in English or SI units. The simulation may be used to select welding parameters that lead to desired operation conditions.« less
The Use of Particle/Substrate Material Models in Simulation of Cold-Gas Dynamic-Spray Process
NASA Astrophysics Data System (ADS)
Rahmati, Saeed; Ghaei, Abbas
2014-02-01
Cold spray is a coating deposition method in which the solid particles are accelerated to the substrate using a low temperature supersonic gas flow. Many numerical studies have been carried out in the literature in order to study this process in more depth. Despite the inability of Johnson-Cook plasticity model in prediction of material behavior at high strain rates, it is the model that has been frequently used in simulation of cold spray. Therefore, this research was devoted to compare the performance of different material models in the simulation of cold spray process. Six different material models, appropriate for high strain-rate plasticity, were employed in finite element simulation of cold spray process for copper. The results showed that the material model had a considerable effect on the predicted deformed shapes.
Lattice Boltzmann simulations of immiscible displacement process with large viscosity ratios
NASA Astrophysics Data System (ADS)
Rao, Parthib; Schaefer, Laura
2017-11-01
Immiscible displacement is a key physical mechanism involved in enhanced oil recovery and carbon sequestration processes. This multiphase flow phenomenon involves a complex interplay of viscous, capillary, inertial and wettability effects. The lattice Boltzmann (LB) method is an accurate and efficient technique for modeling and simulating multiphase/multicomponent flows especially in complex flow configurations and media. In this presentation we present numerical simulation results of displacement process in thin long channels. The results are based on a new psuedo-potential multicomponent LB model with multiple relaxation time collision (MRT) model and explicit forcing scheme. We demonstrate that the proposed model is capable of accurately simulating the displacement process involving fluids with a wider range of viscosity ratios (>100) and which also leads to viscosity-independent interfacial tension and reduction of some important numerical artifacts.
Rapid Automated Aircraft Simulation Model Updating from Flight Data
NASA Technical Reports Server (NTRS)
Brian, Geoff; Morelli, Eugene A.
2011-01-01
Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.
Statistical error in simulations of Poisson processes: Example of diffusion in solids
NASA Astrophysics Data System (ADS)
Nilsson, Johan O.; Leetmaa, Mikael; Vekilova, Olga Yu.; Simak, Sergei I.; Skorodumova, Natalia V.
2016-08-01
Simulations of diffusion in solids often produce poor statistics of diffusion events. We present an analytical expression for the statistical error in ion conductivity obtained in such simulations. The error expression is not restricted to any computational method in particular, but valid in the context of simulation of Poisson processes in general. This analytical error expression is verified numerically for the case of Gd-doped ceria by running a large number of kinetic Monte Carlo calculations.
Real-Time Visualization of an HPF-based CFD Simulation
NASA Technical Reports Server (NTRS)
Kremenetsky, Mark; Vaziri, Arsi; Haimes, Robert; Chancellor, Marisa K. (Technical Monitor)
1996-01-01
Current time-dependent CFD simulations produce very large multi-dimensional data sets at each time step. The visual analysis of computational results are traditionally performed by post processing the static data on graphics workstations. We present results from an alternate approach in which we analyze the simulation data in situ on each processing node at the time of simulation. The locally analyzed results, usually more economical and in a reduced form, are then combined and sent back for visualization on a graphics workstation.
Automated Classification of Phonological Errors in Aphasic Language
Ahuja, Sanjeev B.; Reggia, James A.; Berndt, Rita S.
1984-01-01
Using heuristically-guided state space search, a prototype program has been developed to simulate and classify phonemic errors occurring in the speech of neurologically-impaired patients. Simulations are based on an interchangeable rule/operator set of elementary errors which represent a theory of phonemic processing faults. This work introduces and evaluates a novel approach to error simulation and classification, it provides a prototype simulation tool for neurolinguistic research, and it forms the initial phase of a larger research effort involving computer modelling of neurolinguistic processes.
Using Simulation for Launch Team Training and Evaluation
NASA Technical Reports Server (NTRS)
Peaden, Cary J.
2005-01-01
This document describes some of the histor y and uses of simulation systems and processes for the training and evaluation of Launch Processing, Mission Control, and Mission Management teams. It documents some of the types of simulations that are used at Kennedy Space Center (KSC) today and that could be utilized (and possibly enhanced) for future launch vehicles. This article is intended to provide an initial baseline for further research into simulation for launch team training in the near future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foley, M.G.; Petrie, G.M.; Baldwin, A.J.
1982-06-01
This report contains the input data and computer results for the Geologic Simulation Model. This model is described in detail in the following report: Petrie, G.M., et. al. 1981. Geologic Simulation Model for a Hypothetical Site in the Columbia Plateau, Pacific Northwest Laboratory, Richland, Washington. The Geologic Simulation Model is a quasi-deterministic process-response model which simulates, for a million years into the future, the development of the geologic and hydrologic systems of the ground-water basin containing the Pasco Basin. Effects of natural processes on the ground-water hydrologic system are modeled principally by rate equations. The combined effects and synergistic interactionsmore » of different processes are approximated by linear superposition of their effects during discrete time intervals in a stepwise-integration approach.« less
DEVELOPMENT OF AN INSOLUBLE SALT SIMULANT TO SUPPORT ENHANCED CHEMICAL CLEANING TESTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eibling, R
The closure process for high level waste tanks at the Savannah River Site will require dissolution of the crystallized salts that are currently stored in many of the tanks. The insoluble residue from salt dissolution is planned to be removed by an Enhanced Chemical Cleaning (ECC) process. Development of a chemical cleaning process requires an insoluble salt simulant to support evaluation tests of different cleaning methods. The Process Science and Engineering section of SRNL has been asked to develop an insoluble salt simulant for use in testing potential ECC processes (HLE-TTR-2007-017). An insoluble salt simulant has been developed based uponmore » the residues from salt dissolution of saltcake core samples from Tank 28F. The simulant was developed for use in testing SRS waste tank chemical cleaning methods. Based on the results of the simulant development process, the following observations were developed: (1) A composition based on the presence of 10.35 grams oxalate and 4.68 grams carbonate per 100 grams solids produces a sufficiently insoluble solids simulant. (2) Aluminum observed in the solids remaining from actual waste salt dissolution tests is probably precipitated from sodium aluminate due to the low hydroxide content of the saltcake. (3) In-situ generation of aluminum hydroxide (by use of aluminate as the Al source) appears to trap additional salts in the simulant in a manner similar to that expected for actual waste samples. (4) Alternative compositions are possible with higher oxalate levels and lower carbonate levels. (5) The maximum oxalate level is limited by the required Na content of the insoluble solids. (6) Periodic mixing may help to limit crystal growth in this type of salt simulant. (7) Long term storage of an insoluble salt simulant is likely to produce a material that can not be easily removed from the storage container. Production of a relatively fresh simulant is best if pumping the simulant is necessary for testing purposes. The insoluble salt simulant described in this report represents the initial attempt to represent the material which may be encountered during final waste removal and tank cleaning. The final selected simulant was produced by heating and evaporation of a salt slurry sample to remove excess water and promote formation and precipitation of solids with solubility characteristics which are consistent with actual tank insoluble salt samples. The exact anion composition of the final product solids is not explicitly known since the chemical components in the final product are distributed between the solid and liquid phases. By combining the liquid phase analyses and total solids analysis with mass balance requirements a calculated composition of assumed simple compounds was obtained and is shown in Table 0-1. Additional improvements to and further characterization of the insoluble salt simulant are possible. During the development of these simulants it was recognized that: (1) Additional waste characterization on the residues from salt dissolution tests with actual waste samples to determine the amount of species such as carbonate, oxalate and aluminosilicate would allow fewer assumptions to be made in constructing an insoluble salt simulant. (2) The tank history will impact the amount and type of insoluble solids that exist in the salt dissolution solids. Varying the method of simulant production (elevated temperature processing time, degree of evaporation, amount of mixing (shear) during preparation, etc.) should be tested.« less
Study on wet scavenging of atmospheric pollutants in south Brazil
NASA Astrophysics Data System (ADS)
Wiegand, Flavio; Pereira, Felipe Norte; Teixeira, Elba Calesso
2011-09-01
The present paper presents the study of in-cloud and below-cloud SO 2 and SO 42-scavenging processes by applying numerical models in the Candiota region, located in the state of Rio Grande do Sul, South Brazil. The BRAMS (Brazilian Regional Atmospheric Modeling System) model was applied to simulate the vertical structure of the clouds, and the B.V.2 (Below-Cloud Beheng Version 2) scavenging model was applied to simulate in-cloud and below-cloud scavenging processes of the pollutants SO 2 and SO 42-. Five events in 2004 were selected for this study and were sampled at the Candiota Airport station. The concentrations of SO 2 and SO 42- sampled in the air and the simulated meteorological parameters of rainfall episodes were used as input data in the B.V.2, which simulates raindrop interactions associated with the scavenging process. Results for the Candiota region showed that in-cloud scavenging processes were more significant than below-cloud scavenging processes for two of the five events studied, with a contribution of approximately 90-100% of SO 2 and SO 42- concentrations in rainwater. A few adjustments to the original version of B.V.2 were made to allow simulation of scavenging processes in several types of clouds, not only cumulus humilis and cumulus congestus.
Conforti, Patrick F; Prasad, Manish; Garrison, Barbara J
2008-08-01
[Figure: see text]. Laser ablation harnesses photon energy to remove material from a surface. Although applications such as laser-assisted in situ keratomileusis (LASIK) surgery, lithography, and nanoscale device fabrication take advantage of this process, a better understanding the underlying mechanism of ablation in polymeric materials remains much sought after. Molecular simulation is a particularly attractive technique to study the basic aspects of ablation because it allows control over specific process parameters and enables observation of microscopic mechanistic details. This Account describes a hybrid molecular dynamics-Monte Carlo technique to simulate laser ablation in poly(methyl methacrylate) (PMMA). It also discusses the impact of thermal and chemical excitation on the ensuing ejection processes. We used molecular dynamics simulation to study the molecular interactions in a coarse-grained PMMA substrate following photon absorption. To ascertain the role of chemistry in initiating ablation, we embedded a Monte Carlo protocol within the simulation framework. These calculations permit chemical reactions to occur probabilistically during the molecular dynamics calculation using predetermined reaction pathways and Arrhenius rates. With this hybrid scheme, we can examine thermal and chemical pathways of decomposition separately. In the simulations, we observed distinct mechanisms of ablation for each type of photoexcitation pathway. Ablation via thermal processes is governed by a critical number of bond breaks following the deposition of energy. For the case in which an absorbed photon directly causes a bond scission, ablation occurs following the rapid chemical decomposition of material. A detailed analysis of the processes shows that a critical energy for ablation can describe this complex series of events. The simulations show a decrease in the critical energy with a greater amount of photochemistry. Additionally, the simulations demonstrate the effects of the energy deposition rate on the ejection mechanism. When the energy is deposited rapidly, not allowing for mechanical relaxation of the sample, the formation of a pressure wave and subsequent tensile wave dominates the ejection process. This study provides insight into the influence of thermal, chemical, and mechanical processes in PMMA and facilitates greater understanding of the complex nature of polymer ablation. These simulations complement experiments that have used chemical design to harness the photochemical properties of materials to enhance laser ablation. We successfully fit the results of the simulations to established analytical models of both photothermal and photochemical ablation and demonstrate their relevance. Although the simulations are for PMMA, the mechanistic concepts are applicable to a large range of systems and provide a conceptual foundation for interpretation of experimental data.
FY13 GLYCOLIC-NITRIC ACID FLOWSHEET DEMONSTRATIONS OF THE DWPF CHEMICAL PROCESS CELL WITH SIMULANTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambert, D.; Zamecnik, J.; Best, D.
Savannah River Remediation is evaluating changes to its current Defense Waste Processing Facility flowsheet to replace formic acid with glycolic acid in order to improve processing cycle times and decrease by approximately 100x the production of hydrogen, a potentially flammable gas. Higher throughput is needed in the Chemical Processing Cell since the installation of the bubblers into the melter has increased melt rate. Due to the significant maintenance required for the safety significant gas chromatographs and the potential for production of flammable quantities of hydrogen, eliminating the use of formic acid is highly desirable. Previous testing at the Savannah Rivermore » National Laboratory has shown that replacing formic acid with glycolic acid allows the reduction and removal of mercury without significant catalytic hydrogen generation. Five back-to-back Sludge Receipt and Adjustment Tank (SRAT) cycles and four back-to-back Slurry Mix Evaporator (SME) cycles were successful in demonstrating the viability of the nitric/glycolic acid flowsheet. The testing was completed in FY13 to determine the impact of process heels (approximately 25% of the material is left behind after transfers). In addition, back-to-back experiments might identify longer-term processing problems. The testing was designed to be prototypic by including sludge simulant, Actinide Removal Product simulant, nitric acid, glycolic acid, and Strip Effluent simulant containing Next Generation Solvent in the SRAT processing and SRAT product simulant, decontamination frit slurry, and process frit slurry in the SME processing. A heel was produced in the first cycle and each subsequent cycle utilized the remaining heel from the previous cycle. Lower SRAT purges were utilized due to the low hydrogen generation. Design basis addition rates and boilup rates were used so the processing time was shorter than current processing rates.« less
The Development of a 3D LADAR Simulator Based on a Fast Target Impulse Response Generation Approach
NASA Astrophysics Data System (ADS)
Al-Temeemy, Ali Adnan
2017-09-01
A new laser detection and ranging (LADAR) simulator has been developed, using MATLAB and its graphical user interface, to simulate direct detection time of flight LADAR systems, and to produce 3D simulated scanning images under a wide variety of conditions. This simulator models each stage from the laser source to data generation and can be considered as an efficient simulation tool to use when developing LADAR systems and their data processing algorithms. The novel approach proposed for this simulator is to generate the actual target impulse response. This approach is fast and able to deal with high scanning requirements without losing the fidelity that accompanies increments in speed. This leads to a more efficient LADAR simulator and opens up the possibility for simulating LADAR beam propagation more accurately by using a large number of laser footprint samples. The approach is to select only the parts of the target that lie in the laser beam angular field by mathematically deriving the required equations and calculating the target angular ranges. The performance of the new simulator has been evaluated under different scanning conditions, the results showing significant increments in processing speeds in comparison to conventional approaches, which are also used in this study as a point of comparison for the results. The results also show the simulator's ability to simulate phenomena related to the scanning process, for example, type of noise, scanning resolution and laser beam width.
The VIIRS Ocean Data Simulator Enhancements and Results
NASA Technical Reports Server (NTRS)
Robinson, Wayne D.; Patt, Fredrick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.
2011-01-01
The VIIRS Ocean Science Team (VOST) has been developing an Ocean Data Simulator to create realistic VIIRS SDR datasets based on MODIS water-leaving radiances. The simulator is helping to assess instrument performance and scientific processing algorithms. Several changes were made in the last two years to complete the simulator and broaden its usefulness. The simulator is now fully functional and includes all sensor characteristics measured during prelaunch testing, including electronic and optical crosstalk influences, polarization sensitivity, and relative spectral response. Also included is the simulation of cloud and land radiances to make more realistic data sets and to understand their important influence on nearby ocean color data. The atmospheric tables used in the processing, including aerosol and Rayleigh reflectance coefficients, have been modeled using VIIRS relative spectral responses. The capabilities of the simulator were expanded to work in an unaggregated sample mode and to produce scans with additional samples beyond the standard scan. These features improve the capability to realistically add artifacts which act upon individual instrument samples prior to aggregation and which may originate from beyond the actual scan boundaries. The simulator was expanded to simulate all 16 M-bands and the EDR processing was improved to use these bands to make an SST product. The simulator is being used to generate global VIIRS data from and in parallel with the MODIS Aqua data stream. Studies have been conducted using the simulator to investigate the impact of instrument artifacts. This paper discusses the simulator improvements and results from the artifact impact studies.
The VIIRS ocean data simulator enhancements and results
NASA Astrophysics Data System (ADS)
Robinson, Wayne D.; Patt, Frederick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.
2011-10-01
The VIIRS Ocean Science Team (VOST) has been developing an Ocean Data Simulator to create realistic VIIRS SDR datasets based on MODIS water-leaving radiances. The simulator is helping to assess instrument performance and scientific processing algorithms. Several changes were made in the last two years to complete the simulator and broaden its usefulness. The simulator is now fully functional and includes all sensor characteristics measured during prelaunch testing, including electronic and optical crosstalk influences, polarization sensitivity, and relative spectral response. Also included is the simulation of cloud and land radiances to make more realistic data sets and to understand their important influence on nearby ocean color data. The atmospheric tables used in the processing, including aerosol and Rayleigh reflectance coefficients, have been modeled using VIIRS relative spectral responses. The capabilities of the simulator were expanded to work in an unaggregated sample mode and to produce scans with additional samples beyond the standard scan. These features improve the capability to realistically add artifacts which act upon individual instrument samples prior to aggregation and which may originate from beyond the actual scan boundaries. The simulator was expanded to simulate all 16 M-bands and the EDR processing was improved to use these bands to make an SST product. The simulator is being used to generate global VIIRS data from and in parallel with the MODIS Aqua data stream. Studies have been conducted using the simulator to investigate the impact of instrument artifacts. This paper discusses the simulator improvements and results from the artifact impact studies.
Computer simulation of the NASA water vapor electrolysis reactor
NASA Technical Reports Server (NTRS)
Bloom, A. M.
1974-01-01
The water vapor electrolysis (WVE) reactor is a spacecraft waste reclamation system for extended-mission manned spacecraft. The WVE reactor's raw material is water, its product oxygen. A computer simulation of the WVE operational processes provided the data required for an optimal design of the WVE unit. The simulation process was implemented with the aid of a FORTRAN IV routine.
NASA Astrophysics Data System (ADS)
Ma, K.; Thomassey, S.; Zeng, X.
2017-10-01
In this paper we proposed a central order processing system under resource sharing strategy for demand-driven garment supply chains to increase supply chain performances. We examined this system by using simulation technology. Simulation results showed that significant improvement in various performance indicators was obtained in new collaborative model with proposed system.
USDA-ARS?s Scientific Manuscript database
Computer simulation is a useful tool for benchmarking the electrical and fuel energy consumption and water use in a fluid milk plant. In this study, a computer simulation model of the fluid milk process based on high temperature short time (HTST) pasteurization was extended to include models for pr...
Mathematical modeling of high-pH chemical flooding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhuyan, D.; Lake, L.W.; Pope, G.A.
1990-05-01
This paper describes a generalized compositional reservoir simulator for high-pH chemical flooding processes. This simulator combines the reaction chemistry associated with these processes with the extensive physical- and flow-property modeling schemes of an existing micellar/polymer flood simulator, UTCHEM. Application of the model is illustrated for cases from a simple alkaline preflush to surfactant-enhanced alkaline-polymer flooding.
Sun, Rui; Ismail, Tamer M; Ren, Xiaohan; Abd El-Salam, M
2015-05-01
In order to reveal the features of the combustion process in the porous bed of a waste incinerator, a two-dimensional unsteady state model and experimental study were employed to investigate the combustion process in a fixed bed of municipal solid waste (MSW) on the combustion process in a fixed bed reactor. Conservation equations of the waste bed were implemented to describe the incineration process. The gas phase turbulence was modeled using the k-ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The rate of moisture evaporation, devolatilization rate, and char burnout was calculated according to the waste property characters. The simulation results were then compared with experimental data for different moisture content of MSW, which shows that the incineration process of waste in the fixed bed is reasonably simulated. The simulation results of solid temperature, gas species and process rate in the bed are accordant with experimental data. Due to the high moisture content of fuel, moisture evaporation consumes a vast amount of heat, and the evaporation takes up most of the combustion time (about 2/3 of the whole combustion process). The whole bed combustion process reduces greatly as MSW moisture content increases. The experimental and simulation results provide direction for design and optimization of the fixed bed of MSW. Copyright © 2015 Elsevier Ltd. All rights reserved.
Design of virtual simulation experiment based on key events
NASA Astrophysics Data System (ADS)
Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu
2018-06-01
Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.
NASA Astrophysics Data System (ADS)
Forouzan, Mehdi M.; Chao, Chien-Wei; Bustamante, Danilo; Mazzeo, Brian A.; Wheeler, Dean R.
2016-04-01
The fabrication process of Li-ion battery electrodes plays a prominent role in the microstructure and corresponding cell performance. Here, a mesoscale particle dynamics simulation is developed to relate the manufacturing process of a cathode containing Toda NCM-523 active material to physical and structural properties of the dried film. Particle interactions are simulated with shifted-force Lennard-Jones and granular Hertzian functions. LAMMPS, a freely available particle simulator, is used to generate particle trajectories and resulting predicted properties. To make simulations of the full film thickness feasible, the carbon binder domain (CBD) is approximated with μm-scale particles, each representing about 1000 carbon black particles and associated binder. Metrics for model parameterization and validation are measured experimentally and include the following: slurry viscosity, elasticity of the dried film, shrinkage ratio during drying, volume fraction of phases, slurry and dried film densities, and microstructure cross sections. Simulation results are in substantial agreement with experiment, showing that the simulations reasonably reproduce the relevant physics of particle arrangement during fabrication.
NASA Astrophysics Data System (ADS)
Lindstrom, Erik Vilhelm Mathias
Gasification of black liquor could drastically increase the flexibility and improve the profit potential of a mature industry. The completed work was focused on research around the economics and benefits of its implementation, utilizing laboratory pulping experiments and process simulation. The separation of sodium and sulfur achieved through gasification of recovered black liquor, can be utilized in processes like modified continuous cooking, split sulfidity and green liquor pretreatment pulping, and polysulfide-anthraquinone pulping, to improve pulp yield and properties. Laboratory pulping protocols have been developed for these modified pulping technologies and different process options evaluated. The process simulation work around BLG has led to the development of a WinGEMS module for the low temperature MTCI steam reforming process, and case studies comparing a simulated conventional kraft process to different process options built around the implementation of a BLG unit operation into the kraft recovery cycle. Pulp yield increases of 1-3% points with improved product quality, and the potential for capital and operating cost savings relative to the conventional kraft process have been demonstrated. Process simulation work has shown that the net variable operating cost for a pulping process using BLGCC is highly dependent on the cost of lime kiln fuel and the selling price of green power to the grid. Under the assumptions taken in the performed case study, the BLGCC process combined with split sulfidity or PSAQ pulping operations had net variable operating cost 2-4% greater than the kraft reference. The influence of the sales price of power to the grid is the most significant cost factor. If a sales price increase to 6 ¢/KWh for green power could be achieved, cost savings of about $40/ODtP could be realized in all investigated BLG processes. Other alternatives to improve the process economics around BLG would be to modify or eliminate the lime kiln unit operations, utilizing high sulfidity green liquor pretreatment, PSAQ with auto-causticization, or converting the process to mini-sulfide sulfite-AQ.
A Process for the Creation of T-MATS Propulsion System Models from NPSS data
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink (Math Works, Inc.) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.
A Process for the Creation of T-MATS Propulsion System Models from NPSS Data
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink(Trademark) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.
Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes.
Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J; Wang, Liliang; Lin, Jianguo
2016-12-13
The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions.
A Process for the Creation of T-MATS Propulsion System Models from NPSS Data
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink(Registered TradeMark) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.
Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes
Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J.; Wang, Liliang; Lin, Jianguo
2016-01-01
The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions. PMID:28060298
Experimental Simulations to Understand the Lunar and Martian Surficial Processes
NASA Astrophysics Data System (ADS)
Zhao, Y. Y. S.; Li, X.; Tang, H.; Li, Y.; Zeng, X.; Chang, R.; Li, S.; Zhang, S.; Jin, H.; Mo, B.; Li, R.; Yu, W.; Wang, S.
2016-12-01
In support with China's Lunar and Mars exploration programs and beyond, our center is dedicated to understand the surficial processes and environments of planetary bodies. Over the latest several years, we design, build and optimize experimental simulation facilities and utilize them to test hypotheses and evaluate affecting mechanisms under controlled conditions particularly relevant to the Moon and Mars. Among the fundamental questions to address, we emphasize on five major areas: (1) Micrometeorites bombardment simulation to evaluate the formation mechanisms of np-Fe0 which was found in lunar samples and the possible sources of Fe. (2) Solar wind implantation simulation to evaluate the alteration/amorphization/OH or H2O formation on the surface of target minerals or rocks. (3) Dusts mobility characteristics on the Moon and other planetary bodies by excitation different types of dust particles and measuring their movements. (4) Mars basaltic soil simulant development (e.g., Jining Martian Soil Simulant (JMSS-1)) and applications for scientific/engineering experiments. (5) Halogens (Cl and Br) and life essential elements (C, H, O, N, P, and S) distribution and speciation on Mars during surficial processes such as sedimentary- and photochemical- related processes. Depending on the variables of interest, the simulation systems provide flexibility to vary source of energy, temperature, pressure, and ambient gas composition in the reaction chambers. Also, simulation products can be observed or analyzed in-situ by various analyzer components inside the chamber, without interrupting the experimental conditions. In addition, behavior of elements and isotopes during certain surficial processes (e.g., evaporation, dissolution, etc.) can be theoretically predicted by our theoretical geochemistry group with thermodynamics-kinetics calculation and modeling, which supports experiment design and result interpretation.
State of the evidence on simulation-based training for laparoscopic surgery: a systematic review.
Zendejas, Benjamin; Brydges, Ryan; Hamstra, Stanley J; Cook, David A
2013-04-01
Summarize the outcomes and best practices of simulation training for laparoscopic surgery. Simulation-based training for laparoscopic surgery has become a mainstay of surgical training. Much new evidence has accrued since previous reviews were published. We systematically searched the literature through May 2011 for studies evaluating simulation, in comparison with no intervention or an alternate training activity, for training health professionals in laparoscopic surgery. Outcomes were classified as satisfaction, skills (in a test setting) of time (to perform the task), process (eg, performance rating), product (eg, knot strength), and behaviors when caring for patients. We used random effects to pool effect sizes. From 10,903 articles screened, we identified 219 eligible studies enrolling 7138 trainees, including 91 (42%) randomized trials. For comparisons with no intervention (n = 151 studies), pooled effect size (ES) favored simulation for outcomes of knowledge (1.18; N = 9 studies), skills time (1.13; N = 89), skills process (1.23; N = 114), skills product (1.09; N = 7), behavior time (1.15; N = 7), behavior process (1.22; N = 15), and patient effects (1.28; N = 1), all P < 0.05. When compared with nonsimulation instruction (n = 3 studies), results significantly favored simulation for outcomes of skills time (ES, 0.75) and skills process (ES, 0.54). Comparisons between different simulation interventions (n = 79 studies) clarified best practices. For example, in comparison with virtual reality, box trainers have similar effects for process skills outcomes and seem to be superior for outcomes of satisfaction and skills time. Simulation-based laparoscopic surgery training of health professionals has large benefits when compared with no intervention and is moderately more effective than nonsimulation instruction.
Predictive displays for a process-control schematic interface.
Yin, Shanqing; Wickens, Christopher D; Helander, Martin; Laberge, Jason C
2015-02-01
Our objective was to examine the extent to which increasing precision of predictive (rate of change) information in process control will improve performance on a simulated process-control task. Predictive displays have been found to be useful in process control (as well as aviation and maritime industries). However, authors of prior research have not examined the extent to which predictive value is increased by increasing predictor resolution, nor has such research tied potential improvements to changes in process control strategy. Fifty nonprofessional participants each controlled a simulated chemical mixture process (honey mixer simulation) that simulated the operations found in process control. Participants in each of five groups controlled with either no predictor or a predictor ranging in the resolution of prediction of the process. Increasing detail resolution generally increased the benefit of prediction over the control condition although not monotonically so. The best overall performance, combining quality and predictive ability, was obtained by the display of intermediate resolution. The two displays with the lowest resolution were clearly inferior. Predictors with higher resolution are of value but may trade off enhanced sensitivity to variable change (lower-resolution discrete state predictor) with smoother control action (higher-resolution continuous predictors). The research provides guidelines to the process-control industry regarding displays that can most improve operator performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gigley, H.M.
1982-01-01
An artificial intelligence approach to the simulation of neurolinguistically constrained processes in sentence comprehension is developed using control strategies for simulation of cooperative computation in associative networks. The desirability of this control strategy in contrast to ATN and production system strategies is explained. A first pass implementation of HOPE, an artificial intelligence simulation model of sentence comprehension, constrained by studies of aphasic performance, psycholinguistics, neurolinguistics, and linguistic theory is described. Claims that the model could serve as a basis for sentence production simulation and for a model of language acquisition as associative learning are discussed. HOPE is a model thatmore » performs in a normal state and includes a lesion simulation facility. HOPE is also a research tool. Its modifiability and use as a tool to investigate hypothesized causes of degradation in comprehension performance by aphasic patients are described. Issues of using behavioral constraints in modelling and obtaining appropriate data for simulated process modelling are discussed. Finally, problems of validation of the simulation results are raised; and issues of how to interpret clinical results to define the evolution of the model are discussed. Conclusions with respect to the feasibility of artificial intelligence simulation process modelling are discussed based on the current state of research.« less
Watershed Simulation of Nutrient Processes
In this presentation, nitrogen processes simulated in watershed models were reviewed and compared. Furthermore, current researches on nitrogen losses from agricultural fields were also reviewed. Finally, applications with those models were reviewed and selected successful and u...
Simulative design and process optimization of the two-stage stretch-blow molding process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopmann, Ch.; Rasche, S.; Windeck, C.
2015-05-22
The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development timemore » and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.« less
Simulation Based Low-Cost Composite Process Development at the US Air Force Research Laboratory
NASA Technical Reports Server (NTRS)
Rice, Brian P.; Lee, C. William; Curliss, David B.
2003-01-01
Low-cost composite research in the US Air Force Research Laboratory, Materials and Manufacturing Directorate, Organic Matrix Composites Branch has focused on the theme of affordable performance. Practically, this means that we use a very broad view when considering the affordability of composites. Factors such as material costs, labor costs, recurring and nonrecurring manufacturing costs are balanced against performance to arrive at the relative affordability vs. performance measure of merit. The research efforts discussed here are two projects focused on affordable processing of composites. The first topic is the use of a neural network scheme to model cure reaction kinetics, then utilize the kinetics coupled with simple heat transport models to predict, in real-time, future exotherms and control them. The neural network scheme is demonstrated to be very robust and a much more efficient method that mechanistic cure modeling approach. This enables very practical low-cost processing of thick composite parts. The second project is liquid composite molding (LCM) process simulation. LCM processing of large 3D integrated composite parts has been demonstrated to be a very cost effective way to produce large integrated aerospace components specific examples of LCM processes are resin transfer molding (RTM), vacuum assisted resin transfer molding (VARTM), and other similar approaches. LCM process simulation is a critical part of developing an LCM process approach. Flow simulation enables the development of the most robust approach to introducing resin into complex preforms. Furthermore, LCM simulation can be used in conjunction with flow front sensors to control the LCM process in real-time to account for preform or resin variability.
Simulative design and process optimization of the two-stage stretch-blow molding process
NASA Astrophysics Data System (ADS)
Hopmann, Ch.; Rasche, S.; Windeck, C.
2015-05-01
The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.
Development of an alkaline/surfactant/polymer compositional reservoir simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhuyan, D.
1989-01-01
The mathematical formulation of a generalized three-dimensional compositional reservoir simulator for high-pH chemical flooding processes is presented in this work. The model assumes local thermodynamic equilibrium with respect to both reaction chemistry and phase behavior and calculates equilibrium electrolyte and phase compositions as a function of time and position. The reaction chemistry considers aqueous electrolytic chemistry, precipitation/dissolution of minerals, ion exchange reactions on matrix surface, reaction of acidic components of crude oil with the bases in the aqueous solution and cation exchange reactions with the micelles. The simulator combines this detailed reaction chemistry associated with these processes with the extensivemore » physical and flow property modeling schemes of an existing chemical flood simulator (UTCHEM) to model the multiphase, multidimensional displacement processes. The formulation of the chemical equilibrium model is quite general and is adaptable to simulate a variety of chemical descriptions. In addition to its use in the simulation of high-pH chemical flooding processes, the model will find application in the simulation of other reactive flow problems like the ground water contamination, reinjection of produced water, chemical waste disposal, etc. in one, two or three dimensions and under multiphase flow conditions. In this work, the model is used to simulate several hypothetical cases of high-pH chemical floods, which include cases from a simple alkaline preflush of a micellar/polymer flood to surfactant enhanced alkaline-polymer flooding and the results are analyzed. Finally, a few published alkaline, alkaline-polymer and surfactant-alkaline-polymer corefloods are simulated and compared with the experimental results.« less
A Multiagent Modeling Environment for Simulating Work Practice in Organizations
NASA Technical Reports Server (NTRS)
Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron
2004-01-01
In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to represent the relations of people, locations, systems, artifacts, communication and information content.
Simulation of Ejecta Production and Mixing Process of Sn Sample under shock loading
NASA Astrophysics Data System (ADS)
Wang, Pei; Chen, Dawei; Sun, Haiquan; Ma, Dongjun
2017-06-01
Ejection may occur when a strong shock wave release at the free surface of metal material and the ejecta of high-speed particulate matter will be formed and further mixed with the surrounding gas. Ejecta production and its mixing process has been one of the most difficult problems in shock physics remain unresolved, and have many important engineering applications in the imploding compression science. The present paper will introduce a methodology for the theoretical modeling and numerical simulation of the complex ejection and mixing process. The ejecta production is decoupled with the particle mixing process, and the ejecta state can be achieved by the direct numerical simulation for the evolution of initial defect on the metal surface. Then the particle mixing process can be simulated and resolved by a two phase gas-particle model which uses the aforementioned ejecta state as the initial condition. A preliminary ejecta experiment of planar Sn metal Sample has validated the feasibility of the proposed methodology.
Flight Dynamic Simulation of Fighter In the Asymmetric External Store Release Process
NASA Astrophysics Data System (ADS)
Safi’i, Imam; Arifianto, Ony; Nurohman, Chandra
2018-04-01
In the fighter design, it is important to evaluate and analyze the flight dynamic of the aircraft earlier in the development process. One of the case is the dynamics of external store release process. A simulation tool can be used to analyze the fighter/external store system’s dynamics in the preliminary design stage. This paper reports the flight dynamics of Jet Fighter Experiment (JF-1 E) in asymmetric Advance Medium Range Air to Air Missile (AMRAAM) release process through simulations. The JF-1 E and AIM 120 AMRAAAM models are built by using Advanced Aircraft Analysis (AAA) and Missile Datcom software. By using these softwares, the aerodynamic stability and control derivatives can be obtained and used to model the dynamic characteristic of the fighter and the external store. The dynamic system is modeled by using MATLAB/Simulink software. By using this software, both the fighter/external store integration and the external store release process is simulated, and the dynamic of the system can be analyzed.
Software Framework for Advanced Power Plant Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Widmann; Sorin Munteanu; Aseem Jain
2010-08-01
This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less
An Aerodynamic Simulation Process for Iced Lifting Surfaces and Associated Issues
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Vickerman, Mary B.; Hackenberg, Anthony W.; Rigby, David L.
2003-01-01
This paper discusses technologies and software tools that are being implemented in a software toolkit currently under development at NASA Glenn Research Center. Its purpose is to help study the effects of icing on airfoil performance and assist with the aerodynamic simulation process which consists of characterization and modeling of ice geometry, application of block topology and grid generation, and flow simulation. Tools and technologies for each task have been carefully chosen based on their contribution to the overall process. For the geometry characterization and modeling, we have chosen an interactive rather than automatic process in order to handle numerous ice shapes. An Appendix presents features of a software toolkit developed to support the interactive process. Approaches taken for the generation of block topology and grids, and flow simulation, though not yet implemented in the software, are discussed with reasons for why particular methods are chosen. Some of the issues that need to be addressed and discussed by the icing community are also included.
In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway
NASA Astrophysics Data System (ADS)
Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun
2016-12-01
HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.
Jürgensen, Lars; Ehimen, Ehiaze Augustine; Born, Jens; Holm-Nielsen, Jens Bo
2015-02-01
This study aimed to investigate the feasibility of substitute natural gas (SNG) generation using biogas from anaerobic digestion and hydrogen from renewable energy systems. Using thermodynamic equilibrium analysis, kinetic reactor modeling and transient simulation, an integrated approach for the operation of a biogas-based Sabatier process was put forward, which was then verified using a lab scale heterogenous methanation reactor. The process simulation using a kinetic reactor model demonstrated the feasibility of the production of SNG at gas grid standards using a single reactor setup. The Wobbe index, CO2 content and calorific value were found to be controllable by the H2/CO2 ratio fed the methanation reactor. An optimal H2/CO2 ratio of 3.45-3.7 was seen to result in a product gas with high calorific value and Wobbe index. The dynamic reactor simulation verified that the process start-up was feasible within several minutes to facilitate surplus electricity use from renewable energy systems. Copyright © 2014 Elsevier Ltd. All rights reserved.
FACE-IT. A Science Gateway for Food Security Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montella, Raffaele; Kelly, David; Xiong, Wei
Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less
ERIC Educational Resources Information Center
Kaup, Barbara; Ludtke, Jana; Maienborn, Claudia
2010-01-01
In two experiments using the action-sentence-compatibility paradigm we investigated the simulation processes that readers undertake when processing state descriptions with adjectives (e.g., "Die Schublade ist offen/zu". ["The drawer is open/shut"]) or adjectival passives (e.g., "Die Schublade ist…
Taplay, Karyn; Jack, Susan M; Baxter, Pamela; Eva, Kevin; Martin, Lynn
2015-01-01
The aim of this study is to explain the process of adopting and incorporating simulation as a teaching strategy in undergraduate nursing programs, define uptake, and discuss potential outcomes. In many countries, simulation is increasingly adopted as a common teaching strategy. However, there is a dearth of knowledge related to the process of adoption and incorporation. We used an interpretive, constructivist approach to grounded theory to guide this research study. We conducted the study was in Ontario, Canada, during 2011-2012. Using multiple data sources, we informed the development of this theory including in-depth interviews (n = 43) and a review of key organizational documents, such as mission and vision statements (n = 67) from multiple nursing programs (n = 13). The adoption and uptake of mid- to high-fidelity simulation equipment is a multistep iterative process involving various organizational levels within the institution that entails a seven-phase process: (a) securing resources, (b) nursing leaders working in tandem, (c) getting it out of the box, (d) learning about simulation and its potential for teaching, (e) finding a fit, (f) trialing the equipment, and (g) integrating into the curriculum. These findings could assist nursing programs in Canada and internationally that wish to adopt or further incorporate simulation into their curricula and highlight potential organizational and program level outcomes. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.
Just, Sarah; Toschkoff, Gregor; Funke, Adrian; Djuric, Dejan; Scharrer, Georg; Khinast, Johannes; Knop, Klaus; Kleinebudde, Peter
2013-03-01
Coating of solid dosage forms is an important unit operation in the pharmaceutical industry. In recent years, numerical simulations of drug manufacturing processes have been gaining interest as process analytical technology tools. The discrete element method (DEM) in particular is suitable to model tablet-coating processes. For the development of accurate simulations, information on the material properties of the tablets is required. In this study, the mechanical parameters Young's modulus, coefficient of restitution (CoR), and coefficients of friction (CoF) of gastrointestinal therapeutic systems (GITS) and of active-coated GITS were measured experimentally. The dynamic angle of repose of these tablets in a drum coater was investigated to revise the CoF. The resulting values were used as input data in DEM simulations to compare simulation and experiment. A mean value of Young's modulus of 31.9 MPa was determined by the uniaxial compression test. The CoR was found to be 0.78. For both tablet-steel and tablet-tablet friction, active-coated GITS showed a higher CoF compared with GITS. According to the values of the dynamic angle of repose, the CoF was adjusted to obtain consistent tablet motion in the simulation and in the experiment. On the basis of this experimental characterization, mechanical parameters are integrated into DEM simulation programs to perform numerical analysis of coating processes.
Simulation of aerobic and anaerobic biodegradation processes at a crude oil spill site
Essaid, Hedeff I.; Bekins, Barbara A.; Godsy, E. Michael; Warren, Ean; Baedecker, Mary Jo; Cozzarelli, Isabelle M.
1995-01-01
A two-dimensional, multispecies reactive solute transport model with sequential aerobic and anaerobic degradation processes was developed and tested. The model was used to study the field-scale solute transport and degradation processes at the Bemidji, Minnesota, crude oil spill site. The simulations included the biodegradation of volatile and nonvolatile fractions of dissolved organic carbon by aerobic processes, manganese and iron reduction, and methanogenesis. Model parameter estimates were constrained by published Monod kinetic parameters, theoretical yield estimates, and field biomass measurements. Despite the considerable uncertainty in the model parameter estimates, results of simulations reproduced the general features of the observed groundwater plume and the measured bacterial concentrations. In the simulation, 46% of the total dissolved organic carbon (TDOC) introduced into the aquifer was degraded. Aerobic degradation accounted for 40% of the TDOC degraded. Anaerobic processes accounted for the remaining 60% of degradation of TDOC: 5% by Mn reduction, 19% by Fe reduction, and 36% by methanogenesis. Thus anaerobic processes account for more than half of the removal of DOC at this site.
Monte Carlo simulations of safeguards neutron counter for oxide reduction process feed material
NASA Astrophysics Data System (ADS)
Seo, Hee; Lee, Chaehun; Oh, Jong-Myeong; An, Su Jung; Ahn, Seong-Kyu; Park, Se-Hwan; Ku, Jeong-Hoe
2016-10-01
One of the options for spent-fuel management in Korea is pyroprocessing whose main process flow is the head-end process followed by oxide reduction, electrorefining, and electrowining. In the present study, a well-type passive neutron coincidence counter, namely, the ACP (Advanced spent fuel Conditioning Process) safeguards neutron counter (ASNC), was redesigned for safeguards of a hot-cell facility related to the oxide reduction process. To this end, first, the isotopic composition, gamma/neutron emission yield and energy spectrum of the feed material ( i.e., the UO2 porous pellet) were calculated using the OrigenARP code. Then, the proper thickness of the gammaray shield was determined, both by irradiation testing at a standard dosimetry laboratory and by MCNP6 simulations using the parameters obtained from the OrigenARP calculation. Finally, the neutron coincidence counter's calibration curve for 100- to 1000-g porous pellets, in consideration of the process batch size, was determined through simulations. Based on these simulation results, the neutron counter currently is under construction. In the near future, it will be installed in a hot cell and tested with spent fuel materials.
GPU-based efficient realistic techniques for bleeding and smoke generation in surgical simulators.
Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu
2010-12-01
In actual surgery, smoke and bleeding due to cauterization processes provide important visual cues to the surgeon, which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated the effects of bleeding and smoke generation, they are not realistic due to the requirement of real-time performance. To be interactive, visual update must be performed at at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques, since other computationally intensive processes compete for the available Central Processing Unit (CPU) resources. In this study we developed a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators, which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. The smoke and bleeding simulation were implemented as part of a laparoscopic adjustable gastric banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur noticeable overhead. However, for smoke generation, an input/output (I/O) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited to VR-based surgical simulators. Copyright © 2010 John Wiley & Sons, Ltd.
Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes
NASA Astrophysics Data System (ADS)
Cropper, A. E.; Wang, Z.
1995-08-01
Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.
A general software reliability process simulation technique
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1991-01-01
The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.
Model-Based Verification and Validation of the SMAP Uplink Processes
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun
2013-01-01
This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.
Massively Parallel Processing for Fast and Accurate Stamping Simulations
NASA Astrophysics Data System (ADS)
Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu
2005-08-01
The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.
Computer Based Simulation of Laboratory Experiments.
ERIC Educational Resources Information Center
Edward, Norrie S.
1997-01-01
Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…
Teaching Process Simulation in Eleven Easy Lessons Using Excel and Its Tools
NASA Astrophysics Data System (ADS)
Morris, Arthur E.
The primary market driver for improving process technology is innovation, which requires a skilled and educated workforce. However, many Materials Science and Engineering departments have eliminated extractive metallurgy and chemical thermodynamics from their curricula, yet these topics contain the necessary fundamentals for process innovation. As a result, most MS&E students are ill-prepared for careers in processing. The dearth of process-oriented MS&E curricula has prompted some Universities to develop a "shared" effort to offer distance education between multiple institutions [1]. A target audience for a shared process simulation course would not only benefit students, but also be a basis for an on-line course for practicing engineers faced with new or changing career choices. To fill the gap, the basics of a process simulation course was developed in an abbreviated form as series of eleven articles and Excel workbooks published in Industrial Heating magazine between July 2012 and July 2013.
Modeling the Gas Nitriding Process of Low Alloy Steels
NASA Astrophysics Data System (ADS)
Yang, M.; Zimmerman, C.; Donahue, D.; Sisson, R. D.
2013-07-01
The effort to simulate the nitriding process has been ongoing for the last 20 years. Most of the work has been done to simulate the nitriding process of pure iron. In the present work a series of experiments have been done to understand the effects of the nitriding process parameters such as the nitriding potential, temperature, and time as well as surface condition on the gas nitriding process for the steels. The compound layer growth model has been developed to simulate the nitriding process of AISI 4140 steel. In this paper the fundamentals of the model are presented and discussed including the kinetics of compound layer growth and the determination of the nitrogen diffusivity in the diffusion zone. The excellent agreements have been achieved for both as-washed and pre-oxided nitrided AISI 4140 between the experimental data and simulation results. The nitrogen diffusivity in the diffusion zone is determined to be constant and only depends on the nitriding temperature, which is ~5 × 10-9 cm2/s at 548 °C. It proves the concept of utilizing the compound layer growth model in other steels. The nitriding process of various steels can thus be modeled and predicted in the future.
Advancing Nucleosynthesis in Core-Collapse Supernovae Models Using 2D CHIMERA Simulations
NASA Astrophysics Data System (ADS)
Harris, J. A.; Hix, W. R.; Chertkow, M. A.; Bruenn, S. W.; Lentz, E. J.; Messer, O. B.; Mezzacappa, A.; Blondin, J. M.; Marronetti, P.; Yakunin, K.
2014-01-01
The deaths of massive stars as core-collapse supernovae (CCSN) serve as a crucial link in understanding galactic chemical evolution since the birth of the universe via the Big Bang. We investigate CCSN in polar axisymmetric simulations using the multidimensional radiation hydrodynamics code CHIMERA. Computational costs have traditionally constrained the evolution of the nuclear composition in CCSN models to, at best, a 14-species α-network. However, the limited capacity of the α-network to accurately evolve detailed composition, the neutronization and the nuclear energy generation rate has fettered the ability of prior CCSN simulations to accurately reproduce the chemical abundances and energy distributions as known from observations. These deficits can be partially ameliorated by "post-processing" with a more realistic network. Lagrangian tracer particles placed throughout the star record the temporal evolution of the initial simulation and enable the extension of the nuclear network evolution by incorporating larger systems in post-processing nucleosynthesis calculations. We present post-processing results of the four ab initio axisymmetric CCSN 2D models of Bruenn et al. (2013) evolved with the smaller α-network, and initiated from stellar metallicity, non-rotating progenitors of mass 12, 15, 20, and 25 M⊙ from Woosley & Heger (2007). As a test of the limitations of post-processing, we provide preliminary results from an ongoing simulation of the 15 M⊙ model evolved with a realistic 150 species nuclear reaction network in situ. With more accurate energy generation rates and an improved determination of the thermodynamic trajectories of the tracer particles, we can better unravel the complicated multidimensional "mass-cut" in CCSN simulations and probe for less energetically significant nuclear processes like the νp-process and the r-process, which require still larger networks.
Failure Analysis of a Sheet Metal Blanking Process Based on Damage Coupling Model
NASA Astrophysics Data System (ADS)
Wen, Y.; Chen, Z. H.; Zang, Y.
2013-11-01
In this paper, a blanking process of sheet metal is studied by the methods of numerical simulation and experimental observation. The effects of varying technological parameters related to the quality of products are investigated. An elastoplastic constitutive equation accounting for isotropic ductile damage is implemented into the finite element code ABAQUS with a user-defined material subroutine UMAT. The simulations of the damage evolution and ductile fracture in a sheet metal blanking process have been carried out by the FEM. In order to guarantee computation accuracy and avoid numerical divergence during large plastic deformation, a specified remeshing technique is successively applied when severe element distortion occurs. In the simulation, the evolutions of damage at different stage of the blanking process have been evaluated and the distributions of damage obtained from simulation are in proper agreement with the experimental results.
NASA Astrophysics Data System (ADS)
Profumieri, A.; Bonell, C.; Catalfamo, P.; Cherniz, A.
2016-04-01
Virtual reality has been proposed for different applications, including the evaluation of new control strategies and training protocols for upper limb prostheses and for the study of new rehabilitation programs. In this study, a lower limb simulation environment commanded by surface electromyography signals is evaluated. The time delays generated by the acquisition and processing stages for the signals that would command the knee joint, were measured and different acquisition windows were analysed. The subjective perception of the quality of simulation was also evaluated when extra delays were added to the process. The results showed that the acquisition window is responsible for the longest delay. Also, the basic implemented processes allowed for the acquisition of three signal channels for commanding the simulation. Finally, the communication between different applications is arguably efficient, although it depends on the amount of data to be sent.
NASA Astrophysics Data System (ADS)
Yan, Xuewei; Wang, Run'nan; Xu, Qingyan; Liu, Baicheng
2017-04-01
Mathematical models for dynamic heat radiation and convection boundary in directional solidification processes are established to simulate the temperature fields. Cellular automaton (CA) method and Kurz-Giovanola-Trivedi (KGT) growth model are used to describe nucleation and growth. Primary dendritic arm spacing (PDAS) and secondary dendritic arm spacing (SDAS) are calculated by the Ma-Sham (MS) and Furer-Wunderlin (FW) models respectively. The mushy zone shape is investigated based on the temperature fields, for both high-rate solidification (HRS) and liquid metal cooling (LMC) processes. The evolution of the microstructure and crystallographic orientation are analyzed by simulation and electron back-scattered diffraction (EBSD) technique, respectively. Comparison of the simulation results from PDAS and SDAS with experimental results reveals a good agreement with each other. The results show that LMC process can provide both dendritic refinement and superior performance for castings due to the increased cooling rate and thermal gradient.
Design of a high-speed digital processing element for parallel simulation
NASA Technical Reports Server (NTRS)
Milner, E. J.; Cwynar, D. S.
1983-01-01
A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.
Characterization and Evaluation of Lunar Regolith and Simulants
NASA Technical Reports Server (NTRS)
Cross, William M.; Murphy, Gloria A.
2010-01-01
A NASA-ESMD (National Aeronautics and Space Administration-Exploration Systems Mission Directorate) funded senior design project "Mineral Separation Technology for Lunar Regolith Simulant Production" is directed toward designing processes to produce Simulant materials as close to lunar regolith as possible. The eight undergraduate (junior and senior) students involved are taking a systems engineering design approach to identifying the most pressing concerns in simulant needs, then designing subsystems and processing strategies to meet these needs using terrestrial materials. This allows the students to, not only learn the systems engineering design process, but also, to make a significant contribution to an important NASA ESMD project. This paper will primarily be focused on the implementation aspect, particularly related to the systems engineering process, of this NASA EMSD senior design project. In addition comparison of the NASA ESMD group experience to the implementation of systems engineering practices into a group of existing design projects is given.
Accelerated simulation of stochastic particle removal processes in particle-resolved aerosol models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, J.H.; Michelotti, M.D.; Riemer, N.
2016-10-01
Stochastic particle-resolved methods have proven useful for simulating multi-dimensional systems such as composition-resolved aerosol size distributions. While particle-resolved methods have substantial benefits for highly detailed simulations, these techniques suffer from high computational cost, motivating efforts to improve their algorithmic efficiency. Here we formulate an algorithm for accelerating particle removal processes by aggregating particles of similar size into bins. We present the Binned Algorithm for particle removal processes and analyze its performance with application to the atmospherically relevant process of aerosol dry deposition. We show that the Binned Algorithm can dramatically improve the efficiency of particle removals, particularly for low removalmore » rates, and that computational cost is reduced without introducing additional error. In simulations of aerosol particle removal by dry deposition in atmospherically relevant conditions, we demonstrate about 50-times increase in algorithm efficiency.« less
Simulation of mass storage systems operating in a large data processing facility
NASA Technical Reports Server (NTRS)
Holmes, R.
1972-01-01
A mass storage simulation program was written to aid system designers in the design of a data processing facility. It acts as a tool for measuring the overall effect on the facility of on-line mass storage systems, and it provides the means of measuring and comparing the performance of competing mass storage systems. The performance of the simulation program is demonstrated.
Robert M. Scheller; James B. Domingo; Brian R. Sturtevant; Jeremy S. Williams; Arnold Rudy; Eric J. Gustafson; David J. Mladenoff
2007-01-01
We introduce LANDIS-II, a landscape model designed to simulate forest succession and disturbances. LANDIS-II builds upon and preserves the functionality of previous LANDIS forest landscape simulation models. LANDIS-II is distinguished by the inclusion of variable time steps for different ecological processes; our use of a rigorous development and testing process used...
Lau, Nathan; Jamieson, Greg A; Skraaning, Gyrd
2016-03-01
The Process Overview Measure is a query-based measure developed to assess operator situation awareness (SA) from monitoring process plants. A companion paper describes how the measure has been developed according to process plant properties and operator cognitive work. The Process Overview Measure demonstrated practicality, sensitivity, validity and reliability in two full-scope simulator experiments investigating dramatically different operational concepts. Practicality was assessed based on qualitative feedback of participants and researchers. The Process Overview Measure demonstrated sensitivity and validity by revealing significant effects of experimental manipulations that corroborated with other empirical results. The measure also demonstrated adequate inter-rater reliability and practicality for measuring SA in full-scope simulator settings based on data collected on process experts. Thus, full-scope simulator studies can employ the Process Overview Measure to reveal the impact of new control room technology and operational concepts on monitoring process plants. Practitioner Summary: The Process Overview Measure is a query-based measure that demonstrated practicality, sensitivity, validity and reliability for assessing operator situation awareness (SA) from monitoring process plants in representative settings.
NASA Astrophysics Data System (ADS)
Xie, Z.; Zou, J.; Qin, P.; Sun, Q.
2014-12-01
In this study, we incorporated a groundwater exploitation scheme into the land surface model CLM3.5 to investigate the effects of the anthropogenic exploitation of groundwater on land surface processes in a river basin. Simulations of the Haihe River Basin in northern China were conducted for the years 1965-2000 using the model. A control simulation without exploitation and three exploitation simulations with different water demands derived from socioeconomic data related to the Basin were conducted. The results showed that groundwater exploitation for human activities resulted in increased wetting and cooling effects at the land surface and reduced groundwater storage. A lowering of the groundwater table, increased upper soil moisture, reduced 2 m air temperature, and enhanced latent heat flux were detected by the end of the simulated period, and the changes at the land surface were related linearly to the water demands. To determine the possible responses of the land surface processes in extreme cases (i.e., in which the exploitation process either continued or ceased), additional hypothetical simulations for the coming 200 years with constant climate forcing were conducted, regardless of changes in climate. The simulations revealed that the local groundwater storage on the plains could not contend with high-intensity exploitation for long if the exploitation process continues at the current rate. Changes attributable to groundwater exploitation reached extreme values and then weakened within decades with the depletion of groundwater resources and the exploitation process will therefore cease. However, if exploitation is stopped completely to allow groundwater to recover, drying and warming effects, such as increased temperature, reduced soil moisture, and reduced total runoff, would occur in the Basin within the early decades of the simulation period. The effects of exploitation will then gradually disappear, and the land surface variables will approach the natural state and stabilize at different rates. Simulations were also conducted for cases in which exploitation either continues or ceases using future climate scenario outputs from a general circulation model. The resulting trends were almost the same as those of the simulations with constant climate forcing.
A cascading failure analysis tool for post processing TRANSCARE simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
This is a MATLAB-based tool to post process simulation results in the EPRI software TRANSCARE, for massive cascading failure analysis following severe disturbances. There are a few key modules available in this tool, including: 1. automatically creating a contingency list to run TRANSCARE simulations, including substation outages above a certain kV threshold, N-k (1, 2 or 3) generator outages and branche outages; 2. read in and analyze a CKO file of PCG definition, an initiating event list, and a CDN file; 3. post process all the simulation results saved in a CDN file and perform critical event corridor analysis; 4.more » provide a summary of TRANSCARE simulations; 5. Identify the most frequently occurring event corridors in the system; and 6. Rank the contingencies using a user defined security index to quantify consequences in terms of total load loss, total number of cascades, etc.« less
CPAS Preflight Drop Test Analysis Process
NASA Technical Reports Server (NTRS)
Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.
2015-01-01
Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.
NASA Astrophysics Data System (ADS)
Cao, Duc; Moses, Gregory; Delettrez, Jacques; Collins, Timothy
2014-10-01
A design process is presented for the nonlocal thermal transport iSNB (implicit Schurtz, Nicolai, and Busquet) model to provide reliable nonlocal thermal transport in polar-drive ICF simulations. Results from the iSNB model are known to be sensitive to changes in the SNB ``mean free path'' formula, and the latter's original form required modification to obtain realistic preheat levels. In the presented design process, SNB mean free paths are first modified until the model can match temperatures from Goncharov's thermal transport model in 1D temperature relaxation simulations. Afterwards the same mean free paths are tested in a 1D polar-drive surrogate simulation to match adiabats from Goncharov's model. After passing the two previous steps, the model can then be run in a full 2D polar-drive simulation. This research is supported by the University of Rochester Laboratory for Laser Energetics.
NASA Astrophysics Data System (ADS)
Yang, Yuansheng; Zhao, Fuze; Feng, Xiaohui
2017-10-01
The dispersion of carbon nanotubes (CNTs) in AZ91D melt by ultrasonic processing and microstructure formation of CNTs/AZ91D composite were studied using numerical and physical simulations. The sound field and acoustic streaming were predicted using finite element method. Meanwhile, optimal immersion depth of the ultrasonic probe and suitable ultrasonic power were obtained. Single-bubble model was used to predict ultrasonic cavitation in AZ91D melt. The relationship between sound pressure amplitude and ultrasonic cavitation was established. Physical simulations of acoustic streaming and ultrasonic cavitation agreed well with the numerical simulations. It was confirmed that the dispersion of carbon nanotubes was remarkably improved by ultrasonic processing. Microstructure formation of CNTs/AZ91D composite was numerically simulated using cellular automation method. In addition, grain refinement was achieved and the growth of dendrites was changed due to the uniform dispersion of CNTs.
Simulation of Mechanical Behavior of Agglutinates
NASA Technical Reports Server (NTRS)
Nakagawa, Masami; Moon, Tae-Hyun
2005-01-01
Due to lack of "real" lunar soil or even lunar simulant, it is difficult to characterize the interaction between lunar soil (or simulant) with different surfaces that are involved in excavation and processing machinery. One unique feature possessed by lunar soil is the agglutinates produced by repeated high-speed micrometeoroid impacts and subsequent pulverization[l and 2]. The large particles are impacted by micrometeoroids [Fig.l] and pulverized to produce finer particles. This process continues until there are no more "large" particles left on the surface of the moon. Due to high impact speed, the impact melting process fuses fines to make agglutinates such as shown in Fig. 2. We will present a series of simulation results and movies will be shown to indicate brittle behavior of each individual agglutinate and also similar compressibility charts shown by Carrier et al. [3]. Fig. 3 shows our preliminary result of the simulated oedometer tests.
Keane, R E; Ryan, K C; Running, S W
1996-03-01
A mechanistic, biogeochemical succession model, FIRE-BGC, was used to investigate the role of fire on long-term landscape dynamics in northern Rocky Mountain coniferous forests of Glacier National Park, Montana, USA. FIRE-BGC is an individual-tree model-created by merging the gap-phase process-based model FIRESUM with the mechanistic ecosystem biogeochemical model FOREST-BGC-that has mixed spatial and temporal resolution in its simulation architecture. Ecological processes that act at a landscape level, such as fire and seed dispersal, are simulated annually from stand and topographic information. Stand-level processes, such as tree establishment, growth and mortality, organic matter accumulation and decomposition, and undergrowth plant dynamics are simulated both daily and annually. Tree growth is mechanistically modeled based on the ecosystem process approach of FOREST-BGC where carbon is fixed daily by forest canopy photosynthesis at the stand level. Carbon allocated to the tree stem at the end of the year generates the corresponding diameter and height growth. The model also explicitly simulates fire behavior and effects on landscape characteristics. We simulated the effects of fire on ecosystem characteristics of net primary productivity, evapotranspiration, standing crop biomass, nitrogen cycling and leaf area index over 200 years for the 50,000-ha McDonald Drainage in Glacier National Park. Results show increases in net primary productivity and available nitrogen when fires are included in the simulation. Standing crop biomass and evapotranspiration decrease under a fire regime. Shade-intolerant species dominate the landscape when fires are excluded. Model tree increment predictions compared well with field data.
NASA Astrophysics Data System (ADS)
Hieu, Nguyen Huu
2017-09-01
Pervaporation is a potential process for the final step of ethanol biofuel production. In this study, a mathematical model was developed based on the resistance-in-series model and a simulation was carried out using the specialized simulation software COMSOL Multiphysics to describe a tubular type pervaporation module with membranes for the dehydration of ethanol solution. The permeance of membranes, operating conditions, and feed conditions in the simulation were referred from experimental data reported previously in literature. Accordingly, the simulated temperature and density profiles of pure water and ethanol-water mixture were validated based on existing published data.
Simulating complex intracellular processes using object-oriented computational modelling.
Johnson, Colin G; Goldman, Jacki P; Gullick, William J
2004-11-01
The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.
Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide
NASA Technical Reports Server (NTRS)
Khayat, Michael A.
2011-01-01
The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.
Simulation of the effect of incline incident angle in DMD Maskless Lithography
NASA Astrophysics Data System (ADS)
Liang, L. W.; Zhou, J. Y.; Xiang, L. L.; Wang, B.; Wen, K. H.; Lei, L.
2017-06-01
The aim of this study is to provide a simulation method for investigation of the intensity fluctuation caused by the inclined incident angle in DMD (digital micromirror device) maskless lithography. The simulation consists of eight main processes involving the simplification of the DMD aperture function and light propagation utilizing the non-parallel angular spectrum method. These processes provide a possibility of co-simulation in the spatial frequency domain, which combines the microlens array and DMD in the maskless lithography system. The simulation provided the spot shape and illumination distribution. These two parameters are crucial in determining the exposure dose in the existing maskless lithography system.
Viscous and thermal modelling of thermoplastic composites forming process
NASA Astrophysics Data System (ADS)
Guzman, Eduardo; Liang, Biao; Hamila, Nahiene; Boisse, Philippe
2016-10-01
Thermoforming thermoplastic prepregs is a fast manufacturing process. It is suitable for automotive composite parts manufacturing. The simulation of thermoplastic prepreg forming is achieved by alternate thermal and mechanical analyses. The thermal properties are obtained from a mesoscopic analysis and a homogenization procedure. The forming simulation is based on a viscous-hyperelastic approach. The thermal simulations define the coefficients of the mechanical model that depend on the temperature. The forming simulations modify the boundary conditions and the internal geometry of the thermal analyses. The comparison of the simulation with an experimental thermoforming of a part representative of automotive applications shows the efficiency of the approach.
NASA Astrophysics Data System (ADS)
Park, Han-Earl; Park, Sang-Young; Kim, Sung-Woo; Park, Chandeok
2013-12-01
Development and experiment of an integrated orbit and attitude hardware-in-the-loop (HIL) simulator for autonomous satellite formation flying are presented. The integrated simulator system consists of an orbit HIL simulator for orbit determination and control, and an attitude HIL simulator for attitude determination and control. The integrated simulator involves four processes (orbit determination, orbit control, attitude determination, and attitude control), which interact with each other in the same way as actual flight processes do. Orbit determination is conducted by a relative navigation algorithm using double-difference GPS measurements based on the extended Kalman filter (EKF). Orbit control is performed by a state-dependent Riccati equation (SDRE) technique that is utilized as a nonlinear controller for the formation control problem. Attitude is determined from an attitude heading reference system (AHRS) sensor, and a proportional-derivative (PD) feedback controller is used to control the attitude HIL simulator using three momentum wheel assemblies. Integrated orbit and attitude simulations are performed for a formation reconfiguration scenario. By performing the four processes adequately, the desired formation reconfiguration from a baseline of 500-1000 m was achieved with meter-level position error and millimeter-level relative position navigation. This HIL simulation demonstrates the performance of the integrated HIL simulator and the feasibility of the applied algorithms in a real-time environment. Furthermore, the integrated HIL simulator system developed in the current study can be used as a ground-based testing environment to reproduce possible actual satellite formation operations.
Application of ICME Methods for the Development of Rapid Manufacturing Technologies
NASA Astrophysics Data System (ADS)
Maiwald-Immer, T.; Göhler, T.; Fischersworring-Bunk, A.; Körner, C.; Osmanlic, F.; Bauereiß, A.
Rapid manufacturing technologies are lately gaining interest as alternative manufacturing method. Due to the large parameter sets applicable in these manufacturing methods and their impact on achievable material properties and quality, support of the manufacturing process development by the use of simulation is highly attractive. This is especially true for aerospace applications with their high quality demands and controlled scatter in the resulting material properties. The applicable simulation techniques to these manufacturing methods are manifold. The paper will focus on the melt pool simulation for a SLM (selective laser melting) process which was originally developed for EBM (electron beam melting). It will be discussed in the overall context of a multi-scale simulation within a virtual process chain.
Designing Scenarios for Controller-in-the-Loop Air Traffic Simulations
NASA Technical Reports Server (NTRS)
Kupfer, Michael; Mercer, Joey S.; Cabrall, Christopher; Callantine, Todd
2013-01-01
Well prepared traffic scenarios contribute greatly to the success of controller-in-the-loop simulations. This paper describes each stage in the design process of realistic scenarios based on real-world traffic, to be used in the Airspace Operations Laboratory for simulations within the Air Traffic Management Technology Demonstration 1 effort. The steps from the initial analysis of real-world traffic, to the editing of individual aircraft records in the scenario file, until the final testing of the scenarios before the simulation conduct, are all described. The iterative nature of the design process and the various efforts necessary to reach the required fidelity, as well as the applied design strategies, challenges, and tools used during this process are also discussed.
NASA Astrophysics Data System (ADS)
Ammouri, Aymen; Ben Salah, Walid; Khachroumi, Sofiane; Ben Salah, Tarek; Kourda, Ferid; Morel, Hervé
2014-05-01
Design of integrated power converters needs prototype-less approaches. Specific simulations are required for investigation and validation process. Simulation relies on active and passive device models. Models of planar devices, for instance, are still not available in power simulator tools. There is, thus, a specific limitation during the simulation process of integrated power systems. The paper focuses on the development of a physically-based planar inductor model and its validation inside a power converter during transient switching. The planar inductor model remains a complex device to model, particularly when the skin, the proximity and the parasitic capacitances effects are taken into account. Heterogeneous simulation scheme, including circuit and device models, is successfully implemented in VHDL-AMS language and simulated in Simplorer platform. The mixed simulation results has been favorably tested and compared with practical measurements. It is found that the multi-domain simulation results and measurements data are in close agreement.
Two-step simulation of velocity and passive scalar mixing at high Schmidt number in turbulent jets
NASA Astrophysics Data System (ADS)
Rah, K. Jeff; Blanquart, Guillaume
2016-11-01
Simulation of passive scalar in the high Schmidt number turbulent mixing process requires higher computational cost than that of velocity fields, because the scalar is associated with smaller length scales than velocity. Thus, full simulation of both velocity and passive scalar with high Sc for a practical configuration is difficult to perform. In this work, a new approach to simulate velocity and passive scalar mixing at high Sc is suggested to reduce the computational cost. First, the velocity fields are resolved by Large Eddy Simulation (LES). Then, by extracting the velocity information from LES, the scalar inside a moving fluid blob is simulated by Direct Numerical Simulation (DNS). This two-step simulation method is applied to a turbulent jet and provides a new way to examine a scalar mixing process in a practical application with smaller computational cost. NSF, Samsung Scholarship.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com; Suprijadi; Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132
Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic imagesmore » and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.« less
NASA Astrophysics Data System (ADS)
Hashim, N. A.; Mudalip, S. K. Abdul; Harun, N.; Che Man, R.; Sulaiman, S. Z.; Arshad, Z. I. M.; Shaarani, S. M.
2018-05-01
Mahkota Dewa (Phaleria Macrocarpa), a good source of saponin, flavanoid, polyphenol, alkaloid, and mangiferin has an extensive range of medicinal effects. The intermolecular interactions between solute and solvents such as hydrogen bonding considered as an important factor that affect the extraction of bioactive compounds. In this work, molecular dynamics simulation was performed to elucidate the hydrogen bonding exists between Mahkota Dewa extracts and water during subcritical extraction process. A bioactive compound in the Mahkota Dewa extract, namely mangiferin was selected as a model compound. The simulation was performed at 373 K and 4.0 MPa using COMPASS force field and Ewald summation method available in Material Studio 7.0 simulation package. The radial distribution functions (RDF) between mangiferin and water signify the presence of hydrogen bonding in the extraction process. The simulation of the binary mixture of mangiferin:water shows that strong hydrogen bonding was formed. It is suggested that, the intermolecular interaction between OH2O••HMR4(OH1) has been identified to be responsible for the mangiferin extraction process.
Study on Roadheader Cutting Load at Different Properties of Coal and Rock
2013-01-01
The mechanism of cutting process of roadheader with cutting head was researched, and the influences of properties of coal and rock on cutting load were deeply analyzed. Aimed at the defects of traditional calculation method of cutting load on fully expressing the complex cutting process of cutting head, the method of finite element simulation was proposed to simulate the dynamic cutting process. Aimed at the characteristics of coal and rock which affect the cutting load, several simulations with different firmness coefficient were taken repeatedly, and the relationship between three-axis force and firmness coefficient was derived. A comparative analysis of cutting pick load between simulation results and theoretical formula was carried out, and a consistency was achieved. Then cutting process with a total cutting head was carried out on this basis. The results show that the simulation analysis not only provides a reliable guarantee for the accurate calculation of the cutting head load and improves the efficiency of the cutting head cutting test but also offers a basis for selection of cutting head with different geological conditions of coal or rock. PMID:24302866
Is Moving More Memorable than Proving? Effects of Embodiment and Imagined Enactment on Verb Memory
Sidhu, David M.; Pexman, Penny M.
2016-01-01
Theories of embodied cognition propose that sensorimotor information is simulated during language processing (e.g., Barsalou, 1999). Previous studies have demonstrated that differences in simulation can have implications for word processing; for instance, lexical processing is facilitated for verbs that have relatively more embodied meanings (e.g., Sidhu et al., 2014). Here we examined the effects of these differences on memory for verbs. We observed higher rates of recognition (Experiments 1a-2a) and recall accuracy (Experiments 2b-3b) for verbs with a greater amount of associated bodily information (i.e., an embodiment effect). We also examined how this interacted with the imagined enactment effect: a memory benefit for actions that one imagines performing (e.g., Ditman et al., 2010). We found that these two effects did not interact (Experiment 3b), suggesting that the memory benefits of automatic simulation (i.e., the embodiment effect) and deliberate simulation (i.e., the imagined enactment effect) are distinct. These results provide evidence for the role of simulation in language processing, and its effects on memory. PMID:27445956
BioNetSim: a Petri net-based modeling tool for simulations of biochemical processes.
Gao, Junhui; Li, Li; Wu, Xiaolin; Wei, Dong-Qing
2012-03-01
BioNetSim, a Petri net-based software for modeling and simulating biochemistry processes, is developed, whose design and implement are presented in this paper, including logic construction, real-time access to KEGG (Kyoto Encyclopedia of Genes and Genomes), and BioModel database. Furthermore, glycolysis is simulated as an example of its application. BioNetSim is a helpful tool for researchers to download data, model biological network, and simulate complicated biochemistry processes. Gene regulatory networks, metabolic pathways, signaling pathways, and kinetics of cell interaction are all available in BioNetSim, which makes modeling more efficient and effective. Similar to other Petri net-based softwares, BioNetSim does well in graphic application and mathematic construction. Moreover, it shows several powerful predominances. (1) It creates models in database. (2) It realizes the real-time access to KEGG and BioModel and transfers data to Petri net. (3) It provides qualitative analysis, such as computation of constants. (4) It generates graphs for tracing the concentration of every molecule during the simulation processes.
Simulation Modeling of Software Development Processes
NASA Technical Reports Server (NTRS)
Calavaro, G. F.; Basili, V. R.; Iazeolla, G.
1996-01-01
A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.
Dependence of Snowmelt Simulations on Scaling of the Forcing Processes (Invited)
NASA Astrophysics Data System (ADS)
Winstral, A. H.; Marks, D. G.; Gurney, R. J.
2009-12-01
The spatial organization and scaling relationships of snow distribution in mountain environs is ultimately dependent on the controlling processes. These processes include interactions between weather, topography, vegetation, snow state, and seasonally-dependent radiation inputs. In large scale snow modeling it is vital to know these dependencies to obtain accurate predictions while reducing computational costs. This study examined the scaling characteristics of the forcing processes and the dependency of distributed snowmelt simulations to their scaling. A base model simulation characterized these processes with 10m resolution over a 14.0 km2 basin with an elevation range of 1474 - 2244 masl. Each of the major processes affecting snow accumulation and melt - precipitation, wind speed, solar radiation, thermal radiation, temperature, and vapor pressure - were independently degraded to 1 km resolution. Seasonal and event-specific results were analyzed. Results indicated that scale effects on melt vary by process and weather conditions. The dependence of melt simulations on the scaling of solar radiation fluxes also had a seasonal component. These process-based scaling characteristics should remain static through time as they are based on physical considerations. As such, these results not only provide guidance for current modeling efforts, but are also well suited to predicting how potential climate changes will affect the heterogeneity of mountain snow distributions.
Validation of mathematical model for CZ process using small-scale laboratory crystal growth furnace
NASA Astrophysics Data System (ADS)
Bergfelds, Kristaps; Sabanskis, Andrejs; Virbulis, Janis
2018-05-01
The present material is focused on the modelling of small-scale laboratory NaCl-RbCl crystal growth furnace. First steps towards fully transient simulations are taken in the form of stationary simulations that deal with the optimization of material properties to match the model to experimental conditions. For this purpose, simulation software primarily used for the modelling of industrial-scale silicon crystal growth process was successfully applied. Finally, transient simulations of the crystal growth are presented, giving a sufficient agreement to experimental results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimminau, G; Nagler, B; Higginbotham, A
2008-06-19
Calculations of the x-ray diffraction patterns from shocked crystals derived from the results of Non-Equilibrium-Molecular-Dynamics (NEMD) simulations are presented. The atomic coordinates predicted by the NEMD simulations combined with atomic form factors are used to generate a discrete distribution of electron density. A Fast-Fourier-Transform (FFT) of this distribution provides an image of the crystal in reciprocal space, which can be further processed to produce quantitative simulated data for direct comparison with experiments that employ picosecond x-ray diffraction from laser-irradiated crystalline targets.
Implementing a high-fidelity simulation program in a community college setting.
Tuoriniemi, Pamela; Schott-Baer, Darlene
2008-01-01
Despite their relatively high cost, there is heightened interest by faculty in undergraduate nursing programs to implement high-fidelity simulation (HFS) programs. High-fidelity simulators are appealing because they allow students to experience high-risk, low-volume patient problems in a realistic setting. The decision to purchase a simulator is the first step in the process of implementing and maintaining an HFS lab. Knowledge, technical skill, commitment, and considerable time are needed to develop a successful program. The process, as experienced by one community college nursing program, is described.
Multithreaded Stochastic PDES for Reactions and Diffusions in Neurons.
Lin, Zhongwei; Tropper, Carl; Mcdougal, Robert A; Patoary, Mohammand Nazrul Ishlam; Lytton, William W; Yao, Yiping; Hines, Michael L
2017-07-01
Cells exhibit stochastic behavior when the number of molecules is small. Hence a stochastic reaction-diffusion simulator capable of working at scale can provide a more accurate view of molecular dynamics within the cell. This paper describes a parallel discrete event simulator, Neuron Time Warp-Multi Thread (NTW-MT), developed for the simulation of reaction diffusion models of neurons. To the best of our knowledge, this is the first parallel discrete event simulator oriented towards stochastic simulation of chemical reactions in a neuron. The simulator was developed as part of the NEURON project. NTW-MT is optimistic and thread-based, which attempts to capitalize on multi-core architectures used in high performance machines. It makes use of a multi-level queue for the pending event set and a single roll-back message in place of individual anti-messages to disperse contention and decrease the overhead of processing rollbacks. Global Virtual Time is computed asynchronously both within and among processes to get rid of the overhead for synchronizing threads. Memory usage is managed in order to avoid locking and unlocking when allocating and de-allocating memory and to maximize cache locality. We verified our simulator on a calcium buffer model. We examined its performance on a calcium wave model, comparing it to the performance of a process based optimistic simulator and a threaded simulator which uses a single priority queue for each thread. Our multi-threaded simulator is shown to achieve superior performance to these simulators. Finally, we demonstrated the scalability of our simulator on a larger CICR model and a more detailed CICR model.
High performance real-time flight simulation at NASA Langley
NASA Technical Reports Server (NTRS)
Cleveland, Jeff I., II
1994-01-01
In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be deterministic and be completed in as short a time as possible. This includes simulation mathematical model computational and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, personnel at NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to a standard input/output system to provide for high bandwidth, low latency data acquisition and distribution. The Computer Automated Measurement and Control technology (IEEE standard 595) was extended to meet the performance requirements for real-time simulation. This technology extension increased the effective bandwidth by a factor of ten and increased the performance of modules necessary for simulator communications. This technology is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications of this technology are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC have completed the development of the use of supercomputers for simulation mathematical model computational to support real-time flight simulation. This includes the development of a real-time operating system and the development of specialized software and hardware for the CAMAC simulator network. This work, coupled with the use of an open systems software architecture, has advanced the state of the art in real time flight simulation. The data acquisition technology innovation and experience with recent developments in this technology are described.
Kinetic Theory and Simulation of Single-Channel Water Transport
NASA Astrophysics Data System (ADS)
Tajkhorshid, Emad; Zhu, Fangqiang; Schulten, Klaus
Water translocation between various compartments of a system is a fundamental process in biology of all living cells and in a wide variety of technological problems. The process is of interest in different fields of physiology, physical chemistry, and physics, and many scientists have tried to describe the process through physical models. Owing to advances in computer simulation of molecular processes at an atomic level, water transport has been studied in a variety of molecular systems ranging from biological water channels to artificial nanotubes. While simulations have successfully described various kinetic aspects of water transport, offering a simple, unified model to describe trans-channel translocation of water turned out to be a nontrivial task.
A Modified Isotropic-Kinematic Hardening Model to Predict the Defects in Tube Hydroforming Process
NASA Astrophysics Data System (ADS)
Jin, Kai; Guo, Qun; Tao, Jie; Guo, Xun-zhong
2017-11-01
Numerical simulations of tube hydroforming process of hollow crankshafts were conducted by using finite element analysis method. Moreover, the modified model involving the integration of isotropic-kinematic hardening model with ductile criteria model was used to more accurately optimize the process parameters such as internal pressure, feed distance and friction coefficient. Subsequently, hydroforming experiments were performed based on the simulation results. The comparison between experimental and simulation results indicated that the prediction of tube deformation, crack and wrinkle was quite accurate for the tube hydroforming process. Finally, hollow crankshafts with high thickness uniformity were obtained and the thickness distribution between numerical and experimental results was well consistent.
Kinetic Monte Carlo (kMC) simulation of carbon co-implant on pre-amorphization process.
Park, Soonyeol; Cho, Bumgoo; Yang, Seungsu; Won, Taeyoung
2010-05-01
We report our kinetic Monte Carlo (kMC) study of the effect of carbon co-implant on the pre-amorphization implant (PAL) process. We employed BCA (Binary Collision Approximation) approach for the acquisition of the initial as-implant dopant profile and kMC method for the simulation of diffusion process during the annealing process. The simulation results implied that carbon co-implant suppresses the boron diffusion due to the recombination with interstitials. Also, we could compare the boron diffusion with carbon diffusion by calculating carbon reaction with interstitial. And we can find that boron diffusion is affected from the carbon co-implant energy by enhancing the trapping of interstitial between boron and interstitial.
ERIC Educational Resources Information Center
Saraswat, Satya Prakash; Anderson, Dennis M.; Chircu, Alina M.
2014-01-01
This paper describes the development and evaluation of a graduate level Business Process Management (BPM) course with process modeling and simulation as its integral component, being offered at an accredited business university in the Northeastern U.S. Our approach is similar to that found in other Information Systems (IS) education papers, and…
Fabricating optical phantoms to simulate skin tissue properties and microvasculatures
NASA Astrophysics Data System (ADS)
Sheng, Shuwei; Wu, Qiang; Han, Yilin; Dong, Erbao; Xu, Ronald
2015-03-01
This paper introduces novel methods to fabricate optical phantoms that simulate the morphologic, optical, and microvascular characteristics of skin tissue. The multi-layer skin-simulating phantom was fabricated by a light-cured 3D printer that mixed and printed the colorless light-curable ink with the absorption and the scattering ingredients for the designated optical properties. The simulated microvascular network was fabricated by a soft lithography process to embed microchannels in polydimethylsiloxane (PDMS) phantoms. The phantoms also simulated vascular anomalies and hypoxia commonly observed in cancer. A dual-modal multispectral and laser speckle imaging system was used for oxygen and perfusion imaging of the tissue-simulating phantoms. The light-cured 3D printing technique and the soft lithography process may enable freeform fabrication of skin-simulating phantoms that embed microvessels for image and drug delivery applications.
NASA Astrophysics Data System (ADS)
Robinson, Wayne D.; Patt, Frederick S.; Franz, Bryan A.; Turpie, Kevin R.; McClain, Charles R.
2009-08-01
One of the roles of the VIIRS Ocean Science Team (VOST) is to assess the performance of the instrument and scientific processing software that generates ocean color parameters such as normalized water-leaving radiances and chlorophyll. A VIIRS data simulator is being developed to help aid in this work. The simulator will create a sufficient set of simulated Sensor Data Records (SDR) so that the ocean component of the VIIRS processing system can be tested. It will also have the ability to study the impact of instrument artifacts on the derived parameter quality. The simulator will use existing resources available to generate the geolocation information and to transform calibrated radiances to geophysical parameters and visa-versa. In addition, the simulator will be able to introduce land features, cloud fields, and expected VIIRS instrument artifacts. The design of the simulator and its progress will be presented.
Manufacturing Process Simulation of Large-Scale Cryotanks
NASA Technical Reports Server (NTRS)
Babai, Majid; Phillips, Steven; Griffin, Brian; Munafo, Paul M. (Technical Monitor)
2002-01-01
NASA's Space Launch Initiative (SLI) is an effort to research and develop the technologies needed to build a second-generation reusable launch vehicle. It is required that this new launch vehicle be 100 times safer and 10 times cheaper to operate than current launch vehicles. Part of the SLI includes the development of reusable composite and metallic cryotanks. The size of these reusable tanks is far greater than anything ever developed and exceeds the design limits of current manufacturing tools. Several design and manufacturing approaches have been formulated, but many factors must be weighed during the selection process. Among these factors are tooling reachability, cycle times, feasibility, and facility impacts. The manufacturing process simulation capabilities available at NASA's Marshall Space Flight Center have played a key role in down selecting between the various manufacturing approaches. By creating 3-D manufacturing process simulations, the varying approaches can be analyzed in a virtual world before any hardware or infrastructure is built. This analysis can detect and eliminate costly flaws in the various manufacturing approaches. The simulations check for collisions between devices, verify that design limits on joints are not exceeded, and provide cycle times which aid in the development of an optimized process flow. In addition, new ideas and concerns are often raised after seeing the visual representation of a manufacturing process flow. The output of the manufacturing process simulations allows for cost and safety comparisons to be performed between the various manufacturing approaches. This output helps determine which manufacturing process options reach the safety and cost goals of the SLI.
Fast simulation of reconstructed phylogenies under global time-dependent birth-death processes.
Höhna, Sebastian
2013-06-01
Diversification rates and patterns may be inferred from reconstructed phylogenies. Both the time-dependent and the diversity-dependent birth-death process can produce the same observed patterns of diversity over time. To develop and test new models describing the macro-evolutionary process of diversification, generic and fast algorithms to simulate under these models are necessary. Simulations are not only important for testing and developing models but play an influential role in the assessment of model fit. In the present article, I consider as the model a global time-dependent birth-death process where each species has the same rates but rates may vary over time. For this model, I derive the likelihood of the speciation times from a reconstructed phylogenetic tree and show that each speciation event is independent and identically distributed. This fact can be used to simulate efficiently reconstructed phylogenetic trees when conditioning on the number of species, the time of the process or both. I show the usability of the simulation by approximating the posterior predictive distribution of a birth-death process with decreasing diversification rates applied on a published bird phylogeny (family Cettiidae). The methods described in this manuscript are implemented in the R package TESS, available from the repository CRAN (http://cran.r-project.org/web/packages/TESS/). Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Cruz Inclán, Carlos M.; González Lazo, Eduardo; Rodríguez Rodríguez, Arturo; Guzmán Martínez, Fernando; Abreu Alfonso, Yamiel; Piñera Hernández, Ibrahin; Leyva Fabelo, Antonio
2017-09-01
The present work deals with the numerical simulation of gamma and electron radiation damage processes under high brightness and radiation particle fluency on regard to two new radiation induced atom displacement processes, which concern with both, the Monte Carlo Method based numerical simulation of the occurrence of atom displacement process as a result of gamma and electron interactions and transport in a solid matrix and the atom displacement threshold energies calculated by Molecular Dynamic methodologies. The two new radiation damage processes here considered in the framework of high brightness and particle fluency irradiation conditions are: 1) The radiation induced atom displacement processes due to a single primary knockout atom excitation in a defective target crystal matrix increasing its defect concentrations (vacancies, interstitials and Frenkel pairs) as a result of a severe and progressive material radiation damage and 2) The occurrence of atom displacements related to multiple primary knockout atom excitations for the same or different atomic species in an perfect target crystal matrix due to subsequent electron elastic atomic scattering in the same atomic neighborhood during a crystal lattice relaxation time. In the present work a review numeral simulation attempts of these two new radiation damage processes are presented, starting from the former developed algorithms and codes for Monte Carlo simulation of atom displacements induced by electron and gamma in
Computer Simulation in Predicting Biochemical Processes and Energy Balance at WWTPs
NASA Astrophysics Data System (ADS)
Drewnowski, Jakub; Zaborowska, Ewa; Hernandez De Vega, Carmen
2018-02-01
Nowadays, the use of mathematical models and computer simulation allow analysis of many different technological solutions as well as testing various scenarios in a short time and at low financial budget in order to simulate the scenario under typical conditions for the real system and help to find the best solution in design or operation process. The aim of the study was to evaluate different concepts of biochemical processes and energy balance modelling using a simulation platform GPS-x and a comprehensive model Mantis2. The paper presents the example of calibration and validation processes in the biological reactor as well as scenarios showing an influence of operational parameters on the WWTP energy balance. The results of batch tests and full-scale campaign obtained in the former work were used to predict biochemical and operational parameters in a newly developed plant model. The model was extended with sludge treatment devices, including anaerobic digester. Primary sludge removal efficiency was found as a significant factor determining biogas production and further renewable energy production in cogeneration. Water and wastewater utilities, which run and control WWTP, are interested in optimizing the process in order to save environment, their budget and decrease the pollutant emissions to water and air. In this context, computer simulation can be the easiest and very useful tool to improve the efficiency without interfering in the actual process performance.
Holistic Nursing Simulation: A Concept Analysis.
Cohen, Bonni S; Boni, Rebecca
2018-03-01
Simulation as a technology and holistic nursing care as a philosophy are two components within nursing programs that have merged during the process of knowledge and skill acquisition in the care of the patients as whole beings. Simulation provides opportunities to apply knowledge and skill through the use of simulators, standardized patients, and virtual settings. Concerns with simulation have been raised regarding the integration of the nursing process and recognizing the totality of the human being. Though simulation is useful as a technology, the nursing profession places importance on patient care, drawing on knowledge, theories, and expertise to administer patient care. There is a need to promptly and comprehensively define the concept of holistic nursing simulation to provide consistency and a basis for quality application within nursing curricula. This concept analysis uses Walker and Avant's approach to define holistic nursing simulation by defining antecedents, consequences, and empirical referents. The concept of holism and the practice of holistic nursing incorporated into simulation require an analysis of the concept of holistic nursing simulation by developing a language and model to provide direction for educators in design and development of holistic nursing simulation.
Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla
2016-11-01
Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.
Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah
Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less
Baseline process description for simulating plutonium oxide production for precalc project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pike, J. A.
Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO 2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as wellmore » as process and facility design details necessary for multi-scale, multi-physics models are provided.« less
Do Simulations Enhance Student Learning? An Empirical Evaluation of an IR Simulation
ERIC Educational Resources Information Center
Shellman, Stephen M.; Turan, Kursad
2006-01-01
There is a nascent literature on the question of whether active learning methods, and in particular simulation methods, enhance student learning. In this article, the authors evaluate the utility of an international relations simulation in enhancing learning objectives. Student evaluations provide evidence that the simulation process enhances…
NASA Astrophysics Data System (ADS)
Danáčová, Michaela; Valent, Peter; Výleta, Roman
2017-12-01
Nowadays, rainfall simulators are being used by many researchers in field or laboratory experiments. The main objective of most of these experiments is to better understand the underlying runoff generation processes, and to use the results in the process of calibration and validation of hydrological models. Many research groups have assembled their own rainfall simulators, which comply with their understanding of rainfall processes, and the requirements of their experiments. Most often, the existing rainfall simulators differ mainly in the size of the irrigated area, and the way they generate rain drops. They can be characterized by the accuracy, with which they produce a rainfall of a given intensity, the size of the irrigated area, and the rain drop generating mechanism. Rainfall simulation experiments can provide valuable information about the genesis of surface runoff, infiltration of water into soil and rainfall erodibility. Apart from the impact of physical properties of soil, its moisture and compaction on the generation of surface runoff and the amount of eroded particles, some studies also investigate the impact of vegetation cover of the whole area of interest. In this study, the rainfall simulator was used to simulate the impact of the slope gradient of the irrigated area on the amount of generated runoff and sediment yield. In order to eliminate the impact of external factors and to improve the reproducibility of the initial conditions, the experiments were conducted in laboratory conditions. The laboratory experiments were carried out using a commercial rainfall simulator, which was connected to an external peristaltic pump. The pump maintained a constant and adjustable inflow of water, which enabled to overcome the maximum volume of simulated precipitation of 2.3 l, given by the construction of the rainfall simulator, while maintaining constant characteristics of the simulated precipitation. In this study a 12-minute rainfall with a constant intensity of 5 mm/min was used to irrigate a corrupted soil sample. The experiment was undertaken for several different slopes, under the condition of no vegetation cover. The results of the rainfall simulation experiment complied with the expectations of a strong relationship between the slope gradient, and the amount of surface runoff generated. The experiments with higher slope gradients were characterised by larger volumes of surface runoff generated, and by shorter times after which it occurred. The experiments with rainfall simulators in both laboratory and field conditions play an important role in better understanding of runoff generation processes. The results of such small scale experiments could be used to estimate some of the parameters of complex hydrological models, which are used to model rainfall-runoff and erosion processes at catchment scale.
Nonlinear intrinsic variables and state reconstruction in multiscale simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dsilva, Carmeline J., E-mail: cdsilva@princeton.edu; Talmon, Ronen, E-mail: ronen.talmon@yale.edu; Coifman, Ronald R., E-mail: coifman@math.yale.edu
2013-11-14
Finding informative low-dimensional descriptions of high-dimensional simulation data (like the ones arising in molecular dynamics or kinetic Monte Carlo simulations of physical and chemical processes) is crucial to understanding physical phenomena, and can also dramatically assist in accelerating the simulations themselves. In this paper, we discuss and illustrate the use of nonlinear intrinsic variables (NIV) in the mining of high-dimensional multiscale simulation data. In particular, we focus on the way NIV allows us to functionally merge different simulation ensembles, and different partial observations of these ensembles, as well as to infer variables not explicitly measured. The approach relies on certainmore » simple features of the underlying process variability to filter out measurement noise and systematically recover a unique reference coordinate frame. We illustrate the approach through two distinct sets of atomistic simulations: a stochastic simulation of an enzyme reaction network exhibiting both fast and slow time scales, and a molecular dynamics simulation of alanine dipeptide in explicit water.« less
Nonlinear intrinsic variables and state reconstruction in multiscale simulations
NASA Astrophysics Data System (ADS)
Dsilva, Carmeline J.; Talmon, Ronen; Rabin, Neta; Coifman, Ronald R.; Kevrekidis, Ioannis G.
2013-11-01
Finding informative low-dimensional descriptions of high-dimensional simulation data (like the ones arising in molecular dynamics or kinetic Monte Carlo simulations of physical and chemical processes) is crucial to understanding physical phenomena, and can also dramatically assist in accelerating the simulations themselves. In this paper, we discuss and illustrate the use of nonlinear intrinsic variables (NIV) in the mining of high-dimensional multiscale simulation data. In particular, we focus on the way NIV allows us to functionally merge different simulation ensembles, and different partial observations of these ensembles, as well as to infer variables not explicitly measured. The approach relies on certain simple features of the underlying process variability to filter out measurement noise and systematically recover a unique reference coordinate frame. We illustrate the approach through two distinct sets of atomistic simulations: a stochastic simulation of an enzyme reaction network exhibiting both fast and slow time scales, and a molecular dynamics simulation of alanine dipeptide in explicit water.
Bochmann, Esther S; Steffens, Kristina E; Gryczke, Andreas; Wagner, Karl G
2018-03-01
Simulation of HME processes is a valuable tool for increased process understanding and ease of scale-up. However, the experimental determination of all required input parameters is tedious, namely the melt rheology of the amorphous solid dispersion (ASD) in question. Hence, a procedure to simplify the application of hot-melt extrusion (HME) simulation for forming amorphous solid dispersions (ASD) is presented. The commercial 1D simulation software Ludovic ® was used to conduct (i) simulations using a full experimental data set of all input variables including melt rheology and (ii) simulations using model-based melt viscosity data based on the ASDs glass transition and the physical properties of polymeric matrix only. Both types of HME computation were further compared to experimental HME results. Variation in physical properties (e.g. heat capacity, density) and several process characteristics of HME (residence time distribution, energy consumption) among the simulations and experiments were evaluated. The model-based melt viscosity was calculated by using the glass transition temperature (T g ) of the investigated blend and the melt viscosity of the polymeric matrix by means of a T g -viscosity correlation. The results of measured melt viscosity and model-based melt viscosity were similar with only few exceptions, leading to similar HME simulation outcomes. At the end, the experimental effort prior to HME simulation could be minimized and the procedure enables a good starting point for rational development of ASDs by means of HME. As model excipients, Vinylpyrrolidone-vinyl acetate copolymer (COP) in combination with various APIs (carbamazepine, dipyridamole, indomethacin, and ibuprofen) or polyethylene glycol (PEG 1500) as plasticizer were used to form the ASDs. Copyright © 2017 Elsevier B.V. All rights reserved.
Erdemir, Ahmet; Guess, Trent M.; Halloran, Jason P.; Modenese, Luca; Reinbolt, Jeffrey A.; Thelen, Darryl G.; Umberger, Brian R.
2016-01-01
Objective The overall goal of this document is to demonstrate that dissemination of models and analyses for assessing the reproducibility of simulation results can be incorporated in the scientific review process in biomechanics. Methods As part of a special issue on model sharing and reproducibility in IEEE Transactions on Biomedical Engineering, two manuscripts on computational biomechanics were submitted: A. Rajagopal et al., IEEE Trans. Biomed. Eng., 2016 and A. Schmitz and D. Piovesan, IEEE Trans. Biomed. Eng., 2016. Models used in these studies were shared with the scientific reviewers and the public. In addition to the standard review of the manuscripts, the reviewers downloaded the models and performed simulations that reproduced results reported in the studies. Results There was general agreement between simulation results of the authors and those of the reviewers. Discrepancies were resolved during the necessary revisions. The manuscripts and instructions for download and simulation were updated in response to the reviewers’ feedback; changes that may otherwise have been missed if explicit model sharing and simulation reproducibility analysis were not conducted in the review process. Increased burden on the authors and the reviewers, to facilitate model sharing and to repeat simulations, were noted. Conclusion When the authors of computational biomechanics studies provide access to models and data, the scientific reviewers can download and thoroughly explore the model, perform simulations, and evaluate simulation reproducibility beyond the traditional manuscript-only review process. Significance Model sharing and reproducibility analysis in scholarly publishing will result in a more rigorous review process, which will enhance the quality of modeling and simulation studies and inform future users of computational models. PMID:28072567
Simulation of Magnetic Field Assisted Finishing (MFAF) Process Utilizing Smart MR Polishing Tool
NASA Astrophysics Data System (ADS)
Barman, Anwesa; Das, Manas
2017-02-01
Magnetic field assisted finishing process is an advanced finishing process. This process is capable of producing nanometer level surface finish. In this process magnetic field is applied to control the finishing forces using magnetorheological polishing medium. In the current study, permanent magnet is used to provide the required magnetic field in the finishing zone. The working gap between the workpiece and the magnet is filled with MR fluid which is used as the polishing brush to remove surface undulations from the top surface of the workpiece. In this paper, the distribution of magnetic flux density on the workpiece surface and behaviour of MR polishing medium during finishing are analyzed using commercial finite element packages (Ansys Maxwell® and Comsol®). The role of magnetic force in the indentation of abrasive particles on the workpiece surface is studied. A two-dimensional simulation study of the steady, laminar, and incompressible MR fluid flow behaviour during finishing process is carried out. The material removal and surface roughness modelling of the finishing process are also presented. The indentation force by a single active abrasive particle on the workpiece surface is modelled during simulation. The velocity profile of MR fluid with and without application of magnetic field is plotted. It shows non-Newtonian property without application of magnetic field. After that the total material displacement due to one abrasive particle is plotted. The simulated roughness profile is in a good agreement with the experimental results. The conducted study will help in understanding the fluid behavior and the mechanism of finishing during finishing process. Also, the modelling and simulation of the process will help in achieving better finishing performance.
Simpson, Robin; Devenyi, Gabriel A; Jezzard, Peter; Hennessy, T Jay; Near, Jamie
2017-01-01
To introduce a new toolkit for simulation and processing of magnetic resonance spectroscopy (MRS) data, and to demonstrate some of its novel features. The FID appliance (FID-A) is an open-source, MATLAB-based software toolkit for simulation and processing of MRS data. The software is designed specifically for processing data with multiple dimensions (eg, multiple radiofrequency channels, averages, spectral editing dimensions). It is equipped with functions for importing data in the formats of most major MRI vendors (eg, Siemens, Philips, GE, Agilent) and for exporting data into the formats of several common processing software packages (eg, LCModel, jMRUI, Tarquin). This paper introduces the FID-A software toolkit and uses examples to demonstrate its novel features, namely 1) the use of a spectral registration algorithm to carry out useful processing routines automatically, 2) automatic detection and removal of motion-corrupted scans, and 3) the ability to perform several major aspects of the MRS computational workflow from a single piece of software. This latter feature is illustrated through both high-level processing of in vivo GABA-edited MEGA-PRESS MRS data, as well as detailed quantum mechanical simulations to generate an accurate LCModel basis set for analysis of the same data. All of the described processing steps resulted in a marked improvement in spectral quality compared with unprocessed data. Fitting of MEGA-PRESS data using a customized basis set resulted in improved fitting accuracy compared with a generic MEGA-PRESS basis set. The FID-A software toolkit enables high-level processing of MRS data and accurate simulation of in vivo MRS experiments. Magn Reson Med 77:23-33, 2017. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Zhang, Xiaoxi; Cheng, Yongguang; Xia, Linsheng; Yang, Jiandong
2016-11-01
This paper reports the preliminary progress in the CFD simulation of the reverse water-hammer induced by the collapse of a draft-tube cavity in a model pump-turbine during the runaway process. Firstly, the Fluent customized 1D-3D coupling model for hydraulic transients and the Schnerr & Sauer cavitation model for cavity development are introduced. Then, the methods are validated by simulating the benchmark reverse water-hammer in a long pipe caused by a valve instant closure. The simulated head history at the valve agrees well with the measured data in literature. After that, the more complicated reverse water-hammer in the draft-tube of a runaway model pump-turbine, which is installed in a model pumped-storage power plant, is simulated. The dynamic processes of a vapor cavity, from generation, expansion, shrink to collapse, are shown. After the cavity collapsed, a sudden increase of pressure can be evidently observed. The process is featured by a locally expending and collapsing vapor cavity that is around the runner cone, which is different from the conventional recognition of violent water- column separation. This work reveals the possibility for simulating the reverse water-hammer phenomenon in turbines by 3D CFD.
Processing biobased polymers using plasticizers: Numerical simulations versus experiments
NASA Astrophysics Data System (ADS)
Desplentere, Frederik; Cardon, Ludwig; Six, Wim; Erkoç, Mustafa
2016-03-01
In polymer processing, the use of biobased products shows lots of possibilities. Considering biobased materials, biodegradability is in most cases the most important issue. Next to this, bio based materials aimed at durable applications, are gaining interest. Within this research, the influence of plasticizers on the processing of the bio based material is investigated. This work is done for an extrusion grade of PLA, Natureworks PLA 2003D. Extrusion through a slit die equipped with pressure sensors is used to compare the experimental pressure values to numerical simulation results. Additional experimental data (temperature and pressure data along the extrusion screw and die are recorded) is generated on a dr. Collin Lab extruder producing a 25mm diameter tube. All these experimental data is used to indicate the appropriate functioning of the numerical simulation tool Virtual Extrusion Laboratory 6.7 for the simulation of both the industrial available extrusion grade PLA and the compound in which 15% of plasticizer is added. Adding the applied plasticizer, resulted in a 40% lower pressure drop over the extrusion die. The combination of different experiments allowed to fit the numerical simulation results closely to the experimental values. Based on this experience, it is shown that numerical simulations also can be used for modified bio based materials if appropriate material and process data are taken into account.
NASA Astrophysics Data System (ADS)
Chicea, Anca-Lucia
2015-09-01
The paper presents the process of building geometric and kinematic models of a technological equipment used in the process of manufacturing devices. First, the process of building the model for a six axes industrial robot is presented. In the second part of the paper, the process of building the model for a five-axis CNC milling machining center is also shown. Both models can be used for accurate cutting processes simulation of complex parts, such as prosthetic devices.
Exploring empirical rank-frequency distributions longitudinally through a simple stochastic process.
Finley, Benjamin J; Kilkki, Kalevi
2014-01-01
The frequent appearance of empirical rank-frequency laws, such as Zipf's law, in a wide range of domains reinforces the importance of understanding and modeling these laws and rank-frequency distributions in general. In this spirit, we utilize a simple stochastic cascade process to simulate several empirical rank-frequency distributions longitudinally. We focus especially on limiting the process's complexity to increase accessibility for non-experts in mathematics. The process provides a good fit for many empirical distributions because the stochastic multiplicative nature of the process leads to an often observed concave rank-frequency distribution (on a log-log scale) and the finiteness of the cascade replicates real-world finite size effects. Furthermore, we show that repeated trials of the process can roughly simulate the longitudinal variation of empirical ranks. However, we find that the empirical variation is often less that the average simulated process variation, likely due to longitudinal dependencies in the empirical datasets. Finally, we discuss the process limitations and practical applications.
Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes
NASA Technical Reports Server (NTRS)
Williams Colin P.
1999-01-01
Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.
NASA Astrophysics Data System (ADS)
Demir, I.
2014-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.
Huang, J; Loeffler, M; Muehle, U; Moeller, W; Mulders, J J L; Kwakman, L F Tz; Van Dorp, W F; Zschech, E
2018-01-01
A Ga focused ion beam (FIB) is often used in transmission electron microscopy (TEM) analysis sample preparation. In case of a crystalline Si sample, an amorphous near-surface layer is formed by the FIB process. In order to optimize the FIB recipe by minimizing the amorphization, it is important to predict the amorphous layer thickness from simulation. Molecular Dynamics (MD) simulation has been used to describe the amorphization, however, it is limited by computational power for a realistic FIB process simulation. On the other hand, Binary Collision Approximation (BCA) simulation is able and has been used to simulate ion-solid interaction process at a realistic scale. In this study, a Point Defect Density approach is introduced to a dynamic BCA simulation, considering dynamic ion-solid interactions. We used this method to predict the c-Si amorphization caused by FIB milling on Si. To validate the method, dedicated TEM studies are performed. It shows that the amorphous layer thickness predicted by the numerical simulation is consistent with the experimental data. In summary, the thickness of the near-surface Si amorphization layer caused by FIB milling can be well predicted using the Point Defect Density approach within the dynamic BCA model. Copyright © 2017 Elsevier B.V. All rights reserved.
Process-Oriented Diagnostics of Tropical Cyclones in Global Climate Models
NASA Astrophysics Data System (ADS)
Moon, Y.; Kim, D.; Camargo, S. J.; Wing, A. A.; Sobel, A. H.; Bosilovich, M. G.; Murakami, H.; Reed, K. A.; Vecchi, G. A.; Wehner, M. F.; Zarzycki, C. M.; Zhao, M.
2017-12-01
Simulating tropical cyclone (TC) activity with global climate models (GCMs) remains a challenging problem. While some GCMs are able to simulate TC activity that is in good agreement with the observations, many other models exhibit strong biases. Decreasing horizontal grid spacing of the GCM simulations tends to improve the characteristics of simulated TCs, but this enhancement alone does not necessarily lead to greater skill in simulating TC activity. This study uses process-based diagnostics to identify model characteristics that could explain why some GCM simulations are able to produce more realistic TC activity than others. The diagnostics examine how convection, moisture, clouds and related processes are coupled at individual grid points, which yields useful information into how convective parameterizations interact with resolved model dynamics. These diagnostics share similarities with those originally developed to examine the Madden-Julian Oscillations in climate models. This study will examine TCs in eight different GCM simulations performed at NOAA/GFDL, NCAR and NASA that have different horizontal resolutions and ocean coupling. Preliminary results suggest that stronger TCs are closely associated with greater rainfall - thus greater diabatic heating - in the inner-core regions of the storms, which is consistent with previous theoretical studies. Other storm characteristics that can be used to infer why GCM simulations with comparable horizontal grid spacings produce different TC activity will be examined.
Mathematical modeling and SAR simulation multifunction SAR technology efforts
NASA Technical Reports Server (NTRS)
Griffin, C. R.; Estes, J. M.
1981-01-01
The orbital SAR (synthetic aperture radar) simulation data was used in several simulation efforts directed toward advanced SAR development. Efforts toward simulating an operational radar, simulation of antenna polarization effects, and simulation of SAR images at serveral different wavelengths are discussed. Avenues for improvements in the orbital SAR simulation and its application to the development of advanced digital radar data processing schemes are indicated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mackiewicz-Ludtka, G.; Sebright, J.
2007-12-15
The primary goal of this Cooperative Research and Development Agreement (CRADA) betwe1311 UT-Battelle (Contractor) and Caterpillar Inc. (Participant) was to develop the plasma arc lamp (PAL), infrared (IR) thermal processing technology 1.) to enhance surface coating performance by improving the interfacial bond strength between selected coatings and substrates; and 2.) to extend this technology base for transitioning of the arc lamp processing to the industrial Participant. Completion of the following three key technical tasks (described below) was necessary in order to accomplish this goal. First, thermophysical property data sets were successfully determined for composite coatings applied to 1010 steel substrates,more » with a more limited data set successfully measured for free-standing coatings. These data are necessary for the computer modeling simulations and parametric studies to; A.) simulate PAL IR processing, facilitating the development of the initial processing parameters; and B.) help develop a better understanding of the basic PAL IR fusing process fundamentals, including predicting the influence of melt pool stirring and heat tnmsfar characteristics introduced during plasma arc lamp infrared (IR) processing; Second, a methodology and a set of procedures were successfully developed and the plasma arc lamp (PAL) power profiles were successfully mapped as a function of PAL power level for the ORNL PAL. The latter data also are necessary input for the computer model to accurately simulate PAL processing during process modeling simulations, and to facilitate a better understand of the fusing process fundamentals. Third, several computer modeling codes have been evaluated as to their capabilities and accuracy in being able to capture and simulate convective mixing that may occur during PAL thermal processing. The results from these evaluation efforts are summarized in this report. The intention of this project was to extend the technology base and provide for transitioning of the arc lamp processing to the industrial Participant.« less
Moussa, Ahmed; Loye, Nathalie; Charlin, Bernard; Audétat, Marie-Claude
2016-01-01
Background Helping trainees develop appropriate clinical reasoning abilities is a challenging goal in an environment where clinical situations are marked by high levels of complexity and unpredictability. The benefit of simulation-based education to assess clinical reasoning skills has rarely been reported. More specifically, it is unclear if clinical reasoning is better acquired if the instructor's input occurs entirely after or is integrated during the scenario. Based on educational principles of the dual-process theory of clinical reasoning, a new simulation approach called simulation with iterative discussions (SID) is introduced. The instructor interrupts the flow of the scenario at three key moments of the reasoning process (data gathering, integration, and confirmation). After each stop, the scenario is continued where it was interrupted. Finally, a brief general debriefing ends the session. System-1 process of clinical reasoning is assessed by verbalization during management of the case, and System-2 during the iterative discussions without providing feedback. Objective The aim of this study is to evaluate the effectiveness of Simulation with Iterative Discussions versus the classical approach of simulation in developing reasoning skills of General Pediatrics and Neonatal-Perinatal Medicine residents. Methods This will be a prospective exploratory, randomized study conducted at Sainte-Justine hospital in Montreal, Qc, between January and March 2016. All post-graduate year (PGY) 1 to 6 residents will be invited to complete one SID or classical simulation 30 minutes audio video-recorded complex high-fidelity simulations covering a similar neonatology topic. Pre- and post-simulation questionnaires will be completed and a semistructured interview will be conducted after each simulation. Data analyses will use SPSS and NVivo softwares. Results This study is in its preliminary stages and the results are expected to be made available by April, 2016. Conclusions This will be the first study to explore a new simulation approach designed to enhance clinical reasoning. By assessing more closely reasoning processes throughout a simulation session, we believe that Simulation with Iterative Discussions will be an interesting and more effective approach for students. The findings of the study will benefit medical educators, education programs, and medical students. PMID:26888076
Pennaforte, Thomas; Moussa, Ahmed; Loye, Nathalie; Charlin, Bernard; Audétat, Marie-Claude
2016-02-17
Helping trainees develop appropriate clinical reasoning abilities is a challenging goal in an environment where clinical situations are marked by high levels of complexity and unpredictability. The benefit of simulation-based education to assess clinical reasoning skills has rarely been reported. More specifically, it is unclear if clinical reasoning is better acquired if the instructor's input occurs entirely after or is integrated during the scenario. Based on educational principles of the dual-process theory of clinical reasoning, a new simulation approach called simulation with iterative discussions (SID) is introduced. The instructor interrupts the flow of the scenario at three key moments of the reasoning process (data gathering, integration, and confirmation). After each stop, the scenario is continued where it was interrupted. Finally, a brief general debriefing ends the session. System-1 process of clinical reasoning is assessed by verbalization during management of the case, and System-2 during the iterative discussions without providing feedback. The aim of this study is to evaluate the effectiveness of Simulation with Iterative Discussions versus the classical approach of simulation in developing reasoning skills of General Pediatrics and Neonatal-Perinatal Medicine residents. This will be a prospective exploratory, randomized study conducted at Sainte-Justine hospital in Montreal, Qc, between January and March 2016. All post-graduate year (PGY) 1 to 6 residents will be invited to complete one SID or classical simulation 30 minutes audio video-recorded complex high-fidelity simulations covering a similar neonatology topic. Pre- and post-simulation questionnaires will be completed and a semistructured interview will be conducted after each simulation. Data analyses will use SPSS and NVivo softwares. This study is in its preliminary stages and the results are expected to be made available by April, 2016. This will be the first study to explore a new simulation approach designed to enhance clinical reasoning. By assessing more closely reasoning processes throughout a simulation session, we believe that Simulation with Iterative Discussions will be an interesting and more effective approach for students. The findings of the study will benefit medical educators, education programs, and medical students.
NASA Astrophysics Data System (ADS)
Ribes Bertomeu, Josep
Wastewater treatments require the execution of many conversion processes simultaneously and/or consecutively, making them a tricky object of study. Furthermore, complexity of treatment processes is increasing not only for the more stringent effluent standards required, but also for the new trends towards sustainable development, which in this process are mainly focused on energy saving and nutrient recovery from wastewaters in order to improve their life cycle. For this reason it becomes necessary to use simulation tools which are able to represent all these processes by means of a suitable mathematical model. They can help in determining and predicting the behaviour of the different treatment schemes. These simulators have become essential for the design, control and optimization of wastewater treatment plants (WWTP). Settling processes have a significant role in the accomplishment of effluent standards and the correct operation of the plant. However, many models that are currently employed for WWTP design and simulation do not take into account settling processes or they are handled in a very simple way, by neglecting the biochemical processes that can occur during sedimentation. People of CALAGUA research group have focussed their efforts towards a new philosophy of simulating treatment plants, which is based on the use of a unique model to represent all physical, chemical and biological processes taking place in WWTPs. In this research topic, they have worked on the development of a general quality model that considers biological conversion processes carried out by different microorganism groups, acid base chemical interactions affecting the pH value in the system, and gas-liquid transfer processes. However, a generalized use of such a quality model requires its combination with a flux model, principally for those processes where completely mixture can not be assumed, as for instance, settlers and thickeners in WWTPs. The main objective of this work has been the development and validation of a general settling model that allows simulating the main settling operations taking place in a WWTP, considering both primary and secondary settlers and thickeners. It consists in a one-dimensional model based on the flux theory of Kynch and the double-exponential settling function of Takacs that takes into account flocculation, hindered settling and compression processes. The model has been applied to simulation of settlers and thickeners by means of splitting the system into several horizontal layers, all of them considered as completely mixed reactors which are interconnected by mass flux obtained from the settling model. In order to simulate the conversion processes taking place during sedimentation, the general quality model BNRM1 has been added, and it has been proposed an iterative procedure for solving the equations for each layer in which the settler has been divided. The settling flux model validation, along with the quality model, has been carried out by applying them to a simulation of primary sludge fermentation - elutriation process. This process has been studied on a pilot plant located in the Carraixet WWTP in Alboraia (Valencia). In order to simulate the observed decrease in solids separation efficiency in the studied fermentation - elutriation process, the quality model has been modified with the addition of a new process called "disintegration of complex particulate material". This process influences the settleability of the sludge because it is considered that the disintegrated solids become non-settleable solids. This modification implies the addition of two new kinetic parameters (the specific disintegration velocity for volatile particulate material and the specific disintegration velocity for non volatile particulate material). However, the settling parameter that represents the non-settleable fraction of total suspended solids is eliminated from the model and it has been transformed into an experimental variable which is quite easy to analyze. The result of this modification is a more general model, which is applicable to fermentation - elutriation process working at any operating condition. Finally, the behaviour and capabilities of the developed model have been tested by simulating a complete WWTP on the DESASS simulation software, developed by the research group. This example includes the most important processes that can be used in a WWTP: biological nutrient removal, primary sludge fermentation and sludge digestion. The model allows considering both settling processes and biochemical processes as a whole (denitrification in secondary settlers, primary sludge fermentation and VFA elutriation, phosphorus release in thickeners because of the PAO decay, etc.). The developed model implies an important advance in study of new wastewater treatment processes because it allows dealing with global process optimization problems, by means of full plants simulation. It is very useful for studying the effects of a modification in operation conditions of one element over the operation of the rest of the elements of the WWTP. (Abstract shortened by UMI.).
NASA Technical Reports Server (NTRS)
Strahan, Susan E.; Douglass, Anne R.
2003-01-01
The Global Modeling Initiative has integrated two 35-year simulations of an ozone recovery scenario with an offline chemistry and transport model using two different meteorological inputs. Physically based diagnostics, derived from satellite and aircraft data sets, are described and then used to evaluate the realism of temperature and transport processes in the simulations. Processes evaluated include barrier formation in the subtropics and polar regions, and extratropical wave-driven transport. Some diagnostics are especially relevant to simulation of lower stratospheric ozone, but most are applicable to any stratospheric simulation. The temperature evaluation, which is relevant to gas phase chemical reactions, showed that both sets of meteorological fields have near climatological values at all latitudes and seasons at 30 hPa and below. Both simulations showed weakness in upper stratospheric wave driving. The simulation using input from a general circulation model (GMI(sub GCM)) showed a very good residual circulation in the tropics and northern hemisphere. The simulation with input from a data assimilation system (GMI(sub DAS)) performed better in the midlatitudes than at high latitudes. Neither simulation forms a realistic barrier at the vortex edge, leading to uncertainty in the fate of ozone-depleted vortex air. Overall, tracer transport in the offline GMI(sub GCM) has greater fidelity throughout the stratosphere than the GMI(sub DAS).
An effective online data monitoring and saving strategy for large-scale climate simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
An effective online data monitoring and saving strategy for large-scale climate simulations
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin; ...
2018-01-22
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
NASA Astrophysics Data System (ADS)
Kushima, A.; Eapen, J.; Li, Ju; Yip, S.; Zhu, T.
2011-08-01
Atomistic simulation methods are known for timescale limitations in resolving slow dynamical processes. Two well-known scenarios of slow dynamics are viscous relaxation in supercooled liquids and creep deformation in stressed solids. In both phenomena the challenge to theory and simulation is to sample the transition state pathways efficiently and follow the dynamical processes on long timescales. We present a perspective based on the biased molecular simulation methods such as metadynamics, autonomous basin climbing (ABC), strain-boost and adaptive boost simulations. Such algorithms can enable an atomic-level explanation of the temperature variation of the shear viscosity of glassy liquids, and the relaxation behavior in solids undergoing creep deformation. By discussing the dynamics of slow relaxation in two quite different areas of condensed matter science, we hope to draw attention to other complex problems where anthropological or geological-scale time behavior can be simulated at atomic resolution and understood in terms of micro-scale processes of molecular rearrangements and collective interactions. As examples of a class of phenomena that can be broadly classified as materials ageing, we point to stress corrosion cracking and cement setting as opportunities for atomistic modeling and simulations.
NASA Astrophysics Data System (ADS)
Michalik, Peter; Mital, Dusan; Zajac, Jozef; Brezikova, Katarina; Duplak, Jan; Hatala, Michal; Radchenko, Svetlana
2016-10-01
Article deals with point to using intelligent relay and PLC systems in practice, to their architecture and principles of programming and simulations for education process on all types of school from secondary to universities. Aim of the article is proposal of simple examples of applications, where is demonstrated methodology of programming on real simple practice examples and shown using of chosen instructions. In practical part is described process of creating schemas and describing of function blocks, where are described methodologies of creating program and simulations of output reactions on changeable inputs for intelligent relays.
NASA Astrophysics Data System (ADS)
Liu, Huihui; He, Xiongwei; Guo, Peng
2017-04-01
Three factors (pouring temperature, injection speed and mold temperature) were selected to do three levels L9 (33)orthogonal experiment, then simulate processing of semi-solid die-casting of magnesium matrix composite by Flow-3D software. The stress distribution, temperature field and defect distribution of filling process were analyzed to find the optimized processing parameter with the help of orthogonal experiment. The results showed that semi-solid has some advantages of well-proportioned stress and temperature field, less defect concentrated in the surface. The results of simulation were the same as the experimental results.
NASA Technical Reports Server (NTRS)
Krosel, S. M.; Milner, E. J.
1982-01-01
The application of Predictor corrector integration algorithms developed for the digital parallel processing environment are investigated. The algorithms are implemented and evaluated through the use of a software simulator which provides an approximate representation of the parallel processing hardware. Test cases which focus on the use of the algorithms are presented and a specific application using a linear model of a turbofan engine is considered. Results are presented showing the effects of integration step size and the number of processors on simulation accuracy. Real time performance, interprocessor communication, and algorithm startup are also discussed.
NASA Astrophysics Data System (ADS)
Reich, Rebecca D.; Eddington, Donald
2002-05-01
Signal processing in a cochlear implant (CI) is primarily designed to convey speech and environmental sounds, and can cause distortion of musical timbre. Systematic investigation of musical instrument identification through a CI has not yet revealed how timbre is affected by the implant's processing. In this experiment, the bandpass filtering, rectification, and low-pass filtering of an implant are simulated in MATLAB. Synthesized signals representing 12 common instruments, each performing a major scale, are processed by simulations using up to 8 analysis channels. The unprocessed recordings, together with the 8 simulation conditions for 12 instruments, are presented in random order to each of the subjects. The subject's task is to identify the instrument represented by each item. The subjects also subjectively score each item based on similarity and pleasantness. We anticipate performance using the simulation will be worse than the unprocessed condition because of the limited information delivered by the envelopes of the analysis channels. These results will be analyzed as a confusion matrix and provide a basis for contrasting the information used by subjects listening to the unprocessed and processed materials. Understanding these differences should aid in the development of new processing strategies to better represent music for cochlear implant users.
Improved simulation of poorly drained forests using Biome-BGC.
Bond-Lamberty, Ben; Gower, Stith T; Ahl, Douglas E
2007-05-01
Forested wetlands and peatlands are important in boreal and terrestrial biogeochemical cycling, but most general-purpose forest process models are designed and parameterized for upland systems. We describe changes made to Biome-BGC, an ecophysiological process model, that improve its ability to simulate poorly drained forests. Model changes allowed for: (1) lateral water inflow from a surrounding watershed, and variable surface and subsurface drainage; (2) adverse effects of anoxic soil on decomposition and nutrient mineralization; (3) closure of leaf stomata in flooded soils; and (4) growth of nonvascular plants (i.e., bryophytes). Bryophytes were treated as ectohydric broadleaf evergreen plants with zero stomatal conductance, whose cuticular conductance to CO(2) was dependent on plant water content. Individual model changes were parameterized with published data, and ecosystem-level model performance was assessed by comparing simulated output to field data from the northern BOREAS site in Manitoba, Canada. The simulation of the poorly drained forest model exhibited reduced decomposition and vascular plant growth (-90%) compared with that of the well-drained forest model; the integrated bryophyte photosynthetic response accorded well with published data. Simulated net primary production, biomass and soil carbon accumulation broadly agreed with field measurements, although simulated net primary production was higher than observed data in well-drained stands. Simulated net primary production in the poorly drained forest was most sensitive to oxygen restriction on soil processes, and secondarily to stomatal closure in flooded conditions. The modified Biome-BGC remains unable to simulate true wetlands that are subject to prolonged flooding, because it does not track organic soil formation, water table changes, soil redox potential or anaerobic processes.
The simulation study on optical target laser active detection performance
NASA Astrophysics Data System (ADS)
Li, Ying-chun; Hou, Zhao-fei; Fan, Youchen
2014-12-01
According to the working principle of laser active detection system, the paper establishes the optical target laser active detection simulation system, carry out the simulation study on the detection process and detection performance of the system. For instance, the performance model such as the laser emitting, the laser propagation in the atmosphere, the reflection of optical target, the receiver detection system, the signal processing and recognition. We focus on the analysis and modeling the relationship between the laser emitting angle and defocus amount and "cat eye" effect echo laser in the reflection of optical target. Further, in the paper some performance index such as operating range, SNR and the probability of the system have been simulated. The parameters including laser emitting parameters, the reflection of the optical target and the laser propagation in the atmosphere which make a great influence on the performance of the optical target laser active detection system. Finally, using the object-oriented software design methods, the laser active detection system with the opening type, complete function and operating platform, realizes the process simulation that the detection system detect and recognize the optical target, complete the performance simulation of each subsystem, and generate the data report and the graph. It can make the laser active detection system performance models more intuitive because of the visible simulation process. The simulation data obtained from the system provide a reference to adjust the structure of the system parameters. And it provides theoretical and technical support for the top level design of the optical target laser active detection system and performance index optimization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
This presentation will examine process systems engineering R&D needs for application to advanced fossil energy (FE) systems and highlight ongoing research activities at the National Energy Technology Laboratory (NETL) under the auspices of a recently launched Collaboratory for Process & Dynamic Systems Research. The three current technology focus areas include: 1) High-fidelity systems with NETL's award-winning Advanced Process Engineering Co-Simulator (APECS) technology for integrating process simulation with computational fluid dynamics (CFD) and virtual engineering concepts, 2) Dynamic systems with R&D on plant-wide IGCC dynamic simulation, control, and real-time training applications, and 3) Systems optimization including large-scale process optimization, stochastic simulationmore » for risk/uncertainty analysis, and cost estimation. Continued R&D aimed at these and other key process systems engineering models, methods, and tools will accelerate the development of advanced gasification-based FE systems and produce increasingly valuable outcomes for DOE and the Nation.« less
Building team adaptive capacity: the roles of sensegiving and team composition.
Randall, Kenneth R; Resick, Christian J; DeChurch, Leslie A
2011-05-01
The current study draws on motivated information processing in groups theory to propose that leadership functions and composition characteristics provide teams with the epistemic and social motivation needed for collective information processing and strategy adaptation. Three-person teams performed a city management decision-making simulation (N=74 teams; 222 individuals). Teams first managed a simulated city that was newly formed and required growth strategies and were then abruptly switched to a second simulated city that was established and required revitalization strategies. Consistent with hypotheses, external sensegiving and team composition enabled distinct aspects of collective information processing. Sensegiving prompted the emergence of team strategy mental models (i.e., cognitive information processing); psychological collectivism facilitated information sharing (i.e., behavioral information processing); and cognitive ability provided the capacity for both the cognitive and behavioral aspects of collective information processing. In turn, team mental models and information sharing enabled reactive strategy adaptation.
Absorptivity Measurements and Heat Source Modeling to Simulate Laser Cladding
NASA Astrophysics Data System (ADS)
Wirth, Florian; Eisenbarth, Daniel; Wegener, Konrad
The laser cladding process gains importance, as it does not only allow the application of surface coatings, but also additive manufacturing of three-dimensional parts. In both cases, process simulation can contribute to process optimization. Heat source modeling is one of the main issues for an accurate model and simulation of the laser cladding process. While the laser beam intensity distribution is readily known, the other two main effects on the process' heat input are non-trivial. Namely the measurement of the absorptivity of the applied materials as well as the powder attenuation. Therefore, calorimetry measurements were carried out. The measurement method and the measurement results for laser cladding of Stellite 6 on structural steel S 235 and for the processing of Inconel 625 are presented both using a CO2 laser as well as a high power diode laser (HPDL). Additionally, a heat source model is deduced.
L'archivage a long terme de la maquette numerique trois-dimensionnelle annotee
NASA Astrophysics Data System (ADS)
Kheddouci, Fawzi
The use of engineering drawings in the development of mechanical products, including the exchange of engineering data as well as for archiving, is common industry practice. Traditionally, paper has been the mean to deliver those needs. However, these practices have evolved in favour of computerized tools and methods for the creation, diffusion and preservation of data involved in the process of developing aeronautical products characterized by life cycles that can exceed 70 years. Therefore, it is necessary to redefine how to maintain this data in a context whereby engineering drawings are being replaced by the 3D annotated digital mock-up. This thesis addresses the issue of long-term archiving of 3D annotated digital mock-ups, which includes geometric and dimensional tolerances, as well as other notes and specifications, in compliance with the requirements formulated by the aviation industry including regulatory and legal requirements. First, we review the requirements imposed by the aviation industry in the context of long-term archiving of 3D annotated digital mock-ups. We then consider alternative solutions. We begin by identifying the theoretical approach behind the choice of a conceptual model for digital long-term archiving. Then we evaluate, among the proposed alternatives, an archiving format that will guarantee the preservation of the integrity of the 3D annotated model (geometry, tolerances and other metadata) and its sustainability. The evaluation of 3D PDF PRC as a potential archiving format is carried out on a sample of 185 3D CATIA V5 models (parts and assemblies) provided by industrial partners. This evaluation is guided by a set of criteria including the transfer of geometry, 3D annotations, views, captures and parts positioning in assembly. The results indicate that maintaining the exact geometry is done successfully when transferring CATIA V5 models to 3D PDF PRC. Concerning the transfer of 3D annotations, we observed degradation associated with their display on the 3D model. This problem can, however, be solved by performing the conversion of the native model to STEP first, and then to 3D PDF PRC. In view of current tools, PDF 3D PRC is considered as a potential solution for long-term archiving of 3D annotated models for individual parts. However, this solution is currently not deemed adequate for archiving assemblies. The practice of 2D drawing will thus remain, in the short term, for assemblies.
The finite element simulation analysis research of 38CrSi cylindrical power spinning
NASA Astrophysics Data System (ADS)
Liang, Wei; Lv, Qiongying; Zhao, Yujuan; Lv, Yunxia
2018-01-01
In order to grope for the influence of the main cylindrical spinning process parameters on the spinning process, this paper combines with real tube power spinning process and uses ABAQUS finite element analysis software to simulate the tube power spinning process of 38CrSi steel materials, through the analysis of the stress, strain of the part forming process, analyzes the influence of the thickness reduction and the feed rate to the forming process, and analyzes the variation of the spinning force, finally determines the reasonable main spinning process parameters combination.
Technology for Transient Simulation of Vibration during Combustion Process in Rocket Thruster
NASA Astrophysics Data System (ADS)
Zubanov, V. M.; Stepanov, D. V.; Shabliy, L. S.
2018-01-01
The article describes the technology for simulation of transient combustion processes in the rocket thruster for determination of vibration frequency occurs during combustion. The engine operates on gaseous propellant: oxygen and hydrogen. Combustion simulation was performed using the ANSYS CFX software. Three reaction mechanisms for the stationary mode were considered and described in detail. The way for obtaining quick CFD-results with intermediate combustion components using an EDM model was found. The way to generate the Flamelet library with CFX-RIF was described. A technique for modeling transient combustion processes in the rocket thruster was proposed based on the Flamelet library. A cyclic irregularity of the temperature field like vortex core precession was detected in the chamber. Frequency of flame precession was obtained with the proposed simulation technique.
Numerical simulation of plasma processes driven by transverse ion heating
NASA Technical Reports Server (NTRS)
Singh, Nagendra; Chan, C. B.
1993-01-01
The plasma processes driven by transverse ion heating in a diverging flux tube are investigated with numerical simulation. The heating is found to drive a host of plasma processes, in addition to the well-known phenomenon of ion conics. The downward electric field near the reverse shock generates a doublestreaming situation consisting of two upflowing ion populations with different average flow velocities. The electric field in the reverse shock region is modulated by the ion-ion instability driven by the multistreaming ions. The oscillating fields in this region have the possibility of heating electrons. These results from the simulations are compared with results from a previous study based on a hydrodynamical model. Effects of spatial resolutions provided by simulations on the evolution of the plasma are discussed.
A fortran program for Monte Carlo simulation of oil-field discovery sequences
Bohling, Geoffrey C.; Davis, J.C.
1993-01-01
We have developed a program for performing Monte Carlo simulation of oil-field discovery histories. A synthetic parent population of fields is generated as a finite sample from a distribution of specified form. The discovery sequence then is simulated by sampling without replacement from this parent population in accordance with a probabilistic discovery process model. The program computes a chi-squared deviation between synthetic and actual discovery sequences as a function of the parameters of the discovery process model, the number of fields in the parent population, and the distributional parameters of the parent population. The program employs the three-parameter log gamma model for the distribution of field sizes and employs a two-parameter discovery process model, allowing the simulation of a wide range of scenarios. ?? 1993.
Performance issues for domain-oriented time-driven distributed simulations
NASA Technical Reports Server (NTRS)
Nicol, David M.
1987-01-01
It has long been recognized that simulations form an interesting and important class of computations that may benefit from distributed or parallel processing. Since the point of parallel processing is improved performance, the recent proliferation of multiprocessors requires that we consider the performance issues that naturally arise when attempting to implement a distributed simulation. Three such issues are: (1) the problem of mapping the simulation onto the architecture, (2) the possibilities for performing redundant computation in order to reduce communication, and (3) the avoidance of deadlock due to distributed contention for message-buffer space. These issues are discussed in the context of a battlefield simulation implemented on a medium-scale multiprocessor message-passing architecture.
Space-filling designs for computer experiments: A review
Joseph, V. Roshan
2016-01-29
Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less
Space-filling designs for computer experiments: A review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joseph, V. Roshan
Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less
A New Numerical Simulation technology of Multistage Fracturing in Horizontal Well
NASA Astrophysics Data System (ADS)
Cheng, Ning; Kang, Kaifeng; Li, Jianming; Liu, Tao; Ding, Kun
2017-11-01
Horizontal multi-stage fracturing is recognized the effective development technology of unconventional oil resources. Geological mechanics in the numerical simulation of hydraulic fracturing technology occupies very important position, compared with the conventional numerical simulation technology, because of considering the influence of geological mechanics. New numerical simulation of hydraulic fracturing can more effectively optimize the design of fracturing and evaluate the production after fracturing. This paper studies is based on the three-dimensional stress and rock physics parameters model, using the latest fluid-solid coupling numerical simulation technology to engrave the extension process of fracture and describes the change of stress field in fracturing process, finally predict the production situation.
A high-order language for a system of closely coupled processing elements
NASA Technical Reports Server (NTRS)
Feyock, S.; Collins, W. R.
1986-01-01
The research reported in this paper was occasioned by the requirements on part of the Real-Time Digital Simulator (RTDS) project under way at NASA Lewis Research Center. The RTDS simulation scheme employs a network of CPUs running lock-step cycles in the parallel computations of jet airplane simulations. Their need for a high order language (HOL) that would allow non-experts to write simulation applications and that could be implemented on a possibly varying network can best be fulfilled by using the programming language Ada. We describe how the simulation problems can be modeled in Ada, how to map a single, multi-processing Ada program into code for individual processors, regardless of network reconfiguration, and why some Ada language features are particulary well-suited to network simulations.
Finite element simulation and Experimental verification of Incremental Sheet metal Forming
NASA Astrophysics Data System (ADS)
Kaushik Yanamundra, Krishna; Karthikeyan, R., Dr.; Naranje, Vishal, Dr
2018-04-01
Incremental sheet metal forming is now a proven manufacturing technique that can be employed to obtain application specific, customized, symmetric or asymmetric shapes that are required by automobile or biomedical industries for specific purposes like car body parts, dental implants or knee implants. Finite element simulation of metal forming process is being performed successfully using explicit dynamics analysis of commercial FE software. The simulation is mainly useful in optimization of the process as well design of the final product. This paper focuses on simulating the incremental sheet metal forming process in ABAQUS, and validating the results using experimental methods. The shapes generated for testing are of trapezoid, dome and elliptical shapes whose G codes are written and fed into the CNC milling machine with an attached forming tool with a hemispherical bottom. The same pre-generated coordinates are used to simulate a similar machining conditions in ABAQUS and the tool forces, stresses and strains in the workpiece while machining are obtained as the output data. The forces experimentally were recorded using a dynamometer. The experimental and simulated results were then compared and thus conclusions were drawn.
Passive coherent location system simulation and evaluation
NASA Astrophysics Data System (ADS)
Slezák, Libor; Kvasnička, Michael; Pelant, Martin; Vávra, Jiř; Plšek, Radek
2006-02-01
Passive Coherent Location (PCL) is going to be important and perspective system of passive location of non cooperative and stealth targets. It works with the sources of irradiation of opportunity. PCL is intended to be a part of mobile Air Command and Control System (ACCS) as a Deployable ACCS Component (DAC). The company ERA works on PCL system parameters verification program by complete PCL simulator development since the year 2003. The Czech DoD takes financial participation on this program. The moving targets scenario, the RCS calculation by method of moment, ground clutter scattering and signal processing method (the bottle neck of the PCL) are available up to now in simulator tool. The digital signal (DSP) processing algorithms are performed both on simulated data and on real data measured at NATO C3 Agency in their Haag experiment. The Institute of Information Theory and Automation of the Academy of Sciences of the Czech Republic takes part on the implementation of the DSP algorithms in FPGA. The paper describes the simulator and signal processing structure and results both on simulated and measured data.
Integrated modeling and heat treatment simulation of austempered ductile iron
NASA Astrophysics Data System (ADS)
Hepp, E.; Hurevich, V.; Schäfer, W.
2012-07-01
The integrated modeling and simulation of the casting and heat treatment processes for producing austempered ductile iron (ADI) castings is presented. The focus is on describing different models to simulate the austenitization, quenching and austempering steps during ADI heat treatment. The starting point for the heat treatment simulation is the simulated microstructure after solidification and cooling. The austenitization model considers the transformation of the initial ferrite-pearlite matrix into austenite as well as the dissolution of graphite in austenite to attain a uniform carbon distribution. The quenching model is based on measured CCT diagrams. Measurements have been carried out to obtain these diagrams for different alloys with varying Cu, Ni and Mo contents. The austempering model includes nucleation and growth kinetics of the ADI matrix. The model of ADI nucleation is based on experimental measurements made for varied Cu, Ni, Mo contents and austempering temperatures. The ADI kinetic model uses a diffusion controlled approach to model the growth. The models have been integrated in a tool for casting process simulation. Results are shown for the optimization of the heat treatment process of a planetary carrier casting.
Physical Processes and Applications of the Monte Carlo Radiative Energy Deposition (MRED) Code
NASA Astrophysics Data System (ADS)
Reed, Robert A.; Weller, Robert A.; Mendenhall, Marcus H.; Fleetwood, Daniel M.; Warren, Kevin M.; Sierawski, Brian D.; King, Michael P.; Schrimpf, Ronald D.; Auden, Elizabeth C.
2015-08-01
MRED is a Python-language scriptable computer application that simulates radiation transport. It is the computational engine for the on-line tool CRÈME-MC. MRED is based on c++ code from Geant4 with additional Fortran components to simulate electron transport and nuclear reactions with high precision. We provide a detailed description of the structure of MRED and the implementation of the simulation of physical processes used to simulate radiation effects in electronic devices and circuits. Extensive discussion and references are provided that illustrate the validation of models used to implement specific simulations of relevant physical processes. Several applications of MRED are summarized that demonstrate its ability to predict and describe basic physical phenomena associated with irradiation of electronic circuits and devices. These include effects from single particle radiation (including both direct ionization and indirect ionization effects), dose enhancement effects, and displacement damage effects. MRED simulations have also helped to identify new single event upset mechanisms not previously observed by experiment, but since confirmed, including upsets due to muons and energetic electrons.
Badal, Andreu; Badano, Aldo
2009-11-01
It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.
A new lumped-parameter model for flow in unsaturated dual-porosity media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zimmerman, Robert W.; Hadgu, Teklu; Bodvarsson, Gudmundur S.
A new lumped-parameter approach to simulating unsaturated flow processes in dual-porosity media such as fractured rocks or aggregated soils is presented. Fluid flow between the fracture network and the matrix blocks is described by a non-linear equation that relates the imbibition rate to the local difference in liquid-phase pressure between the fractures and the matrix blocks. Unlike a Warren-Root-type equation, this equation is accurate in both the early and late time regimes. The fracture/matrix interflow equation has been incorporated into an existing unsaturated flow simulator, to serve as a source/sink term for fracture gridblocks. Flow processes are then simulated usingmore » only fracture gridblocks in the computational grid. This new lumped-parameter approach has been tested on two problems involving transient flow in fractured/porous media, and compared with simulations performed using explicit discretization of the matrix blocks. The new procedure seems to accurately simulate flow processes in unsaturated fractured rocks, and typically requires an order of magnitude less computational time than do simulations using fully-discretized matrix blocks. [References: 37]« less
An intersubject variable regional anesthesia simulator with a virtual patient architecture.
Ullrich, Sebastian; Grottke, Oliver; Fried, Eduard; Frommen, Thorsten; Liao, Wei; Rossaint, Rolf; Kuhlen, Torsten; Deserno, Thomas M
2009-11-01
The main purpose is to provide an intuitive VR-based training environment for regional anesthesia (RA). The research question is how to process subject-specific datasets, organize them in a meaningful way and how to perform the simulation for peripheral regions. We propose a flexible virtual patient architecture and methods to process datasets. Image acquisition, image processing (especially segmentation), interactive nerve modeling and permutations (nerve instantiation) are described in detail. The simulation of electric impulse stimulation and according responses are essential for the training of peripheral RA and solved by an approach based on the electric distance. We have created an XML-based virtual patient database with several subjects. Prototypes of the simulation are implemented and run on multimodal VR hardware (e.g., stereoscopic display and haptic device). A first user pilot study has confirmed our approach. The virtual patient architecture enables support for arbitrary scenarios on different subjects. This concept can also be used for other simulators. In future work, we plan to extend the simulation and conduct further evaluations in order to provide a tool for routine training for RA.
Abdominal surgery process modeling framework for simulation using spreadsheets.
Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja
2015-08-01
We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Electrical Storm Simulation to Improve the Learning Physics Process
ERIC Educational Resources Information Center
Martínez Muñoz, Miriam; Jiménez Rodríguez, María Lourdes; Gutiérrez de Mesa, José Antonio
2013-01-01
This work is part of a research project whose main objective is to understand the impact that the use of Information and Communication Technology (ICT) has on the teaching and learning process on the subject of Physics. We will show that, with the use of a storm simulator, physics students improve their learning process on one hand they understand…
1983-06-01
constrained at each step. Use of dis- crete simulation can be a powerful tool in this process if its role is carefully planned. The gross behavior of the...by projecting: - the arrival of units of work at SPLICE processing facilities (workload analysis) . - the amount of processing resources comsumed in
Random Process Simulation for stochastic fatigue analysis. Ph.D. Thesis - Rice Univ., Houston, Tex.
NASA Technical Reports Server (NTRS)
Larsen, Curtis E.
1988-01-01
A simulation technique is described which directly synthesizes the extrema of a random process and is more efficient than the Gaussian simulation method. Such a technique is particularly useful in stochastic fatigue analysis because the required stress range moment E(R sup m), is a function only of the extrema of the random stress process. The family of autoregressive moving average (ARMA) models is reviewed and an autoregressive model is presented for modeling the extrema of any random process which has a unimodal power spectral density (psd). The proposed autoregressive technique is found to produce rainflow stress range moments which compare favorably with those computed by the Gaussian technique and to average 11.7 times faster than the Gaussian technique. The autoregressive technique is also adapted for processes having bimodal psd's. The adaptation involves using two autoregressive processes to simulate the extrema due to each mode and the superposition of these two extrema sequences. The proposed autoregressive superposition technique is 9 to 13 times faster than the Gaussian technique and produces comparable values for E(R sup m) for bimodal psd's having the frequency of one mode at least 2.5 times that of the other mode.
NASA Astrophysics Data System (ADS)
Green, Tim; Faulkner, Andrew; Rosen, Stuart; Macherey, Olivier
2005-07-01
Standard continuous interleaved sampling processing, and a modified processing strategy designed to enhance temporal cues to voice pitch, were compared on tests of intonation perception, and vowel perception, both in implant users and in acoustic simulations. In standard processing, 400 Hz low-pass envelopes modulated either pulse trains (implant users) or noise carriers (simulations). In the modified strategy, slow-rate envelope modulations, which convey dynamic spectral variation crucial for speech understanding, were extracted by low-pass filtering (32 Hz). In addition, during voiced speech, higher-rate temporal modulation in each channel was provided by 100% amplitude-modulation by a sawtooth-like wave form whose periodicity followed the fundamental frequency (F0) of the input. Channel levels were determined by the product of the lower- and higher-rate modulation components. Both in acoustic simulations and in implant users, the ability to use intonation information to identify sentences as question or statement was significantly better with modified processing. However, while there was no difference in vowel recognition in the acoustic simulation, implant users performed worse with modified processing both in vowel recognition and in formant frequency discrimination. It appears that, while enhancing pitch perception, modified processing harmed the transmission of spectral information.
Simulation of Water Gas Shift Zeolite Membrane Reactor
NASA Astrophysics Data System (ADS)
Makertiharta, I. G. B. N.; Rizki, Z.; Zunita, Megawati; Dharmawijaya, P. T.
2017-07-01
The search of alternative energy sources keeps growing from time to time. Various alternatives have been introduced to reduce the use of fossil fuel, including hydrogen. Many pathways can be used to produce hydrogen. Among all of those, the Water Gas Shift (WGS) reaction is the most common pathway to produce high purity hydrogen. The WGS technique faces a downstream processing challenge due to the removal hydrogen from the product stream itself since it contains a mixture of hydrogen, carbon dioxide and also the excess reactants. An integrated process using zeolite membrane reactor has been introduced to improve the performance of the process by selectively separate the hydrogen whilst boosting the conversion. Furthermore, the zeolite membrane reactor can be further improved via optimizing the process condition. This paper discusses the simulation of Zeolite Membrane Water Gas Shift Reactor (ZMWGSR) with variation of process condition to achieve an optimum performance. The simulation can be simulated into two consecutive mechanisms, the reaction prior to the permeation of gases through the zeolite membrane. This paper is focused on the optimization of the process parameters (e.g. temperature, initial concentration) and also membrane properties (e.g. pore size) to achieve an optimum product specification (concentration, purity).
NASA Astrophysics Data System (ADS)
Safarzadeh, Mohammadtaher; Scannapieco, Evan
2018-06-01
The history of r-process enrichment in our galaxy is modeled through a novel set of zoom cosmo- logical simulations on a MilkyWay type galaxy. r-process sources are assumed to be neutron star mergers with a distribution of natal kicks and merge time distribution. We model turbulent mixing to estimate the pristine gas fraction in each simulation cell which we use to determine the Pop III star formation with assigned Carbon rich ejecta when going off as SNe. We follow the formation of Carbon-Enhanced Metal-Poor (CEMP) stars and the statistics of different r-process enhanced class of stars. The simulation underpredict the frequency of CEMP/MP stars by a factor of 2-4. Likewise the MP-rI/MP and MP-rII/MP and CEMP-r/CEMP cumulative ratios are all under predicted by 1-2 orders of magnitude. Our results show that NS binaries by themselves fall too short to explain the observed frequency of r-process enhanced stars and other sources of r-process enrichment at high redshifts are needed to fill the gap.
Simulations of ecosystem hydrological processes using a unified multi-scale model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiaofan; Liu, Chongxuan; Fang, Yilin
2015-01-01
This paper presents a unified multi-scale model (UMSM) that we developed to simulate hydrological processes in an ecosystem containing both surface water and groundwater. The UMSM approach modifies the Navier–Stokes equation by adding a Darcy force term to formulate a single set of equations to describe fluid momentum and uses a generalized equation to describe fluid mass balance. The advantage of the approach is that the single set of the equations can describe hydrological processes in both surface water and groundwater where different models are traditionally required to simulate fluid flow. This feature of the UMSM significantly facilitates modelling ofmore » hydrological processes in ecosystems, especially at locations where soil/sediment may be frequently inundated and drained in response to precipitation, regional hydrological and climate changes. In this paper, the UMSM was benchmarked using WASH123D, a model commonly used for simulating coupled surface water and groundwater flow. Disney Wilderness Preserve (DWP) site at the Kissimmee, Florida, where active field monitoring and measurements are ongoing to understand hydrological and biogeochemical processes, was then used as an example to illustrate the UMSM modelling approach. The simulations results demonstrated that the DWP site is subject to the frequent changes in soil saturation, the geometry and volume of surface water bodies, and groundwater and surface water exchange. All the hydrological phenomena in surface water and groundwater components including inundation and draining, river bank flow, groundwater table change, soil saturation, hydrological interactions between groundwater and surface water, and the migration of surface water and groundwater interfaces can be simultaneously simulated using the UMSM. Overall, the UMSM offers a cross-scale approach that is particularly suitable to simulate coupled surface and ground water flow in ecosystems with strong surface water and groundwater interactions.« less
NASA Astrophysics Data System (ADS)
Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.
2012-12-01
Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.
MJO simulation in CMIP5 climate models: MJO skill metrics and process-oriented diagnosis
NASA Astrophysics Data System (ADS)
Ahn, Min-Seop; Kim, Daehyun; Sperber, Kenneth R.; Kang, In-Sik; Maloney, Eric; Waliser, Duane; Hendon, Harry
2017-12-01
The Madden-Julian Oscillation (MJO) simulation diagnostics developed by MJO Working Group and the process-oriented MJO simulation diagnostics developed by MJO Task Force are applied to 37 Coupled Model Intercomparison Project phase 5 (CMIP5) models in order to assess model skill in representing amplitude, period, and coherent eastward propagation of the MJO, and to establish a link between MJO simulation skill and parameterized physical processes. Process-oriented diagnostics include the Relative Humidity Composite based on Precipitation (RHCP), Normalized Gross Moist Stability (NGMS), and the Greenhouse Enhancement Factor (GEF). Numerous scalar metrics are developed to quantify the results. Most CMIP5 models underestimate MJO amplitude, especially when outgoing longwave radiation (OLR) is used in the evaluation, and exhibit too fast phase speed while lacking coherence between eastward propagation of precipitation/convection and the wind field. The RHCP-metric, indicative of the sensitivity of simulated convection to low-level environmental moisture, and the NGMS-metric, indicative of the efficiency of a convective atmosphere for exporting moist static energy out of the column, show robust correlations with a large number of MJO skill metrics. The GEF-metric, indicative of the strength of the column-integrated longwave radiative heating due to cloud-radiation interaction, is also correlated with the MJO skill metrics, but shows relatively lower correlations compared to the RHCP- and NGMS-metrics. Our results suggest that modifications to processes associated with moisture-convection coupling and the gross moist stability might be the most fruitful for improving simulations of the MJO. Though the GEF-metric exhibits lower correlations with the MJO skill metrics, the longwave radiation feedback is highly relevant for simulating the weak precipitation anomaly regime that may be important for the establishment of shallow convection and the transition to deep convection.
MJO simulation in CMIP5 climate models: MJO skill metrics and process-oriented diagnosis
Ahn, Min-Seop; Kim, Daehyun; Sperber, Kenneth R.; ...
2017-03-23
The Madden-Julian Oscillation (MJO) simulation diagnostics developed by MJO Working Group and the process-oriented MJO simulation diagnostics developed by MJO Task Force are applied to 37 Coupled Model Intercomparison Project phase 5 (CMIP5) models in order to assess model skill in representing amplitude, period, and coherent eastward propagation of the MJO, and to establish a link between MJO simulation skill and parameterized physical processes. Process-oriented diagnostics include the Relative Humidity Composite based on Precipitation (RHCP), Normalized Gross Moist Stability (NGMS), and the Greenhouse Enhancement Factor (GEF). Numerous scalar metrics are developed to quantify the results. Most CMIP5 models underestimate MJOmore » amplitude, especially when outgoing longwave radiation (OLR) is used in the evaluation, and exhibit too fast phase speed while lacking coherence between eastward propagation of precipitation/convection and the wind field. The RHCP-metric, indicative of the sensitivity of simulated convection to low-level environmental moisture, and the NGMS-metric, indicative of the efficiency of a convective atmosphere for exporting moist static energy out of the column, show robust correlations with a large number of MJO skill metrics. The GEF-metric, indicative of the strength of the column-integrated longwave radiative heating due to cloud-radiation interaction, is also correlated with the MJO skill metrics, but shows relatively lower correlations compared to the RHCP- and NGMS-metrics. Our results suggest that modifications to processes associated with moisture-convection coupling and the gross moist stability might be the most fruitful for improving simulations of the MJO. Though the GEF-metric exhibits lower correlations with the MJO skill metrics, the longwave radiation feedback is highly relevant for simulating the weak precipitation anomaly regime that may be important for the establishment of shallow convection and the transition to deep convection.« less
MJO simulation in CMIP5 climate models: MJO skill metrics and process-oriented diagnosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahn, Min-Seop; Kim, Daehyun; Sperber, Kenneth R.
The Madden-Julian Oscillation (MJO) simulation diagnostics developed by MJO Working Group and the process-oriented MJO simulation diagnostics developed by MJO Task Force are applied to 37 Coupled Model Intercomparison Project phase 5 (CMIP5) models in order to assess model skill in representing amplitude, period, and coherent eastward propagation of the MJO, and to establish a link between MJO simulation skill and parameterized physical processes. Process-oriented diagnostics include the Relative Humidity Composite based on Precipitation (RHCP), Normalized Gross Moist Stability (NGMS), and the Greenhouse Enhancement Factor (GEF). Numerous scalar metrics are developed to quantify the results. Most CMIP5 models underestimate MJOmore » amplitude, especially when outgoing longwave radiation (OLR) is used in the evaluation, and exhibit too fast phase speed while lacking coherence between eastward propagation of precipitation/convection and the wind field. The RHCP-metric, indicative of the sensitivity of simulated convection to low-level environmental moisture, and the NGMS-metric, indicative of the efficiency of a convective atmosphere for exporting moist static energy out of the column, show robust correlations with a large number of MJO skill metrics. The GEF-metric, indicative of the strength of the column-integrated longwave radiative heating due to cloud-radiation interaction, is also correlated with the MJO skill metrics, but shows relatively lower correlations compared to the RHCP- and NGMS-metrics. Our results suggest that modifications to processes associated with moisture-convection coupling and the gross moist stability might be the most fruitful for improving simulations of the MJO. Though the GEF-metric exhibits lower correlations with the MJO skill metrics, the longwave radiation feedback is highly relevant for simulating the weak precipitation anomaly regime that may be important for the establishment of shallow convection and the transition to deep convection.« less
Advances in Integrated Computational Materials Engineering "ICME"
NASA Astrophysics Data System (ADS)
Hirsch, Jürgen
The methods of Integrated Computational Materials Engineering that were developed and successfully applied for Aluminium have been constantly improved. The main aspects and recent advances of integrated material and process modeling are simulations of material properties like strength and forming properties and for the specific microstructure evolution during processing (rolling, extrusion, annealing) under the influence of material constitution and process variations through the production process down to the final application. Examples are discussed for the through-process simulation of microstructures and related properties of Aluminium sheet, including DC ingot casting, pre-heating and homogenization, hot and cold rolling, final annealing. New results are included of simulation solution annealing and age hardening of 6xxx alloys for automotive applications. Physically based quantitative descriptions and computer assisted evaluation methods are new ICME methods of integrating new simulation tools also for customer applications, like heat affected zones in welding of age hardening alloys. The aspects of estimating the effect of specific elements due to growing recycling volumes requested also for high end Aluminium products are also discussed, being of special interest in the Aluminium producing industries.
Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda
2002-01-01
A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.
Simulations Build Efficacy: Empirical Results from a Four-Week Congressional Simulation
ERIC Educational Resources Information Center
Mariani, Mack; Glenn, Brian J.
2014-01-01
This article describes a four-week congressional committee simulation implemented in upper level courses on Congress and the Legislative process at two liberal arts colleges. We find that the students participating in the simulation possessed high levels of political knowledge and confidence in their political skills prior to the simulation. An…
Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process
NASA Astrophysics Data System (ADS)
Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph
2012-08-01
Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.
A simulation study on garment manufacturing process
NASA Astrophysics Data System (ADS)
Liong, Choong-Yeun; Rahim, Nur Azreen Abdul
2015-02-01
Garment industry is an important industry and continues to evolve in order to meet the consumers' high demands. Therefore, elements of innovation and improvement are important. In this work, research studies were conducted at a local company in order to model the sewing process of clothes manufacturing by using simulation modeling. Clothes manufacturing at the company involves 14 main processes, which are connecting the pattern, center sewing and side neating, pockets sewing, backside-sewing, attaching the front and back, sleeves preparation, attaching the sleeves and over lock, collar preparation, collar sewing, bottomedge sewing, buttonholing sewing, removing excess thread, marking button, and button cross sewing. Those fourteen processes are operated by six tailors only. The last four sets of processes are done by a single tailor. Data collection was conducted by on site observation and the probability distribution of processing time for each of the processes is determined by using @Risk's Bestfit. Then a simulation model is developed using Arena Software based on the data collected. Animated simulation model is developed in order to facilitate understanding and verifying that the model represents the actual system. With such model, what if analysis and different scenarios of operations can be experimented with virtually. The animation and improvement models will be presented in further work.
An earth imaging camera simulation using wide-scale construction of reflectance surfaces
NASA Astrophysics Data System (ADS)
Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk
2013-10-01
Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.
Refined Simulation of Satellite Laser Altimeter Full Echo Waveform
NASA Astrophysics Data System (ADS)
Men, H.; Xing, Y.; Li, G.; Gao, X.; Zhao, Y.; Gao, X.
2018-04-01
The return waveform of satellite laser altimeter plays vital role in the satellite parameters designation, data processing and application. In this paper, a method of refined full waveform simulation is proposed based on the reflectivity of the ground target, the true emission waveform and the Laser Profile Array (LPA). The ICESat/GLAS data is used as the validation data. Finally, we evaluated the simulation accuracy with the correlation coefficient. It was found that the accuracy of echo simulation could be significantly improved by considering the reflectivity of the ground target and the emission waveform. However, the laser intensity distribution recorded by the LPA has little effect on the echo simulation accuracy when compared with the distribution of the simulated laser energy. At last, we proposed a refinement idea by analyzing the experimental results, in the hope of providing references for the waveform data simulation and processing of GF-7 satellite in the future.
Simulating the flow of entangled polymers.
Masubuchi, Yuichi
2014-01-01
To optimize automation for polymer processing, attempts have been made to simulate the flow of entangled polymers. In industry, fluid dynamics simulations with phenomenological constitutive equations have been practically established. However, to account for molecular characteristics, a method to obtain the constitutive relationship from the molecular structure is required. Molecular dynamics simulations with atomic description are not practical for this purpose; accordingly, coarse-grained models with reduced degrees of freedom have been developed. Although the modeling of entanglement is still a challenge, mesoscopic models with a priori settings to reproduce entangled polymer dynamics, such as tube models, have achieved remarkable success. To use the mesoscopic models as staging posts between atomistic and fluid dynamics simulations, studies have been undertaken to establish links from the coarse-grained model to the atomistic and macroscopic simulations. Consequently, integrated simulations from materials chemistry to predict the macroscopic flow in polymer processing are forthcoming.
Implementation of Parallel Dynamic Simulation on Shared-Memory vs. Distributed-Memory Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Shuangshuang; Chen, Yousu; Wu, Di
2015-12-09
Power system dynamic simulation computes the system response to a sequence of large disturbance, such as sudden changes in generation or load, or a network short circuit followed by protective branch switching operation. It consists of a large set of differential and algebraic equations, which is computational intensive and challenging to solve using single-processor based dynamic simulation solution. High-performance computing (HPC) based parallel computing is a very promising technology to speed up the computation and facilitate the simulation process. This paper presents two different parallel implementations of power grid dynamic simulation using Open Multi-processing (OpenMP) on shared-memory platform, and Messagemore » Passing Interface (MPI) on distributed-memory clusters, respectively. The difference of the parallel simulation algorithms and architectures of the two HPC technologies are illustrated, and their performances for running parallel dynamic simulation are compared and demonstrated.« less
RFI and SCRIMP Model Development and Verification
NASA Technical Reports Server (NTRS)
Loos, Alfred C.; Sayre, Jay
2000-01-01
Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.
Stanley, Claire; Lindsay, Sally; Parker, Kathryn; Kawamura, Anne; Samad Zubairi, Mohammad
2018-05-09
We previously reported that experienced clinicians find the process of collectively building and participating in simulations provide (1) a unique reflective opportunity; (2) a venue to identify different perspectives through discussion and action in a group; and (3) a safe environment for learning. No studies have assessed the value of collaborating with standardized patients (SPs) and patient facilitators (PFs) in the process. In this work, we describe this collaboration in building a simulation and the key elements that facilitate reflection. Three simulation scenarios surrounding communication were built by teams of clinicians, a PF, and SPs. Six build sessions were audio recorded, transcribed, and thematically analyzed through an iterative process to (1) describe the steps of building a simulation scenario and (2) identify the key elements involved in the collaboration. The five main steps to build a simulation scenario were (1) storytelling and reflection; (2) defining objectives and brainstorming ideas; (3) building a stem and creating a template; (4) refining the scenario with feedback from SPs; and (5) mock run-throughs with follow-up discussion. During these steps, the PF shared personal insights, challenging participants to reflect deeper to better understand and consider the patient's perspective. The SPs provided unique outside perspective to the group. In addition, the interaction between the SPs and the PF helped refine character roles. A collaborative approach incorporating feedback from PFs and SPs to create a simulation scenario is a valuable method to enhance reflective practice for clinicians.
Numerical propulsion system simulation: An interdisciplinary approach
NASA Technical Reports Server (NTRS)
Nichols, Lester D.; Chamis, Christos C.
1991-01-01
The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.
2017-06-01
designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily in...systems, simulation, discrete event simulation, design of experiments, data analysis, simplekit, nearly orthogonal and balanced designs 15. NUMBER OF... designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily
Numerical propulsion system simulation - An interdisciplinary approach
NASA Technical Reports Server (NTRS)
Nichols, Lester D.; Chamis, Christos C.
1991-01-01
The tremendous progress being made in computational engineering and the rapid growth in computing power that is resulting from parallel processing now make it feasible to consider the use of computer simulations to gain insights into the complex interactions in aerospace propulsion systems and to evaluate new concepts early in the design process before a commitment to hardware is made. Described here is a NASA initiative to develop a Numerical Propulsion System Simulation (NPSS) capability.
Dynamic simulation of Static Var Compensators in distribution systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koessler, R.J.
1992-08-01
This paper is a system study guide for the correction of voltage dips due to large motor startups with Static Var Compensators (SVCs). The method utilizes time simulations, which are an important aid in the equipment design and specification. The paper illustrates the process of setting-up a computer model and performing time simulations. The study process is demonstrated through an example, the Shawnee feeder in the Niagara Mohawk Power Corporation service area.
3D numerical simulation of transient processes in hydraulic turbines
NASA Astrophysics Data System (ADS)
Cherny, S.; Chirkov, D.; Bannikov, D.; Lapin, V.; Skorospelov, V.; Eshkunova, I.; Avdushenko, A.
2010-08-01
An approach for numerical simulation of 3D hydraulic turbine flows in transient operating regimes is presented. The method is based on a coupled solution of incompressible RANS equations, runner rotation equation, and water hammer equations. The issue of setting appropriate boundary conditions is considered in detail. As an illustration, the simulation results for runaway process are presented. The evolution of vortex structure and its effect on computed runaway traces are analyzed.
Machine learning in sentiment reconstruction of the simulated stock market
NASA Astrophysics Data System (ADS)
Goykhman, Mikhail; Teimouri, Ali
2018-02-01
In this paper we continue the study of the simulated stock market framework defined by the driving sentiment processes. We focus on the market environment driven by the buy/sell trading sentiment process of the Markov chain type. We apply the methodology of the Hidden Markov Models and the Recurrent Neural Networks to reconstruct the transition probabilities matrix of the Markov sentiment process and recover the underlying sentiment states from the observed stock price behavior. We demonstrate that the Hidden Markov Model can successfully recover the transition probabilities matrix for the hidden sentiment process of the Markov Chain type. We also demonstrate that the Recurrent Neural Network can successfully recover the hidden sentiment states from the observed simulated stock price time series.
Splitting algorithm for numerical simulation of Li-ion battery electrochemical processes
NASA Astrophysics Data System (ADS)
Iliev, Oleg; Nikiforova, Marina A.; Semenov, Yuri V.; Zakharov, Petr E.
2017-11-01
In this paper we present a splitting algorithm for a numerical simulation of Li-ion battery electrochemical processes. Liion battery consists of three domains: anode, cathode and electrolyte. Mathematical model of electrochemical processes is described on a microscopic scale, and contains nonlinear equations for concentration and potential in each domain. On the interface of electrodes and electrolyte there are the Lithium ions intercalation and deintercalation processes, which are described by Butler-Volmer nonlinear equation. To approximate in spatial coordinates we use finite element methods with discontinues Galerkin elements. To simplify numerical simulations we develop the splitting algorithm, which split the original problem into three independent subproblems. We investigate the numerical convergence of the algorithm on 2D model problem.
Modeling laser velocimeter signals as triply stochastic Poisson processes
NASA Technical Reports Server (NTRS)
Mayo, W. T., Jr.
1976-01-01
Previous models of laser Doppler velocimeter (LDV) systems have not adequately described dual-scatter signals in a manner useful for analysis and simulation of low-level photon-limited signals. At low photon rates, an LDV signal at the output of a photomultiplier tube is a compound nonhomogeneous filtered Poisson process, whose intensity function is another (slower) Poisson process with the nonstationary rate and frequency parameters controlled by a random flow (slowest) process. In the present paper, generalized Poisson shot noise models are developed for low-level LDV signals. Theoretical results useful in detection error analysis and simulation are presented, along with measurements of burst amplitude statistics. Computer generated simulations illustrate the difference between Gaussian and Poisson models of low-level signals.
Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design
NASA Astrophysics Data System (ADS)
Koga, Tsuyoshi; Aoyama, Kazuhiro
This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.
Beltrán, F R; Lorenzo, V; Acosta, J; de la Orden, M U; Martínez Urreaga, J
2018-06-15
The aim of this work is to study the effects of different simulated mechanical recycling processes on the structure and properties of PLA. A commercial grade of PLA was melt compounded and compression molded, then subjected to two different recycling processes. The first recycling process consisted of an accelerated ageing and a second melt processing step, while the other recycling process included an accelerated ageing, a demanding washing process and a second melt processing step. The intrinsic viscosity measurements indicate that both recycling processes produce a degradation in PLA, which is more pronounced in the sample subjected to the washing process. DSC results suggest an increase in the mobility of the polymer chains in the recycled materials; however the degree of crystallinity of PLA seems unchanged. The optical, mechanical and gas barrier properties of PLA do not seem to be largely affected by the degradation suffered during the different recycling processes. These results suggest that, despite the degradation of PLA, the impact of the different simulated mechanical recycling processes on the final properties is limited. Thus, the potential use of recycled PLA in packaging applications is not jeopardized. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zwickl, Titus; Carleer, Bart; Kubli, Waldemar
2005-08-01
In the past decade, sheet metal forming simulation became a well established tool to predict the formability of parts. In the automotive industry, this has enabled significant reduction in the cost and time for vehicle design and development, and has helped to improve the quality and performance of vehicle parts. However, production stoppages for troubleshooting and unplanned die maintenance, as well as production quality fluctuations continue to plague manufacturing cost and time. The focus therefore has shifted in recent times beyond mere feasibility to robustness of the product and process being engineered. Ensuring robustness is the next big challenge for the virtual tryout / simulation technology. We introduce new methods, based on systematic stochastic simulations, to visualize the behavior of the part during the whole forming process — in simulation as well as in production. Sensitivity analysis explains the response of the part to changes in influencing parameters. Virtual tryout allows quick exploration of changed designs and conditions. Robust design and manufacturing guarantees quality and process capability for the production process. While conventional simulations helped to reduce development time and cost by ensuring feasible processes, robustness engineering tools have the potential for far greater cost and time savings. Through examples we illustrate how expected and unexpected behavior of deep drawing parts may be tracked down, identified and assigned to the influential parameters. With this knowledge, defects can be eliminated or springback can be compensated e.g.; the response of the part to uncontrollable noise can be predicted and minimized. The newly introduced methods enable more reliable and predictable stamping processes in general.
LPJ-GUESS Simulated North America Vegetation for 21-0 ka Using the TraCE-21ka Climate Simulation
NASA Astrophysics Data System (ADS)
Shafer, S. L.; Bartlein, P. J.
2016-12-01
Transient climate simulations that span multiple millennia (e.g., TraCE-21ka) have become more common as computing power has increased, allowing climate models to complete long simulations in relatively short periods of time (i.e., months). These climate simulations provide information on the potential rate, variability, and spatial expression of past climate changes. They also can be used as input data for other environmental models to simulate transient changes for different components of paleoenvironmental systems, such as vegetation. Long, transient paleovegetation simulations can provide information on a range of ecological processes, describe the spatial and temporal patterns of changes in species distributions, and identify the potential locations of past species refugia. Paleovegetation simulations also can be used to fill in spatial and temporal gaps in observed paleovegetation data (e.g., pollen records from lake sediments) and to test hypotheses of past vegetation change. We used the TraCE-21ka transient climate simulation for 21-0 ka from CCSM3, a coupled atmosphere-ocean general circulation model. The TraCE-21ka simulated temperature, precipitation, and cloud data were regridded onto a 10-minute grid of North America. These regridded climate data, along with soil data and atmospheric carbon dioxide concentrations, were used as input to LPJ-GUESS, a general ecosystem model, to simulate North America vegetation from 21-0 ka. LPJ-GUESS simulates many of the processes controlling the distribution of vegetation (e.g., competition), although some important processes (e.g., dispersal) are not simulated. We evaluate the LPJ-GUESS-simulated vegetation (in the form of plant functional types and biomes) for key time periods and compare the simulated vegetation with observed paleovegetation data, such as data archived in the Neotoma Paleoecology Database. In general, vegetation simulated by LPJ-GUESS reproduces the major North America vegetation patterns (e.g., forest, grassland) with regional areas of disagreement between simulated and observed vegetation. We describe the regions and time periods with the greatest data-model agreement and disagreement, and discuss some of the strengths and weaknesses of both the simulated climate and simulated vegetation data.
A simplified computational memory model from information processing.
Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang
2016-11-23
This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.
A low-cost fabrication method for sub-millimeter wave GaAs Schottky diode
NASA Astrophysics Data System (ADS)
Jenabi, Sarvenaz; Deslandes, Dominic; Boone, Francois; Charlebois, Serge A.
2017-10-01
In this paper, a submillimeter-wave Schottky diode is designed and simulated. Effect of Schottky layer thickness on cut-off frequency is studied. A novel microfabrication process is proposed and implemented. The presented microfabrication process avoids electron-beam (e-beam) lithography which reduces the cost. Also, this process provides more flexibility in selection of design parameters and allows significant reduction in the device parasitic capacitance. A key feature of the process is that the Schottky contact, the air-bridges, and the transmission lines, are fabricated in a single lift-off step. This process relies on a planarization method that is suitable for trenches of 1-10 μm deep and is tolerant to end-point variations. The fabricated diode is measured and results are compared with simulations. A very good agreement between simulation and measurement results are observed.
Two-dimensional simulation of holographic data storage medium for multiplexed recording.
Toishi, Mitsuru; Takeda, Takahiro; Tanaka, Kenji; Tanaka, Tomiji; Fukumoto, Atsushi; Watanabe, Kenjiro
2008-02-18
In this paper, we propose a new analysis model for photopolymer recording processes that calculate the two-dimensional refractive index distribution of multiplexed holograms. For the simulation of the photopolymer medium, time evolution of monomer diffusion and polymerization need to be calculated simultaneously. The distribution of the refractive index inside the medium is induced by these processes. By evaluating the refractive index pattern on each layer, the diffraction beams from the multiplexed hologram can be read out by beam propagation method (BPM). This is the first paper to determine the diffraction beam from a multiplexed hologram in a simulated photopolymer medium process. We analyze the time response of the multiplexed hologram recording processes in the photopolymer, and estimate the degradation of diffraction efficiency with multiplexed recording. This work can greatly contribute to understanding the process of hologram recording.
Simulation of SEU Cross-sections using MRED under Conditions of Limited Device Information
NASA Technical Reports Server (NTRS)
Lauenstein, J. M.; Reed, R. A.; Weller, R. A.; Mendenhall, M. H.; Warren, K. M.; Pellish, J. A.; Schrimpf, R. D.; Sierawski, B. D.; Massengill, L. W.; Dodd, P. E.;
2007-01-01
This viewgraph presentation reviews the simulation of Single Event Upset (SEU) cross sections using the membrane electrode assembly (MEA) resistance and electrode diffusion (MRED) tool using "Best guess" assumptions about the process and geometry, and direct ionization, low-energy beam test results. This work will also simulate SEU cross-sections including angular and high energy responses and compare the simulated results with beam test data for the validation of the model. Using MRED, we produced a reasonably accurate upset response model of a low-critical charge SRAM without detailed information about the circuit, device geometry, or fabrication process
Simulating Local Area Network Protocols with the General Purpose Simulation System (GPSS)
1990-03-01
generation 15 3.1.2 Frame delivery . 15 3.2 Model artifices 16 3.3 Model variables 17 3.4 Simulation results 18 4. EXTERNAL PROCEDURES USED IN SIMULATION 19...46 15. Token Ring: Frame generation process 47 16. Token Ring: Frame delivery process 48 17 . Token Ring: Mean transfer delay vs mean throughput 49...assumed to be zero were replaced by the maximum values specified in the ANSI 802.3 standard (viz &MI=6, &M2=3, &M3= 17 , &D1=18, &D2=3, &D4=4, &D7=3, and
NASA Astrophysics Data System (ADS)
Xu, Ziwei; Yan, Tianying; Liu, Guiwu; Qiao, Guanjun; Ding, Feng
2015-12-01
To explore the mechanism of graphene chemical vapor deposition (CVD) growth on a catalyst surface, a molecular dynamics (MD) simulation of carbon atom self-assembly on a Ni(111) surface based on a well-designed empirical reactive bond order potential was performed. We simulated single layer graphene with recorded size (up to 300 atoms per super-cell) and reasonably good quality by MD trajectories up to 15 ns. Detailed processes of graphene CVD growth, such as carbon atom dissolution and precipitation, formation of carbon chains of various lengths, polygons and small graphene domains were observed during the initial process of the MD simulation. The atomistic processes of typical defect healing, such as the transformation from a pentagon into a hexagon and from a pentagon-heptagon pair (5|7) to two adjacent hexagons (6|6), were revealed as well. The study also showed that higher temperature and longer annealing time are essential to form high quality graphene layers, which is in agreement with experimental reports and previous theoretical results.To explore the mechanism of graphene chemical vapor deposition (CVD) growth on a catalyst surface, a molecular dynamics (MD) simulation of carbon atom self-assembly on a Ni(111) surface based on a well-designed empirical reactive bond order potential was performed. We simulated single layer graphene with recorded size (up to 300 atoms per super-cell) and reasonably good quality by MD trajectories up to 15 ns. Detailed processes of graphene CVD growth, such as carbon atom dissolution and precipitation, formation of carbon chains of various lengths, polygons and small graphene domains were observed during the initial process of the MD simulation. The atomistic processes of typical defect healing, such as the transformation from a pentagon into a hexagon and from a pentagon-heptagon pair (5|7) to two adjacent hexagons (6|6), were revealed as well. The study also showed that higher temperature and longer annealing time are essential to form high quality graphene layers, which is in agreement with experimental reports and previous theoretical results. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr06016h
Exploring Empirical Rank-Frequency Distributions Longitudinally through a Simple Stochastic Process
Finley, Benjamin J.; Kilkki, Kalevi
2014-01-01
The frequent appearance of empirical rank-frequency laws, such as Zipf’s law, in a wide range of domains reinforces the importance of understanding and modeling these laws and rank-frequency distributions in general. In this spirit, we utilize a simple stochastic cascade process to simulate several empirical rank-frequency distributions longitudinally. We focus especially on limiting the process’s complexity to increase accessibility for non-experts in mathematics. The process provides a good fit for many empirical distributions because the stochastic multiplicative nature of the process leads to an often observed concave rank-frequency distribution (on a log-log scale) and the finiteness of the cascade replicates real-world finite size effects. Furthermore, we show that repeated trials of the process can roughly simulate the longitudinal variation of empirical ranks. However, we find that the empirical variation is often less that the average simulated process variation, likely due to longitudinal dependencies in the empirical datasets. Finally, we discuss the process limitations and practical applications. PMID:24755621
Redox Control For Hanford HLW Feeds VSL-12R2530-1, REV 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kruger, A. A.; Matlack, Keith S.; Pegg, Ian L.
2012-12-13
The principal objectives of this work were to investigate the effects of processing simulated Hanford HLW at the estimated maximum concentrations of nitrates and oxalates and to identify strategies to mitigate any processing issues resulting from high concentrations of nitrates and oxalates. This report provides results for a series of tests that were performed on the DM10 melter system with simulated C-106/AY-102 HLW. The tests employed simulated HLW feeds containing variable amounts of nitrates and waste organic compounds corresponding to maximum concentrations proj ected for Hanford HLW streams in order to determine their effects on glass production rate, processing characteristics,more » glass redox conditions, melt pool foaming, and the tendency to form secondary phases. Such melter tests provide information on key process factors such as feed processing behavior, dynamic effects during processing, processing rates, off-gas amounts and compositions, foaming control, etc., that cannot be reliably obtained from crucible melts.« less