INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT
A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...
SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE
The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...
Atmospheric Model Evaluation Tool for meteorological and air quality simulations
The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.
NASA Astrophysics Data System (ADS)
Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David
2018-05-01
As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.
Water Network Tool for Resilience v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-12-09
WNTR is a python package designed to simulate and analyze resilience of water distribution networks. The software includes: - Pressure driven and demand driven hydraulic simulation - Water quality simulation to track concentration, trace, and water age - Conditional controls to simulate power outages - Models to simulate pipe breaks - A wide range of resilience metrics - Analysis and visualization tools
Simulation techniques in hyperthermia treatment planning
Paulides, MM; Stauffer, PR; Neufeld, E; Maccarini, P; Kyriakou, A; Canters, RAM; Diederich, C; Bakker, JF; Van Rhoon, GC
2013-01-01
Clinical trials have shown that hyperthermia (HT), i.e. an increase of tissue temperature to 39-44°C, significantly enhance radiotherapy and chemotherapy effectiveness (1). Driven by the developments in computational techniques and computing power, personalized hyperthermia treatment planning (HTP) has matured and has become a powerful tool for optimizing treatment quality. Electromagnetic, ultrasound, and thermal simulations using realistic clinical setups are now being performed to achieve patient-specific treatment optimization. In addition, extensive studies aimed to properly implement novel HT tools and techniques, and to assess the quality of HT, are becoming more common. In this paper, we review the simulation tools and techniques developed for clinical hyperthermia, and evaluate their current status on the path from “model” to “clinic”. In addition, we illustrate the major techniques employed for validation and optimization. HTP has become an essential tool for improvement, control, and assessment of HT treatment quality. As such, it plays a pivotal role in the quest to establish HT as an efficacious addition to multi-modality treatment of cancer. PMID:23672453
Accurate estimation of short read mapping quality for next-generation genome sequencing
Ruffalo, Matthew; Koyutürk, Mehmet; Ray, Soumya; LaFramboise, Thomas
2012-01-01
Motivation: Several software tools specialize in the alignment of short next-generation sequencing reads to a reference sequence. Some of these tools report a mapping quality score for each alignment—in principle, this quality score tells researchers the likelihood that the alignment is correct. However, the reported mapping quality often correlates weakly with actual accuracy and the qualities of many mappings are underestimated, encouraging the researchers to discard correct mappings. Further, these low-quality mappings tend to correlate with variations in the genome (both single nucleotide and structural), and such mappings are important in accurately identifying genomic variants. Approach: We develop a machine learning tool, LoQuM (LOgistic regression tool for calibrating the Quality of short read mappings, to assign reliable mapping quality scores to mappings of Illumina reads returned by any alignment tool. LoQuM uses statistics on the read (base quality scores reported by the sequencer) and the alignment (number of matches, mismatches and deletions, mapping quality score returned by the alignment tool, if available, and number of mappings) as features for classification and uses simulated reads to learn a logistic regression model that relates these features to actual mapping quality. Results: We test the predictions of LoQuM on an independent dataset generated by the ART short read simulation software and observe that LoQuM can ‘resurrect’ many mappings that are assigned zero quality scores by the alignment tools and are therefore likely to be discarded by researchers. We also observe that the recalibration of mapping quality scores greatly enhances the precision of called single nucleotide polymorphisms. Availability: LoQuM is available as open source at http://compbio.case.edu/loqum/. Contact: matthew.ruffalo@case.edu. PMID:22962451
Impact of tool wear on cross wedge rolling process stability and on product quality
NASA Astrophysics Data System (ADS)
Gutierrez, Catalina; Langlois, Laurent; Baudouin, Cyrille; Bigot, Régis; Fremeaux, Eric
2017-10-01
Cross wedge rolling (CWR) is a metal forming process used in the automotive industry. One of its applications is in the manufacturing process of connecting rods. CWR transforms a cylindrical billet into a complex axisymmetrical shape with an accurate distribution of material. This preform is forged into shape in a forging die. In order to improve CWR tool lifecycle and product quality it is essential to understand tool wear evolution and the physical phenomena that change on the CWR process due to the resulting geometry of the tool when undergoing tool wear. In order to understand CWR tool wear behavior, numerical simulations are necessary. Nevertheless, if the simulations are performed with the CAD geometry of the tool, results are limited. To solve this difficulty, two numerical simulations with FORGE® were performed using the real geometry of the tools (both up and lower roll) at two different states: (1) before starting lifecycle and (2) end of lifecycle. The tools were 3D measured with ATOS triple scan by GOM® using optical 3D measuring techniques. The result was a high-resolution point cloud of the entire geometry of the tool. Each 3D point cloud was digitalized and converted into a STL format. The geometry of the tools in a STL format was input for the 3D simulations. Both simulations were compared. Defects of products obtained in simulation were compared to main defects of products found industrially. Two main defects are: (a) surface defects on the preform that are not fixed in the die forging operation; and (b) Preform bent (no longer straight), with two possible impacts: on the one hand that the robot cannot grab it to take it to the forging stage; on the other hand, an unfilled section in the forging operation.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Steve A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Chris J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Willkinson, Timothy S.
2008-08-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.
2010-06-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
Tools for surveying and improving the quality of life: people with special needs in focus.
Hoyningen-Süess, Ursula; Oberholzer, David; Stalder, René; Brügger, Urs
2012-01-01
This article seeks to describe online tools for surveying and improving quality of life for people with disabilities living in assisted living centers and special education service organizations. Ensuring a decent quality of life for disabled people is an important welfare state goal. Using well-accepted quality of life conceptions, online diagnostic and planning tools were developed during an Institute for Education, University of Zurich, research project. The diagnostic tools measure, evaluate and analyze disabled people's quality of life. The planning tools identify factors that can affect their quality of life and suggest improvements. Instrument validity and reliability are not tested according to the standard statistical procedures. This will be done at a more advanced stage of the project. Instead, the tool is developed, refined and adjusted in cooperation with practitioners who are constantly judging it according to best practice standards. The tools support staff in assisted living centers and special education service organizations. These tools offer comprehensive resources for surveying, quantifying, evaluating, describing and simulating quality of life elements.
The USEPA has developed Watershed Deposition Tool (WDT) to calculate from the Community Multiscale Air Quality (CMAQ) model output the nitrogen, sulfur, and mercury deposition rates to watersheds and their sub-basins. The CMAQ model simulates from first principles the transport, ...
USDA-ARS?s Scientific Manuscript database
Simulation models are increasingly used to assess water quality constituent losses from agricultural systems. Mis-use often gives irrelevant or erroneous answers. The Agricultural Policy Environmental Extender (APEX) model is emerging as one of the premier modeling tools for fields, farms, and agr...
Johnston, Maximilian J; Arora, Sonal; Pucher, Philip H; Reissis, Yannis; Hull, Louise; Huddy, Jeremy R; King, Dominic; Darzi, Ara
2016-03-01
To develop and provide validity and feasibility evidence for the QUality of Information Transfer (QUIT) tool. Prompt escalation of care in the setting of patient deterioration can prevent further harm. Escalation and information transfer skills are not currently measured in surgery. This study comprised 3 phases: the development (phase 1), validation (phase 2), and feasibility analysis (phase 3) of the QUIT tool. Phase 1 involved identification of core skills needed for successful escalation of care through literature review and 33 semistructured interviews with stakeholders. Phase 2 involved the generation of validity evidence for the tool using a simulated setting. Thirty surgeons assessed a deteriorating postoperative patient in a simulated ward and escalated their care to a senior colleague. The face and content validity were assessed using a survey. Construct and concurrent validity of the tool were determined by comparing performance scores using the QUIT tool with those measured using the Situation-Background-Assessment-Recommendation (SBAR) tool. Phase 3 was conducted using direct observation of escalation scenarios on surgical wards in 2 hospitals. A 7-category assessment tool was developed from phase 1 consisting of 24 items. Twenty-one of 24 items had excellent content validity (content validity index >0.8). All 7 categories and 18 of 24 (P < 0.05) items demonstrated construct validity. The correlation between the QUIT and SBAR tools used was strong indicating concurrent validity (r = 0.694, P < 0.001). Real-time scoring of escalation referrals was feasible and indicated that doctors currently have better information transfer skills than nurses when faced with a deteriorating patient. A validated tool to assess information transfer for deteriorating surgical patients was developed and tested using simulation and real-time clinical scenarios. It may improve the quality and safety of patient care on the surgical ward.
A Monte Carlo analysis of breast screening randomized trials.
Zamora, Luis I; Forastero, Cristina; Guirado, Damián; Lallena, Antonio M
2016-12-01
To analyze breast screening randomized trials with a Monte Carlo simulation tool. A simulation tool previously developed to simulate breast screening programmes was adapted for that purpose. The history of women participating in the trials was simulated, including a model for survival after local treatment of invasive cancers. Distributions of time gained due to screening detection against symptomatic detection and the overall screening sensitivity were used as inputs. Several randomized controlled trials were simulated. Except for the age range of women involved, all simulations used the same population characteristics and this permitted to analyze their external validity. The relative risks obtained were compared to those quoted for the trials, whose internal validity was addressed by further investigating the reasons of the disagreements observed. The Monte Carlo simulations produce results that are in good agreement with most of the randomized trials analyzed, thus indicating their methodological quality and external validity. A reduction of the breast cancer mortality around 20% appears to be a reasonable value according to the results of the trials that are methodologically correct. Discrepancies observed with Canada I and II trials may be attributed to a low mammography quality and some methodological problems. Kopparberg trial appears to show a low methodological quality. Monte Carlo simulations are a powerful tool to investigate breast screening controlled randomized trials, helping to establish those whose results are reliable enough to be extrapolated to other populations and to design the trial strategies and, eventually, adapting them during their development. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
3D FEM Simulation of Flank Wear in Turning
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio
2011-05-01
This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.
NASA Astrophysics Data System (ADS)
Henderson, B. H.; Akhtar, F.; Pye, H. O. T.; Napelenok, S. L.; Hutzell, W. T.
2013-09-01
Transported air pollutants receive increasing attention as regulations tighten and global concentrations increase. The need to represent international transport in regional air quality assessments requires improved representation of boundary concentrations. Currently available observations are too sparse vertically to provide boundary information, particularly for ozone precursors, but global simulations can be used to generate spatially and temporally varying Lateral Boundary Conditions (LBC). This study presents a public database of global simulations designed and evaluated for use as LBC for air quality models (AQMs). The database covers the contiguous United States (CONUS) for the years 2000-2010 and contains hourly varying concentrations of ozone, aerosols, and their precursors. The database is complimented by a tool for configuring the global results as inputs to regional scale models (e.g., Community Multiscale Air Quality or Comprehensive Air quality Model with extensions). This study also presents an example application based on the CONUS domain, which is evaluated against satellite retrieved ozone vertical profiles. The results show performance is largely within uncertainty estimates for the Tropospheric Emission Spectrometer (TES) with some exceptions. The major difference shows a high bias in the upper troposphere along the southern boundary in January. This publication documents the global simulation database, the tool for conversion to LBC, and the fidelity of concentrations on the boundaries. This documentation is intended to support applications that require representation of long-range transport of air pollutants.
A YEAR-LONG MM5 EVALUATION USING A MODEL EVALUATION TOOLKIT
Air quality modeling has expanded in both sophistication and application over the past decade. Meteorological and air quality modeling tools are being used for research, forecasting, and regulatory related emission control strategies. Results from air quality simulations have far...
An agent based simulation tool for scheduling emergency department physicians.
Jones, Spencer S; Evans, R Scott
2008-11-06
Emergency department overcrowding is a problem that threatens the public health of communities and compromises the quality of care given to individual patients. The Institute of Medicine recommends that hospitals employ information technology and operations research methods to reduce overcrowding. This paper describes the development of an agent based simulation tool that has been designed to evaluate the impact of various physician staffing configurations on patient waiting times in the emergency department. We evaluate the feasibility of this tool at a single hospital emergency department.
Technical Highlight: NREL Improves Building Energy Simulation Programs Through Diagnostic Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polly, B.
2012-01-09
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market.
AIR QUALITY SIMULATION MODEL PERFORMANCE FOR ONE-HOUR AVERAGES
If a one-hour standard for sulfur dioxide were promulgated, air quality dispersion modeling in the vicinity of major point sources would be an important air quality management tool. Would currently available dispersion models be suitable for use in demonstrating attainment of suc...
New tools for sculpting cranial implants in a shared haptic augmented reality environment.
Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary
2006-01-01
New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.
A New Simulation Framework for Autonomy in Robotic Missions
NASA Technical Reports Server (NTRS)
Flueckiger, Lorenzo; Neukom, Christian
2003-01-01
Autonomy is a key factor in remote robotic exploration and there is significant activity addressing the application of autonomy to remote robots. It has become increasingly important to have simulation tools available to test the autonomy algorithms. While indus1;rial robotics benefits from a variety of high quality simulation tools, researchers developing autonomous software are still dependent primarily on block-world simulations. The Mission Simulation Facility I(MSF) project addresses this shortcoming with a simulation toolkit that will enable developers of autonomous control systems to test their system s performance against a set of integrated, standardized simulations of NASA mission scenarios. MSF provides a distributed architecture that connects the autonomous system to a set of simulated components replacing the robot hardware and its environment.
USDA-ARS?s Scientific Manuscript database
Evaluating the effectiveness of conservation practices (CPs) is an important step to achieving efficient and successful water quality management. Watershed-scale simulation models can provide useful and convenient tools for this evaluation, but simulated conservation practice effectiveness should be...
Modeling Applications and Tools
The U.S. EPA's Air Quality Modeling Group (AQMG) conducts modeling analyses to support policy and regulatory decisions in OAR and provides leadership and direction on the full range of air quality models and other mathematical simulation techniques used in
Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs
NASA Astrophysics Data System (ADS)
RIngenburg, Michael F.
Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.
Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J
2016-08-01
Building modelers need simulation tools capable of simultaneously considering building energy use, airflow and indoor air quality (IAQ) to design and evaluate the ability of buildings and their systems to meet today's demanding energy efficiency and IAQ performance requirements. CONTAM is a widely-used multizone building airflow and contaminant transport simulation tool that requires indoor temperatures as input values. EnergyPlus is a prominent whole-building energy simulation program capable of performing heat transfer calculations that require interzone and infiltration airflows as input values. On their own, each tool is limited in its ability to account for thermal processes upon which building airflow may be significantly dependent and vice versa. This paper describes the initial phase of coupling of CONTAM with EnergyPlus to capture the interdependencies between airflow and heat transfer using co-simulation that allows for sharing of data between independently executing simulation tools. The coupling is accomplished based on the Functional Mock-up Interface (FMI) for Co-simulation specification that provides for integration between independently developed tools. A three-zone combined heat transfer/airflow analytical BESTEST case was simulated to verify the co-simulation is functioning as expected, and an investigation of a two-zone, natural ventilation case designed to challenge the coupled thermal/airflow solution methods was performed.
NASA Astrophysics Data System (ADS)
Hol, J.; Wiebenga, J. H.; Carleer, B.
2017-09-01
In the stamping of automotive parts, friction and lubrication play a key role in achieving high quality products. In the development process of new automotive parts, it is therefore crucial to accurately account for these effects in sheet metal forming simulations. This paper presents a selection of results considering friction and lubrication modelling in sheet metal forming simulations of a front fender product. For varying lubrication conditions, the front fender can either show wrinkling or fractures. The front fender is modelled using different lubrication amounts, tool roughness’s and sheet coatings to show the strong influence of friction on both part quality and the overall production stability. For this purpose, the TriboForm software is used in combination with the AutoForm software. The results demonstrate that the TriboForm software enables the simulation of friction behaviour for varying lubrication conditions, i.e. resulting in a generally applicable approach for friction characterization under industrial sheet metal forming process conditions.
Wasson, Katherine; Parsi, Kayhan; McCarthy, Michael; Siddall, Viva Jo; Kuczewski, Mark
2016-06-01
The American Society for Bioethics and Humanities has created a quality attestation (QA) process for clinical ethics consultants; the pilot phase of reviewing portfolios has begun. One aspect of the QA process which is particularly challenging is assessing the interpersonal skills of individual clinical ethics consultants. We propose that using case simulation to evaluate clinical ethics consultants is an approach that can meet this need provided clear standards for assessment are identified. To this end, we developed the Assessing Clinical Ethics Skills (ACES) tool, which identifies and specifies specific behaviors that a clinical ethics consultant should demonstrate in an ethics case simulation. The aim is for the clinical ethics consultant or student to use a videotaped case simulation, along with the ACES tool scored by a trained rater, to demonstrate their competence as part of their QA portfolio. The development and piloting of the tool is described.
NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
Dynamic simulation of the effect of soft toric contact lenses movement on retinal image quality.
Niu, Yafei; Sarver, Edwin J; Stevenson, Scott B; Marsack, Jason D; Parker, Katrina E; Applegate, Raymond A
2008-04-01
To report the development of a tool designed to dynamically simulate the effect of soft toric contact lens movement on retinal image quality, initial findings on three eyes, and the next steps to be taken to improve the utility of the tool. Three eyes of two subjects wearing soft toric contact lenses were cyclopleged with 1% cyclopentolate and 2.5% phenylephrine. Four hundred wavefront aberration measurements over a 5-mm pupil were recorded during soft contact lens wear at 30 Hz using a complete ophthalmic analysis system aberrometer. Each wavefront error measurement was input into Visual Optics Laboratory (version 7.15, Sarver and Associates, Inc.) to generate a retinal simulation of a high contrast log MAR visual acuity chart. The individual simulations were combined into a single dynamic movie using a custom MatLab PsychToolbox program. Visual acuity was measured for each eye reading the movie with best cycloplegic spectacle correction through a 3-mm artificial pupil to minimize the influence of the eyes' uncorrected aberrations. Comparison of the simulated acuity was made to values recorded while the subject read unaberrated charts with contact lenses through a 5-mm artificial pupil. For one study eye, average acuity was the same as the natural contact lens viewing condition. For the other two study eyes visual acuity of the best simulation was more than one line worse than natural viewing conditions. Dynamic simulation of retinal image quality, although not yet perfect, is a promising technique for visually illustrating the optical effects on image quality because of the movements of alignment-sensitive corrections.
MASTOS: Mammography Simulation Tool for design Optimization Studies.
Spyrou, G; Panayiotakis, G; Tzanakos, G
2000-01-01
Mammography is a high quality imaging technique for the detection of breast lesions, which requires dedicated equipment and optimum operation. The design parameters of a mammography unit have to be decided and evaluated before the construction of such a high cost of apparatus. The optimum operational parameters also must be defined well before the real breast examination. MASTOS is a software package, based on Monte Carlo methods, that is designed to be used as a simulation tool in mammography. The input consists of the parameters that have to be specified when using a mammography unit, and also the parameters specifying the shape and composition of the breast phantom. In addition, the input may specify parameters needed in the design of a new mammographic apparatus. The main output of the simulation is a mammographic image and calculations of various factors that describe the image quality. The Monte Carlo simulation code is PC-based and is driven by an outer shell of a graphical user interface. The entire software package is a simulation tool for mammography and can be applied in basic research and/or in training in the fields of medical physics and biomedical engineering as well as in the performance evaluation of new designs of mammography units and in the determination of optimum standards for the operational parameters of a mammography unit.
Skjerdal, Taran; Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; de Cecare, Alessandra; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Trevisiani, Marcello; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine
2017-01-01
A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes , quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as "good"; "sufficient"; or "corrective action needed" based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users.
Gefferth, Andras; Spajic, Miroslav; Estanga, Edurne Gaston; Vitali, Silvia; Pasquali, Frederique; Bovo, Federica; Manfreda, Gerardo; Mancusi, Rocco; Tessema, Girum Tadesse; Fagereng, Tone; Moen, Lena Haugland; Lyshaug, Lars; Koidis, Anastasios; Delgado-Pando, Gonzalo; Stratakos, Alexandros Ch.; Boeri, Marco; From, Cecilie; Syed, Hyat; Muccioli, Mirko; Mulazzani, Roberto; Halbert, Catherine
2017-01-01
A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes, quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as “good”; “sufficient”; or “corrective action needed” based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users. PMID:29457031
Desktop microsimulation: a tool to improve efficiency in the medical office practice.
Montgomery, James B; Linville, Beth A; Slonim, Anthony D
2013-01-01
Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.
Assessment of tools for protection of quality of water: Uncontrollable discharges of pollutants.
Dehghani Darmian, Mohsen; Hashemi Monfared, Seyed Arman; Azizyan, Gholamreza; Snyder, Shane A; Giesy, John P
2018-06-06
Selecting an appropriate crisis management plans during uncontrollable loading of pollution to water systems is crucial. In this research the quality of water resources against uncontrollable pollution is protected by use of suitable tools. Case study which was chosen in this investigation was a river-reservoir system. Analytical and numerical solutions of pollutant transport equation were considered as the simulation strategy to calculate the efficient tools to protect water quality. These practical instruments are dilution flow and a new tool called detention time which is proposed and simulated for the first time in this study. For uncontrollable pollution discharge which was approximately 130% of the river's assimilation capacity, as long as the duration of contact (T c ) was considered as a constraint, by releasing 30% of the base flow of the river from the upstream dilution reservoir, the unallowable pollution could be treated. Moreover, when the affected distance (X c ) was selected as a constraint, the required detention time that the rubber dam should detained the water to be treated was equal to 187% of the initial duration of contact. Copyright © 2018 Elsevier Inc. All rights reserved.
Patient simulation: a literary synthesis of assessment tools in anesthesiology.
Edler, Alice A; Fanning, Ruth G; Chen, Michael I; Claure, Rebecca; Almazan, Dondee; Struyk, Brain; Seiden, Samuel C
2009-12-20
High-fidelity patient simulation (HFPS) has been hypothesized as a modality for assessing competency of knowledge and skill in patient simulation, but uniform methods for HFPS performance assessment (PA) have not yet been completely achieved. Anesthesiology as a field founded the HFPS discipline and also leads in its PA. This project reviews the types, quality, and designated purpose of HFPS PA tools in anesthesiology. We used the systematic review method and systematically reviewed anesthesiology literature referenced in PubMed to assess the quality and reliability of available PA tools in HFPS. Of 412 articles identified, 50 met our inclusion criteria. Seventy seven percent of studies have been published since 2000; more recent studies demonstrated higher quality. Investigators reported a variety of test construction and validation methods. The most commonly reported test construction methods included "modified Delphi Techniques" for item selection, reliability measurement using inter-rater agreement, and intra-class correlations between test items or subtests. Modern test theory, in particular generalizability theory, was used in nine (18%) of studies. Test score validity has been addressed in multiple investigations and shown a significant improvement in reporting accuracy. However the assessment of predicative has been low across the majority of studies. Usability and practicality of testing occasions and tools was only anecdotally reported. To more completely comply with the gold standards for PA design, both shared experience of experts and recognition of test construction standards, including reliability and validity measurements, instrument piloting, rater training, and explicit identification of the purpose and proposed use of the assessment tool, are required.
Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes
NASA Astrophysics Data System (ADS)
Cropper, A. E.; Wang, Z.
1995-08-01
Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.
Fiorina, E; Ferrero, V; Pennazio, F; Baroni, G; Battistoni, G; Belcari, N; Cerello, P; Camarlinghi, N; Ciocca, M; Del Guerra, A; Donetti, M; Ferrari, A; Giordanengo, S; Giraudo, G; Mairani, A; Morrocchi, M; Peroni, C; Rivetti, A; Da Rocha Rolo, M D; Rossi, S; Rosso, V; Sala, P; Sportelli, G; Tampellini, S; Valvo, F; Wheadon, R; Bisogni, M G
2018-05-07
Hadrontherapy is a method for treating cancer with very targeted dose distributions and enhanced radiobiological effects. To fully exploit these advantages, in vivo range monitoring systems are required. These devices measure, preferably during the treatment, the secondary radiation generated by the beam-tissue interactions. However, since correlation of the secondary radiation distribution with the dose is not straightforward, Monte Carlo (MC) simulations are very important for treatment quality assessment. The INSIDE project constructed an in-beam PET scanner to detect signals generated by the positron-emitting isotopes resulting from projectile-target fragmentation. In addition, a FLUKA-based simulation tool was developed to predict the corresponding reference PET images using a detailed scanner model. The INSIDE in-beam PET was used to monitor two consecutive proton treatment sessions on a patient at the Italian Center for Oncological Hadrontherapy (CNAO). The reconstructed PET images were updated every 10 s providing a near real-time quality assessment. By half-way through the treatment, the statistics of the measured PET images were already significant enough to be compared with the simulations with average differences in the activity range less than 2.5 mm along the beam direction. Without taking into account any preferential direction, differences within 1 mm were found. In this paper, the INSIDE MC simulation tool is described and the results of the first in vivo agreement evaluation are reported. These results have justified a clinical trial, in which the MC simulation tool will be used on a daily basis to study the compliance tolerances between the measured and simulated PET images. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.
Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana
2016-01-01
The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.
Integration of visual and motion cues for simulator requirements and ride quality investigation
NASA Technical Reports Server (NTRS)
Young, L. R.
1976-01-01
Practical tools which can extend the state of the art of moving base flight simulation for research and training are developed. Main approaches to this research effort include: (1) application of the vestibular model for perception of orientation based on motion cues: optimum simulator motion controls; and (2) visual cues in landing.
NASA Astrophysics Data System (ADS)
Henderson, B. H.; Akhtar, F.; Pye, H. O. T.; Napelenok, S. L.; Hutzell, W. T.
2014-02-01
Transported air pollutants receive increasing attention as regulations tighten and global concentrations increase. The need to represent international transport in regional air quality assessments requires improved representation of boundary concentrations. Currently available observations are too sparse vertically to provide boundary information, particularly for ozone precursors, but global simulations can be used to generate spatially and temporally varying lateral boundary conditions (LBC). This study presents a public database of global simulations designed and evaluated for use as LBC for air quality models (AQMs). The database covers the contiguous United States (CONUS) for the years 2001-2010 and contains hourly varying concentrations of ozone, aerosols, and their precursors. The database is complemented by a tool for configuring the global results as inputs to regional scale models (e.g., Community Multiscale Air Quality or Comprehensive Air quality Model with extensions). This study also presents an example application based on the CONUS domain, which is evaluated against satellite retrieved ozone and carbon monoxide vertical profiles. The results show performance is largely within uncertainty estimates for ozone from the Ozone Monitoring Instrument and carbon monoxide from the Measurements Of Pollution In The Troposphere (MOPITT), but there were some notable biases compared with Tropospheric Emission Spectrometer (TES) ozone. Compared with TES, our ozone predictions are high-biased in the upper troposphere, particularly in the south during January. This publication documents the global simulation database, the tool for conversion to LBC, and the evaluation of concentrations on the boundaries. This documentation is intended to support applications that require representation of long-range transport of air pollutants.
Pratas, Diogo; Pinho, Armando J; Rodrigues, João M O S
2014-01-16
The emerging next-generation sequencing (NGS) is bringing, besides the natural huge amounts of data, an avalanche of new specialized tools (for analysis, compression, alignment, among others) and large public and private network infrastructures. Therefore, a direct necessity of specific simulation tools for testing and benchmarking is rising, such as a flexible and portable FASTQ read simulator, without the need of a reference sequence, yet correctly prepared for producing approximately the same characteristics as real data. We present XS, a skilled FASTQ read simulation tool, flexible, portable (does not need a reference sequence) and tunable in terms of sequence complexity. It has several running modes, depending on the time and memory available, and is aimed at testing computing infrastructures, namely cloud computing of large-scale projects, and testing FASTQ compression algorithms. Moreover, XS offers the possibility of simulating the three main FASTQ components individually (headers, DNA sequences and quality-scores). XS provides an efficient and convenient method for fast simulation of FASTQ files, such as those from Ion Torrent (currently uncovered by other simulators), Roche-454, Illumina and ABI-SOLiD sequencing machines. This tool is publicly available at http://bioinformatics.ua.pt/software/xs/.
Evapotranspiration Calculator Desktop Tool
The Evapotranspiration Calculator estimates evapotranspiration time series data for hydrological and water quality models for the Hydrologic Simulation Program - Fortran (HSPF) and the Stormwater Management Model (SWMM).
USDA-ARS?s Scientific Manuscript database
Models that estimate the effects of agricultural conservation practices on water quantity and quality have become increasingly important tools for short- and long-term assessments. In this study, we simulated the water quality and hydrology of a portion of the Jobos Bay watershed, Puerto Rico using...
THE STORM WATER MANAGEMENT MODEL (SWMM) AND RELATED WATERSHED TOOLS DEVELOPMENT
The Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. It is the only publicly available model capable of performing a comprehensiv...
APEX Model Simulation for Row Crop Watersheds with Agroforestry and Grass Buffers
USDA-ARS?s Scientific Manuscript database
Watershed model simulation has become an important tool in studying ways and means to reduce transport of agricultural pollutants. Conducting field experiments to assess buffer influences on water quality are constrained by the large-scale nature of watersheds, high experimental costs, private owner...
Observer roles that optimise learning in healthcare simulation education: a systematic review.
O'Regan, Stephanie; Molloy, Elizabeth; Watterson, Leonie; Nestel, Debra
2016-01-01
Simulation is widely used in health professional education. The convention that learners are actively involved may limit access to this educational method. The aim of this paper is to review the evidence for learning methods that employ directed observation as an alternative to hands-on participation in scenario-based simulation training. We sought studies that included either direct comparison of the learning outcomes of observers with those of active participants or identified factors important for the engagement of observers in simulation. We systematically searched health and education databases and reviewed journals and bibliographies for studies investigating or referring to observer roles in simulation using mannequins, simulated patients or role play simulations. A quality framework was used to rate the studies. We sought studies that included either direct comparison of the learning outcomes of observers with those of active participants or identified factors important for the engagement of observers in simulation. We systematically searched health and education databases and reviewed journals and bibliographies for studies investigating or referring to observer roles in simulation using mannequins, simulated patients or role play simulations. A quality framework was used to rate the studies. Nine studies met the inclusion criteria. Five studies suggest learning outcomes in observer roles are as good or better than hands-on roles in simulation. Four studies document learner satisfaction in observer roles. Five studies used a tool to guide observers. Eight studies involved observers in the debrief. Learning and satisfaction in observer roles is closely associated with observer tools, learner engagement, role clarity and contribution to the debrief. Learners that valued observer roles described them as affording an overarching view, examination of details from a distance, and meaningful feedback during the debrief. Learners who did not value observer roles described them as passive, or boring when compared to hands-on engagement in the simulation encounter. Learning outcomes and role satisfaction for observers is improved through learner engagement and the use of observer tools. The value that students attach to observer roles appear contingent on role clarity, use of observer tools, and inclusion of observers' perspectives in the debrief.
STORM WATER MANAGEMENT MODEL QUALITY ASSURANCE REPORT: DYNAMIC WAVE FLOW ROUTING
The Storm Water Management Model (SWMM) is a computer-based tool for simulating storm water runoff quantity and quality from primarily urban areas. In 2002 the U.S. Environmental Protection Agency’s Water Supply and Water Resources Division partnered with the consulting firm CDM ...
A comprehensive evaluation of assembly scaffolding tools
2014-01-01
Background Genome assembly is typically a two-stage process: contig assembly followed by the use of paired sequencing reads to join contigs into scaffolds. Scaffolds are usually the focus of reported assembly statistics; longer scaffolds greatly facilitate the use of genome sequences in downstream analyses, and it is appealing to present larger numbers as metrics of assembly performance. However, scaffolds are highly prone to errors, especially when generated using short reads, which can directly result in inflated assembly statistics. Results Here we provide the first independent evaluation of scaffolding tools for second-generation sequencing data. We find large variations in the quality of results depending on the tool and dataset used. Even extremely simple test cases of perfect input, constructed to elucidate the behaviour of each algorithm, produced some surprising results. We further dissect the performance of the scaffolders using real and simulated sequencing data derived from the genomes of Staphylococcus aureus, Rhodobacter sphaeroides, Plasmodium falciparum and Homo sapiens. The results from simulated data are of high quality, with several of the tools producing perfect output. However, at least 10% of joins remains unidentified when using real data. Conclusions The scaffolders vary in their usability, speed and number of correct and missed joins made between contigs. Results from real data highlight opportunities for further improvements of the tools. Overall, SGA, SOPRA and SSPACE generally outperform the other tools on our datasets. However, the quality of the results is highly dependent on the read mapper and genome complexity. PMID:24581555
Optimizing STEM Education with Advanced ICTs and Simulations
ERIC Educational Resources Information Center
Levin, Ilya, Ed.; Tsybulsky, Dina, Ed.
2017-01-01
The role of technology in educational settings has become increasingly prominent in recent years. When utilized effectively, these tools provide a higher quality of learning for students. "Optimizing STEM Education With Advanced ICTs and Simulations" is an innovative reference source for the latest scholarly research on the integration…
POLYVIEW-MM: web-based platform for animation and analysis of molecular simulations
Porollo, Aleksey; Meller, Jaroslaw
2010-01-01
Molecular simulations offer important mechanistic and functional clues in studies of proteins and other macromolecules. However, interpreting the results of such simulations increasingly requires tools that can combine information from multiple structural databases and other web resources, and provide highly integrated and versatile analysis tools. Here, we present a new web server that integrates high-quality animation of molecular motion (MM) with structural and functional analysis of macromolecules. The new tool, dubbed POLYVIEW-MM, enables animation of trajectories generated by molecular dynamics and related simulation techniques, as well as visualization of alternative conformers, e.g. obtained as a result of protein structure prediction methods or small molecule docking. To facilitate structural analysis, POLYVIEW-MM combines interactive view and analysis of conformational changes using Jmol and its tailored extensions, publication quality animation using PyMol, and customizable 2D summary plots that provide an overview of MM, e.g. in terms of changes in secondary structure states and relative solvent accessibility of individual residues in proteins. Furthermore, POLYVIEW-MM integrates visualization with various structural annotations, including automated mapping of known inter-action sites from structural homologs, mapping of cavities and ligand binding sites, transmembrane regions and protein domains. URL: http://polyview.cchmc.org/conform.html. PMID:20504857
Can surgical simulation be used to train detection and classification of neural networks?
Zisimopoulos, Odysseas; Flouty, Evangello; Stacey, Mark; Muscroft, Sam; Giataganas, Petros; Nehme, Jean; Chow, Andre; Stoyanov, Danail
2017-10-01
Computer-assisted interventions (CAI) aim to increase the effectiveness, precision and repeatability of procedures to improve surgical outcomes. The presence and motion of surgical tools is a key information input for CAI surgical phase recognition algorithms. Vision-based tool detection and recognition approaches are an attractive solution and can be designed to take advantage of the powerful deep learning paradigm that is rapidly advancing image recognition and classification. The challenge for such algorithms is the availability and quality of labelled data used for training. In this Letter, surgical simulation is used to train tool detection and segmentation based on deep convolutional neural networks and generative adversarial networks. The authors experiment with two network architectures for image segmentation in tool classes commonly encountered during cataract surgery. A commercially-available simulator is used to create a simulated cataract dataset for training models prior to performing transfer learning on real surgical data. To the best of authors' knowledge, this is the first attempt to train deep learning models for surgical instrument detection on simulated data while demonstrating promising results to generalise on real data. Results indicate that simulated data does have some potential for training advanced classification methods for CAI systems.
Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S
2015-03-15
The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.
Air quality (AQ) simulation models provide a basis for implementing the National Ambient Air Quality Standards (NAAQS) and are a tool for performing risk-based assessments and for developing environmental management strategies. Fine particulate matter (PM 2.5), its constituent...
Integrating Reliability Analysis with a Performance Tool
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael
1995-01-01
A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.
Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model
2016-01-01
The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694
NASA Astrophysics Data System (ADS)
Gholami, V.; Khaleghi, M. R.; Sebghati, M.
2017-11-01
The process of water quality testing is money/time-consuming, quite important and difficult stage for routine measurements. Therefore, use of models has become commonplace in simulating water quality. In this study, the coactive neuro-fuzzy inference system (CANFIS) was used to simulate groundwater quality. Further, geographic information system (GIS) was used as the pre-processor and post-processor tool to demonstrate spatial variation of groundwater quality. All important factors were quantified and groundwater quality index (GWQI) was developed. The proposed model was trained and validated by taking a case study of Mazandaran Plain located in northern part of Iran. The factors affecting groundwater quality were the input variables for the simulation, whereas GWQI index was the output. The developed model was validated to simulate groundwater quality. Network validation was performed via comparison between the estimated and actual GWQI values. In GIS, the study area was separated to raster format in the pixel dimensions of 1 km and also by incorporation of input data layers of the Fuzzy Network-CANFIS model; the geo-referenced layers of the effective factors in groundwater quality were earned. Therefore, numeric values of each pixel with geographical coordinates were entered to the Fuzzy Network-CANFIS model and thus simulation of groundwater quality was accessed in the study area. Finally, the simulated GWQI indices using the Fuzzy Network-CANFIS model were entered into GIS, and hence groundwater quality map (raster layer) based on the results of the network simulation was earned. The study's results confirm the high efficiency of incorporation of neuro-fuzzy techniques and GIS. It is also worth noting that the general quality of the groundwater in the most studied plain is fairly low.
Mihaljevic, Susan E; Howard, Valerie M
2016-01-01
Improving resident safety and quality of care by maximizing interdisciplinary communication among long-term care providers is essential in meeting the goals of the United States' Federal Health care reform. The new Triple Aim goals focus on improved patient outcomes, increasing patient satisfaction, and decreased health care costs, thus providing consumers with quality, efficient patient-focused care. Within the United States, sepsis is the 10th leading cause of death with a 28.6% mortality rate in the elderly, increasing to 40% to 60% in septic shock. As a result of the Affordable Care Act, the Centers for Medicare & Medicaid services supported the Interventions to Reduce Acute Care Transfers 3.0 program to improve health care quality and prevent avoidable rehospitalization by improving assessment, documentation, and communication among health care providers. The Interventions to Reduce Acute Care Transfers 3.0 tools were incorporated in interprofessional sepsis simulations throughout 19 long-term care facilities to encourage the early recognition of sepsis symptoms and prompt communication of sepsis symptoms among interdisciplinary teams. As a result of this simulation training, many long-term care organizations have adopted the STOP and WATCH and SBAR tools as a venue to communicate resident condition changes.
ASSESSING THE WATER QUALITY IMPACTS OF GLOBAL CLIMATE CHANGE IN SOUTHWESTERN OHIO, U.S.A
This paper uses a watershed-scale hydrologic model (Soil and Water Assessment Tool) to simulate the water quality impacts of future climate change in the Little Miami River (LMR) watershed in southwestern Ohio. The LMR watershed, the principal source of drinking water for 1.6 mi...
Due to the computational cost of running regional-scale numerical air quality models, reduced form models (RFM) have been proposed as computationally efficient simulation tools for characterizing the pollutant response to many different types of emission reductions. The U.S. Envi...
On the modeling of separation foils in thermoforming simulations
NASA Astrophysics Data System (ADS)
Margossian, Alexane; Bel, Sylvain; Hinterhölzl, Roland
2016-10-01
Composite forming simulations consist in modelling the forming process of composite components to anticipate the occurrence of potential flaws such as out-of-plane wrinkles and fibre re-orientation. Forming methods often consist of automated processes in which flat composite blanks are forced to comply with tool geometries. Although Finite Element forming simulations require the modelling of all stakeholders (blankholder, tooling and composite blank), consumables such as separation films are often not considered. Used in thermoforming processes, these films are placed between tooling and composite to ease part removal after forming. These films are also used to decrease tool/ply friction and thus, enhance forming quality. This work presents thermoforming simulations of pre-impregnated carbon fibre thermoplastic blanks in which separation films are modelled in the same manner as composite layers, i.e. by a layer of shell elements. The mechanical properties of such films are also characterised at the same temperature as forming occurs. The proposed approach is finally compared to the actual modelling method, in which separation films are not modelled as such but in which their influence is only considered within the friction coefficient between tooling and blank.
Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-02-01
New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less
Waterborne Disease Case Investigation: Public Health Nursing Simulation.
Alexander, Gina K; Canclini, Sharon B; Fripp, Jon; Fripp, William
2017-01-01
The lack of safe drinking water is a significant public health threat worldwide. Registered nurses assess the physical environment, including the quality of the water supply, and apply environmental health knowledge to reduce environmental exposures. The purpose of this research brief is to describe a waterborne disease simulation for students enrolled in a public health nursing (PHN) course. A total of 157 undergraduate students completed the simulation in teams, using the SBAR (Situation-Background-Assessment-Recommendation) reporting tool. Simulation evaluation consisted of content analysis of the SBAR tools and debriefing notes. Student teams completed the simulation and articulated the implications for PHN practice. Student teams discussed assessment findings and primarily recommended four nursing interventions: health teaching focused on water, sanitation, and hygiene; community organizing; collaboration; and advocacy to ensure a safe water supply. With advanced planning and collaboration with partners, waterborne disease simulation may enhance PHN education. [J Nurs Educ. 2017;56(1):39-42.]. Copyright 2017, SLACK Incorporated.
The design of real time infrared image generation software based on Creator and Vega
NASA Astrophysics Data System (ADS)
Wang, Rui-feng; Wu, Wei-dong; Huo, Jun-xiu
2013-09-01
Considering the requirement of high reality and real-time quality dynamic infrared image of an infrared image simulation, a method to design real-time infrared image simulation application on the platform of VC++ is proposed. This is based on visual simulation software Creator and Vega. The functions of Creator are introduced simply, and the main features of Vega developing environment are analyzed. The methods of infrared modeling and background are offered, the designing flow chart of the developing process of IR image real-time generation software and the functions of TMM Tool and MAT Tool and sensor module are explained, at the same time, the real-time of software is designed.
Development of a station based climate database for SWAT and APEX assessments in the U.S.
USDA-ARS?s Scientific Manuscript database
Water quality simulation models such as the Soil and Water Assessment Tool (SWAT) and Agricultural Policy EXtender (APEX) are widely used in the U.S. These models require large amounts of spatial and tabular data to simulate the natural world. Accurate and seamless daily climatic data are critical...
Simulation as a vehicle for enhancing collaborative practice models.
Jeffries, Pamela R; McNelis, Angela M; Wheeler, Corinne A
2008-12-01
Clinical simulation used in a collaborative practice approach is a powerful tool to prepare health care providers for shared responsibility for patient care. Clinical simulations are being used increasingly in professional curricula to prepare providers for quality practice. Little is known, however, about how these simulations can be used to foster collaborative practice across disciplines. This article provides an overview of what simulation is, what collaborative practice models are, and how to set up a model using simulations. An example of a collaborative practice model is presented, and nursing implications of using a collaborative practice model in simulations are discussed.
Friction Stir Welding of Magnesium Alloy Type AZ 31
NASA Astrophysics Data System (ADS)
Kupec, Tomáš; Behúlová, Mária; Turňa, Milan; Sahul, Miroslav
The paper deals with welding of Mg alloy of the type AZ 31 by Friction Stir Welding technology (FSW). The FSW technology is at present predominantly used for welding light metals and alloys, as aluminium, magnesium and their alloys. Experimental part consists of performing the simulation and fabrication of welded joints on a new-installed welding equipment available at the Welding Research Institute — Industrial Institute of SR Bratislava. Welding tools made of tool steel type H 13 were used for welding experiments. Geometry of welding tools was designed on the base of literature knowledge. Suitable welding parameters and conditions were determined using numerical simulation. Main emphasis was laid upon the tool revolutions, welding speed and tool bevel angle. The effect of welding parameters on the quality of welded joints was assessed. Assessment of welded joints was carried out by radiography, light microscopy, hardness measurement and EDX microanalysis. Static tensile test was employed for mechanical testing.
Artificial intelligence in public health prevention of legionelosis in drinking water systems.
Sinčak, Peter; Ondo, Jaroslav; Kaposztasova, Daniela; Virčikova, Maria; Vranayova, Zuzana; Sabol, Jakub
2014-08-21
Good quality water supplies and safe sanitation in urban areas are a big challenge for governments throughout the world. Providing adequate water quality is a basic requirement for our lives. The colony forming units of the bacterium Legionella pneumophila in potable water represent a big problem which cannot be overlooked for health protection reasons. We analysed several methods to program a virtual hot water tank with AI (artificial intelligence) tools including neuro-fuzzy systems as a precaution against legionelosis. The main goal of this paper is to present research which simulates the temperature profile in the water tank. This research presents a tool for a water management system to simulate conditions which are able to prevent legionelosis outbreaks in a water system. The challenge is to create a virtual water tank simulator including the water environment which can simulate a situation which is common in building water distribution systems. The key feature of the presented system is its adaptation to any hot water tank. While respecting the basic parameters of hot water, a water supplier and building maintainer are required to ensure the predefined quality and water temperature at each sampling site and avoid the growth of Legionella. The presented system is one small contribution how to overcome a situation when legionelosis could find good conditions to spread and jeopardize human lives.
Artificial Intelligence in Public Health Prevention of Legionelosis in Drinking Water Systems
Sinčak, Peter; Ondo, Jaroslav; Kaposztasova, Daniela; Virčikova, Maria; Vranayova, Zuzana; Sabol, Jakub
2014-01-01
Good quality water supplies and safe sanitation in urban areas are a big challenge for governments throughout the world. Providing adequate water quality is a basic requirement for our lives. The colony forming units of the bacterium Legionella pneumophila in potable water represent a big problem which cannot be overlooked for health protection reasons. We analysed several methods to program a virtual hot water tank with AI (artificial intelligence) tools including neuro-fuzzy systems as a precaution against legionelosis. The main goal of this paper is to present research which simulates the temperature profile in the water tank. This research presents a tool for a water management system to simulate conditions which are able to prevent legionelosis outbreaks in a water system. The challenge is to create a virtual water tank simulator including the water environment which can simulate a situation which is common in building water distribution systems. The key feature of the presented system is its adaptation to any hot water tank. While respecting the basic parameters of hot water, a water supplier and building maintainer are required to ensure the predefined quality and water temperature at each sampling site and avoid the growth of Legionella. The presented system is one small contribution how to overcome a situation when legionelosis could find good conditions to spread and jeopardize human lives. PMID:25153475
Fan, Chihhao; Ko, Chun-Han; Wang, Wei-Shen
2009-04-01
Water quality modeling has been shown to be a useful tool in strategic water quality management. The present study combines the Qual2K model with the HEC-RAS model to assess the water quality of a tidal river in northern Taiwan. The contaminant loadings of biochemical oxygen demand (BOD), ammonia nitrogen (NH(3)-N), total phosphorus (TP), and sediment oxygen demand (SOD) are utilized in the Qual2K simulation. The HEC-RAS model is used to: (i) estimate the hydraulic constants for atmospheric re-aeration constant calculation; and (ii) calculate the water level profile variation to account for concentration changes as a result of tidal effect. The results show that HEC-RAS-assisted Qual2K simulations taking tidal effect into consideration produce water quality indices that, in general, agree with the monitoring data of the river. Comparisons of simulations with different combinations of contaminant loadings demonstrate that BOD is the most import contaminant. Streeter-Phelps simulation (in combination with HEC-RAS) is also performed for comparison, and the results show excellent agreement with the observed data. This paper is the first report of the innovative use of a combination of the HEC-RAS model and the Qual2K model (or Streeter-Phelps equation) to simulate water quality in a tidal river. The combination is shown to provide an alternative for water quality simulation of a tidal river when available dynamic-monitoring data are insufficient to assess the tidal effect of the river.
Following the Part I paper that described an application of the U.S. EPA Models-3/Community Multiscale Air Quality (CMAQ) modeling system to the 1999 Southern Oxidants Study episode, this paper presents results from process analysis (PA) using the PA tool embedded in CMAQ and s...
Towards a supported common NEAMS software stack
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cormac Garvey
2012-04-01
The NEAMS IPSC's are developing multidimensional, multiphysics, multiscale simulation codes based on first principles that will be capable of predicting all aspects of current and future nuclear reactor systems. These new breeds of simulation codes will include rigorous verification, validation and uncertainty quantification checks to quantify the accuracy and quality of the simulation results. The resulting NEAMS IPSC simulation codes will be an invaluable tool in designing the next generation of Nuclear Reactors and also contribute to a more speedy process in the acquisition of licenses from the NRC for new Reactor designs. Due to the high resolution of themore » models, the complexity of the physics and the added computational resources to quantify the accuracy/quality of the results, the NEAMS IPSC codes will require large HPC resources to carry out the production simulation runs.« less
Gosselin, Emilie; Bourgault, Patricia; Lavoie, Stephan; Coleman, Robin-Marie; Méziat-Burdin, Anne
2014-12-01
Pain management in the intensive care unit is often inadequate. There is no tool available to assess nursing pain management practices. The aim of this study was to develop and validate a measuring tool to assess nursing pain management in the intensive care unit during standardized clinical simulation. A literature review was performed to identify relevant components demonstrating optimal pain management in adult intensive care units and to integrate them in an observation tool. This tool was submitted to an expert panel and pretested. It was then used to assess pain management practice during 26 discrete standardized clinical simulation sessions with intensive care nurses. The Nursing Observation Tool for Pain Management (NOTPaM) contains 28 statements grouped into 8 categories, which are grouped into 4 dimensions: subjective assessment, objective assessment, interventions, and reassessment. The tool's internal consistency was calculated at a Cronbach's alpha of 0.436 for the whole tool; the alpha varies from 0.328 to 0.518 for each dimension. To evaluate the inter-rater reliability, intra-class correlation coefficient was used, which was calculated at 0.751 (p < .001) for the whole tool, with variations from 0.619 to 0.920 (p < .01) between dimensions. The expert panel was satisfied with the content and face validity of the tool. The psychometric qualities of the NOTPaM developed in this study are satisfactory. However, the tool could be improved with slight modifications. Nevertheless, it was useful in assessing intensive care nurses' pain management in a standardized clinical simulation. The NOTPaM is the first tool created for this purpose. Copyright © 2014 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
Chen, Ming-Jun; Cheng, Jian; Yuan, Xiao-Dong; Liao, Wei; Wang, Hai-Jun; Wang, Jing-He; Xiao, Yong; Li, Ming-Quan
2015-01-01
Repairing initial slight damage site into stable structures by engineering techniques is the leading strategy to mitigate the damage growth on large-size components used in laser-driven fusion facilities. For KH2PO4 crystals, serving as frequency converter and optoelectronic switch-Pockels cell, micro-milling has been proven the most promising method to fabricate these stable structures. However, tool marks inside repairing pit would be unavoidably introduced due to the wearing of milling cutter in actual repairing process. Here we quantitatively investigate the effect of tool marks on repairing quality of damaged crystal components by simulating its induced light intensification and testing the laser-induced damage threshold. We found that due to the formation of focusing hot spots and interference ripples, the light intensity is strongly enhanced with the presence of tool marks, especially for those on rear surfaces. Besides, the negative effect of tool marks is mark density dependent and multiple tool marks would aggravate the light intensification. Laser damage tests verified the role of tool marks as weak points, reducing the repairing quality. This work offers new criterion to comprehensively evaluate the quality of repaired optical surfaces to alleviate the bottleneck issue of low laser damage threshold for optical components in laser-driven fusion facilities. PMID:26399624
Ice Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel
NASA Technical Reports Server (NTRS)
Broeren, Andy; Potapczuk, Mark; Lee, Sam; Malone, Adam; Paul, Ben; Woodard, Brian
2016-01-01
The design and certification of modern transport airplanes for flight in icing conditions increasing relies on three-dimensional numerical simulation tools for ice accretion prediction. There is currently no publically available, high-quality, ice accretion database upon which to evaluate the performance of icing simulation tools for large-scale swept wings that are representative of modern commercial transport airplanes. The purpose of this presentation is to present the results of a series of icing wind tunnel test campaigns whose aim was to provide an ice accretion database for large-scale, swept wings.
Simulation Modelling in Healthcare: An Umbrella Review of Systematic Literature Reviews.
Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Booth, Andrew
2017-09-01
Numerous studies examine simulation modelling in healthcare. These studies present a bewildering array of simulation techniques and applications, making it challenging to characterise the literature. The aim of this paper is to provide an overview of the level of activity of simulation modelling in healthcare and the key themes. We performed an umbrella review of systematic literature reviews of simulation modelling in healthcare. Searches were conducted of academic databases (JSTOR, Scopus, PubMed, IEEE, SAGE, ACM, Wiley Online Library, ScienceDirect) and grey literature sources, enhanced by citation searches. The articles were included if they performed a systematic review of simulation modelling techniques in healthcare. After quality assessment of all included articles, data were extracted on numbers of studies included in each review, types of applications, techniques used for simulation modelling, data sources and simulation software. The search strategy yielded a total of 117 potential articles. Following sifting, 37 heterogeneous reviews were included. Most reviews achieved moderate quality rating on a modified AMSTAR (A Measurement Tool used to Assess systematic Reviews) checklist. All the review articles described the types of applications used for simulation modelling; 15 reviews described techniques used for simulation modelling; three reviews described data sources used for simulation modelling; and six reviews described software used for simulation modelling. The remaining reviews either did not report or did not provide enough detail for the data to be extracted. Simulation modelling techniques have been used for a wide range of applications in healthcare, with a variety of software tools and data sources. The number of reviews published in recent years suggest an increased interest in simulation modelling in healthcare.
Massively Parallel Processing for Fast and Accurate Stamping Simulations
NASA Astrophysics Data System (ADS)
Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu
2005-08-01
The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.
Side Flow Effect on Surface Generation in Nano Cutting
NASA Astrophysics Data System (ADS)
Xu, Feifei; Fang, Fengzhou; Zhang, Xiaodong
2017-05-01
The side flow of material in nano cutting is one of the most important factors that deteriorate the machined surface quality. The effects of the crystallographic orientation, feed, and the cutting tool geometry, including tool edge radius, rake angle and inclination angle, on the side flow are investigated employing molecular dynamics simulation. The results show that the stagnation region is formed in front of tool edge and it is characterized by the stagnation radius R s and stagnation height h s . The side flow is formed because the material at or under the stagnation region is extruded by the tool edge to flow to the side of the tool edge. Higher stagnation height would increase the size of the side flow. The anisotropic nature of the material which partly determines the stagnation region also influences the side flow due to the different deformation mechanism under the action of the tool edge. At different cutting directions, the size of the side flow has a great difference which would finally affect the machined surface quality. The cutting directions of {100} < 011>, {110} < 001>, and {110} < 1-10 > are beneficial to obtain a better surface quality with small side flow. Besides that, the side flow could be suppressed by reducing the feed and optimizing the cutting tool geometry. Cutting tool with small edge radius, large positive rake angle, and inclination angle would decrease the side flow and consequently improve the machined surface quality.
Side Flow Effect on Surface Generation in Nano Cutting.
Xu, Feifei; Fang, Fengzhou; Zhang, Xiaodong
2017-12-01
The side flow of material in nano cutting is one of the most important factors that deteriorate the machined surface quality. The effects of the crystallographic orientation, feed, and the cutting tool geometry, including tool edge radius, rake angle and inclination angle, on the side flow are investigated employing molecular dynamics simulation. The results show that the stagnation region is formed in front of tool edge and it is characterized by the stagnation radius R s and stagnation height h s . The side flow is formed because the material at or under the stagnation region is extruded by the tool edge to flow to the side of the tool edge. Higher stagnation height would increase the size of the side flow. The anisotropic nature of the material which partly determines the stagnation region also influences the side flow due to the different deformation mechanism under the action of the tool edge. At different cutting directions, the size of the side flow has a great difference which would finally affect the machined surface quality. The cutting directions of {100} < 011>, {110} < 001>, and {110} < 1-10 > are beneficial to obtain a better surface quality with small side flow. Besides that, the side flow could be suppressed by reducing the feed and optimizing the cutting tool geometry. Cutting tool with small edge radius, large positive rake angle, and inclination angle would decrease the side flow and consequently improve the machined surface quality.
Advanced capability of air quality simulation models towards accurate performance at finer scales will be needed for such models to serve as tools for performing exposure and risk assessments in urban areas. It is recognized that the impact of urban features such as street and t...
NASA Astrophysics Data System (ADS)
Zwickl, Titus; Carleer, Bart; Kubli, Waldemar
2005-08-01
In the past decade, sheet metal forming simulation became a well established tool to predict the formability of parts. In the automotive industry, this has enabled significant reduction in the cost and time for vehicle design and development, and has helped to improve the quality and performance of vehicle parts. However, production stoppages for troubleshooting and unplanned die maintenance, as well as production quality fluctuations continue to plague manufacturing cost and time. The focus therefore has shifted in recent times beyond mere feasibility to robustness of the product and process being engineered. Ensuring robustness is the next big challenge for the virtual tryout / simulation technology. We introduce new methods, based on systematic stochastic simulations, to visualize the behavior of the part during the whole forming process — in simulation as well as in production. Sensitivity analysis explains the response of the part to changes in influencing parameters. Virtual tryout allows quick exploration of changed designs and conditions. Robust design and manufacturing guarantees quality and process capability for the production process. While conventional simulations helped to reduce development time and cost by ensuring feasible processes, robustness engineering tools have the potential for far greater cost and time savings. Through examples we illustrate how expected and unexpected behavior of deep drawing parts may be tracked down, identified and assigned to the influential parameters. With this knowledge, defects can be eliminated or springback can be compensated e.g.; the response of the part to uncontrollable noise can be predicted and minimized. The newly introduced methods enable more reliable and predictable stamping processes in general.
NASA Technical Reports Server (NTRS)
Diak, George R.; Huang, Hung-Lung; Kim, Dongsoo
1990-01-01
The paper addresses the concept of synthetic satellite imagery as a visualization and diagnostic tool for understanding satellite sensors of the future and to detail preliminary results on the quality of soundings from the current sensors. Preliminary results are presented on the quality of soundings from the combination of the High-Resolution Infrared Radiometer Sounder and the Advanced Microwave Sounding Unit. Results are also presented on the first Observing System Simulation Experiment using this data in a mesoscale numerical prediction model.
Photomask quality evaluation using lithography simulation and multi-detector MVM-SEM
NASA Astrophysics Data System (ADS)
Ito, Keisuke; Murakawa, Tsutomu; Fukuda, Naoki; Shida, Soichi; Iwai, Toshimichi; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hagiwara, Kazuyuki; Hara, Daisuke
2013-06-01
The detection and management of mask defects which are transferred onto wafer becomes more important day by day. As the photomask patterns becomes smaller and more complicated, using Inverse Lithography Technology (ILT) and Source Mask Optimization (SMO) with Optical Proximity Correction (OPC). To evaluate photomask quality, the current method uses aerial imaging by optical inspection tools. This technique at 1Xnm node has a resolution limit because small defects will be difficult to detect. We already reported the MEEF influence of high-end photomask using wide FOV SEM contour data of "E3630 MVM-SEM®" and lithography simulator "TrueMask® DS" of D2S Inc. in the prior paper [1]. In this paper we evaluate the correlation between our evaluation method and optical inspection tools as ongoing assessment. Also in order to reduce the defect classification work, we can compose the 3 Dimensional (3D) information of defects and can judge whether repairs of defects would be required. Moreover, we confirm the possibility of wafer plane CD measurement based on the combination between E3630 MVM-SEM® and 3D lithography simulation.
Srinivas, Rallapalli; Singh, Ajit Pratap
2018-03-01
Assessment of water quality status of a river with respect to its discharge has become prerequisite to sustainable river basin management. The present paper develops an integrated model for simulating and evaluating strategies for water quality management in a river basin management by controlling point source pollutant loadings and operations of multi-purpose projects. Water Quality Analysis and Simulation Program (WASP version 8.0) has been used for modeling the transport of pollutant loadings and their impact on water quality in the river. The study presents a novel approach of integrating fuzzy set theory with an "advanced eutrophication" model to simulate the transmission and distribution of several interrelated water quality variables and their bio-physiochemical processes in an effective manner in the Ganges river basin, India. After calibration, simulated values are compared with the observed values to validate the model's robustness. Fuzzy technique of order preference by similarity to ideal solution (F-TOPSIS) has been used to incorporate the uncertainty associated with the water quality simulation results. The model also simulates five different scenarios for pollution reduction, to determine the maximum pollutant loadings during monsoon and dry periods. The final results clearly indicate how modeled reduction in the rate of wastewater discharge has reduced impacts of pollutants in the downstream. Scenarios suggesting a river discharge rate of 1500 m 3 /s during the lean period, in addition to 25 and 50% reduction in the load rate, are found to be the most effective option to restore quality of river Ganges. Thus, the model serves as an important hydrologic tool to the policy makers by suggesting appropriate remediation action plans.
Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.
Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A
2016-04-01
The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.
Numerical simulation of controlled directional solidification under microgravity conditions
NASA Astrophysics Data System (ADS)
Holl, S.; Roos, D.; Wein, J.
The computer-assisted simulation of solidification processes influenced by gravity has gained increased importance during the previous years regarding ground-based as well as microgravity research. Depending on the specific needs of the investigator, the simulation model ideally covers a broad spectrum of applications. These primarily include the optimization of furnace design in interaction with selected process parameters to meet the desired crystallization conditions. Different approaches concerning the complexity of the simulation models as well as their dedicated applications will be discussed in this paper. Special emphasis will be put on the potential of software tools to increase the scientific quality and cost-efficiency of microgravity experimentation. The results gained so far in the context of TEXUS, FSLP, D-1 and D-2 (preparatory program) experiments, highlighting their simulation-supported preparation and evaluation will be discussed. An outlook will then be given on the possibilities to enhance the efficiency of pre-industrial research in the Columbus era through the incorporation of suitable simulation methods and tools.
NASA Astrophysics Data System (ADS)
Zhou, Wanmeng; Wang, Hua; Tang, Guojin; Guo, Shuai
2016-09-01
The time-consuming experimental method for handling qualities assessment cannot meet the increasing fast design requirements for the manned space flight. As a tool for the aircraft handling qualities research, the model-predictive-control structured inverse simulation (MPC-IS) has potential applications in the aerospace field to guide the astronauts' operations and evaluate the handling qualities more effectively. Therefore, this paper establishes MPC-IS for the manual-controlled rendezvous and docking (RVD) and proposes a novel artificial neural network inverse simulation system (ANN-IS) to further decrease the computational cost. The novel system was obtained by replacing the inverse model of MPC-IS with the artificial neural network. The optimal neural network was trained by the genetic Levenberg-Marquardt algorithm, and finally determined by the Levenberg-Marquardt algorithm. In order to validate MPC-IS and ANN-IS, the manual-controlled RVD experiments on the simulator were carried out. The comparisons between simulation results and experimental data demonstrated the validity of two systems and the high computational efficiency of ANN-IS.
On the evaluation of segmentation editing tools
Heckel, Frank; Moltz, Jan H.; Meine, Hans; Geisler, Benjamin; Kießling, Andreas; D’Anastasi, Melvin; dos Santos, Daniel Pinto; Theruvath, Ashok Joseph; Hahn, Horst K.
2014-01-01
Abstract. Efficient segmentation editing tools are important components in the segmentation process, as no automatic methods exist that always generate sufficient results. Evaluating segmentation editing algorithms is challenging, because their quality depends on the user’s subjective impression. So far, no established methods for an objective, comprehensive evaluation of such tools exist and, particularly, intermediate segmentation results are not taken into account. We discuss the evaluation of editing algorithms in the context of tumor segmentation in computed tomography. We propose a rating scheme to qualitatively measure the accuracy and efficiency of editing tools in user studies. In order to objectively summarize the overall quality, we propose two scores based on the subjective rating and the quantified segmentation quality over time. Finally, a simulation-based evaluation approach is discussed, which allows a more reproducible evaluation without the need for human input. This automated evaluation complements user studies, allowing a more convincing evaluation, particularly during development, where frequent user studies are not possible. The proposed methods have been used to evaluate two dedicated editing algorithms on 131 representative tumor segmentations. We show how the comparison of editing algorithms benefits from the proposed methods. Our results also show the correlation of the suggested quality score with the qualitative ratings. PMID:26158063
Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C
2015-11-01
Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.
Measuring the Process and Quality of Informed Consent for Clinical Research: Development and Testing
Cohn, Elizabeth Gross; Jia, Haomiao; Smith, Winifred Chapman; Erwin, Katherine; Larson, Elaine L.
2013-01-01
Purpose/Objectives To develop and assess the reliability and validity of an observational instrument, the Process and Quality of Informed Consent (P-QIC). Design A pilot study of the psychometrics of a tool designed to measure the quality and process of the informed consent encounter in clinical research. The study used professionally filmed, simulated consent encounters designed to vary in process and quality. Setting A major urban teaching hospital in the northeastern region of the United States. Sample 63 students enrolled in health-related programs participated in psychometric testing, 16 students participated in test-retest reliability, and 5 investigator-participant dyads were observed for the actual consent encounters. Methods For reliability and validity testing, students watched and rated videotaped simulations of four consent encounters intentionally varied in process and content and rated them with the proposed instrument. Test-retest reliability was established by raters watching the videotaped simulations twice. Inter-rater reliability was demonstrated by two simultaneous but independent raters observing an actual consent encounter. Main Research Variables The essential elements of information and communication for informed consent. Findings The initial testing of the P-QIC demonstrated reliable and valid psychometric properties in both the simulated standardized consent encounters and actual consent encounters in the hospital setting. Conclusions The P-QIC is an easy-to-use observational tool that provides a quick assessment of the areas of strength and areas that need improvement in a consent encounter. It can be used in the initial trainings of new investigators or consent administrators and in ongoing programs of improvement for informed consent. Implications for Nursing The development of a validated observational instrument will allow investigators to assess the consent process more accurately and evaluate strategies designed to improve it. PMID:21708532
Hvitfeldt-Forsberg, Helena; Mazzocato, Pamela; Glaser, Daniel; Keller, Christina; Unbeck, Maria
2017-01-01
Objective To explore healthcare staffs’ and managers’ perceptions of how and when discrete event simulation modelling can be used as a decision support in improvement efforts. Design Two focus group discussions were performed. Setting Two settings were included: a rheumatology department and an orthopaedic section both situated in Sweden. Participants Healthcare staff and managers (n=13) from the two settings. Interventions Two workshops were performed, one at each setting. Workshops were initiated by a short introduction to simulation modelling. Results from the respective simulation model were then presented and discussed in the following focus group discussion. Results Categories from the content analysis are presented according to the following research questions: how and when simulation modelling can assist healthcare improvement? Regarding how, the participants mentioned that simulation modelling could act as a tool for support and a way to visualise problems, potential solutions and their effects. Regarding when, simulation modelling could be used both locally and by management, as well as a pedagogical tool to develop and test innovative ideas and to involve everyone in the improvement work. Conclusions Its potential as an information and communication tool and as an instrument for pedagogic work within healthcare improvement render a broader application and value of simulation modelling than previously reported. PMID:28588107
THE ATMOSPHERIC MODEL EVALUATION (AMET): METEOROLOGY MODULE
An Atmospheric Model Evaluation Tool (AMET), composed of meteorological and air quality components, is being developed to examine the error and uncertainty in the model simulations. AMET matches observations with the corresponding model-estimated values in space and time, and the...
Surface roughness model based on force sensors for the prediction of the tool wear.
de Agustina, Beatriz; Rubio, Eva María; Sebastián, Miguel Ángel
2014-04-04
In this study, a methodology has been developed with the objective of evaluating the surface roughness obtained during turning processes by measuring the signals detected by a force sensor under the same cutting conditions. In this way, the surface quality achieved along the process is correlated to several parameters of the cutting forces (thrust forces, feed forces and cutting forces), so the effect that the tool wear causes on the surface roughness is evaluated. In a first step, the best cutting conditions (cutting parameters and radius of tool) for a certain quality surface requirement were found for pieces of UNS A97075. Next, with this selection a model of surface roughness based on the cutting forces was developed for different states of wear that simulate the behaviour of the tool throughout its life. The validation of this model reveals that it was effective for approximately 70% of the surface roughness values obtained.
Simulation supported POD for RT test case-concept and modeling
NASA Astrophysics Data System (ADS)
Gollwitzer, C.; Bellon, C.; Deresch, A.; Ewert, U.; Jaenisch, G.-R.; Zscherpel, U.; Mistral, Q.
2012-05-01
Within the framework of the European project PICASSO, the radiographic simulator aRTist (analytical Radiographic Testing inspection simulation tool) developed by BAM has been extended for reliability assessment of film and digital radiography. NDT of safety relevant components of aerospace industry requires the proof of probability of detection (POD) of the inspection. Modeling tools can reduce the expense of such extended, time consuming NDT trials, if the result of simulation fits to the experiment. Our analytic simulation tool consists of three modules for the description of the radiation source, the interaction of radiation with test pieces and flaws, and the detection process with special focus on film and digital industrial radiography. It features high processing speed with near-interactive frame rates and a high level of realism. A concept has been developed as well as a software extension for reliability investigations, completed by a user interface for planning automatic simulations with varying parameters and defects. Furthermore, an automatic image analysis procedure is included to evaluate the defect visibility. The radiographic modeling from 3D CAD of aero engine components and quality test samples are compared as a precondition for real trials. This enables the evaluation and optimization of film replacement for application of modern digital equipment for economical NDT and defined POD.
A dynamic simulation based water resources education tool.
Williams, Alison; Lansey, Kevin; Washburne, James
2009-01-01
Educational tools to assist the public in recognizing impacts of water policy in a realistic context are not generally available. This project developed systems with modeling-based educational decision support simulation tools to satisfy this need. The goal of this model is to teach undergraduate students and the general public about the implications of common water management alternatives so that they can better understand or become involved in water policy and make more knowledgeable personal or community decisions. The model is based on Powersim, a dynamic simulation software package capable of producing web-accessible, intuitive, graphic, user-friendly interfaces. Modules are included to represent residential, agricultural, industrial, and turf uses, as well as non-market values, water quality, reservoir, flow, and climate conditions. Supplementary materials emphasize important concepts and lead learners through the model, culminating in an open-ended water management project. The model is used in a University of Arizona undergraduate class and within the Arizona Master Watershed Stewards Program. Evaluation results demonstrated improved understanding of concepts and system interactions, fulfilling the project's objectives.
Ethridge, Max
2009-01-01
The Ozark Highlands include diverse topographic, geologic, soil, and hydrologic conditions that support a broad range of habitat types. The landscape features rugged uplands - some peaks higher than 2,500 feet above sea level - with exposed rock and varying soil depths and includes extensive areas of karst terrain. The Highlands are characterized by extreme biological diversity and high endemism (uniqueness of species). Vegetation communities are dominated by open oak-hickory and shortleaf pine woodlands and forests. Included in this vegetation matrix is an assemblage of various types of fens, forests, wetlands, fluvial features, and carbonate and siliceous glades. An ever-growing human population in the Ozark Highlands has become very dependent on reservoirs constructed on major rivers in the region and, in some cases, groundwater for household and public water supply. Because of human population growth in the Highlands and increases in industrial and agricultural activities, not only is adequate water quantity an issue, but maintaining good water quality is also a challenge. Point and nonpoint sources of excessive nutrients are an issue. U.S. Geological Survey (USGS) partnership programs to monitor water quality and develop simulation tools to help stakeholders better understand strategies to protect the quality of water and the environment are extremely important. The USGS collects relevant data, conducts interpretive studies, and develops simulation tools to help stakeholders understand resource availability and sustainability issues. Stakeholders dependent on these resources are interested in and benefit greatly from evolving these simulation tools (models) into decision support systems that can be used for adaptive management of water and ecological resources. The interaction of unique and high-quality biological and hydrologic resources and the effects of stresses from human activities can be evaluated best by using a multidisciplinary approach that the USGS can provide. Information varying from defining baseline resource conditions to developing simulation models will help resource managers and users understand the human impact on resource sustainability. Varied expertise and experience in biological and water-resources activities across the entire Highlands make the USGS a valued collaborator in studies of Ozark ecosystems, streams, reservoirs, and groundwater. A large part of future success will depend on the involvement and active participation of key partners.
The use of simulation in teaching the basic sciences.
Eason, Martin P
2013-12-01
To assess the current use of simulation in medical education, specifically, the teaching of the basic sciences to accomplish the goal of improved integration. Simulation is increasingly being used by the institutions to teach the basic sciences. Preliminary data suggest that it is an effective tool with increased retention and learner satisfaction. Medical education is undergoing tremendous change. One of the directions of that change is increasing integration of the basic and clinical sciences to improve the efficiency and quality of medical education, and ultimately to improve the patient care. Integration is thought to improve the understanding of basic science conceptual knowledge and to better prepare the learners for clinical practice. Simulation because of its unique effects on learning is currently being successfully used by many institutions as a means to produce that integration through its use in the teaching of the basic sciences. Preliminary data indicate that simulation is an effective tool for basic science education and garners high learner satisfaction.
Many water utilities in the US using chloramine as disinfectant treatment in their distribution systems have experienced nitrification episodes, which detrimentally impact the water quality. A chloraminated drinking water distribution system (DWDS) simulator was operated throug...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rilling, M; Centre de Recherche sur le Cancer, Hôtel-Dieu de Québec, Quebec City, QC; Département de radio-oncologie, CHU de Québec, Quebec City, QC
2015-06-15
Purpose: The purpose of this work is to simulate a multi-focus plenoptic camera used as the measuring device in a real-time three-dimensional scintillation dosimeter. Simulating and optimizing this realistic optical system will bridge the technological gap between concept validation and a clinically viable tool that can provide highly efficient, accurate and precise measurements for dynamic radiotherapy techniques. Methods: The experimental prototype, previously developed for proof of concept purposes, uses an off-the-shelf multi-focus plenoptic camera. With an array of interleaved microlenses of different focal lengths, this camera records spatial and angular information of light emitted by a plastic scintillator volume. Themore » three distinct microlens focal lengths were determined experimentally for use as baseline parameters by measuring image-to-object magnification for different distances in object space. A simulated plenoptic system was implemented using the non-sequential ray tracing software Zemax: this tool allows complete simulation of multiple optical paths by modeling interactions at interfaces such as scatter, diffraction, reflection and refraction. The active sensor was modeled based on the camera manufacturer specifications by a 2048×2048, 5 µm-pixel pitch sensor. Planar light sources, simulating the plastic scintillator volume, were employed for ray tracing simulations. Results: The microlens focal lengths were determined to be 384, 327 and 290 µm. A realistic multi-focus plenoptic system, with independently defined and optimizable specifications, was fully simulated. A f/2.9 and 54 mm-focal length Double Gauss objective was modeled as the system’s main lens. A three-focal length hexagonal microlens array of 250-µm thickness was designed, acting as an image-relay system between the main lens and sensor. Conclusion: Simulation of a fully modeled multi-focus plenoptic camera enables the decoupled optimization of the main lens and microlens specifications. This work leads the way to improving the 3D dosimeter’s achievable resolution, efficiency and build for providing a quality assurance tool fully meeting clinical needs. M.R. is financially supported by a Master’s Canada Graduate Scholarship from the NSERC. This research is also supported by the NSERC Industrial Research Chair in Optical Design.« less
NASA Astrophysics Data System (ADS)
Van Opstal, J.; Neale, C. M. U.; Lecina, S.
2014-12-01
Irrigation management is a dynamic process that adapts according to weather conditions and water availability, as well as socio-economic influences. The goal of water users is to adapt their management to achieve maximum profits. However, these decisions should take into account the environmental impact on the surroundings. Agricultural irrigation systems need to be viewed as a system that is an integral part of a watershed. Therefore changes in the infrastructure, operation and management of an irrigated area, has an impact on the water quantity and quality available for other water users. A strategy can be developed for decision-makers using an irrigation system modelling tool. Such a tool can simulate the impact of the infrastructure, operation and management of an irrigation area on its hydrology and agricultural productivity. This combination of factors is successfully simulated with the Ador model, which is able to reproduce on-farm irrigation and water delivery by a canal system. Model simulations for this study are supported with spatial analysis tools using GIS and remote sensing. Continuous measurements of drainage water will be added to indicate the water quality aspects. The Bear River Canal Company located in Northern Utah (U.S.A.) is used as a case study for this research. The irrigation area encompasses 26,000 ha and grows mainly alfalfa, grains, corn and onions. The model allows the simulation of different strategies related to water delivery, on-farm water use, crop rotations, and reservoirs and networks capacities under different weather and water availability conditions. Such changes in the irrigation area will have consequences for farmers in the study area regarding crop production, and for downstream users concerning both the quantity and quality of outflows. The findings from this study give insight to decision-makers and water users for changing irrigation water delivery strategies to improve the sustainability and profitability of agriculture in the future.
Development of an Efficient Approach to Perform Neutronics Simulations for Plutonium-238 Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chandler, David; Ellis, Ronald James
Conversion of 238Pu decay heat into usable electricity is imperative to power National Aeronautics and Space Administration (NASA) deep space exploration missions; however, the current stockpile of 238Pu is diminishing and the quality is less than ideal. In response, the US Department of Energy and NASA have undertaken a program to reestablish a domestic 238Pu production program and a technology demonstration sub-project has been initiated. Neutronics simulations for 238Pu production play a vital role in this project because the results guide reactor safety-basis, target design and optimization, and post-irradiation examination activities. A new, efficient neutronics simulation tool written in Pythonmore » was developed to evaluate, with the highest fidelity possible with approved tools, the time-dependent nuclide evolution and heat deposition rates in 238Pu production targets irradiated in the High Flux Isotope Reactor (HFIR). The Python Activation and Heat Deposition Script (PAHDS) was developed specifically for experiment analysis in HFIR and couples the MCNP5 and SCALE 6.1.3 software quality assured tools to take advantage of an existing high-fidelity MCNP HFIR model, the most up-to-date ORIGEN code, and the most up-to-date nuclear data. Three cycle simulations were performed with PAHDS implementing ENDF/B-VII.0, ENDF/B-VII.1, and the Hybrid Library GPD-Rev0 cross-section libraries. The 238Pu production results were benchmarked against VESTA-obtained results and the impact of various cross-section libraries on the calculated metrics were assessed.« less
Simulation of laser radar tooling ball measurements: focus dependence
NASA Astrophysics Data System (ADS)
Smith, Daniel G.; Slotwinski, Anthony; Hedges, Thomas
2015-10-01
The Nikon Metrology Laser Radar system focuses a beam from a fiber to a target object and receives the light scattered from the target through the same fiber. The system can, among other things, make highly accurate measurements of the position of a tooling ball by locating the angular position of peak signal quality, which is related to the fiber coupling efficiency. This article explores the relationship between fiber coupling efficiency and focus condition.
Photomask quality evaluation using lithography simulation and precision SEM image contour data
NASA Astrophysics Data System (ADS)
Murakawa, Tsutomu; Fukuda, Naoki; Shida, Soichi; Iwai, Toshimichi; Matsumoto, Jun; Nakamura, Takayuki; Hagiwara, Kazuyuki; Matsushita, Shohei; Hara, Daisuke; Adamov, Anthony
2012-11-01
To evaluate photomask quality, the current method uses spatial imaging by optical inspection tools. This technique at 1Xnm node has a resolution limit because small defects will be difficult to extract. To simulate the mask error-enhancement factor (MEEF) influence for aggressive OPC in 1Xnm node, wide FOV contour data and tone information are derived from high precision SEM images. For this purpose we have developed a new contour data extraction algorithm with sub-nanometer accuracy resulting in a wide Field of View (FOV) SEM image: (for example, more than 10um x 10um square). We evaluated MEEF influence of high-end photomask pattern using the wide FOV contour data of "E3630 MVM-SEMTM" and lithography simulator "TrueMaskTM DS" of D2S, Inc. As a result, we can detect the "invisible defect" as the MEEF influence using the wide FOV contour data and lithography simulator.
Teamwork Assessment Tools in Obstetric Emergencies: A Systematic Review.
Onwochei, Desire N; Halpern, Stephen; Balki, Mrinalini
2017-06-01
Team-based training and simulation can improve patient safety, by improving communication, decision making, and performance of team members. Currently, there is no general consensus on whether or not a specific assessment tool is better adapted to evaluate teamwork in obstetric emergencies. The purpose of this qualitative systematic review was to find the tools available to assess team effectiveness in obstetric emergencies. We searched Embase, Medline, PubMed, Web of Science, PsycINFO, CINAHL, and Google Scholar for prospective studies that evaluated nontechnical skills in multidisciplinary teams involving obstetric emergencies. The search included studies from 1944 until January 11, 2016. Data on reliability and validity measures were collected and used for interpretation. A descriptive analysis was performed on the data. Thirteen studies were included in the final qualitative synthesis. All the studies assessed teams in the context of obstetric simulation scenarios, but only six included anesthetists in the simulations. One study evaluated their teamwork tool using just validity measures, five using just reliability measures, and one used both. The most reliable tools identified were the Clinical Teamwork Scale, the Global Assessment of Obstetric Team Performance, and the Global Rating Scale of performance. However, they were still lacking in terms of quality and validity. More work needs to be conducted to establish the validity of teamwork tools for nontechnical skills, and the development of an ideal tool is warranted. Further studies are required to assess how outcomes, such as performance and patient safety, are influenced when using these tools.
Salari, Marjan; Salami Shahid, Esmaeel; Afzali, Seied Hosein; Ehteshami, Majid; Conti, Gea Oliveri; Derakhshan, Zahra; Sheibani, Solmaz Nikbakht
2018-04-22
Today, due to the increase in the population, the growth of industry and the variety of chemical compounds, the quality of drinking water has decreased. Five important river water quality properties such as: dissolved oxygen (DO), total dissolved solids (TDS), total hardness (TH), alkalinity (ALK) and turbidity (TU) were estimated by parameters such as: electric conductivity (EC), temperature (T), and pH that could be measured easily with almost no costs. Simulate water quality parameters were examined with two methods of modeling include mathematical and Artificial Neural Networks (ANN). Mathematical methods are based on polynomial fitting with least square method and ANN modeling algorithms are feed-forward networks. All conditions/circumstances covered by neural network modeling were tested for all parameters in this study, except for Alkalinity. All optimum ANN models developed to simulate water quality parameters had precision value as R-value close to 0.99. The ANN model extended to simulate alkalinity with R-value equals to 0.82. Moreover, Surface fitting techniques were used to refine data sets. Presented models and equations are reliable/useable tools for studying water quality parameters at similar rivers, as a proper replacement for traditional water quality measuring equipment's. Copyright © 2018 Elsevier Ltd. All rights reserved.
Sotelo, David; Favela-Contreras, Antonio; Sotelo, Carlos; Jiménez, Guillermo; Gallegos-Canales, Luis
2017-11-01
In recent years, interest for petrochemical processes has been increasing, especially in refinement area. However, the high variability in the dynamic characteristics present in the atmospheric distillation column poses a challenge to obtain quality products. To improve distillates quality in spite of the changes in the input crude oil composition, this paper details a new design of a control strategy in a conventional crude oil distillation plant defined using formal interaction analysis tools. The process dynamic and its control are simulated on Aspen HYSYS ® dynamic environment under real operating conditions. The simulation results are compared against a typical control strategy commonly used in crude oil atmospheric distillation columns. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Linking Six Sigma to simulation: a new roadmap to improve the quality of patient care.
Celano, Giovanni; Costa, Antonio; Fichera, Sergio; Tringali, Giuseppe
2012-01-01
Improving the quality of patient care is a challenge that calls for a multidisciplinary approach, embedding a broad spectrum of knowledge and involving healthcare professionals from diverse backgrounds. The purpose of this paper is to present an innovative approach that implements discrete-event simulation (DES) as a decision-supporting tool in the management of Six Sigma quality improvement projects. A roadmap is designed to assist quality practitioners and health care professionals in the design and successful implementation of simulation models within the define-measure-analyse-design-verify (DMADV) or define-measure-analyse-improve-control (DMAIC) Six Sigma procedures. A case regarding the reorganisation of the flow of emergency patients affected by vertigo symptoms was developed in a large town hospital as a preliminary test of the roadmap. The positive feedback from professionals carrying out the project looks promising and encourages further roadmap testing in other clinical settings. The roadmap is a structured procedure that people involved in quality improvement can implement to manage projects based on the analysis and comparison of alternative scenarios. The role of Six Sigma philosophy in improvement of the quality of healthcare services is recognised both by researchers and by quality practitioners; discrete-event simulation models are commonly used to improve the key performance measures of patient care delivery. The two approaches are seldom referenced and implemented together; however, they could be successfully integrated to carry out quality improvement programs. This paper proposes an innovative approach to bridge the gap and enrich the Six Sigma toolbox of quality improvement procedures with DES.
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
Development and evaluation of the Screening Trajectory Ozone Prediction System (STOPS, version 1.0)
NASA Astrophysics Data System (ADS)
Czader, B. H.; Percell, P.; Byun, D.; Choi, Y.
2014-11-01
A hybrid Lagrangian-Eulerian modeling tool has been developed using the Eulerian framework of the Community Multiscale Air Quality (CMAQ) model. It is a moving nest that utilizes saved original CMAQ simulation results to provide boundary conditions, initial conditions, as well as emissions and meteorological parameters necessary for a simulation. Given that these file are available, this tool can run independently from the CMAQ whole domain simulation and it is designed to simulate source - receptor relationship upon changes in emissions. In this tool, the original CMAQ's horizontal domain is reduced to a small sub-domain that follows a trajectory defined by the mean mixed-layer wind. It has the same vertical structure and physical and chemical interactions as CMAQ except advection calculation. The advantage of this tool compared to other Lagrangian models is its capability of utilizing realistic boundary conditions that change with space and time as well as detailed chemistry treatment. The correctness of the algorithms and the overall performance was evaluated against CMAQ simulation results. Its performance depends on the atmospheric conditions occurring during the simulation period with the comparisons being most similar to CMAQ results under uniform wind conditions. The mean bias varies between -0.03 and -0.78 and the slope is between 0.99 and 1.01 for different analyzed cases. For complicated meteorological condition, such as wind circulation, the simulated mixing ratios deviate from CMAQ values as a result of Lagrangian approach of using mean wind for its movement, but are still close, with the mean varying between 0.07 and -4.29 and slope varying between 0.95 and 1.063 for different analyzed cases. For historical reasons this hybrid Lagrangian - Eulerian tool is named the Screening Trajectory Ozone Prediction System (STOPS) but its use is not limited to ozone prediction as similarly to CMAQ it can simulate concentrations of many species, including particulate matter and some toxic compounds, such as formaldehyde and 1,3-butadiene.
Analyzing the effect of tool edge radius on cutting temperature in micro-milling process
NASA Astrophysics Data System (ADS)
Liang, Y. C.; Yang, K.; Zheng, K. N.; Bai, Q. S.; Chen, W. Q.; Sun, G. Y.
2010-10-01
Cutting heat is one of the important physical subjects in the cutting process. Cutting heat together with cutting temperature produced by the cutting process will directly have effects on the tool wear and the life as well as on the workpiece processing precision and surface quality. The feature size of the workpiece is usually several microns. Thus, the tiny changes of cutting temperature will affect the workpiece on the surface quality and accuracy. Therefore, cutting heat and temperature generated in micro-milling will have significantly different effect than the one in the traditional tools cutting. In this paper, a two-dimensional coupled thermal-mechanical finite element model is adopted to determine thermal fields and cutting temperature during the Micro-milling process, by using software Deform-2D. The effect of tool edge radius on effective stress, effective strain, velocity field and cutting temperature distribution in micro-milling of aluminum alloy Al2024-T6 were investigated and analyzed. Also, the transient cutting temperature distribution was simulated dynamically. The simulation results show that the cutting temperature in Micro-milling is lower than those occurring in conventional milling processes due to the small loads and low cutting velocity. With increase of tool edge radius, the maximum temperature region gradually occurs on the contact region between finished surfaced and flank face of micro-cutter, instead of the rake face or the corner of micro-cutter. And this phenomenon shows an obvious size effect.
Romagnoli, Martín; Portapila, Margarita; Rigalli, Alfredo; Maydana, Gisela; Burgués, Martín; García, Carlos M
2017-10-15
Argentina has been among the world leaders in the production and export of agricultural products since the 1990s. The Carcarañá River Lower Basin (CRLB), a cropland of the Pampas region supplied by extensive rainfall, is located in an area with few streamgauging and other hydrologic/water-quality stations. Therefore, limited hydrologic data are available resulting in limited water-resources assessment. This work explores the application of Soil and Water Assessment Tool (SWAT) model to the CRLB in the Santa Fe province of the Pampas region. The analysis of field and remote-sensing data characterizing hydrology, water quality, soil types, land use/land cover, management practices, and crop yield, guarantee a comprehensive SWAT modeling approach. A combined manual and automated calibration and validation process incorporating sensitivity and uncertainty analysis is performed using information concerning interior watershed processes. Eleven N/P fertilizer rates are selected to simulate the impact of N fertilizer on crop yield, plant uptake, as well as runoff and leaching losses. Different indices (partial factor productivity, agronomic efficiency, apparent crop recovery efficiency of applied nutrient, internal utilization efficiency, and physiological efficiency) are considered to assess nitrogen-use efficiency. The overall quality of the fit is satisfactory considering the input data limitations. This work provides, for the first time in Argentina, a reliable tool to simulate yield response to soil quality and water availability capable to meet defined environmental targets to support decision making on planning public policies and private activities on the Pampas region. Copyright © 2017 Elsevier B.V. All rights reserved.
Virtual reality-based simulators for spine surgery: a systematic review.
Pfandler, Michael; Lazarovici, Marc; Stefan, Philipp; Wucherer, Patrick; Weigl, Matthias
2017-09-01
Virtual reality (VR)-based simulators offer numerous benefits and are very useful in assessing and training surgical skills. Virtual reality-based simulators are standard in some surgical subspecialties, but their actual use in spinal surgery remains unclear. Currently, only technical reviews of VR-based simulators are available for spinal surgery. Thus, we performed a systematic review that examined the existing research on VR-based simulators in spinal procedures. We also assessed the quality of current studies evaluating VR-based training in spinal surgery. Moreover, we wanted to provide a guide for future studies evaluating VR-based simulators in this field. This is a systematic review of the current scientific literature regarding VR-based simulation in spinal surgery. Five data sources were systematically searched to identify relevant peer-reviewed articles regarding virtual, mixed, or augmented reality-based simulators in spinal surgery. A qualitative data synthesis was performed with particular attention to evaluation approaches and outcomes. Additionally, all included studies were appraised for their quality using the Medical Education Research Study Quality Instrument (MERSQI) tool. The initial review identified 476 abstracts and 63 full texts were then assessed by two reviewers. Finally, 19 studies that examined simulators for the following procedures were selected: pedicle screw placement, vertebroplasty, posterior cervical laminectomy and foraminotomy, lumbar puncture, facet joint injection, and spinal needle insertion and placement. These studies had a low-to-medium methodological quality with a MERSQI mean score of 11.47 out of 18 (standard deviation=1.81). This review described the current state and applications of VR-based simulator training and assessment approaches in spinal procedures. Limitations, strengths, and future advancements of VR-based simulators for training and assessment in spinal surgery were explored. Higher-quality studies with patient-related outcome measures are needed. To establish further adaptation of VR-based simulators in spinal surgery, future evaluations need to improve the study quality, apply long-term study designs, and examine non-technical skills, as well as multidisciplinary team training. Copyright © 2017 Elsevier Inc. All rights reserved.
Sevdalis, Nick; Undre, Shabnam; Henry, Janet; Sydney, Elaine; Koutantji, Mary; Darzi, Ara; Vincent, Charles A
2009-09-01
The recent emergence of the Systems Approach to the safety and quality of surgical care has triggered individual and team skills training modules for surgeons and anaesthetists and relevant observational assessment tools have been developed. To develop an observational tool that captures operating room (OR) nurses' technical skill and can be used for assessment and training. The Imperial College Assessment of Technical Skills for Nurses (ICATS-N) assesses (i) gowning and gloving, (ii) setting up instrumentation, (iii) draping, and (iv) maintaining sterility. Three to five observable behaviours have been identified for each skill and are rated on 1-6 scales. Feasibility and aspects of reliability and validity were assessed in 20 simulation-based crisis management training modules for trainee nurses and doctors, carried out in a Simulated Operating Room. The tool was feasible to use in the context of simulation-based training. Satisfactory reliability (Cronbach alpha) was obtained across trainers' and trainees' scores (analysed jointly and separately). Moreover, trainer nurse's ratings of the four skills correlated positively, thus indicating adequate content validity. Trainer's and trainees' ratings did not correlate. Assessment of OR nurses' technical skill is becoming a training priority. The present evidence suggests that the ICATS-N could be considered for use as an assessment/training tool for junior OR nurses.
Comparison of different phantoms used in digital diagnostic imaging
NASA Astrophysics Data System (ADS)
Bor, Dogan; Unal, Elif; Uslu, Anil
2015-09-01
The organs of extremity, chest, skull and lumbar were physically simulated using uniform PMMA slabs with different thicknesses alone and using these slabs together with aluminum plates and air gaps (ANSI Phantoms). The variation of entrance surface air kerma and scatter fraction with X-ray beam qualities was investigated for these phantoms and the results were compared with those measured from anthropomorphic phantoms. A flat panel digital radiographic system was used for all the experiments. Considerable variations of entrance surface air kermas were found for the same organs of different designs, and highest doses were measured for the PMMA slabs. A low contrast test tool and a contrast detail test object (CDRAD) were used together with each organ simulation of PMMA slabs and ANSI phantoms in order to test the clinical image qualities. Digital images of these phantom combinations and anthropomorphic phantoms were acquired in raw and clinically processed formats. Variation of image quality with kVp and post processing was evaluated using the numerical metrics of these test tools and measured contrast values from the anthropomorphic phantoms. Our results indicated that design of some phantoms may not be efficient enough to reveal the expected performance of the post processing algorithms.
NASA Astrophysics Data System (ADS)
Ran, L.; Cooter, E. J.; Gilliam, R. C.; Foroutan, H.; Kang, D.; Appel, W.; Wong, D. C.; Pleim, J. E.; Benson, V.; Pouliot, G.
2017-12-01
The combined meteorology and air quality modeling system composed of the Weather Research and Forecast (WRF) model and Community Multiscale Air Quality (CMAQ) model is an important decision support tool that is used in research and regulatory decisions related to emissions, meteorology, climate, and chemical transport. The Environmental Policy Integrated Climate (EPIC) is a cropping model which has long been used in a range of applications related to soil erosion, crop productivity, climate change, and water quality around the world. We have integrated WRF/CMAQ with EPIC using the Fertilizer Emission Scenario Tool for CMAQ (FEST-C) to estimate daily soil N information with fertilization for CMAQ bi-directional ammonia flux modeling. Driven by the weather and N deposition from WRF/CMAQ, FEST-C EPIC simulations are conducted on 22 different agricultural production systems ranging from managed grass lands (e.g. hay and alfalfa) to crop lands (e.g. corn grain and soybean) with rainfed and irrigated information across any defined conterminous United States (U.S.) CMAQ domain and grid resolution. In recent years, this integrated system has been enhanced and applied in many different air quality and ecosystem assessment projects related to land-water-atmosphere interactions. These enhancements have advanced this system to become a valuable tool for integrated assessments of air, land and water quality in light of social drivers and human and ecological outcomes. This presentation will focus on evaluating the sensitivity of precipitation and N deposition in the integrated system to MODIS vegetation input and lightning assimilation and their impacts on agricultural production and fertilization. We will describe the integrated modeling system and evaluate simulated precipitation and N deposition along with other weather information (e.g. temperature, humidity) for 2011 over the conterminous U.S. at 12 km grids from a coupled WRF/CMAQ with MODIS and lightning assimilation. Simulated agricultural production and fertilization from FEST-C EPIC driven by the changed meteorology and N deposition from MODIS and lightning assimilations will be evaluated and analyzed.
Guidelines for the analysis of free energy calculations
Klimovich, Pavel V.; Shirts, Michael R.; Mobley, David L.
2015-01-01
Free energy calculations based on molecular dynamics (MD) simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical–analysis.py, freely available on GitHub at https://github.com/choderalab/pymbar–examples, that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope these tools and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations. PMID:25808134
NASA Astrophysics Data System (ADS)
Farina, Simone; Thepsonti, Thanongsak; Ceretti, Elisabetta; Özel, Tugrul
2011-05-01
Titanium alloys offer superb properties in strength, corrosion resistance and biocompatibility and are commonly utilized in medical devices and implants. Micro-end milling process is a direct and rapid fabrication method for manufacturing medical devices and implants in titanium alloys. Process performance and quality depend upon an understanding of the relationship between cutting parameters and forces and resultant tool deflections to avoid tool breakage. For this purpose, FE simulations of chip formation during micro-end milling of Ti-6Al-4V alloy with an ultra-fine grain solid carbide two-flute micro-end mill are investigated using DEFORM software. At first, specific forces in tangential and radial directions of cutting during micro-end milling for varying feed advance and rotational speeds have been determined using designed FE simulations for chip formation process. Later, these forces are applied to the micro-end mill geometry along the axial depth of cut in 3D analysis of ABAQUS. Consequently, 3D distributions for tool deflections & von Misses stress are determined. These analyses will yield in establishing integrated multi-physics process models for high performance micro-end milling and a leap-forward to process improvements.
Varshney, Rickul; Frenkiel, Saul; Nguyen, Lily H P; Young, Meredith; Del Maestro, Rolando; Zeitouni, Anthony; Tewfik, Marc A
2014-01-01
The technical challenges of endoscopic sinus surgery (ESS) and the high risk of complications support the development of alternative modalities to train residents in these procedures. Virtual reality simulation is becoming a useful tool for training the skills necessary for minimally invasive surgery; however, there are currently no ESS virtual reality simulators available with valid evidence supporting their use in resident education. Our aim was to develop a new rhinology simulator, as well as to define potential performance metrics for trainee assessment. The McGill simulator for endoscopic sinus surgery (MSESS), a new sinus surgery virtual reality simulator with haptic feedback, was developed (a collaboration between the McGill University Department of Otolaryngology-Head and Neck Surgery, the Montreal Neurologic Institute Simulation Lab, and the National Research Council of Canada). A panel of experts in education, performance assessment, rhinology, and skull base surgery convened to identify core technical abilities that would need to be taught by the simulator, as well as performance metrics to be developed and captured. The MSESS allows the user to perform basic sinus surgery skills, such as an ethmoidectomy and sphenoidotomy, through the use of endoscopic tools in a virtual nasal model. The performance metrics were developed by an expert panel and include measurements of safety, quality, and efficiency of the procedure. The MSESS incorporates novel technological advancements to create a realistic platform for trainees. To our knowledge, this is the first simulator to combine novel tools such as the endonasal wash and elaborate anatomic deformity with advanced performance metrics for ESS.
[Quality assurance of a virtual simulation software: application to IMAgo and SIMAgo (ISOgray)].
Isambert, A; Beaudré, A; Ferreira, I; Lefkopoulos, D
2007-06-01
Virtual simulation process is often used to prepare three dimensional conformal radiation therapy treatments. As the quality of the treatment is widely dependent on this step, it is mandatory to perform extensive controls on this software before clinical use. The tests presented in this work have been carried out on the treatment planning system ISOgray (DOSIsoft), including the delineation module IMAgo and the virtual simulation module SIMAgo. According to our experience, the most relevant controls of international protocols have been selected. These tests mainly focused on measuring and delineation tools, virtual simulation functionalities, and have been performed with three phantoms: the Quasar Multi-Purpose Body Phantom, the Quasar MLC Beam Geometry Phantom (Modus Medical Devices Inc.) and a phantom developed at Hospital Tenon. No major issues have been identified while performing the tests. These controls have emphasized the necessity for the user to consider with a critical eye the results displayed by a virtual simulation software. The contrast of visualisation, the slice thickness, the calculation and display mode of 3D structures used by the software are many factors of uncertainties. A virtual simulation software quality assurance procedure has been written and applied on a set of CT images. Similar tests have to be performed periodically and at minimum at each change of major version.
Modeling and performance assessment in QinetiQ of EO and IR airborne reconnaissance systems
NASA Astrophysics Data System (ADS)
Williams, John W.; Potter, Gary E.
2002-11-01
QinetiQ are the technical authority responsible for specifying the performance requirements for the procurement of airborne reconnaissance systems, on behalf of the UK MoD. They are also responsible for acceptance of delivered systems, overseeing and verifying the installed system performance as predicted and then assessed by the contractor. Measures of functional capability are central to these activities. The conduct of these activities utilises the broad technical insight and wide range of analysis tools and models available within QinetiQ. This paper focuses on the tools, methods and models that are applicable to systems based on EO and IR sensors. The tools, methods and models are described, and representative output for systems that QinetiQ has been responsible for is presented. The principle capability applicable to EO and IR airborne reconnaissance systems is the STAR (Simulation Tools for Airborne Reconnaissance) suite of models. STAR generates predictions of performance measures such as GRD (Ground Resolved Distance) and GIQE (General Image Quality) NIIRS (National Imagery Interpretation Rating Scales). It also generates images representing sensor output, using the scene generation software CAMEO-SIM and the imaging sensor model EMERALD. The simulated image 'quality' is fully correlated with the predicted non-imaging performance measures. STAR also generates image and table data that is compliant with STANAG 7023, which may be used to test ground station functionality.
Impact of Parameter Uncertainty Assessment of Critical SWAT Output Simulations
USDA-ARS?s Scientific Manuscript database
Watershed models are increasingly being utilized to evaluate alternate management scenarios for improving water quality. The concern for using these tools in extensive programs such as the National Total Maximum Daily Load (TMDL) program is that the certainty of model results and efficacy of managem...
DOT National Transportation Integrated Search
2014-06-01
In June 2012, the Environmental Protection Agency (EPA) released the Operating Mode : Distribution Generator (OMDG) a tool for developing an operating mode distribution as an input : to the Motor Vehicle Emissions Simulator model (MOVES). The t...
[Development of APSIM (agricultural production systems simulator) and its application].
Shen, Yuying; Nan, Zhibiao; Bellotti, Bill; Robertson, Michael; Chen, Wen; Shao, Xinqing
2002-08-01
Soil-crop simulator model is an effective tool for providing decision on agricultural management. APSIM (Agricultural Production Systems Simulator) was developed to simulate the biophysical process in farming system, and particularly in the economic and ecological features of the systems under climatic risk. The current literatures revealed that APSIM could be applied in wide zone, including temperate continental, temperate maritime, sub-tropic and arid climate, and Mediterranean climates, with the soil type of clay, duplex soil, vertisol, silt sandy, silt loam and silt clay loam. More than 20 crops have been simulated well. APSIM is powerful on describing crop structure, crop sequence, yield prediction, and quality control as well as erosion estimation under different planting pattern.
Hydrologic and water quality sensitivity to climate and land ...
This page describes a current EPA ORD project. No project report or other download is available at this time. Please see the section Next Steps below for a timeline of anticipated products of this work. Background: Projected changes in climate during the next century could cause or contribute to increased flooding, drought, water quality degradation, and ecosystem impairment. The effects of climate change in different watersheds will vary due to regional differences in climate change, physiographic setting, and interaction with land-use, pollutant sources, and water management in different locations. EPA is conducting watershed modeling to develop hydrologic and water quality change scenarios for 20 relatively large U.S. watersheds. Watershed modeling will be conducted using the Hydrologic Simulation Program-FORTRAN (HSPF) and Soil Water Assessment Tool (SWAT) watershed models. Study areas range from about 10,000-15,000 square miles in size, and will cover nearly every ecoregion in the United States and a range of hydro-climatic conditions. A range of hydrologic and water quality endpoints will be determined for each watershed simulation. Endpoints will be selected to inform upon a range of stream flow, water quality, aquatic ecosystem, and EPA program management goals and targets. Model simulations will be conducted to evaluate a range of projected future (2040-2070) changes in climate and land-use. Simulations will include baseline conditions,
Design and simulation of EVA tools for first servicing mission of HST
NASA Technical Reports Server (NTRS)
Naik, Dipak; Dehoff, P. H.
1993-01-01
The Hubble Space Telescope (HST) was launched into near-earth orbit by the space shuttle Discovery on April 24, 1990. The payload of two cameras, two spectrographs, and a high-speed photometer is supplemented by three fine-guidance sensors that can be used for astronomy as well as for star tracking. A widely reported spherical aberration in the primary mirror causes HST to produce images of much lower quality than intended. A space shuttle repair mission in late 1993 will install small corrective mirrors that will restore the full intended optical capability of the HST. The first servicing mission (FSM) will involve considerable extravehicular activity (EVA). It is proposed to design special EVA tools for the FSM. This report includes details of the data acquisition system being developed to test the performance of the various EVA tools in ambient as well as simulated space environment.
Computational homogenisation for thermoviscoplasticity: application to thermally sprayed coatings
NASA Astrophysics Data System (ADS)
Berthelsen, Rolf; Denzer, Ralf; Oppermann, Philip; Menzel, Andreas
2017-11-01
Metal forming processes require wear-resistant tool surfaces in order to ensure a long life cycle of the expensive tools together with a constant high quality of the produced components. Thermal spraying is a relatively widely applied coating technique for the deposit of wear protection coatings. During these coating processes, heterogeneous coatings are deployed at high temperatures followed by quenching where residual stresses occur which strongly influence the performance of the coated tools. The objective of this article is to discuss and apply a thermo-mechanically coupled simulation framework which captures the heterogeneity of the deposited coating material. Therefore, a two-scale finite element framework for the solution of nonlinear thermo-mechanically coupled problems is elaborated and applied to the simulation of thermoviscoplastic material behaviour including nonlinear thermal softening in a geometrically linearised setting. The finite element framework and material model is demonstrated by means of numerical examples.
Surface Roughness Model Based on Force Sensors for the Prediction of the Tool Wear
de Agustina, Beatriz; Rubio, Eva María; Sebastián, Miguel Ángel
2014-01-01
In this study, a methodology has been developed with the objective of evaluating the surface roughness obtained during turning processes by measuring the signals detected by a force sensor under the same cutting conditions. In this way, the surface quality achieved along the process is correlated to several parameters of the cutting forces (thrust forces, feed forces and cutting forces), so the effect that the tool wear causes on the surface roughness is evaluated. In a first step, the best cutting conditions (cutting parameters and radius of tool) for a certain quality surface requirement were found for pieces of UNS A97075. Next, with this selection a model of surface roughness based on the cutting forces was developed for different states of wear that simulate the behaviour of the tool throughout its life. The validation of this model reveals that it was effective for approximately 70% of the surface roughness values obtained. PMID:24714391
There is a need to develop modeling and data analysis tools to increase our understanding of human exposures to air pollutants beyond what can be explained by "limited" field data. Modeling simulations of complex distributions of pollutant concentrations within roadw...
Evaluation of satellite-based, modeled-derived daily solar radiation data for the continental U.S.
USDA-ARS?s Scientific Manuscript database
Many applications of simulation models and related decision support tools for agriculture and natural resource management require daily meteorological data as inputs. Availability and quality of such data, however, often constrain research and decision support activities that require use of these to...
Triple Value System Dynamics Modeling to Help Stakeholders Engage with Food-Energy-Water Problems
Triple Value (3V) Community scoping projects and Triple Value Simulation (3VS) models help decision makers and stakeholders apply systems-analysis methodology to complex problems related to food production, water quality, and energy use. 3VS models are decision support tools that...
DOT National Transportation Integrated Search
2013-01-01
Transportation corridors are vital for our region and even the nations economy and quality of life. A corridor is usually a complicated system that may span multi-jurisdictions, contains multiple modes, include both freeways and local arterials, a...
Development of cropland management dataset to support U.S. SWAT assessments
USDA-ARS?s Scientific Manuscript database
The Soil and Water Assessment Tool (SWAT) is a widely used hydrologic/water quality simulation model in the U.S. Process-based models like SWAT require a great deal of data to accurately represent the natural world, including topography, landuse, soils, weather, and management. With the exception ...
A Design Tool for Liquid Rocket Engine Injectors
NASA Technical Reports Server (NTRS)
Farmer, R.; Cheng, G.; Trinh, H.; Tucker, K.
2000-01-01
A practical design tool which emphasizes the analysis of flowfields near the injector face of liquid rocket engines has been developed and used to simulate preliminary configurations of NASA's Fastrac and vortex engines. This computational design tool is sufficiently detailed to predict the interactive effects of injector element impingement angles and points and the momenta of the individual orifice flows and the combusting flow which results. In order to simulate a significant number of individual orifices, a homogeneous computational fluid dynamics model was developed. To describe sub- and supercritical liquid and vapor flows, the model utilized thermal and caloric equations of state which were valid over a wide range of pressures and temperatures. The model was constructed such that the local quality of the flow was determined directly. Since both the Fastrac and vortex engines utilize RP-1/LOX propellants, a simplified hydrocarbon combustion model was devised in order to accomplish three-dimensional, multiphase flow simulations. Such a model does not identify drops or their distribution, but it does allow the recirculating flow along the injector face and into the acoustic cavity and the film coolant flow to be accurately predicted.
Al-Bustani, Saif; Halvorson, Eric G
2016-06-01
Various simulation models for microsurgery have been developed to overcome the limitations of Halstedian training on real patients. We wanted to assess the status of microsurgery simulation in plastic surgery residency programs in the United States. Data were analyzed from responses to a survey sent to all plastic surgery program directors in the United States, asking for type of simulation, quality of facilities, utilization by trainees, evaluation of trainee sessions, and perception of the relevance of simulation. The survey response rate was 50%. Of all programs, 69% provide microsurgical simulation and 75% of these have a laboratory with microscope and 52% provide live animal models. Half share facilities with other departments. The quality of facilities is rated as good or great in 89%. Trainee utilization is once every 3 to 6 months in 82% of programs. Only in 11% is utilization monthly. Formal evaluation of simulation sessions is provided by 41% of programs. All program directors agree simulation is relevant to competence in microsurgery, 60% agree simulation should be mandatory, and 43% require trainees to complete a formal microsurgery course prior to live surgery. There seems to be consensus that microsurgical simulation improves competence, and the majority of program directors agree it should be mandatory. Developing and implementing standardized simulation modules and assessment tools for trainees across the nation as part of a comprehensive competency-based training program for microsurgery is an important patient safety initiative that should be considered. Organizing with other departments to share facilities may improve their quality and hence utilization.
Innovative Stormwater Quality Tools by SARA for Holistic Watershed Master Planning
NASA Astrophysics Data System (ADS)
Thomas, S. M.; Su, Y. C.; Hummel, P. R.
2016-12-01
Stormwater management strategies such as Best Management Practices (BMP) and Low-Impact Development (LID) have increasingly gained attention in urban runoff control, becoming vital to holistic watershed master plans. These strategies can help address existing water quality impairments and support regulatory compliance, as well as guide planning and management of future development when substantial population growth and urbanization is projected to occur. However, past efforts have been limited to qualitative planning due to the lack of suitable tools to conduct quantitative assessment. The San Antonio River Authority (SARA), with the assistance of Lockwood, Andrews & Newnam, Inc. (LAN) and AQUA TERRA Consultants (a division of RESPEC), developed comprehensive hydrodynamic and water quality models using the Hydrological Simulation Program-FORTRAN (HSPF) for several urban watersheds in the San Antonio River Basin. These models enabled watershed management to look at water quality issues on a more refined temporal and spatial scale than the limited monitoring data. They also provided a means to locate and quantify potential water quality impairments and evaluate the effects of mitigation measures. To support the models, a suite of software tools were developed. including: 1) SARA Timeseries Utility Tool for managing and processing of large model timeseries files, 2) SARA Load Reduction Tool to determine load reductions needed to achieve screening levels for each modeled constituent on a sub-basin basis, and 3) SARA Enhanced BMP Tool to determine the optimal combination of BMP types and units needed to achieve the required load reductions. Using these SARA models and tools, water quality agencies and stormwater professionals can determine the optimal combinations of BMP/LID to accomplish their goals and save substantial stormwater infrastructure and management costs. The tools can also help regulators and permittees evaluate the feasibility of achieving compliance using BMP/LID. The project has gained national attention, being showcased in multiple newsletters, professional magazines, and conference presentations. The project also won the Texas American Council of Engineering Companies (ACEC) Gold Medal Award and the ACEC National Recognition Award in 2016.
Santhi, C; Kannan, N; White, M; Di Luzio, M; Arnold, J G; Wang, X; Williams, J R
2014-01-01
The USDA initiated the Conservation Effects Assessment Project (CEAP) to quantify the environmental benefits of conservation practices at regional and national scales. For this assessment, a sampling and modeling approach is used. This paper provides a technical overview of the modeling approach used in CEAP cropland assessment to estimate the off-site water quality benefits of conservation practices using the Ohio River Basin (ORB) as an example. The modeling approach uses a farm-scale model, Agricultural Policy Environmental Extender (APEX), and a watershed scale model (the Soil and Water Assessment Tool [SWAT]) and databases in the Hydrologic Unit Modeling for the United States system. Databases of land use, soils, land use management, topography, weather, point sources, and atmospheric depositions were developed to derive model inputs. APEX simulates the cultivated cropland, Conserve Reserve Program land, and the practices implemented on them, whereas SWAT simulates the noncultivated land (e.g., pasture, range, urban, and forest) and point sources. Simulation results from APEX are input into SWAT. SWAT routes all sources, including APEX's, to the basin outlet through each eight-digit watershed. Each basin is calibrated for stream flow, sediment, and nutrient loads at multiple gaging sites and turned in for simulating the effects of conservation practice scenarios on water quality. Results indicate that sediment, nitrogen, and phosphorus loads delivered to the Mississippi River from ORB could be reduced by 16, 15, and 23%, respectively, due to current conservation practices. Modeling tools are useful to provide science-based information for assessing existing conservation programs, developing future programs, and developing insights on load reductions necessary for hypoxia in the Gulf of Mexico. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
NASA Astrophysics Data System (ADS)
Ranatunga, T.
2016-12-01
Modeling of fate and transport of fecal bacteria in a watershed is generally a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria (E.coli) source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads were input to the SWAT model in order to simulate the transport through the land and in-stream conditions. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on H-GAC's regional land use, population and household projections (up to 2040). Based on the in-stream reductions required to meet the water quality standards, the corresponding required source load reductions were estimated.
NASA Astrophysics Data System (ADS)
Ranatunga, T.
2017-12-01
Modeling of fate and transport of fecal bacteria in a watershed is a processed based approach that considers releases from manure, point sources, and septic systems. Overland transport with water and sediments, infiltration into soils, transport in the vadose zone and groundwater, die-off and growth processes, and in-stream transport are considered as the other major processes in bacteria simulation. This presentation will discuss a simulation of fecal indicator bacteria source loading and in-stream conditions of a non-tidal watershed (Cedar Bayou Watershed) in South Central Texas using two models; Spatially Explicit Load Enrichment Calculation Tool (SELECT) and Soil and Water Assessment Tool (SWAT). Furthermore, it will discuss a probable approach of bacteria source load reduction in order to meet the water quality standards in the streams. The selected watershed is listed as having levels of fecal indicator bacteria that posed a risk for contact recreation and wading by the Texas Commission of Environmental Quality (TCEQ). The SELECT modeling approach was used in estimating the bacteria source loading from land categories. Major bacteria sources considered were, failing septic systems, discharges from wastewater treatment facilities, excreta from livestock (Cattle, Horses, Sheep and Goat), excreta from Wildlife (Feral Hogs, and Deer), Pet waste (mainly from Dogs), and runoff from urban surfaces. The estimated source loads from SELECT model were input to the SWAT model, and simulate the bacteria transport through the land and in-stream. The calibrated SWAT model was then used to estimate the indicator bacteria in-stream concentrations for future years based on regional land use, population and household forecast (up to 2040). Based on the reductions required to meet the water quality standards in-stream, the corresponding required source load reductions were estimated.
The expected results method for data verification
NASA Astrophysics Data System (ADS)
Monday, Paul
2016-05-01
The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest quality possible.
Assessment of coastal management options by means of multilayered ecosystem models
NASA Astrophysics Data System (ADS)
Nobre, Ana M.; Ferreira, João G.; Nunes, João P.; Yan, Xiaojun; Bricker, Suzanne; Corner, Richard; Groom, Steve; Gu, Haifeng; Hawkins, Anthony J. S.; Hutson, Rory; Lan, Dongzhao; Silva, João D. Lencart e.; Pascoe, Philip; Telfer, Trevor; Zhang, Xuelei; Zhu, Mingyuan
2010-03-01
This paper presents a multilayered ecosystem modelling approach that combines the simulation of the biogeochemistry of a coastal ecosystem with the simulation of the main forcing functions, such as catchment loading and aquaculture activities. This approach was developed as a tool for sustainable management of coastal ecosystems. A key feature is to simulate management scenarios that account for changes in multiple uses and enable assessment of cumulative impacts of coastal activities. The model was applied to a coastal zone in China with large aquaculture production and multiple catchment uses, and where management efforts to improve water quality are under way. Development scenarios designed in conjunction with local managers and aquaculture producers include the reduction of fish cages and treatment of wastewater. Despite the reduction in nutrient loading simulated in three different scenarios, inorganic nutrient concentrations in the bay were predicted to exceed the thresholds for poor quality defined by Chinese seawater quality legislation. For all scenarios there is still a Moderate High to High nutrient loading from the catchment, so further reductions might be enacted, together with additional decreases in fish cage culture. The model predicts that overall, shellfish production decreases by 10%-28% using any of these development scenarios, principally because shellfish growth is being sustained by the substances to be reduced for improvement of water quality. The model outcomes indicate that this may be counteracted by zoning of shellfish aquaculture at the ecosystem level in order to optimize trade-offs between productivity and environmental effects. The present case study exemplifies the value of multilayered ecosystem modelling as a tool for Integrated Coastal Zone Management and for the adoption of ecosystem approaches for marine resource management. This modelling approach can be applied worldwide, and may be particularly useful for the application of coastal management regulation, for instance in the implementation of the European Marine Strategy Framework Directive.
Innovative Tools for Water Quality/Quantity Management: New York City's Operations Support Tool
NASA Astrophysics Data System (ADS)
Wang, L.; Schaake, J. C.; Day, G. N.; Porter, J.; Sheer, D. P.; Pyke, G.
2011-12-01
The New York City Department of Environmental Protection (DEP) manages New York City's water supply, which is comprised of over 20 reservoirs and supplies more than 1 billion gallons of water per day to over 9 million customers. Recently, DEP has initiated design of an Operations Support Tool (OST), a state-of-the-art decision support system to provide computational and predictive support for water supply operations and planning. This presentation describes the technical structure of OST, including the underlying water supply and water quality models, data sources and database management, reservoir inflow forecasts, and the functionalities required to meet the needs of a diverse group of end users. OST is a major upgrade of DEP's current water supply - water quality model, developed to evaluate alternatives for controlling turbidity in NYC's Catskill reservoirs. While the current model relies on historical hydrologic and meteorological data, OST can be driven by forecasted future conditions. It will receive a variety of near-real-time data from a number of sources. OST will support two major types of simulations: long-term, for evaluating policy or infrastructure changes over an extended period of time; and short-term "position analysis" (PA) simulations, consisting of multiple short simulations, all starting from the same initial conditions. Typically, the starting conditions for a PA run will represent those for the current day and traces of forecasted hydrology will drive the model for the duration of the simulation period. The result of these simulations will be a distribution of future system states based on system operating rules and the range of input ensemble streamflow predictions. DEP managers will analyze the output distributions and make operation decisions using risk-based metrics such as probability of refill. Currently, in the developmental stages of OST, forecasts are based on antecedent hydrologic conditions and are statistical in nature. The statistical algorithm is a relatively simple and versatile, but lacks short-term skill critical for water quality and spill management. To improve short-term skill, OST will ultimately operate with meteorologically driven hydrologic forecasts provided by the National Weather Service (NWS). OST functionalities will support a wide range of DEP uses, including short term operational projections, outage planning and emergency management, operating rule development, and water supply planning. A core use of OST will be to inform reservoir management strategies to control and mitigate turbidity events while ensuring water supply reliability. OST will also allow DEP to manage its complex reservoir system to meet multiple objectives, including ecological flows, tailwater fisheries and recreational releases, and peak flow mitigation for downstream communities.
Hvitfeldt-Forsberg, Helena; Mazzocato, Pamela; Glaser, Daniel; Keller, Christina; Unbeck, Maria
2017-06-06
To explore healthcare staffs' and managers' perceptions of how and when discrete event simulation modelling can be used as a decision support in improvement efforts. Two focus group discussions were performed. Two settings were included: a rheumatology department and an orthopaedic section both situated in Sweden. Healthcare staff and managers (n=13) from the two settings. Two workshops were performed, one at each setting. Workshops were initiated by a short introduction to simulation modelling. Results from the respective simulation model were then presented and discussed in the following focus group discussion. Categories from the content analysis are presented according to the following research questions: how and when simulation modelling can assist healthcare improvement? Regarding how, the participants mentioned that simulation modelling could act as a tool for support and a way to visualise problems, potential solutions and their effects. Regarding when, simulation modelling could be used both locally and by management, as well as a pedagogical tool to develop and test innovative ideas and to involve everyone in the improvement work. Its potential as an information and communication tool and as an instrument for pedagogic work within healthcare improvement render a broader application and value of simulation modelling than previously reported. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Guidelines for the analysis of free energy calculations.
Klimovich, Pavel V; Shirts, Michael R; Mobley, David L
2015-05-01
Free energy calculations based on molecular dynamics simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical-analysis.py, freely available on GitHub as part of the pymbar package (located at http://github.com/choderalab/pymbar), that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope this tool and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations.
Le Lous, M; De Chanaud, N; Bourret, A; Senat, M V; Colmant, C; Jaury, P; Tesnière, A; Tsatsaris, V
2017-01-01
Ultrasonography (US) is an essential tool for the diagnosis of acute gynecological conditions. General practice (GP) residents are involved in the first-line management of gynecologic emergencies. They are not familiar with US equipment. Initial training on simulators was conducted.The aim of this study was to evaluate the impact of simulation-based training on the quality of the sonographic images achieved by GP residents 2 months after the simulation training versus clinical training alone. Young GP residents assigned to emergency gynecology departments were invited to a one-day simulation-based US training session. A prospective controlled trial aiming to assess the impact of such training on TVS (transvaginal ultrasound scan) image quality was conducted. The first group included GP residents who attended the simulation training course. The second group included GP residents who did not attend the course. Written consent to participate was obtained from all participants. Images achieved 2 months after the training were scored using standardized quality criteria and compared in both groups. The stress generated by this examination was also assessed with a simple numeric scale. A total of 137 residents attended the simulation training, 26 consented to participate in the controlled trial. Sonographic image quality was significantly better in the simulation group for the sagittal view of the uterus (3.6 vs 2.7, p = 0.01), for the longitudinal view of the right ovary (2.8 vs 1.4, p = 0.027), and for the Morrison space (1.7 vs 0.4, p = 0.034), but the difference was not significant for the left ovary (2.9 vs 1.7, p = 0.189). The stress generated by TVS after 2 months was not different between the groups (6.0 vs 4.8, p = 0.4). Simulation-based training improved the quality of pelvic US images in GP residents assessed after 2 months of experience in gynecology compared to clinical training alone.
Browning, J. R.; Jonkman, J.; Robertson, A.; ...
2014-12-16
In this study, high-quality computer simulations are required when designing floating wind turbines because of the complex dynamic responses that are inherent with a high number of degrees of freedom and variable metocean conditions. In 2007, the FAST wind turbine simulation tool, developed and maintained by the U.S. Department of Energy's (DOE's) National Renewable Energy Laboratory (NREL), was expanded to include capabilities that are suitable for modeling floating offshore wind turbines. In an effort to validate FAST and other offshore wind energy modeling tools, DOE funded the DeepCwind project that tested three prototype floating wind turbines at 1/50 th scalemore » in a wave basin, including a semisubmersible, a tension-leg platform, and a spar buoy. This paper describes the use of the results of the spar wave basin tests to calibrate and validate the FAST offshore floating simulation tool, and presents some initial results of simulated dynamic responses of the spar to several combinations of wind and sea states. Wave basin tests with the spar attached to a scale model of the NREL 5-megawatt reference wind turbine were performed at the Maritime Research Institute Netherlands under the DeepCwind project. This project included free-decay tests, tests with steady or turbulent wind and still water (both periodic and irregular waves with no wind), and combined wind/wave tests. The resulting data from the 1/50th model was scaled using Froude scaling to full size and used to calibrate and validate a full-size simulated model in FAST. Results of the model calibration and validation include successes, subtleties, and limitations of both wave basin testing and FAST modeling capabilities.« less
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315
NASA Astrophysics Data System (ADS)
Parsa, M. H.; Davari, H.; Hadian, A. M.; Ahmadabadi, M. Nili
2007-05-01
Hybrid Rotary Friction Welding is a modified type of common rotary friction welding processes. In this welding method parameters such as pressure, angular velocity and time of welding control temperature, stress, strain and their variations. These dependent factors play an important rule in defining optimum process parameters combinations in order to improve the design and manufacturing of welding machines and quality of welded parts. Thermo-mechanical simulation of friction welding has been carried out and it has been shown that, simulation is an important tool for prediction of generated heat and strain at the weld interface and can be used for prediction of microstructure and evaluation of quality of welds. For simulation of Hybrid Rotary Friction Welding, a commercial finite element program has been used and the effects of pressure and rotary velocity of rotary part on temperature and strain variations have been investigated.
Design and simulation of a sensor for heliostat field closed loop control
NASA Astrophysics Data System (ADS)
Collins, Mike; Potter, Daniel; Burton, Alex
2017-06-01
Significant research has been completed in pursuit of capital cost reductions for heliostats [1],[2]. The camera array closed loop control concept has potential to radically alter the way heliostats are controlled and installed by replacing high quality open loop targeting systems with low quality targeting devices that rely on measurement of image position to remove tracking errors during operation. Although the system could be used for any heliostat size, the system significantly benefits small heliostats by reducing actuation costs, enabling large numbers of heliostats to be calibrated simultaneously, and enabling calibration of heliostats that produce low irradiance (similar or less than ambient light images) on Lambertian calibration targets, such as small heliostats that are far from the tower. A simulation method for the camera array has been designed and verified experimentally. The simulation tool demonstrates that closed loop calibration or control is possible using this device.
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.
2013-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).
Modeling sedimentation-filtration basins for urban watersheds using Soil and Water Assessment Tool
USDA-ARS?s Scientific Manuscript database
Sedimentation-filtration (SedFil) basins are one of the storm-water best management practices (BMPs) that are intended to mitigate water quality problems in urban creeks and rivers. A new physically based model of variably saturated flows was developed for simulating flow and sediment in SedFils wi...
Pathogen Transport and Fate Modeling in the Upper Salem River Watershed using SWAT Model
SWAT (Soil and Water Assessment Tool) is a dynamic watershed model that is applied to simulate the impact of land management practices on water quality over a continuous period. The Upper Salem River, located in Salem County New Jersey, is listed by the New Jersey Department of ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malczynski, Leonard A.
This guide addresses software quality in the construction of Powersim{reg_sign} Studio 8 system dynamics simulation models. It is the result of almost ten years of experience with the Powersim suite of system dynamics modeling tools (Constructor and earlier Studio versions). It is a guide that proposes a common look and feel for the construction of Powersim Studio system dynamics models.
USDA-ARS?s Scientific Manuscript database
The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...
Evaluation of the RWEQ and SWEEP in simulating soil and PM10 loss from a portable wind tunnel
USDA-ARS?s Scientific Manuscript database
Wind erosion threatens sustainable agriculture and environmental quality in the Columbia Plateau region of the US Pacific Northwest. Wind erosion models such as Wind Erosion Prediction System (WEPS) and the Revised Wind Erosion Equation (RWEQ) have been developed as tools for identifying practices t...
Pathogen Transport and Fate Modeling in the Upper Salem River Watershed Using SWAT Model
SWAT (Soil and Water Assessment Tool) is a dynamic watershed model that is applied to simulate the impact of land management practices on water quality over a continuous period. The Upper Salem River, located in Salem County New Jersey, is listed by the New Jersey Department of ...
Projected 2050 Model Simulations for the Chesapeake Bay ...
The Chesapeake Bay Program as has been tasked with assessing how changes in climate systems are expected to alter key variables and processes within the Watershed in concurrence with land use changes. EPA’s Office of Research and Development will be conducting historic and future, 2050, Weather Research and Forecast (WRF) metrological and Community Multiscale Air Quality (CMAQ) chemical transport model simulations to provide meteorological and nutrient deposition estimates for inclusion of the Chesapeake Bay Program’s assessment of how climate and land use change may impact water quality and ecosystem health. This presentation will present the timeline and research updates. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
A Five- Year CMAQ Model Performance for Wildfires and ...
Biomass burning has been identified as an important contributor to the degradation of air quality because of its impact on ozone and particulate matter. Two components of the biomass burning inventory, wildfires and prescribed fires are routinely estimated in the national emissions inventory. However, there is a large amount of uncertainty in the development of these emission inventory sectors. We have completed a 5 year set of CMAQ model simulations (2008-2012) in which we have simulated regional air quality with and without the wildfire and prescribed fire inventory. We will examine CMAQ model performance over regions with significant PM2.5 and Ozone contribution from prescribed fires and wildfires. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Performance Analysis of IIUM Wireless Campus Network
NASA Astrophysics Data System (ADS)
Abd Latif, Suhaimi; Masud, Mosharrof H.; Anwar, Farhat
2013-12-01
International Islamic University Malaysia (IIUM) is one of the leading universities in the world in terms of quality of education that has been achieved due to providing numerous facilities including wireless services to every enrolled student. The quality of this wireless service is controlled and monitored by Information Technology Division (ITD), an ISO standardized organization under the university. This paper aims to investigate the constraints of wireless campus network of IIUM. It evaluates the performance of the IIUM wireless campus network in terms of delay, throughput and jitter. QualNet 5.2 simulator tool has employed to measure these performances of IIUM wireless campus network. The observation from the simulation result could be one of the influencing factors in improving wireless services for ITD and further improvement.
Sinkó, József; Kákonyi, Róbert; Rees, Eric; Metcalf, Daniel; Knight, Alex E.; Kaminski, Clemens F.; Szabó, Gábor; Erdélyi, Miklós
2014-01-01
Localization-based super-resolution microscopy image quality depends on several factors such as dye choice and labeling strategy, microscope quality and user-defined parameters such as frame rate and number as well as the image processing algorithm. Experimental optimization of these parameters can be time-consuming and expensive so we present TestSTORM, a simulator that can be used to optimize these steps. TestSTORM users can select from among four different structures with specific patterns, dye and acquisition parameters. Example results are shown and the results of the vesicle pattern are compared with experimental data. Moreover, image stacks can be generated for further evaluation using localization algorithms, offering a tool for further software developments. PMID:24688813
AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool
Halford, Keith
2009-01-01
Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.
Culture, communication and safety: lessons from the airline industry.
d'Agincourt-Canning, Lori G; Kissoon, Niranjan; Singal, Mona; Pitfield, Alexander F
2011-06-01
Communication is a critical component of effective teamwork and both are essential elements in providing high quality of care to patients. Yet, communication is not an innate skill but a process influenced by internal (personal/cultural values) as well as external (professional roles and hierarchies) factors. To provide illustrative cases, themes and tools for improving communication. Literature review and consensus opinion based on extensive experience. Professional autonomy should be de-emphasized. Tools such as SBAR and simulation are important in communication and teamwork. Tools designed to improve communication and safety in the aviation industry may have applicability to the pediatric intensive care unit.
Development and evaluation of the Screening Trajectory Ozone Prediction System (STOPS, version 1.0)
NASA Astrophysics Data System (ADS)
Czader, B. H.; Percell, P.; Byun, D.; Kim, S.; Choi, Y.
2015-05-01
A hybrid Lagrangian-Eulerian based modeling tool has been developed using the Eulerian framework of the Community Multiscale Air Quality (CMAQ) model. It is a moving nest that utilizes saved original CMAQ simulation results to provide boundary conditions, initial conditions, as well as emissions and meteorological parameters necessary for a simulation. Given that these files are available, this tool can run independently of the CMAQ whole domain simulation, and it is designed to simulate source-receptor relationships upon changes in emissions. In this tool, the original CMAQ's horizontal domain is reduced to a small sub-domain that follows a trajectory defined by the mean mixed-layer wind. It has the same vertical structure and physical and chemical interactions as CMAQ except advection calculation. The advantage of this tool compared to other Lagrangian models is its capability of utilizing realistic boundary conditions that change with space and time as well as detailed chemistry treatment. The correctness of the algorithms and the overall performance was evaluated against CMAQ simulation results. Its performance depends on the atmospheric conditions occurring during the simulation period, with the comparisons being most similar to CMAQ results under uniform wind conditions. The mean bias for surface ozone mixing ratios varies between -0.03 and -0.78 ppbV and the slope is between 0.99 and 1.01 for different analyzed cases. For complicated meteorological conditions, such as wind circulation, the simulated mixing ratios deviate from CMAQ values as a result of the Lagrangian approach of using mean wind for its movement, but are still close, with the mean bias for ozone varying between 0.07 and -4.29 ppbV and the slope varying between 0.95 and 1.06 for different analyzed cases. For historical reasons, this hybrid Lagrangian-Eulerian based tool is named the Screening Trajectory Ozone Prediction System (STOPS), but its use is not limited to ozone prediction as, similarly to CMAQ, it can simulate concentrations of many species, including particulate matter and some toxic compounds, such as formaldehyde and 1,3-butadiene.
Reed, Shelby D.; Neilson, Matthew P.; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H.; Polsky, Daniel E.; Graham, Felicia L.; Bowers, Margaret T.; Paul, Sara C.; Granger, Bradi B.; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara; Levy, Wayne C.
2015-01-01
Background Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. Methods We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics, use of evidence-based medications, and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model (SHFM). Projections of resource use and quality of life are modeled using relationships with time-varying SHFM scores. The model can be used to evaluate parallel-group and single-cohort designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. Results The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. Conclusion The TEAM-HF Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. PMID:26542504
Feaster, Toby D.; Benedict, Stephen T.; Clark, Jimmy M.; Bradley, Paul M.; Conrads, Paul
2014-01-01
As part of an ongoing effort by the U.S. Geological Survey to expand the understanding of relations among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations within the Edisto River Basin, analyses and simulations of the hydrology of the Edisto River Basin were made using the topography-based hydrological model (TOPMODEL). A primary focus of the investigation was to assess the potential for scaling up a previous application of TOPMODEL for the McTier Creek watershed, which is a small headwater catchment to the Edisto River Basin. Scaling up was done in a step-wise manner, beginning with applying the calibration parameters, meteorological data, and topographic-wetness-index data from the McTier Creek TOPMODEL to the Edisto River TOPMODEL. Additional changes were made for subsequent simulations, culminating in the best simulation, which included meteorological and topographic wetness index data from the Edisto River Basin and updated calibration parameters for some of the TOPMODEL calibration parameters. The scaling-up process resulted in nine simulations being made. Simulation 7 best matched the streamflows at station 02175000, Edisto River near Givhans, SC, which was the downstream limit for the TOPMODEL setup, and was obtained by adjusting the scaling factor, including streamflow routing, and using NEXRAD precipitation data for the Edisto River Basin. The Nash-Sutcliffe coefficient of model-fit efficiency and Pearson’s correlation coefficient for simulation 7 were 0.78 and 0.89, respectively. Comparison of goodness-of-fit statistics between measured and simulated daily mean streamflow for the McTier Creek and Edisto River models showed that with calibration, the Edisto River TOPMODEL produced slightly better results than the McTier Creek model, despite the substantial difference in the drainage-area size at the outlet locations for the two models (30.7 and 2,725 square miles, respectively). Along with the TOPMODEL hydrologic simulations, a visualization tool (the Edisto River Data Viewer) was developed to help assess trends and influencing variable in the stream ecosystem. Incorporated into the visualization tool were the water-quality load models TOPLOAD, TOPLOAD–H, and LOADEST. Because the focus of this investigation was on scaling up the models from McTier Creek, water-quality concentrations that were previously collected in the McTier Creek Basin were used in the water-quality load models.
NASA Astrophysics Data System (ADS)
Liu, Y.; Engel, B.; Collingsworth, P.; Pijanowski, B. C.
2017-12-01
Nutrient loading from the Maumee River watershed is a significant reason for the harmful algal blooms (HABs) problem in Lake Erie. Strategies to reduce nutrient loading from agricultural areas in the Maumee River watershed need to be explored. Best management practices (BMPs) are popular approaches for improving hydrology and water quality. Various scenarios of BMP implementation were simulated in the AXL watershed (an agricultural watershed in Maumee River watershed) using Soil and Water Assessment Tool (SWAT) and a new BMP cost tool to explore the cost-effectiveness of the practices. BMPs of interest included vegetative filter strips, grassed waterways, blind inlets, grade stabilization structures, wetlands, no-till, nutrient management, residue management, and cover crops. The following environmental concerns were considered: streamflow, Total Phosphorous (TP), Dissolved Reactive Phosphorus (DRP), Total Kjeldahl Nitrogen (TKN), and Nitrate+Nitrite (NOx). To obtain maximum hydrological and water quality benefits with minimum cost, an optimization tool was developed to optimally select and place BMPs by connecting SWAT, the BMP cost tool, and optimization algorithms. The optimization tool was then applied in AXL watershed to explore optimization focusing on critical areas (top 25% of areas with highest runoff volume/pollutant loads per area) vs. all areas of the watershed, optimization using weather data for spring (March to July, due to the goal of reducing spring phosphorus in watershed management plan) vs. full year, and optimization results of implementing BMPs to achieve the watershed management plan goal (reducing 2008 TP levels by 40%). The optimization tool and BMP optimization results can be used by watershed groups and communities to solve hydrology and water quality problems.
NASA Astrophysics Data System (ADS)
Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.
2014-12-01
The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.
Cerutti, Guillaume; Ali, Olivier; Godin, Christophe
2017-01-01
Context: The shoot apical meristem (SAM), origin of all aerial organs of the plant, is a restricted niche of stem cells whose growth is regulated by a complex network of genetic, hormonal and mechanical interactions. Studying the development of this area at cell level using 3D microscopy time-lapse imaging is a newly emerging key to understand the processes controlling plant morphogenesis. Computational models have been proposed to simulate those mechanisms, however their validation on real-life data is an essential step that requires an adequate representation of the growing tissue to be carried out. Achievements: The tool we introduce is a two-stage computational pipeline that generates a complete 3D triangular mesh of the tissue volume based on a segmented tissue image stack. DRACO (Dual Reconstruction by Adjacency Complex Optimization) is designed to retrieve the underlying 3D topological structure of the tissue and compute its dual geometry, while STEM (SAM Tissue Enhanced Mesh) returns a faithful triangular mesh optimized along several quality criteria (intrinsic quality, tissue reconstruction, visual adequacy). Quantitative evaluation tools measuring the performance of the method along those different dimensions are also provided. The resulting meshes can be used as input and validation for biomechanical simulations. Availability: DRACO-STEM is supplied as a package of the open-source multi-platform plant modeling library OpenAlea (http://openalea.github.io/) implemented in Python, and is freely distributed on GitHub (https://github.com/VirtualPlants/draco-stem) along with guidelines for installation and use. PMID:28424704
Modelling the effect of wildfire on forested catchment water quality using the SWAT model
NASA Astrophysics Data System (ADS)
Yu, M.; Bishop, T.; van Ogtrop, F. F.; Bell, T.
2016-12-01
Wildfire removes the surface vegetation, releases ash, increase erosion and runoff, and therefore effects the hydrological cycle of a forested water catchment. It is important to understand chnage and how the catchment recovers. These processes are spatially sensitive and effected by interactions between fire severity and hillslope, soil type and surface vegetation conditions. Thus, a distributed hydrological modelling approach is required. In this study, the Soil and Water Analysis Tool (SWAT) is used to predict the effect of 2001/02 Sydney wild fire on catchment water quality. 10 years pre-fire data is used to create and calibrate the SWAT model. The calibrated model was then used to simulate the water quality for the 10 years post-fire period without fire effect. The simulated water quality data are compared with recorded water quality data provided by Sydney catchment authority. The mean change of flow, total suspended solid, total nitrate and total phosphate are compare on monthly, three month, six month and annual basis. Two control catchment and three burn catchment were analysed.
NASA Astrophysics Data System (ADS)
Sun, N.; Yearsley, J. R.; Nijssen, B.; Lettenmaier, D. P.
2014-12-01
Urban stream quality is particularly susceptible to extreme precipitation events and land use change. Although the projected effects of extreme events and land use change on hydrology have been resonably well studied, the impacts on urban water quality have not been widely examined due in part to the scale mismatch between global climate models and the spatial scales required to represent urban hydrology and water quality signals. Here we describe a grid-based modeling system that integrates the Distributed Hydrology Soil Vegetation Model (DHSVM) and urban water quality module adpated from EPA's Storm Water Management Model (SWMM) and Soil and water assessment tool (SWAT). Using the model system, we evaluate, for four partially urbanized catchments within the Puget Sound basin, urban water quality under current climate conditions, and projected potential changes in urban water quality associated with future changes in climate and land use. We examine in particular total suspended solids, toal nitrogen, total phosphorous, and coliform bacteria, with catchment representations at the 150-meter spatial resolution and the sub-daily timestep. We report long-term streamflow and water quality predictions in response to extreme precipitation events of varying magnitudes in the four partially urbanized catchments. Our simulations show that urban water quality is highly sensitive to both climatic and land use change.
NASA Astrophysics Data System (ADS)
Kolosionis, Konstantinos; Papadopoulou, Maria P.
2017-04-01
Monitoring networks provide essential information for water resources management especially in areas with significant groundwater exploitation due to extensive agricultural activities. In this work, a simulation-optimization framework is developed based on heuristic optimization methodologies and geostatistical modeling approaches to obtain an optimal design for a groundwater quality monitoring network. Groundwater quantity and quality data obtained from 43 existing observation locations at 3 different hydrological periods in Mires basin in Crete, Greece will be used in the proposed framework in terms of Regression Kriging to develop the spatial distribution of nitrates concentration in the aquifer of interest. Based on the existing groundwater quality mapping, the proposed optimization tool will determine a cost-effective observation wells network that contributes significant information to water managers and authorities. The elimination of observation wells that add little or no beneficial information to groundwater level and quality mapping of the area can be obtain using estimations uncertainty and statistical error metrics without effecting the assessment of the groundwater quality. Given the high maintenance cost of groundwater monitoring networks, the proposed tool could used by water regulators in the decision-making process to obtain a efficient network design that is essential.
Evaluation of DNA Force Fields in Implicit Solvation
Gaillard, Thomas; Case, David A.
2011-01-01
DNA structural deformations and dynamics are crucial to its interactions in the cell. Theoretical simulations are essential tools to explore the structure, dynamics, and thermodynamics of biomolecules in a systematic way. Molecular mechanics force fields for DNA have benefited from constant improvements during the last decades. Several studies have evaluated and compared available force fields when the solvent is modeled by explicit molecules. On the other hand, few systematic studies have assessed the quality of duplex DNA models when implicit solvation is employed. The interest of an implicit modeling of the solvent consists in the important gain in the simulation performance and conformational sampling speed. In this study, respective influences of the force field and the implicit solvation model choice on DNA simulation quality are evaluated. To this end, extensive implicit solvent duplex DNA simulations are performed, attempting to reach both conformational and sequence diversity convergence. Structural parameters are extracted from simulations and statistically compared to available experimental and explicit solvation simulation data. Our results quantitatively expose the respective strengths and weaknesses of the different DNA force fields and implicit solvation models studied. This work can lead to the suggestion of improvements to current DNA theoretical models. PMID:22043178
Assessing teamwork performance in obstetrics: A systematic search and review of validated tools.
Fransen, Annemarie F; de Boer, Liza; Kienhorst, Dieneke; Truijens, Sophie E; van Runnard Heimel, Pieter J; Oei, S Guid
2017-09-01
Teamwork performance is an essential component for the clinical efficiency of multi-professional teams in obstetric care. As patient safety is related to teamwork performance, it has become an important learning goal in simulation-based education. In order to improve teamwork performance, reliable assessment tools are required. These can be used to provide feedback during training courses, or to compare learning effects between different types of training courses. The aim of the current study is to (1) identify the available assessment tools to evaluate obstetric teamwork performance in a simulated environment, and (2) evaluate their psychometric properties in order to identify the most valuable tool(s) to use. We performed a systematic search in PubMed, MEDLINE, and EMBASE to identify articles describing assessment tools for the evaluation of obstetric teamwork performance in a simulated environment. In order to evaluate the quality of the identified assessment tools the standards and grading rules have been applied as recommended by the Accreditation Council for Graduate Medical Education (ACGME) Committee on Educational Outcomes. The included studies were also assessed according to the Oxford Centre for Evidence Based Medicine (OCEBM) levels of evidence. This search resulted in the inclusion of five articles describing the following six tools: Clinical Teamwork Scale, Human Factors Rating Scale, Global Rating Scale, Assessment of Obstetric Team Performance, Global Assessment of Obstetric Team Performance, and the Teamwork Measurement Tool. Based on the ACGME guidelines we assigned a Class 3, level C of evidence, to all tools. Regarding the OCEBM levels of evidence, a level 3b was assigned to two studies and a level 4 to four studies. The Clinical Teamwork Scale demonstrated the most comprehensive validation, and the Teamwork Measurement Tool demonstrated promising results, however it is recommended to further investigate its reliability. Copyright © 2017. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Nordal Petersen, Martin; Nuijts, Roeland; Lange Bjørn, Lars
2014-05-01
This article presents an advanced optical model for simulation of alien wavelengths in multi-domain and multi-vendor dense wavelength-division multiplexing networks. The model aids optical network planners with a better understanding of the non-linear effects present in dense wavelength-division multiplexing systems and better utilization of alien wavelengths in future applications. The limiting physical effects for alien wavelengths are investigated in relation to power levels, channel spacing, and other factors. The simulation results are verified through experimental setup in live multi-domain dense wavelength-division multiplexing systems between two national research networks: SURFnet in Holland and NORDUnet in Denmark.
PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation
NASA Astrophysics Data System (ADS)
España, S; Herraiz, J L; Vicente, E; Vaquero, J J; Desco, M; Udias, J M
2009-03-01
Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Bo; Zhao, Hongwei, E-mail: hwzhao@jlu.edu.cn, E-mail: khl69@163.com; Zhao, Dan
It has always been a critical issue to understand the material removal behavior of Vibration-Assisted Machining (VAM), especially on atomic level. To find out the effects of vibration frequency on material removal response, a three-dimensional molecular dynamics (MD) model has been established in this research to investigate the effects of scratched groove, crystal defects on the surface quality, comparing with the Von Mises shear strain and tangential force in simulations during nano-scratching process. Comparisons are made among the results of simulations from different vibration frequency with the same scratching feed, depth, amplitude and crystal orientation. Copper potential in this simulationmore » is Embedded-Atom Method (EAM) potential. Interaction between copper and carbon atoms is Morse potential. Simulational results show that higher frequency can make groove smoother. Simulation with high frequency creates more dislocations to improve the machinability of copper specimen. The changing frequency does not have evident effects on Von Mises shear strain. Higher frequency can decrease the tangential force to reduce the consumption of cutting energy and tool wear. In conclusion, higher vibration frequency in VAM on mono-crystalline copper has positive effects on surface finish, machinablility and tool wear reduction.« less
Creation and Validation of a Simulator for Neonatal Brain Ultrasonography: A Pilot Study.
Tsai, Andy; Barnewolt, Carol E; Prahbu, Sanjay P; Yonekura, Reimi; Hosmer, Andrew; Schulz, Noah E; Weinstock, Peter H
2017-01-01
Historically, skills training in performing brain ultrasonography has been limited to hours of scanning infants for lack of adequate synthetic models or alternatives. The aim of this study was to create a simulator and determine its utility as an educational tool in teaching the skills that can be used in performing brain ultrasonography on infants. A brain ultrasonography simulator was created using a combination of multi-modality imaging, three-dimensional printing, material and acoustic engineering, and sculpting and molding. Radiology residents participated prior to their pediatric rotation. The study included (1) an initial questionnaire and resident creation of three coronal images using the simulator; (2) brain ultrasonography lecture; (3) hands-on simulator practice; and (4) a follow-up questionnaire and re-creation of the same three coronal images on the simulator. A blinded radiologist scored the quality of the pre- and post-training images using metrics including symmetry of the images and inclusion of predetermined landmarks. Wilcoxon rank-sum test was used to compare pre- and post-training questionnaire rankings and image quality scores. Ten residents participated in the study. Analysis of pre- and post-training rankings showed improvements in technical knowledge and confidence, and reduction in anxiety in performing brain ultrasonography. Objective measures of image quality likewise improved. Mean reported value score for simulator training was high across participants who reported perceived improvements in scanning skills and enjoyment from simulator use, with interest in additional practice on the simulator and recommendations for its use. This pilot study supports the use of a simulator in teaching radiology residents the skills that can be used to perform brain ultrasonography. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Mendoza, Patricia; d'Anjou, Marc-André; Carmel, Eric N; Fournier, Eric; Mai, Wilfried; Alexander, Kate; Winter, Matthew D; Zwingenberger, Allison L; Thrall, Donald E; Theoret, Christine
2014-01-01
Understanding radiographic anatomy and the effects of varying patient and radiographic tube positioning on image quality can be a challenge for students. The purposes of this study were to develop and validate a novel technique for creating simulated radiographs using computed tomography (CT) datasets. A DICOM viewer (ORS Visual) plug-in was developed with the ability to move and deform cuboidal volumetric CT datasets, and to produce images simulating the effects of tube-patient-detector distance and angulation. Computed tomographic datasets were acquired from two dogs, one cat, and one horse. Simulated radiographs of different body parts (n = 9) were produced using different angles to mimic conventional projections, before actual digital radiographs were obtained using the same projections. These studies (n = 18) were then submitted to 10 board-certified radiologists who were asked to score visualization of anatomical landmarks, depiction of patient positioning, realism of distortion/magnification, and image quality. No significant differences between simulated and actual radiographs were found for anatomic structure visualization and patient positioning in the majority of body parts. For the assessment of radiographic realism, no significant differences were found between simulated and digital radiographs for canine pelvis, equine tarsus, and feline abdomen body parts. Overall, image quality and contrast resolution of simulated radiographs were considered satisfactory. Findings from the current study indicated that radiographs simulated using this new technique are comparable to actual digital radiographs. Further studies are needed to apply this technique in developing interactive tools for teaching radiographic anatomy and the effects of varying patient and tube positioning. © 2013 American College of Veterinary Radiology.
Druce, Irena; Williams, Chantal; Baggoo, Carolyn; Keely, Erin; Malcolm, Janine
2017-10-01
Patients are increasingly turning to the internet to seek reliable sources of health information and desire guidance in assessing the quality of information as healthcare becomes progressively more complex. Pituitary adenomas are a rare, diverse group of tumors associated with increased mortality and morbidity whose management requires a multidisciplinary approach. As such, patients with this disorder are often searching for additional sources of healthcare information. We undertook a study to assess the quality of information available on the internet for patients with pituitary adenoma. After exclusion, 42 websites were identified based on a search engine query with various search terms. Each website was assessed in triplicate: once by a health professional, once by a simulated patient, and once by a patient who had a pituitary adenoma and underwent medical and surgical treatment. The assessment tools included a content-specific questionnaire, the DISCERN tool, and the Ensuring Quality Information for Patients tool. The readability of the information was assessed with the Flesch-Kincaid grade level. We found that the overall quality of information on pituitary adenoma on the internet was variable and written at a high grade level. Correlation between the different assessors was poor, indicating that there may be differences in how healthcare professionals and patients view healthcare information. Our findings highlight the importance of assessment of the health information by groups of the intended user to ensure the needs of that population are met. Abbreviation: EQIP = Ensuring Quality Information for Patients.
Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J
2016-03-01
Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO 2 -based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system.
Dols, W. Stuart.; Emmerich, Steven J.; Polidoro, Brian J.
2016-01-01
Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. Practical Application CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO2-based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system. PMID:27099405
Grieger, Jessica A; Johnson, Brittany J; Wycherley, Thomas P; Golley, Rebecca K
2017-05-01
Background: Dietary simulation modeling can predict dietary strategies that may improve nutritional or health outcomes. Objectives: The study aims were to undertake a systematic review of simulation studies that model dietary strategies aiming to improve nutritional intake, body weight, and related chronic disease, and to assess the methodologic and reporting quality of these models. Methods: The Preferred Reporting Items for Systematic Reviews and Meta-Analyses guided the search strategy with studies located through electronic searches [Cochrane Library, Ovid (MEDLINE and Embase), EBSCOhost (CINAHL), and Scopus]. Study findings were described and dietary modeling methodology and reporting quality were critiqued by using a set of quality criteria adapted for dietary modeling from general modeling guidelines. Results: Forty-five studies were included and categorized as modeling moderation, substitution, reformulation, or promotion dietary strategies. Moderation and reformulation strategies targeted individual nutrients or foods to theoretically improve one particular nutrient or health outcome, estimating small to modest improvements. Substituting unhealthy foods with healthier choices was estimated to be effective across a range of nutrients, including an estimated reduction in intake of saturated fatty acids, sodium, and added sugar. Promotion of fruits and vegetables predicted marginal changes in intake. Overall, the quality of the studies was moderate to high, with certain features of the quality criteria consistently reported. Conclusions: Based on the results of reviewed simulation dietary modeling studies, targeting a variety of foods rather than individual foods or nutrients theoretically appears most effective in estimating improvements in nutritional intake, particularly reducing intake of nutrients commonly consumed in excess. A combination of strategies could theoretically be used to deliver the best improvement in outcomes. Study quality was moderate to high. However, given the lack of dietary simulation reporting guidelines, future work could refine the quality tool to harmonize consistency in the reporting of subsequent dietary modeling studies. © 2017 American Society for Nutrition.
USDA-ARS?s Scientific Manuscript database
Microbial contamination of waters is the critical public health issue. The watershed-scale process-based modeling of bacteria fate and transport (F&T) has been proven to serve as the useful tool for predicting microbial water quality and evaluating management practices. The objective of this work is...
Richard M. DeGraaf; Anna M. Lester; Mariko Yamasaki; William B. Leak
2007-01-01
Visualization is a powerful tool for depicting projections of forest structure and landscape conditions, for communicating habitat management practices, and for providing a landscape context to private landowners and to those concerned with public land management. Recent advances in visualization technology, especially in graphics quality, ease of use, and relative...
USDA-ARS?s Scientific Manuscript database
The Soil and Water Assessment Tool (SWAT) is a versatile model presently used worldwide to evaluate water quality and hydrological concerns under varying land use and environmental conditions. In this study, SWAT was used to simulate streamflow and to estimate sediment yield and nutrients loss from ...
NASA Technical Reports Server (NTRS)
Ding, R. Jeffrey; Oelgoetz, Peter A.
1999-01-01
The "Auto-Adjustable Pin Tool for Friction Stir Welding", was developed at The Marshall Space Flight Center to address process deficiencies unique to the FSW process. The auto-adjustable pin tool, also called the retractable pin-tool (R.PT) automatically withdraws the welding probe of the pin-tool into the pin-tool's shoulder. The primary function of the auto-adjustable pin-tool is to allow for keyhole closeout, necessary for circumferential welding and localized weld repair, and, automated pin-length adjustment for the welding of tapered material thickness. An overview of the RPT hardware is presented. The paper follows with studies conducted using the RPT. The RPT was used to simulate two capabilities; welding tapered material thickness and closing out the keyhole in a circumferential weld. The retracted pin-tool regions in aluminum- lithium 2195 friction stir weldments were studied through mechanical property testing and metallurgical sectioning. Correlation's can be =de between retractable pin-tool programmed parameters, process parameters, microstructure, and resulting weld quality.
Scaling up watershed model parameters--Flow and load simulations of the Edisto River Basin
Feaster, Toby D.; Benedict, Stephen T.; Clark, Jimmy M.; Bradley, Paul M.; Conrads, Paul
2014-01-01
The Edisto River is the longest and largest river system completely contained in South Carolina and is one of the longest free flowing blackwater rivers in the United States. The Edisto River basin also has fish-tissue mercury concentrations that are some of the highest recorded in the United States. As part of an effort by the U.S. Geological Survey to expand the understanding of relations among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations within the Edisto River basin, analyses and simulations of the hydrology of the Edisto River basin were made with the topography-based hydrological model (TOPMODEL). The potential for scaling up a previous application of TOPMODEL for the McTier Creek watershed, which is a small headwater catchment to the Edisto River basin, was assessed. Scaling up was done in a step-wise process beginning with applying the calibration parameters, meteorological data, and topographic wetness index data from the McTier Creek TOPMODEL to the Edisto River TOPMODEL. Additional changes were made with subsequent simulations culminating in the best simulation, which included meteorological and topographic wetness index data from the Edisto River basin and updated calibration parameters for some of the TOPMODEL calibration parameters. Comparison of goodness-of-fit statistics between measured and simulated daily mean streamflow for the two models showed that with calibration, the Edisto River TOPMODEL produced slightly better results than the McTier Creek model, despite the significant difference in the drainage-area size at the outlet locations for the two models (30.7 and 2,725 square miles, respectively). Along with the TOPMODEL hydrologic simulations, a visualization tool (the Edisto River Data Viewer) was developed to help assess trends and influencing variables in the stream ecosystem. Incorporated into the visualization tool were the water-quality load models TOPLOAD, TOPLOAD-H, and LOADEST. Because the focus of this investigation was on scaling up the models from McTier Creek, water-quality concentrations that were previously collected in the McTier Creek basin were used in the water-quality load models.
NASA Astrophysics Data System (ADS)
Gassman, P. W.; Arnold, J. G.; Srinivasan, R.
2015-12-01
The Soil and Water Assessment Tool (SWAT) is one of the most widely used watershed-scale water quality models in the world. Over 2,000 peer-reviewed SWAT-related journal articles have been published and hundreds of other studies have been published in conference proceedings and other formats. The use of SWAT was initially concentrated in North America and Europe but has also expanded dramatically in other countries and regions during the past decade including Brazil, China, India, Iran, South Korea, Southeast Asia and eastern Africa. The SWAT model has proven to be a very flexible tool for investigating a broad range of hydrologic and water quality problems at different watershed scales and environmental conditions, and has proven very adaptable for applications requiring improved hydrologic and other enhanced simulation needs. We investigate here the various technological, networking, and other factors that have supported the expanded use of SWAT, and also highlight current worldwide simulation trends and possible impediments to future increased usage of the model. Examples of technological advances include easy access to web-based documentation, user-support groups, and SWAT literature, a variety of Geographic Information System (GIS) interface tools, pre- and post-processing calibration software and other software, and an open source code which has served as a model development catalyst for multiple user groups. Extensive networking regarding the use of SWAT has further occurred via internet-based user support groups, model training workshops, regional working groups, regional and international conferences, and targeted development workshops. We further highlight several important model development trends that have emerged during the past decade including improved hydrologic, cropping system, best management practice (BMP) and pollutant transport simulation methods. In addition, several current SWAT weaknesses will be addressed and key development needs will be described including the ability to represent landscapes and practices with more spatial definition, the incorporation of a module specifically designed to simulate rice paddy systems and algorithms that can capture plant competition dynamics such as occur in complex tree/crop systems and interactions between crops and weeds.
Free web-based modelling platform for managed aquifer recharge (MAR) applications
NASA Astrophysics Data System (ADS)
Stefan, Catalin; Junghanns, Ralf; Glaß, Jana; Sallwey, Jana; Fatkhutdinov, Aybulat; Fichtner, Thomas; Barquero, Felix; Moreno, Miguel; Bonilla, José; Kwoyiga, Lydia
2017-04-01
Managed aquifer recharge represents a valuable instrument for sustainable water resources management. The concept implies purposeful infiltration of surface water into underground for later recovery or environmental benefits. Over decades, MAR schemes were successfully installed worldwide for a variety of reasons: to maximize the natural storage capacity of aquifers, physical aquifer management, water quality management, and ecological benefits. The INOWAS-DSS platform provides a collection of free web-based tools for planning, management and optimization of main components of MAR schemes. The tools are grouped into 13 specific applications that cover most relevant challenges encountered at MAR sites, both from quantitative and qualitative perspectives. The applications include among others the optimization of MAR site location, the assessment of saltwater intrusion, the restoration of groundwater levels in overexploited aquifers, the maximization of natural storage capacity of aquifers, the improvement of water quality, the design and operational optimization of MAR schemes, clogging development and risk assessment. The platform contains a collection of about 35 web-based tools of various degrees of complexity, which are either included in application specific workflows or used as standalone modelling instruments. Among them are simple tools derived from data mining and empirical equations, analytical groundwater related equations, as well as complex numerical flow and transport models (MODFLOW, MT3DMS and SEAWAT). Up to now, the simulation core of the INOWAS-DSS, which is based on the finite differences groundwater flow model MODFLOW, is implemented and runs on the web. A scenario analyser helps to easily set up and evaluate new management options as well as future development such as land use and climate change and compare them to previous scenarios. Additionally simple tools such as analytical equations to assess saltwater intrusion are already running online. Besides the simulation tools, a web-based data base is under development where geospatial and time series data can be stored, managed, and processed. Furthermore, a web-based information system containing user guides for the various developed tools and applications as well as basic information on MAR and related topics is published and will be regularly expanded as new tools are getting implemented. The INOWAS-DSS including its simulation tools, data base and information system provides an extensive framework to manage, plan and optimize MAR facilities. As the INOWAS-DSS is an open-source software accessible via the internet using standard web browsers, it offers new ways for data sharing and collaboration among various partners and decision makers.
Continuous variable quantum optical simulation for time evolution of quantum harmonic oscillators
Deng, Xiaowei; Hao, Shuhong; Guo, Hong; Xie, Changde; Su, Xiaolong
2016-01-01
Quantum simulation enables one to mimic the evolution of other quantum systems using a controllable quantum system. Quantum harmonic oscillator (QHO) is one of the most important model systems in quantum physics. To observe the transient dynamics of a QHO with high oscillation frequency directly is difficult. We experimentally simulate the transient behaviors of QHO in an open system during time evolution with an optical mode and a logical operation system of continuous variable quantum computation. The time evolution of an atomic ensemble in the collective spontaneous emission is analytically simulated by mapping the atomic ensemble onto a QHO. The measured fidelity, which is used for quantifying the quality of the simulation, is higher than its classical limit. The presented simulation scheme provides a new tool for studying the dynamic behaviors of QHO. PMID:26961962
NASA Astrophysics Data System (ADS)
Korchuganova, M.; Syrbakov, A.; Chernysheva, T.; Ivanov, G.; Gnedasch, E.
2016-08-01
Out of all common chip curling methods, a special tool face form has become the most widespread which is developed either by means of grinding or by means of profile pressing in the production process of RMSP. Currently, over 15 large tool manufacturers produce tools using instrument materials of over 500 brands. To this, we must add a large variety of tool face geometries, which purpose includes the control over form and dimensions of the chip. Taking into account all the many processed materials, specific tasks of the process planner, requirements to the quality of manufactured products, all this makes the choice of a proper tool which can perform the processing in the most effective way significantly harder. Over recent years, the nomenclature of RMSP for lathe tools with mechanical mounting has been considerably broadened by means of diversification of their faces
Enhancing a rainfall-runoff model to assess the impacts of BMPs and LID practices on storm runoff.
Liu, Yaoze; Ahiablame, Laurent M; Bralts, Vincent F; Engel, Bernard A
2015-01-01
Best management practices (BMPs) and low impact development (LID) practices are increasingly being used as stormwater management techniques to reduce the impacts of urban development on hydrology and water quality. To assist planners and decision-makers at various stages of development projects (planning, implementation, and evaluation), user-friendly tools are needed to assess the effectiveness of BMPs and LID practices. This study describes a simple tool, the Long-Term Hydrologic Impact Assessment-LID (L-THIA-LID), which is enhanced with additional BMPs and LID practices, improved approaches to estimate hydrology and water quality, and representation of practices in series (meaning combined implementation). The tool was used to evaluate the performance of BMPs and LID practices individually and in series with 30 years of daily rainfall data in four types of idealized land use units and watersheds (low density residential, high density residential, industrial, and commercial). Simulation results were compared with the results of other published studies. The simulated results showed that reductions in runoff volume and pollutant loads after implementing BMPs and LID practices, both individually and in series, were comparable with the observed impacts of these practices. The L-THIA-LID 2.0 model is capable of assisting decision makers in evaluating environmental impacts of BMPs and LID practices, thereby improving the effectiveness of stormwater management decisions. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey
2003-01-01
The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.
Pereira, D; Gomes, P; Faria, S; Cruz-Correia, R; Coimbra, M
2016-08-01
Auscultation is currently both a powerful screening tool, providing a cheap and quick initial assessment of a patient's clinical condition, and a hard skill to master. The teaching of auscultation in Universities is today reduced to an unsuitable number of hours. Virtual patient simulators can potentially mitigate this problem, by providing an interesting high-quality alternative to teaching with real patients or patient simulators. In this paper we evaluate the pedagogical impact of using a virtual patient simulation technology in a short workshop format for medical students, training them to detect cardiac pathologies. Results showed a significant improvement (+16%) in the differentiation between normal and pathological cases, although longer duration formats seem to be needed to accurately identify specific pathologies.
Khanduja, P Kristina; Bould, M Dylan; Naik, Viren N; Hladkowicz, Emily; Boet, Sylvain
2015-01-01
We systematically reviewed the effectiveness of simulation-based education, targeting independently practicing qualified physicians in acute care specialties. We also describe how simulation is used for performance assessment in this population. Data source included: DataMEDLINE, Embase, Cochrane Database of Systematic Reviews, Cochrane CENTRAL Database of Controlled Trials, and National Health Service Economic Evaluation Database. The last date of search was January 31, 2013. All original research describing simulation-based education for independently practicing physicians in anesthesiology, critical care, and emergency medicine was reviewed. Data analysis was performed in duplicate with further review by a third author in cases of disagreement until consensus was reached. Data extraction was focused on effectiveness according to Kirkpatrick's model. For simulation-based performance assessment, tool characteristics and sources of validity evidence were also collated. Of 39 studies identified, 30 studies focused on the effectiveness of simulation-based education and nine studies evaluated the validity of simulation-based assessment. Thirteen studies (30%) targeted the lower levels of Kirkpatrick's hierarchy with reliance on self-reporting. Simulation was unanimously described as a positive learning experience with perceived impact on clinical practice. Of the 17 remaining studies, 10 used a single group or "no intervention comparison group" design. The majority (n = 17; 44%) were able to demonstrate both immediate and sustained improvements in educational outcomes. Nine studies reported the psychometric properties of simulation-based performance assessment as their sole objective. These predominantly recruited independent practitioners as a convenience sample to establish whether the tool could discriminate between experienced and inexperienced operators and concentrated on a single aspect of validity evidence. Simulation is perceived as a positive learning experience with limited evidence to support improved learning. Future research should focus on the optimal modality and frequency of exposure, quality of assessment tools and on the impact of simulation-based education beyond the individuals toward improved patient care.
Liu, Shiyuan; Xu, Shuang; Wu, Xiaofei; Liu, Wei
2012-06-18
This paper proposes an iterative method for in situ lens aberration measurement in lithographic tools based on a quadratic aberration model (QAM) that is a natural extension of the linear model formed by taking into account interactions among individual Zernike coefficients. By introducing a generalized operator named cross triple correlation (CTC), the quadratic model can be calculated very quickly and accurately with the help of fast Fourier transform (FFT). The Zernike coefficients up to the 37th order or even higher are determined by solving an inverse problem through an iterative procedure from several through-focus aerial images of a specially designed mask pattern. The simulation work has validated the theoretical derivation and confirms that such a method is simple to implement and yields a superior quality of wavefront estimate, particularly for the case when the aberrations are relatively large. It is fully expected that this method will provide a useful practical means for the in-line monitoring of the imaging quality of lithographic tools.
Simulation of the XV-15 tilt rotor research aircraft
NASA Technical Reports Server (NTRS)
Churchill, G. B.; Dugan, D. C.
1982-01-01
The effective use of simulation from issuance of the request for proposal through conduct of a flight test program for the XV-15 Tilt Rotor Research Aircraft is discussed. From program inception, simulation complemented all phases of XV-15 development. The initial simulation evaluations during the source evaluation board proceedings contributed significantly to performance and stability and control evaluations. Eight subsequent simulation periods provided major contributions in the areas of control concepts; cockpit configuration; handling qualities; pilot workload; failure effects and recovery procedures; and flight boundary problems and recovery procedures. The fidelity of the simulation also made it a valuable pilot training aid, as well as a suitable tool for military and civil mission evaluations. Simulation also provided valuable design data for refinement of automatic flight control systems. Throughout the program, fidelity was a prime issue and resulted in unique data and methods for fidelity evaluation which are presented and discussed.
Using a medical simulation center as an electronic health record usability laboratory
Landman, Adam B; Redden, Lisa; Neri, Pamela; Poole, Stephen; Horsky, Jan; Raja, Ali S; Pozner, Charles N; Schiff, Gordon; Poon, Eric G
2014-01-01
Usability testing is increasingly being recognized as a way to increase the usability and safety of health information technology (HIT). Medical simulation centers can serve as testing environments for HIT usability studies. We integrated the quality assurance version of our emergency department (ED) electronic health record (EHR) into our medical simulation center and piloted a clinical care scenario in which emergency medicine resident physicians evaluated a simulated ED patient and documented electronically using the ED EHR. Meticulous planning and close collaboration with expert simulation staff was important for designing test scenarios, pilot testing, and running the sessions. Similarly, working with information systems teams was important for integration of the EHR. Electronic tools are needed to facilitate entry of fictitious clinical results while the simulation scenario is unfolding. EHRs can be successfully integrated into existing simulation centers, which may provide realistic environments for usability testing, training, and evaluation of human–computer interactions. PMID:24249778
Development of a simulation model of semi-active suspension for monorail
NASA Astrophysics Data System (ADS)
Hasnan, K.; Didane, D. H.; Kamarudin, M. A.; Bakhsh, Qadir; Abdulmalik, R. E.
2016-11-01
The new Kuala Lumpur Monorail Fleet Expansion Project (KLMFEP) uses semiactive technology in its suspension system. It is recognized that the suspension system influences the ride quality. Thus, among the way to further improve the ride quality is by fine- tuning the semi-active suspension system on the new KL Monorail. The semi-active suspension for the monorail specifically in terms of improving ride quality could be exploited further. Hence a simulation model which will act as a platform to test the design of a complete suspension system particularly to investigate the ride comfort performance is required. MSC Adams software was considered as the tool to develop the simulation platform, where all parameters and data are represented by mathematical equations; whereas the new KL Monorail being the reference model. In the simulation, the model went through step disturbance on the guideway for stability and ride comfort analysis. The model has shown positive results where the monorail is in stable condition as an outcome from stability analysis. The model also scores a Rating 1 classification in ISO 2631 Ride Comfort performance which is very comfortable as an overall outcome from ride comfort analysis. The model is also adjustable, flexibile and understandable by the engineers within the field for the purpose of further development.
NASA Astrophysics Data System (ADS)
Luo, Xichun; Tong, Zhen; Liang, Yingchun
2014-12-01
In this article, the shape transferability of using nanoscale multi-tip diamond tools in the diamond turning for scale-up manufacturing of nanostructures has been demonstrated. Atomistic multi-tip diamond tool models were built with different tool geometries in terms of the difference in the tip cross-sectional shape, tip angle, and the feature of tool tip configuration, to determine their effect on the applied forces and the machined nano-groove geometries. The quality of machined nanostructures was characterized by the thickness of the deformed layers and the dimensional accuracy achieved. Simulation results show that diamond turning using nanoscale multi-tip tools offers tremendous shape transferability in machining nanostructures. Both periodic and non-periodic nano-grooves with different cross-sectional shapes can be successfully fabricated using the multi-tip tools. A hypothesis of minimum designed ratio of tool tip distance to tip base width (L/Wf) of the nanoscale multi-tip diamond tool for the high precision machining of nanostructures was proposed based on the analytical study of the quality of the nanostructures fabricated using different types of the multi-tip tools. Nanometric cutting trials using nanoscale multi-tip diamond tools (different in L/Wf) fabricated by focused ion beam (FIB) were then conducted to verify the hypothesis. The investigations done in this work imply the potential of using the nanoscale multi-tip diamond tool for the deterministic fabrication of period and non-periodic nanostructures, which opens up the feasibility of using the process as a versatile manufacturing technique in nanotechnology.
NASA Astrophysics Data System (ADS)
Tang, C.; Lynch, J. A.; Dennis, R. L.
2016-12-01
The biogeochemical processing of nitrogen and associated pollutants is driven by meteorological and hydrological processes in conjunction with pollutant loading. There are feedbacks between meteorology and hydrology that will be affected by land-use change and climate change. Changes in meteorology will affect pollutant deposition. It is important to account for those feedbacks and produce internally consistent simulations of meteorology, hydrology, and pollutant loading to drive the (watershed/water quality) biogeochemical models. In this study, the ecological response to emission reductions in streams in the Potomac watershed was evaluated. Firstly, we simulated the deposition by using the fully coupled Weather Research & Forecasting (WRF) model and the Community Multiscale Air Quality (CAMQ) model; secondly, we created the hydrological data by the offline linked Variable Infiltration Capacity (VIC) model and the WRF model. Lastly, we investigated the water quality by one comprehensive/environment model, namely the linkage of CMAQ, WRF, VIC and the Model of Acidification of Groundwater In Catchment (MAGIC) model from 2002 to 2010.The simulated results (such as NO3, SO4, and SBC) fit well to the observed values. The linkage provides a generally accurate, well-tested tool for evaluating sensitivities to varying meteorology and environmental changes on acidification and other biogeochemical processes, with capability to comprehensively explore strategic policy and management design.
A human-hearing-related prediction tool for soundscapes and community noise
NASA Astrophysics Data System (ADS)
Genuit, Klaus
2002-11-01
There are several methods of calculation available for the prediction of the A-weighted sound-pressure level of environmental noise, which are, however, not suitable for a qualified prediction of the residents' annoyance and physiological strain. The subjectively felt noise quality does not only depend on the A-weighted sound-pressure level, but also on other psychoacoustical parameters, such as loudness, roughness, sharpness, etc. In addition to these physical and psychoacoustical aspects of noise, the so-called psychological or cognitive aspects have to be considered, too, which means that the listeners' expectations, their mental attitude, as well as the information content of the noise finally influence the noise quality perceived by the individual persons. Within the scope of a research project SVEN (Sound Quality of Vehicle Exterior Noise), which is promoted by the EC, a new tool has been developed which allows a binaural simulation and prediction of the environmental noise to evaluate the influence of different contributions by the sound events with respect to the psychoacoustical parameters, the spatial distribution, movement, and frequency. By means of this tool it is now possible to consider completely new aspects regarding the audible perception of noise when establishing a soundscape or when planning community noise.
Software Quality Control at Belle II
NASA Astrophysics Data System (ADS)
Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.;
2017-10-01
Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.
Improvement of Meteorological Inputs for TexAQS-II Air Quality Simulations
NASA Astrophysics Data System (ADS)
Ngan, F.; Byun, D.; Kim, H.; Cheng, F.; Kim, S.; Lee, D.
2008-12-01
An air quality forecasting system (UH-AQF) for Eastern Texas, which is in operation by the Institute for Multidimensional Air Quality Studies (IMAQS) at the University of Houston, uses the Fifth-Generation PSU/NCAR Mesoscale Model MM5 model as the meteorological driver for modeling air quality with the Community Multiscale Air Quality (CMAQ) model. While the forecasting system was successfully used for the planning and implementation of various measurement activities, evaluations of the forecasting results revealed a few systematic problems in the numerical simulations. From comparison with observations, we observe some times over-prediction of northerly winds caused by inaccurate synoptic inputs and other times too strong southerly winds caused by local sea breeze development. Discrepancies in maximum and minimum temperature are also seen for certain days. Precipitation events, as well as clouds, are simulated at the incorrect locations and times occasionally. Model simulatednrealistic thunderstorms are simulated, causing sometimes cause unrealistically strong outflows. To understand physical and chemical processes influencing air quality measures, a proper description of real world meteorological conditions is essential. The objective of this study is to generate better meteorological inputs than the AQF results to support the chemistry modeling. We utilized existing objective analysis and nudging tools in the MM5 system to develop the MUltiscale Nest-down Data Assimilation System (MUNDAS), which incorporates extensive meteorological observations available in the simulated domain for the retrospective simulation of the TexAQS-II period. With the re-simulated meteorological input, we are able to better predict ozone events during TexAQS-II period. In addition, base datasets in MM5 such as land use/land cover, vegetation fraction, soil type and sea surface temperature are updated by satellite data to represent the surface features more accurately. They are key physical parameters inputs affecting transfer of heat, momentum and soil moisture in land-surface process in MM5. Using base the accurate input datasets, we are able to have improved see the differences of predictions of ground temperatures, winds and even thunderstorm activities within boundary layer.
Computer-Controlled Cylindrical Polishing Process for Large X-Ray Mirror Mandrels
NASA Technical Reports Server (NTRS)
Khan, Gufran S.; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian
2010-01-01
We are developing high-energy grazing incidence shell optics for hard-x-ray telescopes. The resolution of a mirror shells depends on the quality of cylindrical mandrel from which they are being replicated. Mid-spatial-frequency axial figure error is a dominant contributor in the error budget of the mandrel. This paper presents our efforts to develop a deterministic cylindrical polishing process in order to keep the mid-spatial-frequency axial figure errors to a minimum. Simulation software is developed to model the residual surface figure errors of a mandrel due to the polishing process parameters and the tools used, as well as to compute the optical performance of the optics. The study carried out using the developed software was focused on establishing a relationship between the polishing process parameters and the mid-spatial-frequency error generation. The process parameters modeled are the speeds of the lap and the mandrel, the tool s influence function, the contour path (dwell) of the tools, their shape and the distribution of the tools on the polishing lap. Using the inputs from the mathematical model, a mandrel having conical approximated Wolter-1 geometry, has been polished on a newly developed computer-controlled cylindrical polishing machine. The preliminary results of a series of polishing experiments demonstrate a qualitative agreement with the developed model. We report our first experimental results and discuss plans for further improvements in the polishing process. The ability to simulate the polishing process is critical to optimize the polishing process, improve the mandrel quality and significantly reduce the cost of mandrel production
Pika: A snow science simulation tool built using the open-source framework MOOSE
NASA Astrophysics Data System (ADS)
Slaughter, A.; Johnson, M.
2017-12-01
The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase-field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture and crack propagation (via the extended finite-element method), flow in porous media, and others. The heat conduction, tensor mechanics, and phase-field modules, in particular, are well-suited for snow science problems. Pika--an open-source MOOSE-based application--is capable of simulating both 3D, coupled nonlinear continuum heat transfer and large-deformation mechanics applications (such as settlement) and phase-field based micro-structure applications. Additionally, these types of problems may be coupled tightly in a single solve or across length and time scales using a loosely coupled Picard iteration approach. In addition to the wide range of physics capabilities, MOOSE-based applications also inherit an extensible testing framework, graphical user interface, and documentation system; tools that allow MOOSE and other applications to adhere to nuclear software quality standards. The snow science community can learn from the nuclear industry and harness the existing effort to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The snow science community should build on existing tools to enable collaboration between researchers and practitioners throughout the world, and advance the state-of-the-art in line with other scientific research efforts.
Investigation of approximate models of experimental temperature characteristics of machines
NASA Astrophysics Data System (ADS)
Parfenov, I. V.; Polyakov, A. N.
2018-05-01
This work is devoted to the investigation of various approaches to the approximation of experimental data and the creation of simulation mathematical models of thermal processes in machines with the aim of finding ways to reduce the time of their field tests and reducing the temperature error of the treatments. The main methods of research which the authors used in this work are: the full-scale thermal testing of machines; realization of various approaches at approximation of experimental temperature characteristics of machine tools by polynomial models; analysis and evaluation of modelling results (model quality) of the temperature characteristics of machines and their derivatives up to the third order in time. As a result of the performed researches, rational methods, type, parameters and complexity of simulation mathematical models of thermal processes in machine tools are proposed.
Homogeneous Canine Chest Phantom Construction: A Tool for Image Quality Optimization.
Pavan, Ana Luiza Menegatti; Rosa, Maria Eugênia Dela; Giacomini, Guilherme; Bacchim Neto, Fernando Antonio; Yamashita, Seizo; Vulcano, Luiz Carlos; Duarte, Sergio Barbosa; Miranda, José Ricardo de Arruda; de Pina, Diana Rodrigues
2016-01-01
Digital radiographic imaging is increasing in veterinary practice. The use of radiation demands responsibility to maintain high image quality. Low doses are necessary because workers are requested to restrain the animal. Optimizing digital systems is necessary to avoid unnecessary exposure, causing the phenomenon known as dose creep. Homogeneous phantoms are widely used to optimize image quality and dose. We developed an automatic computational methodology to classify and quantify tissues (i.e., lung tissue, adipose tissue, muscle tissue, and bone) in canine chest computed tomography exams. The thickness of each tissue was converted to simulator materials (i.e., Lucite, aluminum, and air). Dogs were separated into groups of 20 animals each according to weight. Mean weights were 6.5 ± 2.0 kg, 15.0 ± 5.0 kg, 32.0 ± 5.5 kg, and 50.0 ± 12.0 kg, for the small, medium, large, and giant groups, respectively. The one-way analysis of variance revealed significant differences in all simulator material thicknesses (p < 0.05) quantified between groups. As a result, four phantoms were constructed for dorsoventral and lateral views. In conclusion, the present methodology allows the development of phantoms of the canine chest and possibly other body regions and/or animals. The proposed phantom is a practical tool that may be employed in future work to optimize veterinary X-ray procedures.
Homogeneous Canine Chest Phantom Construction: A Tool for Image Quality Optimization
2016-01-01
Digital radiographic imaging is increasing in veterinary practice. The use of radiation demands responsibility to maintain high image quality. Low doses are necessary because workers are requested to restrain the animal. Optimizing digital systems is necessary to avoid unnecessary exposure, causing the phenomenon known as dose creep. Homogeneous phantoms are widely used to optimize image quality and dose. We developed an automatic computational methodology to classify and quantify tissues (i.e., lung tissue, adipose tissue, muscle tissue, and bone) in canine chest computed tomography exams. The thickness of each tissue was converted to simulator materials (i.e., Lucite, aluminum, and air). Dogs were separated into groups of 20 animals each according to weight. Mean weights were 6.5 ± 2.0 kg, 15.0 ± 5.0 kg, 32.0 ± 5.5 kg, and 50.0 ± 12.0 kg, for the small, medium, large, and giant groups, respectively. The one-way analysis of variance revealed significant differences in all simulator material thicknesses (p < 0.05) quantified between groups. As a result, four phantoms were constructed for dorsoventral and lateral views. In conclusion, the present methodology allows the development of phantoms of the canine chest and possibly other body regions and/or animals. The proposed phantom is a practical tool that may be employed in future work to optimize veterinary X-ray procedures. PMID:27101001
Cislan-2 extension final document by University of Twente (Netherlands)
NASA Astrophysics Data System (ADS)
Niemegeers, Ignas; Baumann, Frank; Beuwer, Wim; Jordense, Marcel; Pras, Aiko; Schutte, Leon; Tracey, Ian
1992-01-01
Results of worked performed under the so called Cislan extension contract are presented. The adaptation of the Cislan 2 prototype design to an environment of interconnected Local Area Networks (LAN's) instead of a single 802.5 token ring LAN is considered. In order to extend the network architecture, the Interconnection Function (IF) protocol layer was subdivided into two protocol layers: a new IF layer, and below the Medium Enhancement (ME) protocol layer. Some small enhancements to the distributed bandwidth allocation protocol were developed, which in fact are also applicable to the 'normal' Cislan 2 system. The new services and protocols are described together with some scenarios and requirements for the new internetting Cislan 2 system. How to overcome the degradation of the quality of speech due to packet loss on the LAN subsystem was studied. Experiments were planned in order to measure this speech quality degradation. Simulations were performed of two Cislan subsystems, the bandwidth allocation protocol and the clock synchronization mechanism. Results on both simulations, performed on SUN workstations using QNAP as a simulation tool, are given. Results of the simulations of the clock synchronization mechanism, and results of the simulation of the distributed bandwidth allocation protocol are given.
Wong, Lai Fun; Chan, Sally Wai-Chi; Ho, Jasmine Tze Yin; Mordiffi, Siti Zubaidah; Ang, Sophia Bee Leng; Goh, Poh Sun; Ang, Emily Neo Kim
2015-01-01
Background Web-based learning is becoming an increasingly important instructional tool in nursing education. Multimedia advancements offer the potential for creating authentic nursing activities for developing nursing competency in clinical practice. Objective This study aims to describe the design, development, and evaluation of an interactive multimedia Web-based simulation for developing nurses’ competencies in acute nursing care. Methods Authentic nursing activities were developed in a Web-based simulation using a variety of instructional strategies including animation video, multimedia instructional material, virtual patients, and online quizzes. A randomized controlled study was conducted on 67 registered nurses who were recruited from the general ward units of an acute care tertiary hospital. Following a baseline evaluation of all participants’ clinical performance in a simulated clinical setting, the experimental group received 3 hours of Web-based simulation and completed a survey to evaluate their perceptions of the program. All participants were re-tested for their clinical performances using a validated tool. Results The clinical performance posttest scores of the experimental group improved significantly (P<.001) from the pretest scores after the Web-based simulation. In addition, compared to the control group, the experimental group had significantly higher clinical performance posttest scores (P<.001) after controlling the pretest scores. The participants from the experimental group were satisfied with their learning experience and gave positive ratings for the quality of the Web-based simulation. Themes emerging from the comments about the most valuable aspects of the Web-based simulation include relevance to practice, instructional strategies, and fostering problem solving. Conclusions Engaging in authentic nursing activities using interactive multimedia Web-based simulation can enhance nurses’ competencies in acute care. Web-based simulations provide a promising educational tool in institutions where large groups of nurses need to be trained in acute nursing care and accessibility to repetitive training is essential for achieving long-term retention of clinical competency. PMID:25583029
Liaw, Sok Ying; Wong, Lai Fun; Chan, Sally Wai-Chi; Ho, Jasmine Tze Yin; Mordiffi, Siti Zubaidah; Ang, Sophia Bee Leng; Goh, Poh Sun; Ang, Emily Neo Kim
2015-01-12
Web-based learning is becoming an increasingly important instructional tool in nursing education. Multimedia advancements offer the potential for creating authentic nursing activities for developing nursing competency in clinical practice. This study aims to describe the design, development, and evaluation of an interactive multimedia Web-based simulation for developing nurses' competencies in acute nursing care. Authentic nursing activities were developed in a Web-based simulation using a variety of instructional strategies including animation video, multimedia instructional material, virtual patients, and online quizzes. A randomized controlled study was conducted on 67 registered nurses who were recruited from the general ward units of an acute care tertiary hospital. Following a baseline evaluation of all participants' clinical performance in a simulated clinical setting, the experimental group received 3 hours of Web-based simulation and completed a survey to evaluate their perceptions of the program. All participants were re-tested for their clinical performances using a validated tool. The clinical performance posttest scores of the experimental group improved significantly (P<.001) from the pretest scores after the Web-based simulation. In addition, compared to the control group, the experimental group had significantly higher clinical performance posttest scores (P<.001) after controlling the pretest scores. The participants from the experimental group were satisfied with their learning experience and gave positive ratings for the quality of the Web-based simulation. Themes emerging from the comments about the most valuable aspects of the Web-based simulation include relevance to practice, instructional strategies, and fostering problem solving. Engaging in authentic nursing activities using interactive multimedia Web-based simulation can enhance nurses' competencies in acute care. Web-based simulations provide a promising educational tool in institutions where large groups of nurses need to be trained in acute nursing care and accessibility to repetitive training is essential for achieving long-term retention of clinical competency.
NASA Astrophysics Data System (ADS)
Don, Steven; Whiting, Bruce R.; Hildebolt, Charles F.; Sehnert, W. James; Ellinwood, Jacquelyn S.; Töpfer, Karin; Masoumzadeh, Parinaz; Kraus, Richard A.; Kronemer, Keith A.; Herman, Thomas; McAlister, William H.
2006-03-01
The risk of radiation exposure is greatest for pediatric patients and, thus, there is a great incentive to reduce the radiation dose used in diagnostic procedures for children to "as low as reasonably achievable" (ALARA). Testing of low-dose protocols presents a dilemma, as it is unethical to repeatedly expose patients to ionizing radiation in order to determine optimum protocols. To overcome this problem, we have developed a computed-radiography (CR) dose-reduction simulation tool that takes existing images and adds synthetic noise to create realistic images that correspond to images generated with lower doses. The objective of our study was to determine the extent to which simulated, low-dose images corresponded with original (non-simulated) low-dose images. To make this determination, we created pneumothoraces of known volumes in five neonate cadavers and obtained images of the neonates at 10 mR, 1 mR and 0.1 mR (as measured at the cassette plate). The 10-mR exposures were considered "relatively-noise-free" images. We used these 10 mR-images and our simulation tool to create simulated 0.1- and 1-mR images. For the simulated and original images, we identified regions of interest (ROI) of the entire chest, free-in-air region, and liver. We compared the means and standard deviations of the ROI grey-scale values of the simulated and original images with paired t tests. We also had observers rate simulated and original images for image quality and for the presence or absence of pneumothoraces. There was no statistically significant difference in grey-scale-value means nor standard deviations between simulated and original entire chest ROI regions. The observer performance suggests that an exposure >=0.2 mR is required to detect the presence or absence of pneumothoraces. These preliminary results indicate that the use of the simulation tool is promising for achieving ALARA exposures in children.
NASA Astrophysics Data System (ADS)
Adams, R.; Quinn, P. F.; Bowes, M. J.
2015-04-01
A model for simulating runoff pathways and water quality fluxes has been developed using the minimum information requirement (MIR) approach. The model, the Catchment Runoff Attenuation Flux Tool (CRAFT), is applicable to mesoscale catchments and focusses primarily on hydrological pathways that mobilise nutrients. Hence CRAFT can be used to investigate the impact of flow pathway management intervention strategies designed to reduce the loads of nutrients into receiving watercourses. The model can help policy makers meet water quality targets and consider methods to obtain "good" ecological status. A case study of the 414 km2 Frome catchment, Dorset, UK, has been described here as an application of CRAFT in order to highlight the above issues at the mesoscale. The model was primarily calibrated on 10-year records of weekly data to reproduce the observed flows and nutrient (nitrate nitrogen - N; phosphorus - P) concentrations. Data from 2 years with sub-daily monitoring at the same site were also analysed. These data highlighted some additional signals in the nutrient flux, particularly of soluble reactive phosphorus, which were not observable in the weekly data. This analysis has prompted the choice of using a daily time step as the minimum information requirement to simulate the processes observed at the mesoscale, including the impact of uncertainty. A management intervention scenario was also run to demonstrate how the model can support catchment managers investigating how reducing the concentrations of N and P in the various flow pathways. This mesoscale modelling tool can help policy makers consider a range of strategies to meet the European Union (EU) water quality targets for this type of catchment.
QMRAcatch: Microbial Quality Simulation of Water Resources including Infection Risk Assessment
Schijven, Jack; Derx, Julia; de Roda Husman, Ana Maria; Blaschke, Alfred Paul; Farnleitner, Andreas H.
2016-01-01
Given the complex hydrologic dynamics of water catchments and conflicts between nature protection and public water supply, models may help to understand catchment dynamics and evaluate contamination scenarios and may support best environmental practices and water safety management. A catchment model can be an educative tool for investigating water quality and for communication between parties with different interests in the catchment. This article introduces an interactive computational tool, QMRAcatch, that was developed to simulate concentrations in water resources of Escherichia coli, a human-associated Bacteroidetes microbial source tracking (MST) marker, enterovirus, norovirus, Campylobacter, and Cryptosporidium as target microorganisms and viruses (TMVs). The model domain encompasses a main river with wastewater discharges and a floodplain with a floodplain river. Diffuse agricultural sources of TMVs that discharge into the main river are not included in this stage of development. The floodplain river is fed by the main river and may flood the plain. Discharged TMVs in the river are subject to dilution and temperature-dependent degradation. River travel times are calculated using the Manning–Gauckler–Strickler formula. Fecal deposits from wildlife, birds, and visitors in the floodplain are resuspended in flood water, runoff to the floodplain river, or infiltrate groundwater. Fecal indicator and MST marker data facilitate calibration. Infection risks from exposure to the pathogenic TMVs by swimming or drinking water consumption are calculated, and the required pathogen removal by treatment to meet a health-based quality target can be determined. Applicability of QMRAcatch is demonstrated by calibrating the tool for a study site at the River Danube near Vienna, Austria, using field TMV data, including a sensitivity analysis and evaluation of the model outcomes. PMID:26436266
USDA-ARS?s Scientific Manuscript database
The phosphorus (P) Index (PI) is the risk assessment tool approved in the NRCS 590 standard used to target critical source areas and practices to reduce P losses. A revision of the 590 standard, suggested using the Agricultural Policy/Environmental eXtender (APEX) model to assess the risk of nitroge...
Fire and Smoke Model Evaluation Experiment (FASMEE): Modeling gaps and data needs
Yongqiang Liu; Adam Kochanski; Kirk Baker; Ruddy Mell; Rodman Linn; Ronan Paugam; Jan Mandel; Aime Fournier; Mary Ann Jenkins; Scott Goodrick; Gary Achtemeier; Andrew Hudak; Matthew Dickson; Brian Potter; Craig Clements; Shawn Urbanski; Roger Ottmar; Narasimhan Larkin; Timothy Brown; Nancy French; Susan Prichard; Adam Watts; Derek McNamara
2017-01-01
Fire and smoke models are numerical tools for simulating fire behavior, smoke dynamics, and air quality impacts of wildland fires. Fire models are developed based on the fundamental chemistry and physics of combustion and fire spread or statistical analysis of experimental data (Sullivan 2009). They provide information on fire spread and fuel consumption for safe and...
NASA Astrophysics Data System (ADS)
Dong, Feifei; Liu, Yong; Wu, Zhen; Chen, Yihui; Guo, Huaicheng
2018-07-01
Targeting nonpoint source (NPS) pollution hot spots is of vital importance for placement of best management practices (BMPs). Although physically-based watershed models have been widely used to estimate nutrient emissions, connections between nutrient abatement and compliance of water quality standards have been rarely considered in NPS hotspot ranking, which may lead to ineffective decision-making. It's critical to develop a strategy to identify priority management areas (PMAs) based on water quality response to nutrient load mitigation. A water quality constrained PMA identification framework was thereby proposed in this study, based on the simulation-optimization approach with ideal load reduction (ILR-SO). It integrates the physically-based Soil and Water Assessment Tool (SWAT) model and an optimization model under constraints of site-specific water quality standards. To our knowledge, it was the first effort to identify PMAs with simulation-based optimization. The SWAT model was established to simulate temporal and spatial nutrient loading and evaluate effectiveness of pollution mitigation. A metamodel was trained to establish a quantitative relationship between sources and water quality. Ranking of priority areas is based on required nutrient load reduction in each sub-watershed targeting to satisfy water quality standards in waterbodies, which was calculated with genetic algorithm (GA). The proposed approach was used for identification of PMAs on the basis of diffuse total phosphorus (TP) in Lake Dianchi Watershed, one of the three most eutrophic large lakes in China. The modeling results demonstrated that 85% of diffuse TP came from 30% of the watershed area. Compared with the two conventional targeting strategies based on overland nutrient loss and instream nutrient loading, the ILR-SO model identified distinct PMAs and narrowed down the coverage of management areas. This study addressed the urgent need to incorporate water quality response into PMA identification and showed that the ILR-SO approach is effective to guide watershed management for aquatic ecosystem restoration.
Sato, Mitsuru; Tateishi, Kensuke; Murata, Hidetoshi; Kin, Taichi; Suenaga, Jun; Takase, Hajime; Yoneyama, Tomohiro; Nishii, Toshiaki; Tateishi, Ukihide; Yamamoto, Tetsuya; Saito, Nobuhito; Inoue, Tomio; Kawahara, Nobutaka
2018-06-26
The utility of surgical simulation with three-dimensional multimodality fusion imaging (3D-MFI) has been demonstrated. However, its potential in deep-seated brain lesions remains unknown. The aim of this study was to investigate the impact of 3D-MFI in deep-seated meningioma operations. Fourteen patients with deeply located meningiomas were included in this study. We constructed 3D-MFIs by fusing high-resolution magnetic resonance (MR) and computed tomography (CT) images with a rotational digital subtraction angiogram (DSA) in all patients. The surgical procedure was simulated by 3D-MFI prior to operation. To assess the impact on neurosurgical education, the objective values of surgical simulation by 3D-MFIs/virtual reality (VR) video were evaluated. To validate the quality of 3D-MFIs, intraoperative findings were compared. The identification rate (IR) and positive predictive value (PPV) for the tumor feeding arteries and involved perforating arteries and veins were also assessed for quality assessment of 3D-MFI. After surgical simulation by 3D-MFIs, near-total resection was achieved in 13 of 14 (92.9%) patients without neurological complications. 3D-MFIs significantly contributed to the understanding of surgical anatomy and optimal surgical view (p < .0001) and learning how to preserve critical vessels (p < .0001) and resect tumors safety and extensively (p < .0001) by neurosurgical residents/fellows. The IR of 3D-MFI for tumor-feeding arteries and perforating arteries and veins was 100% and 92.9%, respectively. The PPV of 3D-MFI for tumor-feeding arteries and perforating arteries and veins was 98.8% and 76.5%, respectively. 3D-MFI contributed to learn skull base meningioma surgery. Also, 3D-MFI provided high quality to identify critical anatomical structures within or adjacent to deep-seated meningiomas. Thus, 3D-MFI is promising educational and surgical planning tool for meningiomas in deep-seated regions.
Molecular dynamic simulation for nanometric cutting of single-crystal face-centered cubic metals.
Huang, Yanhua; Zong, Wenjun
2014-01-01
In this work, molecular dynamics simulations are performed to investigate the influence of material properties on the nanometric cutting of single crystal copper and aluminum with a diamond cutting tool. The atomic interactions in the two metallic materials are modeled by two sets of embedded atom method (EAM) potential parameters. Simulation results show that although the plastic deformation of the two materials is achieved by dislocation activities, the deformation behavior and related physical phenomena, such as the machining forces, machined surface quality, and chip morphology, are significantly different for different materials. Furthermore, the influence of material properties on the nanometric cutting has a strong dependence on the operating temperature.
Procedural training and assessment of competency utilizing simulation.
Sawyer, Taylor; Gray, Megan M
2016-11-01
This review examines the current environment of neonatal procedural learning, describes an updated model of skills training, defines the role of simulation in assessing competency, and discusses potential future directions for simulation-based competency assessment. In order to maximize impact, simulation-based procedural training programs should follow a standardized and evidence-based approach to designing and evaluating educational activities. Simulation can be used to facilitate the evaluation of competency, but must incorporate validated assessment tools to ensure quality and consistency. True competency evaluation cannot be accomplished with simulation alone: competency assessment must also include evaluations of procedural skill during actual clinical care. Future work in this area is needed to measure and track clinically meaningful patient outcomes resulting from simulation-based training, examine the use of simulation to assist physicians undergoing re-entry to practice, and to examine the use of procedural skills simulation as part of a maintenance of competency and life-long learning. Copyright © 2016 Elsevier Inc. All rights reserved.
Monte Carlo simulations in X-ray imaging
NASA Astrophysics Data System (ADS)
Giersch, Jürgen; Durst, Jürgen
2008-06-01
Monte Carlo simulations have become crucial tools in many fields of X-ray imaging. They help to understand the influence of physical effects such as absorption, scattering and fluorescence of photons in different detector materials on image quality parameters. They allow studying new imaging concepts like photon counting, energy weighting or material reconstruction. Additionally, they can be applied to the fields of nuclear medicine to define virtual setups studying new geometries or image reconstruction algorithms. Furthermore, an implementation of the propagation physics of electrons and photons allows studying the behavior of (novel) X-ray generation concepts. This versatility of Monte Carlo simulations is illustrated with some examples done by the Monte Carlo simulation ROSI. An overview of the structure of ROSI is given as an example of a modern, well-proven, object-oriented, parallel computing Monte Carlo simulation for X-ray imaging.
Monte Carlo simulations in Nuclear Medicine
NASA Astrophysics Data System (ADS)
Loudos, George K.
2007-11-01
Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.
NASA Astrophysics Data System (ADS)
Haris, H.; Chow, M. F.; Usman, F.; Sidek, L. M.; Roseli, Z. A.; Norlida, M. D.
2016-03-01
Urbanization is growing rapidly in Malaysia. Rapid urbanization has known to have several negative impacts towards hydrological cycle due to decreasing of pervious area and deterioration of water quality in stormwater runoff. One of the negative impacts of urbanization is the congestion of the stormwater drainage system and this situation leading to flash flood problem and water quality degradation. There are many urban stormwater management softwares available in the market such as Storm Water Drainage System design and analysis program (DRAINS), Urban Drainage and Sewer Model (MOUSE), InfoWorks River Simulation (InfoWork RS), Hydrological Simulation Program-Fortran (HSPF), Distributed Routing Rainfall-Runoff Model (DR3M), Storm Water Management Model (SWMM), XP Storm Water Management Model (XPSWMM), MIKE-SWMM, Quality-Quantity Simulators (QQS), Storage, Treatment, Overflow, Runoff Model (STORM), and Hydrologic Engineering Centre-Hydrologic Modelling System (HEC-HMS). In this paper, we are going to discuss briefly about several softwares and their functionality, accessibility, characteristics and components in the quantity analysis of the hydrological design software and compare it with MSMA Design Aid and Database. Green Infrastructure (GI) is one of the main topics that has widely been discussed all over the world. Every development in the urban area is related to GI. GI can be defined as green area build in the develop area such as forest, park, wetland or floodway. The role of GI is to improve life standard such as water filtration or flood control. Among the twenty models that have been compared to MSMA SME, ten models were selected to conduct a comprehensive review for this study. These are known to be widely accepted by water resource researchers. These ten tools are further classified into three major categories as models that address the stormwater management ability of GI in terms of quantity and quality, models that have the capability of conducting the economic analysis of GI and models that can address both stormwater management and economic aspects together.
Design and simulation of EVA tools for first servicing mission of HST
NASA Technical Reports Server (NTRS)
Naik, Dipak; Dehoff, P. H.
1994-01-01
The Hubble Space Telescope (HST) was launched into near-earth orbit by the Space Shuttle Discovery on April 24, 1990. The payload of two cameras, two spectrographs, and a high-speed photometer is supplemented by three fine-guidance sensors that can be used for astronomy as well as for star tracking. A widely reported spherical aberration in the primary mirror causes HST to produce images of much lower quality than intended. A Space Shuttle repair mission in January 1994 installed small corrective mirrors that restored the full intended optical capability of the HST. The First Servicing Mission (FSM) involved considerable Extra Vehicular Activity (EVA). Special EVA tools for the FSM were designed and developed for this specific purpose. In an earlier report, the details of the Data Acquisition System developed to test the performance of the various EVA tools in ambient as well as simulated space environment were presented. The general schematic of the test setup is reproduced in this report for continuity. Although the data acquisition system was used extensively to test a number of fasteners, only the results of one test each carried on various fasteners and the Power Ratchet Tool are included in this report.
Multi Modal Anticipation in Fuzzy Space
NASA Astrophysics Data System (ADS)
Asproth, Viveca; Holmberg, Stig C.; Hâkansson, Anita
2006-06-01
We are all stakeholders in the geographical space, which makes up our common living and activity space. This means that a careful, creative, and anticipatory planning, design, and management of that space will be of paramount importance for our sustained life on earth. Here it is shown that the quality of such planning could be significantly increased with help of a computer based modelling and simulation tool. Further, the design and implementation of such a tool ought to be guided by the conceptual integration of some core concepts like anticipation and retardation, multi modal system modelling, fuzzy space modelling, and multi actor interaction.
Wehbe-Janek, Hania; Colbert, Colleen Y; Govednik-Horny, Cara; White, Bobbie Ann A; Thomas, Scott; Shabahang, Mohsen
2012-06-01
Simulation has altered surgical curricula throughout residency programs. The purpose of this multimethod study was to explore residents' perceptions of simulation within surgical residency as relevant stakeholder feedback and program evaluation of the surgery simulation curriculum. Focus groups were held with a sample of surgery residents (n = 25) at a university-affiliated program. Residents participated in focus groups based on level of training and completed questionnaires regarding simulation curricula. Groups were facilitated by nonsurgeon faculty. Residents were asked: "What is the role of simulation in surgical education?" An interdisciplinary team recorded narrative data and performed content analyses. Quantitative data from questionnaires were summarized using descriptive statistics and frequencies. Major themes from the qualitative data included: concerns regarding simulation in surgical education (28%), exposure to situations and technical skills in a low-stress learning environment (24%), pressure by external agencies (19%), an educational tool (17%), and quality assurance for patient care (12%). Laparoscopy and cadaver lab were the most prevalent simulation training during residency, in addition to trauma simulations, central lines/chest tubes/IV access, and stapling lab. In response to the statement: "ACGME should require a simulation curriculum in surgery residency," 52.1% responded favorably and 47.8% responded nonfavorably. Residents acknowledge the value of simulation in patient safety, quality, and exposure to procedures before clinical experience, but remain divided on efficacy and requirement of simulation within curricula. The greater challenge to residency programs may be strategic implementation of simulation curricula within the right training context. Copyright © 2012 Mosby, Inc. All rights reserved.
π Scope: python based scientific workbench with visualization tool for MDSplus data
NASA Astrophysics Data System (ADS)
Shiraiwa, S.
2014-10-01
π Scope is a python based scientific data analysis and visualization tool constructed on wxPython and Matplotlib. Although it is designed to be a generic tool, the primary motivation for developing the new software is 1) to provide an updated tool to browse MDSplus data, with functionalities beyond dwscope and jScope, and 2) to provide a universal foundation to construct interface tools to perform computer simulation and modeling for Alcator C-Mod. It provides many features to visualize MDSplus data during tokamak experiments including overplotting different signals and discharges, various plot types (line, contour, image, etc.), in-panel data analysis using python scripts, and publication quality graphics generation. Additionally, the logic to produce multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for dwscope users. πScope uses multi-threading to reduce data transfer latency, and its object-oriented design makes it easy to modify and expand while the open source nature allows portability. A built-in tree data browser allows a user to approach the data structure both from a GUI and a script, enabling relatively complex data analysis workflow to be built quickly. As an example, an IDL-based interface to perform GENRAY/CQL3D simulations was ported on πScope, thus allowing LHCD simulation to be run between-shot using C-Mod experimental profiles. This workflow is being used to generate a large database to develop a LHCD actuator model for the plasma control system. Supported by USDoE Award DE-FC02-99ER54512.
Young Kim, Eun; Johnson, Hans J
2013-01-01
A robust multi-modal tool, for automated registration, bias correction, and tissue classification, has been implemented for large-scale heterogeneous multi-site longitudinal MR data analysis. This work focused on improving the an iterative optimization framework between bias-correction, registration, and tissue classification inspired from previous work. The primary contributions are robustness improvements from incorporation of following four elements: (1) utilize multi-modal and repeated scans, (2) incorporate high-deformable registration, (3) use extended set of tissue definitions, and (4) use of multi-modal aware intensity-context priors. The benefits of these enhancements were investigated by a series of experiments with both simulated brain data set (BrainWeb) and by applying to highly-heterogeneous data from a 32 site imaging study with quality assessments through the expert visual inspection. The implementation of this tool is tailored for, but not limited to, large-scale data processing with great data variation with a flexible interface. In this paper, we describe enhancements to a joint registration, bias correction, and the tissue classification, that improve the generalizability and robustness for processing multi-modal longitudinal MR scans collected at multi-sites. The tool was evaluated by using both simulated and simulated and human subject MRI images. With these enhancements, the results showed improved robustness for large-scale heterogeneous MRI processing.
NASA Astrophysics Data System (ADS)
Schafhirt, S.; Kaufer, D.; Cheng, P. W.
2014-12-01
In recent years many advanced load simulation tools, allowing an aero-servo-hydroelastic analyses of an entire offshore wind turbine, have been developed and verified. Nowadays, even an offshore wind turbine with a complex support structure such as a jacket can be analysed. However, the computational effort rises significantly with an increasing level of details. This counts especially for offshore wind turbines with lattice support structures, since those models do naturally have a higher number of nodes and elements than simpler monopile structures. During the design process multiple load simulations are demanded to obtain an optimal solution. In the view of pre-design tasks it is crucial to apply load simulations which keep the simulation quality and the computational effort in balance. The paper will introduce a reference wind turbine model consisting of the REpower5M wind turbine and a jacket support structure with a high level of detail. In total twelve variations of this reference model are derived and presented. Main focus is to simplify the models of the support structure and the foundation. The reference model and the simplified models are simulated with the coupled simulation tool Flex5-Poseidon and analysed regarding frequencies, fatigue loads, and ultimate loads. A model has been found which reaches an adequate increase of simulation speed while holding the results in an acceptable range compared to the reference results.
Dynamic Evaluation of Two Decades of CMAQ Simulations ...
This presentation focuses on the dynamic evaluation of the CMAQ model over the continental United States using multi-decadal simulations for the period from 1990 to 2010 to examine how well the changes in observed ozone air quality induced by variations in meteorology and/or emissions are simulated by the model. We applied spectral decomposition of the ozone time-series using the KZ filter to assess the variations in the strengths of synoptic (weather-induced variations) and baseline (long-term variation) forcings, embedded in the simulated and observed concentrations. The results reveal that CMAQ captured the year-to-year variability (more so in the later years than the earlier years) and the synoptic forcing in accordance with what the observations are showing. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Multiobjective optimization of low impact development stormwater controls
NASA Astrophysics Data System (ADS)
Eckart, Kyle; McPhee, Zach; Bolisetti, Tirupati
2018-07-01
Green infrastructure such as Low Impact Development (LID) controls are being employed to manage the urban stormwater and restore the predevelopment hydrological conditions besides improving the stormwater runoff water quality. Since runoff generation and infiltration processes are nonlinear, there is a need for identifying optimal combination of LID controls. A coupled optimization-simulation model was developed by linking the U.S. EPA Stormwater Management Model (SWMM) to the Borg Multiobjective Evolutionary Algorithm (Borg MOEA). The coupled model is capable of performing multiobjective optimization which uses SWMM simulations as a tool to evaluate potential solutions to the optimization problem. The optimization-simulation tool was used to evaluate low impact development (LID) stormwater controls. A SWMM model was developed, calibrated, and validated for a sewershed in Windsor, Ontario and LID stormwater controls were tested for three different return periods. LID implementation strategies were optimized using the optimization-simulation model for five different implementation scenarios for each of the three storm events with the objectives of minimizing peak flow in the stormsewers, reducing total runoff, and minimizing cost. For the sewershed in Windsor, Ontario, the peak run off and total volume of the runoff were found to reduce by 13% and 29%, respectively.
Benchmarking of measurement and simulation of transverse rms-emittance growth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeon, Dong-O
2008-01-01
Transverse emittance growth along the Alvarez DTL section is a major concern with respect to the preservation of beam quality of high current beams at the GSI UNILAC. In order to define measures to reduce this growth appropriated tools to simulate the beam dynamics are indispensable. This paper is about the benchmarking of three beam dynamics simulation codes, i.e. DYNAMION, PARMILA, and PARTRAN against systematic measurements of beam emittances for different machine settings. Experimental set-ups, data reduction, the preparation of the simulations, and the evaluation of the simulations will be described. It was found that the measured 100%-rmsemittances behind themore » DTL exceed the simulated values. Comparing measured 90%-rms-emittances to the simulated 95%-rms-emittances gives fair to good agreement instead. The sum of horizontal and vertical emittances is even described well by the codes as long as experimental 90%-rmsemittances are compared to simulated 95%-rms-emittances. Finally, the successful reduction of transverse emittance growth by systematic beam matching is reported.« less
Application Of Moldex3D For Thin-wall Injection Moulding Simulation
NASA Astrophysics Data System (ADS)
Šercer, Mladen; Godec, Damir; Bujanić, Božo
2007-05-01
The benefits associated with decreasing wall thicknesses below their current values are still measurable and desired even if the final wall thickness is nowhere near those of the aggressive portable electronics industry. It is important to note that gains in wall section reduction do not always occur without investment, in this case, in tooling and machinery upgrades. Equally important is the fact that productivity and performance benefits of reduced material usage, fast cycle times, and lighter weight can often outweigh most of the added costs. In order to eliminate unnecessary mould trials, minimize product development cycle, reduce overall costs and improve product quality, polymeric engineers use new CAE technology (Computer Aided Engineering). This technology is a simulation tool, which combines proven theories, material properties and process conditions to generate realistic simulations and produce valuable recommendations. Based on these recommendations, an optional combination of product design, material and process conditions can be identified. In this work, Moldex3D software was used for simulation of injection moulding in order to avoid potential moulding problems. The results gained from the simulation were used for the optimization of an existing product design, for mould development and for optimization of processing parameters, e.g. injection pressure, mould cavity temperature, etc.
Girod, Sabine; Schvartzman, Sara C; Gaudilliere, Dyani; Salisbury, Kenneth; Silva, Rebeka
2016-01-01
Computer-assisted surgical (CAS) planning tools are available for craniofacial surgery, but are usually based on computer-aided design (CAD) tools that lack the ability to detect the collision of virtual objects (i.e., fractured bone segments). We developed a CAS system featuring a sense of touch (haptic) that enables surgeons to physically interact with individual, patient-specific anatomy and immerse in a three-dimensional virtual environment. In this study, we evaluated initial user experience with our novel system compared to an existing CAD system. Ten surgery resident trainees received a brief verbal introduction to both the haptic and CAD systems. Users simulated mandibular fracture reduction in three clinical cases within a 15 min time limit for each system and completed a questionnaire to assess their subjective experience. We compared standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome and found that haptic simulation results were not significantly different from actual postoperative outcomes. In contrast, CAD results significantly differed from both the haptic simulation and actual postoperative results. In addition to enabling a more accurate fracture repair, the haptic system provided a better user experience than the CAD system in terms of intuitiveness and self-reported quality of repair.
Full 3-D OCT-based pseudophakic custom computer eye model
Sun, M.; Pérez-Merino, P.; Martinez-Enriquez, E.; Velasco-Ocana, M.; Marcos, S.
2016-01-01
We compared measured wave aberrations in pseudophakic eyes implanted with aspheric intraocular lenses (IOLs) with simulated aberrations from numerical ray tracing on customized computer eye models, built using quantitative 3-D OCT-based patient-specific ocular geometry. Experimental and simulated aberrations show high correlation (R = 0.93; p<0.0001) and similarity (RMS for high order aberrations discrepancies within 23.58%). This study shows that full OCT-based pseudophakic custom computer eye models allow understanding the relative contribution of optical geometrical and surgically-related factors to image quality, and are an excellent tool for characterizing and improving cataract surgery. PMID:27231608
Paganetti, H; Jiang, H; Lee, S Y; Kooy, H M
2004-07-01
Monte Carlo dosimetry calculations are essential methods in radiation therapy. To take full advantage of this tool, the beam delivery system has to be simulated in detail and the initial beam parameters have to be known accurately. The modeling of the beam delivery system itself opens various areas where Monte Carlo calculations prove extremely helpful, such as for design and commissioning of a therapy facility as well as for quality assurance verification. The gantry treatment nozzles at the Northeast Proton Therapy Center (NPTC) at Massachusetts General Hospital (MGH) were modeled in detail using the GEANT4.5.2 Monte Carlo code. For this purpose, various novel solutions for simulating irregular shaped objects in the beam path, like contoured scatterers, patient apertures or patient compensators, were found. The four-dimensional, in time and space, simulation of moving parts, such as the modulator wheel, was implemented. Further, the appropriate physics models and cross sections for proton therapy applications were defined. We present comparisons between measured data and simulations. These show that by modeling the treatment nozzle with millimeter accuracy, it is possible to reproduce measured dose distributions with an accuracy in range and modulation width, in the case of a spread-out Bragg peak (SOBP), of better than 1 mm. The excellent agreement demonstrates that the simulations can even be used to generate beam data for commissioning treatment planning systems. The Monte Carlo nozzle model was used to study mechanical optimization in terms of scattered radiation and secondary radiation in the design of the nozzles. We present simulations on the neutron background. Further, the Monte Carlo calculations supported commissioning efforts in understanding the sensitivity of beam characteristics and how these influence the dose delivered. We present the sensitivity of dose distributions in water with respect to various beam parameters and geometrical misalignments. This allows the definition of tolerances for quality assurance and the design of quality assurance procedures.
Numerical simulations of novel high-power high-brightness diode laser structures
NASA Astrophysics Data System (ADS)
Boucke, Konstantin; Rogg, Joseph; Kelemen, Marc T.; Poprawe, Reinhart; Weimann, Guenter
2001-07-01
One of the key topics in today's semiconductor laser development activities is to increase the brightness of high-power diode lasers. Although structures showing an increased brightness have been developed specific draw-backs of these structures lead to a still strong demand for investigation of alternative concepts. Especially for the investigation of basically novel structures easy-to-use and fast simulation tools are essential to avoid unnecessary, cost and time consuming experiments. A diode laser simulation tool based on finite difference representations of the Helmholtz equation in 'wide-angle' approximation and the carrier diffusion equation has been developed. An optimized numerical algorithm leads to short execution times of a few seconds per resonator round-trip on a standard PC. After each round-trip characteristics like optical output power, beam profile and beam parameters are calculated. A graphical user interface allows online monitoring of the simulation results. The simulation tool is used to investigate a novel high-power, high-brightness diode laser structure, the so-called 'Z-Structure'. In this structure an increased brightness is achieved by reducing the divergency angle of the beam by angular filtering: The round trip path of the beam is two times folded using internal total reflection at surfaces defined by a small index step in the semiconductor material, forming a stretched 'Z'. The sharp decrease of the reflectivity for angles of incidence above the angle of total reflection leads to a narrowing of the angular spectrum of the beam. The simulations of the 'Z-Structure' indicate an increase of the beam quality by a factor of five to ten compared to standard broad-area lasers.
Establishing a convention for acting in healthcare simulation: merging art and science.
Sanko, Jill S; Shekhter, Ilya; Kyle, Richard R; Di Benedetto, Stephen; Birnbach, David J
2013-08-01
Among the most powerful tools available to simulation instructors is a confederate. Although technical and logical realism is dictated by the simulation platform and setting, the quality of role playing by confederates strongly determines psychological or emotional fidelity of simulation. The highest level of realism, however, is achieved when the confederates are properly trained. Theater and acting methodology can provide simulation educators a framework from which to establish an acting convention specific to the discipline of healthcare simulation. This report attempts to examine simulation through the lens of theater arts and represents an opinion on acting in healthcare simulation for both simulation educators and confederates. It aims to refine the practice of simulation by embracing the lessons of the theater community. Although the application of these approaches in healthcare education has been described in the literature, a systematic way of organizing, publicizing, or documenting the acting within healthcare simulation has never been completed. Therefore, we attempt, for the first time, to take on this challenge and create a resource, which infuses theater arts into the practice of healthcare simulation.
American Society of Composites, 32nd Technical Conference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aitharaju, Venkat; Yu, Hang; Zhao, Selina
Resin transfer molding (RTM) has become increasingly popular for the manufacturing of composite parts. To enable high volume manufacturing and obtain good quality parts at an acceptable cost to automotive industry, accurate process simulation tools are necessary to optimize the process conditions. Towards that goal, General Motors and the ESI-group are involved in developing a state of the art process simulation tool for composite manufacturing in a project supported by the Department of Energy. This paper describes the modeling of various stages in resin transfer molding such as resin injection, resin curing, and part distortion. An instrumented RTM system locatedmore » at the General Motors Research and Development center was used to perform flat plaque molding experiments. The experimental measurements of fill time, in-mold pressure versus time, cure variation with time, and part deformation were compared with the model predictions and very good correlations were observed.« less
WRF/CMAQ AQMEII3 Simulations of US Regional-Scale ...
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary conditions prepared from four different global models. Results indicate that the impacts of different boundary conditions are significant for ozone throughout the year and most pronounced outside the summer season. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Travel simulation modeling: an emerging tool for visitor management in wilderness
David N. Cole
2004-01-01
Rhe amount, type, timing, and location of visitor use all have profound effects on the quality of the natural resources and visitor experiences in wilderness. Therefore, it is important to monitor the flow of visitation, in space and over time, and predict how distributions are likely to change in response to both management actions and factors that are not subject to...
2011-07-01
joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement. Analyzing data with modern statistical techniques to determine the
Evaluation of the Quality of Online Information for Patients with Rare Cancers: Thyroid Cancer.
Kuenzel, Ulrike; Monga Sindeu, Tabea; Schroth, Sarah; Huebner, Jutta; Herth, Natalie
2017-01-24
The Internet offers an easy and quick access to a vast amount of patient information. However, several studies point to the poor quality of many websites and the resulting hazards of false information. The aim of this study was to assess quality of information on thyroid cancer. A patients' search for information about thyroid cancer on German websites was simulated using the search engine Google and the patient portal "Patienten-Information.de". The websites were assessed using a standardized instrument with formal and content aspects from the German Cancer Society. Supporting the results of prior studies that analysed patient information on the Internet, the data showed that the quality of patient information on thyroid cancer is highly heterogeneous depending on the website providers. The majority of website providers are represented by media and health providers other than health insurances, practices and professionals offering patient information of relatively poor quality. Moreover, most websites offer patient information of low-quality content. Only a few trustworthy, high-quality websites exist. Especially Google, a common search engine, focuses more on the dissemination of information than on quality aspects. In order to improve the patient information from the Internet, the visibility of high-quality websites must be improved. For that, education programs to improve patients' eHealth literacy are needed. A quick and easy evaluation tool for online information suited for patients should be implemented, and patients should be taught to integrate such a tool into their research process.
Examining System-Wide Impacts of Solar PV Control Systems with a Power Hardware-in-the-Loop Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Tess L.; Fuller, Jason C.; Schneider, Kevin P.
2014-06-08
High penetration levels of distributed solar PV power generation can lead to adverse power quality impacts, such as excessive voltage rise, voltage flicker, and reactive power values that result in unacceptable voltage levels. Advanced inverter control schemes have been developed that have the potential to mitigate many power quality concerns. However, local closed-loop control may lead to unintended behavior in deployed systems as complex interactions can occur between numerous operating devices. To enable the study of the performance of advanced control schemes in a detailed distribution system environment, a test platform has been developed that integrates Power Hardware-in-the-Loop (PHIL) withmore » concurrent time-series electric distribution system simulation. In the test platform, GridLAB-D, a distribution system simulation tool, runs a detailed simulation of a distribution feeder in real-time mode at the Pacific Northwest National Laboratory (PNNL) and supplies power system parameters at a point of common coupling. At the National Renewable Energy Laboratory (NREL), a hardware inverter interacts with grid and PV simulators emulating an operational distribution system. Power output from the inverters is measured and sent to PNNL to update the real-time distribution system simulation. The platform is described and initial test cases are presented. The platform is used to study the system-wide impacts and the interactions of inverter control modes—constant power factor and active Volt/VAr control—when integrated into a simulated IEEE 8500-node test feeder. We demonstrate that this platform is well-suited to the study of advanced inverter controls and their impacts on the power quality of a distribution feeder. Additionally, results are used to validate GridLAB-D simulations of advanced inverter controls.« less
Real-Time Performance of Mechatronic PZT Module Using Active Vibration Feedback Control.
Aggogeri, Francesco; Borboni, Alberto; Merlo, Angelo; Pellegrini, Nicola; Ricatto, Raffaele
2016-09-25
This paper proposes an innovative mechatronic piezo-actuated module to control vibrations in modern machine tools. Vibrations represent one of the main issues that seriously compromise the quality of the workpiece. The active vibration control (AVC) device is composed of a host part integrated with sensors and actuators synchronized by a regulator; it is able to make a self-assessment and adjust to alterations in the environment. In particular, an innovative smart actuator has been designed and developed to satisfy machining requirements during active vibration control. This study presents the mechatronic model based on the kinematic and dynamic analysis of the AVC device. To ensure a real time performance, a H2-LQG controller has been developed and validated by simulations involving a machine tool, PZT actuator and controller models. The Hardware in the Loop (HIL) architecture is adopted to control and attenuate the vibrations. A set of experimental tests has been performed to validate the AVC module on a commercial machine tool. The feasibility of the real time vibration damping is demonstrated and the simulation accuracy is evaluated.
Real-Time Performance of Mechatronic PZT Module Using Active Vibration Feedback Control
Aggogeri, Francesco; Borboni, Alberto; Merlo, Angelo; Pellegrini, Nicola; Ricatto, Raffaele
2016-01-01
This paper proposes an innovative mechatronic piezo-actuated module to control vibrations in modern machine tools. Vibrations represent one of the main issues that seriously compromise the quality of the workpiece. The active vibration control (AVC) device is composed of a host part integrated with sensors and actuators synchronized by a regulator; it is able to make a self-assessment and adjust to alterations in the environment. In particular, an innovative smart actuator has been designed and developed to satisfy machining requirements during active vibration control. This study presents the mechatronic model based on the kinematic and dynamic analysis of the AVC device. To ensure a real time performance, a H2-LQG controller has been developed and validated by simulations involving a machine tool, PZT actuator and controller models. The Hardware in the Loop (HIL) architecture is adopted to control and attenuate the vibrations. A set of experimental tests has been performed to validate the AVC module on a commercial machine tool. The feasibility of the real time vibration damping is demonstrated and the simulation accuracy is evaluated. PMID:27681732
NASA Astrophysics Data System (ADS)
Ali-Bey, Mohamed; Moughamir, Saïd; Manamanni, Noureddine
2011-12-01
in this paper a simulator of a multi-view shooting system with parallel optical axes and structurally variable configuration is proposed. The considered system is dedicated to the production of 3D contents for auto-stereoscopic visualization. The global shooting/viewing geometrical process, which is the kernel of this shooting system, is detailed and the different viewing, transformation and capture parameters are then defined. An appropriate perspective projection model is afterward derived to work out a simulator. At first, this latter is used to validate the global geometrical process in the case of a static configuration. Next, the simulator is used to show the limitations of a static configuration of this shooting system type by considering the case of dynamic scenes and then a dynamic scheme is achieved to allow a correct capture of this kind of scenes. After that, the effect of the different geometrical capture parameters on the 3D rendering quality and the necessity or not of their adaptation is studied. Finally, some dynamic effects and their repercussions on the 3D rendering quality of dynamic scenes are analyzed using error images and some image quantization tools. Simulation and experimental results are presented throughout this paper to illustrate the different studied points. Some conclusions and perspectives end the paper. [Figure not available: see fulltext.
Simulation of air quality impacts from prescribed fires on an urban area.
Hu, Yongtao; Odman, M Talat; Chang, Michael E; Jackson, William; Lee, Sangil; Edgerton, Eric S; Baumann, Karsten; Russell, Armistead G
2008-05-15
On February 28, 2007, a severe smoke event caused by prescribed forest fires occurred in Atlanta, GA. Later smoke events in the southeastern metropolitan areas of the United States caused by the Georgia-Florida wild forest fires further magnified the significance of forest fire emissions and the benefits of being able to accurately predict such occurrences. By using preburning information, we utilize an operational forecasting system to simulate the potential air quality impacts from two large February 28th fires. Our "forecast" predicts that the scheduled prescribed fires would have resulted in over 1 million Atlanta residents being potentially exposed to fine particle matter (PM2.5) levels of 35 microg m(-3) or higher from 4 p.m. to midnight. The simulated peak 1 h PM2.5 concentration is about 121 microg m(-3). Our study suggests that the current air quality forecasting technology can be a useful tool for helping the management of fire activities to protect public health. With postburning information, our "hindcast" predictions improved significantly on timing and location and slightly on peak values. "Hindcast" simulations also indicated that additional isoprenoid emissions from pine species temporarily triggered by the fire could induce rapid ozone and secondary organic aerosol formation during late winter. Results from this study suggest that fire induced biogenic volatile organic compounds emissions missing from current fire emissions estimate should be included in the future.
Managing simulation-based training: A framework for optimizing learning, cost, and time
NASA Astrophysics Data System (ADS)
Richmond, Noah Joseph
This study provides a management framework for optimizing training programs for learning, cost, and time when using simulation based training (SBT) and reality based training (RBT) as resources. Simulation is shown to be an effective means for implementing activity substitution as a way to reduce risk. The risk profile of 22 US Air Force vehicles are calculated, and the potential risk reduction is calculated under the assumption of perfect substitutability of RBT and SBT. Methods are subsequently developed to relax the assumption of perfect substitutability. The transfer effectiveness ratio (TER) concept is defined and modeled as a function of the quality of the simulator used, and the requirements of the activity trained. The Navy F/A-18 is then analyzed in a case study illustrating how learning can be maximized subject to constraints in cost and time, and also subject to the decision maker's preferences for the proportional and absolute use of simulation. Solution methods for optimizing multiple activities across shared resources are next provided. Finally, a simulation strategy including an operations planning program (OPP), an implementation program (IP), an acquisition program (AP), and a pedagogical research program (PRP) is detailed. The study provides the theoretical tools to understand how to leverage SBT, a case study demonstrating these tools' efficacy, and a set of policy recommendations to enable the US military to better utilize SBT in the future.
Jørgensen, Katarina M; Haddow, Pauline C
2011-08-01
Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.
NASA Astrophysics Data System (ADS)
Akhavan Niaki, Farbod
The objective of this research is first to investigate the applicability and advantage of statistical state estimation methods for predicting tool wear in machining nickel-based superalloys over deterministic methods, and second to study the effects of cutting tool wear on the quality of the part. Nickel-based superalloys are among those classes of materials that are known as hard-to-machine alloys. These materials exhibit a unique combination of maintaining their strength at high temperature and have high resistance to corrosion and creep. These unique characteristics make them an ideal candidate for harsh environments like combustion chambers of gas turbines. However, the same characteristics that make nickel-based alloys suitable for aggressive conditions introduce difficulties when machining them. High strength and low thermal conductivity accelerate the cutting tool wear and increase the possibility of the in-process tool breakage. A blunt tool nominally deteriorates the surface integrity and damages quality of the machined part by inducing high tensile residual stresses, generating micro-cracks, altering the microstructure or leaving a poor roughness profile behind. As a consequence in this case, the expensive superalloy would have to be scrapped. The current dominant solution for industry is to sacrifice the productivity rate by replacing the tool in the early stages of its life or to choose conservative cutting conditions in order to lower the wear rate and preserve workpiece quality. Thus, monitoring the state of the cutting tool and estimating its effects on part quality is a critical task for increasing productivity and profitability in machining superalloys. This work aims to first introduce a probabilistic-based framework for estimating tool wear in milling and turning of superalloys and second to study the detrimental effects of functional state of the cutting tool in terms of wear and wear rate on part quality. In the milling operation, the mechanisms of tool failure were first identified and, based on the rapid catastrophic failure of the tool, a Bayesian inference method (i.e., Markov Chain Monte Carlo, MCMC) was used for parameter calibration of tool wear using a power mechanistic model. The calibrated model was then used in the state space probabilistic framework of a Kalman filter to estimate the tool flank wear. Furthermore, an on-machine laser measuring system was utilized and fused into the Kalman filter to improve the estimation accuracy. In the turning operation the behavior of progressive wear was investigated as well. Due to the nonlinear nature of wear in turning, an extended Kalman filter was designed for tracking progressive wear, and the results of the probabilistic-based method were compared with a deterministic technique, where significant improvement (more than 60% increase in estimation accuracy) was achieved. To fulfill the second objective of this research in understanding the underlying effects of wear on part quality in cutting nickel-based superalloys, a comprehensive study on surface roughness, dimensional integrity and residual stress was conducted. The estimated results derived from a probabilistic filter were used for finding the proper correlations between wear, surface roughness and dimensional integrity, along with a finite element simulation for predicting the residual stress profile for sharp and worn cutting tool conditions. The output of this research provides the essential information on condition monitoring of the tool and its effects on product quality. The low-cost Hall effect sensor used in this work to capture spindle power in the context of the stochastic filter can effectively estimate tool wear in both milling and turning operations, while the estimated wear can be used to generate knowledge of the state of workpiece surface integrity. Therefore the true functionality and efficiency of the tool in superalloy machining can be evaluated without additional high-cost sensing.
Recent Research applications at the Athens Neutron Monitor Station
NASA Astrophysics Data System (ADS)
Mavromichalaki, H.; Gerontidou, M.; Paschalis, P.; Papaioannou, A.; Paouris, E.; Papailiou, M.; Souvatzoglou, G.
2015-08-01
The ground based neutron monitor measurements play a key role in the field of space physics, solar-terrestrial relations, and space weather applications. The Athens cosmic ray group has developed several research applications such as an optimized automated Ground Level Enhancement Alert (GLE Alert Plus) and a web interface, providing data from multiple Neutron Monitor stations (Multi-Station tool). These services are actually available via the Space Weather Portal operated by the European Space Agency (http://swe.ssa.esa.int). In addition, two simulation tools, based on Geant4, have also been implemented. The first one is for the simulation of the cosmic ray showers in the atmosphere (DYASTIMA) and the second one is for the simulation of the 6NM-64 neutron monitor. The contribution of the simulation tools to the calculations of the radiation dose received by air crews and passengers within the Earth's atmosphere and to the neutron monitor study is presented as well. Furthermore, the accurate calculation of the barometric coefficient and the primary data processing by filtering algorithms, such as the well known Median Editor and the developed by the Athens group ANN Algorithm and Edge Editor which contribute to the provision of high quality neutron monitor data are also discussed. Finally, a Space Weather Forecasting Center which provides a three day geomagnetic activity report on a daily basis has been set up and has been operating for the last two years at the Athens Neutron Monitor Station.
In-flight simulation studies at the NASA Dryden Flight Research Facility
NASA Technical Reports Server (NTRS)
Shafer, Mary F.
1992-01-01
Since the late 1950's, the National Aeronautics and Space Administration's Dryden Flight Research Facility has found in-flight simulation to be an invaluable tool. In-flight simulation has been used to address a wide variety of flying qualities questions, including low-lift-to-drag ratio approach characteristics for vehicles like the X-15, the lifting bodies, and the Space Shuttle; the effects of time delays on controllability of aircraft with digital flight-control systems, the causes and cures of pilot-induced oscillation in a variety of aircraft, and flight-control systems for such diverse aircraft as the X-15 and the X-29. In-flight simulation has also been used to anticipate problems and to avoid them and to solve problems once they appear. Presented here is an account of the in-flight simulation at the Dryden Flight Research Facility and some discussion. An extensive bibliography is included.
Man-rated flight software for the F-8 DFBW program
NASA Technical Reports Server (NTRS)
Bairnsfather, R. R.
1975-01-01
The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program Assembly Control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools--the all-digital simulator, the hybrid simulator, and the Iron Bird simulator--are described, as well as the program test plans and their implementation on the various simulators. Failure-effects analysis and the creation of special failure-generating software for testing purposes are described. The quality of the end product is evidenced by the F-8 DFBW flight test program in which 42 flights, totaling 58 hours of flight time, were successfully made without any DFCS inflight software, or hardware, failures.
In-flight simulation studies at the NASA Dryden Flight Research Facility
NASA Technical Reports Server (NTRS)
Shafer, Mary F.
1994-01-01
Since the late 1950's the National Aeronautics and Space Administration's Dryden Flight Research Facility has found in-flight simulation to be an invaluable tool. In-flight simulation has been used to address a wide variety of flying qualities questions, including low lift-to-drag ratio approach characteristics for vehicles like the X-15, the lifting bodies, and the space shuttle; the effects of time delays on controllability of aircraft with digital flight control systems; the causes and cures of pilot-induced oscillation in a variety of aircraft; and flight control systems for such diverse aircraft as the X-15 and the X-29. In-flight simulation has also been used to anticipate problems, avoid them, and solve problems once they appear. This paper presents an account of the in-flight simulation at the Dryden Flight Research Facility and some discussion. An extensive bibliography is included.
NASA Astrophysics Data System (ADS)
Ghamlouch, T.; Roux, S.; Bailleul, J.-L.; Lefèvre, N.; Sobotka, V.
2017-10-01
Today's aerospace industrial first priority is the quality improvement of the composite material parts with the reduction of the manufacturing time in order to increase their quality/cost ratio. A fabrication method that could meet these specifications especially for large parts is the autoclave curing process. In fact the autoclave molding ensures the thermal control of the composite parts during the whole curing cycle. However the geometry of the tools as well as their positioning in the autoclave induce non uniform and complex flows around composite parts. This heterogeneity implies non-uniform heat transfers which can directly impact on part quality. One of the main challenges is therefore to describe the flow field inside an autoclave as well as the convective heat transfer from the heated pressurized gas to the composite part and the mold. For this purpose, and given the technical issues associated with instrumentation and measurements in actual autoclaves, an autoclave model was designed and then manufactured based on similarity laws. This tool allows the measurement of the flow field around representative real industrial molds using the PIV technique and the characterization of the heat transfer thanks to thermal instrumentation. The experimental results are then compared with those derived from numerical simulations using a commercial RANS CFD code. This study aims at developing a semi-empirical approach for the prediction of the heat transfer coefficient around the parts and therefore predicts its thermal history during the process with a view of optimization.
Wieland, Birgit; Ropte, Sven
2017-01-01
The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results. PMID:28981458
Wieland, Birgit; Ropte, Sven
2017-10-05
The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results.
Garcia, Ana Maria
2009-01-01
A study of the Currituck Sound was initiated in 2005 to evaluate the water chemistry of the Sound and assess the effectiveness of management strategies. As part of this study, the Soil and Water Assessment Tool (SWAT) model was used to simulate current sediment and nutrient loadings for two distinct watersheds in the Currituck Sound basin and to determine the consequences of different water-quality management scenarios. The watersheds studied were (1) Tull Creek watershed, which has extensive row-crop cultivation and artificial drainage, and (2) West Neck Creek watershed, which drains urban areas in and around Virginia Beach, Virginia. The model simulated monthly streamflows with Nash-Sutcliffe model efficiency coefficients of 0.83 and 0.76 for Tull Creek and West Neck Creek, respectively. The daily sediment concentration coefficient of determination was 0.19 for Tull Creek and 0.36 for West Neck Creek. The coefficient of determination for total nitrogen was 0.26 for both watersheds and for dissolved phosphorus was 0.4 for Tull Creek and 0.03 for West Neck Creek. The model was used to estimate current (2006-2007) sediment and nutrient yields for the two watersheds. Total suspended-solids yield was 56 percent lower in the urban watershed than in the agricultural watershed. Total nitrogen export was 45 percent lower, and total phosphorus was 43 percent lower in the urban watershed than in the agricultural watershed. A management scenario with filter strips bordering the main channels was simulated for Tull Creek. The Soil and Water Assessment Tool model estimated a total suspended-solids yield reduction of 54 percent and total nitrogen and total phosphorus reductions of 21 percent and 29 percent, respectively, for the Tull Creek watershed.
Sloane, Elliot; Gehlot, Vijay
2005-01-01
Hospitals and manufacturers are designing and deploying the IEEE 802.x wireless technologies in medical devices to promote patient mobility and flexible facility use. There is little information, however, on the reliability or ultimate safety of connecting multiple wireless life-critical medical devices from multiple vendors using commercial 802.11a, 802.11b, 802.11g or pre-802.11n devices. It is believed that 802.11-type devices can introduce unintended life-threatening risks unless delivery of critical patient alarms to central monitoring systems and/or clinical personnel is assured by proper use of 802.11e Quality of Service (QoS) methods. Petri net tools can be used to simulate all possible states and transitions between devices and/or systems in a wireless device network, and can identify failure modes in advance. Colored Petri Net (CPN) tools are ideal, in fact, as they allow tracking and controlling each message in a network based on pre-selected criteria. This paper describes a research project using CPN to simulate and validate alarm integrity in a small multi-modality wireless patient monitoring system. A 20-monitor wireless patient monitoring network is created in two versions: one with non-prioritized 802.x CSM protocols and the second with simulated Quality of Service (QoS) capabilities similar to 802.11e (i.e., the second network allows message priority management.) In the standard 802.x network, dangerous heart arrhythmia and pulse oximetry alarms could not be reliably and rapidly communicated, but the second network's QoS priority management reduced that risk significantly.
Challenges of NDE simulation tool validation, optimization, and utilization for composites
NASA Astrophysics Data System (ADS)
Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter
2016-02-01
Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.
Comparing Simulated and Observed Spectroscopic Signatures of Mix in Omega Capsules
NASA Astrophysics Data System (ADS)
Tregillis, I. L.; Shah, R. C.; Hakel, P.; Cobble, J. A.; Murphy, T. J.; Krasheninnikova, N. S.; Hsu, S. C.; Bradley, P. A.; Schmitt, M. J.; Batha, S. H.; Mancini, R. C.
2012-10-01
The Defect-Induced Mix Experiment (DIME) campaign at Los Alamos National Laboratory uses multi-monochromatic X-ray imaging (MMI)footnotetextT. Nagayama, R.C. Mancini, R. Florido, et al, J. App. Phys. 109, 093303 (2011) to detect the migration of high-Z spectroscopic dopants into the hot core of an imploded capsule. We have developed an MMI post-processing tool for producing synthetic datasets from two- and three-dimensional Lagrangian numerical simulations of Omega and NIF shots. These synthetic datasets are of sufficient quality, and contain sufficient physics, that they can be analyzed in the same manner as actual MMI data. We have carried out an extensive comparison between simulated and observed MMI data for a series of polar direct-drive shots carried out at the Omega laser facility in January, 2011. The capsule diameter was 870 microns; the 15 micron CH ablators contained a 2 micron Ti-doped layer along the inner edge. All capsules were driven with 17 kJ; some capsules were manufactured with an equatorial ``trench'' defect. This talk will focus on the construction of spectroscopic-quality synthetic MMI datasets from numerical simulations, and their correlation with MMI measurements.
NASA Astrophysics Data System (ADS)
Xu, Mengjia; Zou, Changxin; Zhao, Yanwei
2017-04-01
Environmental/ecological models are widely used for lake management as they provide a means to understand physical, chemical and biological processes in highly complex ecosystems. Most research focused on the development of environmental (water quality) and ecological models, separately. Limited studies were developed to couple the two models, and in these limited coupled models, a lake was regarded as a whole for analysis (i.e., considering the lake to be one well-mixed box), which was appropriate for small-scale lakes and was not sufficient to capture spatial variations within middle-scale or large-scale lakes. This paper seeks to establish a zoning-based environmental-ecological-coupled model for a lake. The Baiyangdian Lake, the largest freshwater lake in Northern China, was adopted as the study case. The coupled lake models including a hydrodynamics and water quality model established by MIKE21 and a compartmental ecological model used STELLA software have been established for middle-sized Baiyangdian Lake to realize the simulation of spatial variations of ecological conditions. On the basis of the flow field distribution results generated by MIKE21 hydrodynamics model, four water area zones were used as an example for compartmental ecological model calibration and validation. The results revealed that the developed coupled lake models can reasonably reflected the changes of the key state variables although there remain some state variables that are not well represented by the model due to the low quality of field monitoring data. Monitoring sites in a compartment may not be representative of the water quality and ecological conditions in the entire compartment even though that is the intention of compartment-based model design. There was only one ecological observation from a single monitoring site for some periods. This single-measurement issue may cause large discrepancies particularly when sampled site is not representative of the whole compartment. The coupled models have been applied to simulate the spatial variation trends of ecological condition under ecological water supplement as an example to reflect the application effect in lake restoration and management. The simulation results indicate that the models can provide a useful tool for lake restoration and management. The simulated spatial variation trends can provide a foundation for establishing permissible ranges for a selected set of water quality indices for a series of management measures such as watershed pollution load control and ecological water transfer. Meanwhile, the coupled models can help us to understand processes taking place and the relations of interaction between components in the lake ecosystem and external conditions. Taken together, the proposed models we established show some promising applications as middle-scale or large-scale lake management tools for pollution load control and ecological water transfer. These tools quantify the implications of proposed future water management decisions.
Radio Frequency Scanning and Simulation of Oriented Strand Board Material Property
NASA Astrophysics Data System (ADS)
Liu, Xiaojian; Zhang, Jilei; Steele, Philip. H.; Donohoe, J. Patrick
2008-02-01
Oriented strandboard (OSB) is a wood composite product with the largest market share in U.S. residential and commercial construction. Wood specific gravity (SG) and moisture content (MC) play an important role in the OSB manufacturing process. They are the two of the critical variables that manufacturers are required to monitor, locate, and control in order to produce a product with consistent quality. In this study, radio frequency scanning nondestructive evaluation (NDE) technologies evaluated the local area MC and SG of OSB panels following panel production by hot pressing. A finite element software simulation tool was used to optimize the sensor geometry and for investigating the interaction between electromagnetic field and wood dielectric properties. Our results indicate the RF scanning response is closely correlated to the MC and SG variations in OSB panels. Radio frequency NDE appears to have potential as an effective method for insuring OSB panel quality during manufacturing.
The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diachin, L F; Garaizar, F X; Henson, V E
2009-10-12
In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less
RFI and SCRIMP Model Development and Verification
NASA Technical Reports Server (NTRS)
Loos, Alfred C.; Sayre, Jay
2000-01-01
Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.
Teis, Rachel; Allen, Jyai; Lee, Nigel; Kildea, Sue
2017-02-01
No study has tested a Crisis Resource Management prompt on resuscitation performance. We conducted a feasibility, unblinded, parallel-group, randomised controlled trial at one Australian paediatric hospital (June-September 2014). Eligible participants were any doctor, nurse, or nurse manager who would normally be involved in a Medical Emergency Team simulation. The unit of block randomisation was one of six scenarios (3 control:3 intervention) with or without a verbal prompt. The primary outcomes tested the feasibility and utility of the intervention and data collection tools. The secondary outcomes measured resuscitation quality and team performance. Data were analysed from six resuscitation scenarios (n=49 participants); three control groups (n=25) and three intervention groups (n=24). The ability to measure all data items on the data collection tools was hindered by problems with the recording devices both in the mannequins and the video camera. For a pilot study, greater training for the prompt role and pre-briefing participants about assessment of their cardio-pulmonary resuscitation quality should be undertaken. Data could be analysed in real time with independent video analysis to validate findings. Two cameras would strengthen reliability of the methods. Copyright © 2016 College of Emergency Nursing Australasia. Published by Elsevier Ltd. All rights reserved.
Analysis of the thermo-mechanical deformations in a hot forging tool by numerical simulation
NASA Astrophysics Data System (ADS)
L-Cancelos, R.; Varas, F.; Martín, E.; Viéitez, I.
2016-03-01
Although programs have been developed for the design of tools for hot forging, its design is still largely based on the experience of the tool maker. This obliges to build some test matrices and correct their errors to minimize distortions in the forged piece. This phase prior to mass production consumes time and material resources, which makes the final product more expensive. The forging tools are usually constituted by various parts made of different grades of steel, which in turn have different mechanical properties and therefore suffer different degrees of strain. Furthermore, the tools used in the hot forging are exposed to a thermal field that also induces strain or stress based on the degree of confinement of the piece. Therefore, the mechanical behaviour of the assembly is determined by the contact between the different pieces. The numerical simulation allows to analyse different configurations and anticipate possible defects before tool making, thus, reducing the costs of this preliminary phase. In order to improve the dimensional quality of the manufactured parts, the work presented here focuses on the application of a numerical model to a hot forging manufacturing process in order to predict the areas of the forging die subjected to large deformations. The thermo-mechanical model developed and implemented with free software (Code-Aster) includes the strains of thermal origin, strains during forge impact and contact effects. The numerical results are validated with experimental measurements in a tooling set that produces forged crankshafts for the automotive industry. The numerical results show good agreement with the experimental tests. Thereby, a very useful tool for the design of tooling sets for hot forging is achieved.
A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition
2015-10-05
simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows
Ma, Mingying; Wang, Xiangzhao; Wang, Fan
2006-11-10
The degradation of image quality caused by aberrations of projection optics in lithographic tools is a serious problem in optical lithography. We propose what we believe to be a novel technique for measuring aberrations of projection optics based on two-beam interference theory. By utilizing the partial coherent imaging theory, a novel model that accurately characterizes the relative image displacement of a fine grating pattern to a large pattern induced by aberrations is derived. Both even and odd aberrations are extracted independently from the relative image displacements of the printed patterns by two-beam interference imaging of the zeroth and positive first orders. The simulation results show that by using this technique we can measure the aberrations present in the lithographic tool with higher accuracy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodrigues, A; Wu, Q; Sawkey, D
Purpose: DEAR is a radiation therapy technique utilizing synchronized motion of gantry and couch during delivery to optimize dose distribution homogeneity and penumbra for treatment of superficial disease. Dose calculation for DEAR is not yet supported by commercial TPSs. The purpose of this study is to demonstrate the feasibility of using a web-based Monte Carlo (MC) simulation tool (VirtuaLinac) to calculate dose distributions for a DEAR delivery. Methods: MC simulations were run through VirtuaLinac, which is based on the GEANT4 platform. VirtuaLinac utilizes detailed linac head geometry and material models, validated phase space files, and a voxelized phantom. The inputmore » was expanded to include an XML file for simulation of varying mechanical axes as a function of MU. A DEAR XML plan was generated and used in the MC simulation and delivered on a TrueBeam in Developer Mode. Radiographic film wrapped on a cylindrical phantom (12.5 cm radius) measured dose at a depth of 1.5 cm and compared to the simulation results. Results: A DEAR plan was simulated using an energy of 6 MeV and a 3×10 cm{sup 2} cut-out in a 15×15 cm{sup 2} applicator for a delivery of a 90° arc. The resulting data were found to provide qualitative and quantitative evidence that the simulation platform could be used as the basis for DEAR dose calculations. The resulting unwrapped 2D dose distributions agreed well in the cross-plane direction along the arc, with field sizes of 18.4 and 18.2 cm and penumbrae of 1.9 and 2.0 cm for measurements and simulations, respectively. Conclusion: Preliminary feasibility of a DEAR delivery using a web-based MC simulation platform has been demonstrated. This tool will benefit treatment planning for DEAR as a benchmark for developing other model based algorithms, allowing efficient optimization of trajectories, and quality assurance of plans without the need for extensive measurements.« less
Björklund, Karin; Bondelind, Mia; Karlsson, Anna; Karlsson, Dick; Sokolova, Ekaterina
2018-02-01
The risk from chemical substances in surface waters is often increased during wet weather, due to surface runoff, combined sewer overflows (CSOs) and erosion of contaminated land. There are strong incentives to improve the quality of surface waters affected by human activities, not only from ecotoxicity and ecosystem health perspectives, but also for drinking water and recreational purposes. The aim of this study is to investigate the influence of urban stormwater discharges and CSOs on receiving water in the context of chemical health risks and recreational water quality. Transport of copper (Cu) and benzo[a]pyrene (BaP) in the Göta River (Sweden) was simulated using a hydrodynamic model. Within the 16 km modelled section, 35 CSO and 16 urban stormwater point discharges, as well as the effluent from a major wastewater treatment plant, were included. Pollutant concentrations in the river were simulated for two rain events and investigated at 13 suggested bathing sites. The simulations indicate that water quality guideline values for Cu are exceeded at several sites, and that stormwater discharges generally give rise to higher Cu and BaP concentrations than CSOs. Due to the location of point discharges and the river current inhibiting lateral mixing, the north shore of the river is better suited for bathing. Peak concentrations have a short duration; increased concentrations of the pollutants may however be present for several days after a rain event. Monitoring of river water quality indicates that simulated Cu and BaP concentrations are in the same order of magnitude as measured concentrations. It is concluded that hydrodynamic modelling is a useful tool for identifying suitable bathing sites in urban surface waters and areas of concern where mitigation measures should be implemented to improve water quality. Copyright © 2017 Elsevier Ltd. All rights reserved.
The simcyp population based simulator: architecture, implementation, and quality assurance.
Jamei, Masoud; Marciniak, Steve; Edwards, Duncan; Wragg, Kris; Feng, Kairui; Barnett, Adrian; Rostami-Hodjegan, Amin
2013-01-01
Developing a user-friendly platform that can handle a vast number of complex physiologically based pharmacokinetic and pharmacodynamic (PBPK/PD) models both for conventional small molecules and larger biologic drugs is a substantial challenge. Over the last decade the Simcyp Population Based Simulator has gained popularity in major pharmaceutical companies (70% of top 40 - in term of R&D spending). Under the Simcyp Consortium guidance, it has evolved from a simple drug-drug interaction tool to a sophisticated and comprehensive Model Based Drug Development (MBDD) platform that covers a broad range of applications spanning from early drug discovery to late drug development. This article provides an update on the latest architectural and implementation developments within the Simulator. Interconnection between peripheral modules, the dynamic model building process and compound and population data handling are all described. The Simcyp Data Management (SDM) system, which contains the system and drug databases, can help with implementing quality standards by seamless integration and tracking of any changes. This also helps with internal approval procedures, validation and auto-testing of the new implemented models and algorithms, an area of high interest to regulatory bodies.
A better sequence-read simulator program for metagenomics.
Johnson, Stephen; Trost, Brett; Long, Jeffrey R; Pittet, Vanessa; Kusalik, Anthony
2014-01-01
There are many programs available for generating simulated whole-genome shotgun sequence reads. The data generated by many of these programs follow predefined models, which limits their use to the authors' original intentions. For example, many models assume that read lengths follow a uniform or normal distribution. Other programs generate models from actual sequencing data, but are limited to reads from single-genome studies. To our knowledge, there are no programs that allow a user to generate simulated data following non-parametric read-length distributions and quality profiles based on empirically-derived information from metagenomics sequencing data. We present BEAR (Better Emulation for Artificial Reads), a program that uses a machine-learning approach to generate reads with lengths and quality values that closely match empirically-derived distributions. BEAR can emulate reads from various sequencing platforms, including Illumina, 454, and Ion Torrent. BEAR requires minimal user input, as it automatically determines appropriate parameter settings from user-supplied data. BEAR also uses a unique method for deriving run-specific error rates, and extracts useful statistics from the metagenomic data itself, such as quality-error models. Many existing simulators are specific to a particular sequencing technology; however, BEAR is not restricted in this way. Because of its flexibility, BEAR is particularly useful for emulating the behaviour of technologies like Ion Torrent, for which no dedicated sequencing simulators are currently available. BEAR is also the first metagenomic sequencing simulator program that automates the process of generating abundances, which can be an arduous task. BEAR is useful for evaluating data processing tools in genomics. It has many advantages over existing comparable software, such as generating more realistic reads and being independent of sequencing technology, and has features particularly useful for metagenomics work.
Implementing a prototyping network for injection moulded imaging lenses in Finland
NASA Astrophysics Data System (ADS)
Keränen, K.; Mäkinen, J.-T.; Pääkkönen, E. J.; Koponen, M.; Karttunen, M.; Hiltunen, J.; Karioja, P.
2005-10-01
A network for prototyping imaging lenses using injection moulding was established in Finland. The network consists of several academic and industrial partners capable of designing, processing and characterising imaging lenses produced by injection moulding technology. In order to validate the operation of the network a demonstrator lens was produced. The process steps included in the manufacturing were lens specification, designing and modelling, material selection, mould tooling, moulding process simulation, injection moulding and characterisation. A magnifying imaging singlet lens to be used as an add-on in a camera phone was selected as a demonstrator. The design of the add-on lens proved to be somewhat challenging, but a double aspheric singlet lens design fulfilling nearly the requirement specification was produced. In the material selection task the overall characteristics profile of polymethyl methacrylate (PMMA) material was seen to be the most fitting to the pilot case. It is a low cost material with good moulding properties and therefore it was selected as a material for the pilot lens. Lens mould design was performed using I-DEAS and tested by using MoldFlow 3D injection moulding simulation software. The simulations predicted the achievable lens quality in the processing, when using a two-cavity mould design. First cavity was tooled directly into the mould plate and the second cavity was made by tooling separate insert pieces for the mould. Mould material was steel and the inserts were made from Moldmax copper alloy. Parts were tooled with high speed milling machines. Insert pieces were hand polished after tooling. Prototype lenses were injection moulded using two PMMA grades, namely 6N and 7N. Different process parameters were also experimented in the injection moulding test runs. Prototypes were characterised by measuring mechanical dimensions, surface profile, roughness and MTF of the lenses. Characterisations showed that the lens surface RMS roughness was 30-50 nm and the profile deviation was 5 μm from the design at a distance of 0.3 mm from the lens vertex. These manufacturing defects caused that the measured MTF values were lower than designed. The lens overall quality, however, was adequate to demonstrate the concept successfully. Through the implementation of the demonstrator lens we could test effectively different stages of the manufacturing process and get information about process component weight and risk factors and validate the overall performance of the network.
NASA Astrophysics Data System (ADS)
Bell, C.; Li, Y.; Lopez, E.; Hogue, T. S.
2017-12-01
Decision support tools that quantitatively estimate the cost and performance of infrastructure alternatives are valuable for urban planners. Such a tool is needed to aid in planning stormwater projects to meet diverse goals such as the regulation of stormwater runoff and its pollutants, minimization of economic costs, and maximization of environmental and social benefits in the communities served by the infrastructure. This work gives a brief overview of an integrated decision support tool, called i-DST, that is currently being developed to serve this need. This presentation focuses on the development of a default database for the i-DST that parameterizes water quality treatment efficiency of stormwater best management practices (BMPs) by region. Parameterizing the i-DST by region will allow the tool to perform accurate simulations in all parts of the United States. A national dataset of BMP performance is analyzed to determine which of a series of candidate regionalizations explains the most variance in the national dataset. The data used in the regionalization analysis comes from the International Stormwater BMP Database and data gleaned from an ongoing systematic review of peer-reviewed and gray literature. In addition to identifying a regionalization scheme for water quality performance parameters in the i-DST, our review process will also provide example methods and protocols for systematic reviews in the field of Earth Science.
BACT Simulation User Guide (Version 7.0)
NASA Technical Reports Server (NTRS)
Waszak, Martin R.
1997-01-01
This report documents the structure and operation of a simulation model of the Benchmark Active Control Technology (BACT) Wind-Tunnel Model. The BACT system was designed, built, and tested at NASA Langley Research Center as part of the Benchmark Models Program and was developed to perform wind-tunnel experiments to obtain benchmark quality data to validate computational fluid dynamics and computational aeroelasticity codes, to verify the accuracy of current aeroservoelasticity design and analysis tools, and to provide an active controls testbed for evaluating new and innovative control algorithms for flutter suppression and gust load alleviation. The BACT system has been especially valuable as a control system testbed.
Krueger, Linda; Ernstmeyer, Kim; Kirking, Ellen
2017-06-01
The purpose of this study was to examine the influence of a multipatient, interprofessional simulation session on nursing students' attitudes toward nurse-physician collaboration using the Jefferson Scale of Attitudes Toward Physician-Nurse Collaboration. Final-semester nursing students, along with medical resident and students from other health programs, participated in a simulation exercise that included a period of prebriefing, simulation, and debriefing. Participants completed pre- and postsimulation surveys to assess the impact on collaboration. In total, 268 nursing students completed the survey. Participants had a more positive attitude toward nurse-physician collaboration following the simulation event, compared with prior to it. Significant differences between male and female nursing students were found on mean postsimulation scores and for three of the four subscales of the tool. Interprofessional simulation may be an effective way to enhance collaborative relationships, which ultimately may influence patient safety and quality of care. [J Nurs Educ. 2017;56(6):321-327.]. Copyright 2017, SLACK Incorporated.
Moriasi, Daniel N; Gowda, Prasanna H; Arnold, Jeffrey G; Mulla, David J; Ale, Srinivasulu; Steiner, Jean L; Tomer, Mark D
2013-11-01
Subsurface tile drains in agricultural systems of the midwestern United States are a major contributor of nitrate-N (NO-N) loadings to hypoxic conditions in the Gulf of Mexico. Hydrologic and water quality models, such as the Soil and Water Assessment Tool, are widely used to simulate tile drainage systems. The Hooghoudt and Kirkham tile drain equations in the Soil and Water Assessment Tool have not been rigorously tested for predicting tile flow and the corresponding NO-N losses. In this study, long-term (1983-1996) monitoring plot data from southern Minnesota were used to evaluate the SWAT version 2009 revision 531 (hereafter referred to as SWAT) model for accurately estimating subsurface tile drain flows and associated NO-N losses. A retention parameter adjustment factor was incorporated to account for the effects of tile drainage and slope changes on the computation of surface runoff using the curve number method (hereafter referred to as Revised SWAT). The SWAT and Revised SWAT models were calibrated and validated for tile flow and associated NO-N losses. Results indicated that, on average, Revised SWAT predicted monthly tile flow and associated NO-N losses better than SWAT by 48 and 28%, respectively. For the calibration period, the Revised SWAT model simulated tile flow and NO-N losses within 4 and 1% of the observed data, respectively. For the validation period, it simulated tile flow and NO-N losses within 8 and 2%, respectively, of the observed values. Therefore, the Revised SWAT model is expected to provide more accurate simulation of the effectiveness of tile drainage and NO-N management practices. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Analysis of visual quality improvements provided by known tools for HDR content
NASA Astrophysics Data System (ADS)
Kim, Jaehwan; Alshina, Elena; Lee, JongSeok; Park, Youngo; Choi, Kwang Pyo
2016-09-01
In this paper, the visual quality of different solutions for high dynamic range (HDR) compression using MPEG test contents is analyzed. We also simulate the method for an efficient HDR compression which is based on statistical property of the signal. The method is compliant with HEVC specification and also easily compatible with other alternative methods which might require HEVC specification changes. It was subjectively tested on commercial TVs and compared with alternative solutions for HDR coding. Subjective visual quality tests were performed using SUHD TVs model which is SAMSUNG JS9500 with maximum luminance up to 1000nit in test. The solution that is based on statistical property shows not only improvement of objective performance but improvement of visual quality compared to other HDR solutions, while it is compatible with HEVC specification.
10 CFR 434.606 - Simulation tool.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 3 2013-01-01 2013-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...
10 CFR 434.606 - Simulation tool.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 3 2012-01-01 2012-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...
10 CFR 434.606 - Simulation tool.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 3 2011-01-01 2011-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...
10 CFR 434.606 - Simulation tool.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 3 2014-01-01 2014-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...
NASA Astrophysics Data System (ADS)
Mishra, S. K.; Ding, D.; Rapolu, U.
2012-12-01
Human activity is intricately linked to the quality and quantity of water resources. Although many studies have examined water-human interaction, the complexity of such coupled systems is not well understood largely because of gaps in our knowledge of water-cycle processes which are heavily influenced by socio-economic drivers. On this context, this team has investigated connections among agriculture, policy, climate, land use/land cover, and water quality in Iowa over the past couple of years. To help explore these connections the team is developing a variety of cyber infrastructure tools that facilitate the collection, analysis and visualization of data, and the simulation of system dynamics. In an ongoing effort, the prototype system is applied to Clear Creek watershed, an agricultural dominating catchment in Iowa in the US Midwest, to understand water-human processes relevant to management decisions by farmers regarding agro ecosystems. The primary aim of this research is to understand the connections that exist among the agricultural and biofuel economy, land use/land cover change, and water quality. To help explore these connections an agent-based model (ABM) of land use change has been developed that simulates the decisions made by farmers given alternative assumptions about market forces, farmer characteristics, and water quality regulations. The SWAT model was used to simulate the impact of these decisions on the movement of sediment, nitrogen, and phosphorus across the landscape. The paper also demonstrate how through the use of this system researchers can, for example, search for scenarios that lead to desirable socio-economic outcomes as well as preserve water quantity and quality.
Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.
Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P
2018-03-03
Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boddu, S; Morrow, A; Krishnamurthy, N
Purpose: Our goal is to implement lean methodology to make our current process of CT simulation to treatment more efficient. Methods: In this study, we implemented lean methodology and tools and employed flowchart in excel for process-mapping. We formed a group of physicians, physicists, dosimetrists, therapists and a clinical physics assistant and huddled bi-weekly to map current value streams. We performed GEMBA walks and observed current processes from scheduling patient CT Simulations to treatment plan approval. From this, the entire workflow was categorized into processes, sub-processes, and tasks. For each process we gathered data on touch time, first time quality,more » undesirable effects (UDEs), and wait-times from relevant members of each task. UDEs were binned per frequency of their occurrence. We huddled to map future state and to find solutions to high frequency UDEs. We implemented visual controls, hard stops, and documented issues found during chart checks prior to treatment plan approval. Results: We have identified approximately 64 UDEs in our current workflow that could cause delays, re-work, compromise the quality and safety of patient treatments, or cause wait times between 1 – 6 days. While some UDEs are unavoidable, such as re-planning due to patient weight loss, eliminating avoidable UDEs is our goal. In 2015, we found 399 issues with patient treatment plans, of which 261, 95 and 43 were low, medium and high severity, respectively. We also mapped patient-specific QA processes for IMRT/Rapid Arc and SRS/SBRT, involving 10 and 18 steps, respectively. From these, 13 UDEs were found and 5 were addressed that solved 20% of issues. Conclusion: We have successfully implemented lean methodology and tools. We are further mapping treatment site specific workflows to identify bottlenecks, potential breakdowns and personnel allocation and employ tools like failure mode effects analysis to mitigate risk factors to make this process efficient.« less
1990-11-01
Applying the chain rule: dB dB dH dB 1 (180) 95 It remains to calculate dBi/ dHi and dB i+1/dHi+ 1 . Now as calculation pro - ceeds from node to node...simulation models can be powerful tools for studying these issues. However, to be useful, the water quality model must be properly vuited for the problem...0 GN_1(QNANQN _1AN_ ) = 0 GN(QN,AN) = 0 (47) 44. The e. l solution of these nobi -e-.a. equations c .-n proceed in two ways. First the nonlinear terms
da Silva, Robson Rodrigues; Bissaco, Marcia Aparecida Silva; Goroso, Daniel Gustavo
2015-12-01
Understanding the basic concepts of physiology and biophysics of cardiac cells can be improved by virtual experiments that illustrate the complex excitation-contraction coupling process in cardiac cells. The aim of this study is to propose a rat cardiac myocyte simulator, with which calcium dynamics in excitation-contraction coupling of an isolated cell can be observed. This model has been used in the course "Mathematical Modeling and Simulation of Biological Systems". In this paper we present the didactic utility of the simulator MioLab(®). The simulator enables virtual experiments that can help studying inhibitors and activators in the sarcoplasmic reticulum sodium-calcium exchanger, thus corroborating a better understanding of the effects of medications, which are used to treat arrhythmias, on these compartments. The graphical interfaces were developed not only to facilitate the use of the simulator, but also to promote a constructive learning on the subject, since there are animations and videos for each stage of the simulation. The effectiveness of the simulator was tested by a group of graduate students. Some examples of simulations were presented in order to describe the overall structure of the simulator. Part of these virtual experiments became an activity for Biomedical Engineering graduate students, who evaluated the simulator based on its didactic quality. As a result, students answered a questionnaire on the usability and functionality of the simulator as a teaching tool. All students performed the proposed activities and classified the simulator as an optimal or good learning tool. In their written questions, students indicated as negative characteristics some problems with visualizing graphs; as positive characteristics, they indicated the simulator's didactic function, especially tutorials and videos on the topic of this study. The results show that the simulator complements the study of the physiology and biophysics of the cardiac cell. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
StePS: Stereographically Projected Cosmological Simulations
NASA Astrophysics Data System (ADS)
Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László
2018-05-01
StePS (Stereographically Projected Cosmological Simulations) compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to simulate the evolution of the large-scale structure. This eliminates the need for periodic boundary conditions, which are a numerical convenience unsupported by observation and which modifies the law of force on large scales in an unrealistic fashion. StePS uses stereographic projection for space compactification and naive O(N2) force calculation; this arrives at a correlation function of the same quality more quickly than standard (tree or P3M) algorithms with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence StePS can function as a high-speed prediction tool for modern large-scale surveys.
XCAT/DRASIM: a realistic CT/human-model simulation package
NASA Astrophysics Data System (ADS)
Fung, George S. K.; Stierstorfer, Karl; Segars, W. Paul; Taguchi, Katsuyuki; Flohr, Thomas G.; Tsui, Benjamin M. W.
2011-03-01
The aim of this research is to develop a complete CT/human-model simulation package by integrating the 4D eXtended CArdiac-Torso (XCAT) phantom, a computer generated NURBS surface based phantom that provides a realistic model of human anatomy and respiratory and cardiac motions, and the DRASIM (Siemens Healthcare) CT-data simulation program. Unlike other CT simulation tools which are based on simple mathematical primitives or voxelized phantoms, this new simulation package has the advantages of utilizing a realistic model of human anatomy and physiological motions without voxelization and with accurate modeling of the characteristics of clinical Siemens CT systems. First, we incorporated the 4D XCAT anatomy and motion models into DRASIM by implementing a new library which consists of functions to read-in the NURBS surfaces of anatomical objects and their overlapping order and material properties in the XCAT phantom. Second, we incorporated an efficient ray-tracing algorithm for line integral calculation in DRASIM by computing the intersection points of the rays cast from the x-ray source to the detector elements through the NURBS surfaces of the multiple XCAT anatomical objects along the ray paths. Third, we evaluated the integrated simulation package by performing a number of sample simulations of multiple x-ray projections from different views followed by image reconstruction. The initial simulation results were found to be promising by qualitative evaluation. In conclusion, we have developed a unique CT/human-model simulation package which has great potential as a tool in the design and optimization of CT scanners, and the development of scanning protocols and image reconstruction methods for improving CT image quality and reducing radiation dose.
Kirkman, Matthew A; Muirhead, William; Sevdalis, Nick; Nandi, Dipankar
2015-01-01
Simulation is gaining increasing interest as a method of delivering high-quality, time-effective, and safe training to neurosurgical residents. However, most current simulators are purpose-built for simulation, being relatively expensive and inaccessible to many residents. The purpose of this study was to provide the first comprehensive validity assessment of ventriculostomy performance metrics from the Medtronic StealthStation S7 Surgical Navigation System, a neuronavigational tool widely used in the clinical setting, as a training tool for simulated ventriculostomy while concomitantly reporting on stress measures. A prospective study where participants performed 6 simulated ventriculostomy attempts on a model head with StealthStation-coregistered imaging. The performance measures included distance of the ventricular catheter tip to the foramen of Monro and presence of the catheter tip in the ventricle. Data on objective and self-reported stress and workload measures were also collected. The operating rooms of the National Hospital for Neurology and Neurosurgery, Queen Square, London. A total of 31 individuals with varying levels of prior ventriculostomy experience, varying in seniority from medical student to senior resident. Performance at simulated ventriculostomy improved significantly over subsequent attempts, irrespective of previous ventriculostomy experience. Performance improved whether or not the StealthStation display monitor was used for real-time visual feedback, but performance was optimal when it was. Further, performance was inversely correlated with both objective and self-reported measures of stress (traditionally referred to as concurrent validity). Stress and workload measures were well-correlated with each other, and they also correlated with technical performance. These initial data support the use of the StealthStation as a training tool for simulated ventriculostomy, providing a safe environment for repeated practice with immediate feedback. Although the potential implications are profound for neurosurgical education and training, further research following this proof-of-concept study is required on a larger scale for full validation and proof that training translates into improved long-term simulated and patient outcomes. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Abel, David; Holloway, Tracey; Harkey, Monica; Rrushaj, Arber; Brinkman, Greg; Duran, Phillip; Janssen, Mark; Denholm, Paul
2018-02-01
We evaluate how fine particulate matter (PM2.5) and precursor emissions could be reduced if 17% of electricity generation was replaced with solar photovoltaics (PV) in the Eastern United States. Electricity generation is simulated using GridView, then used to scale electricity-sector emissions of sulfur dioxide (SO2) and nitrogen oxides (NOX) from an existing gridded inventory of air emissions. This approach offers a novel method to leverage advanced electricity simulations with state-of-the-art emissions inventories, without necessitating recalculation of emissions for each facility. The baseline and perturbed emissions are input to the Community Multiscale Air Quality Model (CMAQ version 4.7.1) for a full accounting of time- and space-varying air quality changes associated with the 17% PV scenario. These results offer a high-value opportunity to evaluate the reduced-form AVoided Emissions and geneRation Tool (AVERT), while using AVERT to test the sensitivity of results to changing base-years and levels of solar integration. We find that average NOX and SO2 emissions across the region decrease 20% and 15%, respectively. PM2.5 concentrations decreased on average 4.7% across the Eastern U.S., with nitrate (NO3-) PM2.5 decreasing 3.7% and sulfate (SO42-) PM2.5 decreasing 9.1%. In the five largest cities in the region, we find that the most polluted days show the most significant PM2.5 decrease under the 17% PV generation scenario, and that the greatest benefits are accrued to cities in or near the Ohio River Valley. We find summer health benefits from reduced PM2.5 exposure estimated as 1424 avoided premature deaths (95% Confidence Interval (CI): 284 deaths, 2 732 deaths) or a health savings of 13.1 billion (95% CI: 0.6 billion, 43.9 billion) These results highlight the potential for renewable energy as a tool for air quality managers to support current and future health-based air quality regulations.
Guevara, M; Tena, C; Soret, A; Serradell, K; Guzmán, D; Retama, A; Camacho, P; Jaimes-Palomera, M; Mediavilla, A
2017-04-15
This article describes the High-Elective Resolution Modelling Emission System for Mexico (HERMES-Mex) model, an emission processing tool developed to transform the official Mexico City Metropolitan Area (MCMA) emission inventory into hourly, gridded (up to 1km 2 ) and speciated emissions used to drive mesoscale air quality simulations with the Community Multi-scale Air Quality (CMAQ) model. The methods and ancillary information used for the spatial and temporal disaggregation and speciation of the emissions are presented and discussed. The resulting emission system is evaluated, and a case study on CO, NO 2 , O 3 , VOC and PM 2.5 concentrations is conducted to demonstrate its applicability. Moreover, resulting traffic emissions from the Mobile Source Emission Factor Model for Mexico (MOBILE6.2-Mexico) and the MOtor Vehicle Emission Simulator for Mexico (MOVES-Mexico) models are integrated in the tool to assess and compare their performance. NO x and VOC total emissions modelled are reduced by 37% and 26% in the MCMA when replacing MOBILE6.2-Mexico for MOVES-Mexico traffic emissions. In terms of air quality, the system composed by the Weather Research and Forecasting model (WRF) coupled with the HERMES-Mex and CMAQ models properly reproduces the pollutant levels and patterns measured in the MCMA. The system's performance clearly improves in urban stations with a strong influence of traffic sources when applying MOVES-Mexico emissions. Despite reducing estimations of modelled precursor emissions, O 3 peak averages are increased in the MCMA core urban area (up to 30ppb) when using MOVES-Mexico mobile emissions due to its VOC-limited regime, while concentrations in the surrounding suburban/rural areas decrease or increase depending on the meteorological conditions of the day. The results obtained suggest that the HERMES-Mex model can be used to provide model-ready emissions for air quality modelling in the MCMA. Copyright © 2017 Elsevier B.V. All rights reserved.
Comprehensive helicopter analysis: A state of the art review
NASA Technical Reports Server (NTRS)
Johnson, W.
1978-01-01
An assessment of the status of helicopter theory and analysis is presented. The technology level embodied in available design tools (computer programs) is examined, considering the problem areas of performance, loads and vibration, handling qualities and simulation, and aeroelastic stability. The effectiveness of the present analyses is discussed. The characteristics of the technology in the analyses are reviewed, including the aerodynamics technology, induced velocity and wake geometry, dynamics technology, and machine limitations.
USDA-ARS?s Scientific Manuscript database
Watershed models such as the Soil and Water Assessment Tool (SWAT) have been widely used to simulate watershed hydrologic processes and the effect of management, such as agroforestry, on soil and water resources. In order to use model outputs for tasks ranging from aiding policy decision making to r...
2014-05-01
hand and right hand on the piano, or strumming and chording on the guitar . Perceptual This skill category involves detecting and interpreting sensory...measured as the percent correct, # correct, accumulated points, task/test scoring correct action/timing/performance. This also includes quality rating by...competition and scoring , as well as constraints, privileges and penalties. Simulation-Based The primary delivery environment is an interactive synthetic
Importance of inlet boundary conditions for numerical simulation of combustor flows
NASA Technical Reports Server (NTRS)
Sturgess, G. J.; Syed, S. A.; Mcmanus, K. R.
1983-01-01
Fluid dynamic computer codes for the mathematical simulation of problems in gas turbine engine combustion systems are required as design and diagnostic tools. To eventually achieve a performance standard with these codes of more than qualitative accuracy it is desirable to use benchmark experiments for validation studies. Typical of the fluid dynamic computer codes being developed for combustor simulations is the TEACH (Teaching Elliptic Axisymmetric Characteristics Heuristically) solution procedure. It is difficult to find suitable experiments which satisfy the present definition of benchmark quality. For the majority of the available experiments there is a lack of information concerning the boundary conditions. A standard TEACH-type numerical technique is applied to a number of test-case experiments. It is found that numerical simulations of gas turbine combustor-relevant flows can be sensitive to the plane at which the calculations start and the spatial distributions of inlet quantities for swirling flows.
Polyenergetic known-component reconstruction without prior shape models
NASA Astrophysics Data System (ADS)
Zhang, C.; Zbijewski, W.; Zhang, X.; Xu, S.; Stayman, J. W.
2017-03-01
Purpose: Previous work has demonstrated that structural models of surgical tools and implants can be integrated into model-based CT reconstruction to greatly reduce metal artifacts and improve image quality. This work extends a polyenergetic formulation of known-component reconstruction (Poly-KCR) by removing the requirement that a physical model (e.g. CAD drawing) be known a priori, permitting much more widespread application. Methods: We adopt a single-threshold segmentation technique with the help of morphological structuring elements to build a shape model of metal components in a patient scan based on initial filtered-backprojection (FBP) reconstruction. This shape model is used as an input to Poly-KCR, a formulation of known-component reconstruction that does not require a prior knowledge of beam quality or component material composition. An investigation of performance as a function of segmentation thresholds is performed in simulation studies, and qualitative comparisons to Poly-KCR with an a priori shape model are made using physical CBCT data of an implanted cadaver and in patient data from a prototype extremities scanner. Results: We find that model-free Poly-KCR (MF-Poly-KCR) provides much better image quality compared to conventional reconstruction techniques (e.g. FBP). Moreover, the performance closely approximates that of Poly- KCR with an a prior shape model. In simulation studies, we find that imaging performance generally follows segmentation accuracy with slight under- or over-estimation based on the shape of the implant. In both simulation and physical data studies we find that the proposed approach can remove most of the blooming and streak artifacts around the component permitting visualization of the surrounding soft-tissues. Conclusion: This work shows that it is possible to perform known-component reconstruction without prior knowledge of the known component. In conjunction with the Poly-KCR technique that does not require knowledge of beam quality or material composition, very little needs to be known about the metal implant and system beforehand. These generalizations will allow more widespread application of KCR techniques in real patient studies where the information of surgical tools and implants is limited or not available.
CTViz: A tool for the visualization of transport in nanocomposites.
Beach, Benjamin; Brown, Joshua; Tarlton, Taylor; Derosa, Pedro A
2016-05-01
A visualization tool (CTViz) for charge transport processes in 3-D hybrid materials (nanocomposites) was developed, inspired by the need for a graphical application to assist in code debugging and data presentation of an existing in-house code. As the simulation code grew, troubleshooting problems grew increasingly difficult without an effective way to visualize 3-D samples and charge transport in those samples. CTViz is able to produce publication and presentation quality visuals of the simulation box, as well as static and animated visuals of the paths of individual carriers through the sample. CTViz was designed to provide a high degree of flexibility in the visualization of the data. A feature that characterizes this tool is the use of shade and transparency levels to highlight important details in the morphology or in the transport paths by hiding or dimming elements of little relevance to the current view. This is fundamental for the visualization of 3-D systems with complex structures. The code presented here provides these required capabilities, but has gone beyond the original design and could be used as is or easily adapted for the visualization of other particulate transport where transport occurs on discrete paths. Copyright © 2016 Elsevier Inc. All rights reserved.
Tools for a sustainable management of coastal seas: analysis of Cullera Bay
NASA Astrophysics Data System (ADS)
Mestres, M.; Sierra, J. P.; Sánchez-Arcilla, A.; Mösso, C.; González del Río, J.; Rodilla, M.
2003-04-01
The quality of the water in Cullera Bay (Eastern Spanish coast) has a relevant influence on the economy of the city of Cullera, which is focused mainly on agriculture, fisheries and tourism.. However, the bay waters are highly eutrophized (e.g., González del Río, 1987) because of the large input of nutrients from the river Júcar and from an existing marine outfall that discharges untreated wastewater during the summer months. Cullera Bay has been chosen, within the framework of the European project ECOSUD, to establish a set of indicators that may be used to assess the “health state” of a coastal or estuarine region. The main goal of the ECOSUD project, which also includes the Brazilian Patos Lagoon, is to develop a methodology and tools that will help coastal managers make decisions taking into account the sustainability of coastal and estuarine resources. The execution of the project involves a combination of field campaigns and numerical modelling. The former include integrated observations of the most relevant physical and biological magnitudes, such as water currents, meteorological characteristics, and concentrations of suspended matter, nutrients and pollutants. The latter include numerical simulations of the hydrodynamic fields induced by wind conditions and river discharge, and the simulation of pollutant and nutrient transport. The combined results allow to estimate what physical, chemical or biological parameters influence the water quality in the Bay, and their effects on selected economically important indicators such as the clam population and tourism. The data obtained from the three field campaigns undertaken during the 2002 summer, and the corresponding numerical simulations, reveal the influence of the riverine and outfall discharges on the nutrient concentration gradients inside the Bay. These are determined by the local hydrodynamics, which are mainly driven by the prevailing wind (mainly from the South and Southeast, during summertime) and the river discharge. Under certain wind conditions, the barrier effect of the Cullera Cape plays an important role in determining the water quality within the Bay.
Wu, Yiping; Liu, Shu-Guang; Li, Zhengpeng
2012-01-01
Biofuels are now an important resource in the United States because of the Energy Independence and Security Act of 2007. Both increased corn growth for ethanol production and perennial dedicated energy crop growth for cellulosic feedstocks are potential sources to meet the rising demand for biofuels. However, these measures may cause adverse environmental consequences that are not yet fully understood. This study 1) evaluates the long-term impacts of increased frequency of corn in the crop rotation system on water quantity and quality as well as soil fertility in the James River Basin and 2) identifies potential grasslands for cultivating bioenergy crops (e.g. switchgrass), estimating the water quality impacts. We selected the soil and water assessment tool, a physically based multidisciplinary model, as the modeling approach to simulate a series of biofuel production scenarios involving crop rotation and land cover changes. The model simulations with different crop rotation scenarios indicate that decreases in water yield and soil nitrate nitrogen (NO3-N) concentration along with an increase in NO3-N load to stream water could justify serious concerns regarding increased corn rotations in this basin. Simulations with land cover change scenarios helped us spatially classify the grasslands in terms of biomass productivity and nitrogen loads, and we further derived the relationship of biomass production targets and the resulting nitrogen loads against switchgrass planting acreages. The suggested economically efficient (planting acreage) and environmentally friendly (water quality) planting locations and acreages can be a valuable guide for cultivating switchgrass in this basin. This information, along with the projected environmental costs (i.e. reduced water yield and increased nitrogen load), can contribute to decision support tools for land managers to seek the sustainability of biofuel development in this region.
A Review of Surface Water Quality Models
Li, Shibei; Jia, Peng; Qi, Changjun; Ding, Feng
2013-01-01
Surface water quality models can be useful tools to simulate and predict the levels, distributions, and risks of chemical pollutants in a given water body. The modeling results from these models under different pollution scenarios are very important components of environmental impact assessment and can provide a basis and technique support for environmental management agencies to make right decisions. Whether the model results are right or not can impact the reasonability and scientificity of the authorized construct projects and the availability of pollution control measures. We reviewed the development of surface water quality models at three stages and analyzed the suitability, precisions, and methods among different models. Standardization of water quality models can help environmental management agencies guarantee the consistency in application of water quality models for regulatory purposes. We concluded the status of standardization of these models in developed countries and put forward available measures for the standardization of these surface water quality models, especially in developing countries. PMID:23853533
Frameworks for Assessing the Quality of Modeling and Simulation Capabilities
NASA Astrophysics Data System (ADS)
Rider, W. J.
2012-12-01
The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are incomplete and need to be extended incorporating elements from the other as well as new elements related to how models are solved, and how the model will be applied. I will describe this merger of approach and how it should be applied. The problems in adoption are related to basic human nature in that no one likes to be graded, or told they are not sufficiently quality oriented. Rather than engage in an adversarial role, I suggest that the frameworks be viewed as a collaborative tool. Instead these frameworks should be used to structure collaborations that can be used to assist the modeling and simulation efforts to be high quality. The framework provides a comprehensive setting of modeling and simulation themes that should be explored in providing high quality. W. Oberkampf, M. Pilch, and T. Trucano, Predictive Capability Maturity Model for Computational Modeling and Simulation, SAND2007-5948, 2007. B. Boyack, Quantifying Reactor Safety Margins Part 1: An Overview of the Code Scaling, Applicability, and Uncertainty Evaluation Methodology, Nuc. Eng. Design, 119, pp. 1-15, 1990. National Aeronautics and Space Administration, STANDARD FOR MODELS AND SIMULATIONS, NASA-STD-7009, 2008. Y. Ben-Haim and F. Hemez, Robustness, fidelity and prediction-looseness of models, Proc. R. Soc. A (2012) 468, 227-244.
Strategy and gaps for modeling, simulation, and control of hybrid systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabiti, Cristian; Garcia, Humberto E.; Hovsapian, Rob
2015-04-01
The purpose of this report is to establish a strategy for modeling and simulation of candidate hybrid energy systems. Modeling and simulation is necessary to design, evaluate, and optimize the system technical and economic performance. Accordingly, this report first establishes the simulation requirements to analysis candidate hybrid systems. Simulation fidelity levels are established based on the temporal scale, real and synthetic data availability or needs, solution accuracy, and output parameters needed to evaluate case-specific figures of merit. Accordingly, the associated computational and co-simulation resources needed are established; including physical models when needed, code assembly and integrated solutions platforms, mathematical solvers,more » and data processing. This report first attempts to describe the figures of merit, systems requirements, and constraints that are necessary and sufficient to characterize the grid and hybrid systems behavior and market interactions. Loss of Load Probability (LOLP) and effective cost of Effective Cost of Energy (ECE), as opposed to the standard Levelized Cost of Electricty (LCOE), are introduced as technical and economical indices for integrated energy system evaluations. Financial assessment methods are subsequently introduced for evaluation of non-traditional, hybrid energy systems. Algorithms for coupled and iterative evaluation of the technical and economic performance are subsequently discussed. This report further defines modeling objectives, computational tools, solution approaches, and real-time data collection and processing (in some cases using real test units) that will be required to model, co-simulate, and optimize; (a) an energy system components (e.g., power generation unit, chemical process, electricity management unit), (b) system domains (e.g., thermal, electrical or chemical energy generation, conversion, and transport), and (c) systems control modules. Co-simulation of complex, tightly coupled, dynamic energy systems requires multiple simulation tools, potentially developed in several programming languages and resolved on separate time scales. Whereas further investigation and development of hybrid concepts will provide a more complete understanding of the joint computational and physical modeling needs, this report highlights areas in which co-simulation capabilities are warranted. The current development status, quality assurance, availability and maintainability of simulation tools that are currently available for hybrid systems modeling is presented. Existing gaps in the modeling and simulation toolsets and development needs are subsequently discussed. This effort will feed into a broader Roadmap activity for designing, developing, and demonstrating hybrid energy systems.« less
Advanced Engineering Environments: Implications for Aerospace Manufacturing
NASA Technical Reports Server (NTRS)
Thomas, D.
2001-01-01
There are significant challenges facing today's aerospace industry. Global competition, more complex products, geographically-distributed design teams, demands for lower cost, higher reliability and safer vehicles, and the need to incorporate the latest technologies quicker all face the developer of aerospace systems. New information technologies offer promising opportunities to develop advanced engineering environments (AEEs) to meet these challenges. Significant advances in the state-of-the-art of aerospace engineering practice are envisioned in the areas of engineering design and analytical tools, cost and risk tools, collaborative engineering, and high-fidelity simulations early in the development cycle. These advances will enable modeling and simulation of manufacturing methods, which will in turn allow manufacturing considerations to be included much earlier in the system development cycle. Significant cost savings, increased quality, and decreased manufacturing cycle time are expected to result. This paper will give an overview of the NASA's Intelligent Synthesis Environment, the agency initiative to develop an AEE, with a focus on the anticipated benefits in aerospace manufacturing.
NASA Astrophysics Data System (ADS)
Löwe, P.; Hammitzsch, M.; Babeyko, A.; Wächter, J.
2012-04-01
The development of new Tsunami Early Warning Systems (TEWS) requires the modelling of spatio-temporal spreading of tsunami waves both recorded from past events and hypothetical future cases. The model results are maintained in digital repositories for use in TEWS command and control units for situation assessment once a real tsunami occurs. Thus the simulation results must be absolutely trustworthy, in a sense that the quality of these datasets is assured. This is a prerequisite as solid decision making during a crisis event and the dissemination of dependable warning messages to communities under risk will be based on them. This requires data format validity, but even more the integrity and information value of the content, being a derived value-added product derived from raw tsunami model output. Quality checking of simulation result products can be done in multiple ways, yet the visual verification of both temporal and spatial spreading characteristics for each simulation remains important. The eye of the human observer still remains an unmatched tool for the detection of irregularities. This requires the availability of convenient, human-accessible mappings of each simulation. The improvement of tsunami models necessitates the changes in many variables, including simulation end-parameters. Whenever new improved iterations of the general models or underlying spatial data are evaluated, hundreds to thousands of tsunami model results must be generated for each model iteration, each one having distinct initial parameter settings. The use of a Compute Cluster Environment (CCE) of sufficient size allows the automated generation of all tsunami-results within model iterations in little time. This is a significant improvement to linear processing on dedicated desktop machines or servers. This allows for accelerated/improved visual quality checking iterations, which in turn can provide a positive feedback into the overall model improvement iteratively. An approach to set-up and utilize the CCE has been implemented by the project Collaborative, Complex, and Critical Decision Processes in Evolving Crises (TRIDEC) funded under the European Union's FP7. TRIDEC focuses on real-time intelligent information management in Earth management. The addressed challenges include the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources with accelerated generation of large volumes of data. These include sensor systems, geo-information repositories, simulations and data fusion tools. Additionally, TRIDEC adopts enhancements of Service Oriented Architecture (SOA) principles in terms of Event Driven Architecture (EDA) design. As a next step the implemented CCE's services to generate derived and customized simulation products are foreseen to be provided via an EDA service for on-demand processing for specific threat-parameters and to accommodate for model improvements.
Microdosimetry in ion-beam therapy
NASA Astrophysics Data System (ADS)
Magrin, Giulio; Mayer, Ramona
2015-06-01
The information of the dose is not sufficiently describing the biological effects of ions on tissue since it does not express the radiation quality, i.e. the heterogeneity of the processes due to the slowing-down and the fragmentation of the particles when crossing a target. Depending on different circumstances, the radiation quality can be determined using measurements, calculations, or simulations. Microdosimeters are the primary tools used to provide the experimental information of the radiation quality and their role is becoming crucial for the recent clinical developments in particular with carbon ion therapy. Microdosimetry is strongly linked to the biological effectiveness of the radiation since it provides the physical parameters which explicitly distinguish the radiation for its capability of damaging cells. In the framework of ion-beam therapy microdosimetry can be used in the preparation of the treatment to complement radiobiological experiments and to analyze the modification of the radiation quality in phantoms. A more ambitious goal is to perform the measurements during the irradiation procedure to determine the non-targeted radiation and, more importantly, to monitor the modification of the radiation quality inside the patient. These procedures provide the feedback of the treatment directly beneficial for the single patient but also for the characterization of the biological effectiveness in general with advantages for all future treatment. Traditional and innovative tools are currently under study and an outlook of present experience and future development is presented here.
Multi-criteria analysis for PM10 planning
NASA Astrophysics Data System (ADS)
Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa
To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.
NASA Astrophysics Data System (ADS)
Ileana, Ioan; Risteiu, Mircea; Marc, Gheorghe
2016-12-01
This paper is a part of our research dedicated to high power LED lamps designing. The boost-up selected technology wants to meet driver producers' tendency in the frame of efficiency and disturbances constrains. In our work we used modeling and simulation tools for implementing scenarios of the driver work when some controlling functions are executed (output voltage/ current versus input voltage and fixed switching frequency, input and output electric power transfer versus switching frequency, transient inductor voltage analysis, and transient out capacitor analysis). Some electrical and thermal stress conditions are also analyzed. Based on these aspects, a high reliable power LED driver has been designed.
VOLCWORKS: A suite for optimization of hazards mapping
NASA Astrophysics Data System (ADS)
Delgado Granados, H.; Ramírez Guzmán, R.; Villareal Benítez, J. L.; García Sánchez, T.
2012-04-01
Making hazards maps is a process linking basic science, applied science and engineering for the benefit of the society. The methodologies for hazards maps' construction have evolved enormously together with the tools that allow the forecasting of the behavior of the materials produced by different eruptive processes. However, in spite of the development of tools and evolution of methodologies, the utility of hazards maps has not changed: prevention and mitigation of volcanic disasters. Integration of different tools for simulation of different processes for a single volcano is a challenge to be solved using software tools including processing, simulation and visualization techniques, and data structures in order to build up a suit that helps in the construction process starting from the integration of the geological data, simulations and simplification of the output to design a hazards/scenario map. Scientific visualization is a powerful tool to explore and gain insight into complex data from instruments and simulations. The workflow from data collection, quality control and preparation for simulations, to achieve visual and appropriate presentation is a process that is usually disconnected, using in most of the cases different applications for each of the needed processes, because it requires many tools that are not built for the solution of a specific problem, or were developed by research groups to solve particular tasks, but disconnected. In volcanology, due to its complexity, groups typically examine only one aspect of the phenomenon: ash dispersal, laharic flows, pyroclastic flows, lava flows, and ballistic projectile ejection, among others. However, when studying the hazards associated to the activity of a volcano, it is important to analyze all the processes comprehensively, especially for communication of results to the end users: decision makers and planners. In order to solve this problem and connect different parts of a workflow we are developing the suite VOLCWORKS, whose principle is to have a flexible-implementation architecture allowing rapid development of software to the extent specified by the needs including calculations, routines, or algorithms, both new and through redesign of available software in the volcanological community, but especially allowing to include new knowledge, models or software transferring them to software modules. The design is component-oriented platform, which allows incorporating particular solutions (routines, simulations, etc.), which can be concatenated for integration or highlighting information. The platform includes a graphical interface with capabilities for working in different visual environments that can be focused to the particular work of different types of users (researchers, lecturers, students, etc.). This platform aims to integrate simulation and visualization phases, incorporating proven tools (now isolated). VOLCWORKS can be used under different operating systems (Windows, Linux and Mac OS) and fit the context of use automatically and at runtime: in both tasks and their sequence, such as utilization of hardware resources (CPU, GPU, special monitors, etc.). The application has the ability to run on a laptop or even in a virtual reality room with access to supercomputers.
WE-G-BRA-04: The Development of a Virtual Reality Dosimetry Training Platform for Physics Training.
Beavis, A; Ward, J
2012-06-01
Recently there has been a great deal of interest in the application of Simulation methodologies for training. We have previously developed a Virtual Environment for Radiotherapy Training, VERT, which simulates a fully interactive and functional Linac. Patient and plan data can be accessed across a DICOM interface, allowing the treatment process to be simulated. Here we present a newly developed range of Physics equipment, which allows the user to undertake realistic QC processes. Five devices are available: 1) scanning water phantom, 2) 'solid water' QC block/ion chamber, 3) light/ radiation field coincidence phantom, 4) laser alignment phantom and 5) water based calibration phantom with reference class and 'departmental' ion chamber. The devices were created to operate realistically and function as expected, each has an associated control screen which provides control and feedback information. The dosimetric devices respond appropriately to the beam qualities available on the Linac. Geometrical characteristics of the Linac, e.g. isocentre integrity, laser calibration and jaw calibrations can have random errors introduced in order to enable the user learn and observe fault conditions. In the calibration module appropriate factors for temperature and pressure must be set to correct for ambient, simulated, room conditions. The dosimetric devices can be used to characterise the Linac beams. Depth doses with Dmax of 15mm/29mm and d10 of 67%/77% respectively for 10cm square 6/15MV beams were measured. The Quality Indices (TPR20/10 ratios) can be measured as 0.668 and 0.761 respectively. At a simple level the tools can be used to demonstrate beam divergence or the effect of the inverse square law; They are also designed to be used to simulate the calibration of a new ion chamber. We have developed a novel set of tools that allow education of Physics processes via simulation training in our virtual environment. Both Authors are Founders and Directors of Vertual Ltd, a spin-out company that exists to commericalise the results of the research work presented in this abstract. © 2012 American Association of Physicists in Medicine.
Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M
2018-04-01
Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.
Improving SWAT for simulating water and carbon fluxes of forest ecosystems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Qichun; Zhang, Xuesong
2016-11-01
As a widely used watershed model for assessing impacts of anthropogenic and natural disturbances on water quantity and quality, the Soil and Water Assessment Tool (SWAT) has not been extensively tested in simulating water and carbon fluxes of forest ecosystems. Here, we examine SWAT simulations of evapotranspiration (ET), net primary productivity (NPP), net ecosystem exchange (NEE), and plant biomass at ten AmeriFlux forest sites across the U.S. We identify unrealistic radiation use efficiency (Bio_E), large leaf to biomass fraction (Bio_LEAF), and missing phosphorus supply from parent material weathering as the primary causes for the inadequate performance of the default SWATmore » model in simulating forest dynamics. By further revising the relevant parameters and processes, SWAT’s performance is substantially improved. Based on the comparison between the improved SWAT simulations and flux tower observations, we discuss future research directions for further enhancing model parameterization and representation of water and carbon cycling for forests.« less
Methodological issues in the quantitative assessment of quality of life.
Panagiotakos, Demosthenes B; Yfantopoulos, John N
2011-10-01
The term quality of life can be identified in Aristotle's classical writings of 330 BC. In his Nichomachian ethics he recognises the multiple relationships between happiness, well-being, "eudemonia" and quality of life. Historically the concept of quality of life has undergone various interpretations. It involves personal experience, perceptions and beliefs, attitudes concerning philosophical, cultural, spiritual, psychological, political, and financial aspects of everyday living. Quality of life has been extensively used both as an outcome and an explanatory factor in relation to human health, in various clinical trials, epidemiologic studies and health interview surveys. Because of the variations in the definition of quality of life, both in theory and in practice, there are also a wide range of procedures that are used to assess quality of life. In this paper several methodological issues regarding the tools used to evaluate quality of life is discussed. In summary, the use of components consisted of large number of classes, as well as the use of specific weights for each scale component, and the low-to-moderate inter-correlation level between the components, is evident from simulated and empirical studies.
NASA Astrophysics Data System (ADS)
Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.
2018-06-01
The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.
Next-generation Event Horizon Telescope developments: new stations for enhanced imaging
NASA Astrophysics Data System (ADS)
Palumbo, Daniel; Johnson, Michael; Doeleman, Sheperd; Chael, Andrew; Bouman, Katherine
2018-01-01
The Event Horizon Telescope (EHT) is a multinational Very Long Baseline Interferometry (VLBI) network of dishes joined to resolve general relativistic behavior near a supermassive black hole. The imaging quality of the EHT is largely dependent upon the sensitivity and spatial frequency coverage of the many baselines between its constituent telescopes. The EHT already contains many highly sensitive dishes, including the crucial Atacama Large Millimeter/Submillimeter Array (ALMA), making it viable to add smaller, cheaper telescopes to the array, greatly improving future capabilities of the EHT. We develop tools for optimizing the positions of new dishes in planned arrays. We also explore the feasibility of adding small orbiting dishes to the EHT, and develop orbital optimization tools for space-based VLBI imaging. Unlike the Millimetron mission planned to be at L2, we specifically treat near-earth orbiters, and find rapid filling of spatial frequency coverage across a large range of baseline lengths. Finally, we demonstrate significant improvement in image quality when adding small dishes to planned arrays in simulated observations.
Modeling water quality in the Tualatin River, Oregon, 1991-1997
Rounds, Stewart A.; Wood, Tamara M.
2001-01-01
The calibration of a model of flow, temperature, and water quality in the Tualatin River, Oregon, originally calibrated for the summers of 1991 through 1993, was extended to the summers of 1991 through 1997. The model is now calibrated for a total period of 42 months during the May through October periods of 7 hydrologically distinct years. Based on a modified version of the U.S. Army Corps of Engineers model CE-QUAL-W2, this model provides a good fit to the measured data for streamflow, water temperature, and water quality constituents such as chloride, ammonia, nitrate, total phosphorus, orthophosphate, phytoplankton, and dissolved oxygen. In particular, the model simulates ammonia concentrations and the effects of instream ammonia nitrification very well, which is critical to ongoing efforts to revise ammonia regulations for the Tualatin River. In addition, the model simulates the timing, duration, and relative size of algal blooms with sufficient accuracy to provide important insights for regulators and managers of this river.Efforts to limit the size of algal blooms through phosphorus control measures are apparent in the model simulations, which show this limitation on algal growth. Such measures are largely responsible for avoiding violations of the State of Oregon maximum pH standard of 8.5 in recent years, but they have not yet reduced algal biomass levels below the State of Oregon nuisance phytoplankton growth guideline of 15 ?g/L chlorophyll-a.Most of the dynamics of the instream dissolved oxygen concentrations are captured by the model. About half of the error in the simulated dissolved oxygen concentrations is directly attributable to error in the size of the simulated phytoplankton population. To achieve greater accuracy in simulating dissolved oxygen, therefore, it will be necessary to increase accuracy in the simulation of Tualatin River phytoplankton.Future efforts may include the introduction of multiple algal groups in the model. This model of the Tualatin River continues to be used as a quantitative tool to aid in the management of this important resource.
Examining System-Wide Impacts of Solar PV Control Systems with a Power Hardware-in-the-Loop Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Tess L.; Fuller, Jason C.; Schneider, Kevin P.
2014-10-11
High penetration levels of distributed solar PV power generation can lead to adverse power quality impacts such as excessive voltage rise, voltage flicker, and reactive power values that result in unacceptable voltage levels. Advanced inverter control schemes have been proposed that have the potential to mitigate many power quality concerns. However, closed-loop control may lead to unintended behavior in deployed systems as complex interactions can occur between numerous operating devices. In order to enable the study of the performance of advanced control schemes in a detailed distribution system environment, a Hardware-in-the-Loop (HIL) platform has been developed. In the HIL system,more » GridLAB-D, a distribution system simulation tool, runs in real-time mode at the Pacific Northwest National Laboratory (PNNL) and supplies power system parameters at a point of common coupling to hardware located at the National Renewable Energy Laboratory (NREL). Hardware inverters interact with grid and PV simulators emulating an operational distribution system and power output from the inverters is measured and sent to PNNL to update the real-time distribution system simulation. The platform is described and initial test cases are presented. The platform is used to study the system-wide impacts and the interactions of controls applied to inverters that are integrated into a simulation of the IEEE 8500-node test feeder, with inverters in either constant power factor control or active volt/VAR control. We demonstrate that this HIL platform is well-suited to the study of advanced inverter controls and their impacts on the power quality of a distribution feeder. Additionally, the results from HIL are used to validate GridLAB-D simulations of advanced inverter controls.« less
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
Linked Hydrologic-Hydrodynamic Model Framework to Forecast Impacts of Rivers on Beach Water Quality
NASA Astrophysics Data System (ADS)
Anderson, E. J.; Fry, L. M.; Kramer, E.; Ritzenthaler, A.
2014-12-01
The goal of NOAA's beach quality forecasting program is to use a multi-faceted approach to aid in detection and prediction of bacteria in recreational waters. In particular, our focus has been on the connection between tributary loads and bacteria concentrations at nearby beaches. While there is a clear link between stormwater runoff and beach water quality, quantifying the contribution of river loadings to nearshore bacterial concentrations is complicated due to multiple processes that drive bacterial concentrations in rivers as well as those processes affecting the fate and transport of bacteria upon exiting the rivers. In order to forecast potential impacts of rivers on beach water quality, we developed a linked hydrologic-hydrodynamic water quality framework that simulates accumulation and washoff of bacteria from the landscape, and then predicts the fate and transport of washed off bacteria from the watershed to the coastal zone. The framework includes a watershed model (IHACRES) to predict fecal indicator bacteria (FIB) loadings to the coastal environment (accumulation, wash-off, die-off) as a function of effective rainfall. These loadings are input into a coastal hydrodynamic model (FVCOM), including a bacteria transport model (Lagrangian particle), to simulate 3D bacteria transport within the coastal environment. This modeling system provides predictive tools to assist local managers in decision-making to reduce human health threats.
Power Hardware-in-the-Loop (PHIL) Testing Facility for Distributed Energy Storage (Poster)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neubauer.J.; Lundstrom, B.; Simpson, M.
2014-06-01
The growing deployment of distributed, variable generation and evolving end-user load profiles presents a unique set of challenges to grid operators responsible for providing reliable and high quality electrical service. Mass deployment of distributed energy storage systems (DESS) has the potential to solve many of the associated integration issues while offering reliability and energy security benefits other solutions cannot. However, tools to develop, optimize, and validate DESS control strategies and hardware are in short supply. To fill this gap, NREL has constructed a power hardware-in-the-loop (PHIL) test facility that connects DESS, grid simulator, and load bank hardware to a distributionmore » feeder simulation.« less
Application of the GRC Stirling Convertor System Dynamic Model
NASA Technical Reports Server (NTRS)
Regan, Timothy F.; Lewandowski, Edward J.; Schreiber, Jeffrey G. (Technical Monitor)
2004-01-01
The GRC Stirling Convertor System Dynamic Model (SDM) has been developed to simulate dynamic performance of power systems incorporating free-piston Stirling convertors. This paper discusses its use in evaluating system dynamics and other systems concerns. Detailed examples are provided showing the use of the model in evaluation of off-nominal operating conditions. The many degrees of freedom in both the mechanical and electrical domains inherent in the Stirling convertor and the nonlinear dynamics make simulation an attractive analysis tool in conjunction with classical analysis. Application of SDM in studying the relationship of the size of the resonant circuit quality factor (commonly referred to as Q) in the various resonant mechanical and electrical sub-systems is discussed.
Numerical methods for assessing water quality in lakes and reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahamah, D.S.
1984-01-01
Water quality models are used as tools for predicting both short-term and long-term trends in water quality. They are generally classified into two groups based on the degree of empiricism. The two groups consists of the purely empirical types known as black-box models and the theoretical types called ecosystem models. This dissertation deals with both types of water quality models. The first part deals with empirical phosphorus models. The theory behind this class of models is discussed, leading to the development of an empirical phosphorus model using data from 79 western US lakes. A new approach to trophic state classificationmore » is introduced. The data used for the model was obtained from the Environmental Protection Agency National Eutrophication Study (EPA-NES) of western US lakes. The second portion of the dissertation discusses the development of an ecosystem model for culturally eutrophic Liberty Lake situated in eastern Washington State. The model is capable of simulating chlorophyll-a, phosphorus, and nitrogen levels in the lake on a weekly basis. For computing sediment release rates of phosphorus and nitrogen, equations based on laboratory bench-top studies using sediment samples from Liberty Lake are used. The model is used to simulate certain hypothetical nutrient control techniques such as phosphorus flushing, precipitation, and diversion.« less
Adding Four- Dimensional Data Assimilation (aka grid ...
Adding four-dimensional data assimilation (a.k.a. grid nudging) to MPAS.The U.S. Environmental Protection Agency is investigating the use of MPAS as the meteorological driver for its next-generation air quality model. To function as such, MPAS needs to operate in a diagnostic mode in much the same manner as the current meteorological driver, the Weather Research and Forecasting (WRF) model. The WRF operates in diagnostic mode using Four-Dimensional Data Assimilation, also known as "grid nudging". MPAS version 4.0 has been modified with the addition of an FDDA routine to the standard physics drivers to nudge the state variables for wind, temperature and water vapor towards MPAS initialization fields defined at 6-hour intervals from GFS-derived data. The results to be shown demonstrate the ability to constrain MPAS simulations to known historical conditions and thus provide the U.S. EPA with a practical meteorological driver for global-scale air quality simulations. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use bo
van Rossum, Huub H; Kemperman, Hans
2017-02-01
To date, no practical tools are available to obtain optimal settings for moving average (MA) as a continuous analytical quality control instrument. Also, there is no knowledge of the true bias detection properties of applied MA. We describe the use of bias detection curves for MA optimization and MA validation charts for validation of MA. MA optimization was performed on a data set of previously obtained consecutive assay results. Bias introduction and MA bias detection were simulated for multiple MA procedures (combination of truncation limits, calculation algorithms and control limits) and performed for various biases. Bias detection curves were generated by plotting the median number of test results needed for bias detection against the simulated introduced bias. In MA validation charts the minimum, median, and maximum numbers of assay results required for MA bias detection are shown for various bias. Their use was demonstrated for sodium, potassium, and albumin. Bias detection curves allowed optimization of MA settings by graphical comparison of bias detection properties of multiple MA. The optimal MA was selected based on the bias detection characteristics obtained. MA validation charts were generated for selected optimal MA and provided insight into the range of results required for MA bias detection. Bias detection curves and MA validation charts are useful tools for optimization and validation of MA procedures.
NASA Astrophysics Data System (ADS)
Choi, Yu-Jin; Hyde, Peter; Fernando, H. J. S.
High (episodic) particulate matter (PM) events over the sister cities of Douglas (AZ) and Agua Prieta (Sonora), located in the US-Mexico border, were simulated using the 3D Eulerian air quality model, MODELS-3/CMAQ. The best available input information was used for the simulations, with pollution inventory specified on a fine grid. In spite of inherent uncertainties associated with the emission inventory as well as the chemistry and meteorology of the air quality simulation tool, model evaluations showed acceptable PM predictions, while demonstrating the need for including the interaction between meteorology and emissions in an interactive mode in the model, a capability currently unavailable in MODELS-3/CMAQ when dealing with PM. Sensitivity studies on boundary influence indicate an insignificant regional (advection) contribution of PM to the study area. The contribution of secondary particles to the occurrence of high PM events was trivial. High PM episodes in the study area, therefore, are purely local events that largely depend on local meteorological conditions. The major PM emission sources were identified as vehicular activities on unpaved/paved roads and wind-blown dust. The results will be of immediate utility in devising PM mitigation strategies for the study area, which is one of the US EPA-designated non-attainment areas with respect to PM.
Evaluation of the Community Multi-scale Air Quality (CMAQ) ...
The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Protection Agency develops the CMAQ model and periodically releases new versions of the model that include bug fixes and various other improvements to the modeling system. In the fall of 2015, CMAQ version 5.1 was released. This new version of CMAQ will contain important bug fixes to several issues that were identified in CMAQv5.0.2 and additionally include updates to other portions of the code. Several annual, and numerous episodic, CMAQv5.1 simulations were performed to assess the impact of these improvements on the model results. These results will be presented, along with a base evaluation of the performance of the CMAQv5.1 modeling system against available surface and upper-air measurements available during the time period simulated. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, proces
Vehicle Technology Simulation and Analysis Tools | Transportation Research
| NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology
A flexible tool for hydraulic and water quality performance analysis of green infrastructure
NASA Astrophysics Data System (ADS)
Massoudieh, A.; Alikhani, J.
2017-12-01
Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. To be used to evaluate the effect design configurations on the long-term performance of GIs, models should be able to consider processes within GIs with good fidelity. In this presentation, a sophisticated, yet flexible tool for hydraulic and water quality assessment of GIs will be introduced. The tool can be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media employed in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biogeochemical processes affecting contaminants such as evapotranspiration, plant uptake, reactions, and particle-associated transport accurately while maintaining a high degree of flexibility to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated. The process-based model framework developed here can be used to model a diverse range of GI practices such as stormwater ponds, green roofs, retention ponds, bioretention systems, infiltration trench, permeable pavement and other custom-designed combinatory systems. An example of the application of the system to evaluate the performance of a rain-garden system will be demonstrated.
NASA Astrophysics Data System (ADS)
Lovette, J. P.; Duncan, J. M.; Band, L. E.
2016-12-01
Watershed management requires information on the hydrologic impacts of local to regional land use, land cover and infrastructure conditions. Management of runoff volumes, storm flows, and water quality can benefit from large scale, "top-down" screening tools, using readily available information, as well as more detailed, "bottom-up" process-based models that explicitly track local runoff production and routing from sources to receiving water bodies. Regional scale data, available nationwide through the NHD+, and top-down models based on aggregated catchment information provide useful tools for estimating regional patterns of peak flows, volumes and nutrient loads at the catchment level. Management impacts can be estimated with these models, but have limited ability to resolve impacts beyond simple changes to land cover proportions. Alternatively, distributed process-based models provide more flexibility in modeling management impacts by resolving spatial patterns of nutrient source, runoff generation, and uptake. This bottom-up approach can incorporate explicit patterns of land cover, drainage connectivity, and vegetation extent, but are typically applied over smaller areas. Here, we first model peak flood flows and nitrogen loads across North Carolina's 70,000 NHD+ catchments using USGS regional streamflow regression equations and the SPARROW model. We also estimate management impact by altering aggregated sources in each of these models. To address the missing spatial implications of the top-down approach, we further explore the demand for riparian buffers as a management strategy, simulating the accumulation of nutrient sources along flow paths and the potential mitigation of these sources through forested buffers. We use the Regional Hydro-Ecological Simulation System (RHESSys) to model changes across several basins in North Carolina's Piedmont and Blue Ridge regions, ranging in size from 15 - 1,130 km2. The two approaches provide a complementary set of tools for large area screening, followed by smaller, more process based assessment and design tools.
NASA Technical Reports Server (NTRS)
Anderson, Frederick; Biezad, Daniel J.
1994-01-01
This paper describes the Rapid Aircraft DynamIcs AssessmeNt (RADIAN) project - an integration of the Aircraft SYNThesis (ACSTNT) design code with the USAD DATCOM code that estimates stability derivatives. Both of these codes are available to universities. These programs are then linked to flight simulation and flight controller synthesis tools and resulting design is evaluated on a graphics workstation. The entire process reduces the preliminary design time by an order of magnitude and provides an initial handling qualities evaluation of the design coupled to a control law. The integrated design process is applicable to both conventional aircraft taken from current textbooks and to unconventional designs emphasizing agility and propulsive control of attitude. The interactive and concurrent nature of the design process has been well received by industry and by design engineers at NASA. The process is being implemented into the design curriculum and is being used by students who view it as a significant advance over prior methods.
Thermal modelling using discrete vasculature for thermal therapy: a review
Kok, H.P.; Gellermann, J.; van den Berg, C.A.T.; Stauffer, P.R.; Hand, J.W.; Crezee, J.
2013-01-01
Reliable temperature information during clinical hyperthermia and thermal ablation is essential for adequate treatment control, but conventional temperature measurements do not provide 3D temperature information. Treatment planning is a very useful tool to improve treatment quality and substantial progress has been made over the last decade. Thermal modelling is a very important and challenging aspect of hyperthermia treatment planning. Various thermal models have been developed for this purpose, with varying complexity. Since blood perfusion is such an important factor in thermal redistribution of energy in in vivo tissue, thermal simulations are most accurately performed by modelling discrete vasculature. This review describes the progress in thermal modelling with discrete vasculature for the purpose of hyperthermia treatment planning and thermal ablation. There has been significant progress in thermal modelling with discrete vasculature. Recent developments have made real-time simulations possible, which can provide feedback during treatment for improved therapy. Future clinical application of thermal modelling with discrete vasculature in hyperthermia treatment planning is expected to further improve treatment quality. PMID:23738700
Proceedings of the 1987 conference on tools for the simulation profession
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hawkins, R.; Klukis, K.
1987-01-01
This book covers the proceedings of the 1987 conference on tools for the simulation profession. Some of the topics are: SIMULACT: a generic tool for simulating distributed systems; ESL language simulation of spacecraft batteries; and Trends in global cadmium levels from increased use of fossil fuels.
IgSimulator: a versatile immunosequencing simulator.
Safonova, Yana; Lapidus, Alla; Lill, Jennie
2015-10-01
The recent introduction of next-generation sequencing technologies to antibody studies have resulted in a growing number of immunoinformatics tools for antibody repertoire analysis. However, benchmarking these newly emerging tools remains problematic since the gold standard datasets that are needed to validate these tools are typically not available. Since simulating antibody repertoires is often the only feasible way to benchmark new immunoinformatics tools, we developed the IgSimulator tool that addresses various complications in generating realistic antibody repertoires. IgSimulator's code has modular structure and can be easily adapted to new requirements to simulation. IgSimulator is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from yana-safonova.github.io/ig_simulator. safonova.yana@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Risk Reduction and Training using Simulation Based Tools - 12180
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Irin P.
2012-07-01
Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less
NASA Astrophysics Data System (ADS)
Akbariyeh, S.; Snow, D. D.; Bartelt-Hunt, S.; Li, X.; Li, Y.
2015-12-01
Contamination of groundwater from nitrogen fertilizers and pesticides in agricultural lands is an important environmental and water quality management issue. It is well recognized that in agriculturally intensive areas, fertilizers and pesticides may leach through the vadose zone and eventually reach groundwater, impacting future uses of this limited resource. While numerical models are commonly used to simulate fate and transport of agricultural contaminants, few models have been validated based on realistic three dimensional soil lithology, hydrological conditions, and historical changes in groundwater quality. In this work, contamination of groundwater in the Nebraska Management Systems Evaluation Area (MSEA) site was simulated based on extensive field data including (1) lithology from 69 wells and 11 test holes; (2) surface soil type, land use, and surface elevations; (3) 5-year groundwater level and flow velocity; (4) daily meteorological monitoring; (5) 5-year seasonal irrigation records; (6) 5-years of spatially intensive contaminant concentration in 40 multilevel monitoring wells; and (7) detailed cultivation records. Using this data, a three-dimensional vadose zone lithological framework was developed using a commercial software tool (RockworksTM). Based on the interpolated lithology, a hydrological model was developed using HYDRUS-3D to simulate water flow and contaminant transport. The model was validated through comparison of simulated atrazine and nitrate concentration with historical data from 40 wells and multilevel samplers. The validated model will be used to predict potential changes in ground water quality due to agricultural contamination under future climate scenarios in the High Plain Aquifer system.
Fabian, Maria Patricia; Adamkiewicz, Gary; Stout, Natasha Kay; Sandel, Megan; Levy, Jonathan Ian
2013-01-01
Background Although indoor environmental conditions can affect pediatric asthmatics, few studies have characterized the impact of building interventions on asthma-related outcomes. Simulation models can evaluate such complex systems but have not been applied in this context. Objective To evaluate the impacts of building interventions on indoor environmental quality and pediatric asthma healthcare utilization, and to conduct cost comparisons between intervention and healthcare costs, and energy savings. Methods We applied our previously developed discrete event simulation model (DEM) to simulate the effect of environmental factors, medication compliance, seasonality, and medical history on: 1) pollutant concentrations indoors, and 2) asthma outcomes in low-income multi-family housing. We estimated healthcare utilization and costs at baseline and subsequent to interventions, and then compared healthcare costs to energy savings and intervention costs. Results Interventions such as integrated pest management and repairing kitchen exhaust fans led to 7–12% reductions in serious asthma events with 1–3 year payback periods. Weatherization efforts targeted solely towards tightening a building envelope led to 20% more serious asthma events, but bundling with repairing kitchen exhaust fans and eliminating indoor sources (e.g. gas stoves or smokers) mitigated this impact. Conclusion Our pediatric asthma model provides a tool to prioritize individual and bundled building interventions based on their impact on health and cost, and highlighting the tradeoffs between weatherization, indoor air quality, and health. Our work bridges the gap between clinical and environmental health sciences by increasing physicians’ understanding of the impact that home environmental changes can have on their patients’ asthma. PMID:23910689
Chauvin, Anthony; Truchot, Jennifer; Bafeta, Aida; Pateron, Dominique; Plaisance, Patrick; Yordanov, Youri
2018-04-01
The number of trials assessing Simulation-Based Medical Education (SBME) interventions has rapidly expanded. Many studies show that potential flaws in design, conduct and reporting of randomized controlled trials (RCTs) can bias their results. We conducted a methodological review of RCTs assessing a SBME in Emergency Medicine (EM) and examined their methodological characteristics. We searched MEDLINE via PubMed for RCT that assessed a simulation intervention in EM, published in 6 general and internal medicine and in the top 10 EM journals. The Cochrane Collaboration risk of Bias tool was used to assess risk of bias, intervention reporting was evaluated based on the "template for intervention description and replication" checklist, and methodological quality was evaluated by the Medical Education Research Study Quality Instrument. Reports selection and data extraction was done by 2 independents researchers. From 1394 RCTs screened, 68 trials assessed a SBME intervention. They represent one quarter of our sample. Cardiopulmonary resuscitation (CPR) is the most frequent topic (81%). Random sequence generation and allocation concealment were performed correctly in 66 and 49% of trials. Blinding of participants and assessors was performed correctly in 19 and 68%. Risk of attrition bias was low in three-quarters of the studies (n = 51). Risk of selective reporting bias was unclear in nearly all studies. The mean MERQSI score was of 13.4/18.4% of the reports provided a description allowing the intervention replication. Trials assessing simulation represent one quarter of RCTs in EM. Their quality remains unclear, and reproducing the interventions appears challenging due to reporting issues.
Image Quality Ranking Method for Microscopy
Koho, Sami; Fazeli, Elnaz; Eriksson, John E.; Hänninen, Pekka E.
2016-01-01
Automated analysis of microscope images is necessitated by the increased need for high-resolution follow up of events in time. Manually finding the right images to be analyzed, or eliminated from data analysis are common day-to-day problems in microscopy research today, and the constantly growing size of image datasets does not help the matter. We propose a simple method and a software tool for sorting images within a dataset, according to their relative quality. We demonstrate the applicability of our method in finding good quality images in a STED microscope sample preparation optimization image dataset. The results are validated by comparisons to subjective opinion scores, as well as five state-of-the-art blind image quality assessment methods. We also show how our method can be applied to eliminate useless out-of-focus images in a High-Content-Screening experiment. We further evaluate the ability of our image quality ranking method to detect out-of-focus images, by extensive simulations, and by comparing its performance against previously published, well-established microscopy autofocus metrics. PMID:27364703
NASA Astrophysics Data System (ADS)
Manna, Piero; Bonfante, Antonello; Basile, Angelo; Langella, Giuliano; Agrillo, Antonietta; De Mascellis, Roberto; Florindo Mileti, Antonio; Minieri, Luciana; Orefice, Nadia; Terribile, Fabio
2014-05-01
The SOILCONSWEB Project aims to create a decision support system operating at the landscape scale (Spatial-DSS) for the protection and the management of soils in both agricultural and environmental issues; it is a cyber-infrastructure built on remote servers operating through the web at www.landconsultingweb.eu. It includes - among others - a series of tools specifically designed to a Viticulture aiming at high quality wines production. The system is realized thanks to a collaboration between the University of Naples Federico II, CNR ISAFoM, Ariespace srl and SeSIRCA-Campania Region within a 5-years LIFE+ project funded by European Community. The system includes tools based on modelling procedures at different level of complexity some of which specifically designed for viticulture issues. One of the implemented models arise from the original desktop based SWAP model (Kroes et al, 2008). It can be run "on the fly" through a very user friendly web-interface. The specific tool, thanks to the model based on the Richard's equation can produce data on vineyard water stress, simulating the soil water balances of the different soil types within an area of interest. Thanks to a specific program developed within the project activities, the Spatial-DSS every day acquires punctual weather data and automatically spatialize them with geostatistical approaches in order to use the data as input for the SPA (Soil Plant Atmosphere ) model running. In particular for defining the upper boundary condition (rainfall and temperatures to estimate ET0 by the Hargraves model). Soil hydraulic properties (47 soil profiles within the study area), also essential for modelling simulation, were measured in laboratory using the Wind's approach or estimated through HYPRES PTF. Water retention and hydraulic conductivity relationships were parameterized according to the van Genuchten-Mualem model; Decision makers (individuals, groups of interests and public bodies) through the DSS can have real-time (or near real-time) access to critical, accurate, complete and up-to-date spatial data/output models held/processed in multiple data stores. The system allows the users interested in viticulture to have free, easy and immediate access to a number of environmental data and information very useful for quality wines production and especially for viticulture planning and management in a context of environmental sustainability. It produces detailed spatial documents, report and maps on a series of questions including the identification and description of terroir characteristics. The user once connected to the S-DSS can select an area of interest (i.e. farm, municipality, district) or draw it and obtain in real time a series of detailed information regarding that specific area, including maps and reports of landscape physical factors (i.e. soils, climate, geology, geomorphology, etc.), viticulture suitability, plant disease data and modelling, trends of viticulture years, bioclimatic indexes, etc. The user can also choose between different options such as the time period of the simulation runs or the type of data (maps, report or graphs) to be produced by the system. The S-DSS is being developed, tested and applied in an area of about 20,000 ha in south of Italy (Valle Telesina, in Campania region) mainly vocated to quality wines production (designation of origin DOC and DOCG). Key words: Decision Support System, spatial data, model simulation, soil hydrological properties, cyber infrastructure.
Current concepts in simulation-based trauma education.
Cherry, Robert A; Ali, Jameel
2008-11-01
The use of simulation-based technology in trauma education has focused on providing a safe and effective alternative to the more traditional methods that are used to teach technical skills and critical concepts in trauma resuscitation. Trauma team training using simulation-based technology is also being used to develop skills in leadership, team-information sharing, communication, and decision-making. The integration of simulators into medical student curriculum, residency training, and continuing medical education has been strongly recommended by the American College of Surgeons as an innovative means of enhancing patient safety, reducing medical errors, and performing a systematic evaluation of various competencies. Advanced human patient simulators are increasingly being used in trauma as an evaluation tool to assess clinical performance and to teach and reinforce essential knowledge, skills, and abilities. A number of specialty simulators in trauma and critical care have also been designed to meet these educational objectives. Ongoing educational research is still needed to validate long-term retention of knowledge and skills, provide reliable methods to evaluate teaching effectiveness and performance, and to demonstrate improvement in patient safety and overall quality of care.
Internet Tomography in Support of Internet and Network Simulation and Emulation Modelling
NASA Astrophysics Data System (ADS)
Moloisane, A.; Ganchev, I.; O'Droma, M.
Internet performance measurement data extracted through Internet Tomography techniques and metrics and how it may be used to enhance the capacity of network simulation and emulation modelling is addressed in this paper. The advantages of network simulation and emulation as a means to aid design and develop the component networks, which make up the Internet and are fundamental to its ongoing evolution, are highlighted. The Internet's rapid growth has spurred development of new protocols and algorithms to meet changing operational requirements such as security, multicast delivery, mobile networking, policy management, and quality of service (QoS) support. Both the development and evaluation of these operational tools requires the answering of many design and operational questions. Creating the technical support required by network engineers and managers in their efforts to seek answers to these questions is in itself a major challenge. Within the Internet the number and range of services supported continues to grow exponentially, from legacy and client/server applications to VoIP, multimedia streaming services and interactive multimedia services. Services have their own distinctive requirements and idiosyncrasies. They respond differently to bandwidth limitations, latency and jitter problems. They generate different types of “conversations” between end-user terminals, back-end resources and middle-tier servers. To add to the complexity, each new or enhanced service introduced onto the network contends for available bandwidth with every other service. In an effort to ensure networking products and resources being designed and developed handling diverse conditions encountered in real Internet environments, network simulation and emulation modelling is a valuable tool, and becoming a critical element, in networking product and application design and development. The better these laboratory tools reflect real-world environment and conditions the more helpful to designers they will be.
Hong, Eun-Mi; Shelton, Daniel; Pachepsky, Yakov A; Nam, Won-Ho; Coppock, Cary; Muirhead, Richard
2017-02-01
Knowledge of the microbial quality of irrigation waters is extremely limited. For this reason, the US FDA has promulgated the Produce Rule, mandating the testing of irrigation water sources for many farms. The rule requires the collection and analysis of at least 20 water samples over two to four years to adequately evaluate the quality of water intended for produce irrigation. The objective of this work was to evaluate the effect of interannual weather variability on surface water microbial quality. We used the Soil and Water Assessment Tool model to simulate E. coli concentrations in the Little Cove Creek; this is a perennial creek located in an agricultural watershed in south-eastern Pennsylvania. The model performance was evaluated using the US FDA regulatory microbial water quality metrics of geometric mean (GM) and the statistical threshold value (STV). Using the 90-year time series of weather observations, we simulated and randomly sampled the time series of E. coli concentrations. We found that weather conditions of a specific year may strongly affect the evaluation of microbial quality and that the long-term assessment of microbial water quality may be quite different from the evaluation based on short-term observations. The variations in microbial concentrations and water quality metrics were affected by location, wetness of the hydrological years, and seasonality, with 15.7-70.1% of samples exceeding the regulatory threshold. The results of this work demonstrate the value of using modeling to design and evaluate monitoring protocols to assess the microbial quality of water used for produce irrigation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Tools and Equipment Modeling for Automobile Interactive Assembling Operating Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu Dianliang; Zhu Hongmin; Shanghai Key Laboratory of Advance Manufacturing Environment
Tools and equipment play an important role in the simulation of virtual assembly, especially in the assembly process simulation and plan. Because of variety in function and complexity in structure and manipulation, the simulation of tools and equipments remains to be a challenge for interactive assembly operation. Based on analysis of details and characteristics of interactive operations for automobile assembly, the functional requirement for tools and equipments of automobile assembly is given. Then, a unified modeling method for information expression and function realization of general tools and equipments is represented, and the handling methods of manual, semi-automatic, automatic tools andmore » equipments are discussed. Finally, the application in assembly simulation of rear suspension and front suspension of Roewe 750 automobile is given. The result shows that the modeling and handling methods are applicable in the interactive simulation of various tools and equipments, and can also be used for supporting assembly process planning in virtual environment.« less
[High fidelity simulation : a new tool for learning and research in pediatrics].
Bragard, I; Farhat, N; Seghaye, M-C; Schumacher, K
2016-10-01
Caring for a sick child represents a high risk activity that requires technical and non-technical skills related to several factors such as the rarity of certain events or the stress of caring for a child. As regard these conditions, medi¬cal simulation provides a learning environment without risk, the control of variables, the reproducibility of situations, and the confrontation with rare events. In this article, we des¬cribe the steps of a simulation session and outline the current knowledge of the use of simulation in paediatrics. A session of simulation includes seven phases following the model of Peter Dieckmann, particularly the scenario and the debriefing that form the heart of the learning experience. Several studies have shown the advantages of simulation for paediatric trai¬ning in terms of changes in attitudes, skills and knowledge. Some studies have demonstrated a beneficial transfer to prac¬tice. In conclusion, simulation provides great potential for training and research in paediatrics. The establishment of a collaborative research program by the whole simulation com¬munity would help ensure that this type of training improves the quality of care.
Neuronvisio: A Graphical User Interface with 3D Capabilities for NEURON.
Mattioni, Michele; Cohen, Uri; Le Novère, Nicolas
2012-01-01
The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three-dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://neuronvisio.org) aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model. Neuronvisio also facilitates access to previously published models, allowing users to browse, download, and locally run NEURON models stored in ModelDB. Neuronvisio uses the matplotlib library to plot simulation results and uses the HDF standard format to store simulation results. Neuronvisio can be viewed as an extension of NEURON, facilitating typical user workflows such as model browsing, selection, download, compilation, and simulation. The 3D viewer simplifies the exploration of complex model structure, while matplotlib permits the plotting of high-quality graphs. The newly introduced ability of saving numerical results allows users to perform additional analysis on their previous simulations.
Variation simulation for compliant sheet metal assemblies with applications
NASA Astrophysics Data System (ADS)
Long, Yufeng
Sheet metals are widely used in discrete products, such as automobiles, aircraft, furniture and electronics appliances, due to their good manufacturability and low cost. A typical automotive body assembly consists of more than 300 parts welded together in more than 200 assembly fixture stations. Such an assembly system is usually quite complex, and takes a long time to develop. As the automotive customer demands products of increasing quality in a shorter time, engineers in automotive industry turn to computer-aided engineering (CAE) tools for help. Computers are an invaluable resource for engineers, not only to simplify and automate the design process, but also to share design specifications with manufacturing groups so that production systems can be tooled up quickly and efficiently. Therefore, it is beneficial to develop computerized simulation and evaluation tools for development of automotive body assembly systems. It is a well-known fact that assembly architectures (joints, fixtures, and assembly lines) have a profound impact on dimensional quality of compliant sheet metal assemblies. To evaluate sheet metal assembly architectures, a special dimensional analysis tool need be developed for predicting dimensional variation of the assembly. Then, the corresponding systematic tools can be established to help engineers select the assembly architectures. In this dissertation, a unified variation model is developed to predict variation in compliant sheet metal assemblies by considering fixture-induced rigid-body motion, deformation and springback. Based on the unified variation model, variation propagation models in multiple assembly stations with various configurations are established. To evaluate the dimensional capability of assembly architectures, quantitative indices are proposed based on the sensitivity matrix, which are independent of the variation level of the process. Examples are given to demonstrate their applications in selecting robust assembly architectures, and some useful guidelines for selection of assembly architectures are summarized. In addition, to enhance the fault diagnosis, a systematic methodology is proposed for selection of measurement configurations. Specifically, principles involved in selecting measurements are generalized first; then, the corresponding quantitative indices are developed to evaluate the measurement configurations, and finally, examples are present.
Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.
2012-01-01
Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. M. Sexton,; A. M. Sadeghi,; X. Zhang,
The value of watershed-scale, hydrologic and water quality models to ecosystem management is increasingly evident as more programs adopt these tools to evaluate the effectiveness of different management scenarios and their impact on the environment. Quality of precipitation data is critical for appropriate application of watershed models. In small watersheds, where no dense rain gauge network is available, modelers are faced with a dilemma to choose between different data sets. In this study, we used the German Branch (GB) watershed (~50 km 2), which is included in the USDA Conservation Effects Assessment Project (CEAP), to examine the implications of usingmore » surface rain gauge and next-generation radar (NEXRAD) precipitation data sets on the performance of the Soil and Water Assessment Tool (SWAT). The GB watershed is located in the Coastal Plain of Maryland on the eastern shore of Chesapeake Bay. Stream flow estimation results using surface rain gauge data seem to indicate the importance of using rain gauges within the same direction as the storm pattern with respect to the watershed. In the absence of a spatially representative network of rain gauges within the watershed, NEXRAD data produced good estimates of stream flow at the outlet of the watershed. Three NEXRAD datasets, including (1)*non-corrected (NC), (2) bias-corrected (BC), and (3) inverse distance weighted (IDW) corrected NEXRAD data, were produced. Nash-Sutcliffe efficiency coefficients for daily stream flow simulation using these three NEXRAD data ranged from 0.46 to 0.58 during calibration and from 0.68 to 0.76 during validation. Overall, correcting NEXRAD with rain gauge data is promising to produce better hydrologic modeling results. Given the multiple precipitation datasets and corresponding simulations, we explored the combination of the multiple simulations using Bayesian model averaging.« less
Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.
2015-01-01
Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.
Cheng, Adam; Hunt, Elizabeth A; Donoghue, Aaron; Nelson, Kristen; Leflore, Judy; Anderson, JoDee; Eppich, Walter; Simon, Robert; Rudolph, Jenny; Nadkarni, Vinay
2011-02-01
Over the past decade, medical simulation has evolved into an essential component of pediatric resuscitation education and team training. Evidence to support its value as an adjunct to traditional methods of education is expanding; however, large multicenter studies are very rare. Simulation-based researchers currently face many challenges related to small sample sizes, poor generalizability, and paucity of clinically proven and relevant outcome measures. The Examining Pediatric Resuscitation Education Using Simulation and Scripting (EXPRESS) pediatric simulation research collaborative was formed in an attempt to directly address and overcome these challenges. The primary mission of the EXPRESS collaborative is to improve the delivery of medical care to critically ill children by answering important research questions pertaining to pediatric resuscitation and education and is focused on using simulation either as a key intervention of interest or as the outcome measurement tool. Going forward, the collaborative aims to expand its membership internationally and collectively identify pediatric resuscitation and simulation-based research priorities and use these to guide future projects. Ultimately, we hope that with innovative and high-quality research, the EXPRESS pediatric simulation research collaborative will help to build momentum for simulation-based research on an international level. Copyright © 2011 Society for Simulation in Healthcare
Kunz, Derek; Pariyadath, Manoj; Wittler, Mary; Askew, Kim; Manthey, David; Hartman, Nicholas
2017-06-01
Arthrocentesis is an important skill for physicians in multiple specialties. Recent studies indicate a superior safety and performance profile for this procedure using ultrasound guidance for needle placement, and improving quality of care requires a valid measurement of competency using this modality. We endeavored to create a validated tool to assess the performance of this procedure using the modified Delphi technique and experts in multiple disciplines across the United States. We derived a 22-item checklist designed to assess competency for the completion of ultrasound-guided arthrocentesis, which demonstrated a Cronbach's alpha of 0.89, indicating an excellent degree of internal consistency. Although we were able to demonstrate content validity for this tool, further validity evidence should be acquired after the tool is used and studied in clinical and simulated contexts. © 2017 by the American Institute of Ultrasound in Medicine.
NASA Technical Reports Server (NTRS)
Redhed, D. D.
1978-01-01
Three possible goals for the Numerical Aerodynamic Simulation Facility (NASF) are: (1) a computational fluid dynamics (as opposed to aerodynamics) algorithm development tool; (2) a specialized research laboratory facility for nearly intractable aerodynamics problems that industry encounters; and (3) a facility for industry to use in its normal aerodynamics design work that requires high computing rates. The central system issue for industry use of such a computer is the quality of the user interface as implemented in some kind of a front end to the vector processor.
A Model Based Approach to Increase the Part Accuracy in Robot Based Incremental Sheet Metal Forming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meier, Horst; Laurischkat, Roman; Zhu Junhong
One main influence on the dimensional accuracy in robot based incremental sheet metal forming results from the compliance of the involved robot structures. Compared to conventional machine tools the low stiffness of the robot's kinematic results in a significant deviation of the planned tool path and therefore in a shape of insufficient quality. To predict and compensate these deviations offline, a model based approach, consisting of a finite element approach, to simulate the sheet forming, and a multi body system, modeling the compliant robot structure, has been developed. This paper describes the implementation and experimental verification of the multi bodymore » system model and its included compensation method.« less
A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.
Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier
2018-05-01
The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients. This tool has three main components: the nursing process, communication skills, and safety management. Copyright © 2018 Elsevier Ltd. All rights reserved.
Wu, Hao; Zhang, Yan; Yu, Qi; Ma, Weichun
2018-04-01
In this study, the authors endeavored to develop an effective framework for improving local urban air quality on meso-micro scales in cities in China that are experiencing rapid urbanization. Within this framework, the integrated Weather Research and Forecasting (WRF)/CALPUFF modeling system was applied to simulate the concentration distributions of typical pollutants (particulate matter with an aerodynamic diameter <10 μm [PM 10 ], sulfur dioxide [SO 2 ], and nitrogen oxides [NO x ]) in the urban area of Benxi. Statistical analyses were performed to verify the credibility of this simulation, including the meteorological fields and concentration fields. The sources were then categorized using two different classification methods (the district-based and type-based methods), and the contributions to the pollutant concentrations from each source category were computed to provide a basis for appropriate control measures. The statistical indexes showed that CALMET had sufficient ability to predict the meteorological conditions, such as the wind fields and temperatures, which provided meteorological data for the subsequent CALPUFF run. The simulated concentrations from CALPUFF showed considerable agreement with the observed values but were generally underestimated. The spatial-temporal concentration pattern revealed that the maximum concentrations tended to appear in the urban centers and during the winter. In terms of their contributions to pollutant concentrations, the districts of Xihu, Pingshan, and Mingshan all affected the urban air quality to different degrees. According to the type-based classification, which categorized the pollution sources as belonging to the Bengang Group, large point sources, small point sources, and area sources, the source apportionment showed that the Bengang Group, the large point sources, and the area sources had considerable impacts on urban air quality. Finally, combined with the industrial characteristics, detailed control measures were proposed with which local policy makers could improve the urban air quality in Benxi. In summary, the results of this study showed that this framework has credibility for effectively improving urban air quality, based on the source apportionment of atmospheric pollutants. The authors endeavored to build up an effective framework based on the integrated WRF/CALPUFF to improve the air quality in many cities on meso-micro scales in China. Via this framework, the integrated modeling tool is accurately used to study the characteristics of meteorological fields, concentration fields, and source apportionments of pollutants in target area. The impacts of classified sources on air quality together with the industrial characteristics can provide more effective control measures for improving air quality. Through the case study, the technical framework developed in this study, particularly the source apportionment, could provide important data and technical support for policy makers to assess air pollution on the scale of a city in China or even the world.
[Development of a Text-Data Based Learning Tool That Integrates Image Processing and Displaying].
Shinohara, Hiroyuki; Hashimoto, Takeyuki
2015-01-01
We developed a text-data based learning tool that integrates image processing and displaying by Excel. Knowledge required for programing this tool is limited to using absolute, relative, and composite cell references and learning approximately 20 mathematical functions available in Excel. The new tool is capable of resolution translation, geometric transformation, spatial-filter processing, Radon transform, Fourier transform, convolutions, correlations, deconvolutions, wavelet transform, mutual information, and simulation of proton density-, T1-, and T2-weighted MR images. The processed images of 128 x 128 pixels or 256 x 256 pixels are observed directly within Excel worksheets without using any particular image display software. The results of image processing using this tool were compared with those using C language and the new tool was judged to have sufficient accuracy to be practically useful. The images displayed on Excel worksheets were compared with images using binary-data display software. This comparison indicated that the image quality of the Excel worksheets was nearly equal to the latter in visual impressions. Since image processing is performed by using text-data, the process is visible and facilitates making contrasts by using mathematical equations within the program. We concluded that the newly developed tool is adequate as a computer-assisted learning tool for use in medical image processing.
NASA Astrophysics Data System (ADS)
Curci, Gabriele; Falasca, Serena
2017-04-01
Deterministic air quality forecast is routinely carried out at many local Environmental Agencies in Europe and throughout the world by means of eulerian chemistry-transport models. The skill of these models in predicting the ground-level concentrations of relevant pollutants (ozone, nitrogen dioxide, particulate matter) a few days ahead has greatly improved in recent years, but it is not yet always compliant with the required quality level for decision making (e.g. the European Commission has set a maximum uncertainty of 50% on daily values of relevant pollutants). Post-processing of deterministic model output is thus still regarded as a useful tool to make the forecast more reliable. In this work, we test several bias correction techniques applied to a long-term dataset of air quality forecasts over Europe and Italy. We used the WRF-CHIMERE modelling system, which provides operational experimental chemical weather forecast at CETEMPS (http://pumpkin.aquila.infn.it/forechem/), to simulate the years 2008-2012 at low resolution over Europe (0.5° x 0.5°) and moderate resolution over Italy (0.15° x 0.15°). We compared the simulated dataset with available observation from the European Environmental Agency database (AirBase) and characterized model skill and compliance with EU legislation using the Delta tool from FAIRMODE project (http://fairmode.jrc.ec.europa.eu/). The bias correction techniques adopted are, in order of complexity: (1) application of multiplicative factors calculated as the ratio of model-to-observed concentrations averaged over the previous days; (2) correction of the statistical distribution of model forecasts, in order to make it similar to that of the observations; (3) development and application of Model Output Statistics (MOS) regression equations. We illustrate differences and advantages/disadvantages of the three approaches. All the methods are relatively easy to implement for other modelling systems.
Adams, Russell; Quinn, Paul F; Perks, Matthew; Barber, Nicholas J; Jonczyk, Jennine; Owen, Gareth J
2016-12-01
High resolution water quality data has recently become widely available from numerous catchment based monitoring schemes. However, the models that can reproduce time series of concentrations or fluxes have not kept pace with the advances in monitoring data. Model performance at predicting phosphorus (P) and sediment concentrations has frequently been poor with models not fit for purpose except for predicting annual losses. Here, the data from the Eden Demonstration Test Catchments (DTC) project have been used to calibrate the Catchment Runoff Attenuation Flux Tool (CRAFT), a new, parsimonious model developed with the aim of modelling both the generation and attenuation of nutrients and sediments in small to medium sized catchments. The CRAFT has the ability to run on an hourly timestep and can calculate the mass of sediments and nutrients transported by three flow pathways representing rapid surface runoff, fast subsurface drainage and slow groundwater flow (baseflow). The attenuation feature of the model is introduced here; this enables surface runoff and contaminants transported via this pathway to be delayed in reaching the catchment outlet. It was used to investigate some hypotheses of nutrient and sediment transport in the Newby Beck Catchment (NBC) Model performance was assessed using a suite of metrics including visual best fit and the Nash-Sutcliffe efficiency. It was found that this approach for water quality models may be the best assessment method as opposed to using a single metric. Furthermore, it was found that, when the aim of the simulations was to reproduce the time series of total P (TP) or total reactive P (TRP) to get the best visual fit, that attenuation was required. The model will be used in the future to explore the impacts on water quality of different mitigation options in the catchment; these will include attenuation of surface runoff. Copyright © 2016 Elsevier B.V. All rights reserved.
A drill-soil system modelization for future Mars exploration
NASA Astrophysics Data System (ADS)
Finzi, A. E.; Lavagna, M.; Rocchitelli, G.
2004-01-01
This paper presents a first approach to the problem of modeling a drilling process to be carried on in the space environment by a dedicated payload. Systems devoted to work in space present very strict requirements in many different fields such as thermal response, electric power demand, reliability and so on. Thus, models devoted to the operational behaviour simulation represent a fundamental help in the design phase and give a great improvement in the final product quality. As the required power is the crucial constraint within drilling devices, the tool-soil interaction modelization and simulation are finalized to the computation of the power demand as a function of both the drill and the soil parameters. An accurate study of the tool and the soil separately has been firstly carried on and, secondly their interaction has been analyzed. The Dee-Dri system, designed by Tecnospazio and to be part of the lander components in the NASA's Mars Sample Return Mission, has been taken as the tool reference. The Deep-Drill system is a complex rotary tool devoted to the soil perforation and sample collection; it has to operate in a Martian zone made of rocks similar to the terrestrial basalt, then the modelization is restricted to the interaction analysis between the tool and materials belonging to the rock set. The tool geometric modelization has been faced by a finite element approach with a Langrangian formulation: for the static analysis a refined model is assumed considering both the actual geometry of the head and the rod screws; a simplified model has been used to deal with the dynamic analysis. The soil representation is based on the Mohr-Coulomb crack criterion and an Eulerian approach has been selected to model it. However, software limitations in dealing with the tool-soil interface definition required assuming a Langrangian formulation for the soil too. The interaction between the soil and the tool has been modeled by extending the two-dimensional Nishimatsu's theory for rock cutting for rotating perforation tools. A fine analysis on f.e.m. element choice for each part of the tool is presented together with static analysis results. The dynamic analysis results are limited to the first impact phenomenon between the rock and the tool head. The validity of both the theoretical and numerical models is confirmed by the good agreement between simulation results and data coming from the experiments done within the Tecnospazio facilities.
NASA Astrophysics Data System (ADS)
Rimbault, C.; Le Meur, G.; Blampuy, F.; Bambade, P.; Schulte, D.
2009-12-01
Depolarization is a new feature in the beam-beam simulation tool GUINEA-PIG++ (GP++). The results of this simulation are studied and compared with another beam-beam simulation tool, CAIN, considering different beam parameters for the International Linear Collider (ILC) with a centre-of-mass energy of 500 GeV.
Knowledge Management tools integration within DLR's concurrent engineering facility
NASA Astrophysics Data System (ADS)
Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.
The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.
Water quality modelling of an impacted semi-arid catchment using flow data from the WEAP model
NASA Astrophysics Data System (ADS)
Slaughter, Andrew R.; Mantel, Sukhmani K.
2018-04-01
The continuous decline in water quality in many regions is forcing a shift from quantity-based water resources management to a greater emphasis on water quality management. Water quality models can act as invaluable tools as they facilitate a conceptual understanding of processes affecting water quality and can be used to investigate the water quality consequences of management scenarios. In South Africa, the Water Quality Systems Assessment Model (WQSAM) was developed as a management-focussed water quality model that is relatively simple to be able to utilise the small amount of available observed data. Importantly, WQSAM explicitly links to systems (yield) models routinely used in water resources management in South Africa by using their flow output to drive water quality simulations. Although WQSAM has been shown to be able to represent the variability of water quality in South African rivers, its focus on management from a South African perspective limits its use to within southern African regions for which specific systems model setups exist. Facilitating the use of WQSAM within catchments outside of southern Africa and within catchments for which these systems model setups to not exist would require WQSAM to be able to link to a simple-to-use and internationally-applied systems model. One such systems model is the Water Evaluation and Planning (WEAP) model, which incorporates a rainfall-runoff component (natural hydrology), and reservoir storage, return flows and abstractions (systems modelling), but within which water quality modelling facilities are rudimentary. The aims of the current study were therefore to: (1) adapt the WQSAM model to be able to use as input the flow outputs of the WEAP model and; (2) provide an initial assessment of how successful this linkage was by application of the WEAP and WQSAM models to the Buffalo River for historical conditions; a small, semi-arid and impacted catchment in the Eastern Cape of South Africa. The simulations of the two models were compared to the available observed data, with the initial focus within WQSAM on a simulation of instream total dissolved solids (TDS) and nutrient concentrations. The WEAP model was able to adequately simulate flow in the Buffalo River catchment, with consideration of human inputs and outputs. WQSAM was adapted to successfully take as input the flow output of the WEAP model, and the simulations of nutrients by WQSAM provided a good representation of the variability of observed nutrient concentrations in the catchment. This study showed that the WQSAM model is able to accept flow inputs from the WEAP model, and that this approach is able to provide satisfactory estimates of both flow and water quality for a small, semi-arid and impacted catchment. It is hoped that this research will encourage the application of WQSAM to an increased number of catchments within southern Africa and beyond.
Technical Note: Detective quantum efficiency simulation of a-Se imaging detectors using ARTEMIS.
Fang, Yuan; Ito, Takaaki; Nariyuki, Fumito; Kuwabara, Takao; Badano, Aldo; Karim, Karim S
2017-08-01
This work studies the detective quantum efficiency (DQE) of a-Se-based solid state x-ray detectors for medical imaging applications using ARTEMIS, a Monte Carlo simulation tool for modeling x-ray photon, electron and charged carrier transport in semiconductors with the presence of applied electric field. ARTEMIS is used to model the signal formation process in a-Se. The simulation model includes x-ray photon and high-energy electron interactions, and detailed electron-hole pair transport with applied detector bias taking into account drift, diffusion, Coulomb interactions, recombination and trapping. For experimental validation, the DQE performance of prototype a-Se detectors is measured following IEC Testing Standard 62220-1-3. Comparison of simulated and experimental DQE results show reasonable agreement for RQA beam qualities. Experimental validation demonstrated within 5% percentage difference between simulation and experimental DQE results for spatial frequency above 0.25 cycles/mm using uniform applied electric field for RQA beam qualities (RQA5, RQA7 and RQA9). Results include two different prototype detectors with thicknesses of 240 μm and 1 mm. ARTEMIS can be used to model the DQE of a-Se detectors as a function of x-ray energy, detector thickness, and spatial frequency. The ARTEMIS model can be used to improve understanding of the physics of x-ray interactions in a-Se and in optimization studies for the development of novel medical imaging applications. © 2017 American Association of Physicists in Medicine.
Enabling Efficient Climate Science Workflows in High Performance Computing Environments
NASA Astrophysics Data System (ADS)
Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.
2015-12-01
A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.
Spectrum simulation in DTSA-II.
Ritchie, Nicholas W M
2009-10-01
Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.
Olsha-Yehiav, Maya; Einbinder, Jonathan S.; Jung, Eunice; Linder, Jeffrey A.; Greim, Julie; Li, Qi; Schnipper, Jeffrey L.; Middleton, Blackford
2006-01-01
Quality Dashboards (QD) is a condition-specific, actionable web-based application for quality reporting and population management that is integrated into the Electronic Health Record (EHR). Using server-based graphic web controls in a .Net environment to construct Quality Dashboards allows customization of the reporting tool without the need to rely on commercial business intelligence tool. Quality Dashboards will improve patient care and quality outcomes as clinicians utilize the reporting tool for population management. PMID:17238671
Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations
NASA Technical Reports Server (NTRS)
Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.
2015-01-01
The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.
New perspectives in hydrodynamic radial polishing techniques for optical surfaces
NASA Astrophysics Data System (ADS)
Ruiz, Elfego; Sohn, Erika; Luna, Esteban; Salas, Luis; Cordero, Alberto; González, Jorge; Núñez, Manuel; Salinas, Javier; Cruz-González, Irene; Valdés, Jorge; Cabrera, Victor; Martínez, Benjamín
2004-09-01
In order to overcome classic polishing techniques, a novel hydrodynamic radial polishing tool (HyDRa) is presented; it is useful for the corrective lapping and fine polishing of diverse materials by means of a low-cost abrasive flux and a hydrostatic suspension system that avoids contact of the tool with the working surface. This tool enables the work on flat or curved surfaces of currently up to two and a half meters in diameter. It has the advantage of avoiding fallen edges during the polishing process as well as reducing tool wear out and deformation. The functioning principle is based on the generation of a high-velocity, high-pressure, abrasive emulsion flux with radial geometry. The polishing process is repeatable by means of the control of the tool operational parameters, achieving high degrees of precision and accuracy on optical and semiconductor surfaces, with removal rates of up to 9 mm3/hour and promising excellent surface polishing qualities. An additional advantage of this new tool is the possibility to perform interferometric measurements during the polishing process without the need of dismounting the working surface. A series of advantages of this method, numerical simulations and experimental results are described.
Anderson, Jane A; Godwin, Kyler M; Saleem, Jason J; Russell, Scott; Robinson, Joshua J; Kimmel, Barbara
2014-12-01
This article reports redesign strategies identified to create a Web-based user-interface for the Self-management TO Prevent (STOP) Stroke Tool. Members of a Stroke Quality Improvement Network (N = 12) viewed a visualization video of a proposed prototype and provided feedback on implementation barriers/facilitators. Stroke-care providers (N = 10) tested the Web-based prototype in think-aloud sessions of simulated clinic visits. Participants' dialogues were coded into themes. Access to comprehensive information and the automated features/systematized processes were the primary accessibility and usability facilitator themes. The need for training, time to complete the tool, and computer-centric care were identified as possible usability barriers. Patient accountability, reminders for best practice, goal-focused care, and communication/counseling themes indicate that the STOP Stroke Tool supports the paradigm of patient-centered care. The STOP Stroke Tool was found to prompt clinicians on secondary stroke-prevention clinical-practice guidelines, facilitate comprehensive documentation of evidence-based care, and support clinicians in providing patient-centered care through the shared decision-making process that occurred while using the action-planning/goal-setting feature of the tool. © The Author(s) 2013.
SU-E-T-103: Development and Implementation of Web Based Quality Control Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Studinski, R; Taylor, R; Angers, C
Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less
Simulating the detection and classification of high-redshift supernovae with HARMONI on the ELT
NASA Astrophysics Data System (ADS)
Bounissou, S.; Thatte, N.; Zieleniewski, S.; Houghton, R. C. W.; Tecza, M.; Hook, I.; Neichel, B.; Fusco, T.
2018-02-01
We present detailed simulations of integral field spectroscopic observations of a supernova in a host galaxy at z ˜ 3, as observed by the HARMONI spectrograph on the Extremely Large Telescope, asssisted by laser tomographic adaptive optics. The goal of the simulations, using the HSIM simulation tool, is to determine whether HARMONI can discern the supernova Type from spectral features in the supernova spectrum. We find that in a 3 hour observation, covering the near-infrared H and K bands, at a spectral resolving power of ˜3000, and using the 20×20 mas spaxel scale, we can classify supernova Type Ia and their redshift robustly up to 80 days past maximum light (20 days in the supernova rest frame). We show that HARMONI will provide spectra at z ˜ 3 that are of comparable (or better) quality to the best spectra we can currently obtain at z ˜ 1, thus allowing studies of cosmic expansion rates to be pushed to substantially higher redshifts.
Regional Climate Simulation and Data Assimilation with Variable-Resolution GCMs
NASA Technical Reports Server (NTRS)
Fox-Rabinovitz, Michael S.
2002-01-01
Variable resolution GCMs using a global stretched grid (SG) with enhanced regional resolution over one or multiple areas of interest represents a viable new approach to regional climateklimate change and data assimilation studies and applications. The multiple areas of interest, at least one within each global quadrant, include the major global mountains and major global monsoonal circulations over North America, South America, India-China, and Australia. They also can include the polar domains, and the European and African regions. The SG-approach provides an efficient regional downscaling to mesoscales, and it is an ideal tool for representing consistent interactions of globaYlarge- and regionallmeso- scales while preserving the high quality of global circulation. Basically, the SG-GCM simulations are no different from those of the traditional uniform-grid GCM simulations besides using a variable-resolution grid. Several existing SG-GCMs developed by major centers and groups are briefly described. The major discussion is based on the GEOS (Goddard Earth Observing System) SG-GCM regional climate simulations.
NASA Astrophysics Data System (ADS)
Matras, A.; Kowalczyk, R.
2014-11-01
The analysis results of machining accuracy after the free form surface milling simulations (based on machining EN AW- 7075 alloys) for different machining strategies (Level Z, Radial, Square, Circular) are presented in the work. Particular milling simulations were performed using CAD/CAM Esprit software. The accuracy of obtained allowance is defined as a difference between the theoretical surface of work piece element (the surface designed in CAD software) and the machined surface after a milling simulation. The difference between two surfaces describes a value of roughness, which is as the result of tool shape mapping on the machined surface. Accuracy of the left allowance notifies in direct way a surface quality after the finish machining. Described methodology of usage CAD/CAM software can to let improve a time design of machining process for a free form surface milling by a 5-axis CNC milling machine with omitting to perform the item on a milling machine in order to measure the machining accuracy for the selected strategies and cutting data.
Infrared imagery acquisition process supporting simulation and real image training
NASA Astrophysics Data System (ADS)
O'Connor, John
2012-05-01
The increasing use of infrared sensors requires development of advanced infrared training and simulation tools to meet current Warfighter needs. In order to prepare the force, a challenge exists for training and simulation images to be both realistic and consistent with each other to be effective and avoid negative training. The US Army Night Vision and Electronic Sensors Directorate has corrected this deficiency by developing and implementing infrared image collection methods that meet the needs of both real image trainers and real-time simulations. The author presents innovative methods for collection of high-fidelity digital infrared images and the associated equipment and environmental standards. The collected images are the foundation for US Army, and USMC Recognition of Combat Vehicles (ROC-V) real image combat ID training and also support simulations including the Night Vision Image Generator and Synthetic Environment Core. The characteristics, consistency, and quality of these images have contributed to the success of these and other programs. To date, this method has been employed to generate signature sets for over 350 vehicles. The needs of future physics-based simulations will also be met by this data. NVESD's ROC-V image database will support the development of training and simulation capabilities as Warfighter needs evolve.
Integrating Flight Dynamics & Control Analysis and Simulation in Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben; Berger, Tom; Tischler, Mark B.; Theodore, Colin R; Elmore, Josh; Gallaher, Andrew; Tobias, Eric L.
2016-01-01
The development of a toolset, SIMPLI-FLYD ('SIMPLIfied FLight dynamics for conceptual Design') is described. SIMPLI-FLYD is a collection of tools that perform flight dynamics and control modeling and analysis of rotorcraft conceptual designs including a capability to evaluate the designs in an X-Plane-based real-time simulation. The establishment of this framework is now facilitating the exploration of this new capability, in terms of modeling fidelity and data requirements, and the investigation of which stability and control and handling qualities requirements are appropriate for conceptual design. Illustrative design variation studies for single main rotor and tiltrotor vehicle configurations show sensitivity of the stability and control characteristics and an approach to highlight potential weight savings by identifying over-design.
Simulation-Based Cryosurgery Intelligent Tutoring System (ITS) Prototype
Sehrawat, Anjali; Keelan, Robert; Shimada, Kenji; Wilfong, Dona M.; McCormick, James T.; Rabin, Yoed
2015-01-01
As a part of an ongoing effort to develop computerized training tools for cryosurgery, the current study presents a proof-of-concept for a computerized tool for cryosurgery tutoring. The tutoring system lists geometrical constraints of cryoprobes placement, simulates cryoprobe insertion, displays a rendered shape of the prostate, enables distance measurements, simulates the corresponding thermal history, and evaluates the mismatch between the target region shape and a pre-selected planning isotherm. The quality of trainee planning is measured in comparison with a computer-generated planning, created for each case study by previously developed planning algorithms. Two versions of the tutoring system have been tested in the current study: (i) an unguided version, where the trainee can practice cases in unstructured sessions, and (ii) an intelligent tutoring system (ITS), which forces the trainee to follow specific steps, believed by the authors to potentially shorten the learning curve. While the tutoring level in this study aims only at geometrical constraints on cryoprobe placement and the resulting thermal histories, it creates a unique opportunity to gain insight into the process outside of the operation room. Posttest results indicate that the ITS system maybe more beneficial than the non-ITS system, but the proof-of-concept is demonstrated with either system. PMID:25941163
Simulation-Based Cryosurgery Intelligent Tutoring System Prototype.
Sehrawat, Anjali; Keelan, Robert; Shimada, Kenji; Wilfong, Dona M; McCormick, James T; Rabin, Yoed
2016-04-01
As a part of an ongoing effort to develop computerized training tools for cryosurgery, the current study presents a proof of concept for a computerized tool for cryosurgery tutoring. The tutoring system lists geometrical constraints of cryoprobes placement, simulates cryoprobe insertion, displays a rendered shape of the prostate, enables distance measurements, simulates the corresponding thermal history, and evaluates the mismatch between the target region shape and a preselected planning isotherm. The quality of trainee planning is measured in comparison with a computer-generated planning, created for each case study by previously developed planning algorithms. The following two versions of the tutoring system have been tested in the current study: (1) an unguided version, where the trainee can practice cases in unstructured sessions and (2) an intelligent tutoring system, which forces the trainee to follow specific steps, believed by the authors to potentially shorten the learning curve. Although the tutoring level in this study aims only at geometrical constraints on cryoprobe placement and the resulting thermal histories, it creates a unique opportunity to gain insight into the process outside the operation room. Post-test results indicate that the intelligent tutoring system may be more beneficial than the nonintelligent tutoring system, but the proof of concept is demonstrated with either system. © The Author(s) 2015.
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond
Falenski, Alexander; Weiser, Armin A.; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations. PMID:26247028
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond.
Falenski, Alexander; Weiser, Armin A; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations.
Coaching the Debriefer: Peer Coaching to Improve Debriefing Quality in Simulation Programs.
Cheng, Adam; Grant, Vincent; Huffman, James; Burgess, Gavin; Szyld, Demian; Robinson, Traci; Eppich, Walter
2017-10-01
Formal faculty development programs for simulation educators are costly and time-consuming. Peer coaching integrated into the teaching flow can enhance an educator's debriefing skills. We provide a practical guide for the who, what, when, where, why, and how of peer coaching for debriefing in simulation-based education. Peer coaching offers advantages such as psychological safety and team building, and it can benefit both the educator who is receiving feedback and the coach who is providing it. A feedback form for effective peer coaching includes the following: (1) psychological safety, (2) framework, (3) method/strategy, (4) content, (5) learner centeredness, (6) co-facilitation, (7) time management, (8) difficult situations, (9) debriefing adjuncts, and (10) individual style and experience. Institutional backing of peer coaching programs can facilitate implementation and sustainability. Program leaders should communicate the need and benefits, establish program goals, and provide assessment tools, training, structure, and evaluation to optimize chances of success.
Simulation-based training in echocardiography.
Biswas, Monodeep; Patel, Rajendrakumar; German, Charles; Kharod, Anant; Mohamed, Ahmed; Dod, Harvinder S; Kapoor, Poonam Malhotra; Nanda, Navin C
2016-10-01
The knowledge gained from echocardiography is paramount for the clinician in diagnosing, interpreting, and treating various forms of disease. While cardiologists traditionally have undergone training in this imaging modality during their fellowship, many other specialties are beginning to show interest as well, including intensive care, anesthesia, and primary care trainees, in both transesophageal and transthoracic echocardiography. Advances in technology have led to the development of simulation programs accessible to trainees to help gain proficiency in the nuances of obtaining quality images, in a low stress, pressure free environment, often with a functioning ultrasound probe and mannequin that can mimic many of the pathologies seen in living patients. Although there are various training simulation programs each with their own benefits and drawbacks, it is clear that these programs are a powerful tool in educating the trainee and likely will lead to improved patient outcomes. © 2016, Wiley Periodicals, Inc.
SS-mPMG and SS-GA: tools for finding pathways and dynamic simulation of metabolic networks.
Katsuragi, Tetsuo; Ono, Naoaki; Yasumoto, Keiichi; Altaf-Ul-Amin, Md; Hirai, Masami Y; Sriyudthsak, Kansuporn; Sawada, Yuji; Yamashita, Yui; Chiba, Yukako; Onouchi, Hitoshi; Fujiwara, Toru; Naito, Satoshi; Shiraishi, Fumihide; Kanaya, Shigehiko
2013-05-01
Metabolomics analysis tools can provide quantitative information on the concentration of metabolites in an organism. In this paper, we propose the minimum pathway model generator tool for simulating the dynamics of metabolite concentrations (SS-mPMG) and a tool for parameter estimation by genetic algorithm (SS-GA). SS-mPMG can extract a subsystem of the metabolic network from the genome-scale pathway maps to reduce the complexity of the simulation model and automatically construct a dynamic simulator to evaluate the experimentally observed behavior of metabolites. Using this tool, we show that stochastic simulation can reproduce experimentally observed dynamics of amino acid biosynthesis in Arabidopsis thaliana. In this simulation, SS-mPMG extracts the metabolic network subsystem from published databases. The parameters needed for the simulation are determined using a genetic algorithm to fit the simulation results to the experimental data. We expect that SS-mPMG and SS-GA will help researchers to create relevant metabolic networks and carry out simulations of metabolic reactions derived from metabolomics data.
Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel
NASA Astrophysics Data System (ADS)
Outeiro, José C.; Umbrello, Domenico; Pina, José C.; Rizzuti, Stefania
2007-05-01
Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.
NASA Astrophysics Data System (ADS)
Hoang, Linh; Schneiderman, Elliot; Mukundan, Rajith; Moore, Karen; Owens, Emmet; Steenhuis, Tammo
2017-04-01
Surface runoff is the primary mechanism transporting substances such as sediments, agricultural chemicals, and pathogens to receiving waters. In order to predict runoff and pollutant fluxes, and to evaluate management practices, it is essential to accurately predict the areas generating surface runoff, which depend on the type of runoff: infiltration-excess runoff and saturation-excess runoff. The watershed of Cannonsville reservoir is part of the New York City water supply system that provides high quality drinking water to nine million people in New York City (NYC) and nearby communities. Previous research identified saturation-excess runoff as the dominant runoff mechanism in this region. The Soil and Water Assessment Tool (SWAT) is a promising tool to simulate the NYC watershed given its broad application and good performance in many watersheds with different scales worldwide, for its ability to model water quality responses, and to evaluate the effect of management practices on water quality at the watershed scale. However, SWAT predicts runoff based mainly on soil and land use characteristics, and implicitly considers only infiltration-excess runoff. Therefore, we developed a modified version of SWAT, referred to as SWAT-Hillslope (SWAT-HS), which explicitly simulates saturation-excess runoff by redefining Hydrological Response Units (HRUs) based on wetness classes with varying soil water storage capacities, and by introducing a surface aquifer with the ability to route interflow from "drier" to "wetter" wetness classes. SWAT-HS was first tested at Town Brook, a 37 km2 headwater watershed draining to the Cannonsville reservoir using a single sub-basin for the whole watershed. SWAT-HS performed well, and predicted streamflow yielded Nash-Sutcliffe Efficiencies of 0.68 and 0.87 at the daily and monthly time steps, respectively. More importantly, it predicted the spatial distribution of saturated areas accurately. Based on the good performance in the Town Brook watershed, we scale-up the application of SWAT-HS to the 1160 km2 Cannonsville watershed utilizing a setup of multiple sub-basins, and evaluate the model performance on flow simulation at different gauged locations in the watershed. Results from flow predictions will be used as a basis for evaluating the ability of SWAT-HS to make sediment and nutrient loading estimates.
Flared landing approach flying qualities. Volume 2: Appendices
NASA Technical Reports Server (NTRS)
Weingarten, Norman C.; Berthe, Charles J., Jr.; Rynaski, Edmund G.; Sarrafian, Shahan K.
1986-01-01
An in-flight research study was conducted utilizing the USAF/Total In-Flight Simulator (TIFS) to investigate longitudinal flying qualities for the flared landing approach phase of flight. A consistent set of data were generated for: determining what kind of command response the pilot prefers/requires in order to flare and land an aircraft with precision, and refining a time history criterion that took into account all the necessary variables and the characteristics that would accurately predict flying qualities. Seven evaluation pilots participated representing NASA Langley, NASA Dryden, Calspan, Boeing, Lockheed, and DFVLR (Braunschweig, Germany). The results of the first part of the study provide guidelines to the flight control system designer, using MIL-F-8785-(C) as a guide, that yield the dynamic behavior pilots prefer in flared landings. The results of the second part provide the flying qualities engineer with a derived flying qualities predictive tool which appears to be highly accurate. This time-domain predictive flying qualities criterion was applied to the flight data as well as six previous flying qualities studies, and the results indicate that the criterion predicted the flying qualities level 81% of the time and the Cooper-Harper pilot rating, within + or - 1%, 60% of the time.
Flared landing approach flying qualities. Volume 1: Experiment design and analysis
NASA Technical Reports Server (NTRS)
Weingarten, Norman C.; Berthe, Charles J., Jr.; Rynaski, Edmund G.; Sarrafian, Shahan K.
1986-01-01
An inflight research study was conducted utilizing the USAF Total Inflight Simulator (TIFS) to investigate longitudinal flying qualities for the flared landing approach phase of flight. The purpose of the experiment was to generate a consistent set of data for: (1) determining what kind of commanded response the pilot prefers in order to flare and land an airplane with precision, and (2) refining a time history criterion that took into account all the necessary variables and their characteristics that would accurately predict flying qualities. The result of the first part provides guidelines to the flight control system designer, using MIL-F-8785-(C) as a guide, that yield the dynamic behavior pilots perfer in flared landings. The results of the second part provides the flying qualities engineer with a newly derived flying qualities predictive tool which appears to be highly accurate. This time domain predictive flying qualities criterion was applied to the flight data as well as six previous flying qualities studies, and the results indicate that the criterion predicted the flying qualities level 81% of the time and the Cooper-Harper pilot rating, within + or - 1, 60% of the time.
Judd, Belinda K; Scanlan, Justin N; Alison, Jennifer A; Waters, Donna; Gordon, Christopher J
2016-08-05
Despite the recent widespread adoption of simulation in clinical education in physiotherapy, there is a lack of validated tools for assessment in this setting. The Assessment of Physiotherapy Practice (APP) is a comprehensive tool used in clinical placement settings in Australia to measure professional competence of physiotherapy students. The aim of the study was to evaluate the validity of the APP for student assessment in simulation settings. A total of 1260 APPs were collected, 971 from students in simulation and 289 from students in clinical placements. Rasch analysis was used to examine the construct validity of the APP tool in three different simulation assessment formats: longitudinal assessment over 1 week of simulation; longitudinal assessment over 2 weeks; and a short-form (25 min) assessment of a single simulation scenario. Comparison with APPs from 5 week clinical placements in hospital and clinic-based settings were also conducted. The APP demonstrated acceptable fit to the expectations of the Rasch model for the 1 and 2 week clinical simulations, exhibiting unidimensional properties that were able to distinguish different levels of student performance. For the short-form simulation, nine of the 20 items recorded greater than 25 % of scores as 'not-assessed' by clinical educators which impacted on the suitability of the APP tool in this simulation format. The APP was a valid assessment tool when used in longitudinal simulation formats. A revised APP may be required for assessment in short-form simulation scenarios.
Towards a genetics-based adaptive agent to support flight testing
NASA Astrophysics Data System (ADS)
Cribbs, Henry Brown, III
Although the benefits of aircraft simulation have been known since the late 1960s, simulation almost always entails interaction with a human test pilot. This "pilot-in-the-loop" simulation process provides useful evaluative information to the aircraft designer and provides a training tool to the pilot. Emulation of a pilot during the early phases of the aircraft design process might provide designers a useful evaluative tool. Machine learning might emulate a pilot in a simulated aircraft/cockpit setting. Preliminary work in the application of machine learning techniques, such as reinforcement learning, to aircraft maneuvering have shown promise. These studies used simplified interfaces between machine learning agent and the aircraft simulation. The simulations employed low order equivalent system models. High-fidelity aircraft simulations exist, such as the simulations developed by NASA at its Dryden Flight Research Center. To expand the applicational domain of reinforcement learning to aircraft designs, this study presents a series of experiments that examine a reinforcement learning agent in the role of test pilot. The NASA X-31 and F-106 high-fidelity simulations provide realistic aircraft for the agent to maneuver. The approach of the study is to examine an agent possessing a genetic-based, artificial neural network to approximate long-term, expected cost (Bellman value) in a basic maneuvering task. The experiments evaluate different learning methods based on a common feedback function and an identical task. The learning methods evaluated are: Q-learning, Q(lambda)-learning, SARSA learning, and SARSA(lambda) learning. Experimental results indicate that, while prediction error remain quite high, similar, repeatable behaviors occur in both aircraft. Similar behavior exhibits portability of the agent between aircraft with different handling qualities (dynamics). Besides the adaptive behavior aspects of the study, the genetic algorithm used in the agent is shown to play an additive role in the shaping of the artificial neural network to the prediction task.
Spanager, Lene; Beier-Holgersen, Randi; Dieckmann, Peter; Konge, Lars; Rosenberg, Jacob; Oestergaard, Doris
2013-11-01
Nontechnical skills are essential for safe and efficient surgery. The aim of this study was to evaluate the reliability of an assessment tool for surgeons' nontechnical skills, Non-Technical Skills for Surgeons dk (NOTSSdk), and the effect of rater training. A 1-day course was conducted for 15 general surgeons in which they rated surgeons' nontechnical skills in 9 video recordings of scenarios simulating real intraoperative situations. Data were gathered from 2 sessions separated by a 4-hour training session. Interrater reliability was high for both pretraining ratings (Cronbach's α = .97) and posttraining ratings (Cronbach's α = .98). There was no statistically significant development in assessment skills. The D study showed that 2 untrained raters or 1 trained rater was needed to obtain generalizability coefficients >.80. The high pretraining interrater reliability indicates that videos were easy to rate and Non-Technical Skills for Surgeons dk easy to use. This implies that Non-Technical Skills for Surgeons dk (NOTSSdk) could be an important tool in surgical training, potentially improving safety and quality for surgical patients. Copyright © 2013 Elsevier Inc. All rights reserved.
Discrete event simulation for healthcare organizations: a tool for decision making.
Hamrock, Eric; Paige, Kerrie; Parks, Jennifer; Scheulen, James; Levin, Scott
2013-01-01
Healthcare organizations face challenges in efficiently accommodating increased patient demand with limited resources and capacity. The modern reimbursement environment prioritizes the maximization of operational efficiency and the reduction of unnecessary costs (i.e., waste) while maintaining or improving quality. As healthcare organizations adapt, significant pressures are placed on leaders to make difficult operational and budgetary decisions. In lieu of hard data, decision makers often base these decisions on subjective information. Discrete event simulation (DES), a computerized method of imitating the operation of a real-world system (e.g., healthcare delivery facility) over time, can provide decision makers with an evidence-based tool to develop and objectively vet operational solutions prior to implementation. DES in healthcare commonly focuses on (1) improving patient flow, (2) managing bed capacity, (3) scheduling staff, (4) managing patient admission and scheduling procedures, and (5) using ancillary resources (e.g., labs, pharmacies). This article describes applicable scenarios, outlines DES concepts, and describes the steps required for development. An original DES model developed to examine crowding and patient flow for staffing decision making at an urban academic emergency department serves as a practical example.
NASA Astrophysics Data System (ADS)
Dooraghi, Alex A.; Tringe, Joseph W.
2018-04-01
To evaluate conventional munition, we simulated an x-ray computed tomography (CT) system for generating radiographs from nominal x-ray energies of 6 or 9 megaelectron volts (MeV). CT simulations, informed by measured data, allow for optimization of both system design and acquisition techniques necessary to enhance image quality. MCNP6 radiographic simulation tools were used to model ideal detector responses (DR) that assume either (1) a detector response proportional to photon flux (N) or (2) a detector response proportional to energy flux (E). As scatter may become significant with MeV x-ray systems, simulations were performed with and without the inclusion of object scatter. Simulations were compared against measurements of a cylindrical munition component principally composed of HMX, tungsten and aluminum encased in carbon fiber. Simulations and measurements used a 6 MeV peak energy x-ray spectrum filtered with 3.175 mm of tantalum. A detector response proportional to energy which includes object scatter agrees to within 0.6 % of the measured line integral of the linear attenuation coefficient. Exclusion of scatter increases the difference between measurement and simulation to 5 %. A detector response proportional to photon flux agrees to within 20 % when object scatter is included in the simulation and 27 % when object scatter is excluded.
Using Delft3D to Simulate Current Energy Conversion
NASA Astrophysics Data System (ADS)
James, S. C.; Chartrand, C.; Roberts, J.
2015-12-01
As public concern with renewable energy increases, current energy conversion (CEC) technology is being developed to optimize energy output and minimize environmental impact. CEC turbines generate energy from tidal and current systems and create wakes that interact with turbines located downstream of a device. The placement of devices can greatly influence power generation and structural reliability. CECs can also alter the ecosystem process surrounding the turbines, such as flow regimes, sediment dynamics, and water quality. Software is needed to investigate specific CEC sites to simulate power generation and hydrodynamic responses of a flow through a CEC turbine array. This work validates Delft3D against several flume experiments by simulating the power generation and hydrodynamic response of flow through a turbine or actuator disc(s). Model parameters are then calibrated against these data sets to reproduce momentum removal and wake recovery data with 3-D flow simulations. Simulated wake profiles and turbulence intensities compare favorably to the experimental data and demonstrate the utility and accuracy of a fast-running tool for future siting and analysis of CEC arrays in complex domains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abel, David; Holloway, Tracey; Harkey, Monica
We evaluate how fine particulate matter (PM2.5) and precursor emissions could be reduced if 17% of electricity generation was replaced with solar photovoltaics (PV) in the Eastern United States. Electricity generation is simulated using GridView, then used to scale electricity-sector emissions of sulfur dioxide (SO2) and nitrogen oxides (NOX) from an existing gridded inventory of air emissions. This approach offers a novel method to leverage advanced electricity simulations with state-of-the-art emissions inventories, without necessitating recalculation of emissions for each facility. The baseline and perturbed emissions are input to the Community Multiscale Air Quality Model (CMAQ version 4.7.1) for a fullmore » accounting of time- and space-varying air quality changes associated with the 17% PV scenario. These results offer a high-value opportunity to evaluate the reduced-form AVoided Emissions and geneRation Tool (AVERT), while using AVERT to test the sensitivity of results to changing base-years and levels of solar integration. We find that average NOX and SO2 emissions across the region decrease 20% and 15%, respectively. PM2.5 concentrations decreased on average 4.7% across the Eastern U.S., with nitrate (NO3-) PM2.5 decreasing 3.7% and sulfate (SO42-) PM2.5 decreasing 9.1%. In the five largest cities in the region, we find that the most polluted days show the most significant PM2.5 decrease under the 17% PV generation scenario, and that the greatest benefits are accrued to cities in or near the Ohio River Valley. We find summer health benefits from reduced PM2.5 exposure estimated as 1424 avoided premature deaths (95% Confidence Interval (CI): 284 deaths, 2 732 deaths) or a health savings of $13.1 billion (95% CI: $0.6 billion, $43.9 billion) These results highlight the potential for renewable energy as a tool for air quality managers to support current and future health-based air quality regulations.« less
10 CFR 434.606 - Simulation tool.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 3 2010-01-01 2010-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria...
Landsat-7 Simulation and Testing Environments
NASA Technical Reports Server (NTRS)
Holmes, E.; Ha, K.; Hawkins, K.; Lombardo, J.; Ram, M.; Sabelhaus, P.; Scott, S.; Phillips, R.
1999-01-01
A spacecraft Attitude Control and Determination Subsystem (ACDS) is heavily dependent upon simulation throughout its entire development, implementation and ground test cycle. Engineering simulation tools are typically developed to design and analyze control systems to validate the design and software simulation tools are required to qualify the flight software. However, the need for simulation does not end here. Operating the ACDS of a spacecraft on the ground requires the simulation of spacecraft dynamics, disturbance modeling and celestial body motion. Sensor data must also be simulated and substituted for actual sensor data on the ground so that the spacecraft will respond by sending commands to the actuators as they will on orbit. And finally, the simulators is the primary training tool and test-bed for the Flight Operations Team. In this paper various ACDS simulation, developed for or used by the Landsat 7 project will be described. The paper will include a description of each tool, its unique attributes, and its role in the overall development and testing of the ACDS. Finally, a section is included which discusses how the coordinated use of these simulation tools can maximize the probability of uncovering software, hardware and operations errors during the ground test process.
Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)
2005-02-01
method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background
PSAMM: A Portable System for the Analysis of Metabolic Models
Steffensen, Jon Lund; Dufault-Thompson, Keith; Zhang, Ying
2016-01-01
The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM), a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies. PMID:26828591
Brydges, Ryan; Hatala, Rose; Zendejas, Benjamin; Erwin, Patricia J; Cook, David A
2015-02-01
To examine the evidence supporting the use of simulation-based assessments as surrogates for patient-related outcomes assessed in the workplace. The authors systematically searched MEDLINE, EMBASE, Scopus, and key journals through February 26, 2013. They included original studies that assessed health professionals and trainees using simulation and then linked those scores with patient-related outcomes assessed in the workplace. Two reviewers independently extracted information on participants, tasks, validity evidence, study quality, patient-related and simulation-based outcomes, and magnitude of correlation. All correlations were pooled using random-effects meta-analysis. Of 11,628 potentially relevant articles, the 33 included studies enrolled 1,203 participants, including postgraduate physicians (n = 24 studies), practicing physicians (n = 8), medical students (n = 6), dentists (n = 2), and nurses (n = 1). The pooled correlation for provider behaviors was 0.51 (95% confidence interval [CI], 0.38 to 0.62; n = 27 studies); for time behaviors, 0.44 (95% CI, 0.15 to 0.66; n = 7); and for patient outcomes, 0.24 (95% CI, -0.02 to 0.47; n = 5). Most reported validity evidence was favorable, though studies often included only correlational evidence. Validity evidence of internal structure (n = 13 studies), content (n = 12), response process (n = 2), and consequences (n = 1) were reported less often. Three tools showed large pooled correlations and favorable (albeit incomplete) validity evidence. Simulation-based assessments often correlate positively with patient-related outcomes. Although these surrogates are imperfect, tools with established validity evidence may replace workplace-based assessments for evaluating select procedural skills.
NASA Astrophysics Data System (ADS)
Gajdošová, Lenka; Seyringer, Dana
2017-02-01
We present the design and simulation of 20-channel, 50-GHz Si3N4 based AWG using three different commercial photonics tools, namely PHASAR from Optiwave Systems Inc., APSS from Apollo Photonics Inc. and RSoft from Synopsys Inc. For this purpose we created identical waveguide structures and identical AWG layouts in these tools and performed BPM simulations. For the simulations the same calculation conditions were used. These AWGs were designed for TM-polarized light with an AWG central wavelength of 850 nm. The output of all simulations, the transmission characteristics, were used to calculate the transmission parameters defining the optical properties of the simulated AWGs. These parameters were summarized and compared with each other. The results feature very good correlation between the tools and are comparable to the designed parameters in AWG-Parameters tool.
Uncertainty in BMP evaluation and optimization for watershed management
NASA Astrophysics Data System (ADS)
Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.
2012-12-01
Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.
Report Central: quality reporting tool in an electronic health record.
Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H; Middleton, Blackford; Einbinder, Jonathan S
2006-01-01
Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XItrade mark and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow.
Impacts of Farmers' Knowledge Increase on Farm Profit and Watershed Water Quality
NASA Astrophysics Data System (ADS)
Ding, D.; Bennett, D. A.
2013-12-01
This study explores the impact that an increase in real-time data might have on farmers' nitrogen management, on-farm profit, and watershed water quality in the Midwestern US. In this study, an agent-based model (ABM) is used to simulate farmers' decisions about nitrogen application rate and timing in corn fields. SWAT (soil-water assessment tool) is used to generate a database that characterizes the response of corn yields to nitrogen fertilizer application and the dynamics of nitrogen loss under different scenarios of rainfall events. The database simulates a scenario where farmers would receive real-time feedback about the fate and impact of nitrogen applied to their fields from in-situ sensors. The ability to transform these data into optimal actions is simulated at multiple levels for farmer agents. In a baseline scenario, the farmer agent is only aware of the yield potential of the land field and single values of N rates for achieving the yield potential and is not aware of N loss from farm fields. Knowledge increase is represented by greater accuracy in predicting rainfall events, and the increase of the number of discrete points in a field-specific quadratic curve that captures crop yield response to various levels of nitrogen perceived by farmer agents. In addition, agents perceive N loss from farm fields at increased temporal resolutions. Correspondingly, agents make adjustments to the rate of N application for crops and the timing of fertilizer application given the rainfall events predictions. Farmers' decisions simulated by the ABM are input into SWAT to model nitrogen concentration in impacted streams. Farm profit statistics and watershed-level nitrogen loads are compared among different scenarios of knowledge increase. The hypothesis that the increase of farmers' knowledge benefits both farm profits and watershed water quality is tested through the comparison.
Fabian, Maria Patricia; Adamkiewicz, Gary; Stout, Natasha Kay; Sandel, Megan; Levy, Jonathan Ian
2014-01-01
Although indoor environmental conditions can affect pediatric asthmatic patients, few studies have characterized the effect of building interventions on asthma-related outcomes. Simulation models can evaluate such complex systems but have not been applied in this context. We sought to evaluate the impact of building interventions on indoor environmental quality and pediatric asthma health care use, and to conduct cost comparisons between intervention and health care costs and energy savings. We applied our previously developed discrete event simulation model (DEM) to simulate the effect of environmental factors, medication compliance, seasonality, and medical history on (1) pollutant concentrations indoors and (2) asthma outcomes in low-income multifamily housing. We estimated health care use and costs at baseline and subsequent to interventions, and then compared health care costs with energy savings and intervention costs. Interventions, such as integrated pest management and repairing kitchen exhaust fans, led to 7% to 12% reductions in serious asthma events with 1- to 3-year payback periods. Weatherization efforts targeted solely toward tightening a building envelope led to 20% more serious asthma events, but bundling with repairing kitchen exhaust fans and eliminating indoor sources (eg, gas stoves or smokers) mitigated this effect. Our pediatric asthma model provides a tool to prioritize individual and bundled building interventions based on their effects on health and costs, highlighting the tradeoffs between weatherization, indoor air quality, and health. Our work bridges the gap between clinical and environmental health sciences by increasing physicians' understanding of the effect that home environmental changes can have on their patients' asthma. Copyright © 2013 American Academy of Allergy, Asthma & Immunology. Published by Mosby, Inc. All rights reserved.
Modeling riverine nitrate export from an East-Central Illinois watershed using SWAT.
Hu, X; McIsaac, G F; David, M B; Louwers, C A L
2007-01-01
Reliable water quality models are needed to forecast the water quality consequences of different agricultural nutrient management scenarios. In this study, the Soil and Water Assessment Tool (SWAT), version 2000, was applied to simulate streamflow, riverine nitrate (NO(3)) export, crop yield, and watershed nitrogen (N) budgets in the upper Embarras River (UER) watershed in east-central Illinois, which has extensive maize-soybean cultivation, large N fertilizer input, and extensive tile drainage. During the calibration (1994-2002) and validation (1985-1993) periods, SWAT simulated monthly and annual stream flows with Nash-Sutcliffe coefficients (E) ranging from 0.67 to 0.94 and R(2) from 0.75 to 0.95. For monthly and annual NO(3) loads, E ranged from -0.16 to 0.45 and R(2) from 0.36 to 0.74. Annual maize and soybean yields were simulated with relative errors ranging from -10 to 6%. The model was then used to predict the changes in NO(3) output with N fertilizer application rates 10 to 50% lower than original application rates in UER. The calibrated SWAT predicted a 10 to 43% decrease in NO(3) export from UER and a 6 to 38% reduction in maize yield in response to the reduction in N fertilizer. The SWAT model markedly overestimated NO(3) export during major wet periods. Moreover, SWAT estimated soybean N fixation rates considerably greater than literature values, and some simulated changes in the N cycle in response to fertilizer reduction seemed to be unrealistic. Improving these aspects of SWAT could lead to more reliable predictions in the water quality outcomes of nutrient management practices in tile-drained watersheds.
NASA Astrophysics Data System (ADS)
Derx, Julia; Schijven, Jack; Sommer, Regina; Kirschner, Alexander; Farnleitner, Andreas H.; Blaschke, Alfred Paul
2016-04-01
QMRAcatch, a tool to simulate microbial water quality including infection risk assessment, was previously developed and successfully tested at a Danube river site (Schijven et al. 2015). In the tool concentrations of target faecal microorganisms and viruses (TMVs) are computed at a point of interest (PI) along the main river and the floodplain river at daily intervals for a one year period. Even though faecal microbial pathogen concentrations in water resources are usually below the sample limit of detection, this does not ensure, that the water quality complies with a certain required health based target. The aim of this study was therefore to improve the predictability of relevant human pathogenic viruses, i.e. enterovirus and norovirus, in the studied river/floodplain area. This was done by following an innovative calibration strategy based on human-associated microbial source tracking (MST) marker data which were determined following the HF183 TaqMan assay (Green et al. 2011). The MST marker is strongly associated with human faeces and communal sewage, occurring there in numbers by several magnitudes higher than for human enteric pathogens (Mayer et al 2015). The calibrated tool was then evaluated with measured enterovirus concentrations at the PI and in the floodplain river. In the simulation tool the discharges of 5 wastewater treatment plants (WWTPs) were considered with point discharges along a 200 km reach of the Danube river. The MST marker and target virus concentrations at the PI at a certain day were computed based on the concentrations of the previous day, plus the wastewater concentrations times the WWTP discharge divided by the river discharge. A ratio of the river width was also considered, over which the MST marker and virus particles have fully mixed with river water. In the tool, the excrements from recreational visitors frequenting the floodplain area every day were assumed to be homogeneously distributed in the area. A binomial distributed probability was considered that people practice open defecation in the floodplain area, including a viral prevalence. The release rate and runoff coefficient, defined here as ratios of daily rainfall amounts, were assumed the same for the MST marker and target viruses, and everywhere the same in the floodplain area. They may differ for different years, however, because climatic and hydrologic conditions can change. The model parameter uncertainties were considered in the tool within a Monte-Carlo framework. Random numbers were drawn from preselected statistical probability distributions e.g. of the faecal MST marker concentrations, for each year-day, iterated 10000 times. The calibrated tool was shown to predict enterovirus concentrations in the Danube river and the floodplain river within the right order of magnitude, when comparing the mean, 95th percentiles and the shape parameters of the Gamma distributions of measured and simulated concentrations over a year. With the calibrated tool, the required target virus reductions from the river Danube and the floodplain river water to produce safe drinking water were estimated. Low and high contamination scenarios (i.e. 5 log10 to no wastewater treatment, small to large percentage of visitors that practice open defecation, low to high viral prevalence) were investigated for guiding robust treatment design criteria of water supplies. This paper was supported by FWF (Vienna Doctoral Program on Water Resource Systems W1219-N22) and the GWRS project (Vienna Water) as part of the "(New) Danube-Lower Lobau Network Project" funded by the Government of Austria and Vienna, and the European Agricultural Fund for Rural Development (LE 07-13). References Green, H. C., Haugland, R. A., Varma, M., Millen, H. T., Borchardt, M. A., Field, K. G., Walters, W. A., Knight, R., Sivaganesan, M., Kelty, C.A., Shanks, O.C. 2014. Improved HF183 quantitative real-time PCR assay for characterization of human fecal pollution in ambient surface water samples. Applied and Environmental Microbiology 80: 3086-3094. Mayer, R.E., Bofill-Mas, S., Egle, L., Reischer, G.H., Schade, M., Fernandez-Cassi, X, Fuchs, W., Mach, R. L., Lindner, G., Kirschner, A., Gaisbauer, M., Piringer, H., Blaschke, A.P., Girones, R., Zessner, M., Sommer, R., Farnleitner, A.H. 2015. Occurrence of human-associated Bacteroidetes genetic source tracking markers in raw and treated wastewater of municipal and domestic origin and comparison to standard and alternative indicators of faecal pollution, Water Research 90: 265-276. Schijven, J., Derx, J., de Roda Husman, A. M., Blaschke, A. P., Farnleitner, A. H. 2015. QMRAcatch: Microbial quality simulation of water resources including infection risk assessment. J.Environ.Qual. 44, 1491-1502
[Quality assurance of the renal applications software].
del Real Núñez, R; Contreras Puertas, P I; Moreno Ortega, E; Mena Bares, L M; Maza Muret, F R; Latre Romero, J M
2007-01-01
The need for quality assurance of all technical aspects of nuclear medicine studies is widely recognised. However, little attention has been paid to the quality assurance of the applications software. Our work reported here aims at verifying the analysis software for processing of renal nuclear medicine studies (renograms). The software tools were used to build a synthetic dynamic model of renal system. The model consists of two phases: perfusion and function. The organs of interest (kidneys, bladder and aortic artery) were simple geometric forms. The uptake of the renal structures was described by mathematic functions. Curves corresponding to normal or pathological conditions were simulated for kidneys, bladder and aortic artery by appropriate selection of parameters. There was no difference between the parameters of the mathematic curves and the quantitative data produced by the renal analysis program. Our test procedure is simple to apply, reliable, reproducible and rapid to verify the renal applications software.
QoS measurement of workflow-based web service compositions using Colored Petri net.
Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra
2014-01-01
Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.
ERIC Educational Resources Information Center
National Comprehensive Center for Teacher Quality, 2008
2008-01-01
Teaching Quality (TQ) Source Tips & Tools: Emerging Strategies to Enhance Educator Quality is an online resource developed by the TQ Center. It is designed to help education practitioners tap into strategies and resources they can use to enhance educator quality. This publication is based on the TQ Source Tips & Tools topic area "Enhancing…
VISdish: A new tool for canting and shape-measuring solar-dish facets.
Montecchi, Marco; Cara, Giuseppe; Benedetti, Arcangelo
2017-06-01
Solar dishes allow us to obtain highly concentrated solar fluxes used to produce electricity or feed thermal processes/storage. For practical reasons, the reflecting surface is composed by a number of facets. After the dish assembly, facet-canting is an important task for improving the concentration of solar radiation around the focus-point, as well as the capture ratio at the receiver placed there. Finally, flux profile should be measured or evaluated to verify the concentration quality. All these tasks can be achieved by the new tool we developed at ENEA, named VISdish. The instrument is based on the visual inspection system (VIS) approach and can work in two functionalities: canting and shape-measurement. The shape data are entered in a simulation software for evaluating the flux profile and concentration quality. With respect to prior methods, VISdish offers several advantages: (i) simpler data processing, because light point-source and its reflections are univocally related, (ii) higher accuracy. The instrument functionality is illustrated through the preliminary experimental results obtained on the dish recently installed in ENEA-Casaccia in the framework of the E.U. project OMSoP.
A cost-efficiency and health benefit approach to improve urban air quality.
Miranda, A I; Ferreira, J; Silveira, C; Relvas, H; Duque, L; Roebeling, P; Lopes, M; Costa, S; Monteiro, A; Gama, C; Sá, E; Borrego, C; Teixeira, J P
2016-11-01
When ambient air quality standards established in the EU Directive 2008/50/EC are exceeded, Member States are obliged to develop and implement Air Quality Plans (AQP) to improve air quality and health. Notwithstanding the achievements in emission reductions and air quality improvement, additional efforts need to be undertaken to improve air quality in a sustainable way - i.e. through a cost-efficiency approach. This work was developed in the scope of the recently concluded MAPLIA project "Moving from Air Pollution to Local Integrated Assessment", and focuses on the definition and assessment of emission abatement measures and their associated costs, air quality and health impacts and benefits by means of air quality modelling tools, health impact functions and cost-efficiency analysis. The MAPLIA system was applied to the Grande Porto urban area (Portugal), addressing PM10 and NOx as the most important pollutants in the region. Four different measures to reduce PM10 and NOx emissions were defined and characterized in terms of emissions and implementation costs, and combined into 15 emission scenarios, simulated by the TAPM air quality modelling tool. Air pollutant concentration fields were then used to estimate health benefits in terms of avoided costs (external costs), using dose-response health impact functions. Results revealed that, among the 15 scenarios analysed, the scenario including all 4 measures lead to a total net benefit of 0.3M€·y(-1). The largest net benefit is obtained for the scenario considering the conversion of 50% of open fire places into heat recovery wood stoves. Although the implementation costs of this measure are high, the benefits outweigh the costs. Research outcomes confirm that the MAPLIA system is useful for policy decision support on air quality improvement strategies, and could be applied to other urban areas where AQP need to be implemented and monitored. Copyright © 2016. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.
2015-05-01
A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.
NASA Astrophysics Data System (ADS)
Johnston, J. M.
2013-12-01
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.
Rohling, Heide; Sihver, Lembit; Priegnitz, Marlen; Enghardt, Wolfgang; Fiedler, Fine
2014-01-01
Therapeutic irradiation with protons and ions is advantageous over radiotherapy with photons due to its favorable dose deposition. Additionally, ion beams provide a higher relative biological effectiveness than photons. For this reason, an improved treatment of deep-seated tumors is achieved and normal tissue is spared. However, small deviations from the treatment plan can have a large impact on the dose distribution. Therefore, a monitoring is required to assure the quality of the treatment. Particle therapy positron emission tomography (PT-PET) is the only clinically proven method which provides a non-invasive monitoring of dose delivery. It makes use of the β+-activity produced by nuclear fragmentation during irradiation. In order to evaluate these PT-PET measurements, simulations of the β+-activity are necessary. Therefore, it is essential to know the yields of the β+-emitting nuclides at every position of the beam path as exact as possible. We evaluated the three-dimensional Monte-Carlo simulation tool PHITS (version 2.30) [ 1] and the 1D deterministic simulation tool HIBRAC [ 2] with respect to the production of β+-emitting nuclides. The yields of the most important β+-emitting nuclides for carbon, lithium, helium and proton beams have been calculated. The results were then compared with experimental data obtained at GSI Helmholtzzentrum für Schwerionenforschung Darmstadt, Germany. GEANT4 simulations provide an additional benchmark [ 3]. For PHITS, the impact of different nuclear reaction models, total cross-section models and evaporation models on the β+-emitter production has been studied. In general, PHITS underestimates the yields of positron-emitters and cannot compete with GEANT4 so far. The β+-emitters calculated with an extended HIBRAC code were in good agreement with the experimental data for carbon and proton beams and comparable to the GEANT4 results, see [ 4] and Fig. 1. Considering the simulation results and its speed compared with three-dimensional Monte-Carlo tools, HIBRAC is a good candidate for the implementation in clinical routine PT-PET. Fig 1.Depth-dependent yields of the production of 11C and 15O during proton irradiation of a PMMA target with 140 MeV [ 4].
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udhay Ravishankar; Milos manic
2013-08-01
This paper presents a micro-grid simulator tool useful for implementing and testing multi-agent controllers (SGridSim). As a common engineering practice it is important to have a tool that simplifies the modeling of the salient features of a desired system. In electric micro-grids, these salient features are the voltage and power distributions within the micro-grid. Current simplified electric power grid simulator tools such as PowerWorld, PowerSim, Gridlab, etc, model only the power distribution features of a desired micro-grid. Other power grid simulators such as Simulink, Modelica, etc, use detailed modeling to accommodate the voltage distribution features. This paper presents a SGridSimmore » micro-grid simulator tool that simplifies the modeling of both the voltage and power distribution features in a desired micro-grid. The SGridSim tool accomplishes this simplified modeling by using Effective Node-to-Node Complex Impedance (EN2NCI) models of components that typically make-up a micro-grid. The term EN2NCI models means that the impedance based components of a micro-grid are modeled as single impedances tied between their respective voltage nodes on the micro-grid. Hence the benefit of the presented SGridSim tool are 1) simulation of a micro-grid is performed strictly in the complex-domain; 2) faster simulation of a micro-grid by avoiding the simulation of detailed transients. An example micro-grid model was built using the SGridSim tool and tested to simulate both the voltage and power distribution features with a total absolute relative error of less than 6%.« less
Simulation of computed tomography dose based on voxel phantom
NASA Astrophysics Data System (ADS)
Liu, Chunyu; Lv, Xiangbo; Li, Zhaojun
2017-01-01
Computed Tomography (CT) is one of the preferred and the most valuable imaging tool used in diagnostic radiology, which provides a high-quality cross-sectional image of the body. It still causes higher doses of radiation to patients comparing to the other radiological procedures. The Monte-Carlo method is appropriate for estimation of the radiation dose during the CT examinations. The simulation of the Computed Tomography Dose Index (CTDI) phantom was developed in this paper. Under a similar conditions used in physical measurements, dose profiles were calculated and compared against the measured values that were reported. The results demonstrate a good agreement between the calculated and the measured doses. From different CT exam simulations using the voxel phantom, the highest absorbed dose was recorded for the lung, the brain, the bone surface. A comparison between the different scan type shows that the effective dose for a chest scan is the highest one, whereas the effective dose values during abdomen and pelvis scan are very close, respectively. The lowest effective dose resulted from the head scan. Although, the dose in CT is related to various parameters, such as the tube current, exposure time, beam energy, slice thickness and patient size, this study demonstrates that the MC simulation is a useful tool to accurately estimate the dose delivered to any specific organs for patients undergoing the CT exams and can be also a valuable technique for the design and the optimization of the CT x-ray source.
Report Central: Quality Reporting Tool in an Electronic Health Record
Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S.; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H.; Middleton, Blackford; Einbinder, Jonathan S.
2006-01-01
Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XI™ and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow. PMID:17238590
NASA Astrophysics Data System (ADS)
Smetana, Lara Kathleen; Bell, Randy L.
2012-06-01
Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.
ADS-33C related handling qualities research performed using the NRC Bell 205 airborne simulator
NASA Technical Reports Server (NTRS)
Morgan, J. Murray; Baillie, Stewart W.
1993-01-01
Over 10 years ago a project was initiated by the U.S. Army AVSCOM to update the military helicopter flying qualities specification MIL-8501-A. While not yet complete, the project reached a major milestone in 1989 with the publication of an Airworthiness Design Standard, ADS-33C. The 8501 update project initially set out to identify critical gaps in the requisite data base and then proceeded to fill them using a variety of directed research studies. The magnitude of the task required that it become an international effort: appropriate research studies were conducted in Germany, the UK and Canada as well as in the USA. Canadian participation was supported by the Department of National Defence (DND) through the Chief of Research and Development. Both ground based and in-flight simulation were used to study the defined areas and the Canadian Bell 205-A1 variable stability helicopter was used extensively as one of the primary research tools available for this effort. This paper reviews the involvement of the Flight Research Laboratory of the National Research Council of Canada in the update project, it describes the various experiments conducted on the Airborne Simulator, it notes significant results obtained and describes ongoing research associated with the project.
Uchida, Masafumi
2014-04-01
A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.
Combining Simulation Tools for End-to-End Trajectory Optimization
NASA Technical Reports Server (NTRS)
Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min
2015-01-01
Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.
Towards Large Eddy Simulation of gas turbine compressors
NASA Astrophysics Data System (ADS)
McMullan, W. A.; Page, G. J.
2012-07-01
With increasing computing power, Large Eddy Simulation could be a useful simulation tool for gas turbine axial compressor design. This paper outlines a series of simulations performed on compressor geometries, ranging from a Controlled Diffusion Cascade stator blade to the periodic sector of a stage in a 3.5 stage axial compressor. The simulation results show that LES may offer advantages over traditional RANS methods when off-design conditions are considered - flow regimes where RANS models often fail to converge. The time-dependent nature of LES permits the resolution of transient flow structures, and can elucidate new mechanisms of vorticity generation on blade surfaces. It is shown that accurate LES is heavily reliant on both the near-wall mesh fidelity and the ability of the imposed inflow condition to recreate the conditions found in the reference experiment. For components embedded in a compressor this requires the generation of turbulence fluctuations at the inlet plane. A recycling method is developed that improves the quality of the flow in a single stage calculation of an axial compressor, and indicates that future developments in both the recycling technique and computing power will bring simulations of axial compressors within reach of industry in the coming years.
Ion Move Brownian Dynamics (IMBD)--simulations of ion transport.
Kurczynska, Monika; Kotulska, Malgorzata
2014-01-01
Comparison of the computed characteristics and physiological measurement of ion transport through transmembrane proteins could be a useful method to assess the quality of protein structures. Simulations of ion transport should be detailed but also timeefficient. The most accurate method could be Molecular Dynamics (MD), which is very time-consuming, hence is not used for this purpose. The model which includes ion-ion interactions and reduces the simulation time by excluding water, protein and lipid molecules is Brownian Dynamics (BD). In this paper a new computer program for BD simulation of the ion transport is presented. We evaluate two methods for calculating the pore accessibility (round and irregular shape) and two representations of ion sizes (van der Waals diameter and one voxel). Ion Move Brownian Dynamics (IMBD) was tested with two nanopores: alpha-hemolysin and potassium channel KcsA. In both cases during the simulation an ion passed through the pore in less than 32 ns. Although two types of ions were in solution (potassium and chloride), only ions which agreed with the selectivity properties of the channels passed through the pores. IMBD is a new tool for the ion transport modelling, which can be used in the simulations of wide and narrow pores.
2013-01-01
Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. Methods We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. Results In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. Conclusions There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed. PMID:24044807
Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa
2013-09-17
Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed.
Bayona, Sofía; Fernández-Arroyo, José Manuel; Martín, Isaac; Bayona, Pilar
2008-09-01
The aims of this study were to test the face, content, and construct validities of a virtual-reality haptic arthroscopy simulator and to validate four assessment hypothesis. The participants in our study were 94 arthroscopists attending an international conference on arthroscopy. The interviewed surgeons had been performing arthroscopies for a mean of 8.71 years (σ = 6.94 years). We explained the operation, functionality, instructions for use, and the exercises provided by the simulator. They performed a trial exercise and then an exercise in which performance was recorded. After having using it, the arthroscopists answered a questionnaire. The simulator was classified as one of the best training methods (over phantoms), and obtained a mark of 7.10 out of 10 as an evaluation tool. The simulator was considered more useful for inexperienced surgeons than for surgeons with experience (mean difference 1.88 out of 10, P value < 0.001). The participants valued the simulator at 8.24 as a tool for learning skills, its fidelity at 7.41, the quality of the platform at 7.54, and the content of the exercises at 7.09. It obtained a global score of 7.82. Of the subjects, 30.8% said they would practise with the simulator more than 6 h per week. Of the surgeons, 89.4% affirmed that they would recommend the simulator to their colleagues. The data gathered support the first three hypotheses, as well as face and content validities. Results show statistically significant differences between experts and novices, thus supporting the construct validity, but studies with a larger sample must be carried out to verify this. We propose concrete solutions and an equation to calculate economy of movement. Analogously, we analyze competence measurements and propose an equation to provide a single measurement that contains them all and that, according to the surgeons' criteria, is as reliable as the judgment of experts observing the performance of an apprentice.
A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James
2011-11-01
Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less
The value of SPaCE in delivering patient feedback.
Clapham, Laura; Allan, Laura; Stirling, Kevin
2016-02-01
The use of simulated patients (SPs) within undergraduate medical curricula is an established and valued learning opportunity. Within the context of simulation, it is imperative to capture feedback from all participants within the simulation activity. The Simulated Patient Candidate Evaluation (SPaCE) tool was developed to deliver SP feedback following a simulation activity. SpaCE is a closed feedback tool that allows SPs to rate a student's performance, using a five-point Likert scale, in three domains: attitude; interaction skills; and management. This research study examined the value of the SPaCE tool and how it contributes to the overall feedback that a student receives. Classical test theory was used to determine the reliability of the SPaCE tool. An evaluation of all SP responses was conducted to observe trends in scoring patterns for each question. Qualitative data were collected via a free-text questionnaire and subsequent focus group discussion. It is imperative to capture feedback from all participants within the simulation activity Classical test theory determined that the SPaCE tool had a reliability co-efficient of 0.89. A total of 13 SPs replied to the questionnaire. A thematic analysis of all questionnaire data identified that the SPaCE tool provides a structure that allows patient feedback to be given effectively following a simulation activity. These themes were discussed further with six SPs who attended the subsequent focus group session. The SPaCE tool has been shown to be a reliable closed feedback tool that allows SPs to discriminate between students, based on their performance. The next stage in the development of the SPaCE tool is to test the wider applicability of this feedback tool. © 2015 John Wiley & Sons Ltd.
New Tooling System for Forming Aluminum Beverage Can End Shell
NASA Astrophysics Data System (ADS)
Yamazaki, Koetsu; Otsuka, Takayasu; Han, Jing; Hasegawa, Takashi; Shirasawa, Taketo
2011-08-01
This paper proposes a new tooling system for forming shells of aluminum beverage can ends. At first, forming process of a conversional tooling system has been simulated using three-dimensional finite element models. Simulation results have been confirmed to be consistent with those of axisymmetric models, so simulations for further study have been performed using axisymmetric models to save computational time. A comparison shows that thinning of the shell formed by the proposed tooling system has been improved about 3.6%. Influences of the tool upmost surface profiles and tool initial positions in the new tooling system have been investigated and the design optimization method based on the numerical simulations has been then applied to search optimum design points, in order to minimize thinning subjected to the constraints of the geometrical dimensions of the shell. At last, the performance of the shell subjected to internal pressure has been confirmed to meet design requirements.
Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan
2014-01-01
LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784
Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan
2014-12-15
LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.
Poland, Michael P; Nugent, Chris D; Wang, Hui; Chen, Liming
2009-01-01
Smart Homes offer potential solutions for various forms of independent living for the elderly. The assistive and protective environment afforded by smart homes offer a safe, relatively inexpensive, dependable and viable alternative to vulnerable inhabitants. Nevertheless, the success of a smart home rests upon the quality of information its decision support system receives and this in turn places great importance on the issue of correct sensor deployment. In this article we present a software tool that has been developed to address the elusive issue of sensor distribution within smart homes. Details of the tool will be presented and it will be shown how it can be used to emulate any real world environment whereby virtual sensor distributions can be rapidly implemented and assessed without the requirement for physical deployment for evaluation. As such, this approach offers the potential of tailoring sensor distributions to the specific needs of a patient in a non-evasive manner. The heuristics based tool presented here has been developed as the first part of a three stage project.
Magnetic resonance imaging of rodent spinal cord with an improved performance coil at 7 Tesla
NASA Astrophysics Data System (ADS)
Solis-Najera, S. E.; Rodriguez, A. O.
2014-11-01
Magnetic Resonance Imaging of animal models provide reliable means to study human diseases. The image acquisition particularly determined by the radio frequency coil to detect the signal emanated from a particular region of interest. A scaled-down version of the slotted surface coil was built based on the previous results of a magnetron-type surface coil for human applications. Our coil prototype had a 2 cm total diameter and six circular slots and was developed for murine spinal cord at 7 T. Electromagnetic simulations of the slotted and circular coils were also performed to compute the spatially dependent magnetic and electric fields using a simulated saline-solution sphere. The quality factor of both coils was experimentally measured giving a lower noise figure and a higher quality factor for the slotted coil outperforming the circular coil. Images of the spinal cord of a rat were acquired using standard pulse sequences. The slotted surface coil can be a good tool for spinal cord rat imaging using conventional pulse sequences at 7 T.
NASA Astrophysics Data System (ADS)
Imbrogno, Stano; Rinaldi, Sergio; Raso, Antonio; Bordin, Alberto; Bruschi, Stefania; Umbrello, Domenico
2018-05-01
The Additive Manufacturing techniques are gaining more and more interest in various industrial fields due to the possibility of drastically reduce the material waste during the production processes, revolutionizing the standard scheme and strategies of the manufacturing processes. However, the metal parts shape produced, frequently do not satisfy the tolerances as well as the surface quality requirements. During the design phase, the finite element simulation results a fundamental tool to help the engineers in the correct decision of the most suitable process parameters, especially in manufacturing processes, in order to produce products of high quality. The aim of this work is to develop a 3D finite element model of semi-finishing turning operation of Ti6Al4V, produced via Direct Metal Laser Sintering (DMLS). A customized user sub-routine was built-up in order to model the mechanical behavior of the material under machining operations to predict the main fundamental variables as cutting forces and temperature. Moreover, the machining induced alterations are also studied by the finite element model developed.
Capturing, using, and managing quality assurance knowledge for shuttle post-MECO flight design
NASA Technical Reports Server (NTRS)
Peters, H. L.; Fussell, L. R.; Goodwin, M. A.; Schultz, Roger D.
1991-01-01
Ascent initialization values used by the Shuttle's onboard computer for nominal and abort mission scenarios are verified by a six degrees of freedom computer simulation. The procedure that the Ascent Post Main Engine Cutoff (Post-MECO) group uses to perform quality assurance (QA) of the simulation is time consuming. Also, the QA data, checklists and associated rationale, though known by the group members, is not sufficiently documented, hindering transfer of knowledge and problem resolution. A new QA procedure which retains the current high level of integrity while reducing the time required to perform QA is needed to support the increasing Shuttle flight rate. Documenting the knowledge is also needed to increase its availability for training and problem resolution. To meet these needs, a knowledge capture process, embedded into the group activities, was initiated to verify the existing QA checks, define new ones, and document all rationale. The resulting checks were automated in a conventional software program to achieve the desired standardization, integrity, and time reduction. A prototype electronic knowledge base was developed with Macintosh's HyperCard to serve as a knowledge capture tool and data storage.
Interdisciplinary collaboration to maintain a culture of safety in a labor and delivery setting.
Burke, Carol; Grobman, William; Miller, Deborah
2013-01-01
A culture of safety is a growing movement in obstetrical healthcare quality and management. Patient-centered and safe care is a primary priority for all healthcare workers, with communication and teamwork central to achieving optimal maternal health outcomes. A mandatory educational program was developed and implemented by physicians and nurses to sustain awareness and compliance to current protocols within a large university-based hospital. A didactic portion reviewing shoulder dystocia, operative vaginal delivery, obstetric hemorrhage, and fetal monitoring escalation was combined with a simulation session. The simulation was a fetal bradycardia activating the decision to perform an operative vaginal delivery complicated by a shoulder dystocia. More than 370 members of the healthcare team participated including obstetricians, midwives, the anesthesia team, and nurses. Success of the program was measured by an evaluation tool and comparing results from a prior safety questionnaire. Ninety-seven percent rated the program as excellent, and the response to a question on perception of overall grade on patient safety measured by the Agency for Healthcare Research and Quality safety survey demonstrated a significant improvement in the score (P = .003) following the program.
NASA Technical Reports Server (NTRS)
Koontz, S. L.; Kuminecz, J.; Leger, L.; Nordine, P.
1988-01-01
The use of thermal atom test methods as a materials selection and screening technique for low-Earth orbit (LEO) spacecraft is critically evaluated. The chemistry and physics of thermal atom environments are compared with the LEO environment. The relative reactivities of a number of materials determined to be in thermal atom environments are compared to those observed in LEO and in high quality LEO simulations. Reaction efficiencies measured in a new type of thermal atom apparatus are one-hundredth to one-thousandth those observed in LEO, and many materials showing nearly identical reactivities in LEO show relative reactivities differing by as much as a factor of 8 in thermal atom systems. A simple phenomenological kinetic model for the reaction of oxygen atoms with organic materials can be used to explain the differences in reactivity in different environments. Certain specific thermal test environments can be used as reliable materials screening tools. Using thermal atom methods to predict material lifetime in LEO requires direct calibration of the method against LEO data or high quality simulation data for each material.
Yu, Zeyun; Holst, Michael J.; Hayashi, Takeharu; Bajaj, Chandrajit L.; Ellisman, Mark H.; McCammon, J. Andrew; Hoshijima, Masahiko
2009-01-01
A general framework of image-based geometric processing is presented to bridge the gap between three-dimensional (3D) imaging that provides structural details of a biological system and mathematical simulation where high-quality surface or volumetric meshes are required. A 3D density map is processed in the order of image pre-processing (contrast enhancement and anisotropic filtering), feature extraction (boundary segmentation and skeletonization), and high-quality and realistic surface (triangular) and volumetric (tetrahedral) mesh generation. While the tool-chain described is applicable to general types of 3D imaging data, the performance is demonstrated specifically on membrane-bound organelles in ventricular myocytes that are imaged and reconstructed with electron microscopic (EM) tomography and two-photon microscopy (T-PM). Of particular interest in this study are two types of membrane-bound Ca2+-handling organelles, namely, transverse tubules (T-tubules) and junctional sarcoplasmic reticulum (jSR), both of which play an important role in regulating the excitation-contraction (E-C) coupling through dynamic Ca2+ mobilization in cardiomyocytes. PMID:18835449
Yu, Zeyun; Holst, Michael J; Hayashi, Takeharu; Bajaj, Chandrajit L; Ellisman, Mark H; McCammon, J Andrew; Hoshijima, Masahiko
2008-12-01
A general framework of image-based geometric processing is presented to bridge the gap between three-dimensional (3D) imaging that provides structural details of a biological system and mathematical simulation where high-quality surface or volumetric meshes are required. A 3D density map is processed in the order of image pre-processing (contrast enhancement and anisotropic filtering), feature extraction (boundary segmentation and skeletonization), and high-quality and realistic surface (triangular) and volumetric (tetrahedral) mesh generation. While the tool-chain described is applicable to general types of 3D imaging data, the performance is demonstrated specifically on membrane-bound organelles in ventricular myocytes that are imaged and reconstructed with electron microscopic (EM) tomography and two-photon microscopy (T-PM). Of particular interest in this study are two types of membrane-bound Ca(2+)-handling organelles, namely, transverse tubules (T-tubules) and junctional sarcoplasmic reticulum (jSR), both of which play an important role in regulating the excitation-contraction (E-C) coupling through dynamic Ca(2+) mobilization in cardiomyocytes.
NASA Technical Reports Server (NTRS)
Schott, John; Gerace, Aaron; Brown, Scott; Gartley, Michael; Montanaro, Matthew; Reuter, Dennis C.
2012-01-01
The next Landsat satellite, which is scheduled for launch in early 2013, will carry two instruments: the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS). Significant design changes over previous Landsat instruments have been made to these sensors to potentially enhance the quality of Landsat image data. TIRS, which is the focus of this study, is a dual-band instrument that uses a push-broom style architecture to collect data. To help understand the impact of design trades during instrument build, an effort was initiated to model TIRS imagery. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) tool was used to produce synthetic "on-orbit" TIRS data with detailed radiometric, geometric, and digital image characteristics. This work presents several studies that used DIRSIG simulated TIRS data to test the impact of engineering performance data on image quality in an effort to determine if the image data meet specifications or, in the event that they do not, to determine if the resulting image data are still acceptable.
[Existing laparoscopic simulators and their benefit for the surgeon].
Kalvach, J; Ryska, O; Ryska, M
2016-01-01
Nowadays, laparoscopic operations are a common part of surgical practice. However, they have their own characteristics and require a specific method of preparation. Recently, simulation techniques have been increasingly used for the training of skills. The aim of this review is to provide a summary of available literature on the topic of laparoscopic simulators, to assess their contribution to the training of surgeons, and to identify the most effective type of simulation. PubMed database, Web of Science and Cochrane Library were used to search for relevant publications. The keywords "laparoscopy, simulator, surgery, assessment" were used in the search. The search was limited to prospective studies published in the last 5 years in the English language. From a total of 354 studies found, we included in the survey 26 that matched our criteria. Nine studies compared individual simulators to one another. Five studies evaluated "high and low fidelity" (a virtual box simulator) as equally effective (EBM 2a). In three cases the "low fidelity" box simulator was found to be more efficient (EBM 2a3b). Only one study preferred the virtual simulator (VR) (EBM2b).Thirteen studies evaluated the benefits of simulators for practice. Twelve found training on a simulator to be an effective method of preparation (EBM 1b3b). In contrast, one study did not find any difference between the training simulator and traditional preparation (EBM 3b). Nine studies evaluated directly one of the methods of evaluating laparoscopic skills. Three studies evaluated VR simulator as a useful assessment tool. Other studies evaluated as successful the scoring system GOALS-GH. The hand motion analysis model was successful in one case. Most studies were observational (EBM 3b) and only 2 studies were of higher quality (EBM 2b). Simulators are an effective tool for practicing laparoscopic techniques (EBM: 1b). It cannot be determined based on available data which of the simulators is most effective. The virtual simulator, however, still remains the most self-sufficient unit suitable for teaching as well as evaluation of laparoscopic techniques (EBM 2b3b). Further studies are needed to find an effective system and parameters for an objective evaluation of skills. laparoscopy - simulator - surgery assessment.
Temporal evolution modeling of hydraulic and water quality performance of permeable pavements
NASA Astrophysics Data System (ADS)
Huang, Jian; He, Jianxun; Valeo, Caterina; Chu, Angus
2016-02-01
A mathematical model for predicting hydraulic and water quality performance in both the short- and long-term is proposed based on field measurements for three types of permeable pavements: porous asphalt (PA), porous concrete (PC), and permeable inter-locking concrete pavers (PICP). The model was applied to three field-scale test sites in Calgary, Alberta, Canada. The model performance was assessed in terms of hydraulic parameters including time to peak, peak flow and water balance and a water quality variable (the removal rate of total suspended solids). A total of 20 simulated storm events were used for model calibration and verification processes. The proposed model can simulate the outflow hydrographs with a coefficient of determination (R2) ranging from 0.762 to 0.907, and normalized root-mean-square deviation (NRMSD) ranging from 13.78% to 17.83%. Comparison of the time to peak flow, peak flow, runoff volume and TSS removal rates between the measured and modeled values in model verification phase had a maximum difference of 11%. The results demonstrate that the proposed model is capable of capturing the temporal dynamics of the pavement performance. Therefore, the model has great potential as a practical modeling tool for permeable pavement design and performance assessment.
Machine Learning and Deep Learning Models to Predict Runoff Water Quantity and Quality
NASA Astrophysics Data System (ADS)
Bradford, S. A.; Liang, J.; Li, W.; Murata, T.; Simunek, J.
2017-12-01
Contaminants can be rapidly transported at the soil surface by runoff to surface water bodies. Physically-based models, which are based on the mathematical description of main hydrological processes, are key tools for predicting surface water impairment. Along with physically-based models, data-driven models are becoming increasingly popular for describing the behavior of hydrological and water resources systems since these models can be used to complement or even replace physically based-models. In this presentation we propose a new data-driven model as an alternative to a physically-based overland flow and transport model. First, we have developed a physically-based numerical model to simulate overland flow and contaminant transport (the HYDRUS-1D overland flow module). A large number of numerical simulations were carried out to develop a database containing information about the impact of various input parameters (weather patterns, surface topography, vegetation, soil conditions, contaminants, and best management practices) on runoff water quantity and quality outputs. This database was used to train data-driven models. Three different methods (Neural Networks, Support Vector Machines, and Recurrence Neural Networks) were explored to prepare input- output functional relations. Results demonstrate the ability and limitations of machine learning and deep learning models to predict runoff water quantity and quality.
NASA Astrophysics Data System (ADS)
Lopez-Nicolas, Antonio; Pulido-Velazquez, Manuel
2014-05-01
The main challenge of the BLUEPRINT to safeguard Europe's water resources (EC, 2012) is to guarantee that enough good quality water is available for people's needs, the economy and the environment. In this sense, economic policy instruments such as water pricing policies and water markets can be applied to enhance efficient use of water. This paper presents a method based on hydro-economic tools to assess the effect of economic instruments on water resource systems. Hydro-economic models allow integrated analysis of water supply, demand and infrastructure operation at the river basin scale, by simultaneously combining engineering, hydrologic and economic aspects of water resources management. The method made use of the simulation and optimization hydroeconomic tools SIMGAMS and OPTIGAMS. The simulation tool SIMGAMS allocates water resources among the users according to priorities and operating rules, and evaluate economic scarcity costs of the system by using economic demand functions. The model's objective function is designed so that the system aims to meet the operational targets (ranked according to priorities) at each month while following the system operating rules. The optimization tool OPTIGAMS allocates water resources based on an economic efficiency criterion: maximize net benefits, or alternatively, minimizing the total water scarcity and operating cost of water use. SIMGAS allows to simulate incentive water pricing policies based on marginal resource opportunity costs (MROC; Pulido-Velazquez et al., 2013). Storage-dependent step pricing functions are derived from the time series of MROC values at a certain reservoir in the system. These water pricing policies are defined based on water availability in the system (scarcity pricing), so that when water storage is high, the MROC is low, while low storage (drought periods) will be associated to high MROC and therefore, high prices. We also illustrate the use of OPTIGAMS to simulate the effect of ideal water markets by economic optimization, without considering the potential effect of transaction costs. These methods and tools have been applied to the Jucar River basin (Spain). The results show the potential of economic instruments in setting incentives for a more efficient management of water resources systems. Acknowledgments: The study has been partially supported by the European Community 7th Framework Project (GENESIS project, n. 226536), SAWARES (Plan Nacional I+D+i 2008-2011, CGL2009-13238-C02-01 and C02-02), SCARCE (Consolider-Ingenio 2010 CSD2009-00065) of the Spanish Ministry of Economy and Competitiveness; and EC 7th Framework Project ENHANCE (n. 308438) Reference: Pulido-Velazquez, M., Alvarez-Mendiola, E., and Andreu, J., 2013. Design of Efficient Water Pricing Policies Integrating Basinwide Resource Opportunity Costs. J. Water Resour. Plann. Manage., 139(5): 583-592.
NASA Astrophysics Data System (ADS)
Biermann, D.; Kahleyss, F.; Krebs, E.; Upmeier, T.
2011-07-01
Micro-sized applications are gaining more and more relevance for NiTi-based shape memory alloys (SMA). Different types of micro-machining offer unique possibilities for the manufacturing of NiTi components. The advantage of machining is the low thermal influence on the workpiece. This is important, because the phase transformation temperatures of NiTi SMAs can be changed and the components may need extensive post manufacturing. The article offers a simulation-based approach to optimize five-axis micro-milling processes with respect to the special material properties of NiTi SMA. Especially, the influence of the various tool inclination angles is considered for introducing an intelligent tool inclination optimization algorithm. Furthermore, aspects of micro deep-hole drilling of SMAs are discussed. Tools with diameters as small as 0.5 mm are used. The possible length-to-diameter ratio reaches up to 50. This process offers new possibilities in the manufacturing of microstents. The study concentrates on the influence of the cutting speed, the feed and the tool design on the tool wear and the quality of the drilled holes.
NASA Astrophysics Data System (ADS)
McKane, R. B.; M, S.; F, P.; Kwiatkowski, B. L.; Rastetter, E. B.
2006-12-01
Federal and state agencies responsible for protecting water quality rely mainly on statistically-based methods to assess and manage risks to the nation's streams, lakes and estuaries. Although statistical approaches provide valuable information on current trends in water quality, process-based simulation models are essential for understanding and forecasting how changes in human activities across complex landscapes impact the transport of nutrients and contaminants to surface waters. To address this need, we developed a broadly applicable, process-based watershed simulator that links a spatially-explicit hydrologic model and a terrestrial biogeochemistry model (MEL). See Stieglitz et al. and Pan et al., this meeting, for details on the design and verification of this simulator. Here we apply the watershed simulator to a generalized agricultural setting to demonstrate its potential for informing policy and management decisions concerning water quality. This demonstration specifically explores the effectiveness of riparian buffers for reducing the transport of nitrogenous fertilizers from agricultural fields to streams. The interaction of hydrologic and biogeochemical processes represented in our simulator allows several important questions to be addressed. (1) For a range of upland fertilization rates, to what extent do riparian buffers reduce nitrogen inputs to streams? (2) How does buffer effectiveness change over time as the plant-soil system approaches N-saturation? (3) How can buffers be managed to increase their effectiveness, e.g., through periodic harvest and replanting? The model results illustrate that, while the answers to these questions depend to some extent on site factors (climatic regime, soil properties and vegetation type), in all cases riparian buffers have a limited capacity to reduce nitrogen inputs to streams where fertilization rates approach those typically used for intensive agriculture (e.g., 200 kg N per ha per year for corn in the U.S.A. Midwestern states). We also discuss how the insights gained from our approach cannot be achieved with modeling tools that are not both spatially explicit and process-based.
Data and Tools | Hydrogen and Fuel Cells | NREL
researchers, developers, investors, and others interested in the viability, analysis, and development of , energy use, and emissions. Alternative Fuels Data Center Tools Collection of tools-calculators -makers reduce petroleum use. FASTSim: Future Automotive Systems Technology Simulator Simulation tool that
NASA Astrophysics Data System (ADS)
Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.; Xia, J.
2016-02-01
Integrated water system modeling is a feasible approach to understanding severe water crises in the world and promoting the implementation of integrated river basin management. In this study, a classic hydrological model (the time variant gain model: TVGM) was extended to an integrated water system model by coupling multiple water-related processes in hydrology, biogeochemistry, water quality, and ecology, and considering the interference of human activities. A parameter analysis tool, which included sensitivity analysis, autocalibration and model performance evaluation, was developed to improve modeling efficiency. To demonstrate the model performances, the Shaying River catchment, which is the largest highly regulated and heavily polluted tributary of the Huai River basin in China, was selected as the case study area. The model performances were evaluated on the key water-related components including runoff, water quality, diffuse pollution load (or nonpoint sources) and crop yield. Results showed that our proposed model simulated most components reasonably well. The simulated daily runoff at most regulated and less-regulated stations matched well with the observations. The average correlation coefficient and Nash-Sutcliffe efficiency were 0.85 and 0.70, respectively. Both the simulated low and high flows at most stations were improved when the dam regulation was considered. The daily ammonium-nitrogen (NH4-N) concentration was also well captured with the average correlation coefficient of 0.67. Furthermore, the diffuse source load of NH4-N and the corn yield were reasonably simulated at the administrative region scale. This integrated water system model is expected to improve the simulation performances with extension to more model functionalities, and to provide a scientific basis for the implementation in integrated river basin managements.
Soil-test N recommendations augmented with PEST-optimized RZWQM simulations
Malone, R.W.; Jaynes, D.B.; Ma, Liwang; Nolan, B.T.; Meek, D.W.; Karlen, D.L.
2010-01-01
Improved understanding of year-to-year late-spring soil nitrate test (LSNT) variability could help make it more attractive to producers. We test the ability of the Root Zone Water Quality Model (RZWQM) to simulate watershed-scale variability due to the LSNT, and we use the optimized model to simulate long-term field N dynamics under related conditions. Autoregressive techniques and the automatic parameter calibration program PEST were used to show that RZWQM simulates significantly lower nitrate concentration in discharge from LSNT treatments compared with areas receiving fall N fertilizer applications within the tile-drained Walnut Creek, Iowa, watershed (>5 mg N L-1 difference for the third year of the treatment, 1999). This result is similar to field-measured data from a paired watershed experiment. A statistical model we developed using RZWQM simulations from 1970 to 2005 shows that early-season precipitation and early-season temperature account for 90% of the interannual variation in LSNT-based fertilizer N rates. Long-term simulations with similar average N application rates for corn (Zea mays L.) (151 kg N ha-1) show annual average N loss in tile flow of 20.4, 22.2, and 27.3 kg N ha -1 for LSNT, single spring, and single fall N applications. These results suggest that (i) RZWQM is a promising tool to accurately estimate the water quality effects of LSNT; (ii) the majority of N loss difference between LSNT and fall applications is because more N remains in the root zone for crop uptake; and (iii) year-to-year LSNT-based N rate differences are mainly due to variation in early-season precipitation and temperature. Copyright ?? 2010 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.
Status of the AIAA Modeling and Simulation Format Standard
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Hildreth, Bruce L.
2008-01-01
The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.
SimulCAT: Windows Software for Simulating Computerized Adaptive Test Administration
ERIC Educational Resources Information Center
Han, Kyung T.
2012-01-01
Most, if not all, computerized adaptive testing (CAT) programs use simulation techniques to develop and evaluate CAT program administration and operations, but such simulation tools are rarely available to the public. Up to now, several software tools have been available to conduct CAT simulations for research purposes; however, these existing…
NASA Astrophysics Data System (ADS)
Mikkili, Suresh; Panda, Anup Kumar; Prattipati, Jayanthi
2015-06-01
Nowadays the researchers want to develop their model in real-time environment. Simulation tools have been widely used for the design and improvement of electrical systems since the mid twentieth century. The evolution of simulation tools has progressed in step with the evolution of computing technologies. In recent years, computing technologies have improved dramatically in performance and become widely available at a steadily decreasing cost. Consequently, simulation tools have also seen dramatic performance gains and steady cost decreases. Researchers and engineers now have the access to affordable, high performance simulation tools that were previously too cost prohibitive, except for the largest manufacturers. This work has introduced a specific class of digital simulator known as a real-time simulator by answering the questions "what is real-time simulation", "why is it needed" and "how it works". The latest trend in real-time simulation consists of exporting simulation models to FPGA. In this article, the Steps involved for implementation of a model from MATLAB to REAL-TIME are provided in detail.
ERIC Educational Resources Information Center
Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.
2011-01-01
An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…
NASA Astrophysics Data System (ADS)
Martinez Baquero, G. F.; Furnans, J.; Hudson, C.; Magan, C.
2012-12-01
Management decisions on rivers and associated habitats require sound tools to identify major drivers for spatial and temporal variations of temperature and related water quality variables. 3D hydrodynamic and water quality models are key components to abstract flow dynamics in complex river systems as they allow extrapolating available observations to ungaged locations and alternative scenarios. The data collection and model development are intended to support the Mid-Columbia Fisheries Enhancement Group in conjunction with the Benton Conservation District in efforts to understand how seasonal flow patterns in the Yakima and Columbia rivers interact with the Yakima delta geometry to cause the relatively high water temperatures previously observed west of Bateman Island. These high temperatures are suspected of limiting salmonid success in the area, possibly contributing to adjustments in migration patterns and increased predation. The Environmental Fluid Dynamics Code (EFDC) and Water Quality Analysis Simulation Program (WASP) are used to model flow patterns and enable simulations of temperature distributions and water quality parameters at the confluence. Model development is supported by a bathymetric campaign in 2011 to evaluate delta geometry and to construct the EFDC domain, a sonar river survey in 2012 to measure velocity profiles and to enable model calibration, and a continuous collection of temperature and dissolved oxygen records from Level Scout probes at key locations during last year to drive water quality simulations. The current model is able to reproduce main flow features observed at the confluence and is being prepared to integrate previous and current temperature observations. The final model is expected to evaluate scenarios for the removal or alteration of the Bateman Island Causeway. Alterations to the causeway that permit water passage to the south of Bateman Island are likely to dramatically alter the water flow patterns through the Yakima and Columbia River confluence, which in turn will alter water temperature distributions, sediment transport pathways, and salmonid migration routes.
[The Italian instrument evaluating the nursing students clinical learning quality].
Palese, Alvisa; Grassetti, Luca; Mansutti, Irene; Destrebecq, Anne; Terzoni, Stefano; Altini, Pietro; Bevilacqua, Anita; Brugnolli, Anna; Benaglio, Carla; Dal Ponte, Adriana; De Biasio, Laura; Dimonte, Valerio; Gambacorti, Benedetta; Fasci, Adriana; Grosso, Silvia; Mantovan, Franco; Marognolli, Oliva; Montalti, Sandra; Nicotera, Raffaela; Randon, Giulia; Stampfl, Brigitte; Tollini, Morena; Canzan, Federica; Saiani, Luisa; Zannini, Lucia
2017-01-01
. The Clinical Learning Quality Evaluation Index for nursing students. The Italian nursing programs, the need to introduce tools evaluating the quality of the clinical learning as perceived by nursing students. Several tools already exist, however, several limitations suggesting the need to develop a new tool. A national project aimed at developing and validating a new instrument capable of measuring the clinical learning quality as experience by nursing students. A validation study design was undertaken from 2015 to 2016. All nursing national programs (n=43) were invited to participate by including all nursing students attending regularly their clinical learning. The tool developed based upon a) literature, b) validated tools already established among other healthcare professionals, and c) consensus expressed by experts and nursing students, was administered to the eligible students. 9606 nursing in 27 universities (62.8%) participated. The psychometric properties of the new instrument ranged from good to excellent. According to the findings, the tool consists in 22 items and five factors: a) quality of the tutorial strategies, b) learning opportunities; c) safety and nursing care quality; d) self-direct learning; e) quality of the learning environment. The tool is already used. Its systematic adoption may support comparison among settings and across different programs; moreover, the tool may also support in accrediting new settings as well as in measuring the effects of strategies aimed at improving the quality of the clinical learning.
Ambrosia airborne pollen concentration modelling and evaluation over Europe
NASA Astrophysics Data System (ADS)
Hamaoui-Laguel, Lynda; Vautard, Robert; Viovy, Nicolas; Khvorostyanov, Dmitry; Colette, Augustin
2014-05-01
Native from North America, Ambrosia artemisiifolia L. (Common Ragweed) is an invasive annual weed introduced in Europe in the mid-nineteenth century. It has a very high spreading potential throughout Europe and releases very allergenic pollen leading to health problems for sensitive persons. Because of its health effects, it is necessary to develop modelling tools to be able to forecast ambrosia air pollen concentration and to inform allergy populations of allergenic threshold exceedance. This study is realised within the framework of the ATOPICA project (https://www.atopica.eu/) which is designed to provide first steps in tools and estimations of the fate of allergies in Europe due to changes in climate, land use and air quality. To calculate and predict airborne concentrations of ambrosia pollen, a chain of models has been built. Models have been developed or adapted for simulating the phenology (PMP phonological modelling platform), inter-annual production (ORCHIDEE vegetation model), release and airborne processes (CHIMERE chemical transport model) of ragweed pollen. Airborne pollens follow processes similar to air quality pollutants in CHIMERE with some adaptations. The detailed methodology, formulations and input data will be presented. A set of simulations has been performed to simulate airborne concentrations of pollens over long time periods on a large European domain. Hindcast simulations (2000 - 2012) driven by ERA-Interim re-analyses are designed to best simulate past periods airborne pollens. The modelled pollen concentrations are calibrated with observations and validated against additional observations. Then, 20-year long historical simulations (1986 - 2005) are carried out using calibrated ambrosia density distribution and climate model-driven weather in order to serve as a control simulation for future scenarios. By comparison with multi-annual observed daily pollen counts we have shown that the model captures well the gross features of the pollen concentrations found in Europe. The spatial distribution is well captured with correlation equal to 0.7, but the daily variability of pollen counts remains to be improved with correlations varying between 0.1 and 0.75. The model chain captures reasonably well the inter-annual variability of pollen yearly mean concentrations, correlations, even not statistically significant due to the short length of time series, are positive for about 80% of sites. The main uncertainty in ambrosia pollen modelling is linked to the uncertainty in the plant density distribution. Preliminary results of the impact of environmental changes on pollen concentrations in the future will also be shown.
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
SolarPILOT | Concentrating Solar Power | NREL
tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is
NASA Astrophysics Data System (ADS)
Arnoux, Marie; Barbecot, Florent; Gibert-Brunet, Elisabeth; Gibson, John; Noret, Aurélie
2017-11-01
Lakes are under increasing pressure due to widespread anthropogenic impacts related to rapid development and population growth. Accordingly, many lakes are currently undergoing a systematic decline in water quality. Recent studies have highlighted that global warming and the subsequent changes in water use may further exacerbate eutrophication in lakes. Lake evolution depends strongly on hydrologic balance, and therefore on groundwater connectivity. Groundwater also influences the sensitivity of lacustrine ecosystems to climate and environmental changes, and governs their resilience. Improved characterization of groundwater exchange with lakes is needed today for lake preservation, lake restoration, and sustainable management of lake water quality into the future. In this context, the aim of the present paper is to determine if the future evolution of the climate, the population, and the recharge could modify the geochemistry of lakes (mainly isotopic signature and quality via phosphorous load) and if the isotopic monitoring of lakes could be an efficient tool to highlight the variability of the water budget and quality. Small groundwater-connected lakes were chosen to simulate changes in water balance and water quality expected under future climate change scenarios, namely representative concentration pathways (RCPs) 4.5 and 8.5. Contemporary baseline conditions, including isotope mass balance and geochemical characteristics, were determined through an intensive field-based research program prior to the simulations. Results highlight that future lake geochemistry and isotopic composition trends will depend on four main parameters: location (and therefore climate conditions), lake catchment size (which impacts the intensity of the flux change), lake volume (which impacts the range of variation), and lake G index (i.e., the percentage of groundwater that makes up total lake inflows), the latter being the dominant control on water balance conditions, as revealed by the sensitivity of lake isotopic composition. Based on these model simulations, stable isotopes appear to be especially useful for detecting changes in recharge to lakes with a G index of between 50 and 80 %, but response is non-linear. Simulated monthly trends reveal that evolution of annual lake isotopic composition can be dampened by opposing monthly recharge fluctuations. It is also shown that changes in water quality in groundwater-connected lakes depend significantly on lake location and on the intensity of recharge change.
Pivel, María Alejandra Gómez; Dal Sasso Freitas, Carla Maria
2010-08-01
Numerical models that predict the fate of drilling discharges at sea constitute a valuable tool for both the oil industry and regulatory agencies. In order to provide reliable estimates, models must be validated through the comparison of predictions with field or laboratory observations. In this paper, we used the Offshore Operators Committee Model to simulate the discharges from two wells drilled at Campos Basin, offshore SE Brazil, and compared the results with field observations obtained 3 months after drilling. The comparison showed that the model provided reasonable predictions, considering that data about currents were reconstructed and theoretical data were used to characterize the classes of solids. The model proved to be a valuable tool to determine the degree of potential impact associated to drilling activities. However, since the accuracy of the model is directly dependent on the quality of input data, different possible scenarios should be considered when used for forecast modeling.
Creation and Delphi-method refinement of pediatric disaster triage simulations.
Cicero, Mark X; Brown, Linda; Overly, Frank; Yarzebski, Jorge; Meckler, Garth; Fuchs, Susan; Tomassoni, Anthony; Aghababian, Richard; Chung, Sarita; Garrett, Andrew; Fagbuyi, Daniel; Adelgais, Kathleen; Goldman, Ran; Parker, James; Auerbach, Marc; Riera, Antonio; Cone, David; Baum, Carl R
2014-01-01
There is a need for rigorously designed pediatric disaster triage (PDT) training simulations for paramedics. First, we sought to design three multiple patient incidents for EMS provider training simulations. Our second objective was to determine the appropriate interventions and triage level for each victim in each of the simulations and develop evaluation instruments for each simulation. The final objective was to ensure that each simulation and evaluation tool was free of bias toward any specific PDT strategy. We created mixed-methods disaster simulation scenarios with pediatric victims: a school shooting, a school bus crash, and a multiple-victim house fire. Standardized patients, high-fidelity manikins, and low-fidelity manikins were used to portray the victims. Each simulation had similar acuity of injuries and 10 victims. Examples include children with special health-care needs, gunshot wounds, and smoke inhalation. Checklist-based evaluation tools and behaviorally anchored global assessments of function were created for each simulation. Eight physicians and paramedics from areas with differing PDT strategies were recruited as Subject Matter Experts (SMEs) for a modified Delphi iterative critique of the simulations and evaluation tools. The modified Delphi was managed with an online survey tool. The SMEs provided an expected triage category for each patient. The target for modified Delphi consensus was ≥85%. Using Likert scales and free text, the SMEs assessed the validity of the simulations, including instances of bias toward a specific PDT strategy, clarity of learning objectives, and the correlation of the evaluation tools to the learning objectives and scenarios. After two rounds of the modified Delphi, consensus for expected triage level was >85% for 28 of 30 victims, with the remaining two achieving >85% consensus after three Delphi iterations. To achieve consensus, we amended 11 instances of bias toward a specific PDT strategy and corrected 10 instances of noncorrelation between evaluations and simulation. The modified Delphi process, used to derive novel PDT simulation and evaluation tools, yielded a high degree of consensus among the SMEs, and eliminated biases toward specific PDT strategies in the evaluations. The simulations and evaluation tools may now be tested for reliability and validity as part of a prehospital PDT curriculum.
Ruano, M V; Ribes, J; Seco, A; Ferrer, J
2011-01-01
This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.
River and Reservoir Operations Model, Truckee River basin, California and Nevada, 1998
Berris, Steven N.; Hess, Glen W.; Bohman, Larry R.
2001-01-01
The demand for all uses of water in the Truckee River Basin, California and Nevada, commonly is greater than can be supplied. Storage reservoirs in the system have a maximum effective total capacity equivalent to less than two years of average river flows, so longer-term droughts can result in substantial water-supply shortages for irrigation and municipal users and may stress fish and wildlife ecosystems. Title II of Public Law (P.L.) 101-618, the Truckee?Carson?Pyramid Lake Water Rights Settlement Act of 1990, provides a foundation for negotiating and developing operating criteria, known as the Truckee River Operating Agreement (TROA), to balance interstate and interbasin allocation of water rights among the many interests competing for water from the Truckee River. In addition to TROA, the Truckee River Water Quality Settlement Agreement (WQSA), signed in 1996, provides for acquisition of water rights to resolve water-quality problems during low flows along the Truckee River in Nevada. Efficient execution of many of the planning, management, or environmental assessment requirements of TROA and WQSA will require detailed water-resources data coupled with sound analytical tools. Analytical modeling tools constructed and evaluated with such data could help assess effects of alternative operational scenarios related to reservoir and river operations, water-rights transfers, and changes in irrigation practices. The Truckee?Carson Program of the U.S. Geological Survey, to support U.S. Department of the Interior implementation of P.L. 101-618, is developing a modeling system to support efficient water-resources planning, management, and allocation. The daily operations model documented herein is a part of the modeling system that includes a database management program, a graphical user interface program, and a program with modules that simulate river/reservoir operations and a variety of hydrologic processes. The operations module is capable of simulating lake/ reservoir and river operations including diversion of Truckee River water to the Truckee Canal for transport to the Carson River Basin. In addition to the operations and streamflow-routing modules, the modeling system is structured to allow integration of other modules, such as water-quality and precipitation-runoff modules. The USGS Truckee River Basin operations model was designed to provide simulations that allow comparison of the effects of alternative management practices or allocations on streamflow or reservoir storages in the Truckee River Basin over long periods of time. Because the model was not intended to reproduce historical streamflow or reservoir storage values, a traditional calibration that includes statistical comparisons of observed and simulated values would be problematic with this model and database. This report describes a chronology and background of decrees, agreements, and laws that affect Truckee River operational practices; the construction of the Truckee River daily operations model; the simulation of Truckee River Basin operations, both current and proposed under the draft TROA and WQSA; and suggested model improvements and limitations. The daily operations model uses Hydrological Simulation Program?FORTRAN (HSPF) to simulate flow-routing and reservoir and river operations. The operations model simulates reservoir and river operations that govern streamflow in the Truckee River from Lake Tahoe to Pyramid Lake, including diversions through the Truckee Canal to Lahontan Reservoir in the Carson River Basin. A general overview is provided of daily operations and their simulation. Supplemental information that documents the extremely complex operating rules simulated by the model is available.
Enhanced Electric Power Transmission by Hybrid Compensation Technique
NASA Astrophysics Data System (ADS)
Palanichamy, C.; Kiu, G. Q.
2015-04-01
In today's competitive environment, new power system engineers are likely to contribute immediately to the task, without years of seasoning via on-the-job training, mentoring, and rotation assignments. At the same time it is becoming obligatory to train power system engineering graduates for an increasingly quality-minded corporate environment. In order to achieve this, there is a need to make available better-quality tools for educating and training power system engineering students and in-service system engineers too. As a result of the swift advances in computer hardware and software, many windows-based computer software packages were developed for the purpose of educating and training. In line with those packages, a simulation package called Hybrid Series-Shunt Compensators (HSSC) has been developed and presented in this paper for educational purposes.
Modeling and Simulation Tools for Heavy Lift Airships
NASA Technical Reports Server (NTRS)
Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John
2016-01-01
For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.
Joint space-time geostatistical model for air quality surveillance
NASA Astrophysics Data System (ADS)
Russo, A.; Soares, A.; Pereira, M. J.
2009-04-01
Air pollution and peoples' generalized concern about air quality are, nowadays, considered to be a global problem. Although the introduction of rigid air pollution regulations has reduced pollution from industry and power stations, the growing number of cars on the road poses a new pollution problem. Considering the characteristics of the atmospheric circulation and also the residence times of certain pollutants in the atmosphere, a generalized and growing interest on air quality issues led to research intensification and publication of several articles with quite different levels of scientific depth. As most natural phenomena, air quality can be seen as a space-time process, where space-time relationships have usually quite different characteristics and levels of uncertainty. As a result, the simultaneous integration of space and time is not an easy task to perform. This problem is overcome by a variety of methodologies. The use of stochastic models and neural networks to characterize space-time dispersion of air quality is becoming a common practice. The main objective of this work is to produce an air quality model which allows forecasting critical concentration episodes of a certain pollutant by means of a hybrid approach, based on the combined use of neural network models and stochastic simulations. A stochastic simulation of the spatial component with a space-time trend model is proposed to characterize critical situations, taking into account data from the past and a space-time trend from the recent past. To identify near future critical episodes, predicted values from neural networks are used at each monitoring station. In this paper, we describe the design of a hybrid forecasting tool for ambient NO2 concentrations in Lisbon, Portugal.
NASA Astrophysics Data System (ADS)
Ranatunga, T.; Tong, S.; Yang, J.
2011-12-01
Hydrologic and water quality models can provide a general framework to conceptualize and investigate the relationships between climate and water resources. Under a hot and dry climate, highly urbanized watersheds are more vulnerable to changes in climate, such as excess heat and drought. In this study, a comprehensive watershed model, Hydrological Simulation Program FORTRAN (HSPF), is used to assess the impacts of future climate change on the stream discharge and water quality in Las Vegas Wash in Nevada, the only surface water body that drains from the Las Vegas Valley (an area with rapid population growth and urbanization) to Lake Mead. In this presentation, the process of model building, calibration and validation, the generation of climate change scenarios, and the assessment of future climate change effects on stream hydrology and quality are demonstrated. The hydrologic and water quality model is developed based on the data from current national databases and existing major land use categories of the watershed. The model is calibrated for stream discharge, nutrients (nitrogen and phosphorus) and sediment yield. The climate change scenarios are derived from the outputs of the Global Climate Models (GCM) and Regional Climate Models (RCM) simulations, and from the recent assessment reports from the Intergovernmental Panel on Climate Change (IPCC). The Climate Assessment Tool from US EPA's BASINS is used to assess the effects of likely future climate scenarios on the water quantity and quality in Las Vegas Wash. Also the presentation discusses the consequences of these hydrologic changes, including the deficit supplies of clean water during peak seasons of water demand, increased eutrophication potentials, wetland deterioration, and impacts on wild life habitats.
NASA Astrophysics Data System (ADS)
Nair, U. S.; Keiser, K.; Wu, Y.; Maskey, M.; Berendes, D.; Glass, P.; Dhakal, A.; Christopher, S. A.
2012-12-01
The Alabama Forestry Commission (AFC) is responsible for wildfire control and also prescribed burn management in the state of Alabama. Visibility and air quality degradation resulting from smoke are two pieces of information that are crucial for this activity. Currently the tools available to AFC are the dispersion index available from the National Weather Service and also surface smoke concentrations. The former provides broad guidance for prescribed burning activities but does not provide specific information regarding smoke transport, areas affected and quantification of air quality and visibility degradation. While the NOAA operational air quality guidance includes surface smoke concentrations from existing fire events, it does not account for contributions from background aerosols, which are important for the southeastern region including Alabama. Also lacking is the quantification of visibility. The University of Alabama in Huntsville has developed a state-of-the-art integrated modeling system to address these concerns. This system based on the Community Air Quality Modeling System (CMAQ) that ingests satellite derived smoke emissions and also assimilates NASA MODIS derived aerosol optical thickness. In addition, this operational modeling system also simulates the impact of potential prescribed burn events based on location information derived from the AFC prescribed burn permit database. A lagrangian model is used to simulate smoke plumes for the prescribed burns requests. The combined air quality and visibility degradation resulting from these smoke plumes and background aerosols is computed and the information is made available through a web based decision support system utilizing open source GIS components. This system provides information regarding intersections between highways and other critical facilities such as old age homes, hospitals and schools. The system also includes satellite detected fire locations and other satellite derived datasets relevant for fire and smoke management.
Testing vision with angular and radial multifocal designs using Adaptive Optics.
Vinas, Maria; Dorronsoro, Carlos; Gonzalez, Veronica; Cortes, Daniel; Radhakrishnan, Aiswaryah; Marcos, Susana
2017-03-01
Multifocal vision corrections are increasingly used solutions for presbyopia. In the current study we have evaluated, optically and psychophysically, the quality provided by multizone radial and angular segmented phase designs. Optical and relative visual quality were evaluated using 8 subjects, testing 6 phase designs. Optical quality was evaluated by means of Visual Strehl-based-metrics (VS). The relative visual quality across designs was obtained through a psychophysical paradigm in which images viewed through 210 pairs of phase patterns were perceptually judged. A custom-developed Adaptive Optics (AO) system, including a Hartmann-Shack sensor and an electromagnetic deformable mirror, to measure and correct the eye's aberrations, and a phase-only reflective Spatial Light Modulator, to simulate the phase designs, was developed for this study. The multizone segmented phase designs had 2-4 zones of progressive power (0 to +3D) in either radial or angular distributions. The response of an "ideal observer" purely responding on optical grounds to the same psychophysical test performed on subjects was calculated from the VS curves, and compared with the relative visual quality results. Optical and psychophysical pattern-comparison tests showed that while 2-zone segmented designs (angular & radial) provided better performance for far and near vision, 3- and 4-zone segmented angular designs performed better for intermediate vision. AO-correction of natural aberrations of the subjects modified the response for the different subjects but general trends remained. The differences in perceived quality across the different multifocal patterns are, in a large extent, explained by optical factors. AO is an excellent tool to simulate multifocal refractions before they are manufactured or delivered to the patient, and to assess the effects of the native optics to their performance. Copyright © 2016 Elsevier Ltd. All rights reserved.
WiFiSiM: An Educational Tool for the Study and Design of Wireless Networks
ERIC Educational Resources Information Center
Mateo Sanguino, T. J.; Serrano Lopez, C.; Marquez Hernandez, F. A.
2013-01-01
A new educational simulation tool designed for the generic study of wireless networks, the Wireless Fidelity Simulator (WiFiSim), is presented in this paper. The goal of this work was to create and implement a didactic tool to improve the teaching and learning of computer networks by means of two complementary strategies: simulating the behavior…
Kirkman, Matthew A; Muirhead, William; Nandi, Dipankar; Sevdalis, Nick
2014-01-01
Neurosurgical simulation training is becoming increasingly popular. Attitudes toward simulation among residents can contribute to the effectiveness of simulation training, but such attitudes remain poorly explored in neurosurgery with no psychometrically proven measure in the literature. The aim of the present study was to evaluate prospectively a newly developed tool for this purpose: the Neurosurgical Evaluation of Attitudes towards simulation Training (NEAT). The NEAT tool was prospectively developed in 2 stages and psychometrically evaluated (validity and reliability) in 2 administrations with the same participants. The tool comprises a questionnaire with 9 Likert scale items and 2 free-text sections assessing attitudes toward simulation in neurosurgery. The evaluation was completed with 31 neurosurgery residents in London, United Kingdom, who were generally favorable toward neurosurgical simulation. The internal consistency of the questionnaire was high, as demonstrated by the overall Cronbach α values (α=0.899 and α=0.955). All but 2 questionnaire items had "substantial" or "almost perfect" test-retest reliability following repeated survey administrations (median Pearson r correlation=0.688; range, 0.248-0.841). NEAT items were well correlated with each other on both occasions, showing good validity of content within the NEAT tool. There was no significant relationship between either gender or length of neurosurgical experience and item ratings. NEAT is the first psychometrically evaluated tool for evaluating attitudes toward simulation in neurosurgery. Further implementation of NEAT is required in wider neurosurgical populations to establish whether specific population groups differ. Use of NEAT in studies of neurosurgical simulation could offer an additional outcome measure to performance metrics, permitting evaluation of the impact of neurosurgical simulation on attitudes toward simulation both between participants and within the same participants over time. Copyright © 2014 Elsevier Inc. All rights reserved.
Nurse training with simulation: an innovative approach to teach complex microsurgery patient care.
Flurry, Mitchell; Brooke, Sebastian; Micholetti, Brett; Natoli, Noel; Moyer, Kurtis; Mnich, Stephanie; Potochny, John
2012-10-01
Simulation has become an integral part of education at all levels within the medical field. The ability to allow personnel to practice and learn in a safe and controlled environment makes it a valuable tool for initial training and continued competence verification. An area of specific interest to the reconstructive microsurgeon is assurance that the nursing staff has adequate training and experience to provide optimum care for microsurgery patients. Plastic surgeons in institutions where few microsurgeries are performed face challenges teaching nurses how to care for these complex patients. Because no standard exists to educate microsurgery nurses, learning often happens by chance on-the-job encounters. Outcomes, therefore, may be affected by poor handoffs between inexperienced personnel. Our objective is to create a course that augments such random clinical experience and teaches the knowledge and skills necessary for successful microsurgery through simulated patient scenarios. Quality care reviews at our institution served as the foundation to develop an accredited nursing course providing clinical training for the care of microsurgery patients. The course combined lectures on microsurgery, pharmacology, and flap monitoring as well as simulated operating room, surgical intensive care unit, postanesthesia care unit, Trauma Bay, and Floor scenarios. Evaluation of participants included precourse examination, postcourse examination, and a 6-month follow-up. Average test scores were 72% precourse and 92% postcourse. Educational value, effectiveness of lectures and simulation, and overall course quality was rated very high or high by 86% of respondents; 0% respondents rated it as low. Six-month follow-up test score average was 88%. Learning to care for microsurgery patients should not be left to chance patient encounters on the job. Simulation provides a safe, reproducible, and controlled clinical experience. Our results show that simulation is a highly rated and effective way to teach nurses microsurgery patient care. Simulated patient care training should be considered to augment the clinical experience in hospitals where microsurgery is performed.
NASA Astrophysics Data System (ADS)
Arasa, Josep; Pizarro, Carles; Blanco, Patricia
2016-06-01
Injection molded plastic lenses have continuously improved their performance regarding optical quality and nowadays are as usual as glass lenses in image forming devices. However, during the manufacturing process unavoidable fluctuations in material density occur, resulting in local changes in the distribution of refractive index, which degrade the imaging properties of the polymer lens. Such material density fluctuations correlate to phase delays, which opens a path for their mapping. However, it is difficult to transfer the measured variations in refractive index into conventional optical simulation tool. Thus, we propose a method to convert the local variations in refractive index into local changes of one surface of the lens, which can then be described as a free-form surface, easy to introduce in conventional simulation tools. The proposed method was tested on a commercial gradient index (GRIN) lens for a set of six different object positions, using the MTF sagittal and tangential cuts to compare the differences between the real lens and a lens with homogenous refractive index, and the last surface converted into a free-form shape containing the internal refractive index changes. The same procedure was used to reproduce the local refractive index changes of an injected plastic lens with local index changes measured using an in-house built polariscopic arrangement, showing the capability of the method to provide successful results.
Saletti, Dominique
2017-01-01
Rapid progress in ultra-high-speed imaging has allowed material properties to be studied at high strain rates by applying full-field measurements and inverse identification methods. Nevertheless, the sensitivity of these techniques still requires a better understanding, since various extrinsic factors present during an actual experiment make it difficult to separate different sources of errors that can significantly affect the quality of the identified results. This study presents a methodology using simulated experiments to investigate the accuracy of the so-called spalling technique (used to study tensile properties of concrete subjected to high strain rates) by numerically simulating the entire identification process. The experimental technique uses the virtual fields method and the grid method. The methodology consists of reproducing the recording process of an ultra-high-speed camera by generating sequences of synthetically deformed images of a sample surface, which are then analysed using the standard tools. The investigation of the uncertainty of the identified parameters, such as Young's modulus along with the stress–strain constitutive response, is addressed by introducing the most significant user-dependent parameters (i.e. acquisition speed, camera dynamic range, grid sampling, blurring), proving that the used technique can be an effective tool for error investigation. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956505
The acoustic performance of double-skin facades: A design support tool for architects
NASA Astrophysics Data System (ADS)
Batungbakal, Aireen
This study assesses and validates the influence of measuring sound in the urban environment and the influence of glass facade components in reducing sound transmission to the indoor environment. Among the most reported issues affecting workspaces, increased awareness to minimize noise led building designers to reconsider the design of building envelopes and its site environment. Outdoor sound conditions, such as traffic noise, challenge designers to accurately estimate the capability of glass facades in acquiring an appropriate indoor sound quality. Indicating the density of the urban environment, field-tests acquired existing sound levels in areas of high commercial development, employment, and traffic activity, establishing a baseline for sound levels common in urban work areas. Composed from the direct sound transmission loss of glass facades simulated through INSUL, a sound insulation software, data is utilized as an informative tool correlating the response of glass facade components towards existing outdoor sound levels of a project site in order to achieve desired indoor sound levels. This study progresses to link the disconnection in validating the acoustic performance of glass facades early in a project's design, from conditioned settings such as field-testing and simulations to project completion. Results obtained from the study's facade simulations and facade comparison supports that acoustic comfort is not limited to a singular solution, but multiple design options responsive to its environment.
Grid Integration Research | Wind | NREL
-generated simulation of a wind turbine. Wind Power Plant Modeling and Simulation Engineers at the National computer-aided engineering tool, FAST, as well as their wind power plant simulation tool, Wind-Plant
Millar, Ross
2013-01-01
The purpose of this paper is to present a study of how quality improvement tools and techniques are framed within healthcare settings. The paper employs an interpretive approach to understand how quality improvement tools and techniques are mobilised and legitimated. It does so using a case study of the NHS Modernisation Agency Improvement Leaders' Guides in England. Improvement Leaders' Guides were framed within a service improvement approach encouraging the use of quality improvement tools and techniques within healthcare settings. Their use formed part of enacting tools and techniques across different contexts. Whilst this enactment was believed to support the mobilisation of tools and techniques, the experience also illustrated the challenges in distributing such approaches. The paper provides an important contribution in furthering our understanding of framing the "social act" of quality improvement. Given the ongoing emphasis on quality improvement in health systems and the persistent challenges involved, it also provides important information for healthcare leaders globally in seeking to develop, implement or modify similar tools and distribute leadership within health and social care settings.
An agent-based simulation model to study accountable care organizations.
Liu, Pai; Wu, Shinyi
2016-03-01
Creating accountable care organizations (ACOs) has been widely discussed as a strategy to control rapidly rising healthcare costs and improve quality of care; however, building an effective ACO is a complex process involving multiple stakeholders (payers, providers, patients) with their own interests. Also, implementation of an ACO is costly in terms of time and money. Immature design could cause safety hazards. Therefore, there is a need for analytical model-based decision-support tools that can predict the outcomes of different strategies to facilitate ACO design and implementation. In this study, an agent-based simulation model was developed to study ACOs that considers payers, healthcare providers, and patients as agents under the shared saving payment model of care for congestive heart failure (CHF), one of the most expensive causes of sometimes preventable hospitalizations. The agent-based simulation model has identified the critical determinants for the payment model design that can motivate provider behavior changes to achieve maximum financial and quality outcomes of an ACO. The results show nonlinear provider behavior change patterns corresponding to changes in payment model designs. The outcomes vary by providers with different quality or financial priorities, and are most sensitive to the cost-effectiveness of CHF interventions that an ACO implements. This study demonstrates an increasingly important method to construct a healthcare system analytics model that can help inform health policy and healthcare management decisions. The study also points out that the likely success of an ACO is interdependent with payment model design, provider characteristics, and cost and effectiveness of healthcare interventions.
NASA Astrophysics Data System (ADS)
Devendran, A. A.; Lakshmanan, G.
2014-11-01
Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.
Evaluating strategies to reduce urban air pollution
NASA Astrophysics Data System (ADS)
Duque, L.; Relvas, H.; Silveira, C.; Ferreira, J.; Monteiro, A.; Gama, C.; Rafael, S.; Freitas, S.; Borrego, C.; Miranda, A. I.
2016-02-01
During the last years, specific air quality problems have been detected in the urban area of Porto (Portugal). Both PM10 and NO2 limit values have been surpassed in several air quality monitoring stations and, following the European legislation requirements, Air Quality Plans were designed and implemented to reduce those levels. In this sense, measures to decrease PM10 and NO2 emissions have been selected, these mainly related to the traffic sector, but also regarding the industrial and residential combustion sectors. The main objective of this study is to investigate the efficiency of these reduction measures with regard to the improvement of PM10 and NO2 concentration levels over the Porto urban region using a numerical modelling tool - The Air Pollution Model (TAPM). TAPM was applied over the study region, for a simulation domain of 80 × 80 km2 with a spatial resolution of 1 × 1 km2. The entire year of 2012 was simulated and set as the base year for the analysis of the impacts of the selected measures. Taking into account the main activity sectors, four main scenarios have been defined and simulated, with focus on: (1) hybrid cars; (2) a Low Emission Zone (LEZ); (3) fireplaces and (4) industry. The modelling results indicate that measures to reduce PM10 should be focused on residential combustion (fireplaces) and industrial activity and for NO2 the strategy should be based on the traffic sector. The implementation of all the defined scenarios will allow a total maximum reduction of 4.5% on the levels of both pollutants.
A Monte Carlo simulation of advanced HIV disease: application to prevention of CMV infection.
Paltiel, A D; Scharfstein, J A; Seage, G R; Losina, E; Goldie, S J; Weinstein, M C; Craven, D E; Freedberg, K A
1998-01-01
Disagreement exists among decision makers regarding the allocation of limited HIV patient care resources and, specifically, the comparative value of preventing opportunistic infections in late-stage disease. A Monte Carlo simulation framework was used to evaluate a state-transition model of the natural history of HIV illness in patients with CD4 counts below 300/mm3 and to project the costs and consequences of alternative strategies for preventing AIDS-related complications. The authors describe the model and demonstrate how it may be employed to assess the cost-effectiveness of oral ganciclovir for prevention of cytomegalovirus (CMV) infection. Ganciclovir prophylaxis confers an estimated additional 0.7 quality-adjusted month of life at a net cost of $10,700, implying an incremental cost-effectiveness ratio of roughly $173,000 per quality-adjusted life year gained. Sensitivity analysis reveals that this baseline result is stable over a wide range of input data estimates, including quality of life and drug efficacy, but it is sensitive to CMV incidence and drug price assumptions. The Monte Carlo simulation framework offers decision makers a powerful and flexible tool for evaluating choices in the realm of chronic disease patient care. The authors have used it to assess HIV-related treatment options and continue to refine it to reflect advances in defining the pathogenesis and treatment of AIDS. Compared with alternative interventions, CMV prophylaxis does not appear to be a cost-effective use of scarce HIV clinical care funds. However, targeted prevention in patients identified to be at higher risk for CMV-related disease may warrant consideration.
Photomask quality assessment solution for 90-nm technology node
NASA Astrophysics Data System (ADS)
Ohira, Katsumi; Chung, Dong Hoon P.; Nobuyuki, Yoshioka; Tateno, Motonari; Matsumura, Kenichi; Chen, Jiunn-Hung; Luk-Pat, Gerard T.; Fukui, Norio; Tanaka, Yoshio
2004-08-01
As 90 nm LSI devices are about to enter pre-production, the cost and turn-around time of photomasks for such devices will be key factors for success in device production. Such devices will be manufactured with state-of-the-art 193nm photolithography systems. Photomasks for these devices are being produced with the most advanced equipment, material and processing technologies and yet, quality assurance still remains an issue for volume production. These issues include defect classification and disposition due to the insufficient resolution of the defect inspection system at conventional review and classification processes and to aggressive RETs, uncertainty of the impact the defects have on the printed feature as well as inconsistencies of classical defect specifications as applied in the sub-wavelength era are becoming a serious problem. Simulation-based photomask qualification using the Virtual Stepper System is widely accepted today as a reliable mask quality assessment tool of mask defects for both the 180 nm and 130 nm technology nodes. This study examines the extendibility of the Virtual Stepper System to 90nm technology node. The proposed method of simulation-based mask qualification uses aerial image defect simulation in combination with a next generation DUV inspection system with shorter wavelength (266nm) and small pixel size combined with DUV high-resolution microscope for some defect cases. This paper will present experimental results that prove the applicability for enabling 90nm technology nodes. Both contact and line/space patterns with varies programmed defects on ArF Attenuated PSM will be used. This paper will also address how to make the strategy production-worthy.
SIMulation of Medication Error induced by Clinical Trial drug labeling: the SIMME-CT study.
Dollinger, Cecile; Schwiertz, Vérane; Sarfati, Laura; Gourc-Berthod, Chloé; Guédat, Marie-Gabrielle; Alloux, Céline; Vantard, Nicolas; Gauthier, Noémie; He, Sophie; Kiouris, Elena; Caffin, Anne-Gaelle; Bernard, Delphine; Ranchon, Florence; Rioufol, Catherine
2016-06-01
To assess the impact of investigational drug labels on the risk of medication error in drug dispensing. A simulation-based learning program focusing on investigational drug dispensing was conducted. The study was undertaken in an Investigational Drugs Dispensing Unit of a University Hospital of Lyon, France. Sixty-three pharmacy workers (pharmacists, residents, technicians or students) were enrolled. Ten risk factors were selected concerning label information or the risk of confusion with another clinical trial. Each risk factor was scored independently out of 5: the higher the score, the greater the risk of error. From 400 labels analyzed, two groups were selected for the dispensing simulation: 27 labels with high risk (score ≥3) and 27 with low risk (score ≤2). Each question in the learning program was displayed as a simulated clinical trial prescription. Medication error was defined as at least one erroneous answer (i.e. error in drug dispensing). For each question, response times were collected. High-risk investigational drug labels correlated with medication error and slower response time. Error rates were significantly 5.5-fold higher for high-risk series. Error frequency was not significantly affected by occupational category or experience in clinical trials. SIMME-CT is the first simulation-based learning tool to focus on investigational drug labels as a risk factor for medication error. SIMME-CT was also used as a training tool for staff involved in clinical research, to develop medication error risk awareness and to validate competence in continuing medical education. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harzalla, S., E-mail: harzallahozil@yahoo.fr; Chabaat, M., E-mail: mchabaat@yahoo.com; Belgacem, F. Bin Muhammad, E-mail: fbmbelgacem@gmail.com
In this paper, a nondestructive technique is used as a tool to control cracks and microcracks in materials. A simulation by a numerical approach such as the finite element method is employed to detect cracks and eventually; to study their propagation using a crucial parameter such as the stress intensity factor. This approach has been used in the aircraft industry to control cracks. Besides, it makes it possible to highlight the defects of parts while preserving the integrity of the controlled products. On the other side, it is proven that the reliability of the control of defects gives convincing resultsmore » for the improvement of the quality and the safety of the material. Eddy current testing (ECT) is a standard technique in industry for the detection of surface breaking flaws in magnetic materials such as steels. In this context, simulation tools can be used to improve the understanding of experimental signals, optimize the design of sensors or evaluate the performance of ECT procedures. CEA-LIST has developed for many years semi-analytical models embedded into the simulation platform CIVA dedicated to non-destructive testing. The developments presented herein address the case of flaws located inside a planar and magnetic medium. Simulation results are obtained through the application of the Volume Integral Method (VIM). When considering the ECT of a single flaw, a system of two differential equations is derived from Maxwell equations. The numerical resolution of the system is carried out using the classical Galerkin variant of the Method of Moments. Besides, a probe response is calculated by application of the Lorentz reciprocity theorem. Finally, the approach itself as well as comparisons between simulation results and measured data are presented.« less
Development of a numerical methodology for flowforming process simulation of complex geometry tubes
NASA Astrophysics Data System (ADS)
Varela, Sonia; Santos, Maite; Arroyo, Amaia; Pérez, Iñaki; Puigjaner, Joan Francesc; Puigjaner, Blanca
2017-10-01
Nowadays, the incremental flowforming process is widely explored because of the usage of complex tubular products is increasing due to the light-weighting trend and the use of expensive materials. The enhanced mechanical properties of finished parts combined with the process efficiency in terms of raw material and energy consumption are the key factors for its competitiveness and sustainability, which is consistent with EU industry policy. As a promising technology, additional steps for extending the existing flowforming limits in the production of tubular products are required. The objective of the present research is to further expand the current state of the art regarding limitations on tube thickness and diameter, exploring the feasibility to flowform complex geometries as tubes of elevated thickness of up to 60 mm. In this study, the analysis of the backward flowforming process of 7075 aluminum tubular preform is carried out to define the optimum process parameters, machine requirements and tooling geometry as demonstration case. Numerical simulation studies on flowforming of thin walled tubular components have been considered to increase the knowledge of the technology. The calculation of the rotational movement of the mesh preform, the high ratio thickness/length and the thermomechanical condition increase significantly the computation time of the numerical simulation model. This means that efficient and reliable tools able to predict the forming loads and the quality of flowformed thick tubes are not available. This paper aims to overcome this situation by developing a simulation methodology based on FEM simulation code including new strategies. Material characterization has also been performed through tensile test to able to design the process. Finally, to check the reliability of the model, flowforming tests at industrial environment have been developed.
NASA Astrophysics Data System (ADS)
Harzalla, S.; Belgacem, F. Bin Muhammad; Chabaat, M.
2014-12-01
In this paper, a nondestructive technique is used as a tool to control cracks and microcracks in materials. A simulation by a numerical approach such as the finite element method is employed to detect cracks and eventually; to study their propagation using a crucial parameter such as the stress intensity factor. This approach has been used in the aircraft industry to control cracks. Besides, it makes it possible to highlight the defects of parts while preserving the integrity of the controlled products. On the other side, it is proven that the reliability of the control of defects gives convincing results for the improvement of the quality and the safety of the material. Eddy current testing (ECT) is a standard technique in industry for the detection of surface breaking flaws in magnetic materials such as steels. In this context, simulation tools can be used to improve the understanding of experimental signals, optimize the design of sensors or evaluate the performance of ECT procedures. CEA-LIST has developed for many years semi-analytical models embedded into the simulation platform CIVA dedicated to non-destructive testing. The developments presented herein address the case of flaws located inside a planar and magnetic medium. Simulation results are obtained through the application of the Volume Integral Method (VIM). When considering the ECT of a single flaw, a system of two differential equations is derived from Maxwell equations. The numerical resolution of the system is carried out using the classical Galerkin variant of the Method of Moments. Besides, a probe response is calculated by application of the Lorentz reciprocity theorem. Finally, the approach itself as well as comparisons between simulation results and measured data are presented.
2018-01-01
Approximately 90% of the structures in the Protein Data Bank (PDB) were obtained by X-ray crystallography or electron microscopy. Whereas the overall quality of structure is considered high, thanks to a wide range of tools for structure validation, uncertainties may arise from density maps of small molecules, such as organic ligands, ions or water, which are non-covalently bound to the biomolecules. Even with some experience and chemical intuition, the assignment of such disconnected electron densities is often far from obvious. In this study, we suggest the use of molecular dynamics (MD) simulations and free energy calculations, which are well-established computational methods, to aid in the assignment of ambiguous disconnected electron densities. Specifically, estimates of (i) relative binding affinities, for instance between an ion and water, (ii) absolute binding free energies, i.e., free energies for transferring a solute from bulk solvent to a binding site, and (iii) stability assessments during equilibrium simulations may reveal the most plausible assignments. We illustrate this strategy using the crystal structure of the fluoride specific channel (Fluc), which contains five disconnected electron densities previously interpreted as four fluoride and one sodium ion. The simulations support the assignment of the sodium ion. In contrast, calculations of relative and absolute binding free energies as well as stability assessments during free MD simulations suggest that four of the densities represent water molecules instead of fluoride. The assignment of water is compatible with the loss of these densities in the non-conductive F82I/F85I mutant of Fluc. We critically discuss the role of the ion force fields for the calculations presented here. Overall, these findings indicate that MD simulations and free energy calculations are helpful tools for modeling water and ions into crystallographic density maps. PMID:29771936
[Virtual reality simulation training in gynecology: review and perspectives].
Ricard-Gauthier, Dominique; Popescu, Silvia; Benmohamed, Naida; Petignat, Patrick; Dubuisson, Jean
2016-10-26
Laparoscopic simulation has rapidly become an important tool for learning and acquiring technical skills in surgery. It is based on two different complementary pedagogic tools : the box model trainer and the virtual reality simulator. The virtual reality simulator has shown its efficiency by improving surgical skills, decreasing operating time, improving economy of movements and improving self-confidence. The main objective of this tool is the opportunity to easily organize a regular, structured and uniformed training program enabling an automated individualized feedback.
Spacecraft Guidance, Navigation, and Control Visualization Tool
NASA Technical Reports Server (NTRS)
Mandic, Milan; Acikmese, Behcet; Blackmore, Lars
2011-01-01
G-View is a 3D visualization tool for supporting spacecraft guidance, navigation, and control (GN&C) simulations relevant to small-body exploration and sampling (see figure). The tool is developed in MATLAB using Virtual Reality Toolbox and provides users with the ability to visualize the behavior of their simulations, regardless of which programming language (or machine) is used to generate simulation results. The only requirement is that multi-body simulation data is generated and placed in the proper format before applying G-View.
FDTD simulation tools for UWB antenna analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brocato, Robert Wesley
2004-12-01
This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.
IFU simulator: a powerful alignment and performance tool for MUSE instrument
NASA Astrophysics Data System (ADS)
Laurent, Florence; Boudon, Didier; Daguisé, Eric; Dubois, Jean-Pierre; Jarno, Aurélien; Kosmalski, Johan; Piqueras, Laure; Remillieux, Alban; Renault, Edgard
2014-07-01
MUSE (Multi Unit Spectroscopic Explorer) is a second generation Very Large Telescope (VLT) integral field spectrograph (1x1arcmin² Field of View) developed for the European Southern Observatory (ESO), operating in the visible wavelength range (0.465-0.93 μm). A consortium of seven institutes is currently commissioning MUSE in the Very Large Telescope for the Preliminary Acceptance in Chile, scheduled for September, 2014. MUSE is composed of several subsystems which are under the responsibility of each institute. The Fore Optics derotates and anamorphoses the image at the focal plane. A Splitting and Relay Optics feed the 24 identical Integral Field Units (IFU), that are mounted within a large monolithic instrument mechanical structure. Each IFU incorporates an image slicer, a fully refractive spectrograph with VPH-grating and a detector system connected to a global vacuum and cryogenic system. During 2012 and 2013, all MUSE subsystems were integrated, aligned and tested to the P.I. institute at Lyon. After successful PAE in September 2013, MUSE instrument was shipped to the Very Large Telescope in Chile where that was aligned and tested in ESO integration hall at Paranal. After, MUSE was directly transferred in monolithic way without dismounting onto VLT telescope where the first light was overcame. This talk describes the IFU Simulator which is the main alignment and performance tool for MUSE instrument. The IFU Simulator mimics the optomechanical interface between the MUSE pre-optic and the 24 IFUs. The optomechanical design is presented. After, the alignment method of this innovative tool for identifying the pupil and image planes is depicted. At the end, the internal test report is described. The success of the MUSE alignment using the IFU Simulator is demonstrated by the excellent results obtained onto MUSE positioning, image quality and throughput. MUSE commissioning at the VLT is planned for September, 2014.
Calleja-Fernández, Alicia; Pintor-de-la-Maza, Begoña; Vidal-Casariego, Alfonso; Cano-Rodríguez, Isidoro; Ballesteros-Pomar, María D
2016-06-01
Texture-modified diets (TMDs) should fulfil nutritional goals, guarantee homogenous texture, and meet food safety regulations. The food industry has created texture-modified food (TMF) that meets the TMD requirements of quality and safety for inpatients. To design and develop a tool that allows the objective selection of foodstuffs for TMDs that ensures nutritional requirements and swallowing safety of inpatients in order to improve their quality of life, especially regarding their food satisfaction. An evaluation tool was designed to objectively determine the adequacy of food included in the TMD menus of a hospital. The "Objective Evaluation Tool for Texture-Modified Food" (OET-TMF) consists of seven items that evaluate the food's nutritional quality (energy and protein input), presence of allergens, texture and viscosity, cooking, storage type, useful life, and patient acceptance. The total score ranged from 0 to 64 and was divided into four categories: high quality, good quality, medium quality, and low quality. Studying four different commercial TMFs contributed to the validation of the tool. All the evaluated products scored between high and good regarding quality. There was a tendency (p = 0.077) towards higher consumption and a higher overall quality of the product obtained with the OET-TMF. The product that scored highest with the tool was the best accepted; the product with the lowest score had the highest rate of refusal. The OET-TMF allows for the objective discrimination of the quality of TMF. In addition, it shows a certain relationship between the observed and assessed quality intake.
Miller, David J; Nelson, Carl A; Oleynikov, Dmitry
2009-05-01
With a limited number of access ports, minimally invasive surgery (MIS) often requires the complete removal of one tool and reinsertion of another. Modular or multifunctional tools can be used to avoid this step. In this study, soft computing techniques are used to optimally arrange a modular tool's functional tips, allowing surgeons to deliver treatment of improved quality in less time, decreasing overall cost. The investigators watched University Medical Center surgeons perform MIS procedures (e.g., cholecystectomy and Nissen fundoplication) and recorded the procedures to digital video. The video was then used to analyze the types of instruments used, the duration of each use, and the function of each instrument. These data were aggregated with fuzzy logic techniques using four membership functions to quantify the overall usefulness of each tool. This allowed subsequent optimization of the arrangement of functional tips within the modular tool to decrease overall time spent changing instruments during simulated surgical procedures based on the video recordings. Based on a prototype and a virtual model of a multifunction laparoscopic tool designed by the investigators that can interchange six different instrument tips through the tool's shaft, the range of tool change times is approximately 11-13 s. Using this figure, estimated time savings for the procedures analyzed ranged from 2.5 to over 32 min, and on average, total surgery time can be reduced by almost 17% by using the multifunction tool.
CAE for Injection Molding — Past, Present and the Future
NASA Astrophysics Data System (ADS)
Wang, Kuo K.
2004-06-01
It is well known that injection molding is the most effective process for mass-producing discrete plastic parts of complex shape to the highest precision at the lowest cost. However, due to the complex property of polymeric materials undergoing a transient non-isothermal process, it is equally well recognized that the quality of final products is often difficult to be assured. This is particularly true when a new mold or material is encountered. As a result, injection molding has often been viewed as an art than a science. During the past few decades, numerical simulation of injection molding process based on analytic models has become feasible for practical use as computers became faster and cheaper continually. A research effort was initiated at the Cornell Injection Molding Program (CIMP) in 1974 under a grant from the National Science Foundation. Over a quarter of the century, CIMP has established some scientific bases ranging from materials characterization, flow analysis, to prediction of part quality. Use of such CAE tools has become common place today in industry. Present effort has been primarily aimed at refinements of many aspects of the process. Computational efficiency and user-interface have been main thrusts by commercial software developers. Extension to 3-dimensional flow analysis for certain parts has drawn some attention. Research activities are continuing on molding of fiber-filled materials and reactive polymers. Expanded molding processes such as gas-assisted, co-injection, micro-molding and many others are continually being investigated. In the future, improvements in simulation accuracy and efficiency will continue. This will include in-depth studies on materials characterization. Intelligent on-line process control may draw more attention in order to achieve higher degree of automation. As Internet technology continues to evolve, Web-based CAE tools for design, production, remote process monitoring and control can come to path. The CAE tools will eventually be integrated into an Enterprise Resources Planning (ERP) system as the trend of enterprise globalization continues.
Simulating the Camp David Negotiations: A Problem-Solving Tool in Critical Pedagogy
ERIC Educational Resources Information Center
McMahon, Sean F.; Miller, Chris
2013-01-01
This article reflects critically on simulations. Building on the authors' experience simulating the Palestinian-Israeli-American Camp David negotiations of 2000, they argue that simulations are useful pedagogical tools that encourage creative--but not critical--thinking and constructivist learning. However, they can also have the deleterious…
THE ATMOSPHERIC MODEL EVALUATION TOOL (AMET); AIR QUALITY MODULE
This presentation reviews the development of the Atmospheric Model Evaluation Tool (AMET) air quality module. The AMET tool is being developed to aid in the model evaluation. This presentation focuses on the air quality evaluation portion of AMET. Presented are examples of the...