Water Quality Analysis Simulation Program (WASP)
The Water Quality Analysis Simulation Program (WASP) model helps users interpret and predict water quality responses to natural phenomena and manmade pollution for various pollution management decisions.
Water Quality Analysis Simulation
The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.
Chen, Xinyuan; Dai, Jianrong
2018-05-01
Magnetic Resonance Imaging (MRI) simulation differs from diagnostic MRI in purpose, technical requirements, and implementation. We propose a semiautomatic method for image acceptance and commissioning for the scanner, the radiofrequency (RF) coils, and pulse sequences for an MRI simulator. The ACR MRI accreditation large phantom was used for image quality analysis with seven parameters. Standard ACR sequences with a split head coil were adopted to examine the scanner's basic performance. The performance of simulation RF coils were measured and compared using the standard sequence with different clinical diagnostic coils. We used simulation sequences with simulation coils to test the quality of image and advanced performance of the scanner. Codes and procedures were developed for semiautomatic image quality analysis. When using standard ACR sequences with a split head coil, image quality passed all ACR recommended criteria. The image intensity uniformity with a simulation RF coil decreased about 34% compared with the eight-channel diagnostic head coil, while the other six image quality parameters were acceptable. Those two image quality parameters could be improved to more than 85% by built-in intensity calibration methods. In the simulation sequences test, the contrast resolution was sensitive to the FOV and matrix settings. The geometric distortion of simulation sequences such as T1-weighted and T2-weighted images was well-controlled in the isocenter and 10 cm off-center within a range of ±1% (2 mm). We developed a semiautomatic image quality analysis method for quantitative evaluation of images and commissioning of an MRI simulator. The baseline performances of simulation RF coils and pulse sequences have been established for routine QA. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Comparative Analysis of Reconstructed Image Quality in a Simulated Chromotomographic Imager
2014-03-01
quality . This example uses five basic images a backlit bar chart with random intensity, 100 nm separation. A total of 54 initial target...compared for a variety of scenes. Reconstructed image quality is highly dependent on the initial target hypercube so a total of 54 initial target...COMPARATIVE ANALYSIS OF RECONSTRUCTED IMAGE QUALITY IN A SIMULATED CHROMOTOMOGRAPHIC IMAGER THESIS
Simulation-based training for nurses: Systematic review and meta-analysis.
Hegland, Pål A; Aarlie, Hege; Strømme, Hilde; Jamtvedt, Gro
2017-07-01
Simulation-based training is a widespread strategy to improve health-care quality. However, its effect on registered nurses has previously not been established in systematic reviews. The aim of this systematic review is to evaluate effect of simulation-based training on nurses' skills and knowledge. We searched CDSR, DARE, HTA, CENTRAL, CINAHL, MEDLINE, Embase, ERIC, and SveMed+ for randomised controlled trials (RCT) evaluating effect of simulation-based training among nurses. Searches were completed in December 2016. Two reviewers independently screened abstracts and full-text, extracted data, and assessed risk of bias. We compared simulation-based training to other learning strategies, high-fidelity simulation to other simulation strategies, and different organisation of simulation training. Data were analysed through meta-analysis and narrative syntheses. GRADE was used to assess the quality of evidence. Fifteen RCTs met the inclusion criteria. For the comparison of simulation-based training to other learning strategies on nurses' skills, six studies in the meta-analysis showed a significant, but small effect in favour of simulation (SMD -1.09, CI -1.72 to -0.47). There was large heterogeneity (I 2 85%). For the other comparisons, there was large between-study variation in results. The quality of evidence for all comparisons was graded as low. The effect of simulation-based training varies substantially between studies. Our meta-analysis showed a significant effect of simulation training compared to other learning strategies, but the quality of evidence was low indicating uncertainty. Other comparisons showed inconsistency in results. Based on our findings simulation training appears to be an effective strategy to improve nurses' skills, but further good-quality RCTs with adequate sample sizes are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.
The Water Quality Analysis Simulation Program (WASP) is a dynamic, spatially-resolved, differential mass balance fate and transport modeling framework. WASP is used to develop models to simulate concentrations of environmental contaminants in surface waters and sediments. As a mo...
Simulation-based optimization framework for reuse of agricultural drainage water in irrigation.
Allam, A; Tawfik, A; Yoshimura, C; Fleifle, A
2016-05-01
A simulation-based optimization framework for agricultural drainage water (ADW) reuse has been developed through the integration of a water quality model (QUAL2Kw) and a genetic algorithm. This framework was applied to the Gharbia drain in the Nile Delta, Egypt, in summer and winter 2012. First, the water quantity and quality of the drain was simulated using the QUAL2Kw model. Second, uncertainty analysis and sensitivity analysis based on Monte Carlo simulation were performed to assess QUAL2Kw's performance and to identify the most critical variables for determination of water quality, respectively. Finally, a genetic algorithm was applied to maximize the total reuse quantity from seven reuse locations with the condition not to violate the standards for using mixed water in irrigation. The water quality simulations showed that organic matter concentrations are critical management variables in the Gharbia drain. The uncertainty analysis showed the reliability of QUAL2Kw to simulate water quality and quantity along the drain. Furthermore, the sensitivity analysis showed that the 5-day biochemical oxygen demand, chemical oxygen demand, total dissolved solids, total nitrogen and total phosphorous are highly sensitive to point source flow and quality. Additionally, the optimization results revealed that the reuse quantities of ADW can reach 36.3% and 40.4% of the available ADW in the drain during summer and winter, respectively. These quantities meet 30.8% and 29.1% of the drainage basin requirements for fresh irrigation water in the respective seasons. Copyright © 2016 Elsevier Ltd. All rights reserved.
DEVELOPMENT AND ANALYSIS OF AIR QUALITY MODELING SIMULATIONS FOR HAZARDOUS AIR POLLUTANTS
The concentrations of five hazardous air pollutants were simulated using the Community Multi Scale Air Quality (CMAQ) modeling system. Annual simulations were performed over the continental United States for the entire year of 2001 to support human exposure estimates. Results a...
Diagnostic Analysis of Ozone Concentrations Simulated by Two Regional-Scale Air Quality Models
Since the Community Multiscale Air Quality modeling system (CMAQ) and the Weather Research and Forecasting with Chemistry model (WRF/Chem) use different approaches to simulate the interaction of meteorology and chemistry, this study compares the CMAQ and WRF/Chem air quality simu...
Incorporating quality and safety education for nurses competencies in simulation scenario design.
Jarzemsky, Paula; McCarthy, Jane; Ellis, Nadege
2010-01-01
When planning a simulation scenario, even if adopting prepackaged simulation scenarios, faculty should first conduct a task analysis to guide development of learning objectives and cue critical events. The authors describe a strategy for systematic planning of simulation-based training that incorporates knowledge, skills, and attitudes as defined by the Quality and Safety Education for Nurses (QSEN) initiative. The strategy cues faculty to incorporate activities that target QSEN competencies (patient-centered care, teamwork and collaboration, evidence-based practice, quality improvement, informatics, and safety) before, during, and after simulation scenarios.
This poster presents analysis of near-realtime air quality simulations over New York State for two summer and one winter season. Simulations were performed as a pilot study between the NOAA, EPA, and NYSDEC, utilizing resources from the national operational NOAA/EPA air quality f...
A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS
Fine-scale Computational Fluid Dynamics (CFD) simulation of pollutant concentrations within roadway and building microenvironments is feasible using high performance computing. Unlike currently used regulatory air quality models, fine-scale CFD simulations are able to account rig...
UNCERTAINTY ANALYSIS IN WATER QUALITY MODELING USING QUAL2E
A strategy for incorporating uncertainty analysis techniques (sensitivity analysis, first order error analysis, and Monte Carlo simulation) into the mathematical water quality model QUAL2E is described. The model, named QUAL2E-UNCAS, automatically selects the input variables or p...
This study presents an evaluation of summertime daily maximum ozone concentrations over North America (NA) and Europe (EU) using the database generated during Phase 1 of the Air Quality Model Evaluation International Initiative (AQMEII). The analysis focuses on identifying tempor...
Development and testing of a fast conceptual river water quality model.
Keupers, Ingrid; Willems, Patrick
2017-04-15
Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.
US EPA Region 4 and the National Water Quality Modeling Work Group is proud to sponsor a 5-day workshop on water quality principles/modeling using the Water Quality Analysis Simulation Program (WASP).
The Analysis, Numerical Simulation, and Diagnosis of Extratropical Weather Systems
2003-09-30
The Analysis, Numerical Simulation, and Diagnosis of Extratropical Weather Systems Dr. Melvyn A. Shapiro NOAA/Office of Weather and Air Quality...predictability of extratropical cyclones. APPROACH My approach toward achieving the above objectives has been to foster national and...TITLE AND SUBTITLE The Analysis, Numerical Simulation, and Diagnosis of Extratropical Weather Systems 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM
Incorporating Quality Scores in Meta-Analysis
ERIC Educational Resources Information Center
Ahn, Soyeon; Becker, Betsy Jane
2011-01-01
This paper examines the impact of quality-score weights in meta-analysis. A simulation examines the roles of study characteristics such as population effect size (ES) and its variance on the bias and mean square errors (MSEs) of the estimators for several patterns of relationship between quality and ES, and for specific patterns of systematic…
NASA Technical Reports Server (NTRS)
Powers, Bruce G.
1996-01-01
The ability to use flight data to determine an aircraft model with structural dynamic effects suitable for piloted simulation. and handling qualities analysis has been developed. This technique was demonstrated using SR-71 flight test data. For the SR-71 aircraft, the most significant structural response is the longitudinal first-bending mode. This mode was modeled as a second-order system, and the other higher order modes were modeled as a time delay. The distribution of the modal response at various fuselage locations was developed using a uniform beam solution, which can be calibrated using flight data. This approach was compared to the mode shape obtained from the ground vibration test, and the general form of the uniform beam solution was found to be a good representation of the mode shape in the areas of interest. To calibrate the solution, pitch-rate and normal-acceleration instrumentation is required for at least two locations. With the resulting structural model incorporated into the simulation, a good representation of the flight characteristics was provided for handling qualities analysis and piloted simulation.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Steve A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Chris J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Willkinson, Timothy S.
2008-08-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
PICASSO: an end-to-end image simulation tool for space and airborne imaging systems
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.
2010-06-01
The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.
This presentation focuses on the dynamic evaluation of the CMAQ model over the continental United States using multi-decadal simulations for the period from 1990 to 2010 to examine how well the changes in observed ozone air quality induced by variations in meteorology and/or emis...
Heo, Eun Hwa; Kim, Sehyun; Park, Hye-Ja; Kil, Suk Yong
2016-11-01
This study aimed to evaluate the effects of a simulated laughter programme on mood, cortisol levels, and health-related quality of life among haemodialysis patients. Forty participants were randomly assigned to a laughter group (n = 20) or a control group (n = 20). Eleven participants completed the laughter programme after haemodialysis sessions and 18 control participants remained. The 4-week simulated laughter programme included weekly 60 min group sessions of simulated laughter, breathing, stretching exercises, and meditation, as well as daily 15 s individual laughter sessions administered via telephone. Mood, cortisol levels, and health-related quality of life were analysed using the rank analysis of covariance, and Wilcoxon's signed rank test. The laughter group exhibited improvements in mood, symptoms, social interaction quality, and role limitations due to physical health. The simulated laughter programme may help improve mood and health-related quality of life among haemodialysis patients. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Klimczak, Marcin; Bojarski, Jacek; Ziembicki, Piotr; Kęskiewicz, Piotr
2017-11-01
The requirements concerning energy performance of buildings and their internal installations, particularly HVAC systems, have been growing continuously in Poland and all over the world. The existing, traditional calculation methods following from the static heat exchange model are frequently not sufficient for a reasonable heating design of a building. Both in Poland and elsewhere in the world, methods and software are employed which allow a detailed simulation of the heating and moisture conditions in a building, and also an analysis of the performance of HVAC systems within a building. However, these systems are usually difficult in use and complex. In addition, the development of a simulation model that is sufficiently adequate to the real building requires considerable time involvement of a designer, is time-consuming and laborious. A simplification of the simulation model of a building renders it possible to reduce the costs of computer simulations. The paper analyses in detail the effect of introducing a number of different variants of the simulation model developed in Design Builder on the quality of final results obtained. The objective of this analysis is to find simplifications which allow obtaining simulation results which have an acceptable level of deviations from the detailed model, thus facilitating a quick energy performance analysis of a given building.
Numerical simulation of deformation and figure quality of precise mirror
NASA Astrophysics Data System (ADS)
Vit, Tomáš; Melich, Radek; Sandri, Paolo
2015-01-01
The presented paper shows results and a comparison of FEM numerical simulations and optical tests of the assembly of a precise Zerodur mirror with a mounting structure for space applications. It also shows how the curing of adhesive film can impact the optical surface, especially as regards deformations. Finally, the paper shows the results of the figure quality analysis, which are based on data from FEM simulation of optical surface deformations.
Rotorcraft flying qualities improvement using advanced control
NASA Technical Reports Server (NTRS)
Walker, D.; Postlethwaite, I.; Howitt, J.; Foster, N.
1993-01-01
We report on recent experience gained when a multivariable helicopter flight control law was tested on the Large Motion Simulator (LMS) at DRA Bedford. This was part of a study into the application of multivariable control theory to the design of full-authority flight control systems for high-performance helicopters. In this paper, we present some of the results that were obtained during the piloted simulation trial and from subsequent off-line simulation and analysis. The performance provided by the control law led to level 1 handling quality ratings for almost all of the mission task elements assessed, both during the real-time and off-line analysis.
Kuselman, Ilya; Pennecchi, Francesca; Epstein, Malka; Fajgelj, Ales; Ellison, Stephen L R
2014-12-01
Monte Carlo simulation of expert judgments on human errors in a chemical analysis was used for determination of distributions of the error quantification scores (scores of likelihood and severity, and scores of effectiveness of a laboratory quality system in prevention of the errors). The simulation was based on modeling of an expert behavior: confident, reasonably doubting and irresolute expert judgments were taken into account by means of different probability mass functions (pmfs). As a case study, 36 scenarios of human errors which may occur in elemental analysis of geological samples by ICP-MS were examined. Characteristics of the score distributions for three pmfs of an expert behavior were compared. Variability of the scores, as standard deviation of the simulated score values from the distribution mean, was used for assessment of the score robustness. A range of the score values, calculated directly from elicited data and simulated by a Monte Carlo method for different pmfs, was also discussed from the robustness point of view. It was shown that robustness of the scores, obtained in the case study, can be assessed as satisfactory for the quality risk management and improvement of a laboratory quality system against human errors. Copyright © 2014 Elsevier B.V. All rights reserved.
Establishing High-Quality Prostate Brachytherapy Using a Phantom Simulator Training Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thaker, Nikhil G.; Kudchadker, Rajat J.; Swanson, David A.
2014-11-01
Purpose: To design and implement a unique training program that uses a phantom-based simulator to teach the process of prostate brachytherapy (PB) quality assurance and improve the quality of education. Methods and Materials: Trainees in our simulator program were practicing radiation oncologists, radiation oncology residents, and fellows of the American Brachytherapy Society. The program emphasized 6 core areas of quality assurance: patient selection, simulation, treatment planning, implant technique, treatment evaluation, and outcome assessment. Using the Iodine 125 ({sup 125}I) preoperative treatment planning technique, trainees implanted their ultrasound phantoms with dummy seeds (ie, seeds with no activity). Pre- and postimplant dosimetric parametersmore » were compared and correlated using regression analysis. Results: Thirty-one trainees successfully completed the simulator program during the period under study. The mean phantom prostate size, number of seeds used, and total activity were generally consistent between trainees. All trainees met the V100 >95% objective both before and after implantation. Regardless of the initial volume of the prostate phantom, trainees' ability to cover the target volume with at least 100% of the dose (V100) was not compromised (R=0.99 pre- and postimplant). However, the V150 had lower concordance (R=0.37) and may better reflect heterogeneity control of the implant process. Conclusions: Analysis of implants from this phantom-based simulator shows a high degree of consistency between trainees and uniformly high-quality implants with respect to parameters used in clinical practice. This training program provides a valuable educational opportunity that improves the quality of PB training and likely accelerates the learning curve inherent in PB. Prostate phantom implantation can be a valuable first step in the acquisition of the required skills to safely perform PB.« less
Verification and Evaluation of Aquatic Contaminant Simulation Module (CSM)
2016-08-01
RECOVERY model (Boyer et al. 1994, Ruiz et al. 2000) and Water- quality Analysis Simulation Program (WASP) model (Wool et al. 2006). This technical note (TN...bacteria, and detritus). Natural waters can contain a mixture of solid particles ranging from gravel (2 mm to 20 mm) or sand (0.07 mm to 2 mm) down to... quality perspective, cohesive sediments are usually of greater importance in water quality modeling. The chemical species in the active sediment
Coon, William F.
2008-01-01
A computer model of hydrologic and water-quality processes of the Onondaga Lake basin in Onondaga County, N.Y., was developed during 2003-07 to assist water-resources managers in making basin-wide management decisions that could affect peak flows and the water quality of tributaries to Onondaga Lake. The model was developed with the Hydrological Simulation Program-Fortran (HSPF) and was designed to allow simulation of proposed or hypothetical land-use changes, best-management practices (BMPs), and instream stormwater-detention basins such that their effects on flows and loads of suspended sediment, orthophosphate, total phosphorus, ammonia, organic nitrogen, and nitrate could be analyzed. Extreme weather conditions, such as intense storms and prolonged droughts, can be simulated through manipulation of the precipitation record. Model results obtained from different scenarios can then be compared and analyzed through an interactive computer program known as Generation and Analysis of Model Simulation Scenarios for Watersheds (GenScn). Background information on HSPF and GenScn is presented to familiarize the user with these two programs. Step-by-step examples are provided on (1) the creation of land-use, BMP, and stormflow-detention scenarios for simulation by the HSPF model, and (2) the analysis of simulation results through GenScn.
The dataset represents the data depicted in the Figures and Tables of a Journal Manuscript with the following abstract: The objective of this study is to determine the adequacy of using a relatively coarse horizontal resolution (i.e. 36 km) to simulate long-term trends of pollutant concentrations and radiation variables with the coupled WRF-CMAQ model. WRF-CMAQ simulations over the continental United State are performed over the 2001 to 2010 time period at two different horizontal resolutions of 12 and 36 km. Both simulations used the same emission inventory and model configurations. Model results are compared both in space and time to assess the potential weaknesses and strengths of using coarse resolution in long-term air quality applications. The results show that the 36 km and 12 km simulations are comparable in terms of trends analysis for both pollutant concentrations and radiation variables. The advantage of using the coarser 36 km resolution is a significant reduction of computational cost, time and storage requirement which are key considerations when performing multiple years of simulations for trend analysis. However, if such simulations are to be used for local air quality analysis, finer horizontal resolution may be beneficial since it can provide information on local gradients. In particular, divergences between the two simulations are noticeable in urban, complex terrain and coastal regions.This dataset is associated with the following publication
Water Network Tool for Resilience v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-12-09
WNTR is a python package designed to simulate and analyze resilience of water distribution networks. The software includes: - Pressure driven and demand driven hydraulic simulation - Water quality simulation to track concentration, trace, and water age - Conditional controls to simulate power outages - Models to simulate pipe breaks - A wide range of resilience metrics - Analysis and visualization tools
Bauer, Daniel P.; Steele, Timothy Doak; Anderson, Richard D.
1978-01-01
An analysis of the waste-load assimilative capacity of the Yampa River from Steamboat Springs to Hayden, Colo., a distance of 38 miles, was made during September 1975 to obtain information on the effects of projected waste loadings on this stream reach. Simulations of effects of waste loadings on streamflow quality were made using a steady-state water-quality model. The simulations were based on 7-day low-flow values with a 10-year recurrence interval and population projections for 2010. Model results for December and September streamflow conditions indicated that the recommended 1978 Colorado and 1976 U.S. Environmental Protection Agency water-quality standard of 0.02 milligram per liter for nonionized ammonia concentration would be exceeded. Model simulations also included the effect of a flow augmentation of 20 cubic feet per second from a proposed upstream reservoir. The permissible ammonia loading in the study reach could be increased approximately 25 percent with this amount of flow augmentation. Simulations of concentrations of dissolved oxygen, fecal-coliform bacteria, and nitrate nitrogen indicated that the State 's water-quality goals proposed for 1978, 1983, or 1985 would not be exceeded. (Woodard-USGS)
Through the comparison of several regional-scale chemistry transport modelling systems that simulate meteorology and air quality over the European and American continents, this study aims at i) apportioning the error to the responsible processes using time-scale analysis, ii) hel...
NASA Astrophysics Data System (ADS)
Siepmann, Jens P.; Wortberg, Johannes; Heinzler, Felix A.
2016-03-01
The injection molding process is mandatorily influenced by the viscosity of the material. By varying the material batch the viscosity of the polymer changes. For the process and part quality the initial conditions of the material in addition to the processing parameters define the process and product quality. A high percentage of technical polymers processed in injection molding is refined in a follow-up production step, for example electro plating. Processing optimized for electro plating often requires avoiding high shear stresses by using low injection speed and pressure conditions. Therefore differences in the material charges' viscosity occur especially in the quality related low shear rate area. These differences and quality related influences can be investigated by high detail rheological analysis and process simulation based on adapted material describing models. Differences in viscosity between batches can be detected by measurements with high-pressure-capillary-rheometers or oscillatory rheometers for low shear rates. A combination of both measurement techniques is possible by the Cox-Merz-Relation. The detected differences in the rheological behavior of both charges are summarized in two material behavior describing model approaches and added to the simulation. In this paper the results of processing-simulations with standard filling parameters are presented with two ABS charges. Part quality defining quantities such as temperature, pressure and shear stress are investigated and the influence of charge variations is pointed out with respect to electro plating quality demands. Furthermore, the results of simulations with a new quality related process control are presented and compared to the standard processing.
Color visual simulation applications at the Defense Mapping Agency
NASA Astrophysics Data System (ADS)
Simley, J. D.
1984-09-01
The Defense Mapping Agency (DMA) produces the Digital Landmass System data base to provide culture and terrain data in support of numerous aircraft simulators. In order to conduct data base and simulation quality control and requirements analysis, DMA has developed the Sensor Image Simulator which can rapidly generate visual and radar static scene digital simulations. The use of color in visual simulation allows the clear portrayal of both landcover and terrain data, whereas the initial black and white capabilities were restricted in this role and thus found limited use. Color visual simulation has many uses in analysis to help determine the applicability of current and prototype data structures to better meet user requirements. Color visual simulation is also significant in quality control since anomalies can be more easily detected in natural appearing forms of the data. The realism and efficiency possible with advanced processing and display technology, along with accurate data, make color visual simulation a highly effective medium in the presentation of geographic information. As a result, digital visual simulation is finding increased potential as a special purpose cartographic product. These applications are discussed and related simulation examples are presented.
Bei, Naifang; Li, Guohui; Meng, Zhiyong; Weng, Yonghui; Zavala, Miguel; Molina, L T
2014-11-15
The purpose of this study is to investigate the impact of using an ensemble Kalman filter (EnKF) on air quality simulations in the California-Mexico border region on two days (May 30 and June 04, 2010) during Cal-Mex 2010. The uncertainties in ozone (O3) and aerosol simulations in the border area due to the meteorological initial uncertainties were examined through ensemble simulations. The ensemble spread of surface O3 averaged over the coastal region was less than 10ppb. The spreads in the nitrate and ammonium aerosols are substantial on both days, mostly caused by the large uncertainties in the surface temperature and humidity simulations. In general, the forecast initialized with the EnKF analysis (EnKF) improved the simulation of meteorological fields to some degree in the border region compared to the reference forecast initialized with NCEP analysis data (FCST) and the simulation with observation nudging (FDDA), which in turn leading to reasonable air quality simulations. The simulated surface O3 distributions by EnKF were consistently better than FCST and FDDA on both days. EnKF usually produced more reasonable simulations of nitrate and ammonium aerosols compared to the observations, but still have difficulties in improving the simulations of organic and sulfate aerosols. However, discrepancies between the EnKF simulations and the measurements were still considerably large, particularly for sulfate and organic aerosols, indicating that there are still ample rooms for improvement in the present data assimilation and/or the modeling systems. Copyright © 2014 Elsevier B.V. All rights reserved.
Physical habitat simulation system reference manual: version II
Milhous, Robert T.; Updike, Marlys A.; Schneider, Diane M.
1989-01-01
There are four major components of a stream system that determine the productivity of the fishery (Karr and Dudley 1978). These are: (1) flow regime, (2) physical habitat structure (channel form, substrate distribution, and riparian vegetation), (3) water quality (including temperature), and (4) energy inputs from the watershed (sediments, nutrients, and organic matter). The complex interaction of these components determines the primary production, secondary production, and fish population of the stream reach. The basic components and interactions needed to simulate fish populations as a function of management alternatives are illustrated in Figure I.1. The assessment process utilizes a hierarchical and modular approach combined with computer simulation techniques. The modular components represent the "building blocks" for the simulation. The quality of the physical habitat is a function of flow and, therefore, varies in quality and quantity over the range of the flow regime. The conceptual framework of the Incremental Methodology and guidelines for its application are described in "A Guide to Stream Habitat Analysis Using the Instream Flow Incremental Methodology" (Bovee 1982). Simulation of physical habitat is accomplished using the physical structure of the stream and streamflow. The modification of physical habitat by temperature and water quality is analyzed separately from physical habitat simulation. Temperature in a stream varies with the seasons, local meteorological conditions, stream network configuration, and the flow regime; thus, the temperature influences on habitat must be analysed on a stream system basis. Water quality under natural conditions is strongly influenced by climate and the geological materials, with the result that there is considerable natural variation in water quality. When we add the activities of man, the possible range of water quality possibilities becomes rather large. Consequently, water quality must also be analysed on a stream system basis. Such analysis is outside the scope of this manual, which concentrates on simulation of physical habitat based on depth, velocity, and a channel index. The results form PHABSIM can be used alone or by using a series of habitat time series programs that have been developed to generate monthly or daily habitat time series from the Weighted Usable Area versus streamflow table resulting from the habitat simulation programs and streamflow time series data. Monthly and daily streamflow time series may be obtained from USGS gages near the study site or as the output of river system management models.
NASA Astrophysics Data System (ADS)
Ngan, Fong; Byun, Daewon; Kim, Hyuncheol; Lee, Daegyun; Rappenglück, Bernhard; Pour-Biazar, Arastoo
2012-07-01
To achieve more accurate meteorological inputs than was used in the daily forecast for studying the TexAQS 2006 air quality, retrospective simulations were conducted using objective analysis and 3D/surface analysis nudging with surface and upper observations. Model ozone using the assimilated meteorological fields with improved wind fields shows better agreement with the observation compared to the forecasting results. In the post-frontal conditions, important factors for ozone modeling in terms of wind patterns are the weak easterlies in the morning for bringing in industrial emissions to the city and the subsequent clockwise turning of the wind direction induced by the Coriolis force superimposing the sea breeze, which keeps pollutants in the urban area. Objective analysis and nudging employed in the retrospective simulation minimize the wind bias but are not able to compensate for the general flow pattern biases inherited from large scale inputs. By using an alternative analyses data for initializing the meteorological simulation, the model can re-produce the flow pattern and generate the ozone peak location closer to the reality. The inaccurate simulation of precipitation and cloudiness cause over-prediction of ozone occasionally. Since there are limitations in the meteorological model to simulate precipitation and cloudiness in the fine scale domain (less than 4-km grid), the satellite-based cloud is an alternative way to provide necessary inputs for the retrospective study of air quality.
Analysis of the ecological water diversion project in Wenzhou City
NASA Astrophysics Data System (ADS)
Xu, Haibo; Fu, Lei; Lin, Tong
2018-02-01
As a developed city in China, Wenzhou City has been suffered from bad water quality for years. In order to improve the river network water quality, an ecological water diversion project was designed and executed by the regional government. In this study, an investigation and analysis of the regional ecological water diversion project is made for the purpose of examining the water quality improvements. A numerical model is also established, different water diversion flow rates and sewer interception levels are considered during the simulation. Simulation results reveal that higher flow rate and sewer interception level will greatly improve the river network water quality in Wenzhou City. The importance of the flow rate and interception level has been proved and future work will be focused on increasing the flow rate and upgrading the sewer interception level.
Technical Highlight: NREL Improves Building Energy Simulation Programs Through Diagnostic Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polly, B.
2012-01-09
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market.
Social Indicators, Policy Analysis and Simulation.
ERIC Educational Resources Information Center
Little, Dennis
1972-01-01
A secondary-level simulation designed to demonstrate the impact of policy, values, and technological and societal developments upon the quality of life within a hypothetical state is described. See related document ED 064 865 for availability of the actual game. By simulating the evaluation of policies in terms of social indicators, STAPOL (State…
Evaluation of the Community Multiscale Air Quality model version 5.1
The Community Multiscale Air Quality model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Atmospheric Modeling and Analysis Division (AMAD) of the U.S. Environment...
Development of a simulation model of semi-active suspension for monorail
NASA Astrophysics Data System (ADS)
Hasnan, K.; Didane, D. H.; Kamarudin, M. A.; Bakhsh, Qadir; Abdulmalik, R. E.
2016-11-01
The new Kuala Lumpur Monorail Fleet Expansion Project (KLMFEP) uses semiactive technology in its suspension system. It is recognized that the suspension system influences the ride quality. Thus, among the way to further improve the ride quality is by fine- tuning the semi-active suspension system on the new KL Monorail. The semi-active suspension for the monorail specifically in terms of improving ride quality could be exploited further. Hence a simulation model which will act as a platform to test the design of a complete suspension system particularly to investigate the ride comfort performance is required. MSC Adams software was considered as the tool to develop the simulation platform, where all parameters and data are represented by mathematical equations; whereas the new KL Monorail being the reference model. In the simulation, the model went through step disturbance on the guideway for stability and ride comfort analysis. The model has shown positive results where the monorail is in stable condition as an outcome from stability analysis. The model also scores a Rating 1 classification in ISO 2631 Ride Comfort performance which is very comfortable as an overall outcome from ride comfort analysis. The model is also adjustable, flexibile and understandable by the engineers within the field for the purpose of further development.
Web-Based Predictive Analytics to Improve Patient Flow in the Emergency Department
NASA Technical Reports Server (NTRS)
Buckler, David L.
2012-01-01
The Emergency Department (ED) simulation project was established to demonstrate how requirements-driven analysis and process simulation can help improve the quality of patient care for the Veterans Health Administration's (VHA) Veterans Affairs Medical Centers (VAMC). This project developed a web-based simulation prototype of patient flow in EDs, validated the performance of the simulation against operational data, and documented IT requirements for the ED simulation.
Weighted analysis of paired microarray experiments.
Kristiansson, Erik; Sjögren, Anders; Rudemo, Mats; Nerman, Olle
2005-01-01
In microarray experiments quality often varies, for example between samples and between arrays. The need for quality control is therefore strong. A statistical model and a corresponding analysis method is suggested for experiments with pairing, including designs with individuals observed before and after treatment and many experiments with two-colour spotted arrays. The model is of mixed type with some parameters estimated by an empirical Bayes method. Differences in quality are modelled by individual variances and correlations between repetitions. The method is applied to three real and several simulated datasets. Two of the real datasets are of Affymetrix type with patients profiled before and after treatment, and the third dataset is of two-colour spotted cDNA type. In all cases, the patients or arrays had different estimated variances, leading to distinctly unequal weights in the analysis. We suggest also plots which illustrate the variances and correlations that affect the weights computed by our analysis method. For simulated data the improvement relative to previously published methods without weighting is shown to be substantial.
Cost-effectiveness analysis of online hemodiafiltration versus high-flux hemodialysis.
Ramponi, Francesco; Ronco, Claudio; Mason, Giacomo; Rettore, Enrico; Marcelli, Daniele; Martino, Francesca; Neri, Mauro; Martin-Malo, Alejandro; Canaud, Bernard; Locatelli, Francesco
2016-01-01
Clinical studies suggest that hemodiafiltration (HDF) may lead to better clinical outcomes than high-flux hemodialysis (HF-HD), but concerns have been raised about the cost-effectiveness of HDF versus HF-HD. Aim of this study was to investigate whether clinical benefits, in terms of longer survival and better health-related quality of life, are worth the possibly higher costs of HDF compared to HF-HD. The analysis comprised a simulation based on the combined results of previous published studies, with the following steps: 1) estimation of the survival function of HF-HD patients from a clinical trial and of HDF patients using the risk reduction estimated in a meta-analysis; 2) simulation of the survival of the same sample of patients as if allocated to HF-HD or HDF using three-state Markov models; and 3) application of state-specific health-related quality of life coefficients and differential costs derived from the literature. Several Monte Carlo simulations were performed, including simulations for patients with different risk profiles, for example, by age (patients aged 40, 50, and 60 years), sex, and diabetic status. Scatter plots of simulations in the cost-effectiveness plane were produced, incremental cost-effectiveness ratios were estimated, and cost-effectiveness acceptability curves were computed. An incremental cost-effectiveness ratio of €6,982/quality-adjusted life years (QALY) was estimated for the baseline cohort of 50-year-old male patients. Given the commonly accepted threshold of €40,000/QALY, HDF is cost-effective. The probabilistic sensitivity analysis showed that HDF is cost-effective with a probability of ~81% at a threshold of €40,000/QALY. It is fundamental to measure the outcome also in terms of quality of life. HDF is more cost-effective for younger patients. HDF can be considered cost-effective compared to HF-HD.
[Review on HSPF model for simulation of hydrology and water quality processes].
Li, Zhao-fu; Liu, Hong-Yu; Li, Yan
2012-07-01
Hydrological Simulation Program-FORTRAN (HSPF), written in FORTRAN, is one ol the best semi-distributed hydrology and water quality models, which was first developed based on the Stanford Watershed Model. Many studies on HSPF model application were conducted. It can represent the contributions of sediment, nutrients, pesticides, conservatives and fecal coliforms from agricultural areas, continuously simulate water quantity and quality processes, as well as the effects of climate change and land use change on water quantity and quality. HSPF consists of three basic application components: PERLND (Pervious Land Segment) IMPLND (Impervious Land Segment), and RCHRES (free-flowing reach or mixed reservoirs). In general, HSPF has extensive application in the modeling of hydrology or water quality processes and the analysis of climate change and land use change. However, it has limited use in China. The main problems with HSPF include: (1) some algorithms and procedures still need to revise, (2) due to the high standard for input data, the accuracy of the model is limited by spatial and attribute data, (3) the model is only applicable for the simulation of well-mixed rivers, reservoirs and one-dimensional water bodies, it must be integrated with other models to solve more complex problems. At present, studies on HSPF model development are still undergoing, such as revision of model platform, extension of model function, method development for model calibration, and analysis of parameter sensitivity. With the accumulation of basic data and imorovement of data sharing, the HSPF model will be applied more extensively in China.
Zuo, Qiting; Chen, Hao; Dou, Ming; Zhang, Yongyong; Li, Dongfeng
2015-07-01
Impact assessment of sluice regulation on water quality is one of the crucial tasks in the present river management. However, research difficulties remain because of insufficient in situ data and numerous influencing factors in aquatic environments. The Huaidian Sluice, the main control sluice of the Shaying River, China, was selected for this study. Three field experimental programs were designed and carried out to analyze spatial and temporal variations in water quality parameters under various sluice regulation conditions and to explore the impacts of regulation mechanisms on water quality. Monitoring data were used to simulate water quality under different scenarios by the water quality analysis simulation program (WASP). Results demonstrate that the influences of sluice regulation on permanganate index (CODMn) and ammonia nitrogen (NH4-N) concentrations (indicators of water quality) were complex and nonlinear and presented different trends of increase or decrease from different regulation modes. Gate openings of different widths and different flow rates affected CODMn and NH4-N concentrations differently. Monitoring results and numerical simulation results indicate that the sluice opening should be small. Flow discharge through the sluice should be greater than 10 m(3) s and less than 60 m(3) s to maintain low CODMn concentrations, and discharge should be low (e.g., 14 m(3) s) to maintain low NH4-N concentrations. This research provides an experimental basis for further research on the construction of water quality models and for the development of reasonable regulations on water quality and quantity.
Srinivas, Rallapalli; Singh, Ajit Pratap
2018-03-01
Assessment of water quality status of a river with respect to its discharge has become prerequisite to sustainable river basin management. The present paper develops an integrated model for simulating and evaluating strategies for water quality management in a river basin management by controlling point source pollutant loadings and operations of multi-purpose projects. Water Quality Analysis and Simulation Program (WASP version 8.0) has been used for modeling the transport of pollutant loadings and their impact on water quality in the river. The study presents a novel approach of integrating fuzzy set theory with an "advanced eutrophication" model to simulate the transmission and distribution of several interrelated water quality variables and their bio-physiochemical processes in an effective manner in the Ganges river basin, India. After calibration, simulated values are compared with the observed values to validate the model's robustness. Fuzzy technique of order preference by similarity to ideal solution (F-TOPSIS) has been used to incorporate the uncertainty associated with the water quality simulation results. The model also simulates five different scenarios for pollution reduction, to determine the maximum pollutant loadings during monsoon and dry periods. The final results clearly indicate how modeled reduction in the rate of wastewater discharge has reduced impacts of pollutants in the downstream. Scenarios suggesting a river discharge rate of 1500 m 3 /s during the lean period, in addition to 25 and 50% reduction in the load rate, are found to be the most effective option to restore quality of river Ganges. Thus, the model serves as an important hydrologic tool to the policy makers by suggesting appropriate remediation action plans.
NASA Technical Reports Server (NTRS)
Crane, D. F.
1984-01-01
When human operators are performing precision tracking tasks, their dynamic response can often be modeled by quasilinear describing functions. That fact permits analysis of the effects of delay in certain man machine control systems using linear control system analysis techniques. The analysis indicates that a reduction in system stability is the immediate effect of additional control system delay, and that system characteristics moderate or exaggerate the importance of the delay. A selection of data (simulator and flight test) consistent with the analysis is reviewed. Flight simulator visual-display delay compensation, designed to restore pilot aircraft system stability, was evaluated in several studies which are reviewed here. The studies range from single-axis, tracking-task experiments (with sufficient subjects and trials to establish the statistical significance of the results) to a brief evaluation of compensation of a computer generated imagery (CGI) visual display system in a full six degree of freedom simulation. The compensation was effective, improvements in pilot performance and workload or aircraft handling qualities rating (HQR) were observed. Results from recent aircraft handling qualities research literature, which support the compensation design approach, are also reviewed.
Changes in vegetation cover associated with urban planning efforts may affect regional meteorology and air quality. Here we use a comprehensive coupled meteorology-air quality model (WRF-CMAQ) to simulate the influence of planned land use changes from green infrastructure impleme...
Estimating short-period dynamics using an extended Kalman filter
NASA Technical Reports Server (NTRS)
Bauer, Jeffrey E.; Andrisani, Dominick
1990-01-01
An extended Kalman filter (EKF) is used to estimate the parameters of a low-order model from aircraft transient response data. The low-order model is a state space model derived from the short-period approximation of the longitudinal aircraft dynamics. The model corresponds to the pitch rate to stick force transfer function currently used in flying qualities analysis. Because of the model chosen, handling qualities information is also obtained. The parameters are estimated from flight data as well as from a six-degree-of-freedom, nonlinear simulation of the aircraft. These two estimates are then compared and the discrepancies noted. The low-order model is able to satisfactorily match both flight data and simulation data from a high-order computer simulation. The parameters obtained from the EKF analysis of flight data are compared to those obtained using frequency response analysis of the flight data. Time delays and damping ratios are compared and are in agreement. This technique demonstrates the potential to determine, in near real time, the extent of differences between computer models and the actual aircraft. Precise knowledge of these differences can help to determine the flying qualities of a test aircraft and lead to more efficient envelope expansion.
Assessment of the effects of horizontal grid resolution on long ...
The objective of this study is to determine the adequacy of using a relatively coarse horizontal resolution (i.e. 36 km) to simulate long-term trends of pollutant concentrations and radiation variables with the coupled WRF-CMAQ model. WRF-CMAQ simulations over the continental United State are performed over the 2001 to 2010 time period at two different horizontal resolutions of 12 and 36 km. Both simulations used the same emission inventory and model configurations. Model results are compared both in space and time to assess the potential weaknesses and strengths of using coarse resolution in long-term air quality applications. The results show that the 36 km and 12 km simulations are comparable in terms of trends analysis for both pollutant concentrations and radiation variables. The advantage of using the coarser 36 km resolution is a significant reduction of computational cost, time and storage requirement which are key considerations when performing multiple years of simulations for trend analysis. However, if such simulations are to be used for local air quality analysis, finer horizontal resolution may be beneficial since it can provide information on local gradients. In particular, divergences between the two simulations are noticeable in urban, complex terrain and coastal regions. The National Exposure Research Laboratory’s Atmospheric Modeling Division (AMAD) conducts research in support of EPA’s mission to protect human health and the environment.
Yaguchi, Shigeo; Nishihara, Hitoshi; Kambhiranond, Waraporn; Stanley, Daniel; Apple, David
2008-01-01
To investigate the surface light scatter and optical quality of AcrySof lenses (Alcon Laboratories, Inc., Fort Worth, TX) following simulated aging of 20 years. AcrySof lenses were exposed to exaggerated thermal conditions to simulate up to 20 years of aging and were tested for surface light scatter and optical quality (modulation transfer function). There were no significant differences from baseline for either the surface light scatter or optical quality of the lenses over time. The current study demonstrated that surface light scatter on AcrySof lenses did not increase under conditions simulating 20 years of aging. Because the simulated aging environment contained no protein, this work indirectly supports the finding that surface light scatter is due to the deposition of a biomaterial on the lens surface rather than changes in the material. Optical performance integrity of the test lenses was maintained under severe environmental conditions.
POLYVIEW-MM: web-based platform for animation and analysis of molecular simulations
Porollo, Aleksey; Meller, Jaroslaw
2010-01-01
Molecular simulations offer important mechanistic and functional clues in studies of proteins and other macromolecules. However, interpreting the results of such simulations increasingly requires tools that can combine information from multiple structural databases and other web resources, and provide highly integrated and versatile analysis tools. Here, we present a new web server that integrates high-quality animation of molecular motion (MM) with structural and functional analysis of macromolecules. The new tool, dubbed POLYVIEW-MM, enables animation of trajectories generated by molecular dynamics and related simulation techniques, as well as visualization of alternative conformers, e.g. obtained as a result of protein structure prediction methods or small molecule docking. To facilitate structural analysis, POLYVIEW-MM combines interactive view and analysis of conformational changes using Jmol and its tailored extensions, publication quality animation using PyMol, and customizable 2D summary plots that provide an overview of MM, e.g. in terms of changes in secondary structure states and relative solvent accessibility of individual residues in proteins. Furthermore, POLYVIEW-MM integrates visualization with various structural annotations, including automated mapping of known inter-action sites from structural homologs, mapping of cavities and ligand binding sites, transmembrane regions and protein domains. URL: http://polyview.cchmc.org/conform.html. PMID:20504857
Incorporating principal component analysis into air quality model evaluation
The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Princi...
WASP TRANSPORT MODELING AND WASP ECOLOGICAL MODELING
A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...
Auerbach, Marc; Roney, Linda; Aysseh, April; Gawel, Marcie; Koziel, Jeannette; Barre, Kimberly; Caty, Michael G; Santucci, Karen
2014-12-01
This study aimed to evaluate the feasibility and measure the impact of an in situ interdisciplinary pediatric trauma quality improvement simulation program. Twenty-two monthly simulations were conducted in a tertiary care pediatric emergency department with the aim of improving the quality of pediatric trauma (February 2010 to November 2012). Each session included 20 minutes of simulated patient care, followed by 30 minutes of debriefing that focused on teamwork, communication, and the identification of gaps in care. A single rater scored the performance of the team in real time using a validated assessment instrument for 6 subcomponents of care (teamwork, airway, intubation, breathing, circulation, and disability). Participants completed a survey and written feedback forms. A trend analysis of the 22 simulations found statistically significant positive trends for overall performance, teamwork, and intubation subcomponents; the strength of the upward trend was the strongest for the teamwork (τ = 0.512), followed by overall performance (τ = 0.488) and intubation (τ = 0.433). Two hundred fifty-one of 398 participants completed the participant feedback form (response rate, 63%), reporting that debriefing was the most valuable aspect of the simulation. An in situ interdisciplinary pediatric trauma simulation quality improvement program resulted in improved validated trauma simulation assessment scores for overall performance, teamwork, and intubation. Participants reported high levels of satisfaction with the program, and debriefing was reported as the most valuable component of the program.
Meeting in Turkey: WASP Transport Modeling and WASP Ecological Modeling
A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...
Meeting in Korea: WASP Transport Modeling and WASP Ecological Modeling
A combination of lectures, demonstrations, and hands-on excercises will be used to introduce pollutant transport modeling with the U.S. EPA's general water quality model, WASP (Water Quality Analysis Simulation Program). WASP features include a user-friendly Windows-based interfa...
The Community Multiscale Air Quality (CMAQ) modeling system has recently been adapted to simulate the emission, transport, transformation and deposition of atmospheric mercury in three distinct forms; elemental mercury gas, reactive gaseous mercury, and particulate mercury. Emis...
Quality control analysis : part IV : field simulation of asphaltic concrete specifications.
DOT National Transportation Integrated Search
1969-02-01
The report present some of the major findings, from a simulated study of statistical specifications, on three asphaltic concrete projects representing a total of approximately 30, 000 tons of hot mix. The major emphasis of the study has been on the a...
NASA Astrophysics Data System (ADS)
Jang, Cheng-Shin; Chen, Ching-Fang; Liang, Ching-Ping; Chen, Jui-Sheng
2016-02-01
Overexploitation of groundwater is a common problem in the Pingtung Plain area of Taiwan, resulting in substantial drawdown of groundwater levels as well as the occurrence of severe seawater intrusion and land subsidence. Measures need to be taken to preserve these valuable groundwater resources. This study seeks to spatially determine the most suitable locations for the use of surface water on this plain instead of extracting groundwater for drinking, irrigation, and aquaculture purposes based on information obtained by combining groundwater quality analysis and a numerical flow simulation assuming the planning of manmade lakes and reservoirs to the increase of water supply. The multivariate indicator kriging method is first used to estimate occurrence probabilities, and to rank townships as suitable or unsuitable for groundwater utilization according to water quality standards for drinking, irrigation, and aquaculture. A numerical model of groundwater flow (MODFLOW) is adopted to quantify the recovery of groundwater levels in townships after model calibration when groundwater for drinking and agricultural demands has been replaced by surface water. Finally, townships with poor groundwater quality and significant increases in groundwater levels in the Pingtung Plain are prioritized for the groundwater conservation planning based on the combined assessment of groundwater quality and quantity. The results of this study indicate that the integration of groundwater quality analysis and the numerical flow simulation is capable of establishing sound strategies for joint groundwater and surface water use. Six southeastern townships are found to be suitable locations for replacing groundwater with surface water from manmade lakes or reservoirs to meet drinking, irrigation, and aquaculture demands.
ERIC Educational Resources Information Center
Little, Dennis; Feller, Richard
The Institute for the Future has been conducting research in technological and societal forecasting, social indicators, value change, and simulation gaming. This paper describes an effort to bring together parts of that research into a simulation game ("State Policy," or STAPOL) for analysis of the impact of government policy, social values, and…
NASA Astrophysics Data System (ADS)
Yu, Maolin; Du, R.
2005-08-01
Sheet metal stamping is one of the most commonly used manufacturing processes, and hence, much research has been carried for economic gain. Searching through the literatures, however, it is found that there are still a lots of problems unsolved. For example, it is well known that for a same press, same workpiece material, and same set of die, the product quality may vary owing to a number of factors, such as the inhomogeneous of the workpice material, the loading error, the lubrication, and etc. Presently, few seem able to predict the quality variation, not to mention what contribute to the quality variation. As a result, trial-and-error is still needed in the shop floor, causing additional cost and time delay. This paper introduces a new approach to predict the product quality variation and identify the sensitive design / process parameters. The new approach is based on a combination of inverse Finite Element Modeling (FEM) and Monte Carlo Simulation (more specifically, the Latin Hypercube Sampling (LHS) approach). With an acceptable accuracy, the inverse FEM (also called one-step FEM) requires much less computation load than that of the usual incremental FEM and hence, can be used to predict the quality variations under various conditions. LHS is a statistical method, through which the sensitivity analysis can be carried out. The result of the sensitivity analysis has clear physical meaning and can be used to optimize the die design and / or the process design. Two simulation examples are presented including drawing a rectangular box and drawing a two-step rectangular box.
Improved first-order uncertainty method for water-quality modeling
Melching, C.S.; Anmangandla, S.
1992-01-01
Uncertainties are unavoidable in water-quality modeling and subsequent management decisions. Monte Carlo simulation and first-order uncertainty analysis (involving linearization at central values of the uncertain variables) have been frequently used to estimate probability distributions for water-quality model output due to their simplicity. Each method has its drawbacks: Monte Carlo simulation's is mainly computational time; and first-order analysis are mainly questions of accuracy and representativeness, especially for nonlinear systems and extreme conditions. An improved (advanced) first-order method is presented, where the linearization point varies to match the output level whose exceedance probability is sought. The advanced first-order method is tested on the Streeter-Phelps equation to estimate the probability distribution of critical dissolved-oxygen deficit and critical dissolved oxygen using two hypothetical examples from the literature. The advanced first-order method provides a close approximation of the exceedance probability for the Streeter-Phelps model output estimated by Monte Carlo simulation using less computer time - by two orders of magnitude - regardless of the probability distributions assumed for the uncertain model parameters.
Feaster, Toby D.; Conrads, Paul
2000-01-01
In May 1996, the U.S. Geological Survey entered into a cooperative agreement with the Kershaw County Water and Sewer Authority to characterize and simulate the water quality in the Wateree River, South Carolina. Longitudinal profiling of dissolved-oxygen concentrations during the spring and summer of 1996 revealed dissolved-oxygen minimums occurring upstream from the point-source discharges. The mean dissolved-oxygen decrease upstream from the effluent discharges was 2.0 milligrams per liter, and the decrease downstream from the effluent discharges was 0.2 milligram per liter. Several theories were investigated to obtain an improved understanding of the dissolved-oxygen dynamics in the upper Wateree River. Data suggest that the dissolved-oxygen concentration decrease is associated with elevated levels of oxygen-consuming nutrients and metals that are flowing into the Wateree River from Lake Wateree. Analysis of long-term streamflow and water-quality data collected at two U.S. Geological Survey gaging stations suggests that no strong correlation exists between streamflow and dissolved-oxygen concentrations in the Wateree River. However, a strong negative correlation does exist between dissolved-oxygen concentrations and water temperature. Analysis of data from six South Carolina Department of Health and Environmental Control monitoring stations for 1980.95 revealed decreasing trends in ammonia nitrogen at all stations where data were available and decreasing trends in 5-day biochemical oxygen demand at three river stations. The influence of various hydrologic and point-source loading conditions on dissolved-oxygen concentrations in the Wateree River were determined by using results from water-quality simulations by the Branched Lagrangian Transport Model. The effects of five tributaries and four point-source discharges were included in the model. Data collected during two synoptic water-quality samplings on June 23.25 and August 11.13, 1997, were used to calibrate and validate the Branched Lagrangian Transport Model. The data include dye-tracer concentrations collected at six locations, stream-reaeration data collected at four locations, and water-quality and water-temperature data collected at nine locations. Hydraulic data for the Branched Lagrangian Transport Model were simulated by using the U.S. Geological Survey BRANCH one-dimensional, unsteady-flow model. Data that were used to calibrate and validate the BRANCH model included time-series of water-level and streamflow data at three locations. The domain of the hydraulic model and the transport model was a 57.3- and 43.5-mile reach of the river, respectively. A sensitivity analysis of the simulated dissolved-oxygen concentrations to model coefficients and data inputs indicated that the simulated dissolved-oxygen concentrations were most sensitive to changes in the boundary concentration inputs of water temperature and dissolved oxygen followed by sensitivity to the change in streamflow. A 35-percent increase in streamflow resulted in a negative normalized sensitivity index, indicating a decrease in dissolved-oxygen concentrations. The simulated dissolved-oxygen concentrations showed no significant sensitivity to changes in model input rate kinetics. To demonstrate the utility of the Branched Lagrangian Transport Model of the Wateree River, the model was used to simulate several hydrologic and water-quality scenarios to evaluate the effects on simulated dissolved-oxygen concentrations. The first scenario compared the 24-hour mean dissolved-oxygen concentrations for August 13, 1997, as simulated during the model validation, with simulations using two different streamflow patterns. The mean streamflow for August 13, 1997, was 2,000 cubic feet per second. Simulations were run using mean streamflows of 1,000 and 1,400 cubic feet per second while keeping the water-quality boundary conditions the same as were used during the validation simulations. When compared t
Under the Toxic Substances Control Act (TSCA), the Environmental Protection Agency (EPA) is required to perform new chemical reviews of nanomaterials identified in premanufacture notices. However, environmental fate models developed for traditional contaminants are limited in the...
Global sensitivity analysis for UNSATCHEM simulations of crop production with degraded waters
USDA-ARS?s Scientific Manuscript database
One strategy for maintaining irrigated agricultural productivity in the face of diminishing resource availability is to make greater use of marginal quality waters and lands. A key to sustaining systems using degraded irrigation waters is salinity management. Advanced simulation models and decision ...
Space Shuttle flying qualities and flight control system assessment study, phase 2
NASA Technical Reports Server (NTRS)
Myers, T. T.; Johnston, D. E.; Mcruer, D. T.
1983-01-01
A program of flying qualities experiments as part of the Orbiter Experiments Program (OEX) is defined. Phase 1, published as CR-170391, reviewed flying qualities criteria and shuttle data. The review of applicable experimental and shuttle data to further define the OEX plan is continued. An unconventional feature of this approach is the use of pilot strategy model identification to relate flight and simulator results. Instrumentation, software, and data analysis techniques for pilot model measurements are examined. The relationship between shuttle characteristics and superaugmented aircraft is established. STS flights 1 through 4 are reviewed from the point of view of flying qualities. A preliminary plan for a coordinated program of inflight and simulator research is presented.
Holistic Nursing Simulation: A Concept Analysis.
Cohen, Bonni S; Boni, Rebecca
2018-03-01
Simulation as a technology and holistic nursing care as a philosophy are two components within nursing programs that have merged during the process of knowledge and skill acquisition in the care of the patients as whole beings. Simulation provides opportunities to apply knowledge and skill through the use of simulators, standardized patients, and virtual settings. Concerns with simulation have been raised regarding the integration of the nursing process and recognizing the totality of the human being. Though simulation is useful as a technology, the nursing profession places importance on patient care, drawing on knowledge, theories, and expertise to administer patient care. There is a need to promptly and comprehensively define the concept of holistic nursing simulation to provide consistency and a basis for quality application within nursing curricula. This concept analysis uses Walker and Avant's approach to define holistic nursing simulation by defining antecedents, consequences, and empirical referents. The concept of holism and the practice of holistic nursing incorporated into simulation require an analysis of the concept of holistic nursing simulation by developing a language and model to provide direction for educators in design and development of holistic nursing simulation.
NASA Astrophysics Data System (ADS)
Yang, Feng; Zhang, Xiaofang; Huang, Yu; Hao, Weiwei; Guo, Baiwei
2012-11-01
Satellite platform vibration causes the image quality to be degraded, it is necessary to study its influence on image quality. The forms of Satellite platform vibration consist of linear vibration, sinusoidal vibration and random vibration. Based on Matlab & Zemax, the simulation system has been developed for simulating impact caused by satellite platform vibration on image quality. Dynamic Data Exchange is used for the communication between Matlab and Zemax. The data of sinusoidal vibration are produced by sinusoidal curve with specific amplitude and frequency. The data of random vibration are obtained by combining sinusoidal signals with 10Hz, 100Hz and 200Hz's frequency, 100, 12, 1.9's amplitude and white noise with zero mean value. Satellite platform vibration data which produced by Matlab are added to the optical system, and its point spread function can be obtained by Zemax. Blurred image can be gained by making the convolution of PSF and the original image. The definition of the original image and the blurred image are evaluated by using average gradient values of image gray. The impact caused by the sine and random vibration of six DOFs on the image quality are respectively simulated. The simulation result reveal that the decenter of X-, Y-, Z- direction and the tilt of Z-direction have a little effect on image quality, while the tilt of X-, Y- direction make image quality seriously degraded. Thus, it can be concluded that correcting the error of satellite platform vibration by FSM is a viable and effective way.
Simulated Watershed Mercury and Nitrate Flux Responses to Multiple Land Cover Conversion Scenarios
Water quality and toxic exposure science is transitioning towards analysis of multiple stressors rather than one particular environmental concern (e.g., mercury) or a group of similarly reacting chemicals (e.g., nutrients). However, two of the most important water quality constit...
A surface analysis nudging scheme coupling atmospheric and land surface thermodynamic parameters has been implemented into WRF v3.8 (latest version) for use with retrospective weather and climate simulations, as well as for applications in air quality, hydrology, and ecosystem mo...
NASA Technical Reports Server (NTRS)
Hess, Ronald A.
1994-01-01
The NASA High-Angle-of Attack Research Vehicle (HARV), a modified F-18 aircraft, experienced handling qualities problems in recent flight tests at NASA Dryden Research Center. Foremost in these problems was the tendency of the pilot-aircraft system to exhibit a potentially dangerous phenomenon known as a pilot-induced oscillation (PIO). When they occur, PIO's can severely restrict performance, sharply dimish mission capabilities, and can even result in aircraft loss. A pilot/vehicle analysis was undertaken with the goal of reducing these PIO tendencies and improving the overall vehicle handling qualities with as few changes as possible to the existing feedback/feedforward flight control laws. Utilizing a pair of analytical pilot models developed by the author, a pilot/vehicle analysis of the existing longitudinal flight control system was undertaken. The analysis included prediction of overall handling qualities levels and PIO susceptability. The analysis indicated that improvement in the flight control system was warranted and led to the formulation of a simple control stick command shaping filter. Analysis of the pilot/vehicle system with the shaping filter indicated significant improvements in handling qualities and PIO tendencies could be achieved. A non-real time simulation of the modified control system was undertaken with a realistic, nonlinear model of the current HARV. Special emphasis was placed upon those details of the command filter implementation which could effect safety of flight. The modified system is currently awaiting evaluation in the real-time, pilot-in-the-loop, Dual-Maneuvering-Simulator (DMS) facility at Langley.
Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process
NASA Astrophysics Data System (ADS)
Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph
2012-08-01
Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Estes, Maurice G., Jr.; Crosson, William L.; Khan, Maudood N.
2006-01-01
The Atlanta Urban Heat Island and Air Quality Project had its genesis in Project ATLANTA (ATlanta Land use Analysis: Temperature and Air quality) that began in 1996. Project ATLANTA examined how high-spatial resolution thermal remote sensing data could be used to derive better measurements of the Urban Heat Island effect over Atlanta. We have explored how these thermal remote sensing, as well as other imaged datasets, can be used to better characterize the urban landscape for improved air quality modeling over the Atlanta area. For the air quality modeling project, the National Land Cover Dataset and the local scale Landpro99 dataset at 30m spatial resolutions have been used to derive land use/land cover characteristics for input into the MM5 mesoscale meteorological model that is one of the foundations for the Community Multiscale Air Quality (CMAQ) model to assess how these data can improve output from CMAQ. Additionally, land use changes to 2030 have been predicted using a Spatial Growth Model (SGM). SGM simulates growth around a region using population, employment and travel demand forecasts. Air quality modeling simulations were conducted using both current and future land cover. Meteorological modeling simulations indicate a 0.5 C increase in daily maximum air temperatures by 2030. Air quality modeling simulations show substantial differences in relative contributions of individual atmospheric pollutant constituents as a result of land cover change. Enhanced boundary layer mixing over the city tends to offset the increase in ozone concentration expected due to higher surface temperatures as a result of urbanization.
Linking Six Sigma to simulation: a new roadmap to improve the quality of patient care.
Celano, Giovanni; Costa, Antonio; Fichera, Sergio; Tringali, Giuseppe
2012-01-01
Improving the quality of patient care is a challenge that calls for a multidisciplinary approach, embedding a broad spectrum of knowledge and involving healthcare professionals from diverse backgrounds. The purpose of this paper is to present an innovative approach that implements discrete-event simulation (DES) as a decision-supporting tool in the management of Six Sigma quality improvement projects. A roadmap is designed to assist quality practitioners and health care professionals in the design and successful implementation of simulation models within the define-measure-analyse-design-verify (DMADV) or define-measure-analyse-improve-control (DMAIC) Six Sigma procedures. A case regarding the reorganisation of the flow of emergency patients affected by vertigo symptoms was developed in a large town hospital as a preliminary test of the roadmap. The positive feedback from professionals carrying out the project looks promising and encourages further roadmap testing in other clinical settings. The roadmap is a structured procedure that people involved in quality improvement can implement to manage projects based on the analysis and comparison of alternative scenarios. The role of Six Sigma philosophy in improvement of the quality of healthcare services is recognised both by researchers and by quality practitioners; discrete-event simulation models are commonly used to improve the key performance measures of patient care delivery. The two approaches are seldom referenced and implemented together; however, they could be successfully integrated to carry out quality improvement programs. This paper proposes an innovative approach to bridge the gap and enrich the Six Sigma toolbox of quality improvement procedures with DES.
Helicopter roll control effectiveness criteria program summary
NASA Technical Reports Server (NTRS)
Heffley, Robert K.; Bourne, Simon M.; Mnich, Marc A.
1988-01-01
A study of helicopter roll control effectiveness is summarized for the purpose of defining military helicopter handling qualities requirements. The study is based on an analysis of pilot-in-the-loop task performance of several basic maneuvers. This is extended by a series of piloted simulations using the NASA Ames Vertical Motion Simulator and selected flight data. The main results cover roll control power and short-term response characteristics. In general the handling qualities requirements recommended are set in conjunction with desired levels of flight task and maneuver response which can be directly observed in actual flight. An important aspect of this, however, is that vehicle handling qualities need to be set with regard to some quantitative aspect of mission performance. Specific examples of how this can be accomplished include a lateral unmask/remask maneuver in the presence of a threat and an air tracking maneuver which recognizes the kill probability enhancement connected with decreasing the range to the target. Conclusions and recommendations address not only the handling qualities recommendations, but also the general use of flight simulators and the dependence of mission performance on handling qualities.
Air pollution simulations critically depend on the quality of the underlying meteorology. In phase 2 of the Air Quality Model Evaluation International Initiative (AQMEII-2), thirteen modeling groups from Europe and four groups from North America operating eight different regional...
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary con...
Generating Nonnormal Multivariate Data Using Copulas: Applications to SEM.
Mair, Patrick; Satorra, Albert; Bentler, Peter M
2012-07-01
This article develops a procedure based on copulas to simulate multivariate nonnormal data that satisfy a prespecified variance-covariance matrix. The covariance matrix used can comply with a specific moment structure form (e.g., a factor analysis or a general structural equation model). Thus, the method is particularly useful for Monte Carlo evaluation of structural equation models within the context of nonnormal data. The new procedure for nonnormal data simulation is theoretically described and also implemented in the widely used R environment. The quality of the method is assessed by Monte Carlo simulations. A 1-sample test on the observed covariance matrix based on the copula methodology is proposed. This new test for evaluating the quality of a simulation is defined through a particular structural model specification and is robust against normality violations.
Coon, William F.
2003-01-01
A computer model of hydrologic and water-quality processes of the Irondequoit Creek basin in Monroe and Ontario Counties, N.Y., was developed during 2000-02 to enable water-resources managers to simulate the effects of future development and stormwater-detention basins on peak flows and water quality of Irondequoit Creek and its tributaries. The model was developed with the program Hydrological Simulation Program-Fortran (HSPF) such that proposed or hypothetical land-use changes and instream stormwater-detention basins could be simulated, and their effects on peak flows and loads of total suspended solids, total phosphorus, ammonia-plus-organic nitrogen, and nitrate-plus-nitrite nitrogen could be analyzed, through an interactive computer program known as Generation and Analysis of Model Simulation Scenarios for Watersheds (GenScn). This report is a user's manual written to guide the Irondequoit Creek Watershed Collaborative in (1) the creation of land-use and flow-detention scenarios for simulation by the HSPF model, and (2) the use of GenScn to analyze the results of these simulations. These analyses can, in turn, aid the group in making basin-wide water-resources-management decisions.
Lundgren, Robert F.; Nustad, Rochelle A.
2008-01-01
A time-of-travel and reaeration-rate study was conducted by the U.S. Geological Survey, in cooperation with the North Dakota Department of Health, the Minnesota Pollution Control Agency, and the cities of Fargo, North Dakota, and Moorhead, Minnesota, to provide information to calibrate a water-quality model for streamflows of less than 150 cubic feet per second. Data collected from September 24 through 27, 2003, were used to develop and calibrate the U.S. Environmental Protection Agency Water Quality Analysis Simulation Program model (hereinafter referred to as the Fargo WASP water-quality model) for a 19.2-mile reach of the Red River of the North. The Fargo WASP water-quality model was calibrated for the transport of dye by fitting simulated time-concentration dye curves to measured time-concentration dye curves. Simulated peak concentrations were within 10 percent of measured concentrations. Simulated traveltimes of the dye cloud centroid were within 7 percent of measured traveltimes. The variances of the simulated dye concentrations were similar to the variances of the measured dye concentrations, indicating dispersion was reproduced reasonably well. Average simulated dissolved-oxygen concentrations were within 6 percent of average measured concentrations. Average simulated ammonia concentrations were within the range of measured concentrations. Simulated dissolved-oxygen and ammonia concentrations were affected by the specification of a single nitrification rate in the Fargo WASP water-quality model. Data sets from August 1989 and August 1990 were used to test traveltime and simulation of dissolved oxygen and ammonia. For streamflows that ranged from 60 to 407 cubic feet per second, simulated traveltimes were within 7 percent of measured traveltimes. Measured dissolved-oxygen concentrations were underpredicted by less than 15 percent for both data sets. Results for ammonia were poor; measured ammonia concentrations were underpredicted by as much as 70 percent for both data sets. Overall, application of the Fargo WASP water-quality model to the 1989 and 1990 data sets resulted in poor agreement between measured and simulated concentrations. This likely is a result of changes in the waste-load composition for the Fargo and Moorhead wastewater-treatment plants as a result of improvements to the wastewater-treatment plants since 1990. The change in waste-load composition probably resulted in a change in decay rates and in dissolved oxygen no longer being substantially depressed downstream from the Moorhead and Fargo wastewater-treatment plants. The Fargo WASP water-quality model is valid for the current (2008) treatment processes at the wastewater-treatment plants.
Lobb, Eric C
2016-07-08
Version 6.3 of the RITG148+ software package offers eight automated analysis routines for quality assurance of the TomoTherapy platform. A performance evaluation of each routine was performed in order to compare RITG148+ results with traditionally accepted analysis techniques and verify that simulated changes in machine parameters are correctly identified by the software. Reference films were exposed according to AAPM TG-148 methodology for each routine and the RITG148+ results were compared with either alternative software analysis techniques or manual analysis techniques in order to assess baseline agreement. Changes in machine performance were simulated through translational and rotational adjustments to subsequently irradiated films, and these films were analyzed to verify that the applied changes were accurately detected by each of the RITG148+ routines. For the Hounsfield unit routine, an assessment of the "Frame Averaging" functionality and the effects of phantom roll on the routine results are presented. All RITG148+ routines reported acceptable baseline results consistent with alternative analysis techniques, with 9 of the 11 baseline test results showing agreement of 0.1mm/0.1° or better. Simulated changes were correctly identified by the RITG148+ routines within approximately 0.2 mm/0.2° with the exception of the Field Centervs. Jaw Setting routine, which was found to have limited accuracy in cases where field centers were not aligned for all jaw settings due to inaccurate autorotation of the film during analysis. The performance of the RITG148+ software package was found to be acceptable for introduction into our clinical environment as an automated alternative to traditional analysis techniques for routine TomoTherapy quality assurance testing.
Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.
Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A
2016-04-01
The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.
Hawthorne, Kamila; Denney, Mei Ling; Bewick, Mike; Wakeford, Richard
2006-01-01
WHAT IS ALREADY KNOWN IN THIS AREA • The Simulated Surgery module of the MRCGP examination has been shown to be a valid and reliable assessment of clinical consulting skills. WHAT THIS WORK ADDS • This paper describes the further development of the methodology of the Simulated Surgery; showing the type of data analysis currently used to assure its quality and reliability. The measures taken to tighten up case quality are discussed. SUGGESTIONS FOR FUTURE RESEARCH The future development of clinical skills assessments in general practice is discussed. More work is needed on the effectiveness and reliability of lay assessors in complex integrated clinical cases. New methods to test areas that are difficult to reproduces in a simulated environment (such as acute emergencies and cases with the very young or very old) are also needed.
NASA Astrophysics Data System (ADS)
Suarez, Berta; Felez, Jesus; Lozano, José Antonio; Rodriguez, Pablo
2013-02-01
This work describes an analytical approach to determine what degree of accuracy is required in the definition of the rail vehicle models used for dynamic simulations. This way it would be possible to know in advance how the results of simulations may be altered due to the existence of errors in the creation of rolling stock models, whilst also identifying their critical parameters. This would make it possible to maximise the time available to enhance dynamic analysis and focus efforts on factors that are strictly necessary. In particular, the parameters related both to the track quality and to the rolling contact were considered in this study. With this aim, a sensitivity analysis was performed to assess their influence on the vehicle dynamic behaviour. To do this, 72 dynamic simulations were performed modifying, one at a time, the track quality, the wheel-rail friction coefficient and the equivalent conicity of both new and worn wheels. Three values were assigned to each parameter, and two wear states were considered for each type of wheel, one for new wheels and another one for reprofiled wheels. After processing the results of these simulations, it was concluded that all the parameters considered show very high influence, though the friction coefficient shows the highest influence. Therefore, it is recommended to undertake any future simulation job with measured track geometry and track irregularities, measured wheel profiles and normative values of the wheel-rail friction coefficient.
Wesolowski, Edwin A.
1996-01-01
Two separate studies to simulate the effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota, have been completed. In the first study, the Red River at Fargo Water-Quality Model was calibrated and verified for icefree conditions. In the second study, the Red River at Fargo Ice-Cover Water-Quality Model was verified for ice-cover conditions.To better understand and apply the Red River at Fargo Water-Quality Model and the Red River at Fargo Ice-Cover Water-Quality Model, the uncertainty associated with simulated constituent concentrations and property values was analyzed and quantified using the Enhanced Stream Water Quality Model-Uncertainty Analysis. The Monte Carlo simulation and first-order error analysis methods were used to analyze the uncertainty in simulated values for six constituents and properties at sites 5, 10, and 14 (upstream to downstream order). The constituents and properties analyzed for uncertainty are specific conductance, total organic nitrogen (reported as nitrogen), total ammonia (reported as nitrogen), total nitrite plus nitrate (reported as nitrogen), 5-day carbonaceous biochemical oxygen demand for ice-cover conditions and ultimate carbonaceous biochemical oxygen demand for ice-free conditions, and dissolved oxygen. Results are given in detail for both the ice-cover and ice-free conditions for specific conductance, total ammonia, and dissolved oxygen.The sensitivity and uncertainty of the simulated constituent concentrations and property values to input variables differ substantially between ice-cover and ice-free conditions. During ice-cover conditions, simulated specific-conductance values are most sensitive to the headwatersource specific-conductance values upstream of site 10 and the point-source specific-conductance values downstream of site 10. These headwater-source and point-source specific-conductance values also are the key sources of uncertainty. Simulated total ammonia concentrations are most sensitive to the point-source total ammonia concentrations at all three sites. Other input variables that contribute substantially to the variability of simulated total ammonia concentrations are the headwater-source total ammonia and the instream reaction coefficient for biological decay of total ammonia to total nitrite. Simulated dissolved-oxygen concentrations at all three sites are most sensitive to headwater-source dissolved-oxygen concentration. This input variable is the key source of variability for simulated dissolved-oxygen concentrations at sites 5 and 10. Headwatersource and point-source dissolved-oxygen concentrations are the key sources of variability for simulated dissolved-oxygen concentrations at site 14.During ice-free conditions, simulated specific-conductance values at all three sites are most sensitive to the headwater-source specific-conductance values. Headwater-source specificconductance values also are the key source of uncertainty. The input variables to which total ammonia and dissolved oxygen are most sensitive vary from site to site and may or may not correspond to the input variables that contribute the most to the variability. The input variables that contribute the most to the variability of simulated total ammonia concentrations are pointsource total ammonia, instream reaction coefficient for biological decay of total ammonia to total nitrite, and Manning's roughness coefficient. The input variables that contribute the most to the variability of simulated dissolved-oxygen concentrations are reaeration rate, sediment oxygen demand rate, and headwater-source algae as chlorophyll a.
Multiphysical simulation analysis of the dislocation structure in germanium single crystals
NASA Astrophysics Data System (ADS)
Podkopaev, O. I.; Artemyev, V. V.; Smirnov, A. D.; Mamedov, V. M.; Sid'ko, A. P.; Kalaev, V. V.; Kravtsova, E. D.; Shimanskii, A. F.
2016-09-01
To grow high-quality germanium crystals is one of the most important problems of growth industry. The dislocation density is an important parameter of the quality of single crystals. The dislocation densities in germanium crystals 100 mm in diameter, which have various shapes of the side surface and are grown by the Czochralski technique, are experimentally measured. The crystal growth is numerically simulated using heat-transfer and hydrodynamics models and the Alexander-Haasen dislocation model in terms of the CGSim software package. A comparison of the experimental and calculated dislocation densities shows that the dislocation model can be applied to study lattice defects in germanium crystals and to improve their quality.
The United States Environmental Protection Agency’s Environmental Sciences and Atmospheric Modeling Analysis Divisions are investigating the viability of simulated (i.e., ‘modeled’) leaf area index (LAI) inputs into various regional and local scale air quality models. Satellite L...
Michael A. Larson; Frank R., III Thompson; Joshua J. Millspaugh; William D. Dijak; Stephen R. Shifley
2004-01-01
Methods for habitat modeling based on landscape simulations and population viability modeling based on habitat quality are well developed, but no published study of which we are aware has effectively joined them in a single, comprehensive analysis. We demonstrate the application of a population viability model for ovenbirds (Seiurus aurocapillus)...
In this study, temporal scale analysis is applied as a technique to evaluate an annual simulation of meteorology, O3, and PM2.5 and its chemical components over the continental U.S. utilizing two modeling systems. It is illustrated that correlations were ins...
Monte Carlo simulation of PET/MR scanner and assessment of motion correction strategies
NASA Astrophysics Data System (ADS)
Işın, A.; Uzun Ozsahin, D.; Dutta, J.; Haddani, S.; El-Fakhri, G.
2017-03-01
Positron Emission Tomography is widely used in three dimensional imaging of metabolic body function and in tumor detection. Important research efforts are made to improve this imaging modality and powerful simulators such as GATE are used to test and develop methods for this purpose. PET requires acquisition time in the order of few minutes. Therefore, because of the natural patient movements such as respiration, the image quality can be adversely affected which drives scientists to develop motion compensation methods to improve the image quality. The goal of this study is to evaluate various image reconstructions methods with GATE simulation of a PET acquisition of the torso area. Obtained results show the need to compensate natural respiratory movements in order to obtain an image with similar quality as the reference image. Improvements are still possible in the applied motion field's extraction algorithms. Finally a statistical analysis should confirm the obtained results.
Study of ceramic products and processing techniques in space. [using computerized simulation
NASA Technical Reports Server (NTRS)
Markworth, A. J.; Oldfield, W.
1974-01-01
An analysis of the solidification kinetics of beta alumina in a zero-gravity environment was carried out, using computer-simulation techniques, in order to assess the feasibility of producing high-quality single crystals of this material in space. The two coupled transport processes included were movement of the solid-liquid interface and diffusion of sodium atoms in the melt. Results of the simulation indicate that appreciable crystal-growth rates can be attained in space. Considerations were also made of the advantages offered by high-quality single crystals of beta alumina for use as a solid electrolyte; these clearly indicate that space-grown materials are superior in many respects to analogous terrestrially-grown crystals. Likewise, economic considerations, based on the rapidly expanding technological applications for beta alumina and related fast ionic conductors, reveal that the many superior qualities of space-grown material justify the added expense and experimental detail associated with space processing.
Tang, Gula; Zhu, Yunqiang; Wu, Guozheng; Li, Jing; Li, Zhao-Liang; Sun, Jiulin
2016-01-01
In this study, the Mudan River, which is the most typical river in the northern cold region of China was selected as the research object; Environmental Fluid Dynamics Code (EFDC) was adopted to construct a new two-dimensional water quality model for the urban sections of the Mudan River, and concentrations of CODCr and NH3N during ice-covered and open-water periods were simulated and analyzed. Results indicated that roughness coefficient and comprehensive pollutant decay rate were significantly different in those periods. To be specific, the roughness coefficient in the ice-covered period was larger than that of the open-water period, while the decay rate within the former period was smaller than that in the latter. In addition, according to the analysis of the simulated results, the main reasons for the decay rate reduction during the ice-covered period are temperature drop, upstream inflow decrease and ice layer cover; among them, ice sheet is the major contributor of roughness increase. These aspects were discussed in more detail in this work. The model could be generalized to hydrodynamic water quality process simulation researches on rivers in other cold regions as well. PMID:27070631
String Stability of a Linear Formation Flight Control System
NASA Technical Reports Server (NTRS)
Allen, Michael J.; Ryan, Jack; Hanson, Curtis E.; Parle, James F.
2002-01-01
String stability analysis of an autonomous formation flight system was performed using linear and nonlinear simulations. String stability is a measure of how position errors propagate from one vehicle to another in a cascaded system. In the formation flight system considered here, each i(sup th) aircraft uses information from itself and the preceding ((i-1)(sup th)) aircraft to track a commanded relative position. A possible solution for meeting performance requirements with such a system is to allow string instability. This paper explores two results of string instability and outlines analysis techniques for string unstable systems. The three analysis techniques presented here are: linear, nonlinear formation performance, and ride quality. The linear technique was developed from a worst-case scenario and could be applied to the design of a string unstable controller. The nonlinear formation performance and ride quality analysis techniques both use nonlinear formation simulation. Three of the four formation-controller gain-sets analyzed in this paper were limited more by ride quality than by performance. Formations of up to seven aircraft in a cascaded formation could be used in the presence of light gusts with this string unstable system.
An Enhanced K-Means Algorithm for Water Quality Analysis of The Haihe River in China.
Zou, Hui; Zou, Zhihong; Wang, Xiaojing
2015-11-12
The increase and the complexity of data caused by the uncertain environment is today's reality. In order to identify water quality effectively and reliably, this paper presents a modified fast clustering algorithm for water quality analysis. The algorithm has adopted a varying weights K-means cluster algorithm to analyze water monitoring data. The varying weights scheme was the best weighting indicator selected by a modified indicator weight self-adjustment algorithm based on K-means, which is named MIWAS-K-means. The new clustering algorithm avoids the margin of the iteration not being calculated in some cases. With the fast clustering analysis, we can identify the quality of water samples. The algorithm is applied in water quality analysis of the Haihe River (China) data obtained by the monitoring network over a period of eight years (2006-2013) with four indicators at seven different sites (2078 samples). Both the theoretical and simulated results demonstrate that the algorithm is efficient and reliable for water quality analysis of the Haihe River. In addition, the algorithm can be applied to more complex data matrices with high dimensionality.
Zheng, Zhong-liang; Zuo, Zhen-yu; Liu, Zhi-gang; Tsai, Keng-chang; Liu, Ai-fu; Zou, Guo-lin
2005-01-01
A three-dimensional structural model of nattokinase (NK) from Bacillus natto was constructed by homology modeling. High-resolution X-ray structures of Subtilisin BPN' (SB), Subtilisin Carlsberg (SC), Subtilisin E (SE) and Subtilisin Savinase (SS), four proteins with sequential, structural and functional homology were used as templates. Initial models of NK were built by MODELLER and analyzed by the PROCHECK programs. The best quality model was chosen for further refinement by constrained molecular dynamics simulations. The overall quality of the refined model was evaluated. The refined model NKC1 was analyzed by different protein analysis programs including PROCHECK for the evaluation of Ramachandran plot quality, PROSA for testing interaction energies and WHATIF for the calculation of packing quality. This structure was found to be satisfactory and also stable at room temperature as demonstrated by a 300ps long unconstrained molecular dynamics (MD) simulation. Further docking analysis promoted the coming of a new nucleophilic catalytic mechanism for NK, which is induced by attacking of hydroxyl rich in catalytic environment and locating of S221.
Impacts of Energy Sector Emissions on PM2.5 Air Quality in Northern India
NASA Astrophysics Data System (ADS)
Karambelas, A. N.; Kiesewetter, G.; Heyes, C.; Holloway, T.
2015-12-01
India experiences high concentrations of fine particulate matter (PM2.5), and several Indian cities currently rank among the world's most polluted cities. With ongoing urbanization and a growing economy, emissions from different energy sectors remain major contributors to air pollution in India. Emission sectors impact ambient air quality differently due to spatial distribution (typical urban vs. typical rural sources) as well as source height characteristics (low-level vs. high stack sources). This study aims to assess the impacts of emissions from three distinct energy sectors—transportation, domestic, and electricity—on ambient PM2.5 in northern India using an advanced air quality analysis framework based on the U.S. EPA Community Multi-Scale Air Quality (CMAQ) model. Present air quality conditions are simulated using 2010 emissions from the Greenhouse Gas-Air Pollution Interaction and Synergies (GAINS) model. Modeled PM2.5 concentrations are compared with satellite observations of aerosol optical depth (AOD) from the Moderate Imaging Spectroradiometer (MODIS) for 2010. Energy sector emissions impacts on future (2030) PM2.5 are evaluated with three sensitivity simulations, assuming maximum feasible reduction technologies for either transportation, domestic, or electricity sectors. These simulations are compared with a business as usual 2030 simulation to assess relative sectoral impacts spatially and temporally. CMAQ is modeled at 12km by 12km and include biogenic emissions from the Community Land Model coupled with the Model of Emissions of Gases and Aerosols in Nature (CLM-MEGAN), biomass burning emissions from the Global Fires Emissions Database (GFED), and ERA-Interim meteorology generated with the Weather Research and Forecasting (WRF) model for 2010 to quantify the impact of modified anthropogenic emissions on ambient PM2.5 concentrations. Energy sector emissions analysis supports decision-making to improve future air quality and public health in India.
Sachit, Dawood Eisa; Veenstra, John N.
2017-01-01
In this work, three different types of Reverse Osmosis (RO) (Thin-Film Composite (SE), Cellulose Acetate (CE), and Polyamide (AD)) were used to perform foulant analysis (autopsy) study on the deposited materials from three different simulated brackish surface feed waters. The brackish surface water qualities represented the water quality in Iraqi marshes. The main foulants from the simulated feed waters were characterized by using Scanning Electron Microscope (SEM) images and Energy-Dispersive X-ray Spectroscopy (EDXS) spectra. The effect of feed water temperatures (37 °C and 11 °C) on the formation of the fouled material deposited on the membrane surface was examined in this study. Also, pretreatment by a 0.1 micron microfiltration (MF) membrane of the simulated feed water in advance of the RO membrane on the precipitated material on the membrane surface was investigated. Finally, Fourier Transform Infrared Spectroscopy (FTIR) analysis was used to identify the functional groups of the organic matter deposited on the RO membrane surfaces. The SEM images and EDSX spectra suggested that the fouled material was mainly organic matter, and the major crystal deposited on the RO membrane was calcium carbonate (CaCO3). The FTIR spectra of the fouled RO membranes suggested that the constituents of the fouled material included aliphatic and aromatic compounds. PMID:28406468
Optomechanical integrated simulation of Mars medium resolution lens with large field of view
NASA Astrophysics Data System (ADS)
Yang, Wenqiang; Xu, Guangzhou; Yang, Jianfeng; Sun, Yi
2017-10-01
The lens of Mars detector is exposed to solar radiation and space temperature for long periods of time during orbit, so that the ambient temperature of the optical system is in a dynamic state. The optical and mechanical change caused by heat will lead to camera's visual axis drift and the wavefront distortion. The surface distortion of the optical lens includes the displacement of the rigid body and the distortion of the surface shape. This paper used the calculation method based on the integrated optomechanical analysis, to explore the impact of thermodynamic load on image quality. Through the analysis software, established a simulation model of the lens structure. The shape distribution and the surface characterization parameters of the lens in some temperature ranges were analyzed and compared. the PV / RMS value, deformation cloud of the lens surface and quality evaluation of imaging was achieved. This simulation has been successfully measured the lens surface shape and shape distribution under the load which is difficult to measure on the experimental conditions. The integrated simulation method of the optical machine can obtain the change of the optical parameters brought by the temperature load. It shows that the application of Integrated analysis has play an important role in guiding the designing the lens.
AQMEII3 evaluation of regional NA/EU simulations and ...
Through the comparison of several regional-scale chemistry transport modelling systems that simulate meteorology and air quality over the European and American continents, this study aims at i) apportioning the error to the responsible processes using time-scale analysis, ii) helping to detect causes of models error, and iii) identifying the processes and scales most urgently requiring dedicated investigations. The analysis is conducted within the framework of the third phase of the Air Quality Model Evaluation International Initiative (AQMEII) and tackles model performance gauging through measurement-to-model comparison, error decomposition and time series analysis of the models biases for several fields (ozone, CO, SO2, NO, NO2, PM10, PM2.5, wind speed, and temperature). The operational metrics (magnitude of the error, sign of the bias, associativity) provide an overall sense of model strengths and deficiencies, while apportioning the error to its constituent parts (bias, variance and covariance) can help to assess the nature and quality of the error. Each of the error components is analysed independently and apportioned to specific processes based on the corresponding timescale (long scale, synoptic, diurnal, and intra-day) using the error apportionment technique devised in the former phases of AQMEII. The application of the error apportionment method to the AQMEII Phase 3 simulations provides several key insights. In addition to reaffirming the strong impac
Analysis of the Space Shuttle main engine simulation
NASA Technical Reports Server (NTRS)
Deabreu-Garcia, J. Alex; Welch, John T.
1993-01-01
This is a final report on an analysis of the Space Shuttle Main Engine Program, a digital simulator code written in Fortran. The research was undertaken in ultimate support of future design studies of a shuttle life-extending Intelligent Control System (ICS). These studies are to be conducted by NASA Lewis Space Research Center. The primary purpose of the analysis was to define the means to achieve a faster running simulation, and to determine if additional hardware would be necessary for speeding up simulations for the ICS project. In particular, the analysis was to consider the use of custom integrators based on the Matrix Stability Region Placement (MSRP) method. In addition to speed of execution, other qualities of the software were to be examined. Among these are the accuracy of computations, the useability of the simulation system, and the maintainability of the program and data files. Accuracy involves control of truncation error of the methods, and roundoff error induced by floating point operations. It also involves the requirement that the user be fully aware of the model that the simulator is implementing.
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
A 5 year (2002-2006) simulation of CMAQ covering the eastern United States is evaluated using principle component analysis in order to identify and characterize statistically significant patterns of model bias. Such analysis is useful in that in can identify areas of poor model ...
Dynamic simulation of the effect of soft toric contact lenses movement on retinal image quality.
Niu, Yafei; Sarver, Edwin J; Stevenson, Scott B; Marsack, Jason D; Parker, Katrina E; Applegate, Raymond A
2008-04-01
To report the development of a tool designed to dynamically simulate the effect of soft toric contact lens movement on retinal image quality, initial findings on three eyes, and the next steps to be taken to improve the utility of the tool. Three eyes of two subjects wearing soft toric contact lenses were cyclopleged with 1% cyclopentolate and 2.5% phenylephrine. Four hundred wavefront aberration measurements over a 5-mm pupil were recorded during soft contact lens wear at 30 Hz using a complete ophthalmic analysis system aberrometer. Each wavefront error measurement was input into Visual Optics Laboratory (version 7.15, Sarver and Associates, Inc.) to generate a retinal simulation of a high contrast log MAR visual acuity chart. The individual simulations were combined into a single dynamic movie using a custom MatLab PsychToolbox program. Visual acuity was measured for each eye reading the movie with best cycloplegic spectacle correction through a 3-mm artificial pupil to minimize the influence of the eyes' uncorrected aberrations. Comparison of the simulated acuity was made to values recorded while the subject read unaberrated charts with contact lenses through a 5-mm artificial pupil. For one study eye, average acuity was the same as the natural contact lens viewing condition. For the other two study eyes visual acuity of the best simulation was more than one line worse than natural viewing conditions. Dynamic simulation of retinal image quality, although not yet perfect, is a promising technique for visually illustrating the optical effects on image quality because of the movements of alignment-sensitive corrections.
The Cost of Quality: Teacher Testing and Racial-Ethnic Representativeness in Public Education.
ERIC Educational Resources Information Center
Dometrius, Nelson C.; Sigelman, Lee
1988-01-01
Reports the results of a simulation-based analysis of Black and Hispanic public school teacher employment in Texas. The simulation reflects recently instituted teacher testing requirements in mathematics and language competence and projects how the program is likely to affect the racial-ethnic composition of public school teachers in Texas over…
ERIC Educational Resources Information Center
Bider, Ilia; Henkel, Martin; Kowalski, Stewart; Perjons, Erik
2015-01-01
Purpose: This paper aims to report on a project aimed at using simulation for improving the quality of teaching and learning modeling skills. More specifically, the project goal was to facilitate the students to acquire skills of building models of organizational structure and behavior through analysis of internal and external documents, and…
Water quality modeling for urban reach of Yamuna river, India (1999-2009), using QUAL2Kw
NASA Astrophysics Data System (ADS)
Sharma, Deepshikha; Kansal, Arun; Pelletier, Greg
2017-06-01
The study was to characterize and understand the water quality of the river Yamuna in Delhi (India) prior to an efficient restoration plan. A combination of collection of monitored data, mathematical modeling, sensitivity, and uncertainty analysis has been done using the QUAL2Kw, a river quality model. The model was applied to simulate DO, BOD, total coliform, and total nitrogen at four monitoring stations, namely Palla, Old Delhi Railway Bridge, Nizamuddin, and Okhla for 10 years (October 1999-June 2009) excluding the monsoon seasons (July-September). The study period was divided into two parts: monthly average data from October 1999-June 2004 (45 months) were used to calibrate the model and monthly average data from October 2005-June 2009 (45 months) were used to validate the model. The R2 for CBODf and TN lies within the range of 0.53-0.75 and 0.68-0.83, respectively. This shows that the model has given satisfactory results in terms of R2 for CBODf, TN, and TC. Sensitivity analysis showed that DO, CBODf, TN, and TC predictions are highly sensitive toward headwater flow and point source flow and quality. Uncertainty analysis using Monte Carlo showed that the input data have been simulated in accordance with the prevalent river conditions.
Assessment and management of the performance risk of a pilot reclaimed water disinfection process.
Zhou, Guangyu; Zhao, Xinhua; Zhang, Lei; Wu, Qing
2013-10-01
Chlorination disinfection has been widely used in reclaimed water treatment plants to ensure water quality. In order to assess the downstream quality risk of a running reclaimed water disinfection process, a set of dynamic equations was developed to simulate reactions in the disinfection process concerning variables of bacteria, chemical oxygen demand (COD), ammonia and monochloramine. The model was calibrated by the observations obtained from a pilot disinfection process which was designed to simulate the actual process in a reclaimed water treatment plant. A Monte Carlo algorithm was applied to calculate the predictive effluent quality distributions that were used in the established hierarchical assessment system for the downstream quality risk, and the key factors affecting the downstream quality risk were defined using the Regional Sensitivity Analysis method. The results showed that the seasonal upstream quality variation caused considerable downstream quality risk; the effluent ammonia was significantly influenced by its upstream concentration; the upstream COD was a key factor determining the process effluent risk of bacterial, COD and residual disinfectant indexes; and lower COD and ammonia concentrations in the influent would mean better downstream quality.
NASA Astrophysics Data System (ADS)
Zwickl, Titus; Carleer, Bart; Kubli, Waldemar
2005-08-01
In the past decade, sheet metal forming simulation became a well established tool to predict the formability of parts. In the automotive industry, this has enabled significant reduction in the cost and time for vehicle design and development, and has helped to improve the quality and performance of vehicle parts. However, production stoppages for troubleshooting and unplanned die maintenance, as well as production quality fluctuations continue to plague manufacturing cost and time. The focus therefore has shifted in recent times beyond mere feasibility to robustness of the product and process being engineered. Ensuring robustness is the next big challenge for the virtual tryout / simulation technology. We introduce new methods, based on systematic stochastic simulations, to visualize the behavior of the part during the whole forming process — in simulation as well as in production. Sensitivity analysis explains the response of the part to changes in influencing parameters. Virtual tryout allows quick exploration of changed designs and conditions. Robust design and manufacturing guarantees quality and process capability for the production process. While conventional simulations helped to reduce development time and cost by ensuring feasible processes, robustness engineering tools have the potential for far greater cost and time savings. Through examples we illustrate how expected and unexpected behavior of deep drawing parts may be tracked down, identified and assigned to the influential parameters. With this knowledge, defects can be eliminated or springback can be compensated e.g.; the response of the part to uncontrollable noise can be predicted and minimized. The newly introduced methods enable more reliable and predictable stamping processes in general.
De Crop, An; Bacher, Klaus; Van Hoof, Tom; Smeets, Peter V; Smet, Barbara S; Vergauwen, Merel; Kiendys, Urszula; Duyck, Philippe; Verstraete, Koenraad; D'Herde, Katharina; Thierens, Hubert
2012-01-01
To determine the correlation between the clinical and physical image quality of chest images by using cadavers embalmed with the Thiel technique and a contrast-detail phantom. The use of human cadavers fulfilled the requirements of the institutional ethics committee. Clinical image quality was assessed by using three human cadavers embalmed with the Thiel technique, which results in excellent preservation of the flexibility and plasticity of organs and tissues. As a result, lungs can be inflated during image acquisition to simulate the pulmonary anatomy seen on a chest radiograph. Both contrast-detail phantom images and chest images of the Thiel-embalmed bodies were acquired with an amorphous silicon flat-panel detector. Tube voltage (70, 81, 90, 100, 113, 125 kVp), copper filtration (0.1, 0.2, 0.3 mm Cu), and exposure settings (200, 280, 400, 560, 800 speed class) were altered to simulate different quality levels. Four experienced radiologists assessed the image quality by using a visual grading analysis (VGA) technique based on European Quality Criteria for Chest Radiology. The phantom images were scored manually and automatically with use of dedicated software, both resulting in an inverse image quality figure (IQF). Spearman rank correlations between inverse IQFs and VGA scores were calculated. A statistically significant correlation (r = 0.80, P < .01) was observed between the VGA scores and the manually obtained inverse IQFs. Comparison of the VGA scores and the automated evaluated phantom images showed an even better correlation (r = 0.92, P < .001). The results support the value of contrast-detail phantom analysis for evaluating clinical image quality in chest radiography. © RSNA, 2011.
Environmental Flow for Sungai Johor Estuary
NASA Astrophysics Data System (ADS)
Adilah, A. Kadir; Zulkifli, Yusop; Zainura, Z. Noor; Bakhiah, Baharim N.
2018-03-01
Sungai Johor estuary is a vital water body in the south of Johor and greatly affects the water quality in the Johor Straits. In the development of the hydrodynamic and water quality models for Sungai Johor estuary, the Environmental Fluid Dynamics Code (EFDC) model was selected. In this application, the EFDC hydrodynamic model was configured to simulate time varying surface elevation, velocity, salinity, and water temperature. The EFDC water quality model was configured to simulate dissolved oxygen (DO), dissolved organic carbon (DOC), chemical oxygen demand (COD), ammoniacal nitrogen (NH3-N), nitrate nitrogen (NO3-N), phosphate (PO4), and Chlorophyll a. The hydrodynamic and water quality model calibration was performed utilizing a set of site specific data acquired in January 2008. The simulated water temperature, salinity and DO showed good and fairly good agreement with observations. The calculated correlation coefficients between computed and observed temperature and salinity were lower compared with the water level. Sensitivity analysis was performed on hydrodynamic and water quality models input parameters to quantify their impact on modeling results such as water surface elevation, salinity and dissolved oxygen concentration. It is anticipated and recommended that the development of this model be continued to synthesize additional field data into the modeling process.
NASA Technical Reports Server (NTRS)
Stringer, Mary T.; Cowen, Brandon; Hoffler, Keith D.; Couch, Jesse C.; Ogburn, Marilyn E.; Diebler, Corey G.
2013-01-01
The NASA Langley Research Center Cockpit Motion Facility (CMF) was used to conduct a piloted simulation assessment of the impact of flexible structures on flying qualities. The CMF was used because of its relatively high bandwidth, six degree-of-freedom motion capability. Previous studies assessed and attempted to mitigate the effects of multiple dynamic aeroservoelastic modes (DASE). Those results indicated problems existed, but the specific cause and effect was difficult to ascertain. The goal of this study was to identify specific DASE frequencies, damping ratios, and gains that cause degradation in handling qualities. A generic aircraft simulation was developed and designed to have Cooper-Harper Level 1 handling qualities when flown without DASE models. A test matrix of thirty-six DASE modes was implemented. The modes had frequencies ranging from 1 to 3.5 Hz and were applied to each axis independently. Each mode consisted of a single axis, frequency, damping, and gain, and was evaluated individually by six subject pilots with test pilot backgrounds. Analysis completed to date suggests that a number of the DASE models evaluated degrade the handling qualities of this class of aircraft to an uncontrollable condition.
Development of an advanced pitch active control system for a wide body jet aircraft
NASA Technical Reports Server (NTRS)
Guinn, Wiley A.; Rising, Jerry J.; Davis, Walt J.
1984-01-01
An advanced PACS control law was developed for a commercial wide-body transport (Lockheed L-1011) by using modern control theory. Validity of the control law was demonstrated by piloted flight simulation tests on the NASA Langley visual motion simulator. The PACS design objective was to develop a PACS that would provide good flying qualities to negative 10 percent static stability margins that were equivalent to those of the baseline aircraft at a 15 percent static stability margin which is normal for the L-1011. Also, the PACS was to compensate for high-Mach/high-g instabilities that degrade flying qualities during upset recoveries and maneuvers. The piloted flight simulation tests showed that the PACS met the design objectives. The simulation demonstrated good flying qualities to negative 20 percent static stability margins for hold, cruise and high-speed flight conditions. Analysis and wind tunnel tests performed on other Lockheed programs indicate that the PACS could be used on an advanced transport configuration to provide a 4 percent fuel savings which results from reduced trim drag by flying at negative static stability margins.
NASA Astrophysics Data System (ADS)
Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David
2018-05-01
As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.
Urban air quality estimation study, phase 1
NASA Technical Reports Server (NTRS)
Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.
1976-01-01
Possibilities are explored for applying estimation theory to the analysis, interpretation, and use of air quality measurements in conjunction with simulation models to provide a cost effective method of obtaining reliable air quality estimates for wide urban areas. The physical phenomenology of real atmospheric plumes from elevated localized sources is discussed. A fluctuating plume dispersion model is derived. Individual plume parameter formulations are developed along with associated a priori information. Individual measurement models are developed.
A Review Analysis of Inverter Topologies for Solar PV Applications Focused on Power Quality
NASA Astrophysics Data System (ADS)
Faruqui, Saad Nazif Ahamad; Anwer, Naqui
2017-10-01
This research article gives widespread review of non-isolated topologies for solar photovoltaic equipments. To relate with available elucidations of the said studied topological arrangement, some conditions have been imposed. The benchmark is based on harmonic distortion as well as power quality issues. Some of the selected solution have been designed and simulated for power quality issues. The best one has been discussed in the paper.
Performance Analysis of Visible Light Communication Using CMOS Sensors.
Do, Trong-Hop; Yoo, Myungsik
2016-02-29
This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis.
Performance Analysis of Visible Light Communication Using CMOS Sensors
Do, Trong-Hop; Yoo, Myungsik
2016-01-01
This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis. PMID:26938535
Interfacing modules for integrating discipline specific structural mechanics codes
NASA Technical Reports Server (NTRS)
Endres, Ned M.
1989-01-01
An outline of the organization and capabilities of the Engine Structures Computational Simulator (Simulator) at NASA Lewis Research Center is given. One of the goals of the research at Lewis is to integrate various discipline specific structural mechanics codes into a software system which can be brought to bear effectively on a wide range of engineering problems. This system must possess the qualities of being effective and efficient while still remaining user friendly. The simulator was initially designed for the finite element simulation of gas jet engine components. Currently, the simulator has been restricted to only the analysis of high pressure turbine blades and the accompanying rotor assembly, although the current installation can be expanded for other applications. The simulator presently assists the user throughout its procedures by performing information management tasks, executing external support tasks, organizing analysis modules and executing these modules in the user defined order while maintaining processing continuity.
NASA Technical Reports Server (NTRS)
Myers, Jerry G.; Young, M.; Goodenow, Debra A.; Keenan, A.; Walton, M.; Boley, L.
2015-01-01
Model and simulation (MS) credibility is defined as, the quality to elicit belief or trust in MS results. NASA-STD-7009 [1] delineates eight components (Verification, Validation, Input Pedigree, Results Uncertainty, Results Robustness, Use History, MS Management, People Qualifications) that address quantifying model credibility, and provides guidance to the model developers, analysts, and end users for assessing the MS credibility. Of the eight characteristics, input pedigree, or the quality of the data used to develop model input parameters, governing functions, or initial conditions, can vary significantly. These data quality differences have varying consequences across the range of MS application. NASA-STD-7009 requires that the lowest input data quality be used to represent the entire set of input data when scoring the input pedigree credibility of the model. This requirement provides a conservative assessment of model inputs, and maximizes the communication of the potential level of risk of using model outputs. Unfortunately, in practice, this may result in overly pessimistic communication of the MS output, undermining the credibility of simulation predictions to decision makers. This presentation proposes an alternative assessment mechanism, utilizing results parameter robustness, also known as model input sensitivity, to improve the credibility scoring process for specific simulations.
The YAV-8B simulation and modeling. Volume 2: Program listing
NASA Technical Reports Server (NTRS)
1983-01-01
Detailed mathematical models of varying complexity representative of the YAV-8B aircraft are defined and documented. These models are used in parameter estimation and in linear analysis computer programs while investigating YAV-8B aircraft handling qualities. Both a six degree of freedom nonlinear model and a linearized three degree of freedom longitudinal and lateral directional model were developed. The nonlinear model is based on the mathematical model used on the MCAIR YAV-8B manned flight simulator. This simulator model has undergone periodic updating based on the results of approximately 360 YAV-8B flights and 8000 hours of wind tunnel testing. Qualified YAV-8B flight test pilots have commented that the handling qualities characteristics of the simulator are quite representative of the real aircraft. These comments are validated herein by comparing data from both static and dynamic flight test maneuvers to the same obtained using the nonlinear program.
NASA Astrophysics Data System (ADS)
Devendran, A. A.; Lakshmanan, G.
2014-11-01
Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.
1994-03-01
optimize, and perform "what-if" analysis on a complicated simulation model of the greenhouse effect . Regression metamodels were applied to several modules of...the large integrated assessment model of the greenhouse effect . In this study, the metamodels gave "acceptable forecast errors" and were shown to
Desktop microsimulation: a tool to improve efficiency in the medical office practice.
Montgomery, James B; Linville, Beth A; Slonim, Anthony D
2013-01-01
Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.
Fast Whole-Engine Stirling Analysis
NASA Technical Reports Server (NTRS)
Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako
2006-01-01
This presentation discusses the simulation approach to whole-engine for physical consistency, REV regenerator modeling, grid layering for smoothness, and quality, conjugate heat transfer method adjustment, high-speed low cost parallel cluster, and debugging.
An Enhanced K-Means Algorithm for Water Quality Analysis of The Haihe River in China
Zou, Hui; Zou, Zhihong; Wang, Xiaojing
2015-01-01
The increase and the complexity of data caused by the uncertain environment is today’s reality. In order to identify water quality effectively and reliably, this paper presents a modified fast clustering algorithm for water quality analysis. The algorithm has adopted a varying weights K-means cluster algorithm to analyze water monitoring data. The varying weights scheme was the best weighting indicator selected by a modified indicator weight self-adjustment algorithm based on K-means, which is named MIWAS-K-means. The new clustering algorithm avoids the margin of the iteration not being calculated in some cases. With the fast clustering analysis, we can identify the quality of water samples. The algorithm is applied in water quality analysis of the Haihe River (China) data obtained by the monitoring network over a period of eight years (2006–2013) with four indicators at seven different sites (2078 samples). Both the theoretical and simulated results demonstrate that the algorithm is efficient and reliable for water quality analysis of the Haihe River. In addition, the algorithm can be applied to more complex data matrices with high dimensionality. PMID:26569283
NASA Astrophysics Data System (ADS)
Johnston, J. M.
2013-12-01
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.
Building test data from real outbreaks for evaluating detection algorithms.
Texier, Gaetan; Jackson, Michael L; Siwe, Leonel; Meynard, Jean-Baptiste; Deparis, Xavier; Chaudet, Herve
2017-01-01
Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method-ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler). We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor) on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1) resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak signals.
Building test data from real outbreaks for evaluating detection algorithms
Texier, Gaetan; Jackson, Michael L.; Siwe, Leonel; Meynard, Jean-Baptiste; Deparis, Xavier; Chaudet, Herve
2017-01-01
Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method—ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler). We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor) on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1) resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak signals. PMID:28863159
NASA Astrophysics Data System (ADS)
Katchasuwanmanee, Kanet; Cheng, Kai; Bateman, Richard
2016-09-01
As energy efficiency is one of the key essentials towards sustainability, the development of an energy-resource efficient manufacturing system is among the great challenges facing the current industry. Meanwhile, the availability of advanced technological innovation has created more complex manufacturing systems that involve a large variety of processes and machines serving different functions. To extend the limited knowledge on energy-efficient scheduling, the research presented in this paper attempts to model the production schedule at an operation process by considering the balance of energy consumption reduction in production, production work flow (productivity) and quality. An innovative systematic approach to manufacturing energy-resource efficiency is proposed with the virtual simulation as a predictive modelling enabler, which provides real-time manufacturing monitoring, virtual displays and decision-makings and consequentially an analytical and multidimensional correlation analysis on interdependent relationships among energy consumption, work flow and quality errors. The regression analysis results demonstrate positive relationships between the work flow and quality errors and the work flow and energy consumption. When production scheduling is controlled through optimization of work flow, quality errors and overall energy consumption, the energy-resource efficiency can be achieved in the production. Together, this proposed multidimensional modelling and analysis approach provides optimal conditions for the production scheduling at the manufacturing system by taking account of production quality, energy consumption and resource efficiency, which can lead to the key competitive advantages and sustainability of the system operations in the industry.
Development and analysis of air quality modeling simulations for hazardous air pollutants
NASA Astrophysics Data System (ADS)
Luecken, D. J.; Hutzell, W. T.; Gipson, G. L.
The concentrations of five hazardous air pollutants were simulated using the community multi-scale air quality (CMAQ) modeling system. Annual simulations were performed over the continental United States for the entire year of 2001 to support human exposure estimates. Results are shown for formaldehyde, acetaldehyde, benzene, 1,3-butadiene and acrolein. Photochemical production in the atmosphere is predicted to dominate ambient formaldehyde and acetaldehyde concentrations, and to account for a significant fraction of ambient acrolein concentrations. Spatial and temporal variations are large throughout the domain over the year. Predicted concentrations are compared with observations for formaldehyde, acetaldehyde, benzene and 1,3-butadiene. Although the modeling results indicate an overall slight tendency towards underprediction, they reproduce episodic and seasonal behavior of pollutant concentrations at many monitors with good skill.
NASA Technical Reports Server (NTRS)
Urie, D. M.
1979-01-01
Relaxed static stability and stability augmentation with active controls were investigated for subsonic transport aircraft. Analytical and simulator evaluations were done using a contemporary wide body transport as a baseline. Criteria for augmentation system performance and unaugmented flying qualities were evaluated. Augmentation control laws were defined based on selected frequency response and time history criteria. Flying qualities evaluations were conducted by pilots using a moving base simulator with a transport cab. Static margin and air turbulence intensity were varied in test with and without augmentation. Suitability of a simple pitch control law was verified at neutral static margin in cruise and landing flight tasks. Neutral stability was found to be marginally acceptable in heavy turbulence in both cruise and landing conditions.
NASA Astrophysics Data System (ADS)
Swain, Snehaprava; Ray, Pravat Kumar
2016-12-01
In this paper a three phase fault analysis is done on a DFIG based grid integrated wind energy system. A Novel Active Crowbar Protection (NACB_P) system is proposed to enhance the Fault-ride through (FRT) capability of DFIG both for symmetrical as well as unsymmetrical grid faults. Hence improves the power quality of the system. The protection scheme proposed here is designed with a capacitor in series with the resistor unlike the conventional Crowbar (CB) having only resistors. The major function of the capacitor in the protection circuit is to eliminate the ripples generated in the rotor current and to protect the converter as well as the DC-link capacitor. It also compensates reactive power required by the DFIG during fault. Due to these advantages the proposed scheme enhances the FRT capability of the DFIG and also improves the power quality of the whole system. Experimentally the fault analysis is done on a 3hp slip ring induction generator and simulation results are carried out on a 1.7 MVA DFIG based WECS under different types of grid faults in MATLAB/Simulation and functionality of the proposed scheme is verified.
NASA Astrophysics Data System (ADS)
Yu, H.; Prospero, J. M.; Chin, M.; Randles, C. A.; da Silva, A.; Bian, H.
2015-12-01
Long-term surface measurements in several locations extending from northeastern coast of South America to Miami in Florida have shown that African dust arrives in the Greater Caribbean Basin throughout a year. This long-range transported dust frequently elevates the level of particulate matter (PM) above the WHO guideline for PM10, which raises a concern of possible adverse impact of African dust on human health in the region. There is also concern about how future climate change might affect dust transport and its influence on regional air quality. In this presentation we provide a comprehensive characterization of the influence of African dust on air quality in the Caribbean Basin via integrating the ground observations with satellite retrievals and model simulations. The ground observations are used to validate and evaluate satellite retrievals and model simulations of dust, while satellite measurements and model simulations are used to extend spatial coverage of the ground observations. An analysis of CALIPSO lidar measurements of three-dimensional distribution of aerosols over 2007-2014 yields altitude-resolved dust mass flux into the region. On a basis of 8-year average and integration over the latitude zone of 0°-30°N, a total of 76 Tg dust is imported to the air above the Greater Caribbean Basin, of which 34 Tg (or 45%) is within the lowest 1 km layer and most relevant to air quality concern. The seasonal and interannual variations of the dust import are well correlated with ground observations of dust in Cayenne, Barbados, Puerto Rico, and Miami. We will also show comparisons of the size-resolved dust amount from both NASA GEOS-5 aerosol simulation and MERRA-2 aerosol reanalysis (i.e., column aerosol loading being constrained by satellite measurements of radiance at the top of atmosphere) with the ground observations and satellite measurement.
An objective method for a video quality evaluation in a 3DTV service
NASA Astrophysics Data System (ADS)
Wilczewski, Grzegorz
2015-09-01
The following article describes proposed objective method for a 3DTV video quality evaluation, a Compressed Average Image Intensity (CAII) method. Identification of the 3DTV service's content chain nodes enables to design a versatile, objective video quality metric. It is based on an advanced approach to the stereoscopic videostream analysis. Insights towards designed metric mechanisms, as well as the evaluation of performance of the designed video quality metric, in the face of the simulated environmental conditions are herein discussed. As a result, created CAII metric might be effectively used in a variety of service quality assessment applications.
3D Fiber Orientation Simulation for Plastic Injection Molding
NASA Astrophysics Data System (ADS)
Lin, Baojiu; Jin, Xiaoshi; Zheng, Rong; Costa, Franco S.; Fan, Zhiliang
2004-06-01
Glass fiber reinforced polymer is widely used in the products made using injection molding processing. The distribution of fiber orientation inside plastic parts has direct effects on quality of molded parts. Using computer simulation to predict fiber orientation distribution is one of most efficient ways to assist engineers to do warpage analysis and to find a good design solution to produce high quality plastic parts. Fiber orientation simulation software based on 2-1/2D (midplane /Dual domain mesh) techniques has been used in industry for a decade. However, the 2-1/2D technique is based on the planar Hele-Shaw approximation and it is not suitable when the geometry has complex three-dimensional features which cannot be well approximated by 2D shells. Recently, a full 3D simulation software for fiber orientation has been developed and integrated into Moldflow Plastics Insight 3D simulation software. The theory for this new 3D fiber orientation calculation module is described in this paper. Several examples are also presented to show the benefit in using 3D fiber orientation simulation.
Sotelo, David; Favela-Contreras, Antonio; Sotelo, Carlos; Jiménez, Guillermo; Gallegos-Canales, Luis
2017-11-01
In recent years, interest for petrochemical processes has been increasing, especially in refinement area. However, the high variability in the dynamic characteristics present in the atmospheric distillation column poses a challenge to obtain quality products. To improve distillates quality in spite of the changes in the input crude oil composition, this paper details a new design of a control strategy in a conventional crude oil distillation plant defined using formal interaction analysis tools. The process dynamic and its control are simulated on Aspen HYSYS ® dynamic environment under real operating conditions. The simulation results are compared against a typical control strategy commonly used in crude oil atmospheric distillation columns. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gulliver, Eric A.
The objective of this thesis to identify and develop techniques providing direct comparison between simulated and real packed particle mixture microstructures containing submicron-sized particles. This entailed devising techniques for simulating powder mixtures, producing real mixtures with known powder characteristics, sectioning real mixtures, interrogating mixture cross-sections, evaluating and quantifying the mixture interrogation process and for comparing interrogation results between mixtures. A drop and roll-type particle-packing model was used to generate simulations of random mixtures. The simulated mixtures were then evaluated to establish that they were not segregated and free from gross defects. A powder processing protocol was established to provide real mixtures for direct comparison and for use in evaluating the simulation. The powder processing protocol was designed to minimize differences between measured particle size distributions and the particle size distributions in the mixture. A sectioning technique was developed that was capable of producing distortion free cross-sections of fine scale particulate mixtures. Tessellation analysis was used to interrogate mixture cross sections and statistical quality control charts were used to evaluate different types of tessellation analysis and to establish the importance of differences between simulated and real mixtures. The particle-packing program generated crescent shaped pores below large particles but realistic looking mixture microstructures otherwise. Focused ion beam milling was the only technique capable of sectioning particle compacts in a manner suitable for stereological analysis. Johnson-Mehl and Voronoi tessellation of the same cross-sections produced tessellation tiles with different the-area populations. Control charts analysis showed Johnson-Mehl tessellation measurements are superior to Voronoi tessellation measurements for detecting variations in mixture microstructure, such as altered particle-size distributions or mixture composition. Control charts based on tessellation measurements were used for direct, quantitative comparisons between real and simulated mixtures. Four sets of simulated and real mixtures were examined. Data from real mixture was matched with simulated data when the samples were well mixed and the particle size distributions and volume fractions of the components were identical. Analysis of mixture components that occupied less than approximately 10 vol% of the mixture was not practical unless the particle size of the component was extremely small and excellent quality high-resolution compositional micrographs of the real sample are available. These methods of analysis should allow future researchers to systematically evaluate and predict the impact and importance of variables such as component volume fraction and component particle size distribution as they pertain to the uniformity of powder mixture microstructures.
Improvement of Meteorological Inputs for TexAQS-II Air Quality Simulations
NASA Astrophysics Data System (ADS)
Ngan, F.; Byun, D.; Kim, H.; Cheng, F.; Kim, S.; Lee, D.
2008-12-01
An air quality forecasting system (UH-AQF) for Eastern Texas, which is in operation by the Institute for Multidimensional Air Quality Studies (IMAQS) at the University of Houston, uses the Fifth-Generation PSU/NCAR Mesoscale Model MM5 model as the meteorological driver for modeling air quality with the Community Multiscale Air Quality (CMAQ) model. While the forecasting system was successfully used for the planning and implementation of various measurement activities, evaluations of the forecasting results revealed a few systematic problems in the numerical simulations. From comparison with observations, we observe some times over-prediction of northerly winds caused by inaccurate synoptic inputs and other times too strong southerly winds caused by local sea breeze development. Discrepancies in maximum and minimum temperature are also seen for certain days. Precipitation events, as well as clouds, are simulated at the incorrect locations and times occasionally. Model simulatednrealistic thunderstorms are simulated, causing sometimes cause unrealistically strong outflows. To understand physical and chemical processes influencing air quality measures, a proper description of real world meteorological conditions is essential. The objective of this study is to generate better meteorological inputs than the AQF results to support the chemistry modeling. We utilized existing objective analysis and nudging tools in the MM5 system to develop the MUltiscale Nest-down Data Assimilation System (MUNDAS), which incorporates extensive meteorological observations available in the simulated domain for the retrospective simulation of the TexAQS-II period. With the re-simulated meteorological input, we are able to better predict ozone events during TexAQS-II period. In addition, base datasets in MM5 such as land use/land cover, vegetation fraction, soil type and sea surface temperature are updated by satellite data to represent the surface features more accurately. They are key physical parameters inputs affecting transfer of heat, momentum and soil moisture in land-surface process in MM5. Using base the accurate input datasets, we are able to have improved see the differences of predictions of ground temperatures, winds and even thunderstorm activities within boundary layer.
Green, W. Reed
2001-01-01
Lake Maumelle is the major drinking-water source for the Little Rock metropolitan area in central Arkansas. Urban and agricultural development has increased in the Lake Maumelle Basin and information is needed related to constituent transport and waterquality response to changes in constituent loading or hydrologic regime. This report characterizes ambient conditions in Lake Maumelle and its major tributary, Maumelle River; describes the calibration and verification of a numerical model of hydrodynamics and water quality; and provides several simulations that describe constituent transport and water quality response to changes in constituent loading and hydrologic regime. Ambient hydrologic and water-quality conditions demonstrate the relatively undisturbed nature of Lake Maumelle and the Maumelle River. Nitrogen and phosphorus concentrations were low, one to two orders of magnitude lower than estimates of national background nutrient concentrations. Phosphorus and chlorophyll a concentrations in Lake Maumelle demonstrate its oligotrophic/mesotrophic condition. However, concentrations of chlorophyll a appeared to increase since 1990 within the upper and middle reaches of the reservoir. A two-dimensional, laterally averaged hydrodynamic and water-quality model developed and calibrated for Lake Maumelle simulates water level, currents, heat transport and temperature distribution, conservative material transport, and the transport and transformation of 11 chemical constituents. Simulations included the movement and dispersion of spills or releases in the reservoir during stratified and unstratified conditions, release of the fish nursery pond off the southern shore of Lake Maumelle, and algal responses to changes in external loading. The model was calibrated using 1991 data and verified using 1992 data. Simulated temperature and dissolved oxygen concentrations related well when compared to measured values. Simulated nutrient and algal biomass also related reasonably well when compared to measured values. A simulated spill of conservative material at the upper end of Lake Maumelle during a major storm event took less than 102 hours to disperse the entire length of the reservoir. Simulation of a nursery pond release into a tributary to Lake Maumelle demonstrated how the released water plunges within the receiving embayment and enters the main stem of the reservoir at mid depths. Simulations of algal response to increases of nitrogen and phosphorus loads demonstrate the phosphorus limiting condition in Lake Maumelle. Results from this study will provide waterresource management with information to better understand how changes in hydrology and water quality in the basin affects water quality in the reservoir. With this information, managers will be able to more effectively manage their drinking-water source supply.
Following the Part I paper that described an application of the U.S. EPA Models-3/Community Multiscale Air Quality (CMAQ) modeling system to the 1999 Southern Oxidants Study episode, this paper presents results from process analysis (PA) using the PA tool embedded in CMAQ and s...
NASA Technical Reports Server (NTRS)
Prive, N. C.; Errico, R. M.; Tai, K.-S.
2013-01-01
The Global Modeling and Assimilation Office (GMAO) observing system simulation experiment (OSSE) framework is used to explore the response of analysis error and forecast skill to observation quality. In an OSSE, synthetic observations may be created that have much smaller error than real observations, and precisely quantified error may be applied to these synthetic observations. Three experiments are performed in which synthetic observations with magnitudes of applied observation error that vary from zero to twice the estimated realistic error are ingested into the Goddard Earth Observing System Model (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation for a one-month period representing July. The analysis increment and observation innovation are strongly impacted by observation error, with much larger variances for increased observation error. The analysis quality is degraded by increased observation error, but the change in root-mean-square error of the analysis state is small relative to the total analysis error. Surprisingly, in the 120 hour forecast increased observation error only yields a slight decline in forecast skill in the extratropics, and no discernable degradation of forecast skill in the tropics.
Creation and Validation of a Simulator for Neonatal Brain Ultrasonography: A Pilot Study.
Tsai, Andy; Barnewolt, Carol E; Prahbu, Sanjay P; Yonekura, Reimi; Hosmer, Andrew; Schulz, Noah E; Weinstock, Peter H
2017-01-01
Historically, skills training in performing brain ultrasonography has been limited to hours of scanning infants for lack of adequate synthetic models or alternatives. The aim of this study was to create a simulator and determine its utility as an educational tool in teaching the skills that can be used in performing brain ultrasonography on infants. A brain ultrasonography simulator was created using a combination of multi-modality imaging, three-dimensional printing, material and acoustic engineering, and sculpting and molding. Radiology residents participated prior to their pediatric rotation. The study included (1) an initial questionnaire and resident creation of three coronal images using the simulator; (2) brain ultrasonography lecture; (3) hands-on simulator practice; and (4) a follow-up questionnaire and re-creation of the same three coronal images on the simulator. A blinded radiologist scored the quality of the pre- and post-training images using metrics including symmetry of the images and inclusion of predetermined landmarks. Wilcoxon rank-sum test was used to compare pre- and post-training questionnaire rankings and image quality scores. Ten residents participated in the study. Analysis of pre- and post-training rankings showed improvements in technical knowledge and confidence, and reduction in anxiety in performing brain ultrasonography. Objective measures of image quality likewise improved. Mean reported value score for simulator training was high across participants who reported perceived improvements in scanning skills and enjoyment from simulator use, with interest in additional practice on the simulator and recommendations for its use. This pilot study supports the use of a simulator in teaching radiology residents the skills that can be used to perform brain ultrasonography. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Janssen, D; Zwartelé, R E; Doets, H C; Verdonschot, N
2010-01-01
Patients suffering from rheumatoid arthritis typically have a poor subchondral bone quality, endangering implant fixation. Using finite element analysis (FEA) an investigation was made to find whether a press-fit acetabular implant with a polar clearance would reduce interfacial micromotions and improve fixation compared with a standard hemispherical design. In addition, the effects of interference fit, friction, and implant material were analysed. Cups were introduced into an FEA model of a human pelvis with simulated subchondral bone plasticity. The models were loaded with a loading configuration simulating two cycles of normal walking, during which contact stresses and interfacial micromotions were monitored. Subsequently, a lever-out simulation was performed to assess the fixation strength of the various cases. A flattened cup with good bone quality produced the lowest interfacial micromotions. Poor bone decreased the fixation strength regardless of the geometry of the cup. Increasing the interference fit of the flattened cup compensated for the loss of fixation strength caused by poor bone quality. In conclusion, a flattened cup did not significantly improve implant fixation over a hemispherical cup in the case of poor bone quality. However, implant fixation can be optimized by increasing interference fit and avoiding inferior frictional properties and low-stiffness implants.
Enhanced verification test suite for physics simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.
2008-09-01
This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.
NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less
NASA Astrophysics Data System (ADS)
Voss, Anja; Bärlund, Ilona; Punzet, Manuel; Williams, Richard; Teichert, Ellen; Malve, Olli; Voß, Frank
2010-05-01
Although catchment scale modelling of water and solute transport and transformations is a widely used technique to study pollution pathways and effects of natural changes, policies and mitigation measures there are only a few examples of global water quality modelling. This work will provide a description of the new continental-scale model of water quality WorldQual and the analysis of model simulations under changed climate and anthropogenic conditions with respect to changes in diffuse and point loading as well as surface water quality. BOD is used as an indicator of the level of organic pollution and its oxygen-depleting potential, and for the overall health of aquatic ecosystems. The first application of this new water quality model is to river systems of Europe. The model itself is being developed as part of the EU-funded SCENES Project which has the principal goal of developing new scenarios of the future of freshwater resources in Europe. The aim of the model is to determine chemical fluxes in different pathways combining analysis of water quantity with water quality. Simple equations, consistent with the availability of data on the continental scale, are used to simulate the response of in-stream BOD concentrations to diffuse and anthropogenic point loadings as well as flow dilution. Point sources are divided into manufacturing, domestic and urban loadings, whereas diffuse loadings come from scattered settlements, agricultural input (for instance livestock farming), and also from natural background sources. The model is tested against measured longitudinal gradients and time series data at specific river locations with different loading characteristics like the Thames that is driven by domestic loading and Ebro with relative high share of diffuse loading. With scenario studies the influence of climate and anthropogenic changes on European water resources shall be investigated with the following questions: 1. What percentage of river systems will have degraded water quality due to different driving forces? 2. How will climate change and changes in wastewater discharges affect water quality? For the analysis these scenario aspects are included: 1. climate with changed runoff (affecting diffuse pollution and loading from sealed areas), river discharge (causing dilution or concentration of point source pollution) and water temperature (affecting BOD degradation). 2. Point sources with changed population (affecting domestic pollution), connectivity to treatment plants (influencing domestic and manufacturing pollution as well as input from sealed areas and scattered settlements).
Simulation and study of power quality issues in a fixed speed wind farm substation.
Magesh, T; Chellamuthu, C
2015-01-01
Power quality issues associated with the fixed speed wind farm substation located at Coimbatore district are investigated as the wind generators are tripping frequently. The investigations are carried out using two power quality analyzers, Fluke 435 and Dranetz PX5.8, with one of them connected at group control breaker of the 110 kV feeder and the other at the selected 0.69 kV generator busbar during the period of maximum power generation. From the analysis of the recorded data it is found that sag, swell, and transients are the major events which are responsible for the tripping of the generators. In the present study, simulation models for wind, turbine, shaft, pitch mechanism, induction generator, and grid are developed using DIgSILENT. Using the turbine characteristics, a two-dimensional lookup table is designed to generate a reference pitch angle necessary to simulate the power curve of the passive stall controlled wind turbine. Various scenarios and their effects on the performance of the wind farm are studied and validated with the recorded data and waveforms. The simulation model will be useful for the designers for planning and development of the wind farm before implementation.
Simulation and Study of Power Quality Issues in a Fixed Speed Wind Farm Substation
Magesh, T.; Chellamuthu, C.
2015-01-01
Power quality issues associated with the fixed speed wind farm substation located at Coimbatore district are investigated as the wind generators are tripping frequently. The investigations are carried out using two power quality analyzers, Fluke 435 and Dranetz PX5.8, with one of them connected at group control breaker of the 110 kV feeder and the other at the selected 0.69 kV generator busbar during the period of maximum power generation. From the analysis of the recorded data it is found that sag, swell, and transients are the major events which are responsible for the tripping of the generators. In the present study, simulation models for wind, turbine, shaft, pitch mechanism, induction generator, and grid are developed using DIgSILENT. Using the turbine characteristics, a two-dimensional lookup table is designed to generate a reference pitch angle necessary to simulate the power curve of the passive stall controlled wind turbine. Various scenarios and their effects on the performance of the wind farm are studied and validated with the recorded data and waveforms. The simulation model will be useful for the designers for planning and development of the wind farm before implementation. PMID:25950016
NASA Astrophysics Data System (ADS)
Hogrefe, Christian; Liu, Peng; Pouliot, George; Mathur, Rohit; Roselle, Shawn; Flemming, Johannes; Lin, Meiyun; Park, Rokjin J.
2018-03-01
This study analyzes simulated regional-scale ozone burdens both near the surface and aloft, estimates process contributions to these burdens, and calculates the sensitivity of the simulated regional-scale ozone burden to several key model inputs with a particular emphasis on boundary conditions derived from hemispheric or global-scale models. The Community Multiscale Air Quality (CMAQ) model simulations supporting this analysis were performed over the continental US for the year 2010 within the context of the Air Quality Model Evaluation International Initiative (AQMEII) and Task Force on Hemispheric Transport of Air Pollution (TF-HTAP) activities. CMAQ process analysis (PA) results highlight the dominant role of horizontal and vertical advection on the ozone burden in the mid-to-upper troposphere and lower stratosphere. Vertical mixing, including mixing by convective clouds, couples fluctuations in free-tropospheric ozone to ozone in lower layers. Hypothetical bounding scenarios were performed to quantify the effects of emissions, boundary conditions, and ozone dry deposition on the simulated ozone burden. Analysis of these simulations confirms that the characterization of ozone outside the regional-scale modeling domain can have a profound impact on simulated regional-scale ozone. This was further investigated by using data from four hemispheric or global modeling systems (Chemistry - Integrated Forecasting Model (C-IFS), CMAQ extended for hemispheric applications (H-CMAQ), the Goddard Earth Observing System model coupled to chemistry (GEOS-Chem), and AM3) to derive alternate boundary conditions for the regional-scale CMAQ simulations. The regional-scale CMAQ simulations using these four different boundary conditions showed that the largest ozone abundance in the upper layers was simulated when using boundary conditions from GEOS-Chem, followed by the simulations using C-IFS, AM3, and H-CMAQ boundary conditions, consistent with the analysis of the ozone fields from the global models along the CMAQ boundaries. Using boundary conditions from AM3 yielded higher springtime ozone columns burdens in the middle and lower troposphere compared to boundary conditions from the other models. For surface ozone, the differences between the AM3-driven CMAQ simulations and the CMAQ simulations driven by other large-scale models are especially pronounced during spring and winter where they can reach more than 10 ppb for seasonal mean ozone mixing ratios and as much as 15 ppb for domain-averaged daily maximum 8 h average ozone on individual days. In contrast, the differences between the C-IFS-, GEOS-Chem-, and H-CMAQ-driven regional-scale CMAQ simulations are typically smaller. Comparing simulated surface ozone mixing ratios to observations and computing seasonal and regional model performance statistics revealed that boundary conditions can have a substantial impact on model performance. Further analysis showed that boundary conditions can affect model performance across the entire range of the observed distribution, although the impacts tend to be lower during summer and for the very highest observed percentiles. The results are discussed in the context of future model development and analysis opportunities.
Exploratory Factor Analysis with Small Sample Sizes
ERIC Educational Resources Information Center
de Winter, J. C. F.; Dodou, D.; Wieringa, P. A.
2009-01-01
Exploratory factor analysis (EFA) is generally regarded as a technique for large sample sizes ("N"), with N = 50 as a reasonable absolute minimum. This study offers a comprehensive overview of the conditions in which EFA can yield good quality results for "N" below 50. Simulations were carried out to estimate the minimum required "N" for different…
Missing data is a common problem in the application of statistical techniques. In principal component analysis (PCA), a technique for dimensionality reduction, incomplete data points are either discarded or imputed using interpolation methods. Such approaches are less valid when ...
Mission definition study for Stanford relativity satellite. Volume 3: Appendices
NASA Technical Reports Server (NTRS)
1971-01-01
An analysis is presented for the cost of the mission as a function of the following variables: amount of redundancy in the spacecraft, amount of care taken in building the spacecraft (functional and environmental tests, screening of components, quality control, etc), and the number of flights necessary to accomplish the mission. Thermal analysis and mathematical models for the experimental components are presented. The results of computer structural and stress analyses for support and cylinders are discussed. Reliability, quality control, and control system simulation by computer are also considered.
Flight Dynamics Aspects of a Large Civil Tiltrotor Simulation Using Translational Rate Command
NASA Technical Reports Server (NTRS)
Lawrence, Ben; Malpica, Carlos A.; Theodore, Colin R.; Decker, William A.; Lindsey, James E.
2011-01-01
An in-depth analysis of a Large Civil Tiltrotor simulation with a Translational Rate Command control law that uses automatic nacelle deflections for longitudinal velocity control and lateral cyclic for lateral velocity control is presented. Results from piloted real-time simulation experiments and offline time and frequency domain analyses are used to investigate the fundamental flight dynamic and control mechanisms of the control law. The baseline Translational Rate Command conferred handling qualities improvements over an attitude command attitude hold control law but in some scenarios there was a tendency to enter PIO. Nacelle actuator rate limiting strongly influenced the PIO tendency and reducing the rate limits degraded the handling qualities further. Counterintuitively, increasing rate limits also led to a worsening of the handling qualities ratings. This led to the identification of a nacelle rate to rotor longitudinal flapping coupling effect that induced undesired pitching motions proportional to the allowable amount of nacelle rate. A modification that applied a counteracting amount of longitudinal cyclic proportional to the nacelle rate significantly improved the handling qualities. The lateral axis of the Translational Rate Command conferred Level 1 handling qualities in a Lateral Reposition maneuver. Analysis of the influence of the modeling fidelity on the lateral flapping angles is presented. It is showed that the linear modeling approximation is likely to have under-predicted the side-force and therefore under-predicted the lateral flapping at velocities above 15 ft/s. However, at lower velocities, and therefore more weakly influenced by the side force modeling, the accelerations that the control law commands also significantly influenced the peak levels of lateral flapping achieved.
A Monte Carlo analysis of breast screening randomized trials.
Zamora, Luis I; Forastero, Cristina; Guirado, Damián; Lallena, Antonio M
2016-12-01
To analyze breast screening randomized trials with a Monte Carlo simulation tool. A simulation tool previously developed to simulate breast screening programmes was adapted for that purpose. The history of women participating in the trials was simulated, including a model for survival after local treatment of invasive cancers. Distributions of time gained due to screening detection against symptomatic detection and the overall screening sensitivity were used as inputs. Several randomized controlled trials were simulated. Except for the age range of women involved, all simulations used the same population characteristics and this permitted to analyze their external validity. The relative risks obtained were compared to those quoted for the trials, whose internal validity was addressed by further investigating the reasons of the disagreements observed. The Monte Carlo simulations produce results that are in good agreement with most of the randomized trials analyzed, thus indicating their methodological quality and external validity. A reduction of the breast cancer mortality around 20% appears to be a reasonable value according to the results of the trials that are methodologically correct. Discrepancies observed with Canada I and II trials may be attributed to a low mammography quality and some methodological problems. Kopparberg trial appears to show a low methodological quality. Monte Carlo simulations are a powerful tool to investigate breast screening controlled randomized trials, helping to establish those whose results are reliable enough to be extrapolated to other populations and to design the trial strategies and, eventually, adapting them during their development. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Evaluative methodology for comprehensive water quality management planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, H. L.
Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.
A pilot modeling technique for handling-qualities research
NASA Technical Reports Server (NTRS)
Hess, R. A.
1980-01-01
A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.
Considerations for the Use of Remote Gaze Tracking to Assess Behavior in Flight Simulators
NASA Technical Reports Server (NTRS)
Kalar, Donald J.; Liston, Dorion; Mulligan, Jeffrey B.; Beutter, Brent; Feary, Michael
2016-01-01
Complex user interfaces (such as those found in an aircraft cockpit) may be designed from first principles, but inevitably must be evaluated with real users. User gaze data can provide valuable information that can help to interpret other actions that change the state of the system. However, care must be taken to ensure that any conclusions drawn from gaze data are well supported. Through a combination of empirical and simulated data, we identify several considerations and potential pitfalls when measuring gaze behavior in high-fidelity simulators. We show that physical layout, behavioral differences, and noise levels can all substantially alter the quality of fit for algorithms that segment gaze measurements into individual fixations. We provide guidelines to help investigators ensure that conclusions drawn from gaze tracking data are not artifactual consequences of data quality or analysis techniques.
Xi, Qing; Li, Zhao-Fu; Luo, Chuan
2014-05-01
Sensitivity analysis of hydrology and water quality parameters has a great significance for integrated model's construction and application. Based on AnnAGNPS model's mechanism, terrain, hydrology and meteorology, field management, soil and other four major categories of 31 parameters were selected for the sensitivity analysis in Zhongtian river watershed which is a typical small watershed of hilly region in the Taihu Lake, and then used the perturbation method to evaluate the sensitivity of the parameters to the model's simulation results. The results showed that: in the 11 terrain parameters, LS was sensitive to all the model results, RMN, RS and RVC were generally sensitive and less sensitive to the output of sediment but insensitive to the remaining results. For hydrometeorological parameters, CN was more sensitive to runoff and sediment and relatively sensitive for the rest results. In field management, fertilizer and vegetation parameters, CCC, CRM and RR were less sensitive to sediment and particulate pollutants, the six fertilizer parameters (FR, FD, FID, FOD, FIP, FOP) were particularly sensitive for nitrogen and phosphorus nutrients. For soil parameters, K is quite sensitive to all the results except the runoff, the four parameters of the soil's nitrogen and phosphorus ratio (SONR, SINR, SOPR, SIPR) were less sensitive to the corresponding results. The simulation and verification results of runoff in Zhongtian watershed show a good accuracy with the deviation less than 10% during 2005- 2010. Research results have a direct reference value on AnnAGNPS model's parameter selection and calibration adjustment. The runoff simulation results of the study area also proved that the sensitivity analysis was practicable to the parameter's adjustment and showed the adaptability to the hydrology simulation in the Taihu Lake basin's hilly region and provide reference for the model's promotion in China.
Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J
2009-04-01
Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.
Simulated annealing with probabilistic analysis for solving traveling salesman problems
NASA Astrophysics Data System (ADS)
Hong, Pei-Yee; Lim, Yai-Fung; Ramli, Razamin; Khalid, Ruzelan
2013-09-01
Simulated Annealing (SA) is a widely used meta-heuristic that was inspired from the annealing process of recrystallization of metals. Therefore, the efficiency of SA is highly affected by the annealing schedule. As a result, in this paper, we presented an empirical work to provide a comparable annealing schedule to solve symmetric traveling salesman problems (TSP). Randomized complete block design is also used in this study. The results show that different parameters do affect the efficiency of SA and thus, we propose the best found annealing schedule based on the Post Hoc test. SA was tested on seven selected benchmarked problems of symmetric TSP with the proposed annealing schedule. The performance of SA was evaluated empirically alongside with benchmark solutions and simple analysis to validate the quality of solutions. Computational results show that the proposed annealing schedule provides a good quality of solution.
Coverage-guaranteed sensor node deployment strategies for wireless sensor networks.
Fan, Gaojuan; Wang, Ruchuan; Huang, Haiping; Sun, Lijuan; Sha, Chao
2010-01-01
Deployment quality and cost are two conflicting aspects in wireless sensor networks. Random deployment, where the monitored field is covered by randomly and uniformly deployed sensor nodes, is an appropriate approach for large-scale network applications. However, their successful applications depend considerably on the deployment quality that uses the minimum number of sensors to achieve a desired coverage. Currently, the number of sensors required to meet the desired coverage is based on asymptotic analysis, which cannot meet deployment quality due to coverage overestimation in real applications. In this paper, we first investigate the coverage overestimation and address the challenge of designing coverage-guaranteed deployment strategies. To overcome this problem, we propose two deployment strategies, namely, the Expected-area Coverage Deployment (ECD) and BOundary Assistant Deployment (BOAD). The deployment quality of the two strategies is analyzed mathematically. Under the analysis, a lower bound on the number of deployed sensor nodes is given to satisfy the desired deployment quality. We justify the correctness of our analysis through rigorous proof, and validate the effectiveness of the two strategies through extensive simulation experiments. The simulation results show that both strategies alleviate the coverage overestimation significantly. In addition, we also evaluate two proposed strategies in the context of target detection application. The comparison results demonstrate that if the target appears at the boundary of monitored region in a given random deployment, the average intrusion distance of BOAD is considerably shorter than that of ECD with the same desired deployment quality. In contrast, ECD has better performance in terms of the average intrusion distance when the invasion of intruder is from the inside of monitored region.
Analysis of air quality management with emphasis on transportation sources
NASA Technical Reports Server (NTRS)
English, T. D.; Divita, E.; Lees, L.
1980-01-01
The current environment and practices of air quality management were examined for three regions: Denver, Phoenix, and the South Coast Air Basin of California. These regions were chosen because the majority of their air pollution emissions are related to mobile sources. The impact of auto exhaust on the air quality management process is characterized and assessed. An examination of the uncertainties in air pollutant measurements, emission inventories, meteorological parameters, atmospheric chemistry, and air quality simulation models is performed. The implications of these uncertainties to current air quality management practices is discussed. A set of corrective actions are recommended to reduce these uncertainties.
3D finite element modelling of sheet metal blanking process
NASA Astrophysics Data System (ADS)
Bohdal, Lukasz; Kukielka, Leon; Chodor, Jaroslaw; Kulakowska, Agnieszka; Patyk, Radoslaw; Kaldunski, Pawel
2018-05-01
The shearing process such as the blanking of sheet metals has been used often to prepare workpieces for subsequent forming operations. The use of FEM simulation is increasing for investigation and optimizing the blanking process. In the current literature a blanking FEM simulations for the limited capability and large computational cost of the three dimensional (3D) analysis has been largely limited to two dimensional (2D) plane axis-symmetry problems. However, a significant progress in modelling which takes into account the influence of real material (e.g. microstructure of the material), physical and technological conditions can be obtained by using 3D numerical analysis methods in this area. The objective of this paper is to present 3D finite element analysis of the ductile fracture, strain distribution and stress in blanking process with the assumption geometrical and physical nonlinearities. The physical, mathematical and computer model of the process are elaborated. Dynamic effects, mechanical coupling, constitutive damage law and contact friction are taken into account. The application in ANSYS/LS-DYNA program is elaborated. The effect of the main process parameter a blanking clearance on the deformation of 1018 steel and quality of the blank's sheared edge is analyzed. The results of computer simulations can be used to forecasting quality of the final parts optimization.
A sediment resuspension and water quality model of Lake Okeechobee
James, R.T.; Martin, J.; Wool, T.; Wang, P.-F.
1997-01-01
The influence of sediment resuspension on the water quality of shallow lakes is well documented. However, a search of the literature reveals no deterministic mass-balance eutrophication models that explicitly include resuspension. We modified the Lake Okeeehobee water quality model - which uses the Water Analysis Simulation Package (WASP) to simulate algal dynamics and phosphorus, nitrogen, and oxygen cycles - to include inorganic suspended solids and algorithms that: (1) define changes in depth with changes in volume; (2) compute sediment resuspension based on bottom shear stress; (3) compute partition coefficients for ammonia and ortho-phosphorus to solids; and (4) relate light attenuation to solids concentrations. The model calibration and validation were successful with the exception of dissolved inorganic nitrogen species which did not correspond well to observed data in the validation phase. This could be attributed to an inaccurate formulation of algal nitrogen preference and/or the absence of nitrogen fixation in the model. The model correctly predicted that the lake is lightlimited from resuspended solids, and algae are primarily nitrogen limited. The model simulation suggested that biological fluxes greatly exceed external loads of dissolved nutrients; and sedimentwater interactions of organic nitrogen and phosphorus far exceed external loads. A sensitivity analysis demonstrated that parameters affecting resuspension, settling, sediment nutrient and solids concentrations, mineralization, algal productivity, and algal stoichiometry are factors requiring further study to improve our understanding of the Lake Okeechobee ecosystem.
Make or buy decision model with multi-stage manufacturing process and supplier imperfect quality
NASA Astrophysics Data System (ADS)
Pratama, Mega Aria; Rosyidi, Cucuk Nur
2017-11-01
This research develops an make or buy decision model considering supplier imperfect quality. This model can be used to help companies make the right decision in case of make or buy component with the best quality and the least cost in multistage manufacturing process. The imperfect quality is one of the cost component that must be minimizing in this model. Component with imperfect quality, not necessarily defective. It still can be rework and used for assembly. This research also provide a numerical example and sensitivity analysis to show how the model work. We use simulation and help by crystal ball to solve the numerical problem. The sensitivity analysis result show that percentage of imperfect generally not affect to the model significantly, and the model is not sensitive to changes in these parameters. This is because the imperfect cost are smaller than overall total cost components.
Romagnoli, Martín; Portapila, Margarita; Rigalli, Alfredo; Maydana, Gisela; Burgués, Martín; García, Carlos M
2017-10-15
Argentina has been among the world leaders in the production and export of agricultural products since the 1990s. The Carcarañá River Lower Basin (CRLB), a cropland of the Pampas region supplied by extensive rainfall, is located in an area with few streamgauging and other hydrologic/water-quality stations. Therefore, limited hydrologic data are available resulting in limited water-resources assessment. This work explores the application of Soil and Water Assessment Tool (SWAT) model to the CRLB in the Santa Fe province of the Pampas region. The analysis of field and remote-sensing data characterizing hydrology, water quality, soil types, land use/land cover, management practices, and crop yield, guarantee a comprehensive SWAT modeling approach. A combined manual and automated calibration and validation process incorporating sensitivity and uncertainty analysis is performed using information concerning interior watershed processes. Eleven N/P fertilizer rates are selected to simulate the impact of N fertilizer on crop yield, plant uptake, as well as runoff and leaching losses. Different indices (partial factor productivity, agronomic efficiency, apparent crop recovery efficiency of applied nutrient, internal utilization efficiency, and physiological efficiency) are considered to assess nitrogen-use efficiency. The overall quality of the fit is satisfactory considering the input data limitations. This work provides, for the first time in Argentina, a reliable tool to simulate yield response to soil quality and water availability capable to meet defined environmental targets to support decision making on planning public policies and private activities on the Pampas region. Copyright © 2017 Elsevier B.V. All rights reserved.
A Comparison of Two Balance Calibration Model Building Methods
NASA Technical Reports Server (NTRS)
DeLoach, Richard; Ulbrich, Norbert
2007-01-01
Simulated strain-gage balance calibration data is used to compare the accuracy of two balance calibration model building methods for different noise environments and calibration experiment designs. The first building method obtains a math model for the analysis of balance calibration data after applying a candidate math model search algorithm to the calibration data set. The second building method uses stepwise regression analysis in order to construct a model for the analysis. Four balance calibration data sets were simulated in order to compare the accuracy of the two math model building methods. The simulated data sets were prepared using the traditional One Factor At a Time (OFAT) technique and the Modern Design of Experiments (MDOE) approach. Random and systematic errors were introduced in the simulated calibration data sets in order to study their influence on the math model building methods. Residuals of the fitted calibration responses and other statistical metrics were compared in order to evaluate the calibration models developed with different combinations of noise environment, experiment design, and model building method. Overall, predicted math models and residuals of both math model building methods show very good agreement. Significant differences in model quality were attributable to noise environment, experiment design, and their interaction. Generally, the addition of systematic error significantly degraded the quality of calibration models developed from OFAT data by either method, but MDOE experiment designs were more robust with respect to the introduction of a systematic component of the unexplained variance.
A simulation study of the flight dynamics of elastic aircraft. Volume 2: Data
NASA Technical Reports Server (NTRS)
Waszak, Martin R.; Davidson, John B.; Schmidt, David K.
1987-01-01
The simulation experiment described addresses the effects of structural flexibility on the dynamic characteristics of a generic family of aircraft. The simulation was performed using the NASA Langley VMS simulation facility. The vehicle models were obtained as part of this research project. The simulation results include complete response data and subjective pilot ratings and comments and so allow a variety of analyses. The subjective ratings and analysis of the time histories indicate that increased flexibility can lead to increased tracking errors, degraded handling qualities, and changes in the frequency content of the pilot inputs. These results, furthermore, are significantly affected by the visual cues available to the pilot.
Evaluation and error apportionment of an ensemble of ...
Through the comparison of several regional-scale chemistry transport modelling systems that simulate meteorology and air quality over the European and American continents, this study aims at i) apportioning the error to the responsible processes using time-scale analysis, ii) helping to detect causes of models error, and iii) identifying the processes and scales most urgently requiring dedicated investigations. The analysis is conducted within the framework of the third phase of the Air Quality Model Evaluation International Initiative (AQMEII) and tackles model performance gauging through measurement-to-model comparison, error decomposition and time series analysis of the models biases for several fields (ozone, CO, SO2, NO, NO2, PM10, PM2.5, wind speed, and temperature). The operational metrics (magnitude of the error, sign of the bias, associativity) provide an overall sense of model strengths and deficiencies, while apportioning the error to its constituent parts (bias, variance and covariance) can help to assess the nature and quality of the error. Each of the error components is analysed independently and apportioned to specific processes based on the corresponding timescale (long scale, synoptic, diurnal, and intra-day) using the error apportionment technique devised in the former phases of AQMEII.The application of the error apportionment method to the AQMEII Phase 3 simulations provides several key insights. In addition to reaffirming the strong impact
On the application of hybrid meshes in hydraulic machinery CFD simulations
NASA Astrophysics Data System (ADS)
Schlipf, M.; Tismer, A.; Riedelbauch, S.
2016-11-01
The application of two different hybrid mesh types for the simulation of a Francis runner for automated optimization processes without user input is investigated. Those mesh types are applied to simplified test cases such as flow around NACA airfoils to identify the special mesh resolution effects with reduced complexity, like rotating cascade flows, as they occur in a turbomachine runner channel. The analysis includes the application of those different meshes on the geometries by keeping defined quality criteria and exploring the influences on the simulation results. All results are compared with reference values gained by simulations with blockstructured hexahedron meshes and the same numerical scheme. This avoids additional inaccuracies caused by further numerical and experimental measurement methods. The results show that a simulation with hybrid meshes built up by a blockstructured domain with hexahedrons around the blade in combination with a tetrahedral far field in the channel is sufficient to get results which are almost as accurate as the results gained by the reference simulation. Furthermore this method is robust enough for automated processes without user input and enables comparable meshes in size, distribution and quality for different similar geometries as occurring in optimization processes.
NASA Astrophysics Data System (ADS)
Gong, W.; Beagley, S. R.; Zhang, J.; Cousineau, S.; Sassi, M.; Munoz-Alpizar, R.; Racine, J.; Menard, S.; Chen, J.
2015-12-01
Arctic atmospheric composition is strongly influenced by long-range transport from mid-latitudes as well as processes occurring in the Arctic locally. Using an on-line air quality prediction model GEM-MACH, simulations were carried out for the 2010 northern shipping season (April - October) over a regional Arctic domain. North American wildfire emissions and Arctic shipping emissions were represented, along with other anthropogenic and biogenic emissions. Sensitivity studies were carried out to investigate the principal sources and processes affecting air quality in the Canadian Northern and Arctic regions. In this paper, we present an analysis of sources, transport, and removal processes on the ambient concentrations and atmospheric loading of various pollutants with air quality and climate implications, such as, O3, NOx, SO2, CO, and aerosols (sulfate, black carbon, and organic carbon components). Preliminary results from a model simulation of a recent summertime Arctic field campaign will also be presented.
INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT
A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...
Laomettachit, Teeraphan; Termsaithong, Teerasit; Sae-Tang, Anuwat; Duangphakdee, Orawan
2015-01-07
In the nest-site selection process of honeybee swarms, an individual bee performs a waggle dance to communicate information about direction, quality, and distance of a discovered site to other bees at the swarm. Initially, different groups of bees dance to represent different potential sites, but eventually the swarm usually reaches an agreement for only one site. Here, we model the nest-site selection process in honeybee swarms of Apis mellifera and show how the swarms make adaptive decisions based on a trade-off between the quality and distance to candidate nest sites. We use bifurcation analysis and stochastic simulations to reveal that the swarm's site distance preference is moderate>near>far when the swarms choose between low quality sites. However, the distance preference becomes near>moderate>far when the swarms choose between high quality sites. Our simulations also indicate that swarms with large population size prefer nearer sites and, in addition, are more adaptive at making decisions based on available information compared to swarms with smaller population size. Copyright © 2014 Elsevier Ltd. All rights reserved.
Modelling the effect of wildfire on forested catchment water quality using the SWAT model
NASA Astrophysics Data System (ADS)
Yu, M.; Bishop, T.; van Ogtrop, F. F.; Bell, T.
2016-12-01
Wildfire removes the surface vegetation, releases ash, increase erosion and runoff, and therefore effects the hydrological cycle of a forested water catchment. It is important to understand chnage and how the catchment recovers. These processes are spatially sensitive and effected by interactions between fire severity and hillslope, soil type and surface vegetation conditions. Thus, a distributed hydrological modelling approach is required. In this study, the Soil and Water Analysis Tool (SWAT) is used to predict the effect of 2001/02 Sydney wild fire on catchment water quality. 10 years pre-fire data is used to create and calibrate the SWAT model. The calibrated model was then used to simulate the water quality for the 10 years post-fire period without fire effect. The simulated water quality data are compared with recorded water quality data provided by Sydney catchment authority. The mean change of flow, total suspended solid, total nitrate and total phosphate are compare on monthly, three month, six month and annual basis. Two control catchment and three burn catchment were analysed.
Flight simulator for hypersonic vehicle and a study of NASP handling qualities
NASA Technical Reports Server (NTRS)
Ntuen, Celestine A.; Park, Eui H.; Deeb, Joseph M.; Kim, Jung H.
1992-01-01
The research goal of the Human-Machine Systems Engineering Group was to study the existing handling quality studies in aircraft with sonic to supersonic speeds and power in order to understand information requirements needed for a hypersonic vehicle flight simulator. This goal falls within the NASA task statements: (1) develop flight simulator for hypersonic vehicle; (2) study NASP handling qualities; and (3) study effects of flexibility on handling qualities and on control system performance. Following the above statement of work, the group has developed three research strategies. These are: (1) to study existing handling quality studies and the associated aircraft and develop flight simulation data characterization; (2) to develop a profile for flight simulation data acquisition based on objective statement no. 1 above; and (3) to develop a simulator and an embedded expert system platform which can be used in handling quality experiments for hypersonic aircraft/flight simulation training.
Draft versus finished sequence data for DNA and protein diagnostic signature development
Gardner, Shea N.; Lam, Marisa W.; Smith, Jason R.; Torres, Clinton L.; Slezak, Tom R.
2005-01-01
Sequencing pathogen genomes is costly, demanding careful allocation of limited sequencing resources. We built a computational Sequencing Analysis Pipeline (SAP) to guide decisions regarding the amount of genomic sequencing necessary to develop high-quality diagnostic DNA and protein signatures. SAP uses simulations to estimate the number of target genomes and close phylogenetic relatives (near neighbors or NNs) to sequence. We use SAP to assess whether draft data are sufficient or finished sequencing is required using Marburg and variola virus sequences. Simulations indicate that intermediate to high-quality draft with error rates of 10−3–10−5 (∼8× coverage) of target organisms is suitable for DNA signature prediction. Low-quality draft with error rates of ∼1% (3× to 6× coverage) of target isolates is inadequate for DNA signature prediction, although low-quality draft of NNs is sufficient, as long as the target genomes are of high quality. For protein signature prediction, sequencing errors in target genomes substantially reduce the detection of amino acid sequence conservation, even if the draft is of high quality. In summary, high-quality draft of target and low-quality draft of NNs appears to be a cost-effective investment for DNA signature prediction, but may lead to underestimation of predicted protein signatures. PMID:16243783
A Descriptive Guide to Trade Space Analysis
2015-09-01
Development QFD Quality Function Deployment RSM Response Surface Method RSE Response Surface Equation SE Systems Engineering SME Subject Matter...surface equations ( RSEs ) as surrogate models. It uses the RSEs with Monte Carlo simulation to quantitatively explore changes across the surfaces to
Power quality analysis of DC arc furnace operation using the Bowman model for electric arc
NASA Astrophysics Data System (ADS)
Gherman, P. L.
2018-01-01
This work is about a relatively new domain. The DC electric arc is superior to the AC electric arc and it’s not used in Romania. This is why we analyzed the work functions of these furnaces by simulation and model checking of the simulation results.The conclusions are favorable, to be carried is to develop a real-time control system of steel elaboration process.
Massively Parallel Processing for Fast and Accurate Stamping Simulations
NASA Astrophysics Data System (ADS)
Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu
2005-08-01
The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.
NASA Astrophysics Data System (ADS)
Yang, Wenxiu; Liu, Yanbo; Zhang, Ligai; Cao, Hong; Wang, Yang; Yao, Jinbo
2016-06-01
Needleless electrospinning technology is considered as a better avenue to produce nanofibrous materials at large scale, and electric field intensity and its distribution play an important role in controlling nanofiber diameter and quality of the nanofibrous web during electrospinning. In the current study, a novel needleless electrospinning method was proposed based on Von Koch curves of Fractal configuration, simulation and analysis on electric field intensity and distribution in the new electrospinning process were performed with Finite element analysis software, Comsol Multiphysics 4.4, based on linear and nonlinear Von Koch fractal curves (hereafter called fractal models). The result of simulation and analysis indicated that Second level fractal structure is the optimal linear electrospinning spinneret in terms of field intensity and uniformity. Further simulation and analysis showed that the circular type of Fractal spinneret has better field intensity and distribution compared to spiral type of Fractal spinneret in the nonlinear Fractal electrospinning technology. The electrospinning apparatus with the optimal Von Koch fractal spinneret was set up to verify the theoretical analysis results from Comsol simulation, achieving more uniform electric field distribution and lower energy cost, compared to the current needle and needleless electrospinning technologies.
Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 2: Data
NASA Technical Reports Server (NTRS)
Waszak, M. R.; Schmidt, D. K.
1985-01-01
Two analysis methods are applied to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop modal analysis technique. This method considers the effect of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Both analyses indicated that dynamic aeroelastic effects caused a degradation in vehicle tracking performance, based on the evaluation of some simulation results. Volume 2 consists of the presentation of the state variable models of the flexible aircraft configurations used in the analysis applications mode shape plots for the structural modes, numerical results from the modal analysis frequency response plots from the pilot in the loop analysis and a listing of the modal analysis computer program.
Multiobjective generalized extremal optimization algorithm for simulation of daylight illuminants
NASA Astrophysics Data System (ADS)
Kumar, Srividya Ravindra; Kurian, Ciji Pearl; Gomes-Borges, Marcos Eduardo
2017-10-01
Daylight illuminants are widely used as references for color quality testing and optical vision testing applications. Presently used daylight simulators make use of fluorescent bulbs that are not tunable and occupy more space inside the quality testing chambers. By designing a spectrally tunable LED light source with an optimal number of LEDs, cost, space, and energy can be saved. This paper describes an application of the generalized extremal optimization (GEO) algorithm for selection of the appropriate quantity and quality of LEDs that compose the light source. The multiobjective approach of this algorithm tries to get the best spectral simulation with minimum fitness error toward the target spectrum, correlated color temperature (CCT) the same as the target spectrum, high color rendering index (CRI), and luminous flux as required for testing applications. GEO is a global search algorithm based on phenomena of natural evolution and is especially designed to be used in complex optimization problems. Several simulations have been conducted to validate the performance of the algorithm. The methodology applied to model the LEDs, together with the theoretical basis for CCT and CRI calculation, is presented in this paper. A comparative result analysis of M-GEO evolutionary algorithm with the Levenberg-Marquardt conventional deterministic algorithm is also presented.
NASA Technical Reports Server (NTRS)
Kaupp, V. H.; Macdonald, H. C.; Waite, W. P.; Stiles, J. A.; Frost, F. S.; Shanmugam, K. S.; Smith, S. A.; Narayanan, V.; Holtzman, J. C. (Principal Investigator)
1982-01-01
Computer-generated radar simulations and mathematical geologic terrain models were used to establish the optimum radar sensor operating parameters for geologic research. An initial set of mathematical geologic terrain models was created for three basic landforms and families of simulated radar images were prepared from these models for numerous interacting sensor, platform, and terrain variables. The tradeoffs between the various sensor parameters and the quantity and quality of the extractable geologic data were investigated as well as the development of automated techniques of digital SAR image analysis. Initial work on a texture analysis of SEASAT SAR imagery is reported. Computer-generated radar simulations are shown for combinations of two geologic models and three SAR angles of incidence.
Turbulence flight director analysis and preliminary simulation
NASA Technical Reports Server (NTRS)
Johnson, D. E.; Klein, R. E.
1974-01-01
A control column and trottle flight director display system is synthesized for use during flight through severe turbulence. The column system is designed to minimize airspeed excursions without overdriving attitude. The throttle system is designed to augment the airspeed regulation and provide an indication of the trim thrust required for any desired flight path angle. Together they form an energy management system to provide harmonious display indications of current aircraft motions and required corrective action, minimize gust upset tendencies, minimize unsafe aircraft excursions, and maintain satisfactory ride qualities. A preliminary fixed-base piloted simulation verified the analysis and provided a shakedown for a more sophisticated moving-base simulation to be accomplished next. This preliminary simulation utilized a flight scenario concept combining piloting tasks, random turbulence, and discrete gusts to create a high but realistic pilot workload conducive to pilot error and potential upset. The turbulence director (energy management) system significantly reduced pilot workload and minimized unsafe aircraft excursions.
Gravity and thermal deformation of large primary mirror in space telescope
NASA Astrophysics Data System (ADS)
Wang, Xin; Jiang, Shouwang; Wan, Jinlong; Shu, Rong
2016-10-01
The technology of integrating mechanical FEA analysis with optical estimation is essential to simulate the gravity deformation of large main mirror and the thermal deformation such as static or temperature gradient of optical structure. We present the simulation results of FEA analysis, data processing, and image performance. Three kinds of support structure for large primary mirror which have the center holding structure, the edge glue fixation and back support, are designed and compared to get the optimal gravity deformation. Variable mirror materials Zerodur/SiC are chosen and analyzed to obtain the small thermal gradient distortion. The simulation accuracy is dependent on FEA mesh quality, the load definition of structure, the fitting error from discrete data to smooth surface. A main mirror with 1m diameter is designed as an example. The appropriate structure material to match mirror, the central supporting structure, and the key aspects of FEA simulation are optimized for space application.
Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment
NASA Technical Reports Server (NTRS)
Lee, Meemong; Bowman, Kevin
2014-01-01
Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.
SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE
The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...
Code of Federal Regulations, 2010 CFR
2010-01-01
... Quality Management Systems for Flight Simulation Training Devices E Appendix E to Part 60 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION...—Qualification Performance Standards for Quality Management Systems for Flight Simulation Training Devices Begin...
NASA Astrophysics Data System (ADS)
van Ginneken, Meike; Oron, Gideon
2000-09-01
This study assesses health risks to consumers due to the use of agricultural products irrigated with reclaimed wastewater. The analysis is based on a definition of an exposure model which takes into account several parameters: (1) the quality of the applied wastewater, (2) the irrigation method, (3) the elapsed times between irrigation, harvest, and product consumption, and (4) the consumers' habits. The exposure model is used for numerical simulation of human consumers' risks using the Monte Carlo simulation method. The results of the numerical simulation show large deviations, probably caused by uncertainty (impreciseness in quality of input data) and variability due to diversity among populations. There is a 10-orders of magnitude difference in the risk of infection between the different exposure scenarios with the same water quality. This variation indicates the need for setting risk-based criteria for wastewater reclamation rather than single water quality guidelines. Extra data are required to decrease uncertainty in the risk assessment. Future research needs to include definition of acceptable risk criteria, more accurate dose-response modeling, information regarding pathogen survival in treated wastewater, additional data related to the passage of pathogens into and in the plants during irrigation, and information regarding the behavior patterns of the community of human consumers.
NASA Astrophysics Data System (ADS)
Haris, H.; Chow, M. F.; Usman, F.; Sidek, L. M.; Roseli, Z. A.; Norlida, M. D.
2016-03-01
Urbanization is growing rapidly in Malaysia. Rapid urbanization has known to have several negative impacts towards hydrological cycle due to decreasing of pervious area and deterioration of water quality in stormwater runoff. One of the negative impacts of urbanization is the congestion of the stormwater drainage system and this situation leading to flash flood problem and water quality degradation. There are many urban stormwater management softwares available in the market such as Storm Water Drainage System design and analysis program (DRAINS), Urban Drainage and Sewer Model (MOUSE), InfoWorks River Simulation (InfoWork RS), Hydrological Simulation Program-Fortran (HSPF), Distributed Routing Rainfall-Runoff Model (DR3M), Storm Water Management Model (SWMM), XP Storm Water Management Model (XPSWMM), MIKE-SWMM, Quality-Quantity Simulators (QQS), Storage, Treatment, Overflow, Runoff Model (STORM), and Hydrologic Engineering Centre-Hydrologic Modelling System (HEC-HMS). In this paper, we are going to discuss briefly about several softwares and their functionality, accessibility, characteristics and components in the quantity analysis of the hydrological design software and compare it with MSMA Design Aid and Database. Green Infrastructure (GI) is one of the main topics that has widely been discussed all over the world. Every development in the urban area is related to GI. GI can be defined as green area build in the develop area such as forest, park, wetland or floodway. The role of GI is to improve life standard such as water filtration or flood control. Among the twenty models that have been compared to MSMA SME, ten models were selected to conduct a comprehensive review for this study. These are known to be widely accepted by water resource researchers. These ten tools are further classified into three major categories as models that address the stormwater management ability of GI in terms of quantity and quality, models that have the capability of conducting the economic analysis of GI and models that can address both stormwater management and economic aspects together.
Improving the home health acute-care hospitalization quality measure.
Schade, Charles P; Brehm, John G
2010-06-01
(1) To demonstrate average length of service (ALOS) bias in the currently used acute-care hospitalization (ACH) home health quality measure, limiting comparability across agencies, and (2) to propose alternative ACH measures. Secondary analysis of Medicare home health service data 2004-2007; convenience sample of Medicare fee-for-service hospital discharges. Cross-sectional analysis and patient-level simulation. We aggregated outcome and ALOS data from 2,347 larger Medicare-certified home health agencies (HHAs) in the United States between 2004 and 2007, and calculated risk-adjusted monthly ACH rates. We used multiple regression to identify agency characteristics associated with ACH. We simulated ACH during and immediately after home health care using patient and agency characteristics similar to those in the actual data, comparing the existing measure with alternative fixed-interval measures. Of agency characteristics studied, ALOS had by far the highest partial correlation with the current ACH measure (r(2)=0.218, p<.0001). We replicated the correlation between ACH and ALOS in the patient-level simulation. We found no correlation between ALOS and the alternative measures. Alternative measures do not exhibit ALOS bias and would be appropriate for comparing HHA ACH rates with one another or over time.
Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben
2014-01-01
This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.
Optimization of Collision Detection in Surgical Simulations
NASA Astrophysics Data System (ADS)
Custură-Crăciun, Dan; Cochior, Daniel; Neagu, Corneliu
2014-11-01
Just like flight and spaceship simulators already represent a standard, we expect that soon enough, surgical simulators should become a standard in medical applications. A simulations quality is strongly related to the image quality as well as the degree of realism of the simulation. Increased quality requires increased resolution, increased representation speed but more important, a larger amount of mathematical equations. To make it possible, not only that we need more efficient computers, but especially more calculation process optimizations. A simulator executes one of the most complex sets of calculations each time it detects a contact between the virtual objects, therefore optimization of collision detection is fatal for the work-speed of a simulator and hence in its quality
Nalwadda, Gorrette; Tumwesigye, Nazarius M; Faxelid, Elisabeth; Byamugisha, Josaphat; Mirembe, Florence
2011-01-01
Low and inconsistent use of contraceptives by young people contributes to unintended pregnancies. This study assessed quality of contraceptive services for young people aged 15-24 in two rural districts in Uganda. Five female and two male simulated clients (SCs) interacted with 128 providers at public, private not-for-profit (PNFP), and private for profit (PFP) health facilities. After consultations, SCs were interviewed using a structured questionnaire. Six aspects of quality of care (client's needs, choice of contraceptive methods, information given to users, client-provider interpersonal relations, constellation of services, and continuity mechanisms) were assessed. Descriptive statistics and factor analysis were performed. Means and categorized quality scores for all aspects of quality were low in both public and private facilities. The lowest quality scores were observed in PFP, and medium scores in PNFP facilities. The choice of contraceptive methods and interpersonal relations quality scores were slightly higher in public facilities. Needs assessment scores were highest in PNFP facilities. All facilities were classified as having low scores for appropriate constellation of services. Information given to users was suboptimal and providers promoted specific contraceptive methods. Minority of providers offered preferred method of choice and showed respect for privacy. The quality of contraceptive services provided to young people was low. Concurrent quality improvements and strengthening of health systems are needed.
Nalwadda, Gorrette; Tumwesigye, Nazarius M.; Faxelid, Elisabeth; Byamugisha, Josaphat; Mirembe, Florence
2011-01-01
Background Low and inconsistent use of contraceptives by young people contributes to unintended pregnancies. This study assessed quality of contraceptive services for young people aged 15–24 in two rural districts in Uganda. Methods Five female and two male simulated clients (SCs) interacted with 128 providers at public, private not-for-profit (PNFP), and private for profit (PFP) health facilities. After consultations, SCs were interviewed using a structured questionnaire. Six aspects of quality of care (client's needs, choice of contraceptive methods, information given to users, client-provider interpersonal relations, constellation of services, and continuity mechanisms) were assessed. Descriptive statistics and factor analysis were performed. Results Means and categorized quality scores for all aspects of quality were low in both public and private facilities. The lowest quality scores were observed in PFP, and medium scores in PNFP facilities. The choice of contraceptive methods and interpersonal relations quality scores were slightly higher in public facilities. Needs assessment scores were highest in PNFP facilities. All facilities were classified as having low scores for appropriate constellation of services. Information given to users was suboptimal and providers promoted specific contraceptive methods. Minority of providers offered preferred method of choice and showed respect for privacy. Conclusions The quality of contraceptive services provided to young people was low. Concurrent quality improvements and strengthening of health systems are needed. PMID:22132168
Is it beneficial to increase the provision of thrombolysis?-- a discrete-event simulation model.
Barton, M; McClean, S; Gillespie, J; Garg, L; Wilson, D; Fullerton, K
2012-07-01
Although Thrombolysis has been licensed in the UK since 2003, it is still administered only to a small percentage of eligible patients. We consider the impact of investing the impact of thrombolysis on important acute stroke services, and the effect on quality of life. The concept is illustrated using data from the Northern Ireland Stroke Service. Retrospective study. We first present results of survival analysis utilizing length of stay (LOS) for discharge destinations, based on data from the Belfast City Hospital (BCH). None of these patients actually received thrombolysis but from those who would have been eligible, we created two initial groups, the first representing a scenario where they received thrombolysis and the second comprising those who do not receive thrombolysis. On the basis of the survival analysis, we created several subgroups based on discharge destination. We then developed a discrete event simulation (DES) model, where each group is a patient pathway within the simulation. Coxian phase type distributions were used to model the group LOS. Various scenarios were explored focusing on cost-effectiveness across hospital, community and social services had thrombolysis been administered to these patients, and the possible improvement in quality of life, should the proportion of patients who are administered thrombolysis be increased. Our aim in simulating various scenarios for this historical group of patients is to assess what the cost-effectiveness of thrombolysis would have been under different scenarios; from this we can infer the likely cost-effectiveness of future policies. The cost of thrombolysis is offset by reduction in hospital, community rehabilitation and institutional care costs, with a corresponding improvement in quality of life. Our model suggests that provision of thrombolysis would produce moderate overall improvement to the service assuming current levels of funding.
Frameworks for Assessing the Quality of Modeling and Simulation Capabilities
NASA Astrophysics Data System (ADS)
Rider, W. J.
2012-12-01
The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are incomplete and need to be extended incorporating elements from the other as well as new elements related to how models are solved, and how the model will be applied. I will describe this merger of approach and how it should be applied. The problems in adoption are related to basic human nature in that no one likes to be graded, or told they are not sufficiently quality oriented. Rather than engage in an adversarial role, I suggest that the frameworks be viewed as a collaborative tool. Instead these frameworks should be used to structure collaborations that can be used to assist the modeling and simulation efforts to be high quality. The framework provides a comprehensive setting of modeling and simulation themes that should be explored in providing high quality. W. Oberkampf, M. Pilch, and T. Trucano, Predictive Capability Maturity Model for Computational Modeling and Simulation, SAND2007-5948, 2007. B. Boyack, Quantifying Reactor Safety Margins Part 1: An Overview of the Code Scaling, Applicability, and Uncertainty Evaluation Methodology, Nuc. Eng. Design, 119, pp. 1-15, 1990. National Aeronautics and Space Administration, STANDARD FOR MODELS AND SIMULATIONS, NASA-STD-7009, 2008. Y. Ben-Haim and F. Hemez, Robustness, fidelity and prediction-looseness of models, Proc. R. Soc. A (2012) 468, 227-244.
Ecosystem Modeling Applied to Nutrient Criteria Development in Rivers
NASA Astrophysics Data System (ADS)
Carleton, James N.; Park, Richard A.; Clough, Jonathan S.
2009-09-01
Threshold concentrations for biological impairment by nutrients are difficult to quantify in lotic systems, yet States and Tribes in the United States are charged with developing water quality criteria to protect these ecosystems from excessive enrichment. The analysis described in this article explores the use of the ecosystem model AQUATOX to investigate impairment thresholds keyed to biological indexes that can be simulated. The indexes selected for this exercise include percentage cyanobacterial biomass of sestonic algae, and benthic chlorophyll a. The calibrated model was used to analyze responses of these indexes to concurrent reductions in phosphorus, nitrogen, and suspended sediment in an enriched upper Midwestern river. Results suggest that the indexes would respond strongly to changes in phosphorus and suspended sediment, and less strongly to changes in nitrogen concentration. Using simulated concurrent reductions in all three water quality constituents, a total phosphorus concentration of 0.1 mg/l was identified as a threshold concentration, and therefore a hypothetical water quality criterion, for prevention of both excessive periphyton growth and sestonic cyanobacterial blooms. This kind of analysis is suggested as a way to evaluate multiple contrasting impacts of hypothetical nutrient and sediment reductions and to define nutrient criteria or target concentrations that balance multiple management objectives concurrently.
Performance Analysis of IIUM Wireless Campus Network
NASA Astrophysics Data System (ADS)
Abd Latif, Suhaimi; Masud, Mosharrof H.; Anwar, Farhat
2013-12-01
International Islamic University Malaysia (IIUM) is one of the leading universities in the world in terms of quality of education that has been achieved due to providing numerous facilities including wireless services to every enrolled student. The quality of this wireless service is controlled and monitored by Information Technology Division (ITD), an ISO standardized organization under the university. This paper aims to investigate the constraints of wireless campus network of IIUM. It evaluates the performance of the IIUM wireless campus network in terms of delay, throughput and jitter. QualNet 5.2 simulator tool has employed to measure these performances of IIUM wireless campus network. The observation from the simulation result could be one of the influencing factors in improving wireless services for ITD and further improvement.
Lee, Ming-Feng; Lin, Ching-Lan Esther
2017-10-01
The negative attitudes of the general public toward mental illness frequently influence the integration of mental illness patients into the community. Auditory hallucination simulation may be considered as a creative teaching strategy to improve the attitudes of learners toward mental illness. However, the empirical effects of auditory hallucination simulation to change the negative attitudes toward mental illness remains uncertain. To compare and analyze, using a systematic review and meta-analysis, the effectiveness of auditory hallucination simulation in improving empathy, knowledge, social distance, and attitudes toward mental illness in undergraduates. A search using the keywords "auditory hallucination" and "simulation" and the 4 outcome indicators of empathy, knowledge, social distance, and attitudes toward mental illness was conducted to identify related articles published between 2008 and 2016 in 6 Chinese and English electronic databases, including Cochrane Library, EBSCO-CINAHL, MEDLINE, PsycINFO, PubMed, and Airiti Library. Research quality was appraised using the Modified Jadad Scale (MJS), the Oxford Centre for Evidence-Based Medicine Level of Evidence (OCEBM LoE), and the Cochrane Risk of Bias tool. Eleven studies were recruited, and 7 studies with sufficient data were included in the meta-analysis. The meta-analysis showed that hallucination simulation significantly improved the empathy and knowledge of participants, with respective effect sizes of 0.63 (95% CI [0.21, 1.05]) and 0.69 (95% CI [0.43-0.94]). However, this intervention also increased social distance, with an effect size of 0.60 (95% CI [0.01, 1.19]), and did not change attitudes toward mental illness significantly, with an effect size of 0.33 (95% CI [-0.11, 0.77]). Auditory hallucination simulation is an effective teaching strategy for improving the empathy and knowledge of undergraduates. However, related evidence for the effects of social distance and attitudes toward mental illness need to be further strengthened. Most of the extant research on this subject was conducted in the United States and Australia and was of moderate quality. Future studies should use sufficiently rigorous research designs to explore the safety issues and the effectiveness of the auditory hallucination simulation intervention in different countries and ethnic populations.
Hunnicutt, Jacob N; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L
2016-12-01
We systematically reviewed pharmacoepidemiologic and comparative effectiveness studies that use probabilistic bias analysis to quantify the effects of systematic error including confounding, misclassification, and selection bias on study results. We found articles published between 2010 and October 2015 through a citation search using Web of Science and Google Scholar and a keyword search using PubMed and Scopus. Eligibility of studies was assessed by one reviewer. Three reviewers independently abstracted data from eligible studies. Fifteen studies used probabilistic bias analysis and were eligible for data abstraction-nine simulated an unmeasured confounder and six simulated misclassification. The majority of studies simulating an unmeasured confounder did not specify the range of plausible estimates for the bias parameters. Studies simulating misclassification were in general clearer when reporting the plausible distribution of bias parameters. Regardless of the bias simulated, the probability distributions assigned to bias parameters, number of simulated iterations, sensitivity analyses, and diagnostics were not discussed in the majority of studies. Despite the prevalence and concern of bias in pharmacoepidemiologic and comparative effectiveness studies, probabilistic bias analysis to quantitatively model the effect of bias was not widely used. The quality of reporting and use of this technique varied and was often unclear. Further discussion and dissemination of the technique are warranted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Statistical Downscaling of WRF-Chem Model: An Air Quality Analysis over Bogota, Colombia
NASA Astrophysics Data System (ADS)
Kumar, Anikender; Rojas, Nestor
2015-04-01
Statistical downscaling is a technique that is used to extract high-resolution information from regional scale variables produced by coarse resolution models such as Chemical Transport Models (CTMs). The fully coupled WRF-Chem (Weather Research and Forecasting with Chemistry) model is used to simulate air quality over Bogota. Bogota is a tropical Andean megacity located over a high-altitude plateau in the middle of very complex terrain. The WRF-Chem model was adopted for simulating the hourly ozone concentrations. The computational domains were chosen of 120x120x32, 121x121x32 and 121x121x32 grid points with horizontal resolutions of 27, 9 and 3 km respectively. The model was initialized with real boundary conditions using NCAR-NCEP's Final Analysis (FNL) and a 1ox1o (~111 km x 111 km) resolution. Boundary conditions were updated every 6 hours using reanalysis data. The emission rates were obtained from global inventories, namely the REanalysis of the TROpospheric (RETRO) chemical composition and the Emission Database for Global Atmospheric Research (EDGAR). Multiple linear regression and artificial neural network techniques are used to downscale the model output at each monitoring stations. The results confirm that the statistically downscaled outputs reduce simulated errors by up to 25%. This study provides a general overview of statistical downscaling of chemical transport models and can constitute a reference for future air quality modeling exercises over Bogota and other Colombian cities.
Improving the quality of pressure ulcer care with prevention: a cost-effectiveness analysis.
Padula, William V; Mishra, Manish K; Makic, Mary Beth F; Sullivan, Patrick W
2011-04-01
In October 2008, Centers for Medicare and Medicaid Services discontinued reimbursement for hospital-acquired pressure ulcers (HAPUs), thus placing stress on hospitals to prevent incidence of this costly condition. To evaluate whether prevention methods are cost-effective compared with standard care in the management of HAPUs. A semi-Markov model simulated the admission of patients to an acute care hospital from the time of admission through 1 year using the societal perspective. The model simulated health states that could potentially lead to an HAPU through either the practice of "prevention" or "standard care." Univariate sensitivity analyses, threshold analyses, and Bayesian multivariate probabilistic sensitivity analysis using 10,000 Monte Carlo simulations were conducted. Cost per quality-adjusted life-years (QALYs) gained for the prevention of HAPUs. Prevention was cost saving and resulted in greater expected effectiveness compared with the standard care approach per hospitalization. The expected cost of prevention was $7276.35, and the expected effectiveness was 11.241 QALYs. The expected cost for standard care was $10,053.95, and the expected effectiveness was 9.342 QALYs. The multivariate probabilistic sensitivity analysis showed that prevention resulted in cost savings in 99.99% of the simulations. The threshold cost of prevention was $821.53 per day per person, whereas the cost of prevention was estimated to be $54.66 per day per person. This study suggests that it is more cost effective to pay for prevention of HAPUs compared with standard care. Continuous preventive care of HAPUs in acutely ill patients could potentially reduce incidence and prevalence, as well as lead to lower expenditures.
NASA Astrophysics Data System (ADS)
Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Baek, Seong-Min
2013-11-01
The purpose of this study is to present a new method of quality assurance (QA) in order to ensure effective evaluation of the accuracy of respiratory-gated radiotherapy (RGR). This would help in quantitatively analyzing the patient's respiratory cycle and respiration-induced tumor motion and in performing a subsequent comparative analysis of dose distributions, using the gamma-index method, as reproduced in our in-house developed respiration-simulating phantom. Therefore, we designed a respiration-simulating phantom capable of reproducing the patient's respiratory cycle and respiration-induced tumor motion and evaluated the accuracy of RGR by estimating its pass rates. We applied the gamma index passing criteria of accepted error ranges of 3% and 3 mm for the dose distribution calculated by using the treatment planning system (TPS) and the actual dose distribution of RGR. The pass rate clearly increased inversely to the gating width chosen. When respiration-induced tumor motion was 12 mm or less, pass rates of 85% and above were achieved for the 30-70% respiratory phase, and pass rates of 90% and above were achieved for the 40-60% respiratory phase. However, a respiratory cycle with a very small fluctuation range of pass rates failed to prove reliable in evaluating the accuracy of RGR. Therefore, accurate and reliable outcomes of radiotherapy will be obtainable only by establishing a novel QA system using the respiration-simulating phantom, the gamma-index analysis, and a quantitative analysis of diaphragmatic motion, enabling an indirect measurement of tumor motion.
Simulation of atmospheric oxidation capacity in Houston, Texas
Air quality model simulations are performed and evaluated for Houston using the Community Multiscale Air Quality (CMAQ) model. The simulations use two different emissions estimates: the EPA 2005 National Emissions Inventory (NEI) and the Texas Commission on Environmental Quality ...
Muller-Juge, Virginie; Cullati, Stéphane; Blondon, Katherine S; Hudelson, Patricia; Maître, Fabienne; Vu, Nu V; Savoldelli, Georges L; Nendaz, Mathieu R
2014-01-01
Effective teamwork is necessary for optimal patient care. There is insufficient understanding of interactions between physicians and nurses on internal medicine wards. To describe resident physicians' and nurses' actual behaviours contributing to teamwork quality in the setting of a simulated internal medicine ward. A volunteer sample of 14 pairs of residents and nurses in internal medicine was asked to manage one non-urgent and one urgent clinical case in a simulated ward, using a high-fidelity manikin. After the simulation, participants attended a stimulated-recall session during which they viewed the videotape of the simulation and explained their actions and perceptions. All simulations were transcribed, coded, and analyzed, using a qualitative method (template analysis). Quality of teamwork was assessed, based on patient management efficiency and presence of shared management goals and of team spirit. Most resident-nurse pairs tended to interact in a traditional way, with residents taking the leadership and nurses executing medical prescriptions and assuming their own specific role. They also demonstrated different types of interactions involving shared responsibilities and decision making, constructive suggestions, active communication and listening, and manifestations of positive team building. The presence of a leader in the pair or a truly shared leadership between resident and nurse contributed to teamwork quality only if both members of the pair demonstrated sufficient autonomy. In case of a lack of autonomy of one member, the other member could compensate for it, if his/her own autonomy was sufficiently strong and if there were demonstrations of mutual listening, information sharing, and positive team building. Although they often relied on traditional types of interaction, residents and nurses also demonstrated readiness for increased sharing of responsibilities. Interprofessional education should insist on better redefinition of respective roles and reinforce behaviours shown to enhance teamwork quality.
Muller-Juge, Virginie; Cullati, Stéphane; Blondon, Katherine S.; Hudelson, Patricia; Maître, Fabienne; Vu, Nu V.; Savoldelli, Georges L.; Nendaz, Mathieu R.
2014-01-01
Background Effective teamwork is necessary for optimal patient care. There is insufficient understanding of interactions between physicians and nurses on internal medicine wards. Objective To describe resident physicians’ and nurses’ actual behaviours contributing to teamwork quality in the setting of a simulated internal medicine ward. Methods A volunteer sample of 14 pairs of residents and nurses in internal medicine was asked to manage one non-urgent and one urgent clinical case in a simulated ward, using a high-fidelity manikin. After the simulation, participants attended a stimulated-recall session during which they viewed the videotape of the simulation and explained their actions and perceptions. All simulations were transcribed, coded, and analyzed, using a qualitative method (template analysis). Quality of teamwork was assessed, based on patient management efficiency and presence of shared management goals and of team spirit. Results Most resident-nurse pairs tended to interact in a traditional way, with residents taking the leadership and nurses executing medical prescriptions and assuming their own specific role. They also demonstrated different types of interactions involving shared responsibilities and decision making, constructive suggestions, active communication and listening, and manifestations of positive team building. The presence of a leader in the pair or a truly shared leadership between resident and nurse contributed to teamwork quality only if both members of the pair demonstrated sufficient autonomy. In case of a lack of autonomy of one member, the other member could compensate for it, if his/her own autonomy was sufficiently strong and if there were demonstrations of mutual listening, information sharing, and positive team building. Conclusions Although they often relied on traditional types of interaction, residents and nurses also demonstrated readiness for increased sharing of responsibilities. Interprofessional education should insist on better redefinition of respective roles and reinforce behaviours shown to enhance teamwork quality. PMID:24769672
Matschek, Janine; Bullinger, Eric; von Haeseler, Friedrich; Skalej, Martin; Findeisen, Rolf
2017-02-01
Radiofrequency ablation is a valuable tool in the treatment of many diseases, especially cancer. However, controlled heating up to apoptosis of the desired target tissue in complex situations, e.g. in the spine, is challenging and requires experienced interventionalists. For such challenging situations a mathematical model of radiofrequency ablation allows to understand, improve and optimise the outcome of the medical therapy. The main contribution of this work is the derivation of a tailored, yet expandable mathematical model, for the simulation, analysis, planning and control of radiofrequency ablation in complex situations. The dynamic model consists of partial differential equations that describe the potential and temperature distribution during intervention. To account for multipolar operation, time-dependent boundary conditions are introduced. Spatially distributed parameters, like tissue conductivity and blood perfusion, allow to describe the complex 3D environment representing diverse involved tissue types in the spine. To identify the key parameters affecting the prediction quality of the model, the influence of the parameters on the temperature distribution is investigated via a sensitivity analysis. Simulations underpin the quality of the derived model and the analysis approach. The proposed modelling and analysis schemes set the basis for intervention planning, state- and parameter estimation, and control. Copyright © 2016. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Roskam, Jan; Ackers, Deane E.; Gerren, Donna S.
1995-01-01
A propulsion controlled aircraft (PCA) system has been developed at NASA Dryden Flight Research Center at Edwards Air Force Base, California, to provide safe, emergency landing capability should the primary flight control system of the aircraft fail. As a result of the successful PCA work being done at NASA Dryden, this project investigated the possibility of incorporating the PCA system as a backup flight control system in the design of a large, ultra-high capacity megatransport in such a way that flight path control using only the engines is not only possible, but meets MIL-Spec Level 1 or Level 2 handling quality requirements. An 800 passenger megatransport aircraft was designed and programmed into the NASA Dryden simulator. Many different analysis methods were used to evaluate the flying qualities of the megatransport while using engine thrust for flight path control, including: (1) Bode and root locus plot analysis to evaluate the frequency and damping ratio response of the megatransport; (2) analysis of actual simulator strip chart recordings to evaluate the time history response of the megatransport; and (3) analysis of Cooper-Harper pilot ratings by two NaSA test pilots.
NASA Astrophysics Data System (ADS)
Parker, Gary D.
1986-03-01
Galileo's earliest telescopic measurements are of sufficient quality that their detailed analysis yields scientifically interesting and pedagogically useful results. An optical illusion strongly influences Galileo's observations of Jupiter's moons, as published in the Starry Messenger. A simple procedure identifies individual satellites with sufficient reliability to demonstrate that Galileo regularly underestimated satellite brightness and overestimated elongation when a satellite was very close to Jupiter. The probability of underestimation is a monotonically decreasing function of separation angle, both for Galileo and for viewers of a laboratory simulation of the Jupiter ``starfield'' viewed by Galileo. Analysis of Galileo's records and a simple simulation experiment appropriate to undergraduate courses clarify the scientific problems facing Galileo in interpreting his observations.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
Evaluation of regional climate simulations for air quality modelling purposes
NASA Astrophysics Data System (ADS)
Menut, Laurent; Tripathi, Om P.; Colette, Augustin; Vautard, Robert; Flaounas, Emmanouil; Bessagnet, Bertrand
2013-05-01
In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional "climate modeling" source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.
Space Shuttle Main Engine (SSME) LOX turbopump pump-end bearing analysis
NASA Technical Reports Server (NTRS)
1986-01-01
A simulation of the shaft/bearing system of the Space Shuttle Main Engine Liquid Oxygen turbopump was developed. The simulation model allows the thermal and mechanical characteristics to interact as a realistic simulation of the bearing operating characteristics. The model accounts for single and two phase coolant conditions, and includes the heat generation from bearing friction and fluid stirring. Using the simulation model, parametric analyses were performed on the 45 mm pump-end bearings to investigate the sensitivity of bearing characteristics to contact friction, axial preload, coolant flow rate, coolant inlet temperature and quality, heat transfer coefficients, outer race clearance and misalignment, and the effects of thermally isolating the outer race from the isolator.
NASA Astrophysics Data System (ADS)
Li, Junye; Hu, Jinglei; Wang, Binyu; Sheng, Liang; Zhang, Xinming
2018-03-01
In order to investigate the effect of abrasive flow polishing surface variable diameter pipe parts, with high precision dispensing needles as the research object, the numerical simulation of the process of polishing high precision dispensing needle was carried out. Analysis of different volume fraction conditions, the distribution of the dynamic pressure and the turbulence viscosity of the abrasive flow field in the high precision dispensing needle, through comparative analysis, the effectiveness of the abrasive grain polishing high precision dispensing needle was studied, controlling the volume fraction of silicon carbide can change the viscosity characteristics of the abrasive flow during the polishing process, so that the polishing quality of the abrasive grains can be controlled.
Parameterizing water quality analysis and simulation program (WASP) for carbon-based nanomaterials
Carbon nanotubes (CNT) and graphenes are among the most popular carbon-based nanomaterials due to their unique electronic, mechanic and structural properties. Exposure modeling of these nanomaterials in the aquatic environment is necessary to predict the fate of these materials. ...
Understanding ozone response to its precursor emissions is crucial for effective air quality management practices. This nonlinear response is usually simulated using chemical transport models, and the modeling results are affected by uncertainties in emissions inputs. In this stu...
Ortiz, Roderick F.; Miller, Lisa D.
2009-01-01
Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Southern Delivery System (SDS) project is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various Environmental Impact Statements (EIS) alternatives and plans by Pueblo West to discharge treated wastewater into the reservoir. Wastewater plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (year 2006 demand conditions) were compared to the No Action scenario (projected demands in 2046) to assess changes in water quality over time. All scenario modeling used an external nutrient-decay model to simulate degradation and assimilation of nutrients along the riverine reach upstream from Pueblo Reservoir. Reservoir modeling was conducted using the U.S. Army Corps of Engineers CE-QUAL-W2 two-dimensional water-quality model. Lake hydrodynamics, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, algal biomass, and total iron were simulated. Two reservoir site locations were selected for comparison. Results of simulations at site 3B were characteristic of a riverine environment in the reservoir, whereas results at site 7B (near the dam) were characteristic of the main body of the reservoir. Simulation results for the epilimnion and hypolimnion at these two sites also were evaluated and compared. The simulation results in the hypolimnion at site 7B were indicative of the water quality leaving the reservoir. Comparisons of the different scenario results were conducted to assess if substantial differences were observed between selected scenarios. Each of the scenarios was simulated for three contiguous years representing a wet, average, and dry annual hydrologic cycle (water years 2000 through 2002). Additionally, each selected simulation scenario was evaluated for differences in direct and cumulative effects on a particular scenario. Direct effects are intended to isolate the future effects of the scenarios. Cumulative effects are intended to evaluate the effects of the scenarios in conjunction with all reasonably foreseeable future activities in the study area. Comparisons between the direct- and cumulative-effects analyses indicated that there were not large differences in the results between most of the simulation scenarios, and, as such, the focus of this report was on results for the direct-effects analysis. Additionally, the differences between simulation results generally were
Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J
2009-01-01
Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67×3 (67 clusters of three observations) and a 33×6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67×3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis. PMID:20011037
A Computational Observer For Performing Contrast-Detail Analysis Of Ultrasound Images
NASA Astrophysics Data System (ADS)
Lopez, H.; Loew, M. H.
1988-06-01
Contrast-Detail (C/D) analysis allows the quantitative determination of an imaging system's ability to display a range of varying-size targets as a function of contrast. Using this technique, a contrast-detail plot is obtained which can, in theory, be used to compare image quality from one imaging system to another. The C/D plot, however, is usually obtained by using data from human observer readings. We have shown earlier(7) that the performance of human observers in the task of threshold detection of simulated lesions embedded in random ultrasound noise is highly inaccurate and non-reproducible for untrained observers. We present an objective, computational method for the determination of the C/D curve for ultrasound images. This method utilizes digital images of the C/D phantom developed at CDRH, and lesion-detection algorithms that simulate the Bayesian approach using the likelihood function for an ideal observer. We present the results of this method, and discuss the relationship to the human observer and to the comparability of image quality between systems.
van Tulder, Raphael; Roth, Dominik; Laggner, Roberta; Krammel, Mario; Schriefl, Christoph; Kienbacher, Calvin; Novosad, Heinz; Chwojka, Constantin Christof; Sterz, Fritz; Havel, Christof; Schreiber, Wolfgang; Herkner, Harald
2017-02-01
The quality of telephone-assisted cardiopulmonary resuscitation (CPR) needs improvement. This study investigates whether a dispatchers' perception is an adequate measure of the actual quality of CPR provided by laypersons. Individual participant data from 3 randomized simulation trials, with identical methodology but different interventions, were combined for this analysis. Professional dispatchers gave telephone assistance to laypersons, who each provided 10 minutes of CPR on a manikin. Dispatchers were requested to classify the quality of providers' CPR as adequate or inadequate. Based on actual readings from manikins we classified providers' performance as adequate at 5-6 cm for depth and 100-120 compressions per minute (cpm) for rate. We calculated metrics of dispatcher accuracy. Six dispatchers rated the performance of 94 laypersons (38 women [42%]) with a mean (SD) age of 37 (14) years. In 905 analyzed minutes of telephone-assisted CPR, the mean compression depth and rate was 41 (13) mm and 98 (24) cpm, respectively. Analysis of dispatchers' diagnostic test accuracy for adequate compression depth yielded a sensitivity of 65% (95 CI 36%-95%) and specificity of 42% (95% CI, 32%-53%). Analysis of their assessment of adequate compression rate yielded a sensitivity of 75% (95% CI, 64%-86%) and specificity of 42% (95% CI, 32%-52%). Although dispatchers always underestimated the actual values of CPR parameters, the female dispatchers evaluations were less inaccurate than the evaluations of make dispatchers; the dispatchers overall (males and females together) underestimated the adequacy of female laypersons' CPR performance to a greater degree than female dispatchers did. The ability of dispatchers to estimate the quality of telephone-assisted CPR is limited. Dispatchers estimates of CPR adequacy needs to be studied further in order to find ways that telephone-assisted CPR might be improved.
Cohen, Daniel; Vlaev, Ivo; McMahon, Laurie; Harvey, Sarah; Mitchell, Andy; Borovoi, Leah; Darzi, Ara
2017-05-11
The Health and Social Care Act 2012 represents the most complex National Health Service reforms in history. High-quality clinical leadership is important for successful implementation of health service reform. However, little is known about the effectiveness of current leadership training. This study describes the use of a behavioral simulation to improve the knowledge and leadership of a cohort of medical doctors expected to take leadership roles in the National Health Service. A day-long behavioral simulation (The Crucible) was developed and run based on a fictitious but realistic health economy. Participants completed pre- and postsimulation questionnaires generating qualitative and quantitative data. Leadership skills, knowledge, and behavior change processes described by the "theory of planned behavior" were self-assessed pre- and postsimulation. Sixty-nine medical doctors attended. Participants deemed the simulation immersive and relevant. Significant improvements were shown in perceived knowledge, capability, attitudes, subjective norms, intentions, and leadership competency following the program. Nearly one third of participants reported that they had implemented knowledge and skills from the simulation into practice within 4 weeks. This study systematically demonstrates the effectiveness of behavioral simulation for clinical management training and understanding of health policy reform. Potential future uses and strategies for analysis are discussed. High-quality care requires understanding of health systems and strong leadership. Policymakers should consider the use of behavioral simulation to improve understanding of health service reform and development of leadership skills in clinicians, who readily adopt skills from simulation into everyday practice.
Image Quality Ranking Method for Microscopy
Koho, Sami; Fazeli, Elnaz; Eriksson, John E.; Hänninen, Pekka E.
2016-01-01
Automated analysis of microscope images is necessitated by the increased need for high-resolution follow up of events in time. Manually finding the right images to be analyzed, or eliminated from data analysis are common day-to-day problems in microscopy research today, and the constantly growing size of image datasets does not help the matter. We propose a simple method and a software tool for sorting images within a dataset, according to their relative quality. We demonstrate the applicability of our method in finding good quality images in a STED microscope sample preparation optimization image dataset. The results are validated by comparisons to subjective opinion scores, as well as five state-of-the-art blind image quality assessment methods. We also show how our method can be applied to eliminate useless out-of-focus images in a High-Content-Screening experiment. We further evaluate the ability of our image quality ranking method to detect out-of-focus images, by extensive simulations, and by comparing its performance against previously published, well-established microscopy autofocus metrics. PMID:27364703
Development of flying qualities criteria for single pilot instrument flight operations
NASA Technical Reports Server (NTRS)
Bar-Gill, A.; Nixon, W. B.; Miller, G. E.
1982-01-01
Flying qualities criteria for Single Pilot Instrument Flight Rule (SPIFR) operations were investigated. The ARA aircraft was modified and adapted for SPIFR operations. Aircraft configurations to be flight-tested were chosen and matched on the ARA in-flight simulator, implementing modern control theory algorithms. Mission planning and experimental matrix design were completed. Microprocessor software for the onboard data acquisition system was debugged and flight-tested. Flight-path reconstruction procedure and the associated FORTRAN program were developed. Algorithms associated with the statistical analysis of flight test results and the SPIFR flying qualities criteria deduction are discussed.
"Updates to Model Algorithms & Inputs for the Biogenic ...
We have developed new canopy emission algorithms and land use data for BEIS. Simulations with BEIS v3.4 and these updates in CMAQ v5.0.2 are compared these changes to the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and evaluated the simulations against observations. This has resulted in improvements in model evaluations of modeled isoprene, NOx, and O3. The National Exposure Research Laboratory (NERL) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA mission to protect human health and the environment. AMAD research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollution community in understanding and forecasting not only the magnitude of the air pollution problem, but also in developing emission control policies and regulations for air quality improvements.
Beam Characterization at the Neutron Radiography Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarah Morgan; Jeffrey King
The quality of a neutron imaging beam directly impacts the quality of radiographic images produced using that beam. Fully characterizing a neutron beam, including determination of the beam’s effective length-to-diameter ratio, neutron flux profile, energy spectrum, image quality, and beam divergence, is vital for producing quality radiographic images. This project characterized the east neutron imaging beamline at the Idaho National Laboratory Neutron Radiography Reactor (NRAD). The experiments which measured the beam’s effective length-to-diameter ratio and image quality are based on American Society for Testing and Materials (ASTM) standards. An analysis of the image produced by a calibrated phantom measured themore » beam divergence. The energy spectrum measurements consist of a series of foil irradiations using a selection of activation foils, compared to the results produced by a Monte Carlo n-Particle (MCNP) model of the beamline. Improvement of the existing NRAD MCNP beamline model includes validation of the model’s energy spectrum and the development of enhanced image simulation methods. The image simulation methods predict the radiographic image of an object based on the foil reaction rate data obtained by placing a model of the object in front of the image plane in an MCNP beamline model.« less
AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool
Halford, Keith
2009-01-01
Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.
NASA Astrophysics Data System (ADS)
Zhang, Y. Y.; Shao, Q. X.; Ye, A. Z.; Xing, H. T.; Xia, J.
2016-02-01
Integrated water system modeling is a feasible approach to understanding severe water crises in the world and promoting the implementation of integrated river basin management. In this study, a classic hydrological model (the time variant gain model: TVGM) was extended to an integrated water system model by coupling multiple water-related processes in hydrology, biogeochemistry, water quality, and ecology, and considering the interference of human activities. A parameter analysis tool, which included sensitivity analysis, autocalibration and model performance evaluation, was developed to improve modeling efficiency. To demonstrate the model performances, the Shaying River catchment, which is the largest highly regulated and heavily polluted tributary of the Huai River basin in China, was selected as the case study area. The model performances were evaluated on the key water-related components including runoff, water quality, diffuse pollution load (or nonpoint sources) and crop yield. Results showed that our proposed model simulated most components reasonably well. The simulated daily runoff at most regulated and less-regulated stations matched well with the observations. The average correlation coefficient and Nash-Sutcliffe efficiency were 0.85 and 0.70, respectively. Both the simulated low and high flows at most stations were improved when the dam regulation was considered. The daily ammonium-nitrogen (NH4-N) concentration was also well captured with the average correlation coefficient of 0.67. Furthermore, the diffuse source load of NH4-N and the corn yield were reasonably simulated at the administrative region scale. This integrated water system model is expected to improve the simulation performances with extension to more model functionalities, and to provide a scientific basis for the implementation in integrated river basin managements.
Sensitivity of air quality simulation to smoke plume rise
Yongqiang Liu; Gary Achtemeier; Scott Goodrick
2008-01-01
Plume rise is the height smoke plumes can reach. This information is needed by air quality models such as the Community Multiscale Air Quality (CMAQ) model to simulate physical and chemical processes of point-source fire emissions. This study seeks to understand the importance of plume rise to CMAQ air quality simulation of prescribed burning to plume rise. CMAQ...
A FAST BAYESIAN METHOD FOR UPDATING AND FORECASTING HOURLY OZONE LEVELS
A Bayesian hierarchical space-time model is proposed by combining information from real-time ambient AIRNow air monitoring data, and output from a computer simulation model known as the Community Multi-scale Air Quality (Eta-CMAQ) forecast model. A model validation analysis shows...
Ortiz, Roderick F.; Galloway, Joel M.; Miller, Lisa D.; Mau, David P.
2008-01-01
Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Bureau of Reclamation is working to meet its goal to issue a Final Environmental Impact Statement (EIS) on the Southern Delivery System project (SDS). SDS is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various EIS alternatives and plans by Pueblo West to discharge treated water into the reservoir. Plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (water years 2000 through 2002) were compared to the No Action scenario (projected demands in 2046) to assess changes in water quality over time. All scenario modeling used an external nutrient-decay model to simulate degradation and assimilation of nutrients along the riverine reach upstream from Pueblo Reservoir. Reservoir modeling was conducted using the U.S. Army Corps of Engineers CE-QUAL-W2 two-dimensional water-quality model. Lake hydrodynamics, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, algal biomass, and total iron were simulated. Two reservoir site locations were selected for comparison. Results of simulations at site 3B were characteristic of a riverine environment in the reservoir while results at site 7B (near the dam) were characteristic of the main body of the reservoir. Simulation results for the epilimnion and hypolimnion at these two sites also were evaluated and compared. The simulation results in the hypolimnion at site 7B were indicative of the water quality leaving the reservoir. Comparisons of the different scenario results were conducted to assess if substantial differences were observed between selected scenarios. Each of the scenarios was simulated for three contiguous years representing a wet, average, and dry annual hydrologic cycle (water years 2000 through 2002). Additionally, each selected simulation scenario was evaluated for differences in direct- and cumulative-effects on a particular scenario. Direct effects are intended to isolate the future effects of the scenarios. Cumulative effects are intended to evaluate the effects of the scenarios in conjunction with all reasonably foreseeable future activities in the study area. Comparisons between the direct- and cumulative-effects analyses indicated that there were not large differences in the results between most of the simulation scenarios and, as such, the focus of this report was on results for the direct-effects analysis. Addi
Non-Linear Metamodeling Extensions to the Robust Parameter Design of Computer Simulations
2016-09-15
design By principal component analysis," Total Quality Management, vol. 8, no. 6, pp. 409-416, 1997. [25] A. Salmasnia, R. B . Kazemzadeh and S. T . A...and D. T . Sturrock, Simulation with Arena (3rd ed.), New York, NY: McGraw-Hill, 2004. [85] A. M. Mathai and S. B . Provost, Quadratic Forms in Random...PhD Member ADEDEJI B . BADIRU, PhD Dean, Graduate School of Engineering and Management iv AFIT-ENS-DS-16-S-026 Abstract Robust
The impact of on-site wastewater from high density cluster developments on groundwater quality
NASA Astrophysics Data System (ADS)
Morrissey, P. J.; Johnston, P. M.; Gill, L. W.
2015-11-01
The net impact on groundwater quality from high density clusters of unsewered housing across a range of hydro(geo)logical settings has been assessed. Four separate cluster development sites were selected, each representative of different aquifer vulnerability categories. Groundwater samples were collected on a monthly basis over a two year period for chemical and microbiological analysis from nested multi-horizon sampling boreholes upstream and downstream of the study sites. The field results showed no statistically significant difference between upstream and downstream water quality at any of the study areas, although there were higher breakthroughs in contaminants in the High and Extreme vulnerability sites linked to high intensity rainfall events; these however, could not be directly attributed to on-site effluent. Linked numerical models were then built for each site using HYDRUS 2D to simulate the attenuation of contaminants through the unsaturated zone from which the resulting hydraulic and contaminant fluxes at the water table were used as inputs into MODFLOW MT3D models to simulate the groundwater flows. The results of the simulations confirmed the field observations at each site, indicating that the existing clustered on-site wastewater discharges would only cause limited and very localised impacts on groundwater quality, with contaminant loads being quickly dispersed and diluted downstream due to the relatively high groundwater flow rates. Further simulations were then carried out using the calibrated models to assess the impact of increasing cluster densities revealing little impact at any of the study locations up to a density of 6 units/ha with the exception of the Extreme vulnerability site.
Parallel discrete-event simulation of FCFS stochastic queueing networks
NASA Technical Reports Server (NTRS)
Nicol, David M.
1988-01-01
Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.
NASA Astrophysics Data System (ADS)
Gholami, V.; Khaleghi, M. R.; Sebghati, M.
2017-11-01
The process of water quality testing is money/time-consuming, quite important and difficult stage for routine measurements. Therefore, use of models has become commonplace in simulating water quality. In this study, the coactive neuro-fuzzy inference system (CANFIS) was used to simulate groundwater quality. Further, geographic information system (GIS) was used as the pre-processor and post-processor tool to demonstrate spatial variation of groundwater quality. All important factors were quantified and groundwater quality index (GWQI) was developed. The proposed model was trained and validated by taking a case study of Mazandaran Plain located in northern part of Iran. The factors affecting groundwater quality were the input variables for the simulation, whereas GWQI index was the output. The developed model was validated to simulate groundwater quality. Network validation was performed via comparison between the estimated and actual GWQI values. In GIS, the study area was separated to raster format in the pixel dimensions of 1 km and also by incorporation of input data layers of the Fuzzy Network-CANFIS model; the geo-referenced layers of the effective factors in groundwater quality were earned. Therefore, numeric values of each pixel with geographical coordinates were entered to the Fuzzy Network-CANFIS model and thus simulation of groundwater quality was accessed in the study area. Finally, the simulated GWQI indices using the Fuzzy Network-CANFIS model were entered into GIS, and hence groundwater quality map (raster layer) based on the results of the network simulation was earned. The study's results confirm the high efficiency of incorporation of neuro-fuzzy techniques and GIS. It is also worth noting that the general quality of the groundwater in the most studied plain is fairly low.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, Paolo; Theiler, C.; Fasoli, A.
A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less
Enhanced Verification Test Suite for Physics Simulation Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamm, J R; Brock, J S; Brandon, S T
2008-10-10
This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less
Quality assurance paradigms for artificial intelligence in modelling and simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oren, T.I.
1987-04-01
New classes of quality assurance concepts and techniques are required for the advanced knowledge-processing paradigms (such as artificial intelligence, expert systems, or knowledge-based systems) and the complex problems that only simulative systems can cope with. A systematization of quality assurance problems as well as examples are given to traditional and cognizant quality assurance techniques in traditional and cognizant modelling and simulation.
Dynamic stability and handling qualities tests on a highly augmented, statically unstable airplane
NASA Technical Reports Server (NTRS)
Gera, Joseph; Bosworth, John T.
1987-01-01
This paper describes some novel flight tests and analysis techniques in the flight dynamics and handling qualities area. These techniques were utilized during the initial flight envelope clearance of the X-29A aircraft and were largely responsible for the completion of the flight controls clearance program without any incidents or significant delays. The resulting open-loop and closed-loop frequency responses and the time history comparison using flight and linear simulation data are discussed.
Pytte, Morten; Kramer-Johansen, Jo; Eilevstjønn, Joar; Eriksen, Morten; Strømme, Taevje A; Godang, Kristin; Wik, Lars; Steen, Petter Andreas; Sunde, Kjetil
2006-12-01
Adrenaline (epinephrine) is used during cardiopulmonary resuscitation (CPR) based on animal experiments without supportive clinical data. Clinically CPR was reported recently to have much poorer quality than expected from international guidelines and what is generally done in laboratory experiments. We have studied the haemodynamic effects of adrenaline during CPR with good laboratory quality and with quality simulating clinical findings and the feasibility of monitoring these effects through VF waveform analysis. After 4 min of cardiac arrest, followed by 4 min of basic life support, 14 pigs were randomised to ClinicalCPR (intermittent manual chest compressions, compression-to-ventilation ratio 15:2, compression depth 30-38 mm) or LabCPR (continuous mechanical chest compressions, 12 ventilations/min, compression depth 45 mm). Adrenaline 0.02 mg/kg was administered 30 s thereafter. Plasma adrenaline concentration peaked earlier with LabCPR than with ClinicalCPR, median (range), 90 (30, 150) versus 150 (90, 270) s (p = 0.007), respectively. Coronary perfusion pressure (CPP) and cortical cerebral blood flow (CCBF) increased and femoral blood flow (FBF) decreased after adrenaline during LabCPR (mean differences (95% CI) CPP 17 (6, 29) mmHg (p = 0.01), FBF -5.0 (-8.8, -1.2) ml min(-1) (p = 0.02) and median difference CCBF 12% of baseline (p = 0.04)). There were no significant effects during ClinicalCPR (mean differences (95% CI) CPP 4.7 (-3.2, 13) mmHg (p = 0.2), FBF -0.2 (-4.6, 4.2) ml min(-1)(p = 0.9) and CCBF 3.6 (-1.8, 9.0)% of baseline (p = 0.15)). Slope VF waveform analysis reflected changes in CPP. Adrenaline improved haemodynamics during laboratory quality CPR in pigs, but not with quality simulating clinically reported CPR performance.
There is a need to develop modeling and data analysis tools to increase our understanding of human exposures to air pollutants beyond what can be explained by "limited" field data. Modeling simulations of complex distributions of pollutant concentrations within roadw...
The standard WASP7 stream transport model calculates water flow through a branching stream network that may include both free-flowing and ponded segments. This supplemental user manual documents the hydraulic algorithms, including the transport and hydrogeometry equations, the m...
SPATIO-TEMPORAL ANALYSIS OF TOTAL NITRATE CONCENTRATIONS USING DYNAMIC STATISTICAL MODELS
Atmospheric concentrations of total nitrate (TNO3), defined here as gas-phase nitric acid plus particle-phase nitrate, are difficult to simulate in numerical air quality models due to the presence of a variety of formation pathways and loss mechanisms, some of which ar...
Triple Value System Dynamics Modeling to Help Stakeholders Engage with Food-Energy-Water Problems
Triple Value (3V) Community scoping projects and Triple Value Simulation (3VS) models help decision makers and stakeholders apply systems-analysis methodology to complex problems related to food production, water quality, and energy use. 3VS models are decision support tools that...
Effects of roadway configurations on near-road air quality and the implications on roadway designs
This paper presents an analysis of wind tunnel experiments of twelve different roadway configurations and modeling of these configurations using a Large-Eddy Simulation (LES) model, aiming at investigating how flow structures affect the impact of roadway features on near-road and...
Insights into the deterministic skill of air quality ensembles from the analysis of AQMEII data
Simulations from chemical weather models are subject to uncertainties in the input data (e.g. emission inventory, initial and boundary conditions) as well as those intrinsic to the model (e.g. physical parameterization, chemical mechanism). Multi-model ensembles can improve the f...
Through the comparison of several regional-scale chemistry transport modelling systems that simulate meteorology and air quality over the European and American continents, this study aims at i) apportioning the error to the responsible processes using time-scale analysis, ii) hel...
[COSMOS motion design optimization in the CT table].
Shang, Hong; Huang, Jian; Ren, Chao
2013-03-01
Through the CT Table dynamic simulation by COSMOS Motion, analysis the hinge of table and the motor force, then optimize the position of the hinge of table, provide the evidence of selecting bearing and motor, meanwhile enhance the design quality of the CT table and reduce the product design cost.
Today's Business Simulation Industry
ERIC Educational Resources Information Center
Summers, Gary J.
2004-01-01
New technologies are transforming the business simulation industry. The technologies come from research in computational fields of science, and they endow simulations with new capabilities and qualities. These capabilities and qualities include computerized behavioral simulations, online feedback and coaching, advanced interfaces, learning on…
Pediatric laryngeal simulator using 3D printed models: A novel technique.
Kavanagh, Katherine R; Cote, Valerie; Tsui, Yvonne; Kudernatsch, Simon; Peterson, Donald R; Valdez, Tulio A
2017-04-01
Simulation to acquire and test technical skills is an essential component of medical education and residency training in both surgical and nonsurgical specialties. High-quality simulation education relies on the availability, accessibility, and reliability of models. The objective of this work was to describe a practical pediatric laryngeal model for use in otolaryngology residency training. Ideally, this model would be low-cost, have tactile properties resembling human tissue, and be reliably reproducible. Pediatric laryngeal models were developed using two manufacturing methods: direct three-dimensional (3D) printing of anatomical models and casted anatomical models using 3D-printed molds. Polylactic acid, acrylonitrile butadiene styrene, and high-impact polystyrene (HIPS) were used for the directly printed models, whereas a silicone elastomer (SE) was used for the casted models. The models were evaluated for anatomic quality, ease of manipulation, hardness, and cost of production. A tissue likeness scale was created to validate the simulation model. Fleiss' Kappa rating was performed to evaluate interrater agreement, and analysis of variance was performed to evaluate differences among the materials. The SE provided the most anatomically accurate models, with the tactile properties allowing for surgical manipulation of the larynx. Direct 3D printing was more cost-effective than the SE casting method but did not possess the material properties and tissue likeness necessary for surgical simulation. The SE models of the pediatric larynx created from a casting method demonstrated high quality anatomy, tactile properties comparable to human tissue, and easy manipulation with standard surgical instruments. Their use in a reliable, low-cost, accessible, modular simulation system provides a valuable training resource for otolaryngology residents. N/A. Laryngoscope, 127:E132-E137, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
Simulation of breast compression in mammography using finite element analysis: A preliminary study
NASA Astrophysics Data System (ADS)
Liu, Yan-Lin; Liu, Pei-Yuan; Huang, Mei-Lan; Hsu, Jui-Ting; Han, Ruo-Ping; Wu, Jay
2017-11-01
Adequate compression during mammography lowers the absorbed dose in the breast and improves the image quality. The compressed breast thickness (CBT) is affected by various factors, such as breast volume, glandularity, and compression force. In this study, we used the finite element analysis to simulate breast compression and deformation and validated the simulated CBT with clinical mammography results. Image data from ten subjects who had undergone mammography screening and breast magnetic resonance imaging (MRI) were collected, and their breast models were created according to the MR images. The non-linear tissue deformation under 10-16 daN in the cranial-caudal direction was simulated. When the clinical compression force was used, the simulated CBT ranged from 2.34 to 5.90 cm. The absolute difference between the simulated CBT and the clinically measured CBT ranged from 0.5 to 7.1 mm. The simulated CBT had a strong positive linear relationship to breast volume and a weak negative correlation to glandularity. The average simulated CBT under 10, 12, 14, and 16 daN was 5.68, 5.12, 4.67, and 4.25 cm, respectively. Through this study, the relationships between CBT, breast volume, glandularity, and compression force are provided for use in clinical mammography.
Influence of Protein Abundance on High-Throughput Protein-Protein Interaction Detection
2009-06-05
the interaction data sets we determined, via comparisons with strict randomized simulations , the propensity for essential proteins to selectively...and analysis of high- quality PPI data sets. Materials and Methods We analyzed protein interaction networks for yeast and E. coli determined from Y2H...we reinvestigated the centrality-lethality rule, which implies that proteins having more interactions are more likely to be essential. From analysis
The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering
NASA Technical Reports Server (NTRS)
Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen
2006-01-01
This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.
Laser Brazing with Beam Scanning: Experimental and Simulative Analysis
NASA Astrophysics Data System (ADS)
Heitmanek, M.; Dobler, M.; Graudenz, M.; Perret, W.; Göbel, G.; Schmidt, M.; Beyer, E.
Laser beam brazing with copper based filler wire is a widely established technology for joining zinc-coated steel plates in the body-shop. Successful applications are the divided tailgate or the zero-gap joint, which represents the joint between the side panel and the roof-top of the body-in-white. These joints are in direct view to the customer, and therefore have to fulfil highest optical quality requirements. For this reason a stable and efficient laser brazing process is essential. In this paper the current results on quality improvement due to one dimensional laser beam deflections in feed direction are presented. Additionally to the experimental results a transient three-dimensional simulation model for the laser beam brazing process is taken into account. With this model the influence of scanning parameters on filler wire temperature and melt pool characteristics is analyzed. The theoretical predictions are in good accordance with the experimental results. They show that the beam scanning approach is a very promising method to increase process stability and seam quality.
NASA Technical Reports Server (NTRS)
Sarpkaya, Turgut
2006-01-01
The reduction of the separation of the leading and following aircrafts is desirable to enhance the airport capacity provided that there is a physics-based operational model applicable to all regions of the flight domain (out of ground effect, OGE; near ground effect, NGE; and in ground effect, IGE) and that the quality of the quantitative input from the measurements of the prevailing atmospheric conditions and the quality of the total airport operations regarding the safety and the sound interpretation of the prevailing conditions match the quality of the analysis and numerical simulations. In the absence of an analytical solution, the physics of the flow is best expressed by a mathematical model based on numerical simulations, field and laboratory experiments, and heuristic reasoning. This report deals with the creation of a sound physics-based real-time IGE model of the aircraft wake vortices subjected to crosswind, stratification and shear.
Leone, Vincenzo; Cervone, Guido; Iovino, Pasquale
2016-09-01
The Second-order Closure Integrated Puff (SCIPUFF) model was used to study the impact on urban air quality caused by two cement plants emissions located near the city of Caserta, Italy, during the entire year of 2015. The simulated and observed PM10 concentrations were compared using three monitoring stations located in urban and sub-urban area of Caserta city. Both simulated and observed concentrations are shown to be highest in winter, lower in autumn and spring and lowest in summer. Model results generally follow the pattern of the observed concentrations but have a systematic under-prediction of the concentration values. Measures of the bias, NMSE and RMSE indicate a good correlation between observed and estimated values. The SCIPUFF model data analysis suggest that the cement plants are major sources for the measured PM10 values and are responsible for the deterioration of the urban air quality in the city of Caserta.
Beregovykh, V V; Spitskiy, O R
2014-01-01
Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.
Nikkhoo, Mohammad; Hsu, Yu-Chun; Haghpanahi, Mohammad; Parnianpour, Mohamad; Wang, Jaw-Lin
2013-06-01
Finite element analysis is an effective tool to evaluate the material properties of living tissue. For an interactive optimization procedure, the finite element analysis usually needs many simulations to reach a reasonable solution. The meta-model analysis of finite element simulation can be used to reduce the computation of a structure with complex geometry or a material with composite constitutive equations. The intervertebral disc is a complex, heterogeneous, and hydrated porous structure. A poroelastic finite element model can be used to observe the fluid transferring, pressure deviation, and other properties within the disc. Defining reasonable poroelastic material properties of the anulus fibrosus and nucleus pulposus is critical for the quality of the simulation. We developed a material property updating protocol, which is basically a fitting algorithm consisted of finite element simulations and a quadratic response surface regression. This protocol was used to find the material properties, such as the hydraulic permeability, elastic modulus, and Poisson's ratio, of intact and degenerated porcine discs. The results showed that the in vitro disc experimental deformations were well fitted with limited finite element simulations and a quadratic response surface regression. The comparison of material properties of intact and degenerated discs showed that the hydraulic permeability significantly decreased but Poisson's ratio significantly increased for the degenerated discs. This study shows that the developed protocol is efficient and effective in defining material properties of a complex structure such as the intervertebral disc.
Benchmark Data Set for Wheat Growth Models: Field Experiments and AgMIP Multi-Model Simulations.
NASA Technical Reports Server (NTRS)
Asseng, S.; Ewert, F.; Martre, P.; Rosenzweig, C.; Jones, J. W.; Hatfield, J. L.; Ruane, A. C.; Boote, K. J.; Thorburn, P.J.; Rotter, R. P.
2015-01-01
The data set includes a current representative management treatment from detailed, quality-tested sentinel field experiments with wheat from four contrasting environments including Australia, The Netherlands, India and Argentina. Measurements include local daily climate data (solar radiation, maximum and minimum temperature, precipitation, surface wind, dew point temperature, relative humidity, and vapor pressure), soil characteristics, frequent growth, nitrogen in crop and soil, crop and soil water and yield components. Simulations include results from 27 wheat models and a sensitivity analysis with 26 models and 30 years (1981-2010) for each location, for elevated atmospheric CO2 and temperature changes, a heat stress sensitivity analysis at anthesis, and a sensitivity analysis with soil and crop management variations and a Global Climate Model end-century scenario.
Rothman, Jason S.; Silver, R. Angus
2018-01-01
Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic. PMID:29670519
NASA Astrophysics Data System (ADS)
Matras, A.; Kowalczyk, R.
2014-11-01
The analysis results of machining accuracy after the free form surface milling simulations (based on machining EN AW- 7075 alloys) for different machining strategies (Level Z, Radial, Square, Circular) are presented in the work. Particular milling simulations were performed using CAD/CAM Esprit software. The accuracy of obtained allowance is defined as a difference between the theoretical surface of work piece element (the surface designed in CAD software) and the machined surface after a milling simulation. The difference between two surfaces describes a value of roughness, which is as the result of tool shape mapping on the machined surface. Accuracy of the left allowance notifies in direct way a surface quality after the finish machining. Described methodology of usage CAD/CAM software can to let improve a time design of machining process for a free form surface milling by a 5-axis CNC milling machine with omitting to perform the item on a milling machine in order to measure the machining accuracy for the selected strategies and cutting data.
A modal analysis of flexible aircraft dynamics with handling qualities implications
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1983-01-01
A multivariable modal analysis technique is presented for evaluating flexible aircraft dynamics, focusing on meaningful vehicle responses to pilot inputs and atmospheric turbulence. Although modal analysis is the tool, vehicle time response is emphasized, and the analysis is performed on the linear, time-domain vehicle model. In evaluating previously obtained experimental pitch tracking data for a family of vehicle dynamic models, it is shown that flexible aeroelastic effects can significantly affect pitch attitude handling qualities. Consideration of the eigenvalues alone, of both rigid-body and aeroelastic modes, does not explain the simulation results. Modal analysis revealed, however, that although the lowest aeroelastic mode frequency was still three times greater than the short-period frequency, the rigid-body attitude response was dominated by this aeroelastic mode. This dominance was defined in terms of the relative magnitudes of the modal residues in selected vehicle responses.
Development and Assessment of CTF for Pin-resolved BWR Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salko, Robert K; Wysocki, Aaron J; Collins, Benjamin S
2017-01-01
CTF is the modernized and improved version of the subchannel code, COBRA-TF. It has been adopted by the Consortium for Advanced Simulation for Light Water Reactors (CASL) for subchannel analysis applications and thermal hydraulic feedback calculations in the Virtual Environment for Reactor Applications Core Simulator (VERA-CS). CTF is now jointly developed by Oak Ridge National Laboratory and North Carolina State University. Until now, CTF has been used for pressurized water reactor modeling and simulation in CASL, but in the future it will be extended to boiling water reactor designs. This required development activities to integrate the code into the VERA-CSmore » workflow and to make it more ecient for full-core, pin resolved simulations. Additionally, there is a significant emphasis on producing high quality tools that follow a regimented software quality assurance plan in CASL. Part of this plan involves performing validation and verification assessments on the code that are easily repeatable and tied to specific code versions. This work has resulted in the CTF validation and verification matrix being expanded to include several two-phase flow experiments, including the General Electric 3 3 facility and the BWR Full-Size Fine Mesh Bundle Tests (BFBT). Comparisons with both experimental databases is reasonable, but the BFBT analysis reveals a tendency of CTF to overpredict void, especially in the slug flow regime. The execution of these tests is fully automated, analysis is documented in the CTF Validation and Verification manual, and the tests have become part of CASL continuous regression testing system. This paper will summarize these recent developments and some of the two-phase assessments that have been performed on CTF.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Regli, S.; Cromwell, J.; Mosher, J.
The U.S. EPA has undertaken an effort to model how the water supply industry may respond to possible rules and how those responses may affect human health risk. The model is referred to as the Disinfection By-Product Regulatory Analysis Model (DBPRAM), The paper is concerned primarily with presenting and discussing the methods, underlying data, assumptions, limitations and results for the first part of the model. This part of the model shows the creation of sets of simulated water supplies that are representative of the conditions currently encountered by public water supplies with respect to certain raw water quality and watermore » treatment characteristics.« less
Main directions in the simulation of physical characteristics of the World Ocean and seas
NASA Astrophysics Data System (ADS)
Sarkisyan, A. S.
2016-07-01
A brief analysis of the oceanographic papers printed in this issue is presented. For convenience of the reader, the paper by K. Bryan, a prominent scientist and expert in modeling the physical characteristics of the ocean, is discussed in detail. The remaining studies are described briefly in several sections: direct prognostic modeling, diagnosis-adaptation, four-dimensional analysis, and operational oceanography. At the end of the study, we separately discuss the problem of the reproduction of coastal intensification of temperature, salinity, density, and currents. We believe that the quality of the simulation results can be best assessed in terms of the intensity of coastal currents. In conclusion, this opinion is justified in detail.
Application of the GRC Stirling Convertor System Dynamic Model
NASA Technical Reports Server (NTRS)
Regan, Timothy F.; Lewandowski, Edward J.; Schreiber, Jeffrey G. (Technical Monitor)
2004-01-01
The GRC Stirling Convertor System Dynamic Model (SDM) has been developed to simulate dynamic performance of power systems incorporating free-piston Stirling convertors. This paper discusses its use in evaluating system dynamics and other systems concerns. Detailed examples are provided showing the use of the model in evaluation of off-nominal operating conditions. The many degrees of freedom in both the mechanical and electrical domains inherent in the Stirling convertor and the nonlinear dynamics make simulation an attractive analysis tool in conjunction with classical analysis. Application of SDM in studying the relationship of the size of the resonant circuit quality factor (commonly referred to as Q) in the various resonant mechanical and electrical sub-systems is discussed.
NASA Astrophysics Data System (ADS)
Tran, Trang; Tran, Huy; Mansfield, Marc; Lyman, Seth; Crosman, Erik
2018-03-01
Four-dimensional data assimilation (FDDA) was applied in WRF-CMAQ model sensitivity tests to study the impact of observational and analysis nudging on model performance in simulating inversion layers and O3 concentration distributions within the Uintah Basin, Utah, U.S.A. in winter 2013. Observational nudging substantially improved WRF model performance in simulating surface wind fields, correcting a 10 °C warm surface temperature bias, correcting overestimation of the planetary boundary layer height (PBLH) and correcting underestimation of inversion strengths produced by regular WRF model physics without nudging. However, the combined effects of poor performance of WRF meteorological model physical parameterization schemes in simulating low clouds, and warm and moist biases in the temperature and moisture initialization and subsequent simulation fields, likely amplified the overestimation of warm clouds during inversion days when observational nudging was applied, impacting the resulting O3 photochemical formation in the chemistry model. To reduce the impact of a moist bias in the simulations on warm cloud formation, nudging with the analysis water mixing ratio above the planetary boundary layer (PBL) was applied. However, due to poor analysis vertical temperature profiles, applying analysis nudging also increased the errors in the modeled inversion layer vertical structure compared to observational nudging. Combining both observational and analysis nudging methods resulted in unrealistically extreme stratified stability that trapped pollutants at the lowest elevations at the center of the Uintah Basin and yielded the worst WRF performance in simulating inversion layer structure among the four sensitivity tests. The results of this study illustrate the importance of carefully considering the representativeness and quality of the observational and model analysis data sets when applying nudging techniques within stable PBLs, and the need to evaluate model results on a basin-wide scale.
Figueroa-Lara, Alejandro; González-Block, Miguel A
2016-01-01
To estimate the cost-effectiveness ratio of public and private health care providers funded by Seguro Popular. A pilot contracting primary care health care scheme in the state of Hidalgo, Mexico, was evaluated through a population survey to assess quality of care and detection decreased of vision. Costs were assessed from the payer perspective using institutional sources.The alternatives analyzed were a private provider with capitated and performance-based payment modalities, and a public provider funded through budget subsidies. Sensitivity analysis was performed using Monte Carlo simulations. The private provider is dominant in the quality and cost-effective detection of decreased vision. Strategic purchasing of private providers of primary care has shown promising results as an alternative to improving quality of health services and reducing costs.
Air quality and passenger comfort in an air-conditioned bus micro-environment.
Zhu, Xiaoxuan; Lei, Li; Wang, Xingshen; Zhang, Yinghui
2018-04-12
In this study, passenger comfort and the air pollution status of the micro-environmental conditions in an air-conditioned bus were investigated through questionnaires, field measurements, and a numerical simulation. As a subjective analysis, passengers' perceptions of indoor environmental quality and comfort levels were determined from questionnaires. As an objective analysis, a numerical simulation was conducted using a discrete phase model to determine the diffusion and distribution of pollutants, including particulate matter with a diameter < 10 μm (PM 10 ), which were verified by experimental results. The results revealed poor air quality and dissatisfactory thermal comfort conditions in Jinan's air-conditioned bus system. To solve these problems, three scenarios (schemes A, B, C) were designed to alter the ventilation parameters. According to the results of an improved simulation of these scenarios, reducing or adding air outputs would shorten the time taken to reach steady-state conditions and weaken the airflow or lower the temperature in the cabin. The airflow pathway was closely related to the layout of the air conditioning. Scheme B lowered the temperature by 0.4 K and reduced the airflow by 0.01 m/s, while scheme C reduced the volume concentration of PM 10 to 150 μg/m 3 . Changing the air supply angle could further improve the airflow and reduce the concentration of PM 10 . With regard to the perception of airflow and thermal comfort, the scheme with an airflow provided by a 60° nozzle was considered better, and the concentration of PM 10 was reduced to 130 μg/m 3 .
Beddows, Andrew V; Kitwiroon, Nutthida; Williams, Martin L; Beevers, Sean D
2017-06-06
Gaussian process emulation techniques have been used with the Community Multiscale Air Quality model, simulating the effects of input uncertainties on ozone and NO 2 output, to allow robust global sensitivity analysis (SA). A screening process ranked the effect of perturbations in 223 inputs, isolating the 30 most influential from emissions, boundary conditions (BCs), and reaction rates. Community Multiscale Air Quality (CMAQ) simulations of a July 2006 ozone pollution episode in the UK were made with input values for these variables plus ozone dry deposition velocity chosen according to a 576 point Latin hypercube design. Emulators trained on the output of these runs were used in variance-based SA of the model output to input uncertainties. Performing these analyses for every hour of a 21 day period spanning the episode and several days on either side allowed the results to be presented as a time series of sensitivity coefficients, showing how the influence of different input uncertainties changed during the episode. This is one of the most complex models to which these methods have been applied, and here, they reveal detailed spatiotemporal patterns of model sensitivities, with NO and isoprene emissions, NO 2 photolysis, ozone BCs, and deposition velocity being among the most influential input uncertainties.
Modelling of Picosatellite Constellation-Based Network and Effects on Quality of Service
2015-03-01
the STK -QualNet Interface module. The analysis period for the scenario simulation was chosen as a two-day period from 6 June 2011 to 7 June 2011. This...MIN 0:00:04 0:00:34 AVERAGE 0:58:03 0:21:21 27 THIS PAGE INTENTIONALLY LEFT BLANK 28 VI. DATA ANALYSIS /INTERPRETATION The STK model was... Mission Analysis and Design, 3rd ed. (Space Technology Library, vol. 8). El Segundo, CA: Microcosm, 1999. [5] exactEarth. (2015). ExactView
NASA Astrophysics Data System (ADS)
Benjankar, R. M.; Sohrabi, M.; Tonina, D.; McKean, J. A.
2013-12-01
Aquatic habitat models utilize flow variables which may be predicted with one-dimensional (1D) or two-dimensional (2D) hydrodynamic models to simulate aquatic habitat quality. Studies focusing on the effects of hydrodynamic model dimensionality on predicted aquatic habitat quality are limited. Here we present the analysis of the impact of flow variables predicted with 1D and 2D hydrodynamic models on simulated spatial distribution of habitat quality and Weighted Usable Area (WUA) for fall-spawning Chinook salmon. Our study focuses on three river systems located in central Idaho (USA), which are a straight and pool-riffle reach (South Fork Boise River), small pool-riffle sinuous streams in a large meadow (Bear Valley Creek) and a steep-confined plane-bed stream with occasional deep forced pools (Deadwood River). We consider low and high flows in simple and complex morphologic reaches. Results show that 1D and 2D modeling approaches have effects on both the spatial distribution of the habitat and WUA for both discharge scenarios, but we did not find noticeable differences between complex and simple reaches. In general, the differences in WUA were small, but depended on stream type. Nevertheless, spatially distributed habitat quality difference is considerable in all streams. The steep-confined plane bed stream had larger differences between aquatic habitat quality defined with 1D and 2D flow models compared to results for streams with well defined macro-topographies, such as pool-riffle bed forms. KEY WORDS: one- and two-dimensional hydrodynamic models, habitat modeling, weighted usable area (WUA), hydraulic habitat suitability, high and low discharges, simple and complex reaches
Implementation of a WRF-CMAQ Air Quality Modeling System in Bogotá, Colombia
NASA Astrophysics Data System (ADS)
Nedbor-Gross, R.; Henderson, B. H.; Pachon, J. E.; Davis, J. R.; Baublitz, C. B.; Rincón, A.
2014-12-01
Due to a continuous economic growth Bogotá, Colombia has experienced air pollution issues in recent years. The local environmental authority has implemented several strategies to curb air pollution that have resulted in the decrease of PM10 concentrations since 2010. However, more activities are necessary in order to meet international air quality standards in the city. The University of Florida Air Quality and Climate group is collaborating with the Universidad de La Salle to prioritize regulatory strategies for Bogotá using air pollution simulations. To simulate pollution, we developed a modeling platform that combines the Weather Research and Forecasting Model (WRF), local emissions, and the Community Multi-scale Air Quality model (CMAQ). This platform is the first of its kind to be implemented in the megacity of Bogota, Colombia. The presentation will discuss development and evaluation of the air quality modeling system, highlight initial results characterizing photochemical conditions in Bogotá, and characterize air pollution under proposed regulatory strategies. The WRF model has been configured and applied to Bogotá, which resides in a tropical climate with complex mountainous topography. Developing the configuration included incorporation of local topography and land-use data, a physics sensitivity analysis, review, and systematic evaluation. The threshold, however, was set based on synthesis of model performance under less mountainous conditions. We will evaluate the impact that differences in autocorrelation contribute to the non-ideal performance. Air pollution predictions are currently under way. CMAQ has been configured with WRF meteorology, global boundary conditions from GEOS-Chem, and a locally produced emission inventory. Preliminary results from simulations show promising performance of CMAQ in Bogota. Anticipated results include a systematic performance evaluation of ozone and PM10, characterization of photochemical sensitivity, and air quality predictions under proposed regulatory scenarios.
Data-base development for water-quality modeling of the Patuxent River basin, Maryland
Fisher, G.T.; Summers, R.M.
1987-01-01
Procedures and rationale used to develop a data base and data management system for the Patuxent Watershed Nonpoint Source Water Quality Monitoring and Modeling Program of the Maryland Department of the Environment and the U.S. Geological Survey are described. A detailed data base and data management system has been developed to facilitate modeling of the watershed for water quality planning purposes; statistical analysis; plotting of meteorologic, hydrologic and water quality data; and geographic data analysis. The system is Maryland 's prototype for development of a basinwide water quality management program. A key step in the program is to build a calibrated and verified water quality model of the basin using the Hydrological Simulation Program--FORTRAN (HSPF) hydrologic model, which has been used extensively in large-scale basin modeling. The compilation of the substantial existing data base for preliminary calibration of the basin model, including meteorologic, hydrologic, and water quality data from federal and state data bases and a geographic information system containing digital land use and soils data is described. The data base development is significant in its application of an integrated, uniform approach to data base management and modeling. (Lantz-PTT)
USDA-ARS?s Scientific Manuscript database
A three-dimensional water quality model was developed for simulating temporal and spatial variations of phytoplankton, nutrients, and dissolved oxygen in freshwater bodies. Effects of suspended and bed sediment on the water quality processes were simulated. A formula was generated from field measure...
Frimpong, Joseph Asamoah; Amo-Addae, Maame Pokuah; Adewuyi, Peter Adebayo; Hall, Casey Daniel; Park, Meeyoung Mattie; Nagbe, Thomas Knue
2017-01-01
Public health officials depend on timely, complete, and accurate surveillance data for decision making. The quality of data generated from surveillance is highly dependent on external and internal factors which may either impede or enhance surveillance activities. One way of identifying challenges affecting the quality of data generated is to conduct a data quality audit. This case study, based on an audit conducted by residents of the Liberia Frontline Field Epidemiology Training Program, was designed to be a classroom simulation of a data quality audit in a health facility. It is suited to enforce theoretical lectures in surveillance data quality and auditing. The target group is public health trainees, who should be able to complete this exercise in approximately 2 hours and 30 minutes.
Frequency analysis of urban runoff quality in an urbanizing catchment of Shenzhen, China
NASA Astrophysics Data System (ADS)
Qin, Huapeng; Tan, Xiaolong; Fu, Guangtao; Zhang, Yingying; Huang, Yuefei
2013-07-01
This paper investigates the frequency distribution of urban runoff quality indicators using a long-term continuous simulation approach and evaluates the impacts of proposed runoff control schemes on runoff quality in an urbanizing catchment in Shenzhen, China. Four different indicators are considered to provide a comprehensive assessment of the potential impacts: total runoff depth, event pollutant load, Event Mean Concentration, and peak concentration during a rainfall event. The results obtained indicate that urban runoff quantity and quality in the catchment have significant variations in rainfall events and a very high rate of non-compliance with surface water quality regulations. Three runoff control schemes with the capacity to intercept an initial runoff depth of 5 mm, 10 mm, and 15 mm are evaluated, respectively, and diminishing marginal benefits are found with increasing interception levels in terms of water quality improvement. The effects of seasonal variation in rainfall events are investigated to provide a better understanding of the performance of the runoff control schemes. The pre-flood season has higher risk of poor water quality than other seasons after runoff control. This study demonstrates that frequency analysis of urban runoff quantity and quality provides a probabilistic evaluation of pollution control measures, and thus helps frame a risk-based decision making for urban runoff quality management in an urbanizing catchment.
Small scale rainfall simulators: Challenges for a future use in soil erosion research
NASA Astrophysics Data System (ADS)
Ries, Johannes B.; Iserloh, Thomas; Seeger, Manuel
2013-04-01
Rainfall simulation on micro-plot scale is a method used worldwide to assess the generation of overland flow, soil erosion, infiltration and interrelated processes such as soil sealing, crusting, splash and redistribution of solids and solutes. The produced data are of great significance not only for the analysis of the simulated processes, but also as a source of input-data for soil erosion modelling. The reliability of the data is therefore of paramount importance, and quality management of rainfall simulation procedure a general responsibility of the rainfall simulation community. This was an accepted outcome at the "International Rainfall Simulator Workshop 2011" at Trier University. The challenges of the present and near future use of small scale rainfall simulations concern the comparability of results and scales, the quality of the data for soil erosion modelling, and further technical developments to overcome physical limitations and constraints. Regarding the high number of research questions, different fields of application, and due to the great technical creativity of researchers, a large number of different types of rainfall simulators is available. But each of the devices produces a different rainfall, leading to different kinetic energy values influencing soil surface and erosion processes. Plot sizes are also variable, as well as the experimental simulation procedures. As a consequence, differing runoff and erosion results are produced. The presentation summarises the three important aspects of rainfall simulations, following a processual order: 1. Input-factor "rain" and its calibration 2. Surface-factor "plot" and its documentation 3. Output-factors "runoff" and "sediment concentration" Finally, general considerations about the limitations and challenges for further developments and applications of rainfall simulation data are presented.
Waterborne Disease Case Investigation: Public Health Nursing Simulation.
Alexander, Gina K; Canclini, Sharon B; Fripp, Jon; Fripp, William
2017-01-01
The lack of safe drinking water is a significant public health threat worldwide. Registered nurses assess the physical environment, including the quality of the water supply, and apply environmental health knowledge to reduce environmental exposures. The purpose of this research brief is to describe a waterborne disease simulation for students enrolled in a public health nursing (PHN) course. A total of 157 undergraduate students completed the simulation in teams, using the SBAR (Situation-Background-Assessment-Recommendation) reporting tool. Simulation evaluation consisted of content analysis of the SBAR tools and debriefing notes. Student teams completed the simulation and articulated the implications for PHN practice. Student teams discussed assessment findings and primarily recommended four nursing interventions: health teaching focused on water, sanitation, and hygiene; community organizing; collaboration; and advocacy to ensure a safe water supply. With advanced planning and collaboration with partners, waterborne disease simulation may enhance PHN education. [J Nurs Educ. 2017;56(1):39-42.]. Copyright 2017, SLACK Incorporated.
Propagation of variability in railway dynamic simulations: application to virtual homologation
NASA Astrophysics Data System (ADS)
Funfschilling, Christine; Perrin, Guillaume; Kraft, Sönke
2012-01-01
Railway dynamic simulations are increasingly used to predict and analyse the behaviour of the vehicle and of the track during their whole life cycle. Up to now however, no simulation has been used in the certification procedure even if the expected benefits are important: cheaper and shorter procedures, more objectivity, better knowledge of the behaviour around critical situations. Deterministic simulations are nevertheless too poor to represent the whole physical of the track/vehicle system which contains several sources of variability: variability of the mechanical parameters of a train among a class of vehicles (mass, stiffness and damping of different suspensions), variability of the contact parameters (friction coefficient, wheel and rail profiles) and variability of the track design and quality. This variability plays an important role on the safety, on the ride quality, and thus on the certification criteria. When using the simulation for certification purposes, it seems therefore crucial to take into account the variability of the different inputs. The main goal of this article is thus to propose a method to introduce the variability in railway dynamics. A four-step method is described namely the definition of the stochastic problem, the modelling of the inputs variability, the propagation and the analysis of the output. Each step is illustrated with railway examples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cline, K; Narayanasamy, G; Obediat, M
Purpose: Deformable image registration (DIR) is used routinely in the clinic without a formalized quality assurance (QA) process. Using simulated deformations to digitally deform images in a known way and comparing to DIR algorithm predictions is a powerful technique for DIR QA. This technique must also simulate realistic image noise and artifacts, especially between modalities. This study developed an algorithm to create simulated daily kV cone-beam computed-tomography (CBCT) images from CT images for DIR QA between these modalities. Methods: A Catphan and physical head-and-neck phantom, with known deformations, were used. CT and kV-CBCT images of the Catphan were utilized tomore » characterize the changes in Hounsfield units, noise, and image cupping that occur between these imaging modalities. The algorithm then imprinted these changes onto a CT image of the deformed head-and-neck phantom, thereby creating a simulated-CBCT image. CT and kV-CBCT images of the undeformed and deformed head-and-neck phantom were also acquired. The Velocity and MIM DIR algorithms were applied between the undeformed CT image and each of the deformed CT, CBCT, and simulated-CBCT images to obtain predicted deformations. The error between the known and predicted deformations was used as a metric to evaluate the quality of the simulated-CBCT image. Ideally, the simulated-CBCT image registration would produce the same accuracy as the deformed CBCT image registration. Results: For Velocity, the mean error was 1.4 mm for the CT-CT registration, 1.7 mm for the CT-CBCT registration, and 1.4 mm for the CT-simulated-CBCT registration. These same numbers were 1.5, 4.5, and 5.9 mm, respectively, for MIM. Conclusion: All cases produced similar accuracy for Velocity. MIM produced similar values of accuracy for CT-CT registration, but was not as accurate for CT-CBCT registrations. The MIM simulated-CBCT registration followed this same trend, but overestimated MIM DIR errors relative to the CT-CBCT registration.« less
NASA Astrophysics Data System (ADS)
Mishra, S. K.; Ding, D.; Rapolu, U.
2012-12-01
Human activity is intricately linked to the quality and quantity of water resources. Although many studies have examined water-human interaction, the complexity of such coupled systems is not well understood largely because of gaps in our knowledge of water-cycle processes which are heavily influenced by socio-economic drivers. On this context, this team has investigated connections among agriculture, policy, climate, land use/land cover, and water quality in Iowa over the past couple of years. To help explore these connections the team is developing a variety of cyber infrastructure tools that facilitate the collection, analysis and visualization of data, and the simulation of system dynamics. In an ongoing effort, the prototype system is applied to Clear Creek watershed, an agricultural dominating catchment in Iowa in the US Midwest, to understand water-human processes relevant to management decisions by farmers regarding agro ecosystems. The primary aim of this research is to understand the connections that exist among the agricultural and biofuel economy, land use/land cover change, and water quality. To help explore these connections an agent-based model (ABM) of land use change has been developed that simulates the decisions made by farmers given alternative assumptions about market forces, farmer characteristics, and water quality regulations. The SWAT model was used to simulate the impact of these decisions on the movement of sediment, nitrogen, and phosphorus across the landscape. The paper also demonstrate how through the use of this system researchers can, for example, search for scenarios that lead to desirable socio-economic outcomes as well as preserve water quantity and quality.
Atmospheric Model Evaluation Tool for meteorological and air quality simulations
The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.
NASA Technical Reports Server (NTRS)
Bigler, W. B., II
1977-01-01
The NASA passenger ride quality apparatus (PRQA), a ground based motion simulator, was compared to the total in flight simulator (TIFS). Tests were made on PRQA with varying stimuli: motions only; motions and noise; motions, noise, and visual; and motions and visual. Regression equations for the tests were obtained and subsequent t-testing of the slopes indicated that ground based simulator tests produced comfort change rates similar to actual flight data. It was recommended that PRQA be used in the ride quality program for aircraft and that it be validated for other transportation modes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taha, Haider; Hammer, Hillel; Akbari, Hashem
2002-04-30
The study described in this report is part of a project sponsored by the Toronto Atmospheric Fund, performed at the Lawrence Berkeley National Laboratory, to assess the potential role of surface property modifications on energy, meteorology, and air quality in the Greater Toronto Area (GTA), Canada. Numerical models were used to establish the possible meteorological and ozone air-quality impacts of increased urban albedo and vegetative fraction, i.e., ''cool-city'' strategies that can mitigate the urban heat island (UHI), significantly reduce urban energy consumption, and improve thermal comfort, particularly during periods of hot weather in summer. Mitigation is even more important duringmore » critical heat wave periods with possible increased heat-related hospitalization and mortality. The evidence suggests that on an annual basis cool-city strategies are beneficial, and the implementation of such measures is currently being investigated in the U.S. and Canada. We simulated possible scenari os for urban heat-island mitigation in the GTA and investigated consequent meteorological changes, and also performed limited air-quality analysis to assess related impacts. The study was based on a combination of mesoscale meteorological modeling, Lagrangian (trajectory), and photochemical trajectory modeling to assess the potential meteorological and ozone air-quality impacts of cool-city strategies. As available air-quality and emissions data are incompatible with models currently in use at LBNL, our air-quality analysis was based on photochemical trajectory modeling. Because of questions as to the accuracy and appropriateness of this approach, in our opinion this aspect of the study can be improved in the future, and the air-quality results discussed in this report should be viewed as relatively qualitative. The MM5 meteorological model predicts a UHI in the order of 2 to 3 degrees C in locations of maxima, and about 1 degree C as a typical value over most of the urban area. Our si mulations suggest that cool-city strategies can typically reduce local urban air temperature by 0.5-1 degrees C; as more sporadic events, larger decreases (1.5 degrees C, 2.5-2.7 degrees C and 4-6 degrees C) were also simulated. With regard to ozone mixing ratios along the simulated trajectories, the effects of cool-city strategies appear to be on the order of 2 ppb, a typical decrease. The photochemical trajectory model (CIT) also simulates larger decreases (e.g., 4 to 8 ppb), but these are not taken as representative of the potential impacts in this report. A comparison with other simulations suggest very crudely that a decrease of this magnitude corresponds to significant ''equivalent'' decreases in both NOx and VOCs emissions in the region. Our preliminary results suggest that significant UHI control can be achieved with cool-cities strategies in the GTA and is therefore worth further study. We recommend that better input data and more accurate modeling schemes be used to carry out f uture studies in the same direction.« less
Kable, Ashley K; Levett-Jones, Tracy L; Arthur, Carol; Reid-Searl, Kerry; Humphreys, Melanie; Morris, Sara; Walsh, Pauline; Witton, Nicola J
2018-01-01
The aim of this paper is to report the results of a cross-national study that evaluated a range of simulation sessions using an observation schedule developed from evidence-based quality indicators. Observational data were collected from 17 simulation sessions conducted for undergraduate nursing students at three universities in Australia and the United Kingdom. The observation schedule contained 27 questions that rated simulation quality. Data were collected by direct observation and from video recordings of the simulation sessions. Results indicated that the highest quality scores were for provision of learning objectives prior to the simulation session (90%) and debriefing (72%). Student preparatiosn and orientation (67%) and perceived realism and fidelity (67%) were scored lower than other components of the simulation sessions. This observational study proved to be an effective strategy to identify areas of strength and those needing further development to improve simulation sessions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Performance evaluation of power control algorithms in wireless cellular networks
NASA Astrophysics Data System (ADS)
Temaneh-Nyah, C.; Iita, V.
2014-10-01
Power control in a mobile communication network intents to control the transmission power levels in such a way that the required quality of service (QoS) for the users is guaranteed with lowest possible transmission powers. Most of the studies of power control algorithms in the literature are based on some kind of simplified assumptions which leads to compromise in the validity of the results when applied in a real environment. In this paper, a CDMA network was simulated. The real environment was accounted for by defining the analysis area and the network base stations and mobile stations are defined by their geographical coordinates, the mobility of the mobile stations is accounted for. The simulation also allowed for a number of network parameters including the network traffic, and the wireless channel models to be modified. Finally, we present the simulation results of a convergence speed based comparative analysis of three uplink power control algorithms.
Digital techniques for ULF wave polarization analysis
NASA Technical Reports Server (NTRS)
Arthur, C. W.
1979-01-01
Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.
ACES: Space shuttle flight software analysis expert system
NASA Technical Reports Server (NTRS)
Satterwhite, R. Scott
1990-01-01
The Analysis Criteria Evaluation System (ACES) is a knowledge based expert system that automates the final certification of the Space Shuttle onboard flight software. Guidance, navigation and control of the Space Shuttle through all its flight phases are accomplished by a complex onboard flight software system. This software is reconfigured for each flight to allow thousands of mission-specific parameters to be introduced and must therefore be thoroughly certified prior to each flight. This certification is performed in ground simulations by executing the software in the flight computers. Flight trajectories from liftoff to landing, including abort scenarios, are simulated and the results are stored for analysis. The current methodology of performing this analysis is repetitive and requires many man-hours. The ultimate goals of ACES are to capture the knowledge of the current experts and improve the quality and reduce the manpower required to certify the Space Shuttle onboard flight software.
Model Performance Evaluation and Scenario Analysis ...
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too
Lopes, Anne; Sacquin-Mora, Sophie; Dimitrova, Viktoriya; Laine, Elodie; Ponty, Yann; Carbone, Alessandra
2013-01-01
Large-scale analyses of protein-protein interactions based on coarse-grain molecular docking simulations and binding site predictions resulting from evolutionary sequence analysis, are possible and realizable on hundreds of proteins with variate structures and interfaces. We demonstrated this on the 168 proteins of the Mintseris Benchmark 2.0. On the one hand, we evaluated the quality of the interaction signal and the contribution of docking information compared to evolutionary information showing that the combination of the two improves partner identification. On the other hand, since protein interactions usually occur in crowded environments with several competing partners, we realized a thorough analysis of the interactions of proteins with true partners but also with non-partners to evaluate whether proteins in the environment, competing with the true partner, affect its identification. We found three populations of proteins: strongly competing, never competing, and interacting with different levels of strength. Populations and levels of strength are numerically characterized and provide a signature for the behavior of a protein in the crowded environment. We showed that partner identification, to some extent, does not depend on the competing partners present in the environment, that certain biochemical classes of proteins are intrinsically easier to analyze than others, and that small proteins are not more promiscuous than large ones. Our approach brings to light that the knowledge of the binding site can be used to reduce the high computational cost of docking simulations with no consequence in the quality of the results, demonstrating the possibility to apply coarse-grain docking to datasets made of thousands of proteins. Comparison with all available large-scale analyses aimed to partner predictions is realized. We release the complete decoys set issued by coarse-grain docking simulations of both true and false interacting partners, and their evolutionary sequence analysis leading to binding site predictions. Download site: http://www.lgm.upmc.fr/CCDMintseris/ PMID:24339765
CCSDS Advanced Orbiting Systems Virtual Channel Access Service for QoS MACHETE Model
NASA Technical Reports Server (NTRS)
Jennings, Esther H.; Segui, John S.
2011-01-01
To support various communications requirements imposed by different missions, interplanetary communication protocols need to be designed, validated, and evaluated carefully. Multimission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE), described in "Simulator of Space Communication Networks" (NPO-41373), NASA Tech Briefs, Vol. 29, No. 8 (August 2005), p. 44, combines various tools for simulation and performance analysis of space networks. The MACHETE environment supports orbital analysis, link budget analysis, communications network simulations, and hardware-in-the-loop testing. By building abstract behavioral models of network protocols, one can validate performance after identifying the appropriate metrics of interest. The innovators have extended the MACHETE model library to include a generic link-layer Virtual Channel (VC) model supporting quality-of-service (QoS) controls based on IP streams. The main purpose of this generic Virtual Channel model addition was to interface fine-grain flow-based QoS (quality of service) between the network and MAC layers of the QualNet simulator, a commercial component of MACHETE. This software model adds the capability of mapping IP streams, based on header fields, to virtual channel numbers, allowing extended QoS handling at link layer. This feature further refines the QoS v existing at the network layer. QoS at the network layer (e.g. diffserv) supports few QoS classes, so data from one class will be aggregated together; differentiating between flows internal to a class/priority is not supported. By adding QoS classification capability between network and MAC layers through VC, one maps multiple VCs onto the same physical link. Users then specify different VC weights, and different queuing and scheduling policies at the link layer. This VC model supports system performance analysis of various virtual channel link-layer QoS queuing schemes independent of the network-layer QoS systems.
Computational Analysis of Splash Occurring in the Deposition Process in Annular-Mist Flow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Heng; Koshizuka, Seiichi; Oka, Yoshiaki
2004-07-01
The deposition process of a single droplet on the film is numerically simulated by the Moving Particle Semi-implicit (MPS) method to analyze the possibility and effect of splash occurring in the deposition process in BWR condition. The model accounts for the presence of inertial, gravitation, viscous and surface tension and is validated by comparison with experiment results. A simple one-dimensional mixture model is developed to calculate the necessary parameters for the simulation of deposition in BWR condition. The deposition process of a single droplet in BWR condition is simulated. The effect of impact angle of droplet and the velocity ofmore » liquid film are analyzed. A film buffer model is developed to fit the simulation results of critical value for splash. A correlation of critical Weber number for splash in BWR condition is obtained and used to analyze the effect of splash. It is found that the splash play important role in the deposition and re-entrainment process in high quality condition in BWR. The mass fraction of re-entrainment caused by splash in different quality condition is also calculated. (authors)« less
Comprehensive helicopter analysis: A state of the art review
NASA Technical Reports Server (NTRS)
Johnson, W.
1978-01-01
An assessment of the status of helicopter theory and analysis is presented. The technology level embodied in available design tools (computer programs) is examined, considering the problem areas of performance, loads and vibration, handling qualities and simulation, and aeroelastic stability. The effectiveness of the present analyses is discussed. The characteristics of the technology in the analyses are reviewed, including the aerodynamics technology, induced velocity and wake geometry, dynamics technology, and machine limitations.
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2006-01-01
A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis - Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2007-01-01
A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.
Big data analytics for the Future Circular Collider reliability and availability studies
NASA Astrophysics Data System (ADS)
Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter
2017-10-01
Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.
Indoor air quality (IAQ) evaluation of a Novel Tobacco Vapor (NTV) product.
Ichitsubo, Hirokazu; Kotaki, Misato
2018-02-01
The impact of using a Novel Tobacco Vapor (NTV) product on indoor air quality (IAQ) was simulated using an environmentally-controlled chamber. Three environmental simulations were examined; two non-smoking areas (conference room and dining room) and one ventilated smoking area (smoking lounge). IAQ was evaluated by (i) measuring constituents in the mainstream NTV product emissions, (ii) and by determining classical environmental tobacco smoke (ETS) and representative air quality markers. Analysis of the mainstream emissions revealed that vapor from the NTV product is chemically simpler than cigarette smoke. ETS markers (RSP, UVPM, FPM, solanesol, nicotine, 3-ethenylpyridine), volatile organic compound (toluene), carbon monoxide, propylene glycol, glycerol, and triacetin were below the limit of detection or the limit of quantification in both the non-smoking and smoking environments after using the NTV product. The concentrations of ammonia, carbonyls (formaldehyde, acetaldehyde, and acetone), and total volatile organic compounds were the same levels found in the chamber without NTV use. There was no significant increase in the levels of formaldehyde, acetone or ammonia in exhaled breath following NTV use. In summary, under the simulations tested, the NTV product had no measurable effect on the IAQ, in either non-smoking or smoking areas. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Video requirements for remote medical diagnosis
NASA Technical Reports Server (NTRS)
Davis, J. G.
1974-01-01
Minimal television system requirements for medical telediagnosis were studied. The experiment was conducted with the aid of a simulated telemedicine system. The first step involved making high quality videotape recordings of actual medical examinations conducted by a skilled nurse under the direction of a physician watching on closed circuit television. These recordings formed the baseline for the study. Next, these videotape recordings were electronically degraded to simulate television systems of less than broadcast quality. Finally, the baseline and degraded video recordings were shown (via a statistically randomized procedure) to a large number of physicians who attempted to reach a correct medical diagnosis and to visually recognize key physical signs for each patient. By careful scoring and analysis of the results of these viewings, the pictorial and diagnostic limitations as a function of technical video characteristics were to be defined.
USDA-ARS?s Scientific Manuscript database
The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...
DOT National Transportation Integrated Search
2012-03-01
The Alaska adapted version of the Weather Research and Forecasting and the Community Modeling and Analysis Quality (WRF-CMAQ) modeling : systems was used to assess the contribution of traffic to the PM2.5-concentration in the Fairbanks nonattainment ...
The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science chemical transport model (CTM) capable of simulating the emission, transport and fate of numerous air pollutants. Similarly, the Weather Research and Forecasting (WRF) model is a state-of-the-science mete...
Jeffrey G. Borchers
2005-01-01
The risks, uncertainties, and social conflicts surrounding uncharacteristic wildfire and forest resource values have defied conventional approaches to planning and decision-making. Paradoxically, the adoption of technological innovations such as risk assessment, decision analysis, and landscape simulation models by land management organizations has been limited. The...
Martin, Angel; Whiteman, C.D.
1999-01-01
Existing data on water levels, water use, water quality, and aquifer properties were used to construct a multilayer digital model to simulate flow in the aquifer system. The report describes the geohydrologic framework of the aquifer system, and the development, calibration, and sensitivity analysis of the ground-water-flow model, but it is primarily focused on the results of the simulations that show the natural flow of ground water throughout the regional aquifer system and the changes from the natural flow caused by development of ground-water supplies.
NASA Astrophysics Data System (ADS)
Krause, Lars; Braun, Markus; Hauslage, Jens; Hemmersbach, Ruth
2018-05-01
In single-celled rhizoids of the green algae Chara, positively gravitropic growth is governed by statoliths kept in a dynamically stable position 10-25 μ m above the cell tip by a complex interaction of gravity and actomyosin forces. Any deviation of the tube-like cells from the tip-downward orientation causes statoliths to sediment onto the gravisensitive subapical cell flank which initiates a gravitropic curvature response. Microgravity experiments have shown that abolishing the net tip-directed gravity force results in an actomyosin-mediated axial displacement of statoliths away from the cell tip. The present study was performed to critically assess the quality of microgravity simulation provided by different operational modes of a Random Positioning Machine (RPM) running with one axis (2D mode) or two axes (3D mode) and different rotational speeds (2D), speed ranges and directions (3D). The effects of 2D and 3D rotation were compared with data from experiments in real microgravity conditions (MAXUS sounding rocket missions). Rotational speeds in the range of 60-85 rpm in 2D and 3D modes resulted in a similar kinetics of statolith displacement as compared to real microgravity data, while slower clinorotation (2-11 rpm) caused a reduced axial displacement and a more dispersed arrangement of statoliths closer to the cell tip. Increasing the complexity of rotation by adding a second rotation axis in case of 3D clinorotation did not increase the quality of microgravity simulation, however, increased side effects such as the level of vibrations resulting in a more dispersed arrangement of statoliths. In conclusion, fast 2D clinorotation provides the most appropriate microgravity simulation for investigating the graviperception mechanism in Chara rhizoids, whereas slower clinorotation speeds and rotating samples around two axes do not improve the quality of microgravity simulation.
Monitoring Air Quality over China: Evaluation of the modeling system of the PANDA project
NASA Astrophysics Data System (ADS)
Bouarar, Idir; Katinka Petersen, Anna; Brasseur, Guy; Granier, Claire; Xie, Ying; Wang, Xuemei; Fan, Qi; Wang, Lili
2015-04-01
Air pollution has become a pressing problem in Asia and specifically in China due to rapid increase in anthropogenic emissions related to growth of China's economic activity and increasing demand for energy in the past decade. Observed levels of particulate matter and ozone regularly exceed World Health Organization (WHO) air quality guidelines in many parts of the country leading to increased risk of respiratory illnesses and other health problems. The EU-funded project PANDA aims to establish a team of European and Chinese scientists to monitor air pollution over China and elaborate air quality indicators in support of European and Chinese policies. PANDA combines state-of-the-art air pollution modeling with space and surface observations of chemical species to improve methods for monitoring air quality. The modeling system of the PANDA project follows a downscaling approach: global models such as MOZART and MACC system provide initial and boundary conditions to regional WRF-Chem and EMEP simulations over East Asia. WRF-Chem simulations at higher resolution (e.g. 20km) are then performed over a smaller domain covering East China and initial and boundary conditions from this run are used to perform simulations at a finer resolution (e.g. 5km) over specific megacities like Shanghai. Here we present results of model simulations for January and July 2010 performed during the first year of the project. We show an intercomparison of the global (MACC, EMEP) and regional (WRF-Chem) simulations and a comprehensive evaluation with satellite measurements (NO2, CO) and in-situ data (O3, CO, NOx, PM10 and PM2.5) at several surface stations. Using the WRF-Chem model, we demonstrate that model performance is influenced not only by the resolution (e.g. 60km, 20km) but also the emission inventories used (MACCity, HTAPv2), their resolution and diurnal variation, and the choice of initial and boundary conditions (e.g. MOZART, MACC analysis).
NASA Astrophysics Data System (ADS)
Krause, Lars; Braun, Markus; Hauslage, Jens; Hemmersbach, Ruth
2018-01-01
In single-celled rhizoids of the green algae Chara, positively gravitropic growth is governed by statoliths kept in a dynamically stable position 10-25 μ m above the cell tip by a complex interaction of gravity and actomyosin forces. Any deviation of the tube-like cells from the tip-downward orientation causes statoliths to sediment onto the gravisensitive subapical cell flank which initiates a gravitropic curvature response. Microgravity experiments have shown that abolishing the net tip-directed gravity force results in an actomyosin-mediated axial displacement of statoliths away from the cell tip. The present study was performed to critically assess the quality of microgravity simulation provided by different operational modes of a Random Positioning Machine (RPM) running with one axis (2D mode) or two axes (3D mode) and different rotational speeds (2D), speed ranges and directions (3D). The effects of 2D and 3D rotation were compared with data from experiments in real microgravity conditions (MAXUS sounding rocket missions). Rotational speeds in the range of 60-85 rpm in 2D and 3D modes resulted in a similar kinetics of statolith displacement as compared to real microgravity data, while slower clinorotation (2-11 rpm) caused a reduced axial displacement and a more dispersed arrangement of statoliths closer to the cell tip. Increasing the complexity of rotation by adding a second rotation axis in case of 3D clinorotation did not increase the quality of microgravity simulation, however, increased side effects such as the level of vibrations resulting in a more dispersed arrangement of statoliths. In conclusion, fast 2D clinorotation provides the most appropriate microgravity simulation for investigating the graviperception mechanism in Chara rhizoids, whereas slower clinorotation speeds and rotating samples around two axes do not improve the quality of microgravity simulation.
NASA Astrophysics Data System (ADS)
Moore, Craig S.; Wood, Tim J.; Saunderson, John R.; Beavis, Andrew W.
2017-09-01
The use of computer simulated digital x-radiographs for optimisation purposes has become widespread in recent years. To make these optimisation investigations effective, it is vital simulated radiographs contain accurate anatomical and system noise. Computer algorithms that simulate radiographs based solely on the incident detector x-ray intensity (‘dose’) have been reported extensively in the literature. However, while it has been established for digital mammography that x-ray beam quality is an important factor when modelling noise in simulated images there are no such studies for diagnostic imaging of the chest, abdomen and pelvis. This study investigates the influence of beam quality on image noise in a digital radiography (DR) imaging system, and incorporates these effects into a digitally reconstructed radiograph (DRR) computer simulator. Image noise was measured on a real DR imaging system as a function of dose (absorbed energy) over a range of clinically relevant beam qualities. Simulated ‘absorbed energy’ and ‘beam quality’ DRRs were then created for each patient and tube voltage under investigation. Simulated noise images, corrected for dose and beam quality, were subsequently produced from the absorbed energy and beam quality DRRs, using the measured noise, absorbed energy and beam quality relationships. The noise images were superimposed onto the noiseless absorbed energy DRRs to create the final images. Signal-to-noise measurements in simulated chest, abdomen and spine images were within 10% of the corresponding measurements in real images. This compares favourably to our previous algorithm where images corrected for dose only were all within 20%.
NASA Astrophysics Data System (ADS)
Garland, R. M.; Naidoo, M.; Dedekind, Z.; Sibiya, B.; Piketh, S.; Engelbrecht, C. J.; Engelbrecht, F.
2017-12-01
Many parts of the southern hemisphere are linked in part due to the strong impact that emissions from natural sources, such as large biomass burning events and marine sources, as well as growing anthropogenic emission sources. Most of southern Africa has an arid to semi-arid climate that is strongly impacted by biomass burning, biogenic and dust emissions. In addition, there are areas of growing industrialization and urbanization that contributes to poor air quality. This air pollution can impact not only human health, but also agriculture, ecosystems, and the climate. This presentation will highlight on-going research to simulate the southern Africa atmosphere and impacts, with a focus on the interplay and relative importance of natural and anthropogenic emissions. The presentation will discuss the simulated sensitivity of the southern African climate to aerosol particles to highlight the importance of natural sources. These historical simulations (1979-2012) were performed with CCAM and are towards the development of the first Africa-led earth systems model. The analysis focused on the simulated sensitivity of the climate and clouds off the southwestern coast of Africa to aerosol particles. The interplay between natural and anthropogenic sources on air pollution will be highlighted using the Waterberg region of South Africa as a case study. CAMx was run at 2km resolution for 2013 using local emission inventories and meteorological output from CCAM to simulate the air quality of the region. These simulations estimate that, on average in the summer, up to 20% of ozone in and around a power plant plume is attributable to biogenic sources of VOCs, with ozone peaks of up to 120ppb; highlighting the importance of understanding the mix of pollutants in this area. In addition to presenting results from this study, the challenges in modelling will be highlighted. These challenges include very few or no measurements that are important to understand, and then accurately simulate, atmospheric chemistry (e.g. OH, PAN, SOA).
NASA Astrophysics Data System (ADS)
Yousif, Dilon
The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).
NASA Technical Reports Server (NTRS)
Murphy, M. R.; Awe, C. A.
1986-01-01
Six professionally active, retired captains rated the coordination and decisionmaking performances of sixteen aircrews while viewing videotapes of a simulated commercial air transport operation. The scenario featured a required diversion and a probable minimum fuel situation. Seven point Likert-type scales were used in rating variables on the basis of a model of crew coordination and decisionmaking. The variables were based on concepts of, for example, decision difficulty, efficiency, and outcome quality; and leader-subordin ate concepts such as person and task-oriented leader behavior, and competency motivation of subordinate crewmembers. Five-front-end variables of the model were in turn dependent variables for a hierarchical regression procedure. The variance in safety performance was explained 46%, by decision efficiency, command reversal, and decision quality. The variance of decision quality, an alternative substantive dependent variable to safety performance, was explained 60% by decision efficiency and the captain's quality of within-crew communications. The variance of decision efficiency, crew coordination, and command reversal were in turn explained 78%, 80%, and 60% by small numbers of preceding independent variables. A principle component, varimax factor analysis supported the model structure suggested by regression analyses.
Mass imbalances in EPANET water-quality simulations
NASA Astrophysics Data System (ADS)
Davis, Michael J.; Janke, Robert; Taxon, Thomas N.
2018-04-01
EPANET is widely employed to simulate water quality in water distribution systems. However, in general, the time-driven simulation approach used to determine concentrations of water-quality constituents provides accurate results only for short water-quality time steps. Overly long time steps can yield errors in concentration estimates and can result in situations in which constituent mass is not conserved. The use of a time step that is sufficiently short to avoid these problems may not always be feasible. The absence of EPANET errors or warnings does not ensure conservation of mass. This paper provides examples illustrating mass imbalances and explains how such imbalances can occur because of fundamental limitations in the water-quality routing algorithm used in EPANET. In general, these limitations cannot be overcome by the use of improved water-quality modeling practices. This paper also presents a preliminary event-driven approach that conserves mass with a water-quality time step that is as long as the hydraulic time step. Results obtained using the current approach converge, or tend to converge, toward those obtained using the preliminary event-driven approach as the water-quality time step decreases. Improving the water-quality routing algorithm used in EPANET could eliminate mass imbalances and related errors in estimated concentrations. The results presented in this paper should be of value to those who perform water-quality simulations using EPANET or use the results of such simulations, including utility managers and engineers.
Effects of simulated turbulence on aircraft handling qualities
NASA Technical Reports Server (NTRS)
Jacobson, I. D.; Joshi, D. S.
1977-01-01
The influence of simulated turbulence on aircraft handling qualities is presented. Pilot opinions of the handling qualities of a light general aviation aircraft were evaluated in a motion-base simulator using a simulated turbulence environment. A realistic representation of turbulence disturbances is described in terms of rms intensity and scale length and their random variations with time. The time histories generated by the proposed turbulence models showed characteristics which are more similar to real turbulence than the frequently-used Gaussian turbulence model. The proposed turbulence models flexibly accommodate changes in atmospheric conditions and are easily implemented in flight simulator studies.
Hong, Wei; Huang, Dexiu; Zhang, Xinliang; Zhu, Guangxi
2007-12-24
All-optical on-off keying (OOK) to binary phase-shift keying (BPSK) modulation format conversion based on gain-transparent semiconductor optical amplifier (GT-SOA) is simulated and analyzed, where GT-SOA is used as an all-optical phase-modulator (PM). Numerical simulation of the phase modulation effect of GT-SOA is performed using a wideband dynamic model of GT-SOA and the quality of the BPSK signal is evaluated using the differential-phase-Q factor. Performance improvement by holding light injection is analyzed and non-return-to-zero (NRZ) and return-to-zero (RZ) modulation formats of the OOK signal are considered.
Linking Air Quality and Human Health Effects Models: An Application to the Los Angeles Air Basin
Stewart, Devoun R; Saunders, Emily; Perea, Roberto A; Fitzgerald, Rosa; Campbell, David E; Stockwell, William R
2017-01-01
Proposed emission control strategies for reducing ozone and particulate matter are evaluated better when air quality and health effects models are used together. The Community Multiscale Air Quality (CMAQ) model is the US Environmental Protection Agency’s model for determining public policy and forecasting air quality. CMAQ was used to forecast air quality changes due to several emission control strategies that could be implemented between 2008 and 2030 for the South Coast Air Basin that includes Los Angeles. The Environmental Benefits Mapping and Analysis Program—Community Edition (BenMAP-CE) was used to estimate health and economic impacts of the different emission control strategies based on CMAQ simulations. BenMAP-CE is a computer program based on epidemiologic studies that link human health and air quality. This modeling approach is better for determining optimum public policy than approaches that only examine concentration changes. PMID:29162976
NASA Astrophysics Data System (ADS)
Liang, Weibin; Ouyang, Sen; Huang, Xiang; Su, Weijian
2017-05-01
The existing modeling process of power quality about electrified railways connected to power grid is complicated and the simulation scene is incomplete, so this paper puts forward a novel evaluation method of power quality based on PSCAD/ETMDC. Firstly, a model of power quality about electrified railways connected to power grid is established, which is based on testing report or measured data. The equivalent model of electrified locomotive contains power characteristic and harmonic characteristic, which are substituted by load and harmonic source. Secondly, in order to make evaluation more complete, an analysis scheme has been put forward. The scheme uses a combination of three-dimensions of electrified locomotive, which contains types, working conditions and quantity. At last, Shenmao Railway is taken as example to evaluate the power quality at different scenes, and the result shows electrified railways connected to power grid have significant effect on power quality.
Linking Air Quality and Human Health Effects Models: An Application to the Los Angeles Air Basin.
Stewart, Devoun R; Saunders, Emily; Perea, Roberto A; Fitzgerald, Rosa; Campbell, David E; Stockwell, William R
2017-01-01
Proposed emission control strategies for reducing ozone and particulate matter are evaluated better when air quality and health effects models are used together. The Community Multiscale Air Quality (CMAQ) model is the US Environmental Protection Agency's model for determining public policy and forecasting air quality. CMAQ was used to forecast air quality changes due to several emission control strategies that could be implemented between 2008 and 2030 for the South Coast Air Basin that includes Los Angeles. The Environmental Benefits Mapping and Analysis Program-Community Edition (BenMAP-CE) was used to estimate health and economic impacts of the different emission control strategies based on CMAQ simulations. BenMAP-CE is a computer program based on epidemiologic studies that link human health and air quality. This modeling approach is better for determining optimum public policy than approaches that only examine concentration changes.
Li, Zhaofu; Liu, Hongyu; Luo, Chuan; Li, Yan; Li, Hengpeng; Pan, Jianjun; Jiang, Xiaosan; Zhou, Quansuo; Xiong, Zhengqin
2015-05-01
The Hydrological Simulation Program-Fortran (HSPF), which is a hydrological and water-quality computer model that was developed by the United States Environmental Protection Agency, was employed to simulate runoff and nutrient export from a typical small watershed in a hilly eastern monsoon region of China. First, a parameter sensitivity analysis was performed to assess how changes in the model parameters affect runoff and nutrient export. Next, the model was calibrated and validated using measured runoff and nutrient concentration data. The Nash-Sutcliffe efficiency (E NS ) values of the yearly runoff were 0.87 and 0.69 for the calibration and validation periods, respectively. For storms runoff events, the E NS values were 0.93 for the calibration period and 0.47 for the validation period. Antecedent precipitation and soil moisture conditions can affect the simulation accuracy of storm event flow. The E NS values for the total nitrogen (TN) export were 0.58 for the calibration period and 0.51 for the validation period. In addition, the correlation coefficients between the observed and simulated TN concentrations were 0.84 for the calibration period and 0.74 for the validation period. For phosphorus export, the E NS values were 0.89 for the calibration period and 0.88 for the validation period. In addition, the correlation coefficients between the observed and simulated orthophosphate concentrations were 0.96 and 0.94 for the calibration and validation periods, respectively. The nutrient simulation results are generally satisfactory even though the parameter-lumped HSPF model cannot represent the effects of the spatial pattern of land cover on nutrient export. The model parameters obtained in this study could serve as reference values for applying the model to similar regions. In addition, HSPF can properly describe the characteristics of water quantity and quality processes in this area. After adjustment, calibration, and validation of the parameters, the HSPF model is suitable for hydrological and water-quality simulations in watershed planning and management and for designing best management practices.
[Quality assurance of the renal applications software].
del Real Núñez, R; Contreras Puertas, P I; Moreno Ortega, E; Mena Bares, L M; Maza Muret, F R; Latre Romero, J M
2007-01-01
The need for quality assurance of all technical aspects of nuclear medicine studies is widely recognised. However, little attention has been paid to the quality assurance of the applications software. Our work reported here aims at verifying the analysis software for processing of renal nuclear medicine studies (renograms). The software tools were used to build a synthetic dynamic model of renal system. The model consists of two phases: perfusion and function. The organs of interest (kidneys, bladder and aortic artery) were simple geometric forms. The uptake of the renal structures was described by mathematic functions. Curves corresponding to normal or pathological conditions were simulated for kidneys, bladder and aortic artery by appropriate selection of parameters. There was no difference between the parameters of the mathematic curves and the quantitative data produced by the renal analysis program. Our test procedure is simple to apply, reliable, reproducible and rapid to verify the renal applications software.
Safer@home—Simulation and training: the study protocol of a qualitative action research design
Wiig, Siri; Guise, Veslemøy; Anderson, Janet; Storm, Marianne; Lunde Husebø, Anne Marie; Testad, Ingelin; Søyland, Elsa; Moltu, Kirsti L
2014-01-01
Introduction While it is predicted that telecare and other information and communication technology (ICT)-assisted services will have an increasingly important role in future healthcare services, their implementation in practice is complex. For implementation of telecare to be successful and ensure quality of care, sufficient training for staff (healthcare professionals) and service users (patients) is fundamental. Telecare training has been found to have positive effects on attitudes to, sustained use of, and outcomes associated with telecare. However, the potential contribution of training in the adoption, quality and safety of telecare services is an under-investigated research field. The overall aim of this study is to develop and evaluate simulation-based telecare training programmes to aid the use of videophone technology in elderly home care. Research-based training programmes will be designed for healthcare professionals, service users and next of kin, and the study will explore the impact of training on adoption, quality and safety of new telecare services. Methods and analysis The study has a qualitative action research design. The research will be undertaken in close collaboration with a multidisciplinary team consisting of researchers and managers and clinical representatives from healthcare services in two Norwegian municipalities, alongside experts in clinical education and simulation, as well as service user (patient) representatives. The qualitative methods used involve focus group interviews, semistructured interviews, observation and document analysis. To ensure trustworthiness in the data analysis, we will apply member checks and analyst triangulation; in addition to providing contextual and sample description to allow for evaluation of transferability of our results to other contexts and groups. Ethics and dissemination The study is approved by the Norwegian Social Science Data Services. The study is based on voluntary participation and informed written consent. Informants can withdraw at any point in time. The results will be disseminated at research conferences, peer review journals, one PhD thesis and through public presentations to people outside the scientific community. PMID:25079924
Preliminary Evaluation of BIM-based Approaches for Schedule Delay Analysis
NASA Astrophysics Data System (ADS)
Chou, Hui-Yu; Yang, Jyh-Bin
2017-10-01
The problem of schedule delay commonly occurs in construction projects. The quality of delay analysis depends on the availability of schedule-related information and delay evidence. More information used in delay analysis usually produces more accurate and fair analytical results. How to use innovative techniques to improve the quality of schedule delay analysis results have received much attention recently. As Building Information Modeling (BIM) technique has been quickly developed, using BIM and 4D simulation techniques have been proposed and implemented. Obvious benefits have been achieved especially in identifying and solving construction consequence problems in advance of construction. This study preforms an intensive literature review to discuss the problems encountered in schedule delay analysis and the possibility of using BIM as a tool in developing a BIM-based approach for schedule delay analysis. This study believes that most of the identified problems can be dealt with by BIM technique. Research results could be a fundamental of developing new approaches for resolving schedule delay disputes.
Madenci, Arin L; Solis, Carolina V; de Moya, Marc A
2014-02-01
Simulation training for invasive procedures may improve patient safety by enabling efficient training. This study is a meta-analysis with rigorous inclusion and exclusion criteria designed to assess the real patient procedural success of simulation training for central venous access. Published randomized controlled trials and prospective 2-group cohort studies that used simulation for the training of procedures involving central venous access were identified. The quality of each study was assessed. The primary outcome was the proportion of trainees who demonstrated the ability to successfully complete the procedure. Secondary outcomes included the mean number of attempts to procedural success and periprocedural adverse events. Proportions were compared between groups using risk ratios (RRs), whereas continuous variables were compared using weighted mean differences. Random-effects analysis was used to determine pooled effect sizes. We identified 550 studies, of which 5 (3 randomized controlled trials, 2 prospective 2-group cohort studies) studies of central venous catheter (CVC) insertion were included in the meta-analysis, composed of 407 medical trainees. The simulation group had a significantly larger proportion of trainees who successfully placed CVCs (RR, 1.09; 95% confidence interval [CI], 1.03-1.16, P<0.01). In addition, the simulation group had significantly fewer mean attempts to CVC insertion (weighted mean difference, -1.42; 95% CI, -2.34 to -0.49, P<0.01). There was no significant difference in the rate of adverse events between the groups (RR, 0.50; 95% CI, 0.19-1.29; P=0.15). Training programs should consider adopting simulation training for CVC insertion to improve the real patient procedural success of trainees.
Guidelines for the analysis of free energy calculations
Klimovich, Pavel V.; Shirts, Michael R.; Mobley, David L.
2015-01-01
Free energy calculations based on molecular dynamics (MD) simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical–analysis.py, freely available on GitHub at https://github.com/choderalab/pymbar–examples, that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope these tools and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations. PMID:25808134
Development and psychometric testing of the satisfaction with Cultural Simulation Experience Scale.
Courtney-Pratt, Helen; Levett-Jones, Tracy; Lapkin, Samuel; Pitt, Victoria; Gilligan, Conor; Van der Riet, Pamela; Rossiter, Rachel; Jones, Donovan; Everson, Naleya
2015-11-01
Decreasing the numbers of adverse health events experienced by people from culturally diverse backgrounds rests, in part, on the ability of education providers to provide quality learning experiences that support nursing students in developing cultural competence, an essential professional attribute. This paper reports on the implementation and evaluation of an immersive 3D cultural empathy simulation. The Satisfaction with Cultural Simulation Experience Scale used in this study was adapted and validated as the first stage of this study. Exploratory factor analysis and confirmatory factor analysis were undertaken to investigate the psychometric properties of the scale using two randomly-split sub-samples. Cronbach's Alpha was used to examine internal consistency reliability. Descriptive statistics were used for analysis of mean satisfaction scores and qualitative comments to open-ended questions were analysed and coded. A purposive sample (n = 497) of second of nursing students participated in the study. The overall Cronbach's alpha for the scale was 0.95 and each subscale demonstrated high internal consistency: 0.92; 0.92; 0.72 respectively. The mean satisfaction score was 4.64 (SD 0.51) out of a maximum of 5 indicating a high level of participant satisfaction with the simulation. Three factors emerged from qualitative analysis: "Becoming culturally competent", "Learning from the debrief" and "Reflecting on practice". The cultural simulation was highly regarded by students. Psychometric testing of the Satisfaction with Cultural Simulation Experience Scale demonstrated that it is a reliable instrument. However, there is room for improvement and further testing in other contexts is therefore recommended. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Setiawan, R.
2018-05-01
In this paper, Economic Order Quantity (EOQ) of the vendor-buyer supply-chain model under a probabilistic condition with imperfect quality items has been analysed. The analysis is delivered using two concepts in game theory approach, which is Stackelberg equilibrium and Pareto Optimal, under non-cooperative and cooperative games, respectively. Another result is getting acomparison of theoptimal result between integrated scheme and game theory approach based on analytical and numerical result using appropriate simulation data.
2010-01-01
Soporte de Modelos Sistémicos: Aplicación al Sector de Desarrollo de Software de Argentina,” Tesis de PhD, Universidad Tecnológica Nacional-Facultad...with New Results 31 2.3 Other Simulation Approaches 37 Conceptual Planning , Execution, and Operation of Combat Fire Support Effectiveness: A...Figure 29: Functional Structure of Multiple Regression Model 80 Figure 30: TSP Quality Plan One 85 Figure 31: TSP Quality Plan Two 85 Figure
Integrating Flight Dynamics & Control Analysis and Simulation in Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben; Berger, Tom; Tischler, Mark B.; Theodore, Colin R; Elmore, Josh; Gallaher, Andrew; Tobias, Eric L.
2016-01-01
The development of a toolset, SIMPLI-FLYD ('SIMPLIfied FLight dynamics for conceptual Design') is described. SIMPLI-FLYD is a collection of tools that perform flight dynamics and control modeling and analysis of rotorcraft conceptual designs including a capability to evaluate the designs in an X-Plane-based real-time simulation. The establishment of this framework is now facilitating the exploration of this new capability, in terms of modeling fidelity and data requirements, and the investigation of which stability and control and handling qualities requirements are appropriate for conceptual design. Illustrative design variation studies for single main rotor and tiltrotor vehicle configurations show sensitivity of the stability and control characteristics and an approach to highlight potential weight savings by identifying over-design.
Establishing the common patterns of future tropospheric ozone under diverse climate change scenarios
NASA Astrophysics Data System (ADS)
Jimenez-Guerrero, Pedro; Gómez-Navarro, Juan J.; Jerez, Sonia; Lorente-Plazas, Raquel; Baro, Rocio; Montávez, Juan P.
2013-04-01
The impacts of climate change on air quality may affect long-term air quality planning. However, the policies aimed at improving air quality in the EU directives have not accounted for the variations in the climate. Climate change alone influences future air quality through modifications of gas-phase chemistry, transport, removal, and natural emissions. As such, the aim of this work is to check whether the projected changes in gas-phase air pollution over Europe depends on the scenario driving the regional simulation. For this purpose, two full-transient regional climate change-air quality projections for the first half of the XXI century (1991-2050) have been carried out with MM5+CHIMERE system, including A2 and B2 SRES scenarios. Experiments span the periods 1971-2000, as a reference, and 2071-2100, as future enhanced greenhouse gas and aerosol scenarios (SRES A2 and B2). The atmospheric simulations have a horizontal resolution of 25 km and 23 vertical layers up to 100 mb, and were driven by ECHO-G global climate model outputs. The analysis focuses on the connection between meteorological and air quality variables. Our simulations suggest that the modes of variability for tropospheric ozone and their main precursors hardly change under different SRES scenarios. The effect of changing scenarios has to be sought in the intensity of the changing signal, rather than in the spatial structure of the variation patterns, since the correlation between the spatial patterns of variability in A2 and B2 simulation is r > 0.75 for all gas-phase pollutants included in this study. In both cases, full-transient simulations indicate an enhanced enhanced chemical activity under future scenarios. The causes for tropospheric ozone variations have to be sought in a multiplicity of climate factors, such as increased temperature, different distribution of precipitation patterns across Europe, increased photolysis of primary and secondary pollutants due to lower cloudiness, etc. Nonetheless, according to the results of this work future ozone is conditioned by the dependence of biogenic emissions on the climatological patterns of variability. In this sense, ozone over Europe is mainly driven by the warming-induced increase in biogenic emitting activity (vegetation is kept invariable in the simulations, but estimations of these emissions strongly depends on shortwave radiation and temperature, which are substantially modified in climatic simulations). Moreover, one of the most important drivers for ozone increase is the decrease of cloudiness (related to stronger solar radiation) mostly over southern Europe at the first half of the XXI century. However, given the large uncertainty isoprene sensitivity to climate change and the large uncertainties associated to the cloudiness projections, these results should be carefully considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, D
2015-06-15
Purpose: AAPM radiation therapy committee task group No. 66 (TG-66) published a report which described a general approach to CT simulator QA. The report outlines the testing procedures and specifications for the evaluation of patient dose, radiation safety, electromechanical components, and image quality for a CT simulator. The purpose of this study is to thoroughly evaluate the performance of a second generation Toshiba Aquilion Large Bore CT simulator with 90 cm bore size (Toshiba, Nasu, JP) based on the TG-66 criteria. The testing procedures and results from this study provide baselines for a routine QA program. Methods: Different measurements andmore » analysis were performed including CTDIvol measurements, alignment and orientation of gantry lasers, orientation of the tabletop with respect to the imaging plane, table movement and indexing accuracy, Scanogram location accuracy, high contrast spatial resolution, low contrast resolution, field uniformity, CT number accuracy, mA linearity and mA reproducibility using a number of different phantoms and measuring devices, such as CTDI phantom, ACR image quality phantom, TG-66 laser QA phantom, pencil ion chamber (Fluke Victoreen) and electrometer (RTI Solidose 400). Results: The CTDI measurements were within 20% of the console displayed values. The alignment and orientation for both gantry laser and tabletop, as well as the table movement and indexing and scanogram location accuracy were within 2mm as specified in TG66. The spatial resolution, low contrast resolution, field uniformity and CT number accuracy were all within ACR’s recommended limits. The mA linearity and reproducibility were both well below the TG66 threshold. Conclusion: The 90 cm bore size second generation Toshiba Aquilion Large Bore CT simulator that comes with 70 cm true FOV can consistently meet various clinical needs. The results demonstrated that this simulator complies with the TG-66 protocol in all aspects including electromechanical component, radiation safety component, and image quality component. Employee of Toshiba America Medical Systems.« less
Liu, Yanchi; Wang, Xue; Liu, Youda; Cui, Sujin
2016-06-27
Power quality analysis issues, especially the measurement of harmonic and interharmonic in cyber-physical energy systems, are addressed in this paper. As new situations are introduced to the power system, the impact of electric vehicles, distributed generation and renewable energy has introduced extra demands to distributed sensors, waveform-level information and power quality data analytics. Harmonics and interharmonics, as the most significant disturbances, require carefully designed detection methods for an accurate measurement of electric loads whose information is crucial to subsequent analyzing and control. This paper gives a detailed description of the power quality analysis framework in networked environment and presents a fast and resolution-enhanced method for harmonic and interharmonic measurement. The proposed method first extracts harmonic and interharmonic components efficiently using the single-channel version of Robust Independent Component Analysis (RobustICA), then estimates the high-resolution frequency from three discrete Fourier transform (DFT) samples with little additional computation, and finally computes the amplitudes and phases with the adaptive linear neuron network. The experiments show that the proposed method is time-efficient and leads to a better accuracy of the simulated and experimental signals in the presence of noise and fundamental frequency deviation, thus providing a deeper insight into the (inter)harmonic sources or even the whole system.
Liu, Yanchi; Wang, Xue; Liu, Youda; Cui, Sujin
2016-01-01
Power quality analysis issues, especially the measurement of harmonic and interharmonic in cyber-physical energy systems, are addressed in this paper. As new situations are introduced to the power system, the impact of electric vehicles, distributed generation and renewable energy has introduced extra demands to distributed sensors, waveform-level information and power quality data analytics. Harmonics and interharmonics, as the most significant disturbances, require carefully designed detection methods for an accurate measurement of electric loads whose information is crucial to subsequent analyzing and control. This paper gives a detailed description of the power quality analysis framework in networked environment and presents a fast and resolution-enhanced method for harmonic and interharmonic measurement. The proposed method first extracts harmonic and interharmonic components efficiently using the single-channel version of Robust Independent Component Analysis (RobustICA), then estimates the high-resolution frequency from three discrete Fourier transform (DFT) samples with little additional computation, and finally computes the amplitudes and phases with the adaptive linear neuron network. The experiments show that the proposed method is time-efficient and leads to a better accuracy of the simulated and experimental signals in the presence of noise and fundamental frequency deviation, thus providing a deeper insight into the (inter)harmonic sources or even the whole system. PMID:27355946
Computational study of noise in a large signal transduction network.
Intosalmi, Jukka; Manninen, Tiina; Ruohonen, Keijo; Linne, Marja-Leena
2011-06-21
Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies. © 2011 Intosalmi et al; licensee BioMed Central Ltd.
Erdemir, Ahmet; Guess, Trent M.; Halloran, Jason P.; Modenese, Luca; Reinbolt, Jeffrey A.; Thelen, Darryl G.; Umberger, Brian R.
2016-01-01
Objective The overall goal of this document is to demonstrate that dissemination of models and analyses for assessing the reproducibility of simulation results can be incorporated in the scientific review process in biomechanics. Methods As part of a special issue on model sharing and reproducibility in IEEE Transactions on Biomedical Engineering, two manuscripts on computational biomechanics were submitted: A. Rajagopal et al., IEEE Trans. Biomed. Eng., 2016 and A. Schmitz and D. Piovesan, IEEE Trans. Biomed. Eng., 2016. Models used in these studies were shared with the scientific reviewers and the public. In addition to the standard review of the manuscripts, the reviewers downloaded the models and performed simulations that reproduced results reported in the studies. Results There was general agreement between simulation results of the authors and those of the reviewers. Discrepancies were resolved during the necessary revisions. The manuscripts and instructions for download and simulation were updated in response to the reviewers’ feedback; changes that may otherwise have been missed if explicit model sharing and simulation reproducibility analysis were not conducted in the review process. Increased burden on the authors and the reviewers, to facilitate model sharing and to repeat simulations, were noted. Conclusion When the authors of computational biomechanics studies provide access to models and data, the scientific reviewers can download and thoroughly explore the model, perform simulations, and evaluate simulation reproducibility beyond the traditional manuscript-only review process. Significance Model sharing and reproducibility analysis in scholarly publishing will result in a more rigorous review process, which will enhance the quality of modeling and simulation studies and inform future users of computational models. PMID:28072567
Moyer, Douglas; Hyer, Kenneth
2003-01-01
Impairment of surface waters by fecal coliform bacteria is a water-quality issue of national scope and importance. Section 303(d) of the Clean Water Act requires that each State identify surface waters that do not meet applicable water-quality standards. In Virginia, more than 175 stream segments are on the 1998 Section 303(d) list of impaired waters because of violations of the water-quality standard for fecal coliform bacteria. A total maximum daily load (TMDL) will need to be developed by 2006 for each of these impaired streams and rivers by the Virginia Departments of Environmental Quality and Conservation and Recreation. A TMDL is a quantitative representation of the maximum load of a given water-quality constituent, from all point and nonpoint sources, that a stream can assimilate without violating the designated water-quality standard. Blacks Run, in Rockingham County, Virginia, is one of the stream segments listed by the State of Virginia as impaired by fecal coliform bacteria. Watershed modeling and bacterial source tracking were used to develop the technical components of the fecal coliform bacteria TMDL for Accotink Creek. The Hydrological Simulation Program?FORTRAN (HSPF) was used to simulate streamflow, fecal coliform concentrations, and source-specific fecal coliform loading in Blacks Run. Ribotyping, a bacterial source tracking technique, was used to identify the dominant sources of fecal coliform bacteria in the Blacks Run watershed. Ribotyping also was used to determine the relative contributions of specific sources to the observed fecal coliform load in Blacks Run. Data from the ribotyping analysis were incorporated into the calibration of the fecal coliform model. Study results provide information regarding the calibration of the streamflow and fecal coliform bacteria models and also identify the reductions in fecal coliform loads required to meet the TMDL for Blacks Run. The calibrated streamflow model simulated observed streamflow characteristics with respect to total annual runoff, seasonal runoff, average daily streamflow, and hourly stormflow. The calibrated fecal coliform model simulated the patterns and range of observed fecal coliform bacteria concentrations. Observed fecal coliform bacteria concentrations during low-flow periods ranged from 40 to 7,000 colonies per 100 milliliters, and peak concentrations during storm-flow periods ranged from 33,000 to 260,000 colonies per 100 milliliters. Simulated source-specific contributions of fecal coliform bacteria to instream load were matched to the observed contributions from the dominant sources, which were cats, cattle, deer, dogs, ducks, geese, horses, humans, muskrats, poultry, raccoons, and sheep. According to model results, a 95-percent reduction in the current fecal coliform load delivered from the watershed to Blacks Run would result in compliance with the designated water-quality goals and associated TMDL.
Moyer, Douglas; Hyer, Kenneth
2003-01-01
Impairment of surface waters by fecal coliform bacteria is a water-quality issue of national scope and importance. Section 303(d) of the Clean Water Act requires that each State identify surface waters that do not meet applicable water-quality standards. In Virginia, more than 175 stream segments are on the 1998 Section 303(d) list of impaired waters because of violations of the water-quality standard for fecal coliform bacteria. A total maximum daily load (TMDL) will need to be developed by 2006 for each of these impaired streams and rivers by the Virginia Departments of Environmental Quality and Conservation and Recreation. A TMDL is a quantitative representation of the maximum load of a given water-quality constituent, from all point and nonpoint sources, that a stream can assimilate without violating the designated water-quality standard. Accotink Creek, in Fairfax County, Virginia, is one of the stream segments listed by the State of Virginia as impaired by fecal coliform bacteria. Watershed modeling and bacterial source tracking were used to develop the technical components of the fecal coliform bacteria TMDL for Accotink Creek. The Hydrological Simulation Program?FORTRAN (HSPF) was used to simulate streamflow, fecal coliform concentrations, and source-specific fecal coliform loading in Accotink Creek. Ribotyping, a bacterial source tracking technique, was used to identify the dominant sources of fecal coliform bacteria in the Accotink Creek watershed. Ribotyping also was used to determine the relative contributions of specific sources to the observed fecal coliform load in Accotink Creek. Data from the ribotyping analysis were incorporated into the calibration of the fecal coliform model. Study results provide information regarding the calibration of the streamflow and fecal coliform bacteria models and also identify the reductions in fecal coliform loads required to meet the TMDL for Accotink Creek. The calibrated streamflow model simulated observed streamflow characteristics with respect to total annual runoff, seasonal runoff, average daily streamflow, and hourly stormflow. The calibrated fecal coliform model simulated the patterns and range of observed fecal coliform bacteria concentrations. Observed fecal coliform bacteria concentrations during low-flow periods ranged from 25 to 800 colonies per 100 milliliters, and peak concentrations during storm-flow periods ranged from 19,000 to 340,000 colonies per 100 milliliters. Simulated source-specific contributions of fecal coliform bacteria to instream load were matched to the observed contributions from the dominant sources, which were cats, deer, dogs, ducks, geese, humans, muskrats, and raccoons. According to model results, an 89-percent reduction in the current fecal coliform load delivered from the watershed to Accotink Creek would result in compliance with the designated water-quality goals and associated TMDL.
High quality transmission Kikuchi diffraction analysis of deformed alloys - Case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tokarski, Tomasz, E-mail: tokarski@agh.edu.pl
Modern scanning electron microscopes (SEM) equipped with thermally assisted field emission guns (Schottky FEG) are capable of imaging with a resolution in the range of several nanometers or better. Simultaneously, the high electron beam current can be used, which enables fast chemical and crystallographic analysis with a higher resolution than is normally offered by SEM with a tungsten cathode. The current resolution that limits the EDS and EBSD analysis is related to materials' physics, particularly to the electron-specimen interaction volume. The application of thin, electron-transparent specimens, instead of bulk samples, improves the resolution and allows for the detailed analysis ofmore » very fine microstructural features. Beside the typical imaging mode, it is possible to use a standard EBSD camera in such a configuration that only transmitted and scattered electrons are detected. This modern approach was successfully applied to various materials giving rise to significant resolution improvement, especially for the light element magnesium based alloys. This paper presents an insight into the application of the transmission Kikuchi diffraction (TKD) technique applied to the most troublesome, heavily-deformed materials. In particular, the values of the highest possible acquisition rates for high resolution and high quality mapping were estimated within typical imaging conditions of stainless steel and magnesium-yttrium alloy. - Highlights: •Monte Carlo simulations were used to simulate EBSD camera intensity for various measuring conditions. •Transmission Kikuchi diffraction parameters were evaluated for highly deformed, light and heavy elements based alloys. •High quality maps with 20 nm spatial resolution were acquired for Mg and Fe based alloys. •High speed TKD measurements were performed at acquisition rates comparable to the reflection EBSD.« less
PSAMM: A Portable System for the Analysis of Metabolic Models
Steffensen, Jon Lund; Dufault-Thompson, Keith; Zhang, Ying
2016-01-01
The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM), a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies. PMID:26828591
Simulation of FIB-SEM images for analysis of porous microstructures.
Prill, Torben; Schladitz, Katja
2013-01-01
Focused ion beam nanotomography-scanning electron microscopy tomography yields high-quality three-dimensional images of materials microstructures at the nanometer scale combining serial sectioning using a focused ion beam with SEM. However, FIB-SEM tomography of highly porous media leads to shine-through artifacts preventing automatic segmentation of the solid component. We simulate the SEM process in order to generate synthetic FIB-SEM image data for developing and validating segmentation methods. Monte-Carlo techniques yield accurate results, but are too slow for the simulation of FIB-SEM tomography requiring hundreds of SEM images for one dataset alone. Nevertheless, a quasi-analytic description of the specimen and various acceleration techniques, including a track compression algorithm and an acceleration for the simulation of secondary electrons, cut down the computing time by orders of magnitude, allowing for the first time to simulate FIB-SEM tomography. © Wiley Periodicals, Inc.
Man-rated flight software for the F-8 DFBW program
NASA Technical Reports Server (NTRS)
Bairnsfather, R. R.
1975-01-01
The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program Assembly Control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools--the all-digital simulator, the hybrid simulator, and the Iron Bird simulator--are described, as well as the program test plans and their implementation on the various simulators. Failure-effects analysis and the creation of special failure-generating software for testing purposes are described. The quality of the end product is evidenced by the F-8 DFBW flight test program in which 42 flights, totaling 58 hours of flight time, were successfully made without any DFCS inflight software, or hardware, failures.
Clark, Teresa J; Yoder-Wise, Patricia S
2015-07-01
A well-established charge nurse orientation program was enhanced with the addition of a simulation, addressing three primary populations (the trifocus) with whom charge nurses interact: patients, patients' parents, and other staff members. In this pilot quality improvement project, 20 staff nurses enrolled in the orientation program and were assigned a mentor. Only one participant used the mentorship opportunity; therefore, it is not discussed here. Twelve nurses completed all charge nurse classes and a simulation scenario of caring for a deteriorating infant. The nurses were given an opportunity to reflect on leadership practices after the simulation. Thematic analysis from qualitative, reflective data supported the enhanced understanding of managing complex patients, a code situation, and teams; guiding a team's novice nurse; leading as a charge nurse; and using clinical and critical thinking skills. All nurses reported that the simulation as experiential learning helped them to meet their leadership goals. Copyright 2015, SLACK Incorporated.
Monitoring and analysis of air quality in Riga
NASA Astrophysics Data System (ADS)
Ubelis, Arnolds; Leitass, Andris; Vitols, Maris
1995-09-01
Riga, the capital of Latvia is a city with nearly 900,000 inhabitants and various highly concentrated industries. Air pollution in Riga is a serious problem affecting health and damaging valuable buildings of historical importance, as acid rain and smog take their toll. Therefore the Air Quality Management System with significant assistance from Swedish Government and persistent efforts from Riga City Council was arranged in Riga. It contains INDIC AIRVIRO system which simulates and evaluates air pollution levels at various locations. It then processes the data in order to predict air quality based on a number of criteria and parameters, measured by OPSIS differential absorption instruments, as well as data from the Meteorological Service and results of episodic measurements. The analysis of the results provided by Riga Air Quality Management System for the first time allows us to start comprehensive supervision of troposphere physical, chemical, and photochemical processes in the air of Riga as well as to appreciate the influence of lcoal pollution and transboundary transfer. The report contains the actual results of this work and first attempts of analysis as well as overview about activities towards research and teaching in the fields of spectroscopy and photochemistry of polluted atmospheres.
1998-04-01
Members (ADFM); 3,182 retirees; 5,064 family members of retired military; and 846 survivors (TDA, 1997). The Active Duty population artificially includes...that real- world casualty collection, evacuation, and treatment is conducted seamlessly while simulated medical processes are conducted intelligently ...Strategie Analysis 34 BJACH METL +Provide quality, cost-effective healthcare to the nation’s soldiers, their families, and to retirees and their families
Ghaffari, Mahsa; Tangen, Kevin; Alaraj, Ali; Du, Xinjian; Charbel, Fady T; Linninger, Andreas A
2017-12-01
In this paper, we present a novel technique for automatic parametric mesh generation of subject-specific cerebral arterial trees. This technique generates high-quality and anatomically accurate computational meshes for fast blood flow simulations extending the scope of 3D vascular modeling to a large portion of cerebral arterial trees. For this purpose, a parametric meshing procedure was developed to automatically decompose the vascular skeleton, extract geometric features and generate hexahedral meshes using a body-fitted coordinate system that optimally follows the vascular network topology. To validate the anatomical accuracy of the reconstructed vasculature, we performed statistical analysis to quantify the alignment between parametric meshes and raw vascular images using receiver operating characteristic curve. Geometric accuracy evaluation showed an agreement with area under the curves value of 0.87 between the constructed mesh and raw MRA data sets. Parametric meshing yielded on-average, 36.6% and 21.7% orthogonal and equiangular skew quality improvement over the unstructured tetrahedral meshes. The parametric meshing and processing pipeline constitutes an automated technique to reconstruct and simulate blood flow throughout a large portion of the cerebral arterial tree down to the level of pial vessels. This study is the first step towards fast large-scale subject-specific hemodynamic analysis for clinical applications. Copyright © 2017 Elsevier Ltd. All rights reserved.
Multi-Sensory Aerosol Data and the NRL NAAPS model for Regulatory Exceptional Event Analysis
NASA Astrophysics Data System (ADS)
Husar, R. B.; Hoijarvi, K.; Westphal, D. L.; Haynes, J.; Omar, A. H.; Frank, N. H.
2013-12-01
Beyond scientific exploration and analysis, multi-sensory observations along with models are finding increasing applications for operational air quality management. EPA's Exceptional Event (EE) Rule allows the exclusion of data strongly influenced by impacts from "exceptional events," such as smoke from wildfires or dust from abnormally high winds. The EE Rule encourages the use of satellite observations and other non-standard data along with models as evidence for formal documentation of EE samples for exclusion. Thus, the implementation of the EE Rule is uniquely suited for the direct application of integrated multi-sensory observations and indirectly through the assimilation into an aerosol simulation model. Here we report the results of a project: NASA and NAAPS Products for Air Quality Decision Making. The project uses of observations from multiple satellite sensors, surface-based aerosol measurements and the NRL Aerosol Analysis and Prediction System (NAAPS) model that assimilates key satellite observations. The satellite sensor data for detecting and documenting smoke and dust events include: MODIS AOD and Images; OMI Aerosol Index, Tropospheric NO2; AIRS, CO. The surface observations include the EPA regulatory PM2.5 network; the IMPROVE/STN aerosol chemical network; AIRNOW PM2.5 mass network, and surface met. data. Within this application, crucial role is assigned to the NAAPS model for estimating the surface concentration of windblown dust and biomass smoke. The operational model assimilates quality-assured daily MODIS data and 2DVAR to adjust the model concentrations and CALIOP-based climatology to adjust the vertical profiles at 6-hour intervals. The assimilation of satellite data from multiple satellites significantly contributes to the usefulness of NAAPS for EE analysis. The NAAPS smoke and dust simulations were evaluated using the IMPROVE/STN chemical data. The multi-sensory observations along with the model simulations are integrated into a web-based Exceptional Event Decision System (EE DSS) application program, designed to support air quality analysts at the Federal and Regional EPA offices and the EE-affected States. EE DSS screening tool automatically identifies the EPA PM2.5 mass samples that are candidates for EE flagging, based mainly on the NAAPS-simulated surface concentration of dust and smoke. The AQ analysts at the States and the EPA can also use the EE DSS to gather further evidence from the examination of spatio-temporal pattern, Absorbing Aerosol Index, CO and NO2 concentration, backward and forward airmass trajectories and other signatures. Since early 2013, the DSS has been used for the identification and analysis of dozens of events. Hence, integration of multi-sensory observations and modeling with data assimilation is maturing to support real-world operational AQ management applications. The remaining challenges can be resolved by seeking ';closure' of the system components; i.e. the systematic adjustments to reconcile the satellite and surface observations, the emissions and their integration through a suitable AQ model.
Schmidt, K; Witte, H
1999-11-01
Recently the assumption of the independence of individual frequency components in a signal has been rejected, for example, for the EEG during defined physiological states such as sleep or sedation [9, 10]. Thus, the use of higher-order spectral analysis capable of detecting interrelations between individual signal components has proved useful. The aim of the present study was to investigate the quality of various non-parametric and parametric estimation algorithms using simulated as well as true physiological data. We employed standard algorithms available for the MATLAB. The results clearly show that parametric bispectral estimation is superior to non-parametric estimation in terms of the quality of peak localisation and the discrimination from other peaks.
NASA Technical Reports Server (NTRS)
Anderson, D.; Hollingsworth, A.; Uppala, S.; Woiceshyn, P.
1987-01-01
The use of scatterometer and altimeter data in wind and wave assimilation, and the benefits this offers for quality assurance and validation of ERS-1 data were examined. Real time use of ERS-1 data was simulated through assimilation of Seasat scatterometer data. The potential for quality assurance and validation is demonstrated by documenting a series of substantial problems with the scatterometer data, which are known but took years to establish, or are new. A data impact study, and an analysis of the performance of ambiguity removal algorithms on real and simulated data were conducted. The impact of the data on analyses and forecasts is large in the Southern Hemisphere, generally small in the Northern Hemisphere, and occasionally large in the Tropics. Tests with simulated data give more optimistic results than tests with real data. Errors in ambiguity removal results occur in clusters. The probabilities which can be calculated for the ambiguous wind directions on ERS-1 contain more information than is given by a simple ranking of the directions.
Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-02-01
New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less
Voigt, Jeffrey; Carpenter, Linda; Leuchter, Andrew
2017-01-01
Repetitive Transcranial Magnetic Stimulation (rTMS) commonly is used for the treatment of Major Depressive Disorder (MDD) after patients have failed to benefit from trials of multiple antidepressant medications. No analysis to date has examined the cost-effectiveness of rTMS used earlier in the course of treatment and over a patients' lifetime. We used lifetime Markov simulation modeling to compare the direct costs and quality adjusted life years (QALYs) of rTMS and medication therapy in patients with newly diagnosed MDD (ages 20-59) who had failed to benefit from one pharmacotherapy trial. Patients' life expectancies, rates of response and remission, and quality of life outcomes were derived from the literature, and treatment costs were based upon published Medicare reimbursement data. Baseline costs, aggregate per year quality of life assessments (QALYs), Monte Carlo simulation, tornado analysis, assessment of dominance, and one way sensitivity analysis were also performed. The discount rate applied was 3%. Lifetime direct treatment costs, and QALYs identified rTMS as the dominant therapy compared to antidepressant medications (i.e., lower costs with better outcomes) in all age ranges, with costs/improved QALYs ranging from $2,952/0.32 (older patients) to $11,140/0.43 (younger patients). One-way sensitivity analysis demonstrated that the model was most sensitive to the input variables of cost per rTMS session, monthly prescription drug cost, and the number of rTMS sessions per year. rTMS was identified as the dominant therapy compared to antidepressant medication trials over the life of the patient across the lifespan of adults with MDD, given current costs of treatment. These models support the use of rTMS after a single failed antidepressant medication trial versus further attempts at medication treatment in adults with MDD.
2017-01-01
Objective Repetitive Transcranial Magnetic Stimulation (rTMS) commonly is used for the treatment of Major Depressive Disorder (MDD) after patients have failed to benefit from trials of multiple antidepressant medications. No analysis to date has examined the cost-effectiveness of rTMS used earlier in the course of treatment and over a patients’ lifetime. Methods We used lifetime Markov simulation modeling to compare the direct costs and quality adjusted life years (QALYs) of rTMS and medication therapy in patients with newly diagnosed MDD (ages 20–59) who had failed to benefit from one pharmacotherapy trial. Patients’ life expectancies, rates of response and remission, and quality of life outcomes were derived from the literature, and treatment costs were based upon published Medicare reimbursement data. Baseline costs, aggregate per year quality of life assessments (QALYs), Monte Carlo simulation, tornado analysis, assessment of dominance, and one way sensitivity analysis were also performed. The discount rate applied was 3%. Results Lifetime direct treatment costs, and QALYs identified rTMS as the dominant therapy compared to antidepressant medications (i.e., lower costs with better outcomes) in all age ranges, with costs/improved QALYs ranging from $2,952/0.32 (older patients) to $11,140/0.43 (younger patients). One-way sensitivity analysis demonstrated that the model was most sensitive to the input variables of cost per rTMS session, monthly prescription drug cost, and the number of rTMS sessions per year. Conclusion rTMS was identified as the dominant therapy compared to antidepressant medication trials over the life of the patient across the lifespan of adults with MDD, given current costs of treatment. These models support the use of rTMS after a single failed antidepressant medication trial versus further attempts at medication treatment in adults with MDD. PMID:29073256
Quality Assurance Practices in Obstetric Care: A Survey of Hospitals in California.
Lundsberg, Lisbet S; Lee, Henry C; Dueñas, Grace Villarin; Gregory, Kimberly D; Grossetta Nardini, Holly K; Pettker, Christian M; Illuzzi, Jessica L; Xu, Xiao
2018-02-01
To assess hospital practices in obstetric quality management activities and identify institutional characteristics associated with utilization of evidence-supported practices. Data for this study came from a statewide survey of obstetric hospitals in California regarding their organization and delivery of perinatal care. We analyzed responses from 185 hospitals that completed quality assurance sections of the survey to assess their practices in a broad spectrum of quality enhancement activities. The association between institutional characteristics and adoption of evidence-supported practices (ie, those supported by prior literature or recommended by professional organizations as beneficial for improving birth outcome or patient safety) was examined using bivariate analysis and appropriate statistical tests. Most hospitals regularly audited adherence to written protocols regarding critical areas of care; however, 77.7% and 16.8% reported not having written guidelines on diagnosis of labor arrest and management of abnormal fetal heart rate, respectively. Private nonprofit hospitals were more likely to have a written protocol for management of abnormal fetal heart rate (P=.002). One in 10 hospitals (9.7%) did not regularly review cases with significant morbidity or mortality, and only 69.0% regularly tracked indications for cesarean delivery. Moreover, 26.3%, 14.3%, and 8.7% of the hospitals reported never performing interprofessional simulations for eclampsia, shoulder dystocia, or postpartum hemorrhage, respectively. Teaching status was associated with more frequent simulations in these three areas (P≤.04 for all), while larger volume was associated with more frequent simulations for eclampsia (P=.04). Hospitals in California engage in a wide range of practices to assure or improve quality of obstetric care, but substantial variation in practice exists among hospitals. There is opportunity for improvement in adoption of evidence-supported practices.
NASA Technical Reports Server (NTRS)
Trail, M.; Tsimpidi, A. P.; Liu, P.; Tsigaridis, K.; Hu, Y.; Nenes, A.; Russell, A. G.
2013-01-01
Climate change can exacerbate future regional air pollution events by making conditions more favorable to form high levels of ozone. In this study, we use spectral nudging with WRF to downscale NASA earth system GISS modelE2 results during the years 2006 to 2010 and 2048 to 2052 over the continental United States in order to compare the resulting meteorological fields from the air quality perspective during the four seasons of five-year historic and future climatological periods. GISS results are used as initial and boundary conditions by the WRF RCM to produce hourly meteorological fields. The downscaling technique and choice of physics parameterizations used are evaluated by comparing them with in situ observations. This study investigates changes of similar regional climate conditions down to a 12km by 12km resolution, as well as the effect of evolving climate conditions on the air quality at major U.S. cities. The high resolution simulations produce somewhat different results than the coarse resolution simulations in some regions. Also, through the analysis of the meteorological variables that most strongly influence air quality, we find consistent changes in regional climate that would enhance ozone levels in four regions of the U.S. during fall (Western U.S., Texas, Northeastern, and Southeastern U.S), one region during summer (Texas), and one region where changes potentially would lead to better air quality during spring (Northeast). We also find that daily peak temperatures tend to increase in most major cities in the U.S. which would increase the risk of health problems associated with heat stress. Future work will address a more comprehensive assessment of emissions and chemistry involved in the formation and removal of air pollutants.
An Improved Power Quality Based Sheppard-Taylor Converter Fed BLDC Motor Drive
NASA Astrophysics Data System (ADS)
Singh, Bhim; Bist, Vashist
2015-12-01
This paper deals with the design and analysis of a power factor correction based Sheppard-Taylor converter fed brushless dc motor (BLDCM) drive. The speed of the BLDCM is controlled by adjusting the dc link voltage of the voltage source inverter (VSI) feeding BLDCM. Moreover, a low frequency switching of the VSI is used for electronically commutating the BLDCM for reduced switching losses. The Sheppard-Taylor converter is designed to operate in continuous conduction mode to achieve an improved power quality at the ac mains for a wide range of speed control and supply voltage variation. The BLDCM drive is designed and its performance is simulated in a MATLAB/Simulink environment to achieve the power quality indices within the limits of the international power quality standard IEC-61000-3-2.
NASA Astrophysics Data System (ADS)
Wang, Y.; Hu, X.; Yang, X.; Xie, G.
2018-04-01
The image quality of the surveying camera will affect the stereoscopic positioning accuracy of the remote sensing satellite. The key factors closely related to the image quality are Modulation Transfer Function(MTF),Signal to Noise Ratio(SNR) and Quantization Bits(QB). In "Mapping Satellite-1" image as the background, research the effect of positioning precision about the image quality in no ground controlled conditions, and evaluate the quantitative relationship with the positioning precision. At last verify the validity of the experimental results by simulating three factors of the degraded data on orbit, and counting the number of matching points, the mismatch rate, and the matching residuals of the degraded data. The reason for the variety of the positioning precision was analyzed.
Dynamic phantom for radionuclide cardiology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nickles, R.J.
1979-06-01
A flow-based phantom has been developed to verify analysis routines most frequently employed in clinical radionuclide cardiology. Ejection-fraction studies by first-pass or equilibrium techniques are simulated, as well as assessment of shunts and cardiac output. This hydraulic phantom, with its valve-selectable dysfunctions, offers a greater role in training than in quality control, as originally intended.
System reliability analysis through corona testing
NASA Technical Reports Server (NTRS)
Lalli, V. R.; Mueller, L. A.; Koutnik, E. A.
1975-01-01
A corona vacuum test facility for nondestructive testing of power system components was built in the Reliability and Quality Engineering Test Laboratories at the NASA Lewis Research Center. The facility was developed to simulate operating temperature and vacuum while monitoring corona discharges with residual gases. The facility is being used to test various high-voltage power system components.
This paper presents an analysis of the CMAQ v4.5 model performance for particulate matter and its chemical components for the simulated year 2001. This is part two is two part series of papers that examines the model performance of CMAQ v4.5.
Engineering High Assurance Distributed Cyber Physical Systems
2015-01-15
decisions: number of interacting agents and co-dependent decisions made in real-time without causing interference . To engineer a high assurance DART...environment specification, architecture definition, domain-specific languages, design patterns, code - generation, analysis, test-generation, and simulation...include synchronization between the models and source code , debugging at the model level, expression of the design intent, and quality of service
Development of Game-Like Simulations for Procedural Knowledge in Healthcare Education
ERIC Educational Resources Information Center
Torrente, Javier; Borro-Escribano, Blanca; Freire, Manuel; del Blanco, Ángel; Marchiori, Eugenio J.; Martinez-Ortiz, Iván; Moreno-Ger, Pablo; Fernández-Manjón, Baltasar
2014-01-01
We present EGDA, an educational game development approach focused on the teaching of procedural knowledge using a cost-effective approach. EGDA proposes four tasks: analysis, design, implementation, and quality assurance that are subdivided in a total of 12 subtasks. One of the benefits of EGDA is that anyone can apply it to develop a game since…
Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification
NASA Technical Reports Server (NTRS)
Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle
2011-01-01
NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process
Cost: the missing outcome in simulation-based medical education research: a systematic review.
Zendejas, Benjamin; Wang, Amy T; Brydges, Ryan; Hamstra, Stanley J; Cook, David A
2013-02-01
The costs involved with technology-enhanced simulation remain unknown. Appraising the value of simulation-based medical education (SBME) requires complete accounting and reporting of cost. We sought to summarize the quantity and quality of studies that contain an economic analysis of SBME for the training of health professions learners. We performed a systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011. Articles reporting original research in any language evaluating the cost of simulation, in comparison with nonstimulation instruction or another simulation intervention, for training practicing and student physicians, nurses, and other health professionals were selected. Reviewers working in duplicate evaluated study quality and abstracted information on learners, instructional design, cost elements, and outcomes. From a pool of 10,903 articles we identified 967 comparative studies. Of these, 59 studies (6.1%) reported any cost elements and 15 (1.6%) provided information on cost compared with another instructional approach. We identified 11 cost components reported, most often the cost of the simulator (n = 42 studies; 71%) and training materials (n = 21; 36%). Ten potential cost components were never reported. The median number of cost components reported per study was 2 (range, 1-9). Only 12 studies (20%) reported cost in the Results section; most reported it in the Discussion (n = 34; 58%). Cost reporting in SBME research is infrequent and incomplete. We propose a comprehensive model for accounting and reporting costs in SBME. Copyright © 2013 Mosby, Inc. All rights reserved.
Karczyńska, Agnieszka S; Czaplewski, Cezary; Krupa, Paweł; Mozolewska, Magdalena A; Joo, Keehyoung; Lee, Jooyoung; Liwo, Adam
2017-12-05
Molecular simulations restrained to single or multiple templates are commonly used in protein-structure modeling. However, the restraints introduce additional barriers, thus impairing the ergodicity of simulations, which can affect the quality of the resulting models. In this work, the effect of restraint types and simulation schemes on ergodicity and model quality was investigated by performing template-restrained canonical molecular dynamics (MD), multiplexed replica-exchange molecular dynamics, and Hamiltonian replica exchange molecular dynamics (HREMD) simulations with the coarse-grained UNRES force field on nine selected proteins, with pseudo-harmonic log-Gaussian (unbounded) or Lorentzian (bounded) restraint functions. The best ergodicity was exhibited by HREMD. It has been found that non-ergodicity does not affect model quality if good templates are used to generate restraints. However, when poor-quality restraints not covering the entire protein are used, the improved ergodicity of HREMD can lead to significantly improved protein models. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Pedamallu, Chandra Sekhar; Ozdamar, Linet; Weber, Gerhard-Wilhelm; Kropat, Erik
2010-06-01
The system dynamics approach is a holistic way of solving problems in real-time scenarios. This is a powerful methodology and computer simulation modeling technique for framing, analyzing, and discussing complex issues and problems. System dynamics modeling and simulation is often the background of a systemic thinking approach and has become a management and organizational development paradigm. This paper proposes a system dynamics approach for study the importance of infrastructure facilities on quality of primary education system in developing nations. The model is proposed to be built using the Cross Impact Analysis (CIA) method of relating entities and attributes relevant to the primary education system in any given community. We offer a survey to build the cross-impact correlation matrix and, hence, to better understand the primary education system and importance of infrastructural facilities on quality of primary education. The resulting model enables us to predict the effects of infrastructural facilities on the access of primary education by the community. This may support policy makers to take more effective actions in campaigns.
Lindgren, Richard L.; Houston, Natalie A.; Musgrove, MaryLynn; Fahlquist, Lynne S.; Kauffman, Leon J.
2011-01-01
The effect of short-circuit pathways, for example karst conduits, in the flow system on the movement of young water to the selected public-supply well could greatly alter contaminant arrival times compared to what might be expected from advection in a system without short circuiting. In a forecasting exercise, the simulated concentrations showed rapid initial response at the beginning and end of chemical input, followed by more gradual response as older water moved through the system. The nature of karst groundwater flow, where flow predominantly occurs via conduit flow paths, could lead to relatively rapid water quality responses to land-use changes. Results from the forecasting exercise indicate that timescales for change in the quality of water from the selected public-supply well could be on the order of a few years to decades for land-use changes that occur over days to decades, which has implications for source-water protection strategies that rely on land-use change to achieve water-quality objectives.
NASA Technical Reports Server (NTRS)
Talbot, P. D.; Dugan, D. D.; Chen, R. T. N.; Gerdes, R. M.
1980-01-01
A coordinated analysis and ground simulator experiment was performed to investigate the effects on single rotor helicopter handling qualities of systematic variations in the main rotor hinge restraint, hub hinge offset, pitch-flap coupling, and blade lock number. Teetering rotor, articulated rotor, and hingeless rotor helicopters were evaluated by research pilots in special low level flying tasks involving obstacle avoidance at 60 to 100 knots airspeed. The results of the experiment are in the form of pilot ratings, pilot commentary, and some objective performance measures. Criteria for damping and sensitivity are reexamined when combined with the additional factors of cross coupling due to pitch and roll rates, pitch coupling with collective pitch, and longitudinal static stability. Ratings obtained with and without motion are compared. Acceptable flying qualities were obtained within each rotor type by suitable adjustment of the hub parameters, however, pure teetering rotors were found to lack control power for the tasks. A limit for the coupling parameter L sub q/L sub p of 0.35 is suggested.
Ikuma, Laura H; Babski-Reeves, Kari; Nussbaum, Maury A
2009-05-01
The objectives of this study were to determine the efficacy of experimental manipulations of psychosocial exposures and to evaluate the sensitivity of a psychosocial questionnaire by determining the factors perceived. A 50-item questionnaire was developed from the job content questionnaire (JCQ) and the quality of worklife survey (QWL). The experiment involved simulated work at different physical and psychosocial levels. Forty-eight participants were exposed to two levels of one psychosocial manipulation (job control, job demands, social support, or time pressure). Significantly different questionnaire responses supported the effectiveness of psychosocial manipulations. Exploratory factor analysis revealed five factors: skill discretion and decision authority, stress level and supervisor support, physical demands, quality of coworker support, and decision-making support. These results suggest that psychosocial factors can be manipulated experimentally, and that questionnaires can distinguish perceptions of these factors. These questionnaires may be used to assess perceptions of psychosocial factors in experimental settings.
Fan, Chihhao; Ko, Chun-Han; Wang, Wei-Shen
2009-04-01
Water quality modeling has been shown to be a useful tool in strategic water quality management. The present study combines the Qual2K model with the HEC-RAS model to assess the water quality of a tidal river in northern Taiwan. The contaminant loadings of biochemical oxygen demand (BOD), ammonia nitrogen (NH(3)-N), total phosphorus (TP), and sediment oxygen demand (SOD) are utilized in the Qual2K simulation. The HEC-RAS model is used to: (i) estimate the hydraulic constants for atmospheric re-aeration constant calculation; and (ii) calculate the water level profile variation to account for concentration changes as a result of tidal effect. The results show that HEC-RAS-assisted Qual2K simulations taking tidal effect into consideration produce water quality indices that, in general, agree with the monitoring data of the river. Comparisons of simulations with different combinations of contaminant loadings demonstrate that BOD is the most import contaminant. Streeter-Phelps simulation (in combination with HEC-RAS) is also performed for comparison, and the results show excellent agreement with the observed data. This paper is the first report of the innovative use of a combination of the HEC-RAS model and the Qual2K model (or Streeter-Phelps equation) to simulate water quality in a tidal river. The combination is shown to provide an alternative for water quality simulation of a tidal river when available dynamic-monitoring data are insufficient to assess the tidal effect of the river.
This manuscript provides an overview of the formulation, process considerations, and performance for simulating tropospheric ozone and particulate matter distributions of the Multiscale Air Quality Simulation Platform (MAQSIP). MAQSIP is a comprehensive atmospheric chemistry/tran...
Fast Learning for Immersive Engagement in Energy Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, Brian W; Bugbee, Bruce; Gruchalla, Kenny M
The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximationmore » methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.« less
Persistence of initial conditions in continental scale air quality simulations
This study investigates the effect of initial conditions (IC) for pollutant concentrations in the atmosphere and soil on simulated air quality for two continental-scale Community Multiscale Air Quality (CMAQ) model applications. One of these applications was performed for springt...
A Study on the Saving Method of Plate Jigs in Hull Block Butt Welding
NASA Astrophysics Data System (ADS)
Ko, Dae-Eun
2017-11-01
A large amount of plate jigs is used for alignment of welding line and control of welding deformations in hull block assembly stage. Besides material cost, the huge working man-hours required for working process of plate jigs is one of the obstacles in productivity growth of shipyard. In this study, analysis method was proposed to simulate the welding deformations of block butt joint with plate jigs setting. Using the proposed analysis method, an example simulation was performed for actual panel block joint to investigate the saving method of plate jigs. Results show that it is possible to achieve two objectives of quality accuracy of the hull block and saving the plate jig usage at the same time by deploying the plate jigs at the right places. And the proposed analysis method can be used in establishing guidelines for the proper use of plate jigs in block assembly stage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sintonen, Sakari, E-mail: sakari.sintonen@aalto.fi; Suihkonen, Sami; Jussila, Henri
2014-08-28
The crystal quality of bulk GaN crystals is continuously improving due to advances in GaN growth techniques. Defect characterization of the GaN substrates by conventional methods is impeded by the very low dislocation density and a large scale defect analysis method is needed. White beam synchrotron radiation x-ray topography (SR-XRT) is a rapid and non-destructive technique for dislocation analysis on a large scale. In this study, the defect structure of an ammonothermal c-plane GaN substrate was recorded using SR-XRT and the image contrast caused by the dislocation induced microstrain was simulated. The simulations and experimental observations agree excellently and themore » SR-XRT image contrasts of mixed and screw dislocations were determined. Apart from a few exceptions, defect selective etching measurements were shown to correspond one to one with the SR-XRT results.« less
Hong, Eun-Mi; Shelton, Daniel; Pachepsky, Yakov A; Nam, Won-Ho; Coppock, Cary; Muirhead, Richard
2017-02-01
Knowledge of the microbial quality of irrigation waters is extremely limited. For this reason, the US FDA has promulgated the Produce Rule, mandating the testing of irrigation water sources for many farms. The rule requires the collection and analysis of at least 20 water samples over two to four years to adequately evaluate the quality of water intended for produce irrigation. The objective of this work was to evaluate the effect of interannual weather variability on surface water microbial quality. We used the Soil and Water Assessment Tool model to simulate E. coli concentrations in the Little Cove Creek; this is a perennial creek located in an agricultural watershed in south-eastern Pennsylvania. The model performance was evaluated using the US FDA regulatory microbial water quality metrics of geometric mean (GM) and the statistical threshold value (STV). Using the 90-year time series of weather observations, we simulated and randomly sampled the time series of E. coli concentrations. We found that weather conditions of a specific year may strongly affect the evaluation of microbial quality and that the long-term assessment of microbial water quality may be quite different from the evaluation based on short-term observations. The variations in microbial concentrations and water quality metrics were affected by location, wetness of the hydrological years, and seasonality, with 15.7-70.1% of samples exceeding the regulatory threshold. The results of this work demonstrate the value of using modeling to design and evaluate monitoring protocols to assess the microbial quality of water used for produce irrigation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Flight simulator fidelity assessment in a rotorcraft lateral translation maneuver
NASA Technical Reports Server (NTRS)
Hess, R. A.; Malsbury, T.; Atencio, A., Jr.
1992-01-01
A model-based methodology for assessing flight simulator fidelity in closed-loop fashion is exercised in analyzing a rotorcraft low-altitude maneuver for which flight test and simulation results were available. The addition of a handling qualities sensitivity function to a previously developed model-based assessment criteria allows an analytical comparison of both performance and handling qualities between simulation and flight test. Model predictions regarding the existence of simulator fidelity problems are corroborated by experiment. The modeling approach is used to assess analytically the effects of modifying simulator characteristics on simulator fidelity.
Water quality modeling in the dead end sections of drinking water distribution networks.
Abokifa, Ahmed A; Yang, Y Jeffrey; Lo, Cynthia S; Biswas, Pratim
2016-02-01
Dead-end sections of drinking water distribution networks are known to be problematic zones in terms of water quality degradation. Extended residence time due to water stagnation leads to rapid reduction of disinfectant residuals allowing the regrowth of microbial pathogens. Water quality models developed so far apply spatial aggregation and temporal averaging techniques for hydraulic parameters by assigning hourly averaged water demands to the main nodes of the network. Although this practice has generally resulted in minimal loss of accuracy for the predicted disinfectant concentrations in main water transmission lines, this is not the case for the peripheries of the distribution network. This study proposes a new approach for simulating disinfectant residuals in dead end pipes while accounting for both spatial and temporal variability in hydraulic and transport parameters. A stochastic demand generator was developed to represent residential water pulses based on a non-homogenous Poisson process. Dispersive solute transport was considered using highly dynamic dispersion rates. A genetic algorithm was used to calibrate the axial hydraulic profile of the dead-end pipe based on the different demand shares of the withdrawal nodes. A parametric sensitivity analysis was done to assess the model performance under variation of different simulation parameters. A group of Monte-Carlo ensembles was carried out to investigate the influence of spatial and temporal variations in flow demands on the simulation accuracy. A set of three correction factors were analytically derived to adjust residence time, dispersion rate and wall demand to overcome simulation error caused by spatial aggregation approximation. The current model results show better agreement with field-measured concentrations of conservative fluoride tracer and free chlorine disinfectant than the simulations of recent advection dispersion reaction models published in the literature. Accuracy of the simulated concentration profiles showed significant dependence on the spatial distribution of the flow demands compared to temporal variation. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wicklein, Shaun M.; Schiffer, Donna M.
2002-01-01
Hydrologic and water-quality data have been collected within the 177-square-mile Reedy Creek, Florida, watershed, beginning as early as 1939, but the data have not been used to evaluate relations among land use, hydrology, and water quality. A model of the Reedy Creek watershed was developed and applied to the period January 1990 to December 1995 to provide a computational foundation for evaluating the effects of future land-use changes on hydrology and water quality in the watershed. The Hydrological Simulation Program-Fortran (HSPF) model was used to simulate hydrology and water quality of runoff for pervious land areas, impervious land areas, and stream reaches. Six land-use types were used to characterize the hydrology and water quality of pervious and impervious land areas in the Reedy Creek watershed: agriculture, rangeland, forest, wetlands, rapid infiltration basins, and urban areas. Hydrologic routing and water-quality reactions were simulated to characterize hydrologic and water-quality processes and the movement of runoff and its constituents through the main stream channels and their tributaries. Because of the complexity of the stream system within the Reedy Creek Improvement District (RCID) (hydraulic structures, retention ponds) and the anticipated difficulty of modeling the system, an approach of calibrating the model parameters for a subset of the gaged watersheds and confirming the usefulness of the parameters by simulating the remainder of the gaged sites was selected for this study. Two sub-watersheds (Whittenhorse Creek and Davenport Creek) were selected for calibration because both have similar land use to watersheds within the RCID (with the exception of urban areas). Given the lack of available rainfall data, the hydrologic calibration of the Whittenhorse Creek and Davenport Creek sub-watersheds was considered acceptable (for monthly data, correlation coefficients, 0.86 and 0.88, and coefficients of model-fit efficiency, 0.72 and 0.74, respectively). The hydrologic model was tested by applying the parameter sets developed for Whittenhorse Creek and Davenport Creek to other land areas within the Reedy Creek watershed, and by comparing the simulated results to observed data sets for Reedy Creek near Vineland, Bonnet Creek near Vineland, and Reedy Creek near Loughman. The hydrologic model confirmation for Reedy Creek near Vineland (correlation coefficient, 0.91, and coefficient of model fit efficiency, 0.78, for monthly flows) was acceptable. Flows for Bonnet Creek near Vineland were substantially under simulated. Consideration of the ground-water contribution to Bonnet Creek could improve the water balance simulation for Bonnet Creek near Vineland. On longer time scales (monthly or over the 72-month simulation period), simulated discharges for Reedy Creek near Loughman agreed well with observed data (correlation coefficient, 0.88). For monthly flows the coefficient of model-fit efficiency was 0.77. On a shorter time scale (less than a month), however, storm volumes were greatly over simulated and low flows (less than 8 cubic feet per second) were greatly under simulated. A primary reason for the poor results at low flows is the diversion of an unknown amount of water from the RCID at the Bonnet Creek near Kissimmee site. Selection of water-quality constituents for simulation was based primarily on the availability of water-quality data. Dissolved oxygen, nitrogen, and phosphorus species were simulated. Representation of nutrient cycling in HSPF also required simulation of biochemical oxygen demand and phytoplankton populations. The correlation coefficient for simulated and observed daily mean dissolved oxygen concentration values at Reedy Creek near Vineland was 0.633. Simulated time series of total phosphorus, phosphate, ammonia nitrogen, and nitrate nitrogen generally agreed well with periodically observed values for the Whittenhorse Creek and Davenport Creek sites. Simulated water-quality c
Noise sensitivity of portfolio selection in constant conditional correlation GARCH models
NASA Astrophysics Data System (ADS)
Varga-Haszonits, I.; Kondor, I.
2007-11-01
This paper investigates the efficiency of minimum variance portfolio optimization for stock price movements following the Constant Conditional Correlation GARCH process proposed by Bollerslev. Simulations show that the quality of portfolio selection can be improved substantially by computing optimal portfolio weights from conditional covariances instead of unconditional ones. Measurement noise can be further reduced by applying some filtering method on the conditional correlation matrix (such as Random Matrix Theory based filtering). As an empirical support for the simulation results, the analysis is also carried out for a time series of S&P500 stock prices.
Multisite Evaluation of APEX for Water Quality: I. Best Professional Judgment Parameterization.
Baffaut, Claire; Nelson, Nathan O; Lory, John A; Senaviratne, G M M M Anomaa; Bhandari, Ammar B; Udawatta, Ranjith P; Sweeney, Daniel W; Helmers, Matt J; Van Liew, Mike W; Mallarino, Antonio P; Wortmann, Charles S
2017-11-01
The Agricultural Policy Environmental eXtender (APEX) model is capable of estimating edge-of-field water, nutrient, and sediment transport and is used to assess the environmental impacts of management practices. The current practice is to fully calibrate the model for each site simulation, a task that requires resources and data not always available. The objective of this study was to compare model performance for flow, sediment, and phosphorus transport under two parameterization schemes: a best professional judgment (BPJ) parameterization based on readily available data and a fully calibrated parameterization based on site-specific soil, weather, event flow, and water quality data. The analysis was conducted using 12 datasets at four locations representing poorly drained soils and row-crop production under different tillage systems. Model performance was based on the Nash-Sutcliffe efficiency (NSE), the coefficient of determination () and the regression slope between simulated and measured annualized loads across all site years. Although the BPJ model performance for flow was acceptable (NSE = 0.7) at the annual time step, calibration improved it (NSE = 0.9). Acceptable simulation of sediment and total phosphorus transport (NSE = 0.5 and 0.9, respectively) was obtained only after full calibration at each site. Given the unacceptable performance of the BPJ approach, uncalibrated use of APEX for planning or management purposes may be misleading. Model calibration with water quality data prior to using APEX for simulating sediment and total phosphorus loss is essential. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Chenoweth, Lynn; Vickland, Victor; Stein-Parbury, Jane; Jeon, Yun-Hee; Kenny, Patricia; Brodaty, Henry
2015-10-01
To answer questions on the essential components (services, operations and resources) of a person-centered aged care home (iHome) using computer simulation. iHome was developed with AnyLogic software using extant study data obtained from 60 Australian aged care homes, 900+ clients and 700+ aged care staff. Bayesian analysis of simulated trial data will determine the influence of different iHome characteristics on care service quality and client outcomes. Interim results: A person-centered aged care home (socio-cultural context) and care/lifestyle services (interactional environment) can produce positive outcomes for aged care clients (subjective experiences) in the simulated environment. Further testing will define essential characteristics of a person-centered care home.
Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations
NASA Astrophysics Data System (ADS)
Niemeier, Wolfgang; Tengen, Dieter
2017-06-01
In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.
NASA Astrophysics Data System (ADS)
Syahroni, N.; Hartono, A. B. W.; Murtedjo, M.
2018-03-01
In the ship fabrication industry, welding is the most critical stage. If the quality of welding on ship fabrication is not good, then it will affect the strength and overall appearance of the structure. One of the factors that affect the quality of welding is residual stress and distortion. In this research welding simulation is performed on the inner bottom construction of Geomarin IV Ship Survey using shell element and has variation to welding sequence. In this study, welding simulations produced peak temperatures at 2490 K at variation 4. While the lowest peak temperature was produced by variation 2 with a temperature of 2339 K. After welding simulation, it continued simulating residual stresses and distortion. The smallest maximum tensile residual stress found in the inner bottom construction is 375.23 MPa, and the maximum tensile pressure is -20.18 MPa. The residual stress is obtained from variation 3. The distortion occurring in the inner bottom construction for X=720 mm is 4.2 mm and for X=-720 mm, the distortion is 4.92 mm. The distortion is obtained from the variation 3. Near the welding area, distortion value reaches its minimum point. This is because the stiffeners in the form of frames serves as anchoring.
Adams, D. Briane; Bauer, Daniel P.; Dale, Robert H.; Steele, Timothy Doak
1983-01-01
Development of coal resources and associated economy is accelerating in the Yampa River basin in northwestern Colorado and south-central Wyoming. Increased use of the water resources of the area will have a direct impact on their quantity and quality. As part of 18 surface-water projects, 35 reservoirs have been proposed with a combined total storage of 2.18 million acre-feet, 41% greater than the mean annual outflow from the basin. Three computer models were used to demonstrate methods of evaluating future impacts of reservoir development in the Yampa River basin. Four different reservoir configurations were used to simulate the effects of different degrees of proposed reservoir development. A multireservoir-flow model included both within-basin and transmountain diversions. Simulations indicated that in many cases diversion amounts would not be available for either type of diversion. A corresponding frequency analysis of reservoir storage levels indicated that most reservoirs would be operating with small percentages of total capacities and generally with less than 20% of conservation-pool volumes. Simulations using a dissolved-solids model indicated that extensive reservoir development could increase average annual concentrations at most locations. Simulations using a single-reservoir model indicated no significant occurrence of water-temperature stratification in most reservoirs due to limited reservoir storage. (USGS)
Ortiz, Roderick F.
2013-01-01
The purpose of the Arkansas Valley Conduit (AVC) is to deliver water for municipal and industrial use within the boundaries of the Southeastern Colorado Water Conservancy District. Water supplied through the AVC would serve two needs: (1) to supplement or replace existing poor-quality water to communities downstream from Pueblo Reservoir; and (2) to meet a portion of the AVC participants’ projected water demands through 2070. The Bureau of Reclamation (Reclamation) initiated an Environmental Impact Statement (EIS) to address the potential environmental consequences associated with constructing and operating the proposed AVC, entering into a conveyance contract for the Pueblo Dam north-south outlet works interconnect (Interconnect), and entering into a long-term excess capacity master contract (Master Contract). Operational changes, as a result of implementation of proposed EIS alternatives, could change the hydrodynamics and water-quality conditions in Pueblo Reservoir. An interagency agreement was initiated between Reclamation and the U.S. Geological Survey to accurately simulate hydrodynamics and water quality in Pueblo Reservoir for projected demands associated with four of the seven proposed EIS alternatives. The four alternatives submitted to the USGS for scenario simulation included various combinations (action or no action) of the proposed Arkansas Valley Conduit, Master Contract, and Interconnect options. The four alternatives were the No Action, Comanche South, Joint Use Pipeline North, and Master Contract Only. Additionally, scenario simulations were done that represented existing conditions (Existing Conditions scenario) in Pueblo Reservoir. Water-surface elevations, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, total iron, and algal biomass (measured as chlorophyll-a) were simulated. Each of the scenarios was simulated for three contiguous water years representing a wet, average, and dry annual hydrologic cycle. Each selected simulation scenario also was evaluated for differences in direct/indirect effects and cumulative effects on a particular scenario. Analysis of the results for the direct/indirect- and cumulative-effects analyses indicated that, in general, the results were similar for most of the scenarios and comparisons in this report focused on results from the direct/indirect-effects analyses. Scenario simulations that represented existing conditions in Pueblo Reservoir were compared to the No Action scenario to assess changes in water quality from current demands (2006) to projected demands in 2070. Overall, comparisons of the results between the Existing Conditions and the No Action scenarios for water-surface elevations, water temperature, and dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, and total iron concentrations indicated that the annual median values generally were similar for all three simulated years. Additionally, algal groups and chlorophyll-a concentrations (algal biomass) were similar for the Existing Conditions and the No Action scenarios at site 7B in the epilimnion for the simulated period (Water Year 2000 through 2002). The No Action scenario also was compared individually to the Comanche South, Joint Use Pipeline North, and Master Contract Only scenarios. These comparisons were made to describe changes in the annual median, 85th percentile, or 15th percentile concentration between the No Action scenario and each of the other three simulation scenarios. Simulated water-surface elevations, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, total iron, algal groups, and chlorophyll-a concentrations in Pueblo Reservoir generally were similar between the No Action scenario and each of the other three simulation scenarios.
Wettability of graphitic-carbon and silicon surfaces: MD modeling and theoretical analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramos-Alvarado, Bladimir; Kumar, Satish; Peterson, G. P.
2015-07-28
The wettability of graphitic carbon and silicon surfaces was numerically and theoretically investigated. A multi-response method has been developed for the analysis of conventional molecular dynamics (MD) simulations of droplets wettability. The contact angle and indicators of the quality of the computations are tracked as a function of the data sets analyzed over time. This method of analysis allows accurate calculations of the contact angle obtained from the MD simulations. Analytical models were also developed for the calculation of the work of adhesion using the mean-field theory, accounting for the interfacial entropy changes. A calibration method is proposed to providemore » better predictions of the respective contact angles under different solid-liquid interaction potentials. Estimations of the binding energy between a water monomer and graphite match those previously reported. In addition, a breakdown in the relationship between the binding energy and the contact angle was observed. The macroscopic contact angles obtained from the MD simulations were found to match those predicted by the mean-field model for graphite under different wettability conditions, as well as the contact angles of Si(100) and Si(111) surfaces. Finally, an assessment of the effect of the Lennard-Jones cutoff radius was conducted to provide guidelines for future comparisons between numerical simulations and analytical models of wettability.« less
Zhou, Y; Murata, T; Defanti, T A
2000-01-01
Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.
Optimization of porthole die geometrical variables by Taguchi method
NASA Astrophysics Data System (ADS)
Gagliardi, F.; Ciancio, C.; Ambrogio, G.; Filice, L.
2017-10-01
Porthole die extrusion is commonly used to manufacture hollow profiles made of lightweight alloys for numerous industrial applications. The reliability of extruded parts is affected strongly by the quality of the longitudinal and transversal seam welds. According to that, the die geometry must be designed correctly and the process parameters must be selected properly to achieve the desired product quality. In this study, numerical 3D simulations have been created and run to investigate the role of various geometrical variables on punch load and maximum pressure inside the welding chamber. These are important outputs to take into account affecting, respectively, the necessary capacity of the extrusion press and the quality of the welding lines. The Taguchi technique has been used to reduce the number of the required numerical simulations necessary for considering the influence of twelve different geometric variables. Moreover, the Analysis of variance (ANOVA) has been implemented to individually analyze the effect of each input parameter on the two responses. Then, the methodology has been utilized to determine the optimal process configuration individually optimizing the two investigated process outputs. Finally, the responses of the optimized parameters have been verified through finite element simulations approximating the predicted value closely. This study shows the feasibility of the Taguchi technique for predicting performance, optimization and therefore for improving the design of a porthole extrusion process.
Huiliang, Wang; Zening, Wu; Caihong, Hu; Xinzhong, Du
2015-09-01
Nonpoint source (NPS) pollution is considered as the main reason for water quality deterioration; thus, to quantify the NPS loads reliably is the key to implement watershed management practices. In this study, water quality and NPS loads from a watershed with limited data availability were studied in a mountainous area in China. Instantaneous water discharge was measured through the velocity-area method, and samples were taken for water quality analysis in both flood and nonflood days in 2010. The streamflow simulated by Hydrological Simulation Program-Fortran (HSPF) from 1995 to 2013 and a regression model were used to estimate total annual loads of various water quality parameters. The concentrations of total phosphorus (TP) and total nitrogen (TN) were much higher during the flood seasons, but the concentrations of ammonia nitrogen (NH3-N) and nitrate nitrogen (NO3-N) were lower during the flood seasons. Nevertheless, only TP concentration was positively correlated with the flow rate. The fluctuation of annual load from this watershed was significant. Statistical results indicated the significant contribution of pollutant fluxes during flood seasons to annual fluxes. The loads of TP, TN, NH3-N, and NO3-N in the flood seasons were accounted for 58-85, 60-82, 63-88, 64-81% of the total annual loads, respectively. This study presented a new method for estimation of the water and NPS loads in the watershed with limited data availability, which simplified data collection to watershed model and overcame the scale problem of field experiment method.
Probability model for atmospheric sulfur dioxide concentrations in the area of Venice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buttazzoni, C.; Lavagnini, I.; Marani, A.
1986-09-01
This paper deals with a comparative screening of existing air quality models based on their ability to simulate the distribution of sulfur dioxide data in the Venetian area. Investigations have been carried out on sulfur dioxide dispersion in the atmosphere of the Venetian area. The studies have been mainly focused on transport models (Gaussian, plume and K-models) aiming at meaningful correlations of sources and receptors. Among the results, a noteworthy disagreement of simulated and experimental data, due to the lack of thorough knowledge of source field conditions and of local meteorology of the sea-land transition area, has been shown. Investigationsmore » with receptor oriented models (based, e.g., on time series analysis, Fourier analysis, or statistical distributions) have also been performed.« less
Design and finite element analysis of micro punch CNC machine modeling for medical devices
NASA Astrophysics Data System (ADS)
Pranoto, Sigiet Haryo; Mahardika, Muslim
2018-03-01
Research on micromanufacturing has been conducted. Miniaturization and weight reduction of various industrial products continue to be developed, machines with high accuracy and good quality of machining results are needed recently. This research includes design and simulation of Micro Punch CNC Machine using Abaqus with pneumatic system. This article concern of modeling simulation of punching miniplate titanium with 0.6 MPa of pressure and 500 µm of thickness. This study explaining von misses stress, safety factor and displacement analysis while the machine had the load of punching. The result gives the reaction forced of punching is 0.5 MPa on punch tip and maximum displacement is 3.237 × 10-1 mm. The safety factor is over than 12, and considered it safe for manufacturing process.
Introduction and analysis of several FY3C-MWHTS cloud/rain screening methods
NASA Astrophysics Data System (ADS)
Li, Xiaoqing
2017-04-01
Data assimilation of satellite microwave sounders are very important for numerical weather prediction. Fengyun-3C (FY-3C),launched in September, 2013, has two such sounders: MWTS (MicroWave Temperature Sounder) and MWHTS (MicroWave Humidity and Temperature Sounder). These data should be quality-controlled before assimilation and cloud/rain detection is one of the crucial steps. This paper introduced different cloud/rain detection methods based on MWHTS, VIRR (Visible and InfraRed Radiometer) and MWRI (Microwave Radiation Imager) observations. We designed 6 cloud/rain detection combinations and then analyzed the application effect of these schemes. The difference between observations and model simulations for FY-3C MWHTS channels were calculated as a parameter for analysis. Both RTTOV and CRTM were used to fast simulate radiances of MWHTS channels.
NASA Astrophysics Data System (ADS)
Wu, Chen-Yu; Fan, Chihhao
2017-04-01
To assure the river water quality, the Taiwan government establishes many pollution control strategies and expends huge monetary investment. Despite all these efforts, many rivers still suffer from severe pollution because of massive discharges of domestic and industrial wastewater without proper treatment. A comprehensive evaluation tool seems required to assess the suitability of water pollution control strategies. Therefore, the purpose of this study is to quantify the potential strategic benefits by applying the water quality modelling integrated with cost-benefit analysis to simulating scenarios based on regional development planning. The Erhjen Creek is selected as the study example because it is a major river in southern Taiwan, and its riverine environment impacts a great deal to the neighboring people. For strategy assessment, we established QUAL2k model of Erhjen Creek and conducted the cost-benefit analyses according the proposed strategies. In the water quality simulation, HEC-RAS was employed to calculate the hydraulic parameters and dilution impact of tidal effect in the downstream section. Daily pollution loadings were obtained from the Water Pollution Control Information System maintained by Taiwan EPA, and the wastewater delivery ratios were calculated by comparing the occurrence of pollution loadings with the monitoring data. In the cost-benefit analysis, we adopted the market valuation method, setting a period of 65 years for analysis and discount rate at 2.59%. Capital investments were the costs of design, construction, operation and maintenance for each project in Erhjen Creek catchment. In model calibration and model verification, the mean absolute percentage errors (MAPEs) were calculated to be 21.4% and 25.5%, respectively, which met the prescribed acceptable criteria of 50%. This model was applied to simulating water quality based on implementing various pollution control policies and engineering projects in the Erhjen Creek. The overall improvements in BOD, SS and NH3-N were estimated as 36.2%, 27.7% and 29.2%, respectively. The net present value (i.e., economical-based environmental impact) becomes positive in the sixtieth year following the original government planning. We designed two scenarios for further comparison: (i) treatment efficiency improvement of pollution control facilities, and (ii) biogas-based power generation using livestock manure. If government budget is not a limiting factor, improving the efficiency of sewage treatment plants can make the occurrence of balance between payments and revenues (i.e., net present value in this study) three years earlier. For the biogas-based power generation scenario, if all pig farms with livestock number >2000 install the on-site power generation equipment, BOD will further improve by 9% and the time span of payback period will be shortened by 1 year. If all the manure waste from pig-farms is collected for subsequent electricity generation, the BOD river pollution index is estimated to improve to lightly-polluted category for more than half the length of Erhjen Creek. In short, water quality modelling technique not only can assess the contributions of related projects, but establish a practical pollution reduction strategy using cost-benefit analysis, which allows decision-maker to find a suitable pollution reduction plan to exhibit most benefits in river water quality.
NASA Astrophysics Data System (ADS)
Koliopoulos, T. C.; Koliopoulou, G.
2007-10-01
We present an input-output solution for simulating the associated behavior and optimized physical needs of an environmental system. The simulations and numerical analysis determined the accurate boundary loads and areas that were required to interact for the proper physical operation of a complicated environmental system. A case study was conducted to simulate the optimum balance of an environmental system based on an artificial intelligent multi-interacting input-output numerical scheme. The numerical results were focused on probable further environmental management techniques, with the objective of minimizing any risks and associated environmental impact to protect the quality of public health and the environment. Our conclusions allowed us to minimize the associated risks, focusing on probable cases in an emergency to protect the surrounded anthropogenic or natural environment. Therefore, the lining magnitude could be determined for any useful associated technical works to support the environmental system under examination, taking into account its particular boundary necessities and constraints.
Image quality specification and maintenance for airborne SAR
NASA Astrophysics Data System (ADS)
Clinard, Mark S.
2004-08-01
Specification, verification, and maintenance of image quality over the lifecycle of an operational airborne SAR begin with the specification for the system itself. Verification of image quality-oriented specification compliance can be enhanced by including a specification requirement that a vendor provide appropriate imagery at the various phases of the system life cycle. The nature and content of the imagery appropriate for each stage of the process depends on the nature of the test, the economics of collection, and the availability of techniques to extract the desired information from the data. At the earliest lifecycle stages, Concept and Technology Development (CTD) and System Development and Demonstration (SDD), the test set could include simulated imagery to demonstrate the mathematical and engineering concepts being implemented thus allowing demonstration of compliance, in part, through simulation. For Initial Operational Test and Evaluation (IOT&E), imagery collected from precisely instrumented test ranges and targets of opportunity consisting of a priori or a posteriori ground-truthed cultural and natural features are of value to the analysis of product quality compliance. Regular monitoring of image quality is possible using operational imagery and automated metrics; more precise measurements can be performed with imagery of instrumented scenes, when available. A survey of image quality measurement techniques is presented along with a discussion of the challenges of managing an airborne SAR program with the scarce resources of time, money, and ground-truthed data. Recommendations are provided that should allow an improvement in the product quality specification and maintenance process with a minimal increase in resource demands on the customer, the vendor, the operational personnel, and the asset itself.
The business case for quality improvement: oral anticoagulation for atrial fibrillation.
Rose, Adam J; Berlowitz, Dan R; Ash, Arlene S; Ozonoff, Al; Hylek, Elaine M; Goldhaber-Fiebert, Jeremy D
2011-07-01
The potential to save money within a short time frame provides a more compelling "business case" for quality improvement than merely demonstrating cost-effectiveness. Our objective was to demonstrate the potential for cost savings from improved control in patients anticoagulated for atrial fibrillation. Our population consisted of 67 077 Veterans Health Administration patients anticoagulated for atrial fibrillation between October 1, 2006, and September 30, 2008. We simulated the number of adverse events and their associated costs and utilities, both before and after various degrees of improvement in percent time in therapeutic range (TTR). The simulation had a 2-year time horizon, and costs were calculated from the perspective of the payer. In the base-case analysis, improving TTR by 5% prevented 1114 adverse events, including 662 deaths; it gained 863 quality-adjusted life-years and saved $15.9 million compared with the status quo, not accounting for the cost of the quality improvement program. Improving TTR by 10% prevented 2087 events, gained 1606 quality-adjusted life-years, and saved $29.7 million. In sensitivity analyses, costs were most sensitive to the estimated risk of stroke and the expected stroke reduction from improved TTR. Utilities were most sensitive to the estimated risk of death and the expected mortality benefit from improved TTR. A quality improvement program to improve anticoagulation control probably would be cost-saving for the payer, even if it were only modestly effective in improving control and even without considering the value of improved health. This study demonstrates how to make a business case for a quality improvement initiative.
Guidelines for the analysis of free energy calculations.
Klimovich, Pavel V; Shirts, Michael R; Mobley, David L
2015-05-01
Free energy calculations based on molecular dynamics simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical-analysis.py, freely available on GitHub as part of the pymbar package (located at http://github.com/choderalab/pymbar), that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope this tool and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations.
NASA Astrophysics Data System (ADS)
Vanrolleghem, Peter A.; Mannina, Giorgio; Cosenza, Alida; Neumann, Marc B.
2015-03-01
Sensitivity analysis represents an important step in improving the understanding and use of environmental models. Indeed, by means of global sensitivity analysis (GSA), modellers may identify both important (factor prioritisation) and non-influential (factor fixing) model factors. No general rule has yet been defined for verifying the convergence of the GSA methods. In order to fill this gap this paper presents a convergence analysis of three widely used GSA methods (SRC, Extended FAST and Morris screening) for an urban drainage stormwater quality-quantity model. After the convergence was achieved the results of each method were compared. In particular, a discussion on peculiarities, applicability, and reliability of the three methods is presented. Moreover, a graphical Venn diagram based classification scheme and a precise terminology for better identifying important, interacting and non-influential factors for each method is proposed. In terms of convergence, it was shown that sensitivity indices related to factors of the quantity model achieve convergence faster. Results for the Morris screening method deviated considerably from the other methods. Factors related to the quality model require a much higher number of simulations than the number suggested in literature for achieving convergence with this method. In fact, the results have shown that the term "screening" is improperly used as the method may exclude important factors from further analysis. Moreover, for the presented application the convergence analysis shows more stable sensitivity coefficients for the Extended-FAST method compared to SRC and Morris screening. Substantial agreement in terms of factor fixing was found between the Morris screening and Extended FAST methods. In general, the water quality related factors exhibited more important interactions than factors related to water quantity. Furthermore, in contrast to water quantity model outputs, water quality model outputs were found to be characterised by high non-linearity.
SIMULATION OF AEROSOL DYNAMICS: A COMPARATIVE REVIEW OF ALGORITHMS USED IN AIR QUALITY MODELS
A comparative review of algorithms currently used in air quality models to simulate aerosol dynamics is presented. This review addresses coagulation, condensational growth, nucleation, and gas/particle mass transfer. Two major approaches are used in air quality models to repres...
Dynamic Evaluation of Long-Term Air Quality Model Simulations Over the Northeastern U.S.
Dynamic model evaluation assesses a modeling system's ability to reproduce changes in air quality induced by changes in meteorology and/or emissions. In this paper, we illustrate various approaches to dynamic mode evaluation utilizing 18 years of air quality simulations perform...
Integrating Reliability Analysis with a Performance Tool
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael
1995-01-01
A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.
NASA Technical Reports Server (NTRS)
Korsch, D.
1979-01-01
A grazing incidence telescope with six nested subsystems is investigated through the effects of misalignment and surface deformations on it's image quality. The axial rms-spot size serves as measure for the image quality. The surface deformations are simulated by ellipsoidal and sinusoidal deviation elements. Each type of defect is analyzed in the single two-element system. The full nested system is then analyzed in the presence of all possible defects on all twelve elements, whereby the magnitude of the defects is randomized within a given upper limit.
Radi, Marjan; Dezfouli, Behnam; Abu Bakar, Kamalrulnizam; Abd Razak, Shukor
2014-01-01
Network connectivity and link quality information are the fundamental requirements of wireless sensor network protocols to perform their desired functionality. Most of the existing discovery protocols have only focused on the neighbor discovery problem, while a few number of them provide an integrated neighbor search and link estimation. As these protocols require a careful parameter adjustment before network deployment, they cannot provide scalable and accurate network initialization in large-scale dense wireless sensor networks with random topology. Furthermore, performance of these protocols has not entirely been evaluated yet. In this paper, we perform a comprehensive simulation study on the efficiency of employing adaptive protocols compared to the existing nonadaptive protocols for initializing sensor networks with random topology. In this regard, we propose adaptive network initialization protocols which integrate the initial neighbor discovery with link quality estimation process to initialize large-scale dense wireless sensor networks without requiring any parameter adjustment before network deployment. To the best of our knowledge, this work is the first attempt to provide a detailed simulation study on the performance of integrated neighbor discovery and link quality estimation protocols for initializing sensor networks. This study can help system designers to determine the most appropriate approach for different applications. PMID:24678277
The production of graphene-family nanoparticles (GFNs) appreciably increased in recent years. Among GFNs, graphene oxide (GO) is one of the most highly studied members due to its inexpensive synthesis cost compared to graphene, its stability in aqueous media and its broad applica...
System reliability analysis through corona testing
NASA Technical Reports Server (NTRS)
Lalli, V. R.; Mueller, L. A.; Koutnik, E. A.
1975-01-01
In the Reliability and Quality Engineering Test Laboratory at the NASA Lewis Research Center a nondestructive, corona-vacuum test facility for testing power system components was developed using commercially available hardware. The test facility was developed to simulate operating temperature and vacuum while monitoring corona discharges with residual gases. This facility is being used to test various high voltage power system components.
An investigation of a mathematical model for atmospheric absorption spectra
NASA Technical Reports Server (NTRS)
Niple, E. R.
1979-01-01
A computer program that calculates absorption spectra for slant paths through the atmosphere is described. The program uses an efficient convolution technique (Romberg integration) to simulate instrument resolution effects. A brief information analysis is performed on a set of calculated spectra to illustrate how such techniques may be used to explore the quality of the information in a spectrum.
Fire and Smoke Model Evaluation Experiment (FASMEE): Modeling gaps and data needs
Yongqiang Liu; Adam Kochanski; Kirk Baker; Ruddy Mell; Rodman Linn; Ronan Paugam; Jan Mandel; Aime Fournier; Mary Ann Jenkins; Scott Goodrick; Gary Achtemeier; Andrew Hudak; Matthew Dickson; Brian Potter; Craig Clements; Shawn Urbanski; Roger Ottmar; Narasimhan Larkin; Timothy Brown; Nancy French; Susan Prichard; Adam Watts; Derek McNamara
2017-01-01
Fire and smoke models are numerical tools for simulating fire behavior, smoke dynamics, and air quality impacts of wildland fires. Fire models are developed based on the fundamental chemistry and physics of combustion and fire spread or statistical analysis of experimental data (Sullivan 2009). They provide information on fire spread and fuel consumption for safe and...
Performance analysis of mini-propellers based on FlightGear
NASA Astrophysics Data System (ADS)
Vogeltanz, Tomáš
2016-06-01
This paper presents a performance analysis of three mini-propellers based on the FlightGear flight simulator. Although a basic propeller analysis has to be performed before the use of FlightGear, for a complex and more practical performance analysis, it is advantageous to use a propeller model in cooperation with a particular aircraft model. This approach may determine whether the propeller has sufficient quality in respect of aircraft requirements. In the first section, the software used for the analysis is illustrated. Then, the parameters of the analyzed mini-propellers and the tested UAV are described. Finally, the main section shows and discusses the results of the performance analysis of the mini-propellers.
Design and analysis of a radio frequency extractor in an S-band relativistic klystron amplifier.
Zhang, Zehai; Zhang, Jun; Shu, Ting; Qi, Zumin
2012-09-01
A radio frequency (RF) extractor converts the energy of a strongly modulated intense relativistic electron beam (IREB) into the energy of high power microwave in relativistic klystron amplifier (RKA). In the aim of efficiently extracting the energy of the modulated IREB, a RF extractor with all round coupling structure is proposed. Due to the all round structure, the operating transverse magnetic mode can be established easily and its resonant property can be investigated with an approach of group delay time. Furthermore, the external quality factor can be low enough. The design and analysis of the extractor applied in an S-band RKA are carried out, and the performance of the extractor is validated with three-dimensional (3D) particle-in-cell simulations. The extraction efficiency reaches 27% in the simulation with a totally 3D model of the whole RKA. The primary experiments are also carried out and the results show that the RF extractor with the external quality factor of 7.9 extracted 22% of the beam power and transformed it into the high power microwave. Better results are expected after the parasitic mode between the input and middle cavities is suppressed.
Design and analysis of a radio frequency extractor in an S-band relativistic klystron amplifier
NASA Astrophysics Data System (ADS)
Zhang, Zehai; Zhang, Jun; Shu, Ting; Qi, Zumin
2012-09-01
A radio frequency (RF) extractor converts the energy of a strongly modulated intense relativistic electron beam (IREB) into the energy of high power microwave in relativistic klystron amplifier (RKA). In the aim of efficiently extracting the energy of the modulated IREB, a RF extractor with all round coupling structure is proposed. Due to the all round structure, the operating transverse magnetic mode can be established easily and its resonant property can be investigated with an approach of group delay time. Furthermore, the external quality factor can be low enough. The design and analysis of the extractor applied in an S-band RKA are carried out, and the performance of the extractor is validated with three-dimensional (3D) particle-in-cell simulations. The extraction efficiency reaches 27% in the simulation with a totally 3D model of the whole RKA. The primary experiments are also carried out and the results show that the RF extractor with the external quality factor of 7.9 extracted 22% of the beam power and transformed it into the high power microwave. Better results are expected after the parasitic mode between the input and middle cavities is suppressed.
Wear Improvement of Tools in the Cold Forging Process for Long Hex Flange Nuts.
Hsia, Shao-Yi; Shih, Po-Yueh
2015-09-25
Cold forging has played a critical role in fasteners and has been widely used in automotive production, manufacturing, aviation and 3C (Computer, Communication, and Consumer electronics). Despite its extensive use in fastener forming and die design, operator experience and trial and error make it subjective and unreliable owing to the difficulty of controlling the development schedule. This study used finite element analysis to establish and simulate wear in automotive repair fastener manufacturing dies based on actual process conditions. The places on a die that wore most quickly were forecast, with the stress levels obtained being substituted into the Archard equation to calculate die wear. A 19.87% improvement in wear optimization occurred by applying the Taguchi quality method to the new design. Additionally, a comparison of actual manufacturing data to simulations revealed a nut forging size error within 2%, thereby demonstrating the accuracy of this theoretical analysis. Finally, SEM micrographs of the worn surfaces on the upper punch indicate that the primary wear mechanism on the cold forging die for long hex flange nuts was adhesive wear. The results can simplify the development schedule, reduce the number of trials and further enhance production quality and die life.
Wear Improvement of Tools in the Cold Forging Process for Long Hex Flange Nuts
Hsia, Shao-Yi; Shih, Po-Yueh
2015-01-01
Cold forging has played a critical role in fasteners and has been widely used in automotive production, manufacturing, aviation and 3C (Computer, Communication, and Consumer electronics). Despite its extensive use in fastener forming and die design, operator experience and trial and error make it subjective and unreliable owing to the difficulty of controlling the development schedule. This study used finite element analysis to establish and simulate wear in automotive repair fastener manufacturing dies based on actual process conditions. The places on a die that wore most quickly were forecast, with the stress levels obtained being substituted into the Archard equation to calculate die wear. A 19.87% improvement in wear optimization occurred by applying the Taguchi quality method to the new design. Additionally, a comparison of actual manufacturing data to simulations revealed a nut forging size error within 2%, thereby demonstrating the accuracy of this theoretical analysis. Finally, SEM micrographs of the worn surfaces on the upper punch indicate that the primary wear mechanism on the cold forging die for long hex flange nuts was adhesive wear. The results can simplify the development schedule, reduce the number of trials and further enhance production quality and die life. PMID:28793589
Software quality and process improvement in scientific simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, J.; Webster, R.
1997-11-01
This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.
Yang, Kun; Yu, Zhenyu; Luo, Yi; Yang, Yang; Zhao, Lei; Zhou, Xiaolu
2018-05-15
Global warming and rapid urbanization in China have caused a series of ecological problems. One consequence has involved the degradation of lake water environments. Lake surface water temperatures (LSWTs) significantly shape water ecological environments and are highly correlated with the watershed ecosystem features and biodiversity levels. Analysing and predicting spatiotemporal changes in LSWT and exploring the corresponding impacts on water quality is essential for controlling and improving the ecological water environment of watersheds. In this study, Dianchi Lake was examined through an analysis of 54 water quality indicators from 10 water quality monitoring sites from 2005 to 2016. Support vector regression (SVR), Principal Component Analysis (PCA) and Back Propagation Artificial Neural Network (BPANN) methods were applied to form a hybrid forecasting model. A geospatial analysis was conducted to observe historical LSWTs and water quality changes for Dianchi Lake from 2005 to 2016. Based on the constructed model, LSWTs and changes in water quality were simulated for 2017 to 2020. The relationship between LSWTs and water quality thresholds was studied. The results show limited errors and highly generalized levels of predictive performance. In addition, a spatial visualization analysis shows that from 2005 to 2020, the chlorophyll-a (Chla), chemical oxygen demand (COD) and total nitrogen (TN) diffused from north to south and that ammonia nitrogen (NH 3 -N) and total phosphorus (TP) levels are increases in the northern part of Dianchi Lake, where the LSWT levels exceed 17°C. The LSWT threshold is 17.6-18.53°C, which falls within the threshold for nutritional water quality, but COD and TN levels fall below V class water quality standards. Transparency (Trans), COD, biochemical oxygen demand (BOD) and Chla levels present a close relationship with LSWT, and LSWTs are found to fundamentally affect lake cyanobacterial blooms. Copyright © 2017 Elsevier B.V. All rights reserved.
Virtual reality-based simulators for spine surgery: a systematic review.
Pfandler, Michael; Lazarovici, Marc; Stefan, Philipp; Wucherer, Patrick; Weigl, Matthias
2017-09-01
Virtual reality (VR)-based simulators offer numerous benefits and are very useful in assessing and training surgical skills. Virtual reality-based simulators are standard in some surgical subspecialties, but their actual use in spinal surgery remains unclear. Currently, only technical reviews of VR-based simulators are available for spinal surgery. Thus, we performed a systematic review that examined the existing research on VR-based simulators in spinal procedures. We also assessed the quality of current studies evaluating VR-based training in spinal surgery. Moreover, we wanted to provide a guide for future studies evaluating VR-based simulators in this field. This is a systematic review of the current scientific literature regarding VR-based simulation in spinal surgery. Five data sources were systematically searched to identify relevant peer-reviewed articles regarding virtual, mixed, or augmented reality-based simulators in spinal surgery. A qualitative data synthesis was performed with particular attention to evaluation approaches and outcomes. Additionally, all included studies were appraised for their quality using the Medical Education Research Study Quality Instrument (MERSQI) tool. The initial review identified 476 abstracts and 63 full texts were then assessed by two reviewers. Finally, 19 studies that examined simulators for the following procedures were selected: pedicle screw placement, vertebroplasty, posterior cervical laminectomy and foraminotomy, lumbar puncture, facet joint injection, and spinal needle insertion and placement. These studies had a low-to-medium methodological quality with a MERSQI mean score of 11.47 out of 18 (standard deviation=1.81). This review described the current state and applications of VR-based simulator training and assessment approaches in spinal procedures. Limitations, strengths, and future advancements of VR-based simulators for training and assessment in spinal surgery were explored. Higher-quality studies with patient-related outcome measures are needed. To establish further adaptation of VR-based simulators in spinal surgery, future evaluations need to improve the study quality, apply long-term study designs, and examine non-technical skills, as well as multidisciplinary team training. Copyright © 2017 Elsevier Inc. All rights reserved.
Analysis of visual quality improvements provided by known tools for HDR content
NASA Astrophysics Data System (ADS)
Kim, Jaehwan; Alshina, Elena; Lee, JongSeok; Park, Youngo; Choi, Kwang Pyo
2016-09-01
In this paper, the visual quality of different solutions for high dynamic range (HDR) compression using MPEG test contents is analyzed. We also simulate the method for an efficient HDR compression which is based on statistical property of the signal. The method is compliant with HEVC specification and also easily compatible with other alternative methods which might require HEVC specification changes. It was subjectively tested on commercial TVs and compared with alternative solutions for HDR coding. Subjective visual quality tests were performed using SUHD TVs model which is SAMSUNG JS9500 with maximum luminance up to 1000nit in test. The solution that is based on statistical property shows not only improvement of objective performance but improvement of visual quality compared to other HDR solutions, while it is compatible with HEVC specification.
Jiang, Minghuan; You, Joyce Hs
2016-05-01
This study aimed to compare the clinical and economic outcomes of pharmacogenetic-guided (PG-guided) and platelet reactivity testing-guided antiplatelet therapy for patients with acute coronary syndrome undergoing percutaneous coronary intervention. A decision-analytic model was simulated including four antiplatelet strategies: universal clopidogrel 75 mg daily, universal alternative P2Y12 inhibitor (prasugrel or ticagrelor), PG-guided therapy, and platelet reactivity testing-guided therapy. PG-guided therapy was the preferred option with lowest cost (US$75,208) and highest quality-adjusted life years gained (7.6249 quality-adjusted life years). The base-case results were robust in sensitivity analysis. PG-guided antiplatelet therapy showed the highest probability to be preferred antiplatelet strategy for acute coronary syndrome patients with percutaneous coronary intervention.
Human errors and measurement uncertainty
NASA Astrophysics Data System (ADS)
Kuselman, Ilya; Pennecchi, Francesca
2015-04-01
Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.
Zhang, Jun
To explore the subjective learning experiences of baccalaureate nursing students participating in simulation sessions in a Chinese nursing school. This was a qualitative descriptive study. We used semi-structured interviews to explore students' perception about simulation-assisted learning. Each interview was audio-taped and transcribed verbatim. Thematic analysis was used to identify the major themes or categories from the transcript and the field notes. Only 10 students were needed to achieve theoretical saturation, due to high group homogeneity. Three main themes which were found from the study included 1. Students' positive views of the new educational experience of simulation; 2. Factors currently making simulation less attractive to students; and 3. The teacher's role in insuring a positive learning experience. Simulation-assisted teaching has been a positive experience for majority nursing students. Further efforts are needed in developing quality simulation-based course curriculum as well as planning and structuring its teaching process. The pedagogy approach requires close collaboration between faculty and students. Copyright © 2016 Elsevier Inc. All rights reserved.
Wiesmann, Veit; Bergler, Matthias; Palmisano, Ralf; Prinzen, Martin; Franz, Daniela; Wittenberg, Thomas
2017-03-18
Manual assessment and evaluation of fluorescent micrograph cell experiments is time-consuming and tedious. Automated segmentation pipelines can ensure efficient and reproducible evaluation and analysis with constant high quality for all images of an experiment. Such cell segmentation approaches are usually validated and rated in comparison to manually annotated micrographs. Nevertheless, manual annotations are prone to errors and display inter- and intra-observer variability which influence the validation results of automated cell segmentation pipelines. We present a new approach to simulate fluorescent cell micrographs that provides an objective ground truth for the validation of cell segmentation methods. The cell simulation was evaluated twofold: (1) An expert observer study shows that the proposed approach generates realistic fluorescent cell micrograph simulations. (2) An automated segmentation pipeline on the simulated fluorescent cell micrographs reproduces segmentation performances of that pipeline on real fluorescent cell micrographs. The proposed simulation approach produces realistic fluorescent cell micrographs with corresponding ground truth. The simulated data is suited to evaluate image segmentation pipelines more efficiently and reproducibly than it is possible on manually annotated real micrographs.
Process simulations for manufacturing of thick composites
NASA Astrophysics Data System (ADS)
Kempner, Evan A.
The availability of manufacturing simulations for composites can significantly reduce the costs associated with process development. Simulations provide a tool for evaluating the effect of processing conditions on the quality of parts produced without requiring numerous experiments. This is especially significant in parts that have troublesome features such as large thickness. The development of simulations for thick walled composites has been approached by examining the mechanics of resin flow and fiber deformation during processing, applying these evaluations to develop simulations, and evaluating the simulation with experimental results. A unified analysis is developed to describe the three-dimensional resin flow and fiber preform deformation during processing regardless of the manufacturing process used. It is shown how the generic governing evaluations in the unified analysis can be applied to autoclave molding, compression molding, pultrusion, filament winding, and resin transfer molding. A comparison is provided with earlier models derived individually for these processes. The evaluations described for autoclave curing were used to produce a one-dimensional cure simulation for autoclave curing of thick composites. The simulation consists of an analysis for heat transfer and resin flow in the composite as well as bleeder plies used to absorb resin removed from the part. Experiments were performed in a hot press to approximate curing in an autoclave. Graphite/epoxy laminates of 3 cm and 5 cm thickness were cured while monitoring temperatures at several points inside the laminate and thickness. The simulation predicted temperatures fairly closely, but difficulties were encountered in correlation of thickness results. This simulation was also used to study the effects of prepreg aging on processing of thick composites. An investigation was also performed on filament winding with prepreg tow. Cylinders were wound of approximately 12 mm thickness with pressure gages at the mandrel-composite interface. Cylinders were hoop wound with tensions ranging from 13-34 N. An analytical model was developed to calculate change in stress due to relaxation during winding. Although compressive circumferential stresses occurred throughout each of the cylinders, the magnitude was fairly low.
HPLC fingerprint analysis combined with chemometrics for pattern recognition of ginger.
Feng, Xu; Kong, Weijun; Wei, Jianhe; Ou-Yang, Zhen; Yang, Meihua
2014-03-01
Ginger, the fresh rhizome of Zingiber officinale Rosc. (Zingiberaceae), has been used worldwide; however, for a long time, there has been no standard approbated internationally for its quality control. To establish an efficacious and combinational method and pattern recognition technique for quality control of ginger. A simple, accurate and reliable method based on high-performance liquid chromatography with photodiode array (HPLC-PDA) detection was developed for establishing the chemical fingerprints of 10 batches of ginger from different markets in China. The method was validated in terms of precision, reproducibility and stability; and the relative standard deviations were all less than 1.57%. On the basis of this method, the fingerprints of 10 batches of ginger samples were obtained, which showed 16 common peaks. Coupled with similarity evaluation software, the similarities between each fingerprint of the sample and the simulative mean chromatogram were in the range of 0.998-1.000. Then, the chemometric techniques, including similarity analysis, hierarchical clustering analysis and principal component analysis were applied to classify the ginger samples. Consistent results were obtained to show that ginger samples could be successfully classified into two groups. This study revealed that HPLC-PDA method was simple, sensitive and reliable for fingerprint analysis, and moreover, for pattern recognition and quality control of ginger.
Quality of radiotherapy reporting in randomized controlled trials of prostate cancer.
Soon, Yu Yang; Chen, Desiree; Tan, Teng Hwee; Tey, Jeremy
2018-06-07
Good radiotherapy reporting in clinical trials of prostate radiotherapy is important because it will allow accurate reproducibility of radiotherapy treatment and minimize treatment variations that can affect patient outcomes. The aim of our study is to assess the quality of prostate radiotherapy (RT) treatment reporting in randomized controlled trials in prostate cancer. We searched MEDLINE for randomized trials of prostate cancer, published from 1996 to 2016 and included prostate RT as one of the intervention arms. We assessed if the investigators reported the ten criteria adequately in the trial reports: RT dose prescription method; RT dose-planning procedures; organs at risk (OAR) dose constraints; target volume definition, simulation procedures; treatment verification procedures; total RT dose; fractionation schedule; conduct of quality assurance (QA) as well as presence or absence of deviations in RT treatment planning and delivery. We performed multivariate logistic regression to determine the factors that may influence the quality of reporting. We found 59 eligible trials. There was significant variability in the quality of reporting. Target volume definition, total RT dose and fractionation schedule were reported adequately in 97% of included trials. OAR constraints, simulation procedures and presence or absence of deviations in RT treatment planning and delivery were reported adequately in 30% of included trials. Twenty-four trials (40%) reported seven criteria or more adequately. Multivariable logistic analysis showed that trials that published their quality assurance results and cooperative group trials were more likely to have adequate quality in reporting in at least seven criteria. There is significant variability in the quality of reporting on prostate radiotherapy treatment in randomized trials of prostate cancer. We need to have consensus guidelines to standardize the reporting of radiotherapy treatment in randomized trials.
Application and partial validation of a habitat model for moose in the Lake Superior region
Allen, A.W.; Terrell, J.W.; Mangus, W.L.; Lindquist, E.L.
1991-01-01
A modified version of the dormant-season portion of a Habitat Suitability Index (HSI) model developed for assessing moose (Alces alces) habitat in the Lake Superior Region was incorporated in a Geographic Information System (GIS) for 490 km2 of Minnesota's Superior National Forest. Moose locations (n=235) were plotted during aerial surveys conducted in December 1988 and January 1990-1991. Dormant-season forage and cover quality for 1,000-m, 500-m, and 200-m radii plots around random points and moose locations were compared using U.S. Forest Service stand examination data. Cover quality indices were lower than forage quality indices within all plots. The median value for the average cover quality index was greater (P=0.003) within 200-m plots around cow moose locations than for plots around random points for the most severe winter of the study. The proportion of highest-quality winter cover, such as mixed stands dominated by mid-age class white spruce (Picea glauca) and balsam fir (Abies balsanea), was greater within 500-m and 200-m plots around cow moose than within similar plots around random points during the two most severe winters. These results indicate that suboptimum ratings of winter habitat quality used in the GIS for dormant-season forage >100 m from cover, as suggested in the original HSI model, are reasonable. Integrating the habitat model with forest stand data using a GIS permitted analysis of moose habitat within a relatively large geographic area. Simulation of habitat quality indicated a potential shortage of late-winter cover in the study area. The effects of forest management actions on moose habitat quality can be simulated without collecting additional data.
Large Gain in Air Quality Compared to an Alternative Anthropogenic Emissions Scenario
NASA Technical Reports Server (NTRS)
Daskalakis, Nikos; Tsigaridis, Kostas; Myriokefalitakis, Stelios; Fanourgakis, George S.; Kanakidou, Maria
2016-01-01
During the last 30 years, significant effort has been made to improve air quality through legislation for emissions reduction. Global three-dimensional chemistrytransport simulations of atmospheric composition over the past 3 decades have been performed to estimate what the air quality levels would have been under a scenario of stagnation of anthropogenic emissions per capita as in 1980, accounting for the population increase (BA1980) or using the standard practice of neglecting it (AE1980), and how they compare to the historical changes in air quality levels. The simulations are based on assimilated meteorology to account for the yearto- year observed climate variability and on different scenarios of anthropogenic emissions of pollutants. The ACCMIP historical emissions dataset is used as the starting point. Our sensitivity simulations provide clear indications that air quality legislation and technology developments have limited the rapid increase of air pollutants. The achieved reductions in concentrations of nitrogen oxides, carbon monoxide, black carbon, and sulfate aerosols are found to be significant when comparing to both BA1980 and AE1980 simulations that neglect any measures applied for the protection of the environment. We also show the potentially large tropospheric air quality benefit from the development of cleaner technology used by the growing global population. These 30-year hindcast sensitivity simulations demonstrate that the actual benefit in air quality due to air pollution legislation and technological advances is higher than the gain calculated by a simple comparison against a constant anthropogenic emissions simulation, as is usually done. Our results also indicate that over China and India the beneficial technological advances for the air quality may have been masked by the explosive increase in local population and the disproportional increase in energy demand partially due to the globalization of the economy.
Moyer, Douglas; Hyer, Kenneth
2003-01-01
Impairment of surface waters by fecal coliform bacteria is a water-quality issue of national scope and importance. Section 303(d) of the Clean Water Act requires that each State identify surface waters that do not meet applicable water-quality standards. In Virginia, more than 175 stream segments are on the 1998 Section 303(d) list of impaired waters because of violations of the water-quality standard for fecal coliform bacteria. A total maximum daily load (TMDL) will need to be developed by 2006 for each of these impaired streams and rivers by the Virginia Departments of Environmental Quality and Conservation and Recreation. A TMDL is a quantitative representation of the maximum load of a given water-quality constituent, from all point and nonpoint sources, that a stream can assimilate without violating the designated water-quality standard. Christians Creek, in Augusta County, Virginia, is one of the stream segments listed by the State of Virginia as impaired by fecal coliform bacteria. Watershed modeling and bacterial source tracking were used to develop the technical components of the fecal coliform bacteria TMDL for Christians Creek. The Hydrological Simulation Program?FORTRAN (HSPF) was used to simulate streamflow, fecal coliform concentrations, and source-specific fecal coliform loading in Christians Creek. Ribotyping, a bacterial source tracking technique, was used to identify the dominant sources of fecal coliform bacteria in the Christians Creek watershed. Ribotyping also was used to determine the relative contributions of specific sources to the observed fecal coliform load in Christians Creek. Data from the ribotyping analysis were incorporated into the calibration of the fecal coliform model. Study results provide information regarding the calibration of the streamflow and fecal coliform bacteria models and also identify the reductions in fecal coliform loads required to meet the TMDL for Christians Creek. The calibrated streamflow model simulated observed streamflow characteristics with respect to total annual runoff, seasonal runoff, average daily streamflow, and hourly stormflow. The calibrated fecal coliform model simulated the patterns and range of observed fecal coliform bacteria concentrations. Observed fecal coliform bacteria concentrations during low-flow periods ranged from 40 to 2,000 colonies per 100 milliliters, and peak concentrations during stormflow periods ranged from 23,000 to 730,000 colonies per 100 milliliters. Additionally, fecal coliform bacteria concentrations were generally higher upstream and lower downstream. Simulated source-specific contributions of fecal coliform bacteria to instream load were matched to the observed contributions from the dominant sources, which were beaver, cats, cattle, deer, dogs, ducks, geese, horses, humans, muskrats, poultry, raccoons, and sheep. According to model results, a 96-percent reduction in the current fecal coliform load delivered from the watershed to Christians Creek would result in compliance with the designated water-quality goals and associated TMDL.
Evaluation of GEOS-5 Sulfur Dioxide Simulations During the Frostburg, MD 2010 Field Campaign.
NASA Technical Reports Server (NTRS)
Buchard, V.; Da Silva, A. M.; Colarco, P.; Krotkov, N.; Dickerson, R. R.; Stehr, J. W.; Mount, G.; Spenei, E.; Arkinson, H. L.; He, H.
2013-01-01
Sulfur dioxide (SO2) is a major atmospheric pollutant with a strong anthropogenic component mostly produced by the combustion of fossil fuel and other industrial activities. As a precursor of sulfate aerosols that affect climate, air quality, and human health, this gas needs to be monitored on a global scale. Global climate and chemistry models including aerosol processes along with their radiative effects are important tools for climate and air quality research. Validation of these models against in-situ and satellite measurements is essential to ascertain the credibility of these models and to guide model improvements. In this study the Goddard Chemistry, Aerosol, Radiation, and Transport (GOCART) module running on-line inside the Goddard Earth Observing System version 5 (GEOS-5) model is used to simulate aerosol and SO2 concentrations. Data taken in November 2010 over Frostburg, Maryland during an SO2 field campaign involving ground instrumentation and aircraft are used to evaluate GEOS-5 simulated SO2 concentrations. Preliminary data analysis indicated the model overestimated surface SO2 concentration, which motivated the examination of mixing processes in the model and the specification of SO2 anthropogenic emission rates. As a result of this analysis, a revision of anthropogenic emission inventories in GEOS-5 was implemented, and the vertical placement of SO2 sources was updated. Results show that these revisions improve the model agreement with observations locally and in regions outside the area of this field campaign. In particular, we use the ground-based measurements collected by the United States Environmental Protection Agency (US EPA) for the year 2010 to evaluate the revised model simulations over North America.
A technique for the optical analysis of deformed telescope mirrors
NASA Technical Reports Server (NTRS)
Bolton, John F.
1986-01-01
The NASTRAN-ACCOS V programs' interface merges structural and optical analysis capabilities in order to characterize the performance of the NASA Goddard Space Flight Center's Solar Optical Telescope primary mirror, which has a large diameter/thickness ratio. The first step in the optical analysis is to use NASTRAN's FEM to model the primary mirror, simulating any distortions due to gravitation, thermal gradients, and coefficient of thermal expansion nonuniformities. NASTRAN outputs are then converted into an ACCOS V-acceptable form; ACCOS V generates the deformed optical surface on the basis of these inputs, and imaging qualities can be determined.
FE-Analysis of Stretch-Blow Moulded Bottles Using an Integrative Process Simulation
NASA Astrophysics Data System (ADS)
Hopmann, C.; Michaeli, W.; Rasche, S.
2011-05-01
The two-stage stretch-blow moulding process has been established for the large scale production of high quality PET containers with excellent mechanical and optical properties. The total production costs of a bottle are significantly caused by the material costs. Due to this dominant share of the bottle material, the PET industry is interested in reducing the total production costs by an optimised material efficiency. However, a reduced material inventory means decreasing wall thicknesses and therewith a reduction of the bottle properties (e.g. mechanical properties, barrier properties). Therefore, there is often a trade-off between a minimal bottle weight and adequate properties of the bottle. In order to achieve the objectives Computer Aided Engineering (CAE) techniques can assist the designer of new stretch-blow moulded containers. Hence, tools such as the process simulation and the structural analysis have become important in the blow moulding sector. The Institute of Plastics Processing (IKV) at RWTH Aachen University, Germany, has developed an integrative three-dimensional process simulation which models the complete path of a preform through a stretch-blow moulding machine. At first, the reheating of the preform is calculated by a thermal simulation. Afterwards, the inflation of the preform to a bottle is calculated by finite element analysis (FEA). The results of this step are e.g. the local wall thickness distribution and the local biaxial stretch ratios. Not only the material distribution but also the material properties that result from the deformation history of the polymer have significant influence on the bottle properties. Therefore, a correlation between the material properties and stretch ratios is considered in an integrative simulation approach developed at IKV. The results of the process simulation (wall thickness, stretch ratios) are transferred to a further simulation program and mapped on the bottles FE mesh. This approach allows a local determination of the material properties and thus a more accurate prediction of the bottle properties. The approach was applied both for a mechanical structural analysis and for a barrier analysis. First results point out that the approach can improve the FE analysis and might be a helpful tool for designing new stretch-blow moulded bottles.
NASA Technical Reports Server (NTRS)
Prive, Nikki C.; Errico, Ronald M.
2013-01-01
A series of experiments that explore the roles of model and initial condition error in numerical weather prediction are performed using an observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASA/GMAO). The use of an OSSE allows the analysis and forecast errors to be explicitly calculated, and different hypothetical observing networks can be tested with ease. In these experiments, both a full global OSSE framework and an 'identical twin' OSSE setup are utilized to compare the behavior of the data assimilation system and evolution of forecast skill with and without model error. The initial condition error is manipulated by varying the distribution and quality of the observing network and the magnitude of observation errors. The results show that model error has a strong impact on both the quality of the analysis field and the evolution of forecast skill, including both systematic and unsystematic model error components. With a realistic observing network, the analysis state retains a significant quantity of error due to systematic model error. If errors of the analysis state are minimized, model error acts to rapidly degrade forecast skill during the first 24-48 hours of forward integration. In the presence of model error, the impact of observation errors on forecast skill is small, but in the absence of model error, observation errors cause a substantial degradation of the skill of medium range forecasts.
Uncertainty Quantification of Water Quality in Tamsui River in Taiwan
NASA Astrophysics Data System (ADS)
Kao, D.; Tsai, C.
2017-12-01
In Taiwan, modeling of non-point source pollution is unavoidably associated with uncertainty. The main purpose of this research is to better understand water contamination in the metropolitan Taipei area, and also to provide a new analysis method for government or companies to establish related control and design measures. In this research, three methods are utilized to carry out the uncertainty analysis step by step with Mike 21, which is widely used for hydro-dynamics and water quality modeling, and the study area is focused on Tamsui river watershed. First, a sensitivity analysis is conducted which can be used to rank the order of influential parameters and variables such as Dissolved Oxygen, Nitrate, Ammonia and Phosphorous. Then we use the First-order error method (FOEA) to determine the number of parameters that could significantly affect the variability of simulation results. Finally, a state-of-the-art method for uncertainty analysis called the Perturbance moment method (PMM) is applied in this research, which is more efficient than the Monte-Carlo simulation (MCS). For MCS, the calculations may become cumbersome when involving multiple uncertain parameters and variables. For PMM, three representative points are used for each random variable, and the statistical moments (e.g., mean value, standard deviation) for the output can be presented by the representative points and perturbance moments based on the parallel axis theorem. With the assumption of the independent parameters and variables, calculation time is significantly reduced for PMM as opposed to MCS for a comparable modeling accuracy.
Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C
2015-11-01
Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.
StratBAM: A Discrete-Event Simulation Model to Support Strategic Hospital Bed Capacity Decisions.
Devapriya, Priyantha; Strömblad, Christopher T B; Bailey, Matthew D; Frazier, Seth; Bulger, John; Kemberling, Sharon T; Wood, Kenneth E
2015-10-01
The ability to accurately measure and assess current and potential health care system capacities is an issue of local and national significance. Recent joint statements by the Institute of Medicine and the Agency for Healthcare Research and Quality have emphasized the need to apply industrial and systems engineering principles to improving health care quality and patient safety outcomes. To address this need, a decision support tool was developed for planning and budgeting of current and future bed capacity, and evaluating potential process improvement efforts. The Strategic Bed Analysis Model (StratBAM) is a discrete-event simulation model created after a thorough analysis of patient flow and data from Geisinger Health System's (GHS) electronic health records. Key inputs include: timing, quantity and category of patient arrivals and discharges; unit-level length of care; patient paths; and projected patient volume and length of stay. Key outputs include: admission wait time by arrival source and receiving unit, and occupancy rates. Electronic health records were used to estimate parameters for probability distributions and to build empirical distributions for unit-level length of care and for patient paths. Validation of the simulation model against GHS operational data confirmed its ability to model real-world data consistently and accurately. StratBAM was successfully used to evaluate the system impact of forecasted patient volumes and length of stay in terms of patient wait times, occupancy rates, and cost. The model is generalizable and can be appropriately scaled for larger and smaller health care settings.
Milani, Massimo; Montorsi, Luca; Stefani, Matteo; Saponelli, Roberto; Lizzano, Maurizio
2017-12-01
The paper focuses on the analysis of an industrial ceramic kiln in order to improve the energy efficiency and thus the fuel consumption and the corresponding carbon dioxide emissions. A lumped and distributed parameter model of the entire system is constructed to simulate the performance of the kiln under actual operating conditions. The model is able to predict accurately the temperature distribution along the different modules of the kiln and the operation of the many natural gas burners employed to provide the required thermal power. Furthermore, the temperature of the tiles is also simulated so that the quality of the final product can be addressed by the modelling. Numerical results are validated against experimental measurements carried out on a real ceramic kiln during regular production operations. The developed numerical model demonstrates to be an efficient tool for the investigation of different design solutions for the kiln's components. In addition, a number of control strategies for the system working conditions can be simulated and compared in order to define the best trade off in terms of fuel consumption and product quality. In particular, the paper analyzes the effect of a new burner type characterized by internal heat recovery capability aimed at improving the energy efficiency of the ceramic kiln. The fuel saving and the relating reduction of carbon dioxide emissions resulted in the order of 10% when compared to the standard burner. Copyright © 2017 Elsevier Ltd. All rights reserved.
Multi-criteria analysis for PM10 planning
NASA Astrophysics Data System (ADS)
Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa
To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.
Ground-to-Flight Handling Qualities Comparisons for a High Performance Airplane
NASA Technical Reports Server (NTRS)
Brandon, Jay M.; Glaab, Louis J.; Brown, Philip W.; Phillips, Michael R.
1995-01-01
A flight test program was conducted in conjunction with a ground-based piloted simulation study to enable a comparison of handling qualities ratings for a variety of maneuvers between flight and simulation of a modern high performance airplane. Specific objectives included an evaluation of pilot-induced oscillation (PIO) tendencies and a determination of maneuver types which result in either good or poor ground-to-flight pilot handling qualities ratings. A General Dynamics F-16XL aircraft was used for the flight evaluations, and the NASA Langley Differential Maneuvering Simulator was employed for the ground based evaluations. Two NASA research pilots evaluated both the airplane and simulator characteristics using tasks developed in the simulator. Simulator and flight tests were all conducted within approximately a one month time frame. Maneuvers included numerous fine tracking evaluations at various angles of attack, load factors and speed ranges, gross acquisitions involving longitudinal and lateral maneuvering, roll angle captures, and an ILS task with a sidestep to landing. Overall results showed generally good correlation between ground and flight for PIO tendencies and general handling qualities comments. Differences in pilot technique used in simulator evaluations and effects of airplane accelerations and motions are illustrated.
Impact simulation of shrimp farm effluent on BOD-DO in Setiu River
NASA Astrophysics Data System (ADS)
Chong, Michael Sueng Lock; Teh, Su Yean; Koh, Hock Lye
2017-08-01
Release of effluent from intensive aquaculture farms into a river can pollute the receiving river and exert negative impacts on the aquatic ecosystem. In this paper, we simulate the effects of effluent released from a marine shrimp aquaculture farm into Sg Setiu, focusing on two critical water quality parameters i.e. DO (dissolved oxygen) and BOD (biochemical oxygen demand). DO is an important constituent in a river in sustaining water quality, with levels of DO below 5 mg/L deemed undesirable. DO levels can be depressed by the presence of BOD and other organics that consume DO. Water quality simulations in conjunction with management of effluent treatment can suggest mitigation measures for reducing the adverse environmental impact. For this purpose, an in-house two-dimensional water quality simulation model codenamed TUNA-WQ will be used for these simulations. TUNA-WQ has been undergoing regular updates and improvements to broaden the applicability and to improve the robustness. Here, the model is calibrated and verified for simulation of DO and BOD dynamics in Setiu River (Sg Setiu). TUNA-WQ simulated DO and BOD in Setiu River due to the discharge from a marine shrimp aquaculture farm will be presented.
Failure Analysis of a Sheet Metal Blanking Process Based on Damage Coupling Model
NASA Astrophysics Data System (ADS)
Wen, Y.; Chen, Z. H.; Zang, Y.
2013-11-01
In this paper, a blanking process of sheet metal is studied by the methods of numerical simulation and experimental observation. The effects of varying technological parameters related to the quality of products are investigated. An elastoplastic constitutive equation accounting for isotropic ductile damage is implemented into the finite element code ABAQUS with a user-defined material subroutine UMAT. The simulations of the damage evolution and ductile fracture in a sheet metal blanking process have been carried out by the FEM. In order to guarantee computation accuracy and avoid numerical divergence during large plastic deformation, a specified remeshing technique is successively applied when severe element distortion occurs. In the simulation, the evolutions of damage at different stage of the blanking process have been evaluated and the distributions of damage obtained from simulation are in proper agreement with the experimental results.
NASA Astrophysics Data System (ADS)
Profumieri, A.; Bonell, C.; Catalfamo, P.; Cherniz, A.
2016-04-01
Virtual reality has been proposed for different applications, including the evaluation of new control strategies and training protocols for upper limb prostheses and for the study of new rehabilitation programs. In this study, a lower limb simulation environment commanded by surface electromyography signals is evaluated. The time delays generated by the acquisition and processing stages for the signals that would command the knee joint, were measured and different acquisition windows were analysed. The subjective perception of the quality of simulation was also evaluated when extra delays were added to the process. The results showed that the acquisition window is responsible for the longest delay. Also, the basic implemented processes allowed for the acquisition of three signal channels for commanding the simulation. Finally, the communication between different applications is arguably efficient, although it depends on the amount of data to be sent.
Fiber pushout test: A three-dimensional finite element computational simulation
NASA Technical Reports Server (NTRS)
Mital, Subodh K.; Chamis, Christos C.
1990-01-01
A fiber pushthrough process was computationally simulated using three-dimensional finite element method. The interface material is replaced by an anisotropic material with greatly reduced shear modulus in order to simulate the fiber pushthrough process using a linear analysis. Such a procedure is easily implemented and is computationally very effective. It can be used to predict fiber pushthrough load for a composite system at any temperature. The average interface shear strength obtained from pushthrough load can easily be separated into its two components: one that comes from frictional stresses and the other that comes from chemical adhesion between fiber and the matrix and mechanical interlocking that develops due to shrinkage of the composite because of phase change during the processing. Step-by-step procedures are described to perform the computational simulation, to establish bounds on interfacial bond strength and to interpret interfacial bond quality.
Tran-Duy, An; Boonen, Annelies; van de Laar, Mart A F J; Franke, Angelinus C; Severens, Johan L
2011-12-01
To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Discrete event simulation paradigm was selected for model development. Drug efficacy was modelled as changes in disease activity (Bath Ankylosing Spondylitis Disease Activity Index (BASDAI)) and functional status (Bath Ankylosing Spondylitis Functional Index (BASFI)), which were linked to costs and health utility using statistical models fitted based on an observational AS cohort. Published clinical data were used to estimate drug efficacy and time to events. Two strategies were compared: (1) five available non-steroidal anti-inflammatory drugs (strategy 1) and (2) same as strategy 1 plus two tumour necrosis factor α inhibitors (strategy 2). 13,000 patients were followed up individually until death. For probability sensitivity analysis, Monte Carlo simulations were performed with 1000 sets of parameters sampled from the appropriate probability distributions. The models successfully generated valid data on treatments, BASDAI, BASFI, utility, quality-adjusted life years (QALYs) and costs at time points with intervals of 1-3 months during the simulation length of 70 years. Incremental cost per QALY gained in strategy 2 compared with strategy 1 was €35,186. At a willingness-to-pay threshold of €80,000, it was 99.9% certain that strategy 2 was cost-effective. The modelling framework provides great flexibility to implement complex algorithms representing treatment selection, disease progression and changes in costs and utilities over time of patients with AS. Results obtained from the simulation are plausible.
Analysis of nonreciprocal noise based on mode splitting in a high-Q optical microresonator
NASA Astrophysics Data System (ADS)
Yang, Zhaohua; Xiao, Yarong; Huo, Jiayan; Shao, Hui
2018-01-01
The whispering gallery mode optical microresonator offers a high quality factor, which enables it to act as the core component of a high sensitivity resonator optic gyro; however, nonreciprocal noise limits its precision. Considering the Sagnac effect, i.e. mode splitting in high-quality optical micro-resonators, we derive the explicit expression for the angular velocity versus the splitting amount, and verify the sensing mechanism by simulation using finite element method. Remarkably, the accuracy of the angular velocity measurement in the whispering gallery mode optical microresonator with a quality factor of 108 is 106 °/s. We obtain the optimal coupling position of the novel angular velocity sensing system by detecting the output transmittance spectra of different vertical coupling distances and axial coupling positions. In addition, the reason for the nonreciprocal phenomenon is determined by theoretical analysis of the evanescent distribution of a tapered fiber. These results will provide an effective method and a theoretical basis for suppression of the nonreciprocal noise.
A Reduced Form Model (RFM) is a mathematical relationship between the inputs and outputs of an air quality model, permitting estimation of additional modeling without costly new regional-scale simulations. A 21-year Community Multiscale Air Quality (CMAQ) simulation for the con...
Effect of Training in Rational Decision Making on the Quality of Simulated Career Decisions.
ERIC Educational Resources Information Center
Krumboltz, John D.; And Others
1982-01-01
Determined if training in rational decision making improves the quality of simulated career decisions. Training in rational decision making resulted in superior performance for females on one subscore of the knowledge measure. It also resulted in superior simulated career choices by females and younger males. (Author)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com; Suprijadi; Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132
Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic imagesmore » and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.« less
Ray, Joshua A; Boye, Kristina S; Yurgin, Nicole; Valentine, William J; Roze, Stéphane; McKendrick, Jan; Tucker, Daniel M D; Foos, Volker; Palmer, Andrew J
2007-03-01
The aim of this study was to evaluate the long-term clinical and economic outcomes associated with exenatide or insulin glargine, added to oral therapy in individuals with type 2 diabetes inadequately controlled with combination oral agents in the UK setting. A published and validated computer simulation model of diabetes was used to project long-term complications, life expectancy, quality-adjusted life expectancy and direct medical costs. Probabilities of diabetes-related complications were derived from published sources. Treatment effects and patient characteristics were extracted from a recent randomised controlled trial comparing exenatide with insulin glargine. Simulations incorporated published quality of life utilities and UK-specific costs from 2004. Pharmacy costs for exenatide were based on 20, 40, 60, 80 and 100% of the US value (as no price for the UK was available at the time of analysis). Future costs and clinical benefits were discounted at 3.5% annually. Sensitivity analyses were performed. In the base-case analysis exenatide was associated with improvements in life expectancy of 0.057 years and in quality-adjusted life expectancy of 0.442 quality-adjusted life years (QALYs) versus insulin glargine. Long-term projections demonstrated that exenatide was associated with a lower cumulative incidence of most cardiovascular disease (CVD) complications and CVD-related death than insulin glargine. Using the range of cost values, evaluation results showed that exenatide is likely to fall in a range between dominant (cost and life saving) at 20% of the US price and cost-effective (with an ICER of 22,420 pounds per QALY gained) at 100% of the US price, versus insulin glargine. Based on the findings of a recent clinical trial, long-term projections indicated that exenatide is likely to be associated with improvement in life expectancy and quality-adjusted life expectancy compared to insulin glargine. The results from this modelling analysis suggest that that exenatide is likely to represent good value for money by generally accepted standards in the UK setting in individuals with type 2 diabetes inadequately controlled on oral therapy.
Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra
2015-11-01
A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Barre, Jerome; Edwards, David; Worden, Helen; Da Silva, Arlindo; Lahoz, William
2015-01-01
By the end of the current decade, there are plans to deploy several geostationary Earth orbit (GEO) satellite missions for atmospheric composition over North America, East Asia and Europe with additional missions proposed. Together, these present the possibility of a constellation of geostationary platforms to achieve continuous time-resolved high-density observations over continental domains for mapping pollutant sources and variability at diurnal and local scales. In this paper, we use a novel approach to sample a very high global resolution model (GEOS-5 at 7 km horizontal resolution) to produce a dataset of synthetic carbon monoxide pollution observations representative of those potentially obtainable from a GEO satellite constellation with predicted measurement sensitivities based on current remote sensing capabilities. Part 1 of this study focuses on the production of simulated synthetic measurements for air quality OSSEs (Observing System Simulation Experiments). We simulate carbon monoxide nadir retrievals using a technique that provides realistic measurements with very low computational cost. We discuss the sampling methodology: the projection of footprints and areas of regard for geostationary geometries over each of the North America, East Asia and Europe regions; the regression method to simulate measurement sensitivity; and the measurement error simulation. A detailed analysis of the simulated observation sensitivity is performed, and limitations of the method are discussed. We also describe impacts from clouds, showing that the efficiency of an instrument making atmospheric composition measurements on a geostationary platform is dependent on the dominant weather regime over a given region and the pixel size resolution. These results demonstrate the viability of the "instrument simulator" step for an OSSE to assess the performance of a constellation of geostationary satellites for air quality measurements.
NASA Astrophysics Data System (ADS)
Hu, Haoyue; Eberhard, Peter
2017-10-01
Process simulations of conduction mode laser welding are performed using the meshless Lagrangian smoothed particle hydrodynamics (SPH) method. The solid phase is modeled based on the governing equations in thermoelasticity. For the liquid phase, surface tension effects are taken into account to simulate the melt flow in the weld pool, including the Marangoni force caused by a temperature-dependent surface tension gradient. A non-isothermal solid-liquid phase transition with the release or absorption of additional energy known as the latent heat of fusion is considered. The major heat transfer through conduction is modeled, whereas heat convection and radiation are neglected. The energy input from the laser beam is modeled as a Gaussian heat source acting on the initial material surface. The developed model is implemented in Pasimodo. Numerical results obtained with the model are presented for laser spot welding and seam welding of aluminum and iron. The change of process parameters like welding speed and laser power, and their effects on weld dimensions are investigated. Furthermore, simulations may be useful to obtain the threshold for deep penetration welding and to assess the overall welding quality. A scalability and performance analysis of the implemented SPH algorithm in Pasimodo is run in a shared memory environment. The analysis reveals the potential of large welding simulations on multi-core machines.
Database-driven web interface automating gyrokinetic simulations for validation
NASA Astrophysics Data System (ADS)
Ernst, D. R.
2010-11-01
We are developing a web interface to connect plasma microturbulence simulation codes with experimental data. The website automates the preparation of gyrokinetic simulations utilizing plasma profile and magnetic equilibrium data from TRANSP analysis of experiments, read from MDSPLUS over the internet. This database-driven tool saves user sessions, allowing searches of previous simulations, which can be restored to repeat the same analysis for a new discharge. The website includes a multi-tab, multi-frame, publication quality java plotter Webgraph, developed as part of this project. Input files can be uploaded as templates and edited with context-sensitive help. The website creates inputs for GS2 and GYRO using a well-tested and verified back-end, in use for several years for the GS2 code [D. R. Ernst et al., Phys. Plasmas 11(5) 2637 (2004)]. A centralized web site has the advantage that users receive bug fixes instantaneously, while avoiding the duplicated effort of local compilations. Possible extensions to the database to manage run outputs, toward prototyping for the Fusion Simulation Project, are envisioned. Much of the web development utilized support from the DoE National Undergraduate Fellowship program [e.g., A. Suarez and D. R. Ernst, http://meetings.aps.org/link/BAPS.2005.DPP.GP1.57.
COP-compost: a software to study the degradation of organic pollutants in composts.
Zhang, Y; Lashermes, G; Houot, S; Zhu, Y-G; Barriuso, E; Garnier, P
2014-02-01
Composting has been demonstrated to be effective in degrading organic pollutants (OP) whose behaviour depends on the composting conditions, the microbial populations activated and interactions with organic matters. The fate of OP during composting involves complex mechanisms and models can be helpful tools for educational and scientific purposes, as well as for industrialists who want to optimise the composting process for OP elimination. A COP-Compost model, which couples an organic carbon (OC) module and an organic pollutant (OP) module and which simulates the changes of organic matter, organic pollutants and the microbial activities during the composting process, has been proposed and calibrated for a first set of OP in a previous study. The objectives of the present work were (1) to introduce the COP-Compost model from its convenient interface to a potential panel of users, (2) to show the variety of OP that could be simulated, including the possibility of choosing between degradation through co-metabolism or specific metabolism and (3) to show the effect of the initial characteristics of organic matter quality and its microbial biomass on the simulated results of the OP dynamic. In the model, we assumed that the pollutants can be adsorbed on organic matter according to the biochemical quality of the OC and that the microorganisms can degrade the pollutants at the same time as they degrade OC (by co-metabolism). A composting experiment describing two different (14)C-labelled organic pollutants, simazine and pyrene, were chosen from the literature because the four OP fractions simulated in the model were measured during the study (the mineralised, soluble, sorbed and non-extractable fractions). Except for the mineralised fraction of simazine, a good agreement was achieved between the simulated and experimental results describing the evolution of the different organic fractions. For simazine, a specific biomass had to be added. To assess the relative importance of organic matter dynamics on the organic pollutants' behaviour, a sensitivity analysis was conducted. The sensitivity analysis demonstrated that the parameters associated with organic matter dynamics and its initial microbial biomass greatly influenced the evolution of all the OP fractions, although the initial biochemical quality of the OC did not have a significant impact on the OP evolution.
Wesolowski, Edwin A.
1999-01-01
A streamflow and water-quality model was developed for reaches of Sand and Caddo Creeks in south-central Oklahoma to simulate the effects of wastewater discharge from a refinery and a municipal treatment plant.The purpose of the model was to simulate conditions during low streamflow when the conditions controlling dissolved-oxygen concentrations are most severe. Data collected to calibrate and verify the streamflow and water-quality model include continuously monitored streamflow and water-quality data at two gaging stations and three temporary monitoring stations; wastewater discharge from two wastewater plants; two sets each of five water-quality samples at nine sites during a 24-hour period; dye and propane samples; periphyton samples; and sediment oxygen demand measurements. The water-quality sampling, at a 6-hour frequency, was based on a Lagrangian reference frame in which the same volume of water was sampled at each site. To represent the unsteady streamflows and the dynamic water-quality conditions, a transport modeling system was used that included both a model to route streamflow and a model to transport dissolved conservative constituents with linkage to reaction kinetics similar to the U.S. Environmental Protection Agency QUAL2E model to simulate nonconservative constituents. These model codes are the Diffusion Analogy Streamflow Routing Model (DAFLOW) and the branched Lagrangian transport model (BLTM) and BLTM/QUAL2E that, collectively, as calibrated models, are referred to as the Ardmore Water-Quality Model.The Ardmore DAFLOW model was calibrated with three sets of streamflows that collectively ranged from 16 to 3,456 cubic feet per second. The model uses only one set of calibrated coefficients and exponents to simulate streamflow over this range. The Ardmore BLTM was calibrated for transport by simulating dye concentrations collected during a tracer study when streamflows ranged from 16 to 23 cubic feet per second. Therefore, the model is expected to be most useful for low streamflow simulations. The Ardmore BLTM/QUAL2E model was calibrated and verified with water-quality data from nine sites where two sets of five samples were collected. The streamflow during the water-quality sampling in Caddo Creek at site 7 ranged from 8.4 to 20 cubic feet per second, of which about 5.0 to 9.7 cubic feet per second was contributed by Sand Creek. The model simulates the fate and transport of 10 water-quality constituents. The model was verified by running it using data that were not used in calibration; only phytoplankton were not verified.Measured and simulated concentrations of dissolved oxygen exhibited a marked daily pattern that was attributable to waste loading and algal activity. Dissolved-oxygen measurements during this study and simulated dissolved-oxygen concentrations using the Ardmore Water-Quality Model, for the conditions of this study, illustrate that the dissolved-oxygen sag curve caused by the upstream wastewater discharges is confined to Sand Creek.
Gatidis, Sergios; Würslin, Christian; Seith, Ferdinand; Schäfer, Jürgen F; la Fougère, Christian; Nikolaou, Konstantin; Schwenzer, Nina F; Schmidt, Holger
2016-01-01
Optimization of tracer dose regimes in positron emission tomography (PET) imaging is a trade-off between diagnostic image quality and radiation exposure. The challenge lies in defining minimal tracer doses that still result in sufficient diagnostic image quality. In order to find such minimal doses, it would be useful to simulate tracer dose reduction as this would enable to study the effects of tracer dose reduction on image quality in single patients without repeated injections of different amounts of tracer. The aim of our study was to introduce and validate a method for simulation of low-dose PET images enabling direct comparison of different tracer doses in single patients and under constant influencing factors. (18)F-fluoride PET data were acquired on a combined PET/magnetic resonance imaging (MRI) scanner. PET data were stored together with the temporal information of the occurrence of single events (list-mode format). A predefined proportion of PET events were then randomly deleted resulting in undersampled PET data. These data sets were subsequently reconstructed resulting in simulated low-dose PET images (retrospective undersampling of list-mode data). This approach was validated in phantom experiments by visual inspection and by comparison of PET quality metrics contrast recovery coefficient (CRC), background-variability (BV) and signal-to-noise ratio (SNR) of measured and simulated PET images for different activity concentrations. In addition, reduced-dose PET images of a clinical (18)F-FDG PET dataset were simulated using the proposed approach. (18)F-PET image quality degraded with decreasing activity concentrations with comparable visual image characteristics in measured and in corresponding simulated PET images. This result was confirmed by quantification of image quality metrics. CRC, SNR and BV showed concordant behavior with decreasing activity concentrations for measured and for corresponding simulated PET images. Simulation of dose-reduced datasets based on clinical (18)F-FDG PET data demonstrated the clinical applicability of the proposed data. Simulation of PET tracer dose reduction is possible with retrospective undersampling of list-mode data. Resulting simulated low-dose images have equivalent characteristics with PET images actually measured at lower doses and can be used to derive optimal tracer dose regimes.
Hart, Rheannon M.; Green, W. Reed; Westerman, Drew A.; Petersen, James C.; DeLanois, Jeanne L.
2012-01-01
Lake Maumelle, located in central Arkansas northwest of the cities of Little Rock and North Little Rock, is one of two principal drinking-water supplies for the Little Rock, and North Little Rock, Arkansas, metropolitan areas. Lake Maumelle and the Maumelle River (its primary tributary) are more pristine than most other reservoirs and streams in the region with 80 percent of the land area in the entire watershed being forested. However, as the Lake Maumelle watershed becomes increasingly more urbanized and timber harvesting becomes more extensive, concerns about the sustainability of the quality of the water supply also have increased. Two hydrodynamic and water-quality models were developed to examine the hydrology and water quality in the Lake Maumelle watershed and changes that might occur as the watershed becomes more urbanized and timber harvesting becomes more extensive. A Hydrologic Simulation Program–FORTRAN watershed model was developed using continuous streamflow and discreet suspended-sediment and water-quality data collected from January 2004 through 2010. A CE–QUAL–W2 model was developed to simulate reservoir hydrodynamics and selected water-quality characteristics using the simulated output from the Hydrologic Simulation Program–FORTRAN model from January 2004 through 2010. The calibrated Hydrologic Simulation Program–FORTRAN model and the calibrated CE–QUAL–W2 model were developed to simulate three land-use scenarios and to examine the potential effects of these land-use changes, as defined in the model, on the water quality of Lake Maumelle during the 2004 through 2010 simulation period. These scenarios included a scenario that simulated conversion of most land in the watershed to forest (scenario 1), a scenario that simulated conversion of potentially developable land to low-intensity urban land use in part of the watershed (scenario 2), and a scenario that simulated timber harvest in part of the watershed (scenario 3). Simulated land-use changes for scenarios 1 and 3 resulted in little (generally less than 10 percent) overall effect on the simulated water quality in the Hydrologic Simulation Program–FORTRAN model. The land-use change of scenario 2 affected subwatersheds that include Bringle, Reece, and Yount Creek tributaries and most other subwatersheds that drain into the northern side of Lake Maumelle; large percent increases in loading rates (generally between 10 and 25 percent) included dissolved nitrite plus nitrate nitrogen, dissolved orthophosphate, total phosphorus, suspended sediment, dissolved ammonia nitrogen, total organic carbon, and fecal coliform bacteria. For scenario 1, the simulated changes in nutrient, suspended sediment, and total organic carbon loads from the Hydrologic Simulation Program–FORTRAN model resulted in very slight (generally less than 10 percent) changes in simulated water quality for Lake Maumelle, relative to the baseline condition. Following lake mixing in the falls of 2006 and 2007, phosphorus and nitrogen concentrations were higher than the baseline condition and chlorophyll a responded accordingly. The increased nutrient and chlorophyll a concentrations in late October and into 2007 were enough to increase concentrations, on average, for the entire simulation period (2004–10). For scenario 2, the simulated changes in nutrient, suspended sediment, total organic carbon, and fecal coliform bacteria loads from the Lake Maumelle watershed resulted in slight changes in simulated water quality for Lake Maumelle, relative to the baseline condition (total nitrogen decreased by 0.01 milligram per liter; dissolved orthophosphate increased by 0.001 milligram per liter; chlorophyll a decreased by 0.1 microgram per liter). The differences in these concentrations are approximately an order of magnitude less than the error between measured and simulated concentrations in the baseline model. During the driest summer in the simulation period (2006), phosphorus and nitrogen concentrations were lower than the baseline condition and chlorophyll a concentrations decreased during the same summer season. The decrease in nitrogen and chlorophyll a concentrations during the dry summer in 2006 was enough to decrease concentrations of these constituents very slightly, on average, for the entire simulation period (2004–10). For scenario 3, the changes in simulated nutrient, suspended sediment, total organic carbon, and fecal coliform bacteria loads from Lake Maumelle watershed resulted in very slight changes in simulated water quality within Lake Maumelle, relative to the baseline condition, for most of the reservoir. Among the implications of the results of the modeling described in this report are those related to scale in both space and time. Spatial scales include limited size and location of land-use changes, their effects on loading rates, and resultant effects on water quality of Lake Maumelle. Temporally, the magnitude of the water-quality changes simulated by the land-use change scenarios over the 7-year period (2004–10) are not necessarily indicative of the changes that could be expected to occur with similar land-use changes persisting over a 20-, 30-, or 40- year period, for example. These implications should be tempered by realization of the described model limitations. The Hydrologic Simulation Program–FORTRAN watershed model was calibrated to streamflow and water-quality data from five streamflow-gaging stations, and in general, these stations characterize a range of subwatershed areas with varying land-use types. The CE–QUAL–W2 reservoir model was calibrated to water-quality data collected during January 2004 through December 2010 at three reservoir stations, representing the upper, middle, and lower sections of the reservoir. In general, the baseline simulation for the Hydrologic Simulation Program–FORTRAN and the CE–QUAL–W2 models matched reasonably well to the measured data. Simulated and measured suspended-sediment concentrations during periods of base flow (streamflows not substantially influenced by runoff) agree reasonably well for Maumelle River at Williams Junction, the station representing the upper end of the watershed (with differences—simulated minus measured value—generally ranging from -15 to 41 milligrams per liter, and percent difference—relative to the measured value—ranging from -99 to 182 percent) and Maumelle River near Wye, the station just above the reservoir at the lower end (differences generally ranging from -20 to 22 milligrams per liter, and percent difference ranging from -100 to 194 percent). In general, water temperature and dissolved-oxygen concentration simulations followed measured seasonal trends for all stations with the largest differences occurring during periods of lowest temperatures or during the periods of lowest measured dissolved-oxygen concentrations. For the CE–QUAL–W2 model, simulated vertical distributions of water temperatures and dissolved-oxygen concentrations agreed with measured vertical distributions over time, even for the most complex water-temperature profiles. Considering the oligotrophic-mesotrophic (low to intermediate primary productivity and associated low nutrient concentrations) condition of Lake Maumelle, simulated algae, phosphorus, and nitrogen concentrations compared well with generally low measured concentrations.
Thoen, Hendrik; Keereman, Vincent; Mollet, Pieter; Van Holen, Roel; Vandenberghe, Stefaan
2013-09-21
The optimization of a whole-body PET system remains a challenging task, as the imaging performance is influenced by a complex interaction of different design parameters. However, it is not always clear which parameters have the largest impact on image quality and are most eligible for optimization. To determine this, we need to be able to assess their influence on image quality. We performed Monte-Carlo simulations of a whole-body PET scanner to predict the influence on image quality of three detector parameters: the TOF resolution, the transverse pixel size and depth-of-interaction (DOI)-correction. The inner diameter of the PET scanner was 65 cm, small enough to allow physical integration into a simultaneous PET-MR system. Point sources were used to evaluate the influence of transverse pixel size and DOI-correction on spatial resolution as function of radial distance. To evaluate the influence on contrast recovery and pixel noise a cylindrical phantom of 35 cm diameter was used, representing a large patient. The phantom contained multiple hot lesions with 5 mm diameter. These lesions were placed at radial distances of 50, 100 and 150 mm from the center of the field-of-view, to be able to study the effects at different radial positions. The non-prewhitening (NPW) observer was used for objective analysis of the detectability of the hot lesions in the cylindrical phantom. Based on this analysis the NPW-SNR was used to quantify the relative improvements in image quality due to changes of the variable detector parameters. The image quality of a whole-body PET scanner can be improved significantly by reducing the transverse pixel size from 4 to 2.6 mm and improving the TOF resolution from 600 to 400 ps and further from 400 to 200 ps. Compared to pixel size, the TOF resolution has the larger potential to increase image quality for the simulated phantom. The introduction of two layer DOI-correction only leads to a modest improvement for the spheres at radial distance of 150 mm from the center of the transaxial FOV.
NASA Astrophysics Data System (ADS)
Cheng, Tian-Le; Ma, Fengde D.; Zhou, Jie E.; Jennings, Guy; Ren, Yang; Jin, Yongmei M.; Wang, Yu U.
2012-01-01
Diffuse scattering contains rich information on various structural disorders, thus providing a useful means to study the nanoscale structural deviations from the average crystal structures determined by Bragg peak analysis. Extraction of maximal information from diffuse scattering requires concerted efforts in high-quality three-dimensional (3D) data measurement, quantitative data analysis and visualization, theoretical interpretation, and computer simulations. Such an endeavor is undertaken to study the correlated dynamic atomic position fluctuations caused by thermal vibrations (phonons) in precursor state of shape-memory alloys. High-quality 3D diffuse scattering intensity data around representative Bragg peaks are collected by using in situ high-energy synchrotron x-ray diffraction and two-dimensional digital x-ray detector (image plate). Computational algorithms and codes are developed to construct the 3D reciprocal-space map of diffuse scattering intensity distribution from the measured data, which are further visualized and quantitatively analyzed to reveal in situ physical behaviors. Diffuse scattering intensity distribution is explicitly formulated in terms of atomic position fluctuations to interpret the experimental observations and identify the most relevant physical mechanisms, which help set up reduced structural models with minimal parameters to be efficiently determined by computer simulations. Such combined procedures are demonstrated by a study of phonon softening phenomenon in precursor state and premartensitic transformation of Ni-Mn-Ga shape-memory alloy.
[Seedling index of Salvia miltiorrhiza and its simulation model].
Huang, Shu-Hua; Xu, Fu-Li; Wang, Wei-Ling; Du, Jun-Bo; Ru, Mei; Wang, Jing; Cao, Xian-Yan
2012-10-01
Through the correlation analysis on the quantitative traits and their ratios of Salvia miltiorrhiza seedlings and seedling quality, a series of representative indices reflecting the seedling quality of the plant species were determined, and the seedling index suitable to the S. miltiorrhiza seedlings was ascertained by correlation degree analysis. Meanwhile, based on the relationships between the seedling index and the air temperature, solar radiation and air humidity, a simulation model for the seedling index of S. miltiorrhiza was established. The experimental data of different test plots and planting dates were used to validate the model. The results showed that the root diameter, stem diameter, crown dry mass, root dry mass, and plant dry mass had significant positive relationships with the other traits, and could be used as the indicators of the seedling's health. The seedling index of S. miltiorrhiza could be calculated by (stem diameter/root diameter + root dry mass/crown dry mass) x plant dry mass. The stem diameter, root dry mass, crown dry mass and plant dry mass had higher correlations with the seedling index, and thus, the seedling index determined by these indicators could better reflect the seedling's quality. The coefficient of determination (R2) between the predicted and measured values based on 1:1 line was 0.95, and the root mean squared error (RMSE) was 0.15, indicating that the model established in this study could precisely reflect the quantitative relationships between the seedling index of S. miltiorrhiza and the environmental factors.
Assessment of PM10 enhancement by yellow sand on the air quality of Taipei, Taiwan in 2001.
Chang, Shuenn-Chin; Lee, Chung-Te
2007-09-01
The impact of long-range transport of yellow sand from Asian Continent to the Taipei Metropolitan Area (Taipei) not only deteriorates air quality but also poses health risks to all, especially the children and the elderly. As such, it is important to assess the enhancement of PM(10) during yellow sand periods. In order to estimate PM(10) enhancement, we adopted factor analysis to distinguish the yellow-sand (YS) periods from non-yellow-sand (NYS) periods based on air quality monitoring records. Eight YS events were identified using factor analysis coupling with an independent validation procedure by checking background site values, examining meteorological conditions, and modeling air mass trajectory from January 2001 to May 2001. The duration of each event varied from 11 to 132 h, which was identified from the time when the PM(10) level was high, and the CO and NOx levels were low. Subsequently, we used the artificial neural network (ANN) to simulate local PM(10) levels from related parameters including local gas pollutants and meteorological factors during the NYS periods. The PM(10) enhancement during the YS periods is then calculated by subtracting the simulated PM(10) from the observed PM(10) levels. Based on our calculations, the PM(10) enhancement in the maximum hour of each event ranged from 51 to 82%. Moreover, in the eight events identified in 2001, it was estimated that a total amount of 7,210 tons of PM(10) were transported by yellow sand to Taipei. Thus, in this study, we demonstrate that an integration of factor analysis with ANN model could provide a very useful method in identifying YS periods and in determining PM(10) enhancement caused by yellow sand.
Regional impacts of oil and gas development on ozone formation in the western United States.
Rodriguez, Marco A; Barna, Michael G; Moore, Tom
2009-09-01
The Intermountain West is currently experiencing increased growth in oil and gas production, which has the potential to affect the visibility and air quality of various Class I areas in the region. The following work presents an analysis of these impacts using the Comprehensive Air Quality Model with extensions (CAMx). CAMx is a state-of-the-science, "one-atmosphere" Eulerian photochemical dispersion model that has been widely used in the assessment of gaseous and particulate air pollution (ozone, fine [PM2.5], and coarse [PM10] particulate matter). Meteorology and emissions inventories developed by the Western Regional Air Partnership Regional Modeling Center for regional haze analysis and planning are used to establish an ozone baseline simulation for the year 2002. The predicted range of values for ozone in the national parks and other Class I areas in the western United States is then evaluated with available observations from the Clean Air Status and Trends Network (CASTNET). This evaluation demonstrates the model's suitability for subsequent planning, sensitivity, and emissions control strategy modeling. Once the ozone baseline simulation has been established, an analysis of the model results is performed to investigate the regional impacts of oil and gas development on the ozone concentrations that affect the air quality of Class I areas. Results indicate that the maximum 8-hr ozone enhancement from oil and gas (9.6 parts per billion [ppb]) could affect southwestern Colorado and northwestern New Mexico. Class I areas in this region that are likely to be impacted by increased ozone include Mesa Verde National Park and Weminuche Wilderness Area in Colorado and San Pedro Parks Wilderness Area, Bandelier Wilderness Area, Pecos Wilderness Area, and Wheeler Peak Wilderness Area in New Mexico.
Trends in air quality across the Northern Hemisphere over a 21-year period (1990–2010) were simulated using the Community Multiscale Air Quality (CMAQ) multiscale chemical transport model driven by meteorology from Weather Research and Forecasting WRF) simulations and internally ...
NASA Astrophysics Data System (ADS)
Jiang, Sanyuan; Jomaa, Seifeddine; Büttner, Olaf; Rode, Michael
2014-05-01
Hydrological water quality modeling is increasingly used for investigating runoff and nutrient transport processes as well as watershed management but it is mostly unclear how data availablity determins model identification. In this study, the HYPE (HYdrological Predictions for the Environment) model, which is a process-based, semi-distributed hydrological water quality model, was applied in two different mesoscale catchments (Selke (463 km2) and Weida (99 km2)) located in central Germany to simulate discharge and inorganic nitrogen (IN) transport. PEST and DREAM(ZS) were combined with the HYPE model to conduct parameter calibration and uncertainty analysis. Split-sample test was used for model calibration (1994-1999) and validation (1999-2004). IN concentration and daily IN load were found to be highly correlated with discharge, indicating that IN leaching is mainly controlled by runoff. Both dynamics and balances of water and IN load were well captured with NSE greater than 0.83 during validation period. Multi-objective calibration (calibrating hydrological and water quality parameters simultaneously) was found to outperform step-wise calibration in terms of model robustness. Multi-site calibration was able to improve model performance at internal sites, decrease parameter posterior uncertainty and prediction uncertainty. Nitrogen-process parameters calibrated using continuous daily averages of nitrate-N concentration observations produced better and more robust simulations of IN concentration and load, lower posterior parameter uncertainty and IN concentration prediction uncertainty compared to the calibration against uncontinuous biweekly nitrate-N concentration measurements. Both PEST and DREAM(ZS) are efficient in parameter calibration. However, DREAM(ZS) is more sound in terms of parameter identification and uncertainty analysis than PEST because of its capability to evolve parameter posterior distributions and estimate prediction uncertainty based on global search and Bayesian inference schemes.
Water quality simulation of sewage impacts on the west coast of Mumbai, India.
Vijay, R; Khobragade, P J; Sohony, R A
2010-01-01
Most coastal cities use the ocean as a site of waste disposal where pollutant loading degrades the quality of coastal waters. Presently, the west coast of Mumbai receives partially treated effluent from wastewater treatment facilities through ocean outfalls and discharges into creeks as well as wastewater/sewage from various open drains and nallahs which affect the water quality of creek and coastal water. Therefore, the objective of this paper is to simulate and assess the hydrodynamic behaviour and water quality due to impact of sewage and wastewater discharges from the west coast of Mumbai. Hydrodynamics and water quality were simulated based on present conditions and validated by using measured tide, current data and observed DO, BOD and FC. Observed and simulated results indicated non compliance to standards in Malad, Mahim creeks and the impact zones of ocean outfalls. The developed model could be used for generating various conditions of hydrodynamics and water quality considering the improvement in wastewater collection systems, treatment levels and proper disposal for proper planning and management of creeks and coastal environment.
NASA Astrophysics Data System (ADS)
Ghoveisi, H.; Al Dughaishi, U.; Kiker, G.
2017-12-01
Maintaining water quality in agricultural watersheds is a worldwide challenge, especially where furrow irrigation is being practiced. The Yakima River Basin watershed in south central Washington State, (USA) is an example of these impacted areas with elevated load of sediments and other agricultural products due to runoff from furrow-irrigated fields. Within the Yakima basin, the Granger Drain watershed (area of 75 km2) is particularly challenged in this regard with more than 400 flood-irrigated individual parcels (area of 21 km2) growing a variety of crops from maize to grapes. Alternatives for improving water quality from furrow-irrigated parcels include vegetated filter strip (VFS) implementation, furrow water application efficiency, polyacrylamide (PAM) application and irrigation scheduling. These alternatives were simulated separately and in combinations to explore potential Best Management Practices (BMPs) for runoff-related-pollution reduction in a spatially explicit, agent based modeling system (QnD:GrangerDrain). Two regulatory scenarios were tested to BMP adoption within individual parcels. A blanket-style regulatory scenario simulated a total of 60 BMP combinations implemented in all 409 furrow-irrigated parcels. A second regulatory scenario simulated the BMPs in 119 furrow-irrigated parcels designated as "hotspots" based on a standard 12 Mg ha-1 seasonal sediment load. The simulated cumulative runoff and sediment loading from all BMP alternatives were ranked using Multiple Criteria Decision Analysis (MCDA), specifically the Stochastic Multi-Attribute Acceptability Analysis (SMAA) method. Several BMP combinations proved successful in reducing loads below a 25 NTU (91 mg L-1) regulatory sediment concentration. The QnD:GrangerDrain simulations and subsequent MCDA ranking revealed that the BMP combinations of 5 m-VFS and high furrow water efficiency were highly ranked alternatives for both the blanket and hotspot scenarios.
Janssens, Sarah; Beckmann, Michael; Bonney, Donna
2015-08-01
Simulation training in laparoscopic surgery has been shown to improve surgical performance. To describe the implementation of a laparoscopic simulation training and credentialing program for gynaecology registrars. A pilot program consisting of protected, supervised laparoscopic simulation time, a tailored curriculum and a credentialing process, was developed and implemented. Quantitative measures assessing simulated surgical performance were measured over the simulation training period. Laparoscopic procedures requiring credentialing were assessed for both the frequency of a registrar being the primary operator and the duration of surgery and compared to a presimulation cohort. Qualitative measures regarding quality of surgical training were assessed pre- and postsimulation. Improvements were seen in simulated surgical performance in efficiency domains. Operative time for procedures requiring credentialing was reduced by 12%. Primary operator status in the operating theatre for registrars was unchanged. Registrar assessment of training quality improved. The introduction of a laparoscopic simulation training and credentialing program resulted in improvements in simulated performance, reduced operative time and improved registrar assessment of the quality of training. © 2015 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.
NASA Astrophysics Data System (ADS)
Hitt, O.; Hutchins, M.
2016-12-01
UK river waters face considerable future pressures, primarily from population growth and climate change. In understanding controls on river water quality, experimental studies have successfully identified response to single or paired stressors under controlled conditions. Generalised Linear Model (GLM) approaches are commonly used to quantify stressor-response relationships. To explore a wider variety of stressors physics-based models are used. Our objective is to evaluate how five different types of stressor influence the severity of river eutrophication and its impact on Dissolved Oxygen (DO) an integrated measure of river ecological health. This is done by applying a physics-based river quality model for 4 years at daily time step to a 92 km stretch in the 3445 km2 Thames (UK) catchment. To understand the impact of model structural uncertainty we present results from two alternative formulations of the biological response. Sensitivity analysis carried out using the QUESTOR model (QUality Evaluation and Simulation TOol for River systems) considered gradients of various stressors: river flow, water temperature, urbanisation (abstractions and sewage/industrial effluents), phosphate concentrations in effluents and tributaries and riparian tree shading (modifying the light input). Scalar modifiers applied to the 2009-12 time-series inputs define the gradients. The model has been run for each combination of the values of these 5 variables. Results are analysed using graphical methods in order to identify variation in the type of relationship between different pairs of stressors on the system response. The method allows for all outputs from each combination of stressors to be displayed in one graphic and so showing the results of hundreds of model runs simultaneously. This approach can be carried out for all stressor pairs, and many locations/determinands. Supporting statistical analysis (GLM) reinforces the findings from the graphical analysis. Analysis suggests that climate-driven variables (flow and river temperature) give strong explanation of variation in DO content. An indicator of low DO values typically seen in summer is chosen (10th percentile). Increasing temperature clearly has adverse effects lowering DO, and is illustrated in three example graphics.
2016-09-01
TECHNICAL REPORT 3046 September 2016 GENERATION OF QUALITY PULSES FOR CONTROL OF QUBIT/QUANTUM MEMORY SPIN STATES: EXPERIMENTAL AND SIMULATION...nuclear spin states of qubits/quantum memory applicable to semiconductor, superconductor, ionic, and superconductor-ionic hybrid technologies. As the...pulse quality and need for development of single pulses with very high quality will impact directly the coherence time of the qubit/ memory , we present
Evaluation of near surface ozone and particulate matter in air ...
In this study, techniques typically used for future air quality projections are applied to a historical 11-year period to assess the performance of the modeling system when the driving meteorological conditions are obtained using dynamical downscaling of coarse-scale fields without correcting toward higher-resolution observations. The Weather Research and Forecasting model and the Community Multiscale Air Quality model are used to simulate regional climate and air quality over the contiguous United States for 2000–2010. The air quality simulations for that historical period are then compared to observations from four national networks. Comparisons are drawn between defined performance metrics and other published modeling results for predicted ozone, fine particulate matter, and speciated fine particulate matter. The results indicate that the historical air quality simulations driven by dynamically downscaled meteorology are typically within defined modeling performance benchmarks and are consistent with results from other published modeling studies using finer-resolution meteorology. This indicates that the regional climate and air quality modeling framework utilized here does not introduce substantial bias, which provides confidence in the method’s use for future air quality projections. This paper shows that if emissions inputs and coarse-scale meteorological inputs are reasonably accurate, then air quality can be simulated with acceptable accuracy even wi
NASA Astrophysics Data System (ADS)
Regnery, Julia; Lee, Jonghyun; Drumheller, Zachary W.; Drewes, Jörg E.; Illangasekare, Tissa H.; Kitanidis, Peter K.; McCray, John E.; Smits, Kathleen M.
2017-05-01
Meaningful model-based predictions of water quality and quantity are imperative for the designed footprint of managed aquifer recharge installations. A two-dimensional (2D) synthetic MAR system equipped with automated sensors (temperature, water pressure, conductivity, soil moisture, oxidation-reduction potential) and embedded water sampling ports was used to test and model fundamental subsurface processes during surface spreading managed aquifer recharge operations under controlled flow and redox conditions at the meso-scale. The fate and transport of contaminants in the variably saturated synthetic aquifer were simulated using the finite element analysis model, FEFLOW. In general, the model concurred with travel times derived from contaminant breakthrough curves at individual sensor locations throughout the 2D tank. However, discrepancies between measured and simulated trace organic chemical concentrations (i.e., carbamazepine, sulfamethoxazole, tris (2-chloroethyl) phosphate, trimethoprim) were observed. While the FEFLOW simulation of breakthrough curves captured overall shapes of trace organic chemical concentrations well, the model struggled with matching individual data points, although compound-specific attenuation parameters were used. Interestingly, despite steady-state operation, oxidation-reduction potential measurements indicated temporal disturbances in hydraulic properties in the saturated zone of the 2D tank that affected water quality.
Dynamic Simulation Research on Chain Drive Mechanism of Corn Seeder Based on ADAMS
NASA Astrophysics Data System (ADS)
Wang, Y. B.; Jia, H. P.
2017-12-01
In order to reduce the damage to the chain and improve the seeding quality of the seeding machine, the corn seeder has the characteristics of the seeding quality and some technical indexes in the work of the corn seeding machine. The dynamic analysis of the chain drive mechanism is carried out by using the dynamic virtual prototype. In this paper, the speed of the corn planter is 5km/h, and the speed of the simulated knuckle is 0.1~0.9s. The velocity is 0.12m/s, which is equal to the chain speed when the seeder is running normally. Of the dynamic simulation of the movement and the actual situation is basically consistent with the apparent speed of the drive wheel has changed the acceleration and additional dynamic load, the chain drive has a very serious damage, and the maximum load value of 47.28N, in order to reduce the damage to the chain, As far as possible so that the sowing machine in the work to maintain a reasonable uniform speed, to avoid a greater acceleration, the corn sowing machine drive the design of a certain reference.
NASA Technical Reports Server (NTRS)
Colarco, Peter; daSilva, Arlindo; Ginoux, Paul; Chin, Mian; Lin, S.-J.
2003-01-01
Mineral dust aerosols have radiative impacts on Earth's atmosphere, have been implicated in local and regional air quality issues, and have been identified as vectors for transporting disease pathogens and bringing mineral nutrients to terrestrial and oceanic ecosystems. We present for the first time dust simulations using online transport and meteorological analysis in the NASA Finite-Volume General Circulation Model (FVGCM). Our dust formulation follows the formulation in the offline Georgia Institute of Technology-Goddard Global Ozone Chemistry Aerosol Radiation and Transport Model (GOCART) using a topographical source for dust emissions. We compare results of the FVGCM simulations with GOCART, as well as with in situ and remotely sensed observations. Additionally, we estimate budgets of dust emission and transport into various regions.
NASA Astrophysics Data System (ADS)
Nagarajan, S. G.; Srinivasan, M.; Aravinth, K.; Ramasamy, P.
2018-04-01
Transient simulation has been carried out for analyzing the heat transfer properties of Directional Solidification (DS) furnace. The simulation results revealed that the additional heat exchanger block under the bottom insulation on the DS furnace has enhanced the control of solidification of the silicon melt. Controlled Heat extraction rate during the solidification of silicon melt is requisite for growing good quality ingots which has been achieved by the additional heat exchanger block. As an additional heat exchanger block, the water circulating plate has been placed under the bottom insulation. The heat flux analysis of DS system and the temperature distribution studies of grown ingot confirm that the established additional heat exchanger block on the DS system gives additional benefit to the mc-Si ingot.
NASA Astrophysics Data System (ADS)
Li, Kai; Deng, Haixiao
2018-07-01
The Shanghai Coherent Light Facility (SCLF) is a quasi-continuous wave hard X-ray free electron laser facility, which is currently under construction. Due to the high repetition rate and high-quality electron beams, it is straightforward to consider X-ray free electron laser oscillator (XFELO) operation for the SCLF. In this paper, the main processes for XFELO design, and parameter optimization of the undulator, X-ray cavity, and electron beam are described. A three-dimensional X-ray crystal Bragg diffraction code, named BRIGHT, was introduced for the first time, which can be combined with the GENESIS and OPC codes for the numerical simulations of the XFELO. The performance of the XFELO of the SCLF is investigated and optimized by theoretical analysis and numerical simulation.
Fu, Guang; Zhang, David Z; He, Allen N; Mao, Zhongfa; Zhang, Kaifei
2018-05-10
A deep understanding of the laser-material interaction mechanism, characterized by laser absorption, is very important in simulating the laser metal powder bed fusion (PBF) process. This is because the laser absorption of material affects the temperature distribution, which influences the thermal stress development and the final quality of parts. In this paper, a three-dimensional finite element analysis model of heat transfer taking into account the effect of material state and phase changes on laser absorption is presented to gain insight into the absorption mechanism, and the evolution of instantaneous absorptance in the laser metal PBF process. The results showed that the instantaneous absorptance was significantly affected by the time of laser radiation, as well as process parameters, such as hatch space, scanning velocity, and laser power, which were consistent with the experiment-based findings. The applicability of this model to temperature simulation was demonstrated by a comparative study, wherein the peak temperature in fusion process was simulated in two scenarios, with and without considering the effect of material state and phase changes on laser absorption, and the simulated results in the two scenarios were then compared with experimental data respectively.
NASA Astrophysics Data System (ADS)
Duan, Pengfei; Lei, Wenping
2017-11-01
A number of disciplines (mechanics, structures, thermal, and optics) are needed to design and build Space Camera. Separate design models are normally constructed by each discipline CAD/CAE tools. Design and analysis is conducted largely in parallel subject to requirements that have been levied on each discipline, and technical interaction between the different disciplines is limited and infrequent. As a result a unified view of the Space Camera design across discipline boundaries is not directly possible in the approach above, and generating one would require a large manual, and error-prone process. A collaborative environment that is built on abstract model and performance template allows engineering data and CAD/CAE results to be shared across above discipline boundaries within a common interface, so that it can help to attain speedy multivariate design and directly evaluate optical performance under environment loadings. A small interdisciplinary engineering team from Beijing Institute of Space Mechanics and Electricity has recently conducted a Structural/Thermal/Optical (STOP) analysis of a space camera with this collaborative environment. STOP analysis evaluates the changes in image quality that arise from the structural deformations when the thermal environment of the camera changes throughout its orbit. STOP analyses were conducted for four different test conditions applied during final thermal vacuum (TVAC) testing of the payload on the ground. The STOP Simulation Process begins with importing an integrated CAD model of the camera geometry into the collaborative environment, within which 1. Independent thermal and structural meshes are generated. 2. The thermal mesh and relevant engineering data for material properties and thermal boundary conditions are then used to compute temperature distributions at nodal points in both the thermal and structures mesh through Thermal Desktop, a COTS thermal design and analysis code. 3. Thermally induced structural deformations of the camera are then evaluated in Nastran, an industry standard code for structural design and analysis. 4. Thermal and structural results are next imported into SigFit, another COTS tool that computes deformation and best fit rigid body displacements for the optical surfaces. 5. SigFit creates a modified optical prescription that is imported into CODE V for evaluation of optical performance impacts. The integrated STOP analysis was validated using TVAC test data. For the four different TVAC tests, the relative errors between simulation and test data of measuring points temperatures were almost around 5%, while in some test conditions, they were even much lower to 1%. As to image quality MTF, relative error between simulation and test was 8.3% in the worst condition, others were all below 5%. Through the validation, it has been approved that the collaborative design and simulation environment can achieved the integrated STOP analysis of Space Camera efficiently. And further, the collaborative environment allows an interdisciplinary analysis that formerly might take several months to perform to be completed in two or three weeks, which is very adaptive to scheme demonstration of projects in earlier stages.
CUSUM analysis of learning curves for the head-mounted microscope in phonomicrosurgery.
Chen, Ting; Vamos, Andrew C; Dailey, Seth H; Jiang, Jack J
2016-10-01
To observe the learning curve of the head-mounted microscope in a phonomicrosurgery simulator using cumulative summation (CUSUM) analysis, which incorporates a magnetic phonomicrosurgery instrument tracking system (MPTS). Retrospective case series. Eight subjects (6 medical students and 2 surgeons inexperienced in phonomicrosurgery) operated on phonomicrosurgical simulation cutting tasks while using the head-mounted microscope for 400 minutes total. Two 20-minute sessions occurred each day for 10 total days, with operation quality (Qs ) and completion time (T) being recorded after each session. Cumulative summation analysis of Qs and T was performed by using subjects' performance data from trials completed using a traditional standing microscope as success criteria. The motion parameters from the head-mounted microscope were significantly better than the standing microscope (P < 0.01), but T was longer than that from the standing microscope (P < 0.01). No subject successfully adapted to the head-mounted microscope, as assessed by CUSUM analysis. Cumulative summation analysis can objectively monitor the learning process associated with a phonomicrosurgical simulator system, ultimately providing a tool to assess learning status. Also, motion parameters determined by our MPTS showed that, although the head-mounted microscope provides better motion control, worse Qs and longer T resulted. This decrease in Qs is likely a result of the relatively unstable visual environment that it provides. Overall, the inexperienced surgeons participating in this study failed to adapt to the head-mounted microscope in our simulated phonomicrosurgery environment. 4 Laryngoscope, 126:2295-2300, 2016. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.
Cotton, Cary C; Erim, Daniel; Eluri, Swathi; Palmer, Sarah H; Green, Daniel J; Wolf, W Asher; Runge, Thomas M; Wheeler, Stephanie; Shaheen, Nicholas J; Dellon, Evan S
2017-06-01
Topical corticosteroids or dietary elimination are recommended as first-line therapies for eosinophilic esophagitis, but data to directly compare these therapies are scant. We performed a cost utility comparison of topical corticosteroids and the 6-food elimination diet (SFED) in treatment of eosinophilic esophagitis, from the payer perspective. We used a modified Markov model based on current clinical guidelines, in which transition between states depended on histologic response simulated at the individual cohort-member level. Simulation parameters were defined by systematic review and meta-analysis to determine the base-case estimates and bounds of uncertainty for sensitivity analysis. Meta-regression models included adjustment for differences in study and cohort characteristics. In the base-case scenario, topical fluticasone was about as effective as SFED but more expensive at a 5-year time horizon ($9261.58 vs $5719.72 per person). SFED was more effective and less expensive than topical fluticasone and topical budesonide in the base-case scenario. Probabilistic sensitivity analysis revealed little uncertainty in relative treatment effectiveness. There was somewhat greater uncertainty in the relative cost of treatments; most simulations found SFED to be less expensive. In a cost utility analysis comparing topical corticosteroids and SFED for first-line treatment of eosinophilic esophagitis, the therapies were similar in effectiveness. SFED was on average less expensive, and more cost effective in most simulations, than topical budesonide and topical fluticasone, from a payer perspective and not accounting for patient-level costs or quality of life. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.
2011-07-01
joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement. Analyzing data with modern statistical techniques to determine the
NASA Technical Reports Server (NTRS)
Rosenstein, H.; Mcveigh, M. A.; Mollenkof, P. A.
1973-01-01
The results of a real time piloted simulation to investigate the handling qualities and performance of a tilting rotor aircraft design are presented. The aerodynamic configuration of the aircraft is described. The procedures for conducting the simulator evaluation are reported. Pilot comments of the aircraft handling qualities under various simulated flight conditions are included. The time histories of selected pilot maneuvers are shown.
The expected results method for data verification
NASA Astrophysics Data System (ADS)
Monday, Paul
2016-05-01
The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest quality possible.
Parks, Nathan A.; Gannon, Matthew A.; Long, Stephanie M.; Young, Madeleine E.
2016-01-01
Analysis of event-related potential (ERP) data includes several steps to ensure that ERPs meet an appropriate level of signal quality. One such step, subject exclusion, rejects subject data if ERP waveforms fail to meet an appropriate level of signal quality. Subject exclusion is an important quality control step in the ERP analysis pipeline as it ensures that statistical inference is based only upon those subjects exhibiting clear evoked brain responses. This critical quality control step is most often performed simply through visual inspection of subject-level ERPs by investigators. Such an approach is qualitative, subjective, and susceptible to investigator bias, as there are no standards as to what constitutes an ERP of sufficient signal quality. Here, we describe a standardized and objective method for quantifying waveform quality in individual subjects and establishing criteria for subject exclusion. The approach uses bootstrap resampling of ERP waveforms (from a pool of all available trials) to compute a signal-to-noise ratio confidence interval (SNR-CI) for individual subject waveforms. The lower bound of this SNR-CI (SNRLB) yields an effective and objective measure of signal quality as it ensures that ERP waveforms statistically exceed a desired signal-to-noise criterion. SNRLB provides a quantifiable metric of individual subject ERP quality and eliminates the need for subjective evaluation of waveform quality by the investigator. We detail the SNR-CI methodology, establish the efficacy of employing this approach with Monte Carlo simulations, and demonstrate its utility in practice when applied to ERP datasets. PMID:26903849
BACT Simulation User Guide (Version 7.0)
NASA Technical Reports Server (NTRS)
Waszak, Martin R.
1997-01-01
This report documents the structure and operation of a simulation model of the Benchmark Active Control Technology (BACT) Wind-Tunnel Model. The BACT system was designed, built, and tested at NASA Langley Research Center as part of the Benchmark Models Program and was developed to perform wind-tunnel experiments to obtain benchmark quality data to validate computational fluid dynamics and computational aeroelasticity codes, to verify the accuracy of current aeroservoelasticity design and analysis tools, and to provide an active controls testbed for evaluating new and innovative control algorithms for flutter suppression and gust load alleviation. The BACT system has been especially valuable as a control system testbed.
Impact of particulate air pollution on quality-adjusted life expectancy in Canada.
Coyle, Douglas; Stieb, Dave; Burnett, Richard T; DeCivita, Paul; Krewski, Daniel; Chen, Yue; Thun, Michael J
Air pollution and premature death are important public health concerns. Analyses have repeatedly demonstrated that airborne particles are associated with increased mortality and estimates have been used to forecast the impact on life expectancy. In this analysis, we draw upon data from the American Cancer Society (ACS) cohort and literature on utility-based measures of quality of life in relation to health status to more fully quantify the effects of air pollution on mortality in terms of quality-adjusted life expectancy. The analysis was conducted within a decision analytic model using Monte Carlo simulation techniques. Outcomes were estimated based on projections of the Canadian population. A one-unit reduction in sulfate air pollution would yield a mean annual increase in Quality-Adjusted Life Years (QALYs) of 20,960, with gains being greater for individuals with lower educational status and for males compared to females. This suggests that the impact of reductions in sulfate air pollution on quality-adjusted life expectancy is substantial. Interpretation of the results is unclear. However, the potential gains in QALYs from reduced air pollutants can be contrasted to the costs of policies to bring about such reductions. Based on a tentative threshold for the value of health benefits, analysis suggests that an investment in Canada of over 1 billion dollars per annum would be an efficient use of resources if it could be demonstrated that this would reduce sulfate concentrations in ambient air by 1 microg/m(3). Further analysis can assess the efficiency of targeting such initiatives to communities that are most likely to benefit.
Massey, Meredith; Roter, Debra L
2016-01-01
Certified nursing assistants (CNAs) provide 80% of the hands-on care in US nursing homes; a significant portion of this work is performed by immigrants with limited English fluency. This study is designed to assess immigrant CNA's communication behavior in response to a series of virtual simulated care challenges. A convenience sample of 31 immigrant CNAs verbally responded to 9 care challenges embedded in an interactive computer platform. The responses were coded with the Roter Interaction Analysis System (RIAS), CNA instructors rated response quality and spoken English was rated. CNA communication behaviors varied across care challenges and a broad repertoire of communication was used; 69% of response content was characterized as psychosocial. Communication elements (both instrumental and psychosocial) were significant predictors of response quality for 5 of 9 scenarios. Overall these variables explained between 13% and 36% of the adjusted variance in quality ratings. Immigrant CNAs responded to common care challenges using a variety of communication strategies despite fluency deficits. Virtual simulation-based observation is a feasible, acceptable and low cost method of communication assessment with implications for supervision, training and evaluation of a para-professional workforce. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Sensory and quality analysis of different melon cultivars after prolonged storage.
Hoberg, Edelgard; Ulrich, Detlef; Schulz, Hartwig; Tuvia-Alkali, Sharon; Fallik, Elazar
2003-10-01
The purpose of this work was to evaluate the sensory and general quality of four different melon cultivars (Cucumis melo L.) immediately after harvest and at the end of storage and marketing simulation. After 16 days of storage at 5 degrees C and additional 3 days at 20 degrees C, only cultivar 'C-8' had a poor general appearance due to significant low firmness and relatively high decay incidence compared to the cultivars '5080', 'Ideal' and '7302'. The cultivar '7302' was found to have the higher overall quality. The human-sensory and organoleptic analyses revealed that the cultivars can be differentiated on the basis of retronasal odour. The texture of the melons seems to be dependent on the genotype. All the complex perceptions analysed in this work contribute to the acceptability, which is in the fresh fruits of '7302' the best and in 'Ideal' the worst. After storage and marketing simulation 'Ideal' and 'C-8' are no longer favoured, but '5080' and '7302', despite different characters, were found to be similarly accepted. It can be concluded that with the aid of the human-sensory method developed to characterize the melon varieties it is possible to distinguish the different genotypes.
Neuronvisio: A Graphical User Interface with 3D Capabilities for NEURON.
Mattioni, Michele; Cohen, Uri; Le Novère, Nicolas
2012-01-01
The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three-dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://neuronvisio.org) aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model. Neuronvisio also facilitates access to previously published models, allowing users to browse, download, and locally run NEURON models stored in ModelDB. Neuronvisio uses the matplotlib library to plot simulation results and uses the HDF standard format to store simulation results. Neuronvisio can be viewed as an extension of NEURON, facilitating typical user workflows such as model browsing, selection, download, compilation, and simulation. The 3D viewer simplifies the exploration of complex model structure, while matplotlib permits the plotting of high-quality graphs. The newly introduced ability of saving numerical results allows users to perform additional analysis on their previous simulations.
ERIC Educational Resources Information Center
Zacharia, Zacharias C.
2005-01-01
This study investigated how individuals' construction of explanations--a way of ascertaining how well an individual understands a concept--develops from an interactive simulation. Specifically, the purpose was to investigate the effect of interactive computer simulations or science textbook assignments on the nature and quality of postgraduate…
Influence of Left-Right Asymmetries on Voice Quality in Simulated Paramedian Vocal Fold Paralysis
ERIC Educational Resources Information Center
Samlan, Robin A.; Story, Brad H.
2017-01-01
Purpose: The purpose of this study was to determine the vocal fold structural and vibratory symmetries that are important to vocal function and voice quality in a simulated paramedian vocal fold paralysis. Method: A computational kinematic speech production model was used to simulate an exemplar "voice" on the basis of asymmetric…
Comeau, Robyn; Craig, Catherine
2014-03-01
Documentation of deliveries complicated by shoulder dystocia is a valuable communication skill necessary for residents to attain during residency training. Our objective was to determine whether the teaching of documentation of shoulder dystocia in a simulation environment would translate to improved documentation of the event in an actual clinical situation. We conducted a cohort study involving obstetrics and gynaecology residents in years 2 to 5 between November 2010 and December 2012. Each resident participated in a shoulder dystocia simulation teaching session and was asked to write a delivery note immediately afterwards. They were given feedback regarding their performance of the delivery and their documentation of the events. Following this, dictated records of shoulder dystocia deliveries immediately before and after the simulation session were identified through the Meditech system. An itemized checklist was used to assess the quality of residents' dictated documentation before and after the simulation session. All eligible residents (18) enrolled in the study, and 17 met the inclusion criteria. For 10 residents (59%) documentation of a delivery with shoulder dystocia was present before and after the simulation session, for five residents (29%) it was only present before the session, and for two residents (18%) it was only present after the session. When residents were assessed as a group, there were no differences in the proportion of residents recording items on the checklist before and after the simulation session (P > 0.05 for all). Similarly, analysis of the performance of the10 residents who had dictated documentation both before and after the session showed no differences in the number of elements recorded on dictations done before and after the simulation session (P > 0.05 for all). The teaching of shoulder dystocia documentation through simulation did not result in a measurable improvement in the quality of documentation of shoulder dystocia in actual clinical situations.
NASA Astrophysics Data System (ADS)
Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.
2015-05-01
A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.
NASA Technical Reports Server (NTRS)
Pavel, M.
1993-01-01
This presentation outlines in viewgraph format a general approach to the evaluation of display system quality for aviation applications. This approach is based on the assumption that it is possible to develop a model of the display which captures most of the significant properties of the display. The display characteristics should include spatial and temporal resolution, intensity quantizing effects, spatial sampling, delays, etc. The model must be sufficiently well specified to permit generation of stimuli that simulate the output of the display system. The first step in the evaluation of display quality is an analysis of the tasks to be performed using the display. Thus, for example, if a display is used by a pilot during a final approach, the aesthetic aspects of the display may be less relevant than its dynamic characteristics. The opposite task requirements may apply to imaging systems used for displaying navigation charts. Thus, display quality is defined with regard to one or more tasks. Given a set of relevant tasks, there are many ways to approach display evaluation. The range of evaluation approaches includes visual inspection, rapid evaluation, part-task simulation, and full mission simulation. The work described is focused on two complementary approaches to rapid evaluation. The first approach is based on a model of the human visual system. A model of the human visual system is used to predict the performance of the selected tasks. The model-based evaluation approach permits very rapid and inexpensive evaluation of various design decisions. The second rapid evaluation approach employs specifically designed critical tests that embody many important characteristics of actual tasks. These are used in situations where a validated model is not available. These rapid evaluation tests are being implemented in a workstation environment.
Fidelity assessment of a UH-60A simulation on the NASA Ames vertical motion simulator
NASA Technical Reports Server (NTRS)
Atencio, Adolph, Jr.
1993-01-01
Helicopter handling qualities research requires that a ground-based simulation be a high-fidelity representation of the actual helicopter, especially over the frequency range of the investigation. This experiment was performed to assess the current capability to simulate the UH-60A Black Hawk helicopter on the Vertical Motion Simulator (VMS) at NASA Ames, to develop a methodology for assessing the fidelity of a simulation, and to find the causes for lack of fidelity. The approach used was to compare the simulation to the flight vehicle for a series of tasks performed in flight and in the simulator. The results show that subjective handling qualities ratings from flight to simulator overlap, and the mathematical model matches the UH-60A helicopter very well over the range of frequencies critical to handling qualities evaluation. Pilot comments, however, indicate a need for improvement in the perceptual fidelity of the simulation in the areas of motion and visual cuing. The methodology used to make the fidelity assessment proved useful in showing differences in pilot work load and strategy, but additional work is needed to refine objective methods for determining causes of lack of fidelity.
NASA Astrophysics Data System (ADS)
Woldegiorgis, Befekadu Taddesse; van Griensven, Ann; Pereira, Fernando; Bauwens, Willy
2017-06-01
Most common numerical solutions used in CSTR-based in-stream water quality simulators are susceptible to instabilities and/or solution inconsistencies. Usually, they cope with instability problems by adopting computationally expensive small time steps. However, some simulators use fixed computation time steps and hence do not have the flexibility to do so. This paper presents a novel quasi-analytical solution for CSTR-based water quality simulators of an unsteady system. The robustness of the new method is compared with the commonly used fourth-order Runge-Kutta methods, the Euler method and three versions of the SWAT model (SWAT2012, SWAT-TCEQ, and ESWAT). The performance of each method is tested for different hypothetical experiments. Besides the hypothetical data, a real case study is used for comparison. The growth factors we derived as stability measures for the different methods and the R-factor—considered as a consistency measure—turned out to be very useful for determining the most robust method. The new method outperformed all the numerical methods used in the hypothetical comparisons. The application for the Zenne River (Belgium) shows that the new method provides stable and consistent BOD simulations whereas the SWAT2012 model is shown to be unstable for the standard daily computation time step. The new method unconditionally simulates robust solutions. Therefore, it is a reliable scheme for CSTR-based water quality simulators that use first-order reaction formulations.
Primary display latency criteria based on flying qualities and performance data
NASA Technical Reports Server (NTRS)
Funk, John D., Jr.; Beck, Corin P.; Johns, John B.
1993-01-01
With a pilots' increasing use of visual cue augmentation, much requiring extensive pre-processing, there is a need to establish criteria for new avionics/display design. The timeliness and synchronization of the augmented cues is vital to ensure the performance quality required for precision mission task elements (MTEs) where augmented cues are the primary source of information to the pilot. Processing delays incurred while transforming sensor-supplied flight information into visual cues are unavoidable. Relationships between maximum control system delays and associated flying qualities levels are documented in MIL-F-83300 and MIL-F-8785. While cues representing aircraft status may be just as vital to the pilot as prompt control response for operations in instrument meteorological conditions, presently, there are no specification requirements on avionics system latency. To produce data relating avionics system latency to degradations in flying qualities, the Navy conducted two simulation investigations. During the investigations, flying qualities and performance data were recorded as simulated avionics system latency was varied. Correlated results of the investigation indicates that there is a detrimental impact of latency on flying qualities. Analysis of these results and consideration of key factors influencing their application indicate that: (1) Task performance degrades and pilot workload increases as latency is increased. Inconsistency in task performance increases as latency increases. (2) Latency reduces the probability of achieving Level 1 handling qualities with avionics system latency as low as 70 ms. (3) The data suggest that the achievement of desired performance will be ensured only at display latency values below 120 ms. (4) These data also suggest that avoidance of inadequate performance will be ensured only at display latency values below 150 ms.
High-throughput full-length single-cell mRNA-seq of rare cells.
Ooi, Chin Chun; Mantalas, Gary L; Koh, Winston; Neff, Norma F; Fuchigami, Teruaki; Wong, Dawson J; Wilson, Robert J; Park, Seung-Min; Gambhir, Sanjiv S; Quake, Stephen R; Wang, Shan X
2017-01-01
Single-cell characterization techniques, such as mRNA-seq, have been applied to a diverse range of applications in cancer biology, yielding great insight into mechanisms leading to therapy resistance and tumor clonality. While single-cell techniques can yield a wealth of information, a common bottleneck is the lack of throughput, with many current processing methods being limited to the analysis of small volumes of single cell suspensions with cell densities on the order of 107 per mL. In this work, we present a high-throughput full-length mRNA-seq protocol incorporating a magnetic sifter and magnetic nanoparticle-antibody conjugates for rare cell enrichment, and Smart-seq2 chemistry for sequencing. We evaluate the efficiency and quality of this protocol with a simulated circulating tumor cell system, whereby non-small-cell lung cancer cell lines (NCI-H1650 and NCI-H1975) are spiked into whole blood, before being enriched for single-cell mRNA-seq by EpCAM-functionalized magnetic nanoparticles and the magnetic sifter. We obtain high efficiency (> 90%) capture and release of these simulated rare cells via the magnetic sifter, with reproducible transcriptome data. In addition, while mRNA-seq data is typically only used for gene expression analysis of transcriptomic data, we demonstrate the use of full-length mRNA-seq chemistries like Smart-seq2 to facilitate variant analysis of expressed genes. This enables the use of mRNA-seq data for differentiating cells in a heterogeneous population by both their phenotypic and variant profile. In a simulated heterogeneous mixture of circulating tumor cells in whole blood, we utilize this high-throughput protocol to differentiate these heterogeneous cells by both their phenotype (lung cancer versus white blood cells), and mutational profile (H1650 versus H1975 cells), in a single sequencing run. This high-throughput method can help facilitate single-cell analysis of rare cell populations, such as circulating tumor or endothelial cells, with demonstrably high-quality transcriptomic data.
NASA Astrophysics Data System (ADS)
Hei Tong, Cheuk
2017-04-01
Small particulates can cause long term impairment to human health as they can penetrate deep and deposit on the wall of the respiratory system. Under the projected climate change as reported by literature, atmospheric stability, which has strong effects on vertical mixing of air pollutants and thus air quality Hong Kong, is also varying from near to far future. In addition to domestic emission, Hong Kong receives also significant concentration of cross-boundary particulates that their natures and movements are correlated with atmospheric condition. This study aims to study the relation of atmospheric conditions with air quality over Hong Kong. Past meteorological data is based on Modern Era Retrospective Analysis for Research and Applications (MERRA) reanalysis data. Radiosonde data provided from HKO are also adopted in testing and validating the data. Future meteorological data is simulated by the Weather Research and Forecasting Model (WRF), which dynamically downscaled the past and future climate under the A1B scenario simulated by ECHAM5/MPIOM. Air quality data is collected on one hand from the ground station data provided by Environment Protection Department, with selected stations revealing local emission and trans-boundary emission respectively. On the other hand, an Atmospheric Light Detection and Ranging (LiDAR), which operates using the radar principle to detect Rayleigh and Mie scattering from atmospheric gas and aerosols, has also been adopted to measure vertical aerosol profile, which has been observed tightly related to the high level meteorology. Data from scattered signals are collected, averaged or some episode selected for characteristic comparison with the atmospheric stability indices and other meteorological factors. The relation between atmospheric conditions and air quality is observed by statistical analysis, and statistical models are built based on the stability indices to project the changes in sulphur dioxide, ozone and particulate matters due to changes in stability in future years.
NASA Astrophysics Data System (ADS)
Bitew, M. M.; Jackson, C. R.; Vache, K. B.; Griffiths, N.; Starr, G.; McDonnell, J.; Rau, B.; Younger, S. E.; Fouts, K.
2016-12-01
Intensively managed loblolly pine is a candidate species for biofuel feedstock production in the southeastern Coastal Plain of the United States. However, the water quantity and quality effects of high intensity, short-rotation silviculture are largely unknown. Here we evaluate the potential hydrologic and water quality impacts of biofuel-induced land use changes based on model scenarios developed using existing forest BMPs and industry wide experiences. We quantified the effect of bio-energy production scenarios on each of water the balance components by applying an integrated physically based distributed watershed modeling system, and multi-objective assessment functions that accurately describes the flow regimes, water quality, and isotopic observations from three experimental headwater watersheds of Fourmile Creek at Savannah River Site, SC. The model incorporates optimized travel times of groundwater flowpaths and flow control processes in the riparian region allowing water quality analysis of groundwater dominated watershed systems. We compared five different short rotation pine management scenarios ranging from 35 year (low intensity) to 10 year (high intensity) rotations and a mixture of forestry and agriculture/pasture production practices. Simulation results, based on long-term climate records, revealed that complete conversion to short-rotation woody crops would have a negligible effect on water budget components; <2% decrease in streamflow, <1.5% increase in actual evapotranspiration, an average 0.5 m fall in the groundwater table, and no change in subsurface flow due to biofuel production. Simulation results of mixed 50% agriculture and pasture and 50% short-rotation woody crops showed the largest deviation in water budget components compared to the reference condition. Analysis of extreme stream flows showed that the largest effect was observed in the low intensity mixed land use scenario. The smallest effect was in the low intensity biomass production scenario with a 0.5% increase in a 100 year return event.
Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.
Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana
2016-01-01
The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.
Han, Longtao; Irle, Stephan; Nakai, Hiromi
2018-01-01
We performed nanosecond timescale computer simulations of clusterization and agglomeration processes of boron nitride (BN) nanostructures in hot, high pressure gas, starting from eleven different atomic and molecular precursor systems containing boron, nitrogen and hydrogen at various temperatures from 1500 to 6000 K. The synthesized BN nanostructures self-assemble in the form of cages, flakes, and tubes as well as amorphous structures. The simulations facilitate the analysis of chemical dynamics and we are able to predict the optimal conditions concerning temperature and chemical precursor composition for controlling the synthesis process in a high temperature gas volume, at high pressure. We identify the optimal precursor/temperature choices that lead to the nanostructures of highest quality with the highest rate of synthesis, using a novel parameter of the quality of the synthesis (PQS). Two distinct mechanisms of BN nanotube growth were found, neither of them based on the root-growth process. The simulations were performed using quantum-classical molecular dynamics (QCMD) based on the density-functional tight-binding (DFTB) quantum mechanics in conjunction with a divide-and-conquer (DC) linear scaling algorithm, as implemented in the DC-DFTB-K code, enabling the study of systems as large as 1300 atoms in canonical NVT ensembles for 1 ns time. PMID:29780513
Persistence of initial conditions in continental scale air quality ...
This study investigates the effect of initial conditions (IC) for pollutant concentrations in the atmosphere and soil on simulated air quality for two continental-scale Community Multiscale Air Quality (CMAQ) model applications. One of these applications was performed for springtime and the second for summertime. Results show that a spin-up period of ten days commonly used in regional-scale applications may not be sufficient to reduce the effects of initial conditions to less than 1% of seasonally-averaged surface ozone concentrations everywhere while 20 days were found to be sufficient for the entire domain for the spring case and almost the entire domain for the summer case. For the summer case, differences were found to persist longer aloft due to circulation of air masses and even a spin-up period of 30 days was not sufficient to reduce the effects of ICs to less than 1% of seasonally-averaged layer 34 ozone concentrations over the southwestern portion of the modeling domain. Analysis of the effect of soil initial conditions for the CMAQ bidirectional NH3 exchange model shows that during springtime they can have an important effect on simulated inorganic aerosols concentrations for time periods of one month or longer. The effects are less pronounced during other seasons. The results, while specific to the modeling domain and time periods simulated here, suggest that modeling protocols need to be scrutinized for a given application and that it cannot be assum