Validation of Robotic Surgery Simulator (RoSS).
Kesavadas, Thenkurussi; Stegemann, Andrew; Sathyaseelan, Gughan; Chowriappa, Ashirwad; Srimathveeravalli, Govindarajan; Seixas-Mikelus, Stéfanie; Chandrasekhar, Rameella; Wilding, Gregory; Guru, Khurshid
2011-01-01
Recent growth of daVinci Robotic Surgical System as a minimally invasive surgery tool has led to a call for better training of future surgeons. In this paper, a new virtual reality simulator, called RoSS is presented. Initial results from two studies - face and content validity, are very encouraging. 90% of the cohort of expert robotic surgeons felt that the simulator was excellent or somewhat close to the touch and feel of the daVinci console. Content validity of the simulator received 90% approval in some cases. These studies demonstrate that RoSS has the potential of becoming an important training tool for the daVinci surgical robot.
(abstract) Generic Modeling of a Life Support System for Process Technology Comparisons
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.
Rapid E-Learning Simulation Training and User Response
ERIC Educational Resources Information Center
Rackler, Angeline
2011-01-01
A new trend in e-learning development is to have subject matter experts use rapid development tools to create training simulations. This type of training is called rapid e-learning simulation training. Though companies are using rapid development tools to create training quickly and cost effectively, there is little empirical research to indicate…
Generic Modeling of a Life Support System for Process Technology Comparison
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.
Kelly, Michelle M; Blunt, Elizabeth; Nestor, Kelly
2017-12-01
Few nurse practitioner (NP) programs include an after-hours/on-call component in their clinical preparation of NP students. This role is expected in many primary and specialty care practices, and is one that students feel unprepared to competently navigate. Utilizing simulated callers as patients or parents, NP students participated in a simulated after-hours/on-call experience that included receiving the call, managing the patient, and submitting documentation of the encounter. Students completed pre- and postparticipation evaluations, and were evaluated by the simulated patient callers and faculty using standardized evaluation tools. NP students rated the experience as an educationally valuable experience despite feeling anxious and nervous about the experience. Several essential skills were identified including critical thinking, clear communication, self-confidence, and access to resources. After participation NP students were more receptive to an NP position with an on-call component. Inclusion of a simulated on-call experience is a feasible component of NP education and should be added to the NP curriculum. ©2017 American Association of Nurse Practitioners.
SCOUT: A Fast Monte-Carlo Modeling Tool of Scintillation Camera Output
Hunter, William C. J.; Barrett, Harrison H.; Lewellen, Thomas K.; Miyaoka, Robert S.; Muzi, John P.; Li, Xiaoli; McDougald, Wendy; MacDonald, Lawrence R.
2011-01-01
We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:22072297
SCOUT: a fast Monte-Carlo modeling tool of scintillation camera output†
Hunter, William C J; Barrett, Harrison H.; Muzi, John P.; McDougald, Wendy; MacDonald, Lawrence R.; Miyaoka, Robert S.; Lewellen, Thomas K.
2013-01-01
We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout of a scintillation camera. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:23640136
A numerical tool for reproducing driver behaviour: experiments and predictive simulations.
Casucci, M; Marchitto, M; Cacciabue, P C
2010-03-01
This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions.
Automated simulation as part of a design workstation
NASA Technical Reports Server (NTRS)
Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.
1990-01-01
A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.
ERIC Educational Resources Information Center
Haji, Faizal A.; Hoppe, Daniel J.; Morin, Marie-Paule; Giannoulakis, Konstantine; Koh, Jansen; Rojas, David; Cheung, Jeffrey J. H.
2014-01-01
Rapid technological advances and concern for patient safety have increased the focus on simulation as a pedagogical tool for educating health care providers. To date, simulation research scholarship has focused on two areas; evaluating instructional designs of simulation programs, and the integration of simulation into a broader educational…
The Mission Planning Lab: A Visualization and Analysis Tool
NASA Technical Reports Server (NTRS)
Daugherty, Sarah C.; Cervantes, Benjamin W.
2009-01-01
Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).
A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James
2011-11-01
Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less
On the Edge: Intelligent CALL in the 1990s.
ERIC Educational Resources Information Center
Underwood, John
1989-01-01
Examines the possibilities of developing computer-assisted language learning (CALL) based on the best of modern technology, arguing that artificial intelligence (AI) strategies will radically improve the kinds of exercises that can be performed. Recommends combining AI technology with other tools for delivering instruction, such as simulation and…
HRLSim: a high performance spiking neural network simulator for GPGPU clusters.
Minkovich, Kirill; Thibeault, Corey M; O'Brien, Michael John; Nogin, Aleksey; Cho, Youngkwan; Srinivasa, Narayan
2014-02-01
Modeling of large-scale spiking neural models is an important tool in the quest to understand brain function and subsequently create real-world applications. This paper describes a spiking neural network simulator environment called HRL Spiking Simulator (HRLSim). This simulator is suitable for implementation on a cluster of general purpose graphical processing units (GPGPUs). Novel aspects of HRLSim are described and an analysis of its performance is provided for various configurations of the cluster. With the advent of inexpensive GPGPU cards and compute power, HRLSim offers an affordable and scalable tool for design, real-time simulation, and analysis of large-scale spiking neural networks.
software development of next-generation whole-building energy modeling, analysis, and simulation tools technical positions in networking protocol specifications, call control software, and requirements
42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems
NASA Technical Reports Server (NTRS)
Stoneking, Eric
2018-01-01
Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.
Evaluating Variant Calling Tools for Non-Matched Next-Generation Sequencing Data
NASA Astrophysics Data System (ADS)
Sandmann, Sarah; de Graaf, Aniek O.; Karimi, Mohsen; van der Reijden, Bert A.; Hellström-Lindberg, Eva; Jansen, Joop H.; Dugas, Martin
2017-02-01
Valid variant calling results are crucial for the use of next-generation sequencing in clinical routine. However, there are numerous variant calling tools that usually differ in algorithms, filtering strategies, recommendations and thus, also in the output. We evaluated eight open-source tools regarding their ability to call single nucleotide variants and short indels with allelic frequencies as low as 1% in non-matched next-generation sequencing data: GATK HaplotypeCaller, Platypus, VarScan, LoFreq, FreeBayes, SNVer, SAMtools and VarDict. We analysed two real datasets from patients with myelodysplastic syndrome, covering 54 Illumina HiSeq samples and 111 Illumina NextSeq samples. Mutations were validated by re-sequencing on the same platform, on a different platform and expert based review. In addition we considered two simulated datasets with varying coverage and error profiles, covering 50 samples each. In all cases an identical target region consisting of 19 genes (42,322 bp) was analysed. Altogether, no tool succeeded in calling all mutations. High sensitivity was always accompanied by low precision. Influence of varying coverages- and background noise on variant calling was generally low. Taking everything into account, VarDict performed best. However, our results indicate that there is a need to improve reproducibility of the results in the context of multithreading.
The SIMs Meet ESL Incorporating Authentic Computer Simulation Games into the Language Classroom
ERIC Educational Resources Information Center
Miller, Megan; Hegelheimer, Volker
2006-01-01
Despite their motivational appeal to learners, innovative and technologically advanced computer simulation games targeting native English speakers frequently remain beyond the competence of ESL learners as independent didactic tools. Guided by Chapelle's (2001) criteria for determining CALL task appropriateness, this paper illustrates how the…
Ruano, M V; Ribes, J; Seco, A; Ferrer, J
2011-01-01
This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.
The Creation of a CPU Timer for High Fidelity Programs
NASA Technical Reports Server (NTRS)
Dick, Aidan A.
2011-01-01
Using C and C++ programming languages, a tool was developed that measures the efficiency of a program by recording the amount of CPU time that various functions consume. By inserting the tool between lines of code in the program, one can receive a detailed report of the absolute and relative time consumption associated with each section. After adapting the generic tool for a high-fidelity launch vehicle simulation program called MAVERIC, the components of a frequently used function called "derivatives ( )" were measured. Out of the 34 sub-functions in "derivatives ( )", it was found that the top 8 sub-functions made up 83.1% of the total time spent. In order to decrease the overall run time of MAVERIC, a launch vehicle simulation program, a change was implemented in the sub-function "Event_Controller ( )". Reformatting "Event_Controller ( )" led to a 36.9% decrease in the total CPU time spent by that sub-function, and a 3.2% decrease in the total CPU time spent by the overarching function "derivatives ( )".
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com; Khamehchi, Ehsan
Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks andmore » fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.« less
Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed
NASA Technical Reports Server (NTRS)
Follen, Gregory J.; Naiman, Cynthia G.
2001-01-01
The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.
Cognitive simulation as a tool for cognitive task analysis.
Roth, E M; Woods, D D; Pople, H E
1992-10-01
Cognitive simulations are runnable computer programs that represent models of human cognitive activities. We show how one cognitive simulation built as a model of some of the cognitive processes involved in dynamic fault management can be used in conjunction with small-scale empirical data on human performance to uncover the cognitive demands of a task, to identify where intention errors are likely to occur, and to point to improvements in the person-machine system. The simulation, called Cognitive Environment Simulation or CES, has been exercised on several nuclear power plant accident scenarios. Here we report one case to illustrate how a cognitive simulation tool such as CES can be used to clarify the cognitive demands of a problem-solving situation as part of a cognitive task analysis.
Evaluating the ecological benefits of wildfire by integrating fire and ecosystem simulation models
Robert E. Keane; Eva Karau
2010-01-01
Fire managers are now realizing that wildfires can be beneficial because they can reduce hazardous fuels and restore fire-dominated ecosystems. A software tool that assesses potential beneficial and detrimental ecological effects from wildfire would be helpful to fire management. This paper presents a simulation platform called FLEAT (Fire and Landscape Ecology...
ERIC Educational Resources Information Center
Klopfer, Eric; Yoon, Susan; Perry, Judy
2005-01-01
This paper reports on teachers' perceptions of the educational affordances of a handheld application called Participatory Simulations. It presents evidence from five cases representing each of the populations who work with these computational tools. Evidence across multiple data sources yield similar results to previous research evaluations of…
Weller, J M; Torrie, J; Boyd, M; Frengley, R; Garden, A; Ng, W L; Frampton, C
2014-06-01
Sharing information with the team is critical in developing a shared mental model in an emergency, and fundamental to effective teamwork. We developed a structured call-out tool, encapsulated in the acronym 'SNAPPI': Stop; Notify; Assessment; Plan; Priorities; Invite ideas. We explored whether a video-based intervention could improve structured call-outs during simulated crises and if this would improve information sharing and medical management. In a simulation-based randomized, blinded study, we evaluated the effect of the video-intervention teaching SNAPPI on scores for SNAPPI, information sharing, and medical management using baseline and follow-up crisis simulations. We assessed information sharing using a probe technique where nurses and technicians received unique, clinically relevant information probes before the simulation. Shared knowledge of probes was measured in a written, post-simulation test. We also scored sharing of diagnostic options with the team and medical management. Anaesthetists' scores for SNAPPI were significantly improved, as was the number of diagnostic options they shared. We found a non-significant trend to improve information-probe sharing and medical management in the intervention group, and across all simulations, a significant correlation between SNAPPI and information-probe sharing. Of note, only 27% of the clinically relevant information about the patient provided to the nurse and technician in the pre-simulation information probes was subsequently learnt by the anaesthetist. We developed a structured communication tool, SNAPPI, to improve information sharing between anaesthetists and their team, taught it using a video-based intervention, and provide initial evidence to support its value for improving communication in a crisis. © The Author [2014]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Medical simulation: a tool for recognition of and response to risk.
Ruddy, Richard M; Patterson, Mary Deffner
2008-11-01
The use of simulation and team training has become an excellent tool to reduce errors in high-risk industry such as the commercial airlines and in the nuclear energy field. The health care industry has begun to use similar tools to improve the outcome of high-risk areas where events are relatively rare but where practice with a tactical team can significantly reduce the chance of bad outcome. There are two parts to this review: first, we review the rationale of why simulation is a key element in improving our error rate, and second, we describe specific tools that have great use at the clinical bedside for improving the care of patients. These cross different (i.e. medical and surgical) specialties and practices within specialties in the health care setting. Tools described will include the pinch, brief/debriefing, read-backs, call-outs, dynamic skepticism, assertive statements, two-challenge rules, checklists and step back (hold points). Examples will assist the clinician in practical daily use to improve their bedside care of children.
Mission Assignment Model and Simulation Tool for Different Types of Unmanned Aerial Vehicles
2008-09-01
TABLE OF ABBREVIATIONS AND ACRONYMS AAA Anti Aircraft Artillery ATO Air Tasking Order BDA Battle Damage Assessment DES Discrete Event Simulation...clock is advanced in small, fixed time steps. Since the value of simulated time is important in DES , an internal variable, called as simulation clock...VEHICLES Yücel Alver Captain, Turkish Air Force B.S., Turkish Air Force Academy, 2000 Murat Özdoğan 1st Lieutenant, Turkish Air Force B.S., Turkish
Logistics Process Analysis ToolProcess Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-03-31
LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less
Similarities between principal components of protein dynamics and random diffusion
NASA Astrophysics Data System (ADS)
Hess, Berk
2000-12-01
Principal component analysis, also called essential dynamics, is a powerful tool for finding global, correlated motions in atomic simulations of macromolecules. It has become an established technique for analyzing molecular dynamics simulations of proteins. The first few principal components of simulations of large proteins often resemble cosines. We derive the principal components for high-dimensional random diffusion, which are almost perfect cosines. This resemblance between protein simulations and noise implies that for many proteins the time scales of current simulations are too short to obtain convergence of collective motions.
PRANAS: A New Platform for Retinal Analysis and Simulation.
Cessac, Bruno; Kornprobst, Pierre; Kraria, Selim; Nasser, Hassan; Pamplona, Daniela; Portelli, Geoffrey; Viéville, Thierry
2017-01-01
The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr), standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints) takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space) receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.
Terminal - Tactical Separation Assured Flight Environment (T-TSafe)
NASA Technical Reports Server (NTRS)
Verma, Savita Arora; Tang, Huabin; Ballinger, Debbi
2011-01-01
The Tactical Separation Assured Flight Environment (TSAFE) has been previously tested as a conflict detection and resolution tool in the en-route phase of flight. Fast time simulations of a terminal version of this tool called Terminal TSAFE (T-TSAFE) have shown promise over the current conflict detection tools. It has shown to have fewer false alerts (as low as 2 per hour) and better prediction to conflict time than Conflict Alert. The tool will be tested in the simulated terminal area of Los Angeles International Airport, in a Human-in-the-loop experiment to identify controller procedures and information requirements. The simulation will include comparisons of T-TSAFE with NASA's version of Conflict Alert. Also, some other variables such as altitude entry by the controller, which improve T-TSAFE's predictions for conflict detection, will be tested. T-TSAFE integrates features of current conflict detection tools such as Automated Terminal Proximity Alert used to alleviate compression errors in the final approach phase. Based on fast-time simulation analysis, the anticipated benefits of T-TSAFE over Conflict Alert include reduced false/missed alerts and increased time to predicted loss of separation. Other metrics that will be used to evaluate the tool's impact on the controller include controller intervention, workload, and situation awareness.
Biopathways representation and simulation on hybrid functional petri net.
Matsuno, Hiroshi; Tanaka, Yukiko; Aoshima, Hitoshi; Doi, Atsushi; Matsui, Mika; Miyano, Satoru
2011-01-01
The following two matters should be resolved in order for biosimulation tools to be accepted by users in biology/medicine: (1) remove issues which are irrelevant to biological importance, and (2) allow users to represent biopathways intuitively and understand/manage easily the details of representation and simulation mechanism. From these criteria, we firstly define a novel notion of Petri net called Hybrid Functional Petri Net (HFPN). Then, we introduce a software tool, Genomic Object Net, for representing and simulating biopathways, which we have developed by employing the architecture of HFPN. In order to show the usefulness of Genomic Object Net for representing and simulating biopathways, we show two HFPN representations of gene regulation mechanisms of Drosophila melanogaster (fruit fly) circadian rhythm and apoptosis induced by Fas ligand. The simulation results of these biopathways are also correlated with biological observations. The software is available to academic users from http://www.GenomicObject.Net/.
The impact of simulation education on self-efficacy towards teaching for nurse educators.
Garner, S L; Killingsworth, E; Bradshaw, M; Raj, L; Johnson, S R; Abijah, S P; Parimala, S; Victor, S
2018-03-23
The objective of this study was to assess the impact of a simulation workshop on self-efficacy towards teaching for nurse educators in India. Additionally, we sought to revise and validate a tool to measure self-efficacy in teaching for use with a global audience. Simulation is an evidence-based teaching and learning method and is increasingly used in nursing education globally. As new technology and teaching methods such as simulation continue to evolve, it is important for new as well as experienced nurse educators globally to have confidence in their teaching skills and abilities. The study included (1) instrument revision, and measures of reliability and validation, (2) an 8-h faculty development workshop intervention on simulation, (3) pre- and post-survey of self-efficacy among nurse educators, and (4) investigation of relationship between faculty socio-demographics and degree of self-efficacy. The modified tool showed internal consistency (r = 0.98) and was validated by international faculty experts. There were significant improvements in total self-efficacy (P < 0.001) and subscale scores among nurse educators after the simulation workshop intervention when compared to pre-survey results. No significant relationships were found between socio-demographic variables and degree of self-efficacy. Strong self-efficacy in teaching among nurse educators is crucial for effective learning to occur. Results indicated the simulation workshop was effective in significantly improving self-efficacy towards teaching for nurse educators using an internationally validated tool. The Minister of Health in India recently called for improvements in nursing education. Introducing nursing education on simulation as a teaching method in India and globally to improve self-efficacy among teachers is an example of a strategy towards meeting this call. © 2018 The Authors International Nursing Review published by John Wiley & Sons Ltd on behalf of International Council of Nurses.
Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows
NASA Astrophysics Data System (ADS)
Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.
2014-12-01
The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.
Automated simulation as part of a design workstation
NASA Technical Reports Server (NTRS)
Cantwell, E.; Shenk, T.; Robinson, P.; Upadhye, R.
1990-01-01
A development project for a design workstation for advanced life-support systems incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulations, such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The paper reports on the Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components.
NASA Astrophysics Data System (ADS)
Chaturvedi, Pratik; Arora, Akshit; Dutt, Varun
2018-06-01
Feedback via simulation tools is likely to help people improve their decision-making against natural disasters. However, little is known on how differing strengths of experiential feedback and feedback's availability in simulation tools influence people's decisions against landslides. We tested the influence of differing strengths of experiential feedback and feedback's availability on people's decisions against landslides in Mandi, Himachal Pradesh, India. Experiential feedback (high or low) and feedback's availability (present or absent) were varied across four between-subject conditions in a tool called the Interactive Landslide Simulation (ILS): high damage with feedback present, high damage with feedback absent, low damage with feedback present, and low damage with feedback absent. In high-damage conditions, the probabilities of damages to life and property due to landslides were 10 times higher than those in the low-damage conditions. In feedback-present conditions, experiential feedback was provided in numeric, text, and graphical formats in ILS. In feedback-absent conditions, the probabilities of damages were described; however, there was no experiential feedback present. Investments were greater in conditions where experiential feedback was present and damages were high compared to conditions where experiential feedback was absent and damages were low. Furthermore, only high-damage feedback produced learning in ILS. Simulation tools like ILS seem appropriate for landslide risk communication and for performing what-if analyses.
Metamodeling Techniques to Aid in the Aggregation Process of Large Hierarchical Simulation Models
2008-08-01
Level Outputs Campaign Level Model Campaign Level Outputs Aggregation Metamodeling Complexity (Spatial, Temporal, etc.) Others? Apply VRT (type......reduction, are called variance reduction techniques ( VRT ) [Law, 2006]. The implementation of some type of VRT can prove to be a very valuable tool
ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package
NASA Astrophysics Data System (ADS)
Jaggi, S.
1993-02-01
The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.
NY TBO Research: Integrated Demand Management (IDM): IDM Concept, Tools, and Training Package
NASA Technical Reports Server (NTRS)
Smith, Nancy
2016-01-01
A series of human-in-the-loop simulation sessions were conducted in the Airspace Operations Laboratory (AOL) to evaluate a new traffic management concept called Integrated Demand Management (IDM). The simulation explored how to address chronic equity, throughput and delay issues associated with New Yorks high-volume airports by operationally integrating three current and NextGen capabilities the Collaborative Trajectory Options Program (CTOP), Time-Based Flow Management (TBFM) and Required Time of Arrival (RTA) in order to better manage traffic demand within the National Air Traffic System. A package of presentation slides was developed to describe the concept, tools, and training materials used in the simulation sessions. The package will be used to outbrief our stakeholders by both presenting orally and disseminating of the materials via email.
Wen, Bin; Peng, Junhui; Zuo, Xiaobing; Gong, Qingguo; Zhang, Zhiyong
2014-01-01
Large-scale flexibility within a multidomain protein often plays an important role in its biological function. Despite its inherent low resolution, small-angle x-ray scattering (SAXS) is well suited to investigate protein flexibility and determine, with the help of computational modeling, what kinds of protein conformations would coexist in solution. In this article, we develop a tool that combines SAXS data with a previously developed sampling technique called amplified collective motions (ACM) to elucidate structures of highly dynamic multidomain proteins in solution. We demonstrate the use of this tool in two proteins, bacteriophage T4 lysozyme and tandem WW domains of the formin-binding protein 21. The ACM simulations can sample the conformational space of proteins much more extensively than standard molecular dynamics (MD) simulations. Therefore, conformations generated by ACM are significantly better at reproducing the SAXS data than are those from MD simulations. PMID:25140431
Simulation of Needle-Type Corona Electrodes by the Finite Element Method
NASA Astrophysics Data System (ADS)
Yang, Shiyou; José Márcio, Machado; Nancy Mieko, Abe; Angelo, Passaro
2007-12-01
This paper describes a software tool, called LEVSOFT, suitable for the electric field simulations of corona electrodes by the Finite Element Method (FEM). Special attention was paid to the user friendly construction of geometries with corners and sharp points, and to the fast generation of highly refined triangular meshes and field maps. The execution of self-adaptive meshes was also implemented. These customized features make the code attractive for the simulation of needle-type corona electrodes. Some case examples involving needle type electrodes are presented.
Maestro: an orchestration framework for large-scale WSN simulations.
Riliskis, Laurynas; Osipov, Evgeny
2014-03-18
Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation.
Maestro: An Orchestration Framework for Large-Scale WSN Simulations
Riliskis, Laurynas; Osipov, Evgeny
2014-01-01
Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123
Space Communications and Navigation (SCaN) Network Simulation Tool Development and Its Use Cases
NASA Technical Reports Server (NTRS)
Jennings, Esther; Borgen, Richard; Nguyen, Sam; Segui, John; Stoenescu, Tudor; Wang, Shin-Ywan; Woo, Simon; Barritt, Brian; Chevalier, Christine; Eddy, Wesley
2009-01-01
In this work, we focus on the development of a simulation tool to assist in analysis of current and future (proposed) network architectures for NASA. Specifically, the Space Communications and Navigation (SCaN) Network is being architected as an integrated set of new assets and a federation of upgraded legacy systems. The SCaN architecture for the initial missions for returning humans to the moon and beyond will include the Space Network (SN) and the Near-Earth Network (NEN). In addition to SCaN, the initial mission scenario involves a Crew Exploration Vehicle (CEV), the International Space Station (ISS) and NASA Integrated Services Network (NISN). We call the tool being developed the SCaN Network Integration and Engineering (SCaN NI&E) Simulator. The intended uses of such a simulator are: (1) to characterize performance of particular protocols and configurations in mission planning phases; (2) to optimize system configurations by testing a larger parameter space than may be feasible in either production networks or an emulated environment; (3) to test solutions in order to find issues/risks before committing more significant resources needed to produce real hardware or flight software systems. We describe two use cases of the tool: (1) standalone simulation of CEV to ISS baseline scenario to determine network performance, (2) participation in Distributed Simulation Integration Laboratory (DSIL) tests to perform function testing and verify interface and interoperability of geographically dispersed simulations/emulations.
VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox
NASA Astrophysics Data System (ADS)
Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.
2016-12-01
VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.
Optical 3-Way Handshake (O3WHS) Protocol Simulation in OMNeT++
2017-06-01
PERSON Vinod K Mishra a. REPORT Unclassified b. ABSTRACT Unclassified c . THIS PAGE Unclassified 19b. TELEPHONE NUMBER (Include area code) 410...popular program called OMNeT++2 for that purpose. It is an open-source discrete event simulator tool written in C ++ language. It has been chiefly...References 1. Von Lehmen A, Doverspike R, Clapp G, Freimuth DM, Gannett J, Kolarov A, Kobrinski H, Makaya C , Mavrogiorgis E, Pastor J, Rauch M
SmaggIce 2D Version 1.8: Software Toolkit Developed for Aerodynamic Simulation Over Iced Airfoils
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Vickerman, Mary B.
2005-01-01
SmaggIce 2D version 1.8 is a software toolkit developed at the NASA Glenn Research Center that consists of tools for modeling the geometry of and generating the grids for clean and iced airfoils. Plans call for the completed SmaggIce 2D version 2.0 to streamline the entire aerodynamic simulation process--the characterization and modeling of ice shapes, grid generation, and flow simulation--and to be closely coupled with the public-domain application flow solver, WIND. Grid generated using version 1.8, however, can be used by other flow solvers. SmaggIce 2D will help researchers and engineers study the effects of ice accretion on airfoil performance, which is difficult to do with existing software tools because of complex ice shapes. Using SmaggIce 2D, when fully developed, to simulate flow over an iced airfoil will help to reduce the cost of performing flight and wind-tunnel tests for certifying aircraft in natural and simulated icing conditions.
Haji, Faizal A; Hoppe, Daniel J; Morin, Marie-Paule; Giannoulakis, Konstantine; Koh, Jansen; Rojas, David; Cheung, Jeffrey J H
2014-05-01
Rapid technological advances and concern for patient safety have increased the focus on simulation as a pedagogical tool for educating health care providers. To date, simulation research scholarship has focused on two areas; evaluating instructional designs of simulation programs, and the integration of simulation into a broader educational context. However, these two categories of research currently exist under a single label-Simulation-Based Medical Education. In this paper we argue that introducing a more refined nomenclature within which to frame simulation research is necessary for researchers, to appropriately design research studies and describe their findings, and for end-point users (such as program directors and educators), to more appropriately understand and utilize this evidence.
Kasahara, Kota; Kinoshita, Kengo
2016-01-01
Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.
Representing climate, disturbance, and vegetation interactions in landscape models
Robert E. Keane; Donald McKenzie; Donald A. Falk; Erica A.H. Smithwick; Carol Miller; Lara-Karena B. Kellogg
2015-01-01
The prospect of rapidly changing climates over the next century calls for methods to predict their effects on myriad, interactive ecosystem processes. Spatially explicit models that simulate ecosystem dynamics at fine (plant, stand) to coarse (regional, global) scales are indispensable tools for meeting this challenge under a variety of possible futures. A special...
2015-09-01
shown have units of pF/m. This is the capacitance matrix for the 115-kV 3-phase circuit seen in Fig. 24.....................................24 Fig. 29...The window that appears when one clicks “Calculate Lambdas ”. These are the linear charge densities for the 115-kV 3-phase circuit seen in Fig. 24...calculate the capacitance matrix (Fig. 28). The diagonal entries are called the coefficients of capacitance, and the non-diagonal entries are called
Zimmerman, M I; Bowman, G R
2016-01-01
Molecular dynamics (MD) simulations are a powerful tool for understanding enzymes' structures and functions with full atomistic detail. These physics-based simulations model the dynamics of a protein in solution and store snapshots of its atomic coordinates at discrete time intervals. Analysis of the snapshots from these trajectories provides thermodynamic and kinetic properties such as conformational free energies, binding free energies, and transition times. Unfortunately, simulating biologically relevant timescales with brute force MD simulations requires enormous computing resources. In this chapter we detail a goal-oriented sampling algorithm, called fluctuation amplification of specific traits, that quickly generates pertinent thermodynamic and kinetic information by using an iterative series of short MD simulations to explore the vast depths of conformational space. © 2016 Elsevier Inc. All rights reserved.
The effects of fatigue on robotic surgical skill training in Urology residents.
Mark, James R; Kelly, Douglas C; Trabulsi, Edouard J; Shenot, Patrick J; Lallas, Costas D
2014-09-01
This study reports on the effect of fatigue on Urology residents using the daVinci surgical skills simulator (dVSS). Seven Urology residents performed a series of selected exercises on the dVSS while pre-call and post-call. Prior to dVSS performance a survey of subjective fatigue was taken and residents were tested with the Epworth Sleepiness Scale (ESS). Using the metrics available in the dVSS software, the performance of each resident was evaluated. The Urology residents slept an average of 4.07 h (range 2.5-6 h) while on call compared to an average of 5.43 h while not on call (range 3-7 h, p = 0.08). Post-call residents were significantly more likely to be identified as fatigued by the Epworth Sleepiness Score than pre-call residents (p = 0.01). Significant differences were observed in fatigued residents performing the exercises, Tubes and Match Board 2 (p = 0.05, 0.02). Additionally, there were significant differences in the total number of critical errors during the training session (9.29 vs. 3.14, p = 0.04). Fatigue in post-call Urology residents leads to poorer performance on the dVSS simulator. The dVSS may become a useful instrument in the education of fatigued residents and a tool to identify fatigue in trainees.
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
NASA Technical Reports Server (NTRS)
Dubos, Gregory F.; Cornford, Steven
2012-01-01
While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".
Miller, Brian W.; Morisette, Jeffrey T.
2014-01-01
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
GridLAB-D: An Agent-Based Simulation Framework for Smart Grids
Chassin, David P.; Fuller, Jason C.; Djilali, Ned
2014-01-01
Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less
GridLAB-D: An Agent-Based Simulation Framework for Smart Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chassin, David P.; Fuller, Jason C.; Djilali, Ned
2014-06-23
Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less
McLeod, M.A.; Andersen, D.E.
1998-01-01
Forest-nesting raptors are often difficult to detect and monitor because they can be secretive, and their nests can be difficult to locate. Some species, however, respond to broadcasts of taped calls, and these responses may be useful both in monitoring population trends and in locating nests. We conducted broadcast surveys on roads and at active red-shouldered hawk (Buteo lineatus) nests in northcentral Minnesota to determine effects of type of call (conspecific or great horned owl [Bubo virginianus]), time of day, and phase of the breeding cycle on red-shouldered hawk response behavior and to evaluate usefulness of broadcasts as a population monitoring tool using area occupied-probability-of-detection techniques. During the breeding seasons of 1994 and 1995, we surveyed 4 10-station road transects 59 times and conducted 76 surveys at 24 active nests. Results of these surveys indicated conspecific calls broadcast prior to hatch and early in the day were the most effective method of detecting red-shouldered hawks. Probability of detection via conspecific calls averaged 0.25, and area occupied was 100%. Computer simulations using these field data indicated broadcast surveys have the potential to be used as a population monitoring tool.
An approach for coupled-code multiphysics core simulations from a common input
Schmidt, Rodney; Belcourt, Kenneth; Hooper, Russell; ...
2014-12-10
This study describes an approach for coupled-code multiphysics reactor core simulations that is being developed by the Virtual Environment for Reactor Applications (VERA) project in the Consortium for Advanced Simulation of Light-Water Reactors (CASL). In this approach a user creates a single problem description, called the “VERAIn” common input file, to define and setup the desired coupled-code reactor core simulation. A preprocessing step accepts the VERAIn file and generates a set of fully consistent input files for the different physics codes being coupled. The problem is then solved using a single-executable coupled-code simulation tool applicable to the problem, which ismore » built using VERA infrastructure software tools and the set of physics codes required for the problem of interest. The approach is demonstrated by performing an eigenvalue and power distribution calculation of a typical three-dimensional 17 × 17 assembly with thermal–hydraulic and fuel temperature feedback. All neutronics aspects of the problem (cross-section calculation, neutron transport, power release) are solved using the Insilico code suite and are fully coupled to a thermal–hydraulic analysis calculated by the Cobra-TF (CTF) code. The single-executable coupled-code (Insilico-CTF) simulation tool is created using several VERA tools, including LIME (Lightweight Integrating Multiphysics Environment for coupling codes), DTK (Data Transfer Kit), Trilinos, and TriBITS. Parallel calculations are performed on the Titan supercomputer at Oak Ridge National Laboratory using 1156 cores, and a synopsis of the solution results and code performance is presented. Finally, ongoing development of this approach is also briefly described.« less
Burnside, Elizabeth S.; Lee, Sandra J.; Bennette, Carrie; Near, Aimee M.; Alagoz, Oguzhan; Huang, Hui; van den Broek, Jeroen J.; Kim, Joo Yeon; Ergun, Mehmet A.; van Ravesteyn, Nicolien T.; Stout, Natasha K.; de Koning, Harry J.; Mandelblatt, Jeanne S.
2017-01-01
Background There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49), annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16) who completed surveys. Results The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16) liked the appearance of the site; 94% (15/16) found the tool helpful; and 94% (15/16) would recommend the tool to a colleague. Conclusions This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network) simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms. PMID:29376135
Development of a Robust and Efficient Parallel Solver for Unsteady Turbomachinery Flows
NASA Technical Reports Server (NTRS)
West, Jeff; Wright, Jeffrey; Thakur, Siddharth; Luke, Ed; Grinstead, Nathan
2012-01-01
The traditional design and analysis practice for advanced propulsion systems relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. In the design of advanced propulsion systems, CFD plays a major role in defining the required performance over the entire flight regime, as well as in testing the sensitivity of the design to the different modes of operation. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment. The turbomachinery simulation capability presented here is being developed in a computational tool called Loci-STREAM [1]. It integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci [2] which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective is to be able to routinely simulate problems involving complex geometries requiring large unstructured grids and complex multidisciplinary physics. An immediate application of interest is simulation of unsteady flows in rocket turbopumps, particularly in cryogenic liquid rocket engines. The key components of the overall methodology presented in this paper are the following: (a) high fidelity unsteady simulation capability based on Detached Eddy Simulation (DES) in conjunction with second-order temporal discretization, (b) compliance with Geometric Conservation Law (GCL) in order to maintain conservative property on moving meshes for second-order time-stepping scheme, (c) a novel cloud-of-points interpolation method (based on a fast parallel kd-tree search algorithm) for interfaces between turbomachinery components in relative motion which is demonstrated to be highly scalable, and (d) demonstrated accuracy and parallel scalability on large grids (approx 250 million cells) in full turbomachinery geometries.
NASA Astrophysics Data System (ADS)
Frezzo, Dennis C.; Behrens, John T.; Mislevy, Robert J.
2010-04-01
Simulation environments make it possible for science and engineering students to learn to interact with complex systems. Putting these capabilities to effective use for learning, and assessing learning, requires more than a simulation environment alone. It requires a conceptual framework for the knowledge, skills, and ways of thinking that are meant to be developed, in order to design activities that target these capabilities. The challenges of using simulation environments effectively are especially daunting in dispersed social systems. This article describes how these challenges were addressed in the context of the Cisco Networking Academies with a simulation tool for computer networks called Packet Tracer. The focus is on a conceptual support framework for instructors in over 9,000 institutions around the world for using Packet Tracer in instruction and assessment, by learning to create problem-solving scenarios that are at once tuned to the local needs of their students and consistent with the epistemic frame of "thinking like a network engineer." We describe a layered framework of tools and interfaces above the network simulator that supports the use of Packet Tracer in the distributed community of instructors and students.
Etienne, E; Le Breton, N; Martinho, M; Mileo, E; Belle, V
2017-08-01
Site-directed spin labeling (SDSL) combined with continuous wave electron paramagnetic resonance (cw EPR) spectroscopy is a powerful technique to reveal, at the residue level, structural transitions in proteins. SDSL-EPR is based on the selective grafting of a paramagnetic label on the protein under study, followed by cw EPR analysis. To extract valuable quantitative information from SDSL-EPR spectra and thus give reliable interpretation on biological system dynamics, numerical simulations of the spectra are required. Such spectral simulations can be carried out by coding in MATLAB using functions from the EasySpin toolbox. For non-expert users of MATLAB, this could be a complex task or even impede the use of such simulation tool. We developed a graphical user interface called SimLabel dedicated to run cw EPR spectra simulations particularly coming from SDSL-EPR experiments. Simlabel provides an intuitive way to visualize, simulate, and fit such cw EPR spectra. An example of SDSL-EPR spectra simulation concerning the study of an intrinsically disordered region undergoing a local induced folding is described and discussed. We believe that this new tool will help the users to rapidly obtain reliable simulated spectra and hence facilitate the interpretation of their results. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
A virtual therapeutic environment with user projective agents.
Ookita, S Y; Tokuda, H
2001-02-01
Today, we see the Internet as more than just an information infrastructure, but a socializing place and a safe outlet of inner feelings. Many personalities develop aside from real world life due to its anonymous environment. Virtual world interactions are bringing about new psychological illnesses ranging from netaddiction to technostress, as well as online personality disorders and conflicts in multiple identities that exist in the virtual world. Presently, there are no standard therapy models for the virtual environment. There are very few therapeutic environments, or tools especially made for virtual therapeutic environments. The goal of our research is to provide the therapy model and middleware tools for psychologists to use in virtual therapeutic environments. We propose the Cyber Therapy Model, and Projective Agents, a tool used in the therapeutic environment. To evaluate the effectiveness of the tool, we created a prototype system, called the Virtual Group Counseling System, which is a therapeutic environment that allows the user to participate in group counseling through the eyes of their Projective Agent. Projective Agents inherit the user's personality traits. During the virtual group counseling, the user's Projective Agent interacts and collaborates to recover and increase their psychological growth. The prototype system provides a simulation environment where psychologists can adjust the parameters and customize their own simulation environment. The model and tool is a first attempt toward simulating online personalities that may exist only online, and provide data for observation.
Management simulations for Lean healthcare: exploiting the potentials of role-playing.
Barnabè, Federico; Giorgino, Maria Cleofe; Guercini, Jacopo; Bianciardi, Caterina; Mezzatesta, Vincenzo
2018-04-09
Purpose The purpose of this paper is to investigate the potentials of role-playing (RP) both in training healthcare (HC) professionals to implement tools and improvement actions based on Lean principles, and in supporting group discussion and the sharing of different competencies for the development of Lean HC. Design/methodology/approach The paper presents the case study of an RP simulation called LEAN HEALTHCARE LAB, which is used to train HC professionals at Siena University Hospital. The paper reports and discusses the results of a specific two-day simulation session and of a questionnaire that was distributed to gather feedback from the participants. Findings The paper verifies the potentials of RP to be a powerful educational and training tool that is able to stimulate the HC participants to apply Lean thinking principles and share their competencies in collaborative decision-making processes. Research limitations/implications The study provides data in reference to one single simulation session, although the game has already been applied several times in different HC organizations with very similar outcomes. Moreover, a more in-depth analysis of players' perceptions and decisions could be performed using different tools in addition to the adopted questionnaire. Practical implications RP games (RPGs) are effective training and educational tools for HC professionals. They offer benefits and learning conditions which are definitely different if compared with more conventional education programs for HC professionals. Originality/value While previous studies have extensively discussed the potentialities of RPG and simulations in training programs, only a few articles have discussed the RP adoption for Lean thinking and even less to educate HC professionals on Lean principles and tools.
Orbit Design Based on the Global Maps of Telecom Metrics
NASA Technical Reports Server (NTRS)
Lee, Charles H.; Cheung, Kar-Ming; Edwards, Chad; Noreen, Gary K.; Vaisnys, Arvydas
2004-01-01
In this paper we describe an orbit design aide tool, called Telecom Orbit Analysis and Simulation Tool(TOAST). Although it can be used for studying and selecting orbits for any planet, we solely concentrate on its use for Mars. By specifying the six orbital elements for an orbit, a time frame of interest, a horizon mask angle, and some telecom parameters such as the transmitting power, frequency, antenna gains, antenna losses, link margin, received threshold powers for the rates, etc. this tool enables the user to view the animation of the orbit in two and three-dimensional different telecom metrics at any point on the Mars, namely the global planetary map.
NETIMIS: Dynamic Simulation of Health Economics Outcomes Using Big Data.
Johnson, Owen A; Hall, Peter S; Hulme, Claire
2016-02-01
Many healthcare organizations are now making good use of electronic health record (EHR) systems to record clinical information about their patients and the details of their healthcare. Electronic data in EHRs is generated by people engaged in complex processes within complex environments, and their human input, albeit shaped by computer systems, is compromised by many human factors. These data are potentially valuable to health economists and outcomes researchers but are sufficiently large and complex enough to be considered part of the new frontier of 'big data'. This paper describes emerging methods that draw together data mining, process modelling, activity-based costing and dynamic simulation models. Our research infrastructure includes safe links to Leeds hospital's EHRs with 3 million secondary and tertiary care patients. We created a multidisciplinary team of health economists, clinical specialists, and data and computer scientists, and developed a dynamic simulation tool called NETIMIS (Network Tools for Intervention Modelling with Intelligent Simulation; http://www.netimis.com ) suitable for visualization of both human-designed and data-mined processes which can then be used for 'what-if' analysis by stakeholders interested in costing, designing and evaluating healthcare interventions. We present two examples of model development to illustrate how dynamic simulation can be informed by big data from an EHR. We found the tool provided a focal point for multidisciplinary team work to help them iteratively and collaboratively 'deep dive' into big data.
Numerical Propulsion System Simulation
NASA Technical Reports Server (NTRS)
Naiman, Cynthia
2006-01-01
The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.
Mastoidectomy performance assessment of virtual simulation training using final-product analysis.
Andersen, Steven A W; Cayé-Thomasen, Per; Sørensen, Mads S
2015-02-01
The future development of integrated automatic assessment in temporal bone virtual surgical simulators calls for validation against currently established assessment tools. This study aimed to explore the relationship between mastoidectomy final-product performance assessment in virtual simulation and traditional dissection training. Prospective trial with blinding. A total of 34 novice residents performed a mastoidectomy on the Visible Ear Simulator and on a cadaveric temporal bone. Two blinded senior otologists assessed the final-product performance using a modified Welling scale. The simulator gathered basic metrics on time, steps, and volumes in relation to the on-screen tutorial and collisions with vital structures. Substantial inter-rater reliability (kappa = 0.77) for virtual simulation and moderate inter-rater reliability (kappa = 0.59) for dissection final-product assessment was found. The simulation and dissection performance scores had significant correlation (P = .014). None of the basic simulator metrics correlated significantly with the final-product score except for number of steps completed in the simulator. A modified version of a validated final-product performance assessment tool can be used to assess mastoidectomy on virtual temporal bones. Performance assessment of virtual mastoidectomy could potentially save the use of cadaveric temporal bones for more advanced training when a basic level of competency in simulation has been achieved. NA. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.
NASA Technical Reports Server (NTRS)
Ding, R. Jeffrey; Oelgoetz, Peter A.
1999-01-01
The "Auto-Adjustable Pin Tool for Friction Stir Welding", was developed at The Marshall Space Flight Center to address process deficiencies unique to the FSW process. The auto-adjustable pin tool, also called the retractable pin-tool (R.PT) automatically withdraws the welding probe of the pin-tool into the pin-tool's shoulder. The primary function of the auto-adjustable pin-tool is to allow for keyhole closeout, necessary for circumferential welding and localized weld repair, and, automated pin-length adjustment for the welding of tapered material thickness. An overview of the RPT hardware is presented. The paper follows with studies conducted using the RPT. The RPT was used to simulate two capabilities; welding tapered material thickness and closing out the keyhole in a circumferential weld. The retracted pin-tool regions in aluminum- lithium 2195 friction stir weldments were studied through mechanical property testing and metallurgical sectioning. Correlation's can be =de between retractable pin-tool programmed parameters, process parameters, microstructure, and resulting weld quality.
ERIC Educational Resources Information Center
Hallinger, Philip; Shaobing, Tang; Jiafang, Lu
2017-01-01
School leader training has become a critical strategy in educational reform. However, in China, there still exists a big gap in terms of how to transfer leadership knowledge into practice. Thus, tools that can integrate formal knowledge into practice are called for urgently in school leader training. This paper presents the results of a research…
The MeqTrees software system and its use for third-generation calibration of radio interferometers
NASA Astrophysics Data System (ADS)
Noordam, J. E.; Smirnov, O. M.
2010-12-01
Context. The formulation of the radio interferometer measurement equation (RIME) for a generic radio telescope by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. Aims: The MeqTrees software system is designed to implement numerical models, and to solve for arbitrary subsets of their parameters. It may be applied to many problems, but was originally geared towards implementing Measurement Equations in radio astronomy for the purposes of simulation and calibration. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool that facilitates rapid experimentation, and exchange of ideas (and scripts). Methods: MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a Python-based Tree Definition Language (TDL), then rapidly executed on the back-end. The use of TDL facilitates an extremely short turn-around time (hours rather than weeks or months) for experimentation with new ideas. This is also helped by unprecedented visualization capabilities for all final and intermediate results. A flexible data model and a number of important optimizations in the back-end ensures that the numerical performance is comparable to that of hand-written code. Results: MeqTrees is already widely used as the simulation tool for new instruments (LOFAR, SKA) and technologies (focal plane arrays). It has demonstrated that it can achieve a noise-limited dynamic range in excess of a million, on WSRT data. It is the only package that is specifically designed to handle what we propose to call third-generation calibration (3GC), which is needed for the new generation of giant radio telescopes, but can also improve the calibration of existing instruments.
NASA Technical Reports Server (NTRS)
Mainger, Steve
2004-01-01
As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.
2013-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).
Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment
NASA Astrophysics Data System (ADS)
Zeigler, Bernard P.; Lee, J. S.
1998-08-01
In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.
Sethi, Suresh; Linden, Daniel; Wenburg, John; Lewis, Cara; Lemons, Patrick R.; Fuller, Angela K.; Hare, Matthew P.
2016-01-01
Error-tolerant likelihood-based match calling presents a promising technique to accurately identify recapture events in genetic mark–recapture studies by combining probabilities of latent genotypes and probabilities of observed genotypes, which may contain genotyping errors. Combined with clustering algorithms to group samples into sets of recaptures based upon pairwise match calls, these tools can be used to reconstruct accurate capture histories for mark–recapture modelling. Here, we assess the performance of a recently introduced error-tolerant likelihood-based match-calling model and sample clustering algorithm for genetic mark–recapture studies. We assessed both biallelic (i.e. single nucleotide polymorphisms; SNP) and multiallelic (i.e. microsatellite; MSAT) markers using a combination of simulation analyses and case study data on Pacific walrus (Odobenus rosmarus divergens) and fishers (Pekania pennanti). A novel two-stage clustering approach is demonstrated for genetic mark–recapture applications. First, repeat captures within a sampling occasion are identified. Subsequently, recaptures across sampling occasions are identified. The likelihood-based matching protocol performed well in simulation trials, demonstrating utility for use in a wide range of genetic mark–recapture studies. Moderately sized SNP (64+) and MSAT (10–15) panels produced accurate match calls for recaptures and accurate non-match calls for samples from closely related individuals in the face of low to moderate genotyping error. Furthermore, matching performance remained stable or increased as the number of genetic markers increased, genotyping error notwithstanding.
Component Technology for High-Performance Scientific Simulation Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epperly, T; Kohn, S; Kumfert, G
2000-11-09
We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less
Analysis of In-Route Wireless Charging for the Shuttle System at Zion National Park
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meintz, Andrew; Prohaska, Robert; Konan, Arnaud
System right-sizing is critical to implementation of wireless power transfer (WPT) for electric vehicles (EVs). This study will analyze potential WPT scenarios for the electrification of shuttle buses at Zion National Park utilizing a modelling tool developed by NREL called WPTSim. This tool uses second-by-second speed, location, and road grade data from the conventional shuttles in operation to simulate the incorporation of WPT at fine granularity. Vehicle power and state of charge are simulated over the drive cycle to evaluate potential system designs. The required battery capacity is determined based on the rated power at a variable number of chargingmore » locations. The outcome of this work is an analysis of the design tradeoffs for the electrification of the shuttle fleet with wireless charging versus conventional overnight charging.« less
Simulation Facilities and Test Beds for Galileo
NASA Astrophysics Data System (ADS)
Schlarmann, Bernhard Kl.; Leonard, Arian
2002-01-01
Galileo is the European satellite navigation system, financed by the European Space Agency (ESA) and the European Commission (EC). The Galileo System, currently under definition phase, will offer seamless global coverage, providing state-of-the-art positioning and timing services. Galileo services will include a standard service targeted at mass market users, an augmented integrity service, providing integrity warnings when fault occur and Public Regulated Services (ensuring a continuity of service for the public users). Other services are under consideration (SAR and integrated communications). Galileo will be interoperable with GPS, and will be complemented by local elements that will enhance the services for specific local users. In the frame of the Galileo definition phase, several system design and simulation facilities and test beds have been defined and developed for the coming phases of the project, respectively they are currently under development. These are mainly the following tools: Galileo Mission Analysis Simulator to design the Space Segment, especially to support constellation design, deployment and replacement. Galileo Service Volume Simulator to analyse the global performance requirements based on a coverage analysis for different service levels and degrades modes. Galileo System Simulation Facility is a sophisticated end-to-end simulation tool to assess the navigation performances for a complete variety of users under different operating conditions and different modes. Galileo Signal Validation Facility to evaluate signal and message structures for Galileo. Galileo System Test Bed (Version 1) to assess and refine the Orbit Determination &Time Synchronisation and Integrity algorithms, through experiments relying on GPS space infrastructure. This paper presents an overview on the so called "G-Facilities" and describes the use of the different system design tools during the project life cycle in order to design the system with respect to availability, continuity and integrity requirements. It gives more details on two of these system design tools: the Galileo Signal Validation Facility (GSVF) and the Galileo System Simulation Facility (GSSF). It will describe the operational use of these facilities within the complete set of design tools and especially the combined use of GSVF and GSSF will be described. Finally, this paper presents also examples and results obtained with these tools.
NASA Technical Reports Server (NTRS)
Follen, Gregory J.; Naiman, Cynthia G.
1999-01-01
The NASA Lewis Research Center is developing an environment for analyzing and designing aircraft engines-the Numerical Propulsion System Simulation (NPSS). NPSS will integrate multiple disciplines, such as aerodynamics, structure, and heat transfer, and will make use of numerical "zooming" on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS uses the latest computing and communication technologies to capture complex physical processes in a timely, cost-effective manner. The vision of NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Through the NASA/Industry Cooperative Effort agreement, NASA Lewis and industry partners are developing a new engine simulation called the National Cycle Program (NCP). NCP, which is the first step toward NPSS and is its initial framework, supports the aerothermodynamic system simulation process for the full life cycle of an engine. U.S. aircraft and airframe companies recognize NCP as the future industry standard common analysis tool for aeropropulsion system modeling. The estimated potential payoff for NCP is a $50 million/yr savings to industry through improved engineering productivity.
High-performance computational fluid dynamics: a custom-code approach
NASA Astrophysics Data System (ADS)
Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.
2016-07-01
We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.
Evolution of surface characteristics in material removal simulation with subaperture tools
NASA Astrophysics Data System (ADS)
Kim, Sug-Whan; Jee, Myung-Kook
2002-02-01
Over the last decade, we have witnessed that the fabrication of 200 - 2000 mm scale have received relatively little attention from the fabrication technology development, compared to those of smaller than 200 mm and of larger than 2000 mm in diameter. As a result, the optical surfaces of these scales are still predominantly completed by small optics shops where opticians apply the traditional technique for polishing. Lack of tools in aiding opticians for planning, executing and analyzing their polishing work is a root cause for long and, sometimes, unpredictable delivery and high manufacturing cost for such optical surfaces. We present the on-going development of a software simulation environment called Surface Analysis and Fabrication Environment (SAFE). It is primarily intended to increase the throughput of polishing and testing cycles by allowing opticians to simulate the resulting surface form and roughness with input polishing variables. A brief review of current polishing techniques and their target optics clarifies the need for such simulation tool. This is followed by the development targets and a preliminary simulation plan using the developmental version of SAFE. Among many polishing variables, two removal assumptions and three different types of removal functions we used for the polishing simulation presented. The simulations show that the Gaussian removal function with the proportional removal assumption resulted in the fastest, though marginal, convergence to a super-polished surface of 0.56 micron Peat- to-Valley in form accuracy and of 0.02 nanometer in surface roughness Ra. Other meaningful results and their implications are also presented.
Monte Carlo simulation of particle-induced bit upsets
NASA Astrophysics Data System (ADS)
Wrobel, Frédéric; Touboul, Antoine; Vaillé, Jean-Roch; Boch, Jérôme; Saigné, Frédéric
2017-09-01
We investigate the issue of radiation-induced failures in electronic devices by developing a Monte Carlo tool called MC-Oracle. It is able to transport the particles in device, to calculate the energy deposited in the sensitive region of the device and to calculate the transient current induced by the primary particle and the secondary particles produced during nuclear reactions. We compare our simulation results with SRAM experiments irradiated with neutrons, protons and ions. The agreement is very good and shows that it is possible to predict the soft error rate (SER) for a given device in a given environment.
Analyzing Spacecraft Telecommunication Systems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric
2004-01-01
Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.
Building complex simulations rapidly using MATRIX(x): The Space Station redesign
NASA Technical Reports Server (NTRS)
Carrington, C. K.
1994-01-01
MSFC's quick response to the Space Station redesign effort last year required the development of a computer simulation to model the attitude and station-keeping dynamics of a complex body with rotating solar arrays in orbit around the Earth. The simulation was written using a rapid-prototyping graphical simulation and design tool called MATRIX(x) and provided the capability to quickly remodel complex configuration changes by icon manipulation using a mouse. The simulation determines time-dependent inertia properties, and models forces and torques from gravity-gradient, solar radiation, and aerodynamic disturbances. Surface models are easily built from a selection of beams, plates, tetrahedrons, and cylinders. An optimization scheme was written to determine the torque equilibrium attitudes that balance gravity-gradient and aerodynamic torques over an orbit, and propellant-usage estimates were determined. The simulation has been adapted to model the attitude dynamics for small spacecraft.
A Fast-Time Simulation Environment for Airborne Merging and Spacing Research
NASA Technical Reports Server (NTRS)
Bussink, Frank J. L.; Doble, Nathan A.; Barmore, Bryan E.; Singer, Sharon
2005-01-01
As part of NASA's Distributed Air/Ground Traffic Management (DAG-TM) effort, NASA Langley Research Center is developing concepts and algorithms for merging multiple aircraft arrival streams and precisely spacing aircraft over the runway threshold. An airborne tool has been created for this purpose, called Airborne Merging and Spacing for Terminal Arrivals (AMSTAR). To evaluate the performance of AMSTAR and complement human-in-the-loop experiments, a simulation environment has been developed that enables fast-time studies of AMSTAR operations. The environment is based on TMX, a multiple aircraft desktop simulation program created by the Netherlands National Aerospace Laboratory (NLR). This paper reviews the AMSTAR concept, discusses the integration of the AMSTAR algorithm into TMX and the enhancements added to TMX to support fast-time AMSTAR studies, and presents initial simulation results.
Joint Experiment on Scalable Parallel Processors (JESPP) Parallel Data Management
2006-05-01
management and analysis tool, called Simulation Data Grid ( SDG ). The design principles driving the design of SDG are: 1) minimize network communication...or SDG . In this report, an initial prototype implementation of this system is described. This project follows on earlier research, primarily...distributed logging system had some 2 limitations. These limitations will be described in this report, and how the SDG addresses these limitations. 3.0
Improved Load Alleviation Capability for the KC-135
1997-09-01
software, such as Matlab, Mathematica, Simulink, and Robotica Front End for Mathematica available in the simulation laboratory Overview This thesis report is...outlined in Spong’s text in order to utilize the Robotica system development software which automates the process of calculating the kinematic and...kinematic and dynamic equations can be accomplished using a computer tool called Robotica Front End (RFE) [ 15], developed by Doctor Spong. Boom Root d3
ALMA from the Users' Perspective
NASA Astrophysics Data System (ADS)
Johnson, Kelsey
2010-05-01
After decades of dreaming and preparation, the call for early science with ALMA is just around the corner. The goal of this talk is to illustrate the process of preparing and carrying out a research program with ALMA. This presentation will step through the user interface for proposal preparation, proposal review, project tracking, data acquisition, and post-processing. Examples of the software tools, including the simulator and spectral line catalog, will be included.
NASA Astrophysics Data System (ADS)
Prettyman, T. H.; Gardner, R. P.; Verghese, K.
1993-08-01
A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.
NASA Astrophysics Data System (ADS)
Farroni, Flavio; Lamberti, Raffaele; Mancinelli, Nicolò; Timpone, Francesco
2018-03-01
Tyres play a key role in ground vehicles' dynamics because they are responsible for traction, braking and cornering. A proper tyre-road interaction model is essential for a useful and reliable vehicle dynamics model. In the last two decades Pacejka's Magic Formula (MF) has become a standard in simulation field. This paper presents a Tool, called TRIP-ID (Tyre Road Interaction Parameters IDentification), developed to characterize and to identify with a high grade of accuracy and reliability MF micro-parameters from experimental data deriving from telemetry or from test rig. The tool guides interactively the user through the identification process on the basis of strong diagnostic considerations about the experimental data made evident by the tool itself. A motorsport application of the tool is shown as a case study.
NASA Astrophysics Data System (ADS)
Chooramun, N.; Lawrence, P. J.; Galea, E. R.
2017-08-01
In all evacuation simulation tools, the space through which agents navigate and interact is represented by one the following methods, namely Coarse regions, Fine nodes and Continuous regions. Each of the spatial representation methods has its benefits and limitations. For instance, the Coarse approach allows simulations to be processed very rapidly, but is unable to represent the interactions of the agents from an individual perspective; the Continuous approach provides a detailed representation of agent movement and interaction but suffers from relatively poor computational performance. The Fine nodal approach presents a compromise between the Continuous and Coarse approaches such that it allows agent interaction to be modelled while providing good computational performance. Our approach for representing space in an evacuation simulation tool differs such that it allows evacuation simulations to be run using a combination of Coarse regions, Fine nodes and Continuous regions. This approach, which we call Hybrid Spatial Discretisation (HSD), is implemented within the buildingEXODUS evacuation simulation software. The HSD incorporates the benefits of each of the spatial representation methods whilst providing an optimal environment for representing agent movement and interaction. In this work, we demonstrate the effectiveness of the HSD through its application to a moderately large case comprising of an underground rail tunnel station with a population of 2,000 agents.
Methodological challenges collecting parent phone-call healthcare utilization data.
Moreau, Paula; Crawford, Sybil; Sullivan-Bolyai, Susan
2016-02-01
Recommendations by the National Institute of Nursing Research and other groups have strongly encouraged nurses to pay greater attention to cost-effectiveness analysis when conducting research. Given the increasing prominence of translational science and comparative effective research, cost-effective analysis has become a basic tool in determining intervention value in research. Tracking phone-call communication (number of calls and context) with cross-checks between parents and healthcare providers is an example of this type of healthcare utilization data collection. This article identifies some methodological challenges that have emerged in the process of collecting this type of data in a randomized controlled trial: Parent education Through Simulation-Diabetes (PETS-D). We also describe ways in which those challenges have been addressed with comparison data results, and make recommendations for future research. Copyright © 2015 Elsevier Inc. All rights reserved.
An Interactive, Design and Educational Tool for Supersonic External-Compression Inlets
NASA Technical Reports Server (NTRS)
Benson, Thomas J.
1994-01-01
A workstation-based interactive design tool called VU-INLET was developed for the inviscid flow in rectangular, supersonic, external-compression inlets. VU-INLET solves for the flow conditions from free stream, through the supersonic compression ramps, across the terminal normal shock region and the subsonic diffuser to the engine face. It calculates the shock locations, the capture streamtube, and the additive drag of the inlet. The inlet geometry can be modified using a graphical user interface and the new flow conditions recalculated interactively. Free stream conditions and engine airflow can also be interactively varied and off-design performance evaluated. Flow results from VU-INLET can be saved to a file for a permanent record, and a series of help screens make the simulator easy to learn and use. This paper will detail the underlying assumptions of the models and the numerical methods used in the simulator.
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista
1993-01-01
Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.
National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion
NASA Technical Reports Server (NTRS)
Follen, G.; Naiman, C.; Evans, A.
1999-01-01
Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.
Analysis of In-Route Wireless Charging for the Shuttle System at Zion National Park
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meintz, Andrew; Prohaska, Robert; Konan, Arnaud
System right-sizing is critical to implementation of wireless power transfer (WPT) for electric vehicles. This study will analyze potential WPT scenarios for the electrification of shuttle buses at Zion National Park utilizing a modelling tool developed by the National Renewable Energy Laboratory called WPTSim. This tool uses second-by-second speed, location, and road grade data from the conventional shuttles in operation to simulate the incorporation of WPT at fine granularity. Vehicle power and state of charge are simulated over the drive cycle to evaluate potential system designs. The required battery capacity is determined based on the rated power at a variablemore » number of charging locations. The outcome of this work is an analysis of the design tradeoffs for the electrification of the shuttle fleet with wireless charging versus conventional overnight charging.« less
PBxplore: a tool to analyze local protein structure and deformability with Protein Blocks
Craveur, Pierrick; Joseph, Agnel Praveen; Jallu, Vincent
2017-01-01
This paper describes the development and application of a suite of tools, called PBxplore, to analyze the dynamics and deformability of protein structures using Protein Blocks (PBs). Proteins are highly dynamic macromolecules, and a classical way to analyze their inherent flexibility is to perform molecular dynamics simulations. The advantage of using small structural prototypes such as PBs is to give a good approximation of the local structure of the protein backbone. More importantly, by reducing the conformational complexity of protein structures, PBs allow analysis of local protein deformability which cannot be done with other methods and had been used efficiently in different applications. PBxplore is able to process large amounts of data such as those produced by molecular dynamics simulations. It produces frequencies, entropy and information logo outputs as text and graphics. PBxplore is available at https://github.com/pierrepo/PBxplore and is released under the open-source MIT license. PMID:29177113
Determining the spatial altitude of the hydraulic fractures.
NASA Astrophysics Data System (ADS)
Khamiev, Marsel; Kosarev, Victor; Goncharova, Galina
2016-04-01
Mathematical modeling and numerical simulation are the most widely used approaches for the solving geological problems. They imply software tools which are based on Monte Carlo method. The results of this project presents shows the possibility of using PNL tool to determine fracturing location. The modeled media is a homogeneous rock (limestone) cut by a vertical borehole (d=216 mm) with metal casing 9 mm thick. The cement sheath is 35 mm thick. The borehole is filled with fresh water. The rock mass is cut by crack, filled with a mixture of doped (gadolinium oxide Gd2O3) proppant (75%) and water (25%). A pulse neutron logging (PNL) tool is used for quality control in hydraulic fracturing operations. It includes a fast neutron source (so-called "neutron generator") and a set of thermal (or epithermal) neutron-sensing devices, forming the so-called near (ND) and far (FD) detectors. To evaluate neutron properties various segments (sectors) of the rock mass, the detector must register only neutrons that come from this very formation. It's possible if detecting block includes some (6 for example) thermal neutron detectors arranged circumferentially inside the tool. As a result we get few independent well logs, each accords with define rock sector. Afterwards synthetic logs processing we can determine spatial position of the hydraulic fracture.
A new approach to road accident rescue.
Morales, Alejandro; González-Aguilera, Diego; López, Alfonso I; Gutiérrez, Miguel A
2016-01-01
This article develops and validates a new methodology and tool for rescue assistance in traffic accidents, with the aim of improving its efficiency and safety in the evacuation of people, reducing the number of victims in road accidents. Different tests supported by professionals and experts have been designed under different circumstances and with different categories of damaged vehicles coming from real accidents and simulated trapped victims in order to calibrate and refine the proposed methodology and tool. To validate this new approach, a tool called App_Rescue has been developed. This tool is based on the use of a computer system that allows an efficient access to the technical information of the vehicle and sanitary information of the common passengers. The time spent during rescue using the standard protocol and the proposed method was compared. This rescue assistance system allows us to make vital information accessible in posttrauma care services, improving the effectiveness of interventions by the emergency services, reducing the rescue time and therefore minimizing the consequences involved and the number of victims. This could often mean saving lives. In the different simulated rescue operations, the rescue time has been reduced an average of 14%.
Assessment of tools for protection of quality of water: Uncontrollable discharges of pollutants.
Dehghani Darmian, Mohsen; Hashemi Monfared, Seyed Arman; Azizyan, Gholamreza; Snyder, Shane A; Giesy, John P
2018-06-06
Selecting an appropriate crisis management plans during uncontrollable loading of pollution to water systems is crucial. In this research the quality of water resources against uncontrollable pollution is protected by use of suitable tools. Case study which was chosen in this investigation was a river-reservoir system. Analytical and numerical solutions of pollutant transport equation were considered as the simulation strategy to calculate the efficient tools to protect water quality. These practical instruments are dilution flow and a new tool called detention time which is proposed and simulated for the first time in this study. For uncontrollable pollution discharge which was approximately 130% of the river's assimilation capacity, as long as the duration of contact (T c ) was considered as a constraint, by releasing 30% of the base flow of the river from the upstream dilution reservoir, the unallowable pollution could be treated. Moreover, when the affected distance (X c ) was selected as a constraint, the required detention time that the rubber dam should detained the water to be treated was equal to 187% of the initial duration of contact. Copyright © 2018 Elsevier Inc. All rights reserved.
Distributed dynamic simulations of networked control and building performance applications.
Yahiaoui, Azzedine
2018-02-01
The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.
Distributed dynamic simulations of networked control and building performance applications
Yahiaoui, Azzedine
2017-01-01
The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper. PMID:29568135
Integrating interactive computational modeling in biology curricula.
Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A
2015-03-01
While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.
Long Range Plan for Embedded Computer Systems Support. Volume II
1981-10-01
interface (pilot displays and controls plus visual system), and data collection (CMAC data, bus data and simulation data). Non-real time functions include...unless adequate upfront planning is implemented, the command will be controlled by the dynamics rather than controll - ing them. The upfront planning should...or should they be called manually? What amount and type of data should the various tools pass between each other? Under what conditions and controls
Taxi Time Prediction at Charlotte Airport Using Fast-Time Simulation and Machine Learning Techniques
NASA Technical Reports Server (NTRS)
Lee, Hanbong
2016-01-01
Accurate taxi time prediction is required for enabling efficient runway scheduling that can increase runway throughput and reduce taxi times and fuel consumptions on the airport surface. Currently NASA and American Airlines are jointly developing a decision-support tool called Spot and Runway Departure Advisor (SARDA) that assists airport ramp controllers to make gate pushback decisions and improve the overall efficiency of airport surface traffic. In this presentation, we propose to use Linear Optimized Sequencing (LINOS), a discrete-event fast-time simulation tool, to predict taxi times and provide the estimates to the runway scheduler in real-time airport operations. To assess its prediction accuracy, we also introduce a data-driven analytical method using machine learning techniques. These two taxi time prediction methods are evaluated with actual taxi time data obtained from the SARDA human-in-the-loop (HITL) simulation for Charlotte Douglas International Airport (CLT) using various performance measurement metrics. Based on the taxi time prediction results, we also discuss how the prediction accuracy can be affected by the operational complexity at this airport and how we can improve the fast time simulation model before implementing it with an airport scheduling algorithm in a real-time environment.
SNV-PPILP: refined SNV calling for tumor data using perfect phylogenies and ILP.
van Rens, Karen E; Mäkinen, Veli; Tomescu, Alexandru I
2015-04-01
Recent studies sequenced tumor samples from the same progenitor at different development stages and showed that by taking into account the phylogeny of this development, single-nucleotide variant (SNV) calling can be improved. Accurate SNV calls can better reveal early-stage tumors, identify mechanisms of cancer progression or help in drug targeting. We present SNV-PPILP, a fast and easy to use tool for refining GATK's Unified Genotyper SNV calls, for multiple samples assumed to form a phylogeny. We tested SNV-PPILP on simulated data, with a varying number of samples, SNVs, read coverage and violations of the perfect phylogeny assumption. We always match or improve the accuracy of GATK, with a significant improvement on low read coverage. SNV-PPILP, available at cs.helsinki.fi/gsa/snv-ppilp/, is written in Python and requires the free ILP solver lp_solve. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Mass and energy flows between the Solar chromosphere, transition region, and corona
NASA Astrophysics Data System (ADS)
Hansteen, V. H.
2017-12-01
A number of increasingly sophisticated numerical simulations spanning the convection zone to corona have shed considerable insight into the role of the magnetic field in the structure and energetics of the Sun's outer atmosphere. This development is strengthened by the wealth of observational data now coming on-line from both ground based and space borne observatories. We discuss what numerical models can tell us about the mass and energy flows in the region of the upper chromosphere and lower corona, using a variety of tools, including the direct comparison with data and the use of passive tracer particles (so-called 'corks') inserted into the simulated flows.
FEAMAC/CARES Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Bhatt, Ramakrishna
2016-01-01
Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.
Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composite
NASA Technical Reports Server (NTRS)
Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu
2015-01-01
Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.
A Novel Mobile Testing Equipment for Rock Cuttability Assessment: Vertical Rock Cutting Rig (VRCR)
NASA Astrophysics Data System (ADS)
Yasar, Serdar; Yilmaz, Ali Osman
2017-04-01
In this study, a new mobile rock cutting testing apparatus was designed and produced for rock cuttability assessment called vertical rock cutting rig (VRCR) which was designed specially to fit into hydraulic press testing equipment which are available in almost every rock mechanics laboratory. Rock cutting trials were initiated just after the production of VRCR along with calibration of the measuring load cell with an external load cell to validate the recorded force data. Then, controlled rock cutting tests with both relieved and unrelieved cutting modes were implemented on five different volcanic rock samples with a standard simple-shaped wedge tool. Additionally, core cutting test which is an important approach for roadheader performance prediction was simulated with VRCR. Mini disc cutters and point attack tools were used for execution of experimental trials. Results clearly showed that rock cutting tests were successfully realized and measuring system is delicate to rock strength, cutting depth and other variables. Core cutting test was successfully simulated, and it was also shown that rock cutting tests with mini disc cutters and point attack tools are also successful with VRCR.
Drewes, Rich; Zou, Quan; Goodman, Philip H
2009-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.
Drewes, Rich; Zou, Quan; Goodman, Philip H.
2008-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707
Multiagent Work Practice Simulation: Progress and Challenges
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Shaffe, Michael G. (Technical Monitor)
2001-01-01
Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and a computer system. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3D space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).
Multiagent Work Practice Simulation: Progress and Challenges
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten
2002-01-01
Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and computer systems. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3d space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).
Rossi, Michael R.; Tanaka, Daigo; Shimada, Kenji; Rabin, Yoed
2009-01-01
The current study focuses on experimentally validating a planning scheme based on the so-called bubble-packing method. This study is a part of an ongoing effort to develop computerized planning tools for cryosurgery, where bubble packing has been previously developed as a means to find an initial, uniform distribution of cryoprobes within a given domain; the so-called force-field analogy was then used to move cryoprobes to their optimum layout. However, due to the high quality of the cryoprobes’ distribution, suggested by bubble packing and its low computational cost, it has been argued that a planning scheme based solely on bubble packing may be more clinically relevant. To test this argument, an experimental validation is performed on a simulated cross-section of the prostate, using gelatin solution as a phantom material, proprietary liquid-nitrogen based cryoprobes, and a cryoheater to simulate urethral warming. Experimental results are compared with numerically simulated temperature histories resulting from planning. Results indicate an average disagreement of 0.8 mm in identifying the freezing front location, which is an acceptable level of uncertainty in the context of prostate cryosurgery imaging. PMID:19885373
NASA Astrophysics Data System (ADS)
Lupoglazoff, N.; Vuillot, F.
Some comparisons between firing tests and numerical simulations of vortex shedding via a simple test case called 'C1experimental' are presented. These experiments are performed to validate further numerical simulations, as well as to serve as a tool for facilitating interpretation. At ignition time, spectra of pressure are more complex: it is the effect of vortex pairings. For 6.5-mm burnt, the second longitudinal mode dominates. For 8-mm burnt, the first longitudinal mode dominates. For 11.5-mm burnt, there is only the first longitudinal mode, with a slight shift of the frequency value. Tables are presented which give the pressure oscillation amplitudes of 'C1experimental' with operating pressures, and these amplitudes relative to the corresponding operating pressure.
Student Evaluation of CALL Tools during the Design Process
ERIC Educational Resources Information Center
Nesbitt, Dallas
2013-01-01
This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…
Design and evaluation of a simulation for pediatric dentistry in virtual worlds.
Papadopoulos, Lazaros; Pentzou, Afroditi-Evaggelia; Louloudiadis, Konstantinos; Tsiatsos, Thrasyvoulos-Konstantinos
2013-10-29
Three-dimensional virtual worlds are becoming very popular among educators in the medical field. Virtual clinics and patients are already used for case study and role play in both undergraduate and continuing education levels. Dental education can also take advantage of the virtual world's pedagogical features in order to give students the opportunity to interact with virtual patients (VPs) and practice in treatment planning. The objective of this study was to design and evaluate a virtual patient as a supplemental teaching tool for pediatric dentistry. A child VP, called Erietta, was created by utilizing the programming and building tools that online virtual worlds offer. The case is about an eight-year old girl visiting the dentist with her mother for the first time. Communication techniques such as Tell-Show-Do and parents' interference management were the basic elements of the educational scenario on which the VP was based. An evaluation of the simulation was made by 103 dental students in their fourth year of study. Two groups were formed: an experimental group which was exposed to the simulation (n=52) and a control group which did not receive the simulation (n=51). At the end, both groups were asked to complete a knowledge questionnaire and the results were compared. A statistically significant difference between the two groups was found by applying a t test for independent samples (P<.001), showing a positive learning effect from the VP. The majority of the participants evaluated the aspects of the simulation very positively while 69% (36/52) of the simulation group expressed their preference for using this module as an additional teaching tool. This study demonstrated that a pediatric dentistry VP built in a virtual world offers significant learning potential when used as a supplement to the traditional teaching techniques.
Numerical simulations of novel high-power high-brightness diode laser structures
NASA Astrophysics Data System (ADS)
Boucke, Konstantin; Rogg, Joseph; Kelemen, Marc T.; Poprawe, Reinhart; Weimann, Guenter
2001-07-01
One of the key topics in today's semiconductor laser development activities is to increase the brightness of high-power diode lasers. Although structures showing an increased brightness have been developed specific draw-backs of these structures lead to a still strong demand for investigation of alternative concepts. Especially for the investigation of basically novel structures easy-to-use and fast simulation tools are essential to avoid unnecessary, cost and time consuming experiments. A diode laser simulation tool based on finite difference representations of the Helmholtz equation in 'wide-angle' approximation and the carrier diffusion equation has been developed. An optimized numerical algorithm leads to short execution times of a few seconds per resonator round-trip on a standard PC. After each round-trip characteristics like optical output power, beam profile and beam parameters are calculated. A graphical user interface allows online monitoring of the simulation results. The simulation tool is used to investigate a novel high-power, high-brightness diode laser structure, the so-called 'Z-Structure'. In this structure an increased brightness is achieved by reducing the divergency angle of the beam by angular filtering: The round trip path of the beam is two times folded using internal total reflection at surfaces defined by a small index step in the semiconductor material, forming a stretched 'Z'. The sharp decrease of the reflectivity for angles of incidence above the angle of total reflection leads to a narrowing of the angular spectrum of the beam. The simulations of the 'Z-Structure' indicate an increase of the beam quality by a factor of five to ten compared to standard broad-area lasers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawson, M.; Yu, Y. H.; Nelessen, A.
2014-05-01
Wave energy converters (WECs) are commonly designed and analyzed using numerical models that combine multi-body dynamics with hydrodynamic models based on the Cummins Equation and linearized hydrodynamic coefficients. These modeling methods are attractive design tools because they are computationally inexpensive and do not require the use of high performance computing resources necessitated by high-fidelity methods, such as Navier Stokes computational fluid dynamics. Modeling hydrodynamics using linear coefficients assumes that the device undergoes small motions and that the wetted surface area of the devices is approximately constant. WEC devices, however, are typically designed to undergo large motions in order to maximizemore » power extraction, calling into question the validity of assuming that linear hydrodynamic models accurately capture the relevant fluid-structure interactions. In this paper, we study how calculating buoyancy and Froude-Krylov forces from the instantaneous position of a WEC device (referred to as instantaneous buoyancy and Froude-Krylov forces from herein) changes WEC simulation results compared to simulations that use linear hydrodynamic coefficients. First, we describe the WEC-Sim tool used to perform simulations and how the ability to model instantaneous forces was incorporated into WEC-Sim. We then use a simplified one-body WEC device to validate the model and to demonstrate how accounting for these instantaneously calculated forces affects the accuracy of simulation results, such as device motions, hydrodynamic forces, and power generation.« less
PyRhO: A Multiscale Optogenetics Simulation Platform
Evans, Benjamin D.; Jarvis, Sarah; Schultz, Simon R.; Nikolic, Konstantin
2016-01-01
Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences. PMID:27148037
PyRhO: A Multiscale Optogenetics Simulation Platform.
Evans, Benjamin D; Jarvis, Sarah; Schultz, Simon R; Nikolic, Konstantin
2016-01-01
Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences.
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Xu; Tuo, Rui; Jeff Wu, C. F.
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
He, Xu; Tuo, Rui; Jeff Wu, C. F.
2017-01-31
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
A simulation approach to material removal in microwave drilling of soda lime glass at 2.45 GHz
NASA Astrophysics Data System (ADS)
Lautre, Nitin Kumar; Sharma, Apurbba Kumar; Pradeep, Kumar; Das, Shantanu
2015-09-01
Material removal during microwave drilling is basically due to thermal ablation of the material in the vicinity of the drilling tool. The microtip of the tool, also termed as concentrator, absorbs microwaves and ionizes the dielectric in its proximity creating a zone of plasma. The plasma takes the shape of a sphere owing to the atmospheric sphere, which acts as the source of thermal energy to be used for processing a material. This mechanism of heating, also called localized microwave heating, was used in the present study to drill holes in 1.2-mm-thick soda lime glass. The mechanism of material removal had been analyzed through simulation of the hot spot region, and the results were attempted to explain through experiment observations. It was realized that the glass being a poor conductor of heat, a low power (90 W in this case) yields better drilling results owing to more localized heat corresponding to a low-volume plasma sphere. The low application time prevents further heat transfer, and a localized concentration of heat becomes possible that primarily causes the material ablation. The plasma sphere appears sustain while the tool moves through the bulk of the glass thickness although its volume gets further shrunk. The process needs careful selection of the parameters. The simulation results show relatively low temperature in the top half (opposite to the tool tip) of the plasma sphere which eventually causes the semimolten viscous glass to collapse into the drill cavity as the tool advances into the bulk and stops the movement of the tool. The continued plasma sphere raises the tip temperature, which makes the tip to melt and gets blunt. The plasma formation ceases owing to larger diameter of the tool, and the tool gets stuck which could be verified through experimental results.
Amanzi: An Open-Source Multi-process Simulator for Environmental Applications
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Molins, S.; Johnson, J. N.; Coon, E.; Lipnikov, K.; Day, M.; Barker, E.
2014-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments begin with simplified models, and add geometric and geologic complexity as understanding is gained. The Platform toolsets (Akuna) generates these conceptual models and Amanzi provides the computational engine to perform the simulations, returning the results for analysis and visualization. In this presentation we highlight key elements of the design, algorithms and implementations used in Amanzi. In particular, the hierarchical and modular design is aligned with the coupled processes being sumulated, and naturally supports a wide range of model complexity. This design leverages a dynamic data manager and the synergy of two graphs (one from the high-level perspective of the models the other from the dependencies of the variables in the model) to enable this flexible model configuration at run time. Moreover, to model sites with complex hydrostratigraphy, as well as engineered systems, we are developing a dual unstructured/structured capability. Recently, these capabilities have been collected in a framework named Arcos, and efforts have begun to improve interoperability between the unstructured and structured AMR approaches in Amanzi. To leverage a range of biogeochemistry capability from the community (e.g., CrunchFlow, PFLOTRAN, etc.), a biogeochemistry interface library was developed called Alquimia. To ensure that Amanzi is truly an open-source community code we require a completely open-source tool chain for our development. We will comment on elements of this tool chain, including the testing and documentation development tools such as docutils, and Sphinx. Finally, we will show simulation results from our phased demonstrations, including the geochemically complex Savannah River F-Area seepage basins.
NASA Astrophysics Data System (ADS)
Koutiva, Ifigeneia; Makropoulos, Christos
2015-04-01
The urban water system's sustainable evolution requires tools that can analyse and simulate the complete cycle including both physical and cultural environments. One of the main challenges, in this regard, is the design and development of tools that are able to simulate the society's water demand behaviour and the way policy measures affect it. The effects of these policy measures are a function of personal opinions that subsequently lead to the formation of people's attitudes. These attitudes will eventually form behaviours. This work presents the design of an ABM tool for addressing the social dimension of the urban water system. The created tool, called Urban Water Agents' Behaviour (UWAB) model, was implemented, using the NetLogo agent programming language. The main aim of the UWAB model is to capture the effects of policies and environmental pressures to water conservation behaviour of urban households. The model consists of agents representing urban households that are linked to each other creating a social network that influences the water conservation behaviour of its members. Household agents are influenced as well by policies and environmental pressures, such as drought. The UWAB model simulates behaviour resulting in the evolution of water conservation within an urban population. The final outcome of the model is the evolution of the distribution of different conservation levels (no, low, high) to the selected urban population. In addition, UWAB is implemented in combination with an existing urban water management simulation tool, the Urban Water Optioneering Tool (UWOT) in order to create a modelling platform aiming to facilitate an adaptive approach of water resources management. For the purposes of this proposed modelling platform, UWOT is used in a twofold manner: (1) to simulate domestic water demand evolution and (2) to simulate the response of the water system to the domestic water demand evolution. The main advantage of the UWAB - UWOT model integration is that it allows the investigation of the effects of different water demand management strategies to an urban population's water demand behaviour and ultimately the effects of these policies to the volume of domestic water demand and the water resources system. The proposed modelling platform is optimised to simulate the effects of water policies during the Athens drought period of 1988-1994. The calibrated modelling platform is then applied to evaluate scenarios of water supply, water demand and water demand management strategies.
Radom, Marcin; Rybarczyk, Agnieszka; Szawulak, Bartlomiej; Andrzejewski, Hubert; Chabelski, Piotr; Kozak, Adam; Formanowicz, Piotr
2017-12-01
Model development and its analysis is a fundamental step in systems biology. The theory of Petri nets offers a tool for such a task. Since the rapid development of computer science, a variety of tools for Petri nets emerged, offering various analytical algorithms. From this follows a problem of using different programs to analyse a single model. Many file formats and different representations of results make the analysis much harder. Especially for larger nets the ability to visualize the results in a proper form provides a huge help in the understanding of their significance. We present a new tool for Petri nets development and analysis called Holmes. Our program contains algorithms for model analysis based on different types of Petri nets, e.g. invariant generator, Maximum Common Transitions (MCT) sets and cluster modules, simulation algorithms or knockout analysis tools. A very important feature is the ability to visualize the results of almost all analytical modules. The integration of such modules into one graphical environment allows a researcher to fully devote his or her time to the model building and analysis. Available at http://www.cs.put.poznan.pl/mradom/Holmes/holmes.html. piotr@cs.put.poznan.pl. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Dynamics of flexible bodies in tree topology - A computer oriented approach
NASA Technical Reports Server (NTRS)
Singh, R. P.; Vandervoort, R. J.; Likins, P. W.
1984-01-01
An approach suited for automatic generation of the equations of motion for large mechanical systems (i.e., large space structures, mechanisms, robots, etc.) is presented. The system topology is restricted to a tree configuration. The tree is defined as an arbitrary set of rigid and flexible bodies connected by hinges characterizing relative translations and rotations of two adjoining bodies. The equations of motion are derived via Kane's method. The resulting equation set is of minimum dimension. Dynamical equations are imbedded in a computer program called TREETOPS. Extensive control simulation capability is built in the TREETOPS program. The simulation is driven by an interactive set-up program resulting in an easy to use analysis tool.
Simulation of Etching in Chlorine Discharges Using an Integrated Feature Evolution-Plasma Model
NASA Technical Reports Server (NTRS)
Hwang, Helen H.; Bose, Deepak; Govindan, T. R.; Meyyappan, M.; Biegel, Bryan (Technical Monitor)
2002-01-01
To better utilize its vast collection of heterogeneous resources that are geographically distributed across the United States, NASA is constructing a computational grid called the Information Power Grid (IPG). This paper describes various tools and techniques that we are developing to measure and improve the performance of a broad class of NASA applications when run on the IPG. In particular, we are investigating the areas of grid benchmarking, grid monitoring, user-level application scheduling, and decentralized system-level scheduling.
An advanced environment for hybrid modeling of biological systems based on modelica.
Pross, Sabrina; Bachmann, Bernhard
2011-01-20
Biological systems are often very complex so that an appropriate formalism is needed for modeling their behavior. Hybrid Petri Nets, consisting of time-discrete Petri Net elements as well as continuous ones, have proven to be ideal for this task. Therefore, a new Petri Net library was implemented based on the object-oriented modeling language Modelica which allows the modeling of discrete, stochastic and continuous Petri Net elements by differential, algebraic and discrete equations. An appropriate Modelica-tool performs the hybrid simulation with discrete events and the solution of continuous differential equations. A special sub-library contains so-called wrappers for specific reactions to simplify the modeling process. The Modelica-models can be connected to Simulink-models for parameter optimization, sensitivity analysis and stochastic simulation in Matlab. The present paper illustrates the implementation of the Petri Net component models, their usage within the modeling process and the coupling between the Modelica-tool Dymola and Matlab/Simulink. The application is demonstrated by modeling the metabolism of Chinese Hamster Ovary Cells.
NASA Astrophysics Data System (ADS)
Courageot, Estelle; Sayah, Rima; Huet, Christelle
2010-05-01
Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.
Courageot, Estelle; Sayah, Rima; Huet, Christelle
2010-05-07
Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Anticipation of the landing shock phenomenon in flight simulation
NASA Technical Reports Server (NTRS)
Mcfarland, Richard E.
1987-01-01
An aircraft landing may be described as a controlled crash because a runway surface is intercepted. In a simulation model the transition from aerodynamic flight to weight on wheels involves a single computational cycle during which stiff differential equations are activated; with a significant probability these initial conditions are unrealistic. This occurs because of the finite cycle time, during which large restorative forces will accompany unrealistic initial oleo compressions. This problem was recognized a few years ago at Ames Research Center during simulation studies of a supersonic transport. The mathematical model of this vehicle severely taxed computational resources, and required a large cycle time. The ground strike problem was solved by a described technique called anticipation equations. This extensively used technique has not been previously reported. The technique of anticipating a significant event is a useful tool in the general field of discrete flight simulation. For the differential equations representing a landing gear model stiffness, rate of interception and cycle time may combine to produce an unrealistic simulation of the continuum.
Middle-ear microsurgery simulation to improve new robotic procedures.
Kazmitcheff, Guillaume; Nguyen, Yann; Miroir, Mathieu; Péan, Fabien; Ferrary, Evelyne; Cotin, Stéphane; Sterkers, Olivier; Duriez, Christian
2014-01-01
Otological microsurgery is delicate and requires high dexterity in bad ergonomic conditions. To assist surgeons in these indications, a teleoperated system, called RobOtol, is developed. This robot enhances gesture accuracy and handiness and allows exploration of new procedures for middle ear surgery. To plan new procedures that exploit the capacities given by the robot, a surgical simulator is developed. The simulation reproduces with high fidelity the behavior of the anatomical structures and can also be used as a training tool for an easier control of the robot for surgeons. In the paper, we introduce the middle ear surgical simulation and then we perform virtually two challenging procedures with the robot. We show how interactive simulation can assist in analyzing the benefits of robotics in the case of complex manipulations or ergonomics studies and allow the development of innovative surgical procedures. New robot-based microsurgical procedures are investigated. The improvement offered by RobOtol is also evaluated and discussed.
Middle-Ear Microsurgery Simulation to Improve New Robotic Procedures
Kazmitcheff, Guillaume; Nguyen, Yann; Miroir, Mathieu; Péan, Fabien; Ferrary, Evelyne; Cotin, Stéphane; Sterkers, Olivier; Duriez, Christian
2014-01-01
Otological microsurgery is delicate and requires high dexterity in bad ergonomic conditions. To assist surgeons in these indications, a teleoperated system, called RobOtol, is developed. This robot enhances gesture accuracy and handiness and allows exploration of new procedures for middle ear surgery. To plan new procedures that exploit the capacities given by the robot, a surgical simulator is developed. The simulation reproduces with high fidelity the behavior of the anatomical structures and can also be used as a training tool for an easier control of the robot for surgeons. In the paper, we introduce the middle ear surgical simulation and then we perform virtually two challenging procedures with the robot. We show how interactive simulation can assist in analyzing the benefits of robotics in the case of complex manipulations or ergonomics studies and allow the development of innovative surgical procedures. New robot-based microsurgical procedures are investigated. The improvement offered by RobOtol is also evaluated and discussed. PMID:25157373
An Object-Oriented Finite Element Framework for Multiphysics Phase Field Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael R Tonks; Derek R Gaston; Paul C Millett
2012-01-01
The phase field approach is a powerful and popular method for modeling microstructure evolution. In this work, advanced numerical tools are used to create a phase field framework that facilitates rapid model development. This framework, called MARMOT, is based on Idaho National Laboratory's finite element Multiphysics Object-Oriented Simulation Environment. In MARMOT, the system of phase field partial differential equations (PDEs) are solved simultaneously with PDEs describing additional physics, such as solid mechanics and heat conduction, using the Jacobian-Free Newton Krylov Method. An object-oriented architecture is created by taking advantage of commonalities in phase fields models to facilitate development of newmore » models with very little written code. In addition, MARMOT provides access to mesh and time step adaptivity, reducing the cost for performing simulations with large disparities in both spatial and temporal scales. In this work, phase separation simulations are used to show the numerical performance of MARMOT. Deformation-induced grain growth and void growth simulations are included to demonstrate the muliphysics capability.« less
Numerical Propulsion System Simulation Architecture
NASA Technical Reports Server (NTRS)
Naiman, Cynthia G.
2004-01-01
The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.
Computer simulation of the human respiratory system for educational purposes.
Botsis, Taxiarhis; Halkiotis, Stelios-Chris; Kourlaba, Georgia
2004-01-01
The main objective of this study was the development of a computer simulation system for the human respiratory system, in order to educate students of nursing. This approach was based on existing mathematical models and on our own constructed specific functions. For the development of this educational tool the appropriate software packages were used according to the special demands of this process. This system is called ReSim (Respiratory Simulation) and consists of two parts: the first part deals with pulmonary volumes and the second one represents the mechanical behavior of lungs. The target group evaluated ReSim. The outcomes of the evaluation process were positive and helped us realize the system characteristics that needed improvements. Our basic conclusion is that the extended use of such systems supports the educational process and offers new potential for learning.
NASA Astrophysics Data System (ADS)
Klopfer, Eric; Yoon, Susan; Perry, Judy
2005-09-01
This paper reports on teachers' perceptions of the educational affordances of a handheld application called Participatory Simulations. It presents evidence from five cases representing each of the populations who work with these computational tools. Evidence across multiple data sources yield similar results to previous research evaluations of handheld activities with respect to enhancing motivation, engagement and self-directed learning. Three additional themes are discussed that provide insight into understanding curricular applicability of Participatory Simulations that suggest a new take on ubiquitous and accessible mobile computing. These themes generally point to the multiple layers of social and cognitive flexibility intrinsic to their design: ease of adaptation to subject-matter content knowledge and curricular integration; facility in attending to teacher-individualized goals; and encouraging the adoption of learner-centered strategies.
NASA Astrophysics Data System (ADS)
Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em
2017-09-01
Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.
NASA Technical Reports Server (NTRS)
Chawner, David M.; Gomez, Ray J.
2010-01-01
In the Applied Aerosciences and CFD branch at Johnson Space Center, computational simulations are run that face many challenges. Two of which are the ability to customize software for specialized needs and the need to run simulations as fast as possible. There are many different tools that are used for running these simulations and each one has its own pros and cons. Once these simulations are run, there needs to be software capable of visualizing the results in an appealing manner. Some of this software is called open source, meaning that anyone can edit the source code to make modifications and distribute it to all other users in a future release. This is very useful, especially in this branch where many different tools are being used. File readers can be written to load any file format into a program, to ease the bridging from one tool to another. Programming such a reader requires knowledge of the file format that is being read as well as the equations necessary to obtain the derived values after loading. When running these CFD simulations, extremely large files are being loaded and having values being calculated. These simulations usually take a few hours to complete, even on the fastest machines. Graphics processing units (GPUs) are usually used to load the graphics for computers; however, in recent years, GPUs are being used for more generic applications because of the speed of these processors. Applications run on GPUs have been known to run up to forty times faster than they would on normal central processing units (CPUs). If these CFD programs are extended to run on GPUs, the amount of time they would require to complete would be much less. This would allow more simulations to be run in the same amount of time and possibly perform more complex computations.
NASA Technical Reports Server (NTRS)
Tick, Evan
1987-01-01
This note describes an efficient software emulator for the Warren Abstract Machine (WAM) Prolog architecture. The version of the WAM implemented is called Lcode. The Lcode emulator, written in C, executes the 'naive reverse' benchmark at 3900 LIPS. The emulator is one of a set of tools used to measure the memory-referencing characteristics and performance of Prolog programs. These tools include a compiler, assembler, and memory simulators. An overview of the Lcode architecture is given here, followed by a description and listing of the emulator code implementing each Lcode instruction. This note will be of special interest to those studying the WAM and its performance characteristics. In general, this note will be of interest to those creating efficient software emulators for abstract machine architectures.
Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.
Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat
2015-01-01
A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs.
Securing Mobile Ad Hoc Networks Using Danger Theory-Based Artificial Immune Algorithm
2015-01-01
A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001
A Modeling Tool for Household Biogas Burner Flame Port Design
NASA Astrophysics Data System (ADS)
Decker, Thomas J.
Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.
VirSSPA- a virtual reality tool for surgical planning workflow.
Suárez, C; Acha, B; Serrano, C; Parra, C; Gómez, T
2009-03-01
A virtual reality tool, called VirSSPA, was developed to optimize the planning of surgical processes. Segmentation algorithms for Computed Tomography (CT) images: a region growing procedure was used for soft tissues and a thresholding algorithm was implemented to segment bones. The algorithms operate semiautomati- cally since they only need seed selection with the mouse on each tissue segmented by the user. The novelty of the paper is the adaptation of an enhancement method based on histogram thresholding applied to CT images for surgical planning, which simplifies subsequent segmentation. A substantial improvement of the virtual reality tool VirSSPA was obtained with these algorithms. VirSSPA was used to optimize surgical planning, to decrease the time spent on surgical planning and to improve operative results. The success rate increases due to surgeons being able to see the exact extent of the patient's ailment. This tool can decrease operating room time, thus resulting in reduced costs. Virtual simulation was effective for optimizing surgical planning, which could, consequently, result in improved outcomes with reduced costs.
SBML-PET: a Systems Biology Markup Language-based parameter estimation tool.
Zi, Zhike; Klipp, Edda
2006-11-01
The estimation of model parameters from experimental data remains a bottleneck for a major breakthrough in systems biology. We present a Systems Biology Markup Language (SBML) based Parameter Estimation Tool (SBML-PET). The tool is designed to enable parameter estimation for biological models including signaling pathways, gene regulation networks and metabolic pathways. SBML-PET supports import and export of the models in the SBML format. It can estimate the parameters by fitting a variety of experimental data from different experimental conditions. SBML-PET has a unique feature of supporting event definition in the SMBL model. SBML models can also be simulated in SBML-PET. Stochastic Ranking Evolution Strategy (SRES) is incorporated in SBML-PET for parameter estimation jobs. A classic ODE Solver called ODEPACK is used to solve the Ordinary Differential Equation (ODE) system. http://sysbio.molgen.mpg.de/SBML-PET/. The website also contains detailed documentation for SBML-PET.
NASA Technical Reports Server (NTRS)
Sang, Janche
2003-01-01
Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.
A New Improved and Extended Version of the Multicell Bacterial Simulator gro.
Gutiérrez, Martín; Gregorio-Godoy, Paula; Pérez Del Pulgar, Guillermo; Muñoz, Luis E; Sáez, Sandra; Rodríguez-Patón, Alfonso
2017-08-18
gro is a cell programming language developed in Klavins Lab for simulating colony growth and cell-cell communication. It is used as a synthetic biology prototyping tool for simulating multicellular biocircuits and microbial consortia. In this work, we present several extensions made to gro that improve the performance of the simulator, make it easier to use, and provide new functionalities. The new version of gro is between 1 and 2 orders of magnitude faster than the original version. It is able to grow microbial colonies with up to 10 5 cells in less than 10 min. A new library, CellEngine, accelerates the resolution of spatial physical interactions between growing and dividing cells by implementing a new shoving algorithm. A genetic library, CellPro, based on Probabilistic Timed Automata, simulates gene expression dynamics using simplified and easy to compute digital proteins. We also propose a more convenient language specification layer, ProSpec, based on the idea that proteins drive cell behavior. CellNutrient, another library, implements Monod-based growth and nutrient uptake functionalities. The intercellular signaling management was improved and extended in a library called CellSignals. Finally, bacterial conjugation, another local cell-cell communication process, was added to the simulator. To show the versatility and potential outreach of this version of gro, we provide studies and novel examples ranging from synthetic biology to evolutionary microbiology. We believe that the upgrades implemented for gro have made it into a powerful and fast prototyping tool capable of simulating a large variety of systems and synthetic biology designs.
NASA Technical Reports Server (NTRS)
Thakur, Siddarth; Wright, Jeffrey
2006-01-01
The traditional design and analysis practice for advanced propulsion systems, particularly chemical rocket engines, relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment by non-CFD specialists. A computational tool, called Loci-STREAM is being developed for this purpose. It is a pressure-based, Reynolds-averaged Navier-Stokes (RANS) solver for generalized unstructured grids, which is designed to handle all-speed flows (incompressible to hypersonic) and is particularly suitable for solving multi-species flow in fixed-frame combustion devices. Loci-STREAM integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective of the ongoing work is to develop a robust simulation capability for combustion problems in rocket engines. As an initial step towards validating this capability, a model problem is investigated in the present study which involves a gaseous oxygen/gaseous hydrogen (GO2/GH2) shear coaxial single element injector, for which experimental data are available. The sensitivity of the computed solutions to grid density, grid distribution, different turbulence models, and different near-wall treatments is investigated. A refined grid, which is clustered in the vicinity of the solid walls as well as the flame, is used to obtain a steady state solution which may be considered as the best solution attainable with the steady-state RANS methodology. From a design point of view, quick turnaround times are desirable; with this in mind, coarser grids are also employed and the resulting solutions are evaluated with respect to the fine grid solution.
On the design of computer-based models for integrated environmental science.
McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick
2005-06-01
The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.
Reshaping of large aeronautical structural parts: A simplified simulation approach
NASA Astrophysics Data System (ADS)
Mena, Ramiro; Aguado, José V.; Guinard, Stéphane; Huerta, Antonio
2018-05-01
Large aeronautical structural parts present important distortions after machining. This problem is caused by the presence of residual stresses, which are developed during previous manufacturing steps (quenching). Before being put into service, the nominal geometry is restored by means of mechanical methods. This operation is called reshaping and exclusively depends on the skills of a well-trained and experienced operator. Moreover, this procedure is time consuming and nowadays, it is only based on a trial and error approach. Therefore, there is a need at industrial level to solve this problem with the support of numerical simulation tools. By using a simplification hypothesis, it was found that the springback phenomenon behaves linearly and it allows developing a strategy to implement reshaping at an industrial level.
NASA Astrophysics Data System (ADS)
Berselli, Luigi C.; Spirito, Stefano
2018-06-01
Obtaining reliable numerical simulations of turbulent fluids is a challenging problem in computational fluid mechanics. The large eddy simulation (LES) models are efficient tools to approximate turbulent fluids, and an important step in the validation of these models is the ability to reproduce relevant properties of the flow. In this paper, we consider a fully discrete approximation of the Navier-Stokes-Voigt model by an implicit Euler algorithm (with respect to the time variable) and a Fourier-Galerkin method (in the space variables). We prove the convergence to weak solutions of the incompressible Navier-Stokes equations satisfying the natural local entropy condition, hence selecting the so-called physically relevant solutions.
Developing a Policy Flight Simulator to Facilitate the Adoption of an Evidence-Based Intervention
Yu, Zhongyuan; Hirschman, Karen B.; Pepe, Kara; Pauly, Mark V.; Naylor, Mary D.; Rouse, William B.
2018-01-01
While the use of evidence-based interventions (EBIs) has been advocated by the medical research community for quite some time, uptake of these interventions by healthcare providers has been slow. One possible explanation is that it is challenging for providers to estimate impacts of a specific EBI on their particular organization. To address that concern, we developed and evaluated a type of simulation called a policy flight simulator to determine if it could improve the adoption decision about a specific EBI, the transitional care model (TCM). The TCM uses an advanced practice nurse-led model of care to transition older adults with multiple chronic conditions from a hospitalization to home. An evaluation by a National Advisory Committee, made up of senior representatives from various stakeholders in the U.S. healthcare system, found the policy flight simulator to be a useful tool that has the potential to better inform adoption decisions. This paper describes the simulation development effort and documents lessons learned that may be useful to the healthcare modeling community and those interested in using simulation to support decisions based on EBIs. PMID:29805921
Agent-based modeling as a tool for program design and evaluation.
Lawlor, Jennifer A; McGirr, Sara
2017-12-01
Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.
77 FR 72830 - Request for Comments on Request for Continued Examination (RCE) Practice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... the submission of written comments using a Web-based collaboration tool called IdeaScale[supreg]; and... collaboration tool called IdeaScale[supreg]. The tool allows users to post comments on a topic, and view and...
A Process for the Creation of T-MATS Propulsion System Models from NPSS data
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink (Math Works, Inc.) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.
A Process for the Creation of T-MATS Propulsion System Models from NPSS Data
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink(Trademark) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.
A Process for the Creation of T-MATS Propulsion System Models from NPSS Data
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
A modular thermodynamic simulation package called the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) has been developed for the creation of dynamic simulations. The T-MATS software is designed as a plug-in for Simulink(Registered TradeMark) and allows a developer to create system simulations of thermodynamic plants (such as gas turbines) and controllers in a single tool. Creation of such simulations can be accomplished by matching data from actual systems, or by matching data from steady state models and inserting appropriate dynamics, such as the rotor and actuator dynamics for an aircraft engine. This paper summarizes the process for creating T-MATS turbo-machinery simulations using data and input files obtained from a steady state model created in the Numerical Propulsion System Simulation (NPSS). The NPSS is a thermodynamic simulation environment that is commonly used for steady state gas turbine performance analysis. Completion of all the steps involved in the process results in a good match between T-MATS and NPSS at several steady state operating points. Additionally, the T-MATS model extended to run dynamically provides the possibility of simulating and evaluating closed loop responses.
de Souza, Isaac D T; Silva, Sergio N; Teles, Rafael M; Fernandes, Marcelo A C
2014-10-15
The development of new embedded algorithms for automation and control of industrial equipment usually requires the use of real-time testing. However, the equipment required is often expensive, which means that such tests are often not viable. The objective of this work was therefore to develop an embedded platform for the distributed real-time simulation of dynamic systems. This platform, called the Real-Time Simulator for Dynamic Systems (RTSDS), could be applied in both industrial and academic environments. In industrial applications, the RTSDS could be used to optimize embedded control algorithms. In the academic sphere, it could be used to support research into new embedded solutions for automation and control and could also be used as a tool to assist in undergraduate and postgraduate teaching related to the development of projects concerning on-board control systems.
Platform for Real-Time Simulation of Dynamic Systems and Hardware-in-the-Loop for Control Algorithms
de Souza, Isaac D. T.; Silva, Sergio N.; Teles, Rafael M.; Fernandes, Marcelo A. C.
2014-01-01
The development of new embedded algorithms for automation and control of industrial equipment usually requires the use of real-time testing. However, the equipment required is often expensive, which means that such tests are often not viable. The objective of this work was therefore to develop an embedded platform for the distributed real-time simulation of dynamic systems. This platform, called the Real-Time Simulator for Dynamic Systems (RTSDS), could be applied in both industrial and academic environments. In industrial applications, the RTSDS could be used to optimize embedded control algorithms. In the academic sphere, it could be used to support research into new embedded solutions for automation and control and could also be used as a tool to assist in undergraduate and postgraduate teaching related to the development of projects concerning on-board control systems. PMID:25320906
Evaluating System Parameters on a Dragonfly using Simulation and Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhatele, Abhinav; Jain, Nikhil; Livnat, Yarden
The dragon y topology is becoming a popular choice for build- ing high-radix, low-diameter networks with high-bandwidth links. Even with a powerful network, preliminary experi- ments on Edison at NERSC have shown that for communica- tion heavy applications, job interference and thus presumably job placement remains an important factor. In this paper, we explore the e ects of job placement, job sizes, parallel workloads and network con gurations on network through- put to better understand inter-job interference. We use a simulation tool called Damsel y to model the network be- havior of Edison and study the impact of various systemmore » parameters on network throughput. Parallel workloads based on ve representative communication patters are used and the simulation studies on up to 131,072 cores are aided by a new visualization of the dragon y network.« less
a Discrete Mathematical Model to Simulate Malware Spreading
NASA Astrophysics Data System (ADS)
Del Rey, A. Martin; Sánchez, G. Rodriguez
2012-10-01
With the advent and worldwide development of Internet, the study and control of malware spreading has become very important. In this sense, some mathematical models to simulate malware propagation have been proposed in the scientific literature, and usually they are based on differential equations exploiting the similarities with mathematical epidemiology. The great majority of these models study the behavior of a particular type of malware called computer worms; indeed, to the best of our knowledge, no model has been proposed to simulate the spreading of a computer virus (the traditional type of malware which differs from computer worms in several aspects). In this sense, the purpose of this work is to introduce a new mathematical model not based on continuous mathematics tools but on discrete ones, to analyze and study the epidemic behavior of computer virus. Specifically, cellular automata are used in order to design such model.
Estimation of contraction scour in riverbed using SERF
Jiang, J.; Ganju, N.K.; Mehta, A.J.
2004-01-01
Contraction scour in a firm-clay estuarine riverbed is estimated at an oil-unloading terminal at the Port of Haldia in India, where a scour hole attained a maximum depth greater than 5 m relative to the original bottom. A linear equation for the erosion flux as a function of the excess bed shear stress was semicalibrated in a rotating-cylinder device called SERF (Simulator of Erosion Rate Function) and coupled to a hydrodynamic code to simulate the hole as a clear-water scour process. SERF, whose essential design is based on previous such devices, additionally included a load cell for in situ and rapid measurement of the eroded sediment mass. Based on SERF's performance and the degree of comparison between measured and simulated hole geometry, it appears that this device holds promise as a simple tool for prediction of scour in firm-clay beds. ?? ASCE.
A Review of Enhanced Sampling Approaches for Accelerated Molecular Dynamics
NASA Astrophysics Data System (ADS)
Tiwary, Pratyush; van de Walle, Axel
Molecular dynamics (MD) simulations have become a tool of immense use and popularity for simulating a variety of systems. With the advent of massively parallel computer resources, one now routinely sees applications of MD to systems as large as hundreds of thousands to even several million atoms, which is almost the size of most nanomaterials. However, it is not yet possible to reach laboratory timescales of milliseconds and beyond with MD simulations. Due to the essentially sequential nature of time, parallel computers have been of limited use in solving this so-called timescale problem. Instead, over the years a large range of statistical mechanics based enhanced sampling approaches have been proposed for accelerating molecular dynamics, and accessing timescales that are well beyond the reach of the fastest computers. In this review we provide an overview of these approaches, including the underlying theory, typical applications, and publicly available software resources to implement them.
Souris, Kevin; Lee, John Aldo; Sterpin, Edmond
2016-04-01
Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.
Control Systems with Normalized and Covariance Adaptation by Optimal Control Modification
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T. (Inventor); Burken, John J. (Inventor); Hanson, Curtis E. (Inventor)
2016-01-01
Disclosed is a novel adaptive control method and system called optimal control modification with normalization and covariance adjustment. The invention addresses specifically to current challenges with adaptive control in these areas: 1) persistent excitation, 2) complex nonlinear input-output mapping, 3) large inputs and persistent learning, and 4) the lack of stability analysis tools for certification. The invention has been subject to many simulations and flight testing. The results substantiate the effectiveness of the invention and demonstrate the technical feasibility for use in modern aircraft flight control systems.
Nonequilibrium hypersonic flows simulations with asymptotic-preserving Monte Carlo methods
NASA Astrophysics Data System (ADS)
Ren, Wei; Liu, Hong; Jin, Shi
2014-12-01
In the rarefied gas dynamics, the DSMC method is one of the most popular numerical tools. It performs satisfactorily in simulating hypersonic flows surrounding re-entry vehicles and micro-/nano- flows. However, the computational cost is expensive, especially when Kn → 0. Even for flows in the near-continuum regime, pure DSMC simulations require a number of computational efforts for most cases. Albeit several DSMC/NS hybrid methods are proposed to deal with this, those methods still suffer from the boundary treatment, which may cause nonphysical solutions. Filbet and Jin [1] proposed a framework of new numerical methods of Boltzmann equation, called asymptotic preserving schemes, whose computational costs are affordable as Kn → 0. Recently, Ren et al. [2] realized the AP schemes with Monte Carlo methods (AP-DSMC), which have better performance than counterpart methods. In this paper, AP-DSMC is applied in simulating nonequilibrium hypersonic flows. Several numerical results are computed and analyzed to study the efficiency and capability of capturing complicated flow characteristics.
NASA Technical Reports Server (NTRS)
Groleau, Nicolas; Frainier, Richard; Colombano, Silvano; Hazelton, Lyman; Szolovits, Peter
1993-01-01
This paper describes portions of a novel system called MARIKA (Model Analysis and Revision of Implicit Key Assumptions) to automatically revise a model of the normal human orientation system. The revision is based on analysis of discrepancies between experimental results and computer simulations. The discrepancies are calculated from qualitative analysis of quantitative simulations. The experimental and simulated time series are first discretized in time segments. Each segment is then approximated by linear combinations of simple shapes. The domain theory and knowledge are represented as a constraint network. Incompatibilities detected during constraint propagation within the network yield both parameter and structural model alterations. Interestingly, MARIKA diagnosed a data set from the Massachusetts Eye and Ear Infirmary Vestibular Laboratory as abnormal though the data was tagged as normal. Published results from other laboratories confirmed the finding. These encouraging results could lead to a useful clinical vestibular tool and to a scientific discovery system for space vestibular adaptation.
Sultan, Mohammad M; Kiss, Gert; Shukla, Diwakar; Pande, Vijay S
2014-12-09
Given the large number of crystal structures and NMR ensembles that have been solved to date, classical molecular dynamics (MD) simulations have become powerful tools in the atomistic study of the kinetics and thermodynamics of biomolecular systems on ever increasing time scales. By virtue of the high-dimensional conformational state space that is explored, the interpretation of large-scale simulations faces difficulties not unlike those in the big data community. We address this challenge by introducing a method called clustering based feature selection (CB-FS) that employs a posterior analysis approach. It combines supervised machine learning (SML) and feature selection with Markov state models to automatically identify the relevant degrees of freedom that separate conformational states. We highlight the utility of the method in the evaluation of large-scale simulations and show that it can be used for the rapid and automated identification of relevant order parameters involved in the functional transitions of two exemplary cell-signaling proteins central to human disease states.
Accurate estimation of short read mapping quality for next-generation genome sequencing
Ruffalo, Matthew; Koyutürk, Mehmet; Ray, Soumya; LaFramboise, Thomas
2012-01-01
Motivation: Several software tools specialize in the alignment of short next-generation sequencing reads to a reference sequence. Some of these tools report a mapping quality score for each alignment—in principle, this quality score tells researchers the likelihood that the alignment is correct. However, the reported mapping quality often correlates weakly with actual accuracy and the qualities of many mappings are underestimated, encouraging the researchers to discard correct mappings. Further, these low-quality mappings tend to correlate with variations in the genome (both single nucleotide and structural), and such mappings are important in accurately identifying genomic variants. Approach: We develop a machine learning tool, LoQuM (LOgistic regression tool for calibrating the Quality of short read mappings, to assign reliable mapping quality scores to mappings of Illumina reads returned by any alignment tool. LoQuM uses statistics on the read (base quality scores reported by the sequencer) and the alignment (number of matches, mismatches and deletions, mapping quality score returned by the alignment tool, if available, and number of mappings) as features for classification and uses simulated reads to learn a logistic regression model that relates these features to actual mapping quality. Results: We test the predictions of LoQuM on an independent dataset generated by the ART short read simulation software and observe that LoQuM can ‘resurrect’ many mappings that are assigned zero quality scores by the alignment tools and are therefore likely to be discarded by researchers. We also observe that the recalibration of mapping quality scores greatly enhances the precision of called single nucleotide polymorphisms. Availability: LoQuM is available as open source at http://compbio.case.edu/loqum/. Contact: matthew.ruffalo@case.edu. PMID:22962451
ViSimpl: Multi-View Visual Analysis of Brain Simulation Data
Galindo, Sergio E.; Toharia, Pablo; Robles, Oscar D.; Pastor, Luis
2016-01-01
After decades of independent morphological and functional brain research, a key point in neuroscience nowadays is to understand the combined relationships between the structure of the brain and its components and their dynamics on multiple scales, ranging from circuits of neurons at micro or mesoscale to brain regions at macroscale. With such a goal in mind, there is a vast amount of research focusing on modeling and simulating activity within neuronal structures, and these simulations generate large and complex datasets which have to be analyzed in order to gain the desired insight. In such context, this paper presents ViSimpl, which integrates a set of visualization and interaction tools that provide a semantic view of brain data with the aim of improving its analysis procedures. ViSimpl provides 3D particle-based rendering that allows visualizing simulation data with their associated spatial and temporal information, enhancing the knowledge extraction process. It also provides abstract representations of the time-varying magnitudes supporting different data aggregation and disaggregation operations and giving also focus and context clues. In addition, ViSimpl tools provide synchronized playback control of the simulation being analyzed. Finally, ViSimpl allows performing selection and filtering operations relying on an application called NeuroScheme. All these views are loosely coupled and can be used independently, but they can also work together as linked views, both in centralized and distributed computing environments, enhancing the data exploration and analysis procedures. PMID:27774062
ViSimpl: Multi-View Visual Analysis of Brain Simulation Data.
Galindo, Sergio E; Toharia, Pablo; Robles, Oscar D; Pastor, Luis
2016-01-01
After decades of independent morphological and functional brain research, a key point in neuroscience nowadays is to understand the combined relationships between the structure of the brain and its components and their dynamics on multiple scales, ranging from circuits of neurons at micro or mesoscale to brain regions at macroscale. With such a goal in mind, there is a vast amount of research focusing on modeling and simulating activity within neuronal structures, and these simulations generate large and complex datasets which have to be analyzed in order to gain the desired insight. In such context, this paper presents ViSimpl, which integrates a set of visualization and interaction tools that provide a semantic view of brain data with the aim of improving its analysis procedures. ViSimpl provides 3D particle-based rendering that allows visualizing simulation data with their associated spatial and temporal information, enhancing the knowledge extraction process. It also provides abstract representations of the time-varying magnitudes supporting different data aggregation and disaggregation operations and giving also focus and context clues. In addition, ViSimpl tools provide synchronized playback control of the simulation being analyzed. Finally, ViSimpl allows performing selection and filtering operations relying on an application called NeuroScheme. All these views are loosely coupled and can be used independently, but they can also work together as linked views, both in centralized and distributed computing environments, enhancing the data exploration and analysis procedures.
Performance Support Tools: Delivering Value when and where It Is Needed
ERIC Educational Resources Information Center
McManus, Paul; Rossett, Allison
2006-01-01
Some call them Electronic Performance Support Systems (EPSSs). Others prefer Performance Support Tools (PSTs) or decision support tools. One might call EPSSs or PSTs job aids on steroids, technological tools that provide critical information or advice needed to move forward at a particular moment in time. Characteristic advantages of an EPSS or a…
NASA Astrophysics Data System (ADS)
Marseille, Gert-Jan; Stoffelen, Ad; Barkmeijer, Jan
2008-03-01
Lacking an established methodology to test the potential impact of prospective extensions to the global observing system (GOS) in real atmospheric cases we developed such a method, called Sensitivity Observing System Experiment (SOSE). For example, since the GOS is non uniform it is of interest to investigate the benefit of complementary observing systems filling its gaps. In a SOSE adjoint sensitivity structures are used to define a pseudo true atmospheric state for the simulation of the prospective observing system. Next, the synthetic observations are used together with real observations from the existing GOS in a state-of-the-art Numerical Weather Prediction (NWP) model to assess the potential added value of the new observing system. Unlike full observing system simulation experiments (OSSE), SOSE can be applied to real extreme events that were badly forecast operationally and only requires the simulation of the new instrument. As such SOSE is an effective tool, for example, to define observation requirements for extensions to the GOS. These observation requirements may serve as input for the design of an operational network of prospective observing systems. In a companion paper we use SOSE to simulate potential future space borne Doppler Wind Lidar (DWL) scenarios and assess their capability to sample meteorologically sensitive areas not well captured by the current GOS, in particular over the Northern Hemisphere oceans.
A web GIS based integrated flood assessment modeling tool for coastal urban watersheds
NASA Astrophysics Data System (ADS)
Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.
2014-03-01
Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.
[Improving communication skills of physicians caring for adolescents by simulation].
Reister, Gad; Stoffman, Nava
2011-04-01
Although the unique characteristics and abilities of youths were noted in ancient ages, it was only later that the process of adolescence was studied and understood. Adolescents are considered a healthy population when compared to younger kids and adults. However, unlike other age groups, the morbidity and mortality of adolescents has not decreased in the last decades, probably due to risk-taking behaviors. Since the 1950s, the need for a special medical and health approach in treating adolescents was established. Yet, only a few countries incorporate such approaches when educating and training students, residents and fellows in physicians programs. Youths are treated by physicians of many disciplines, despite the fact that only a minority were trained in adolescent medicine. Simulation of medical situations with standard patients has become a significant tool for improving the communication skills of healthcare providers. The article in this edition of Harefuah describes the use of a simulated-patient-based education system in improving the communication skills of physicians of different fields. The authors presented the positive feedback of the participants in the program and demonstrated that following the program there was a positive influence on their practice when dealing with adolescents. We call to incorporate the teaching of adolescent medicine in all Levels, starting at medical school. Using the simulation tool is very helpful in improving the communication skills of medical personnel.
The TeraShake Computational Platform for Large-Scale Earthquake Simulations
NASA Astrophysics Data System (ADS)
Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas
Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.
Barczi, Jean-François; Rey, Hervé; Caraglio, Yves; de Reffye, Philippe; Barthélémy, Daniel; Dong, Qiao Xue; Fourcaud, Thierry
2008-05-01
AmapSim is a tool that implements a structural plant growth model based on a botanical theory and simulates plant morphogenesis to produce accurate, complex and detailed plant architectures. This software is the result of more than a decade of research and development devoted to plant architecture. New advances in the software development have yielded plug-in external functions that open up the simulator to functional processes. The simulation of plant topology is based on the growth of a set of virtual buds whose activity is modelled using stochastic processes. The geometry of the resulting axes is modelled by simple descriptive functions. The potential growth of each bud is represented by means of a numerical value called physiological age, which controls the value for each parameter in the model. The set of possible values for physiological ages is called the reference axis. In order to mimic morphological and architectural metamorphosis, the value allocated for the physiological age of buds evolves along this reference axis according to an oriented finite state automaton whose occupation and transition law follows a semi-Markovian function. Simulations were performed on tomato plants to demonstrate how the AmapSim simulator can interface external modules, e.g. a GREENLAB growth model and a radiosity model. The algorithmic ability provided by AmapSim, e.g. the reference axis, enables unified control to be exercised over plant development parameter values, depending on the biological process target: how to affect the local pertinent process, i.e. the pertinent parameter(s), while keeping the rest unchanged. This opening up to external functions also offers a broadened field of applications and thus allows feedback between plant growth and the physical environment.
Barczi, Jean-François; Rey, Hervé; Caraglio, Yves; de Reffye, Philippe; Barthélémy, Daniel; Dong, Qiao Xue; Fourcaud, Thierry
2008-01-01
Background and Aims AmapSim is a tool that implements a structural plant growth model based on a botanical theory and simulates plant morphogenesis to produce accurate, complex and detailed plant architectures. This software is the result of more than a decade of research and development devoted to plant architecture. New advances in the software development have yielded plug-in external functions that open up the simulator to functional processes. Methods The simulation of plant topology is based on the growth of a set of virtual buds whose activity is modelled using stochastic processes. The geometry of the resulting axes is modelled by simple descriptive functions. The potential growth of each bud is represented by means of a numerical value called physiological age, which controls the value for each parameter in the model. The set of possible values for physiological ages is called the reference axis. In order to mimic morphological and architectural metamorphosis, the value allocated for the physiological age of buds evolves along this reference axis according to an oriented finite state automaton whose occupation and transition law follows a semi-Markovian function. Key Results Simulations were performed on tomato plants to demostrate how the AmapSim simulator can interface external modules, e.g. a GREENLAB growth model and a radiosity model. Conclusions The algorithmic ability provided by AmapSim, e.g. the reference axis, enables unified control to be exercised over plant development parameter values, depending on the biological process target: how to affect the local pertinent process, i.e. the pertinent parameter(s), while keeping the rest unchanged. This opening up to external functions also offers a broadened field of applications and thus allows feedback between plant growth and the physical environment. PMID:17766310
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Profile Interface Generator (PIG) is a tool for loosely coupling applications and performance tools. It enables applications to write code that looks like standard C and Fortran functions calls, without requiring that applications link to specific implementations of those function calls. Performance tools can register with PIG in order to listen to only the calls that give information they care about. This interface reduces the build and configuration burden on application developers and allows semantic instrumentation to live in production codes without interfering with production runs.
NASA Astrophysics Data System (ADS)
Ines, A. V. M.; Han, E.; Baethgen, W.
2017-12-01
Advances in seasonal climate forecasts (SCFs) during the past decades have brought great potential to improve agricultural climate risk managements associated with inter-annual climate variability. In spite of popular uses of crop simulation models in addressing climate risk problems, the models cannot readily take seasonal climate predictions issued in the format of tercile probabilities of most likely rainfall categories (i.e, below-, near- and above-normal). When a skillful SCF is linked with the crop simulation models, the informative climate information can be further translated into actionable agronomic terms and thus better support strategic and tactical decisions. In other words, crop modeling connected with a given SCF allows to simulate "what-if" scenarios with different crop choices or management practices and better inform the decision makers. In this paper, we present a decision support tool, called CAMDT (Climate Agriculture Modeling and Decision Tool), which seamlessly integrates probabilistic SCFs to DSSAT-CSM-Rice model to guide decision-makers in adopting appropriate crop and agricultural water management practices for given climatic conditions. The CAMDT has a functionality to disaggregate a probabilistic SCF into daily weather realizations (either a parametric or non-parametric disaggregation method) and to run DSSAT-CSM-Rice with the disaggregated weather realizations. The convenient graphical user-interface allows easy implementation of several "what-if" scenarios for non-technical users and visualize the results of the scenario runs. In addition, the CAMDT also translates crop model outputs to economic terms once the user provides expected crop price and cost. The CAMDT is a practical tool for real-world applications, specifically for agricultural climate risk management in the Bicol region, Philippines, having a great flexibility for being adapted to other crops or regions in the world. CAMDT GitHub: https://github.com/Agro-Climate/CAMDT
Clinical simulation practise framework.
Khalili, Hossein
2015-02-01
Historically, simulation has mainly been used to teach students hands-on skills in a relatively safe environment. With changes in the patient population, professional regulations and clinical environments, clinical simulation practise (CSP) must assist students to integrate and apply their theoretical knowledge and skills with their critical thinking, clinical judgement, prioritisation, problem solving, decision making, and teamwork skills to provide holistic care and treatment to their patients. CSP holds great potential to derive a positive transformation in students' transition into the workplace, by associating and consolidating learning from classrooms to clinical settings, and creating bridges between theory and practice. For CSP to be successful in filling the gap, the design and management of the simulation is crucial. In this article a new framework called 'Clinical simulation practise framework: A knowledge to action strategy in health professional education' is being introduced that aims to assist educators and curriculum developers in designing and managing their simulations. This CSP framework theorises that simulation as an experiential educational tool could improve students' competence, confidence and collaboration in performing professional practice in real settings if the CSP provides the following three dimensions: (1) a safe, positive, reflective and fun simulated learning environment; (2) challenging, but realistic, and integrated simulated scenarios; and (3) interactive, inclusive, interprofessional patient-centred simulated practise. © 2015 John Wiley & Sons Ltd.
Observability of ionospheric space-time structure with ISR: A simulation study
NASA Astrophysics Data System (ADS)
Swoboda, John; Semeter, Joshua; Zettergren, Matthew; Erickson, Philip J.
2017-02-01
The sources of error from electronically steerable array (ESA) incoherent scatter radar (ISR) systems are investigated both theoretically and with use of an open-source ISR simulator, developed by the authors, called Simulator for ISR (SimISR). The main sources of error incorporated in the simulator include statistical uncertainty, which arises due to nature of the measurement mechanism and the inherent space-time ambiguity from the sensor. SimISR can take a field of plasma parameters, parameterized by time and space, and create simulated ISR data at the scattered electric field (i.e., complex receiver voltage) level, subsequently processing these data to show possible reconstructions of the original parameter field. To demonstrate general utility, we show a number of simulation examples, with two cases using data from a self-consistent multifluid transport model. Results highlight the significant influence of the forward model of the ISR process and the resulting statistical uncertainty on plasma parameter measurements and the core experiment design trade-offs that must be made when planning observations. These conclusions further underscore the utility of this class of measurement simulator as a design tool for more optimal experiment design efforts using flexible ESA class ISR systems.
Faint source detection in ISOCAM images
NASA Astrophysics Data System (ADS)
Starck, J. L.; Aussel, H.; Elbaz, D.; Fadda, D.; Cesarsky, C.
1999-08-01
We present a tool adapted to the detection of faint mid-infrared sources within ISOCAM mosaics. This tool is based on a wavelet analysis which allows us to discriminate sources from cosmic ray impacts at the very limit of the instrument, four orders of magnitudes below IRAS. It is called PRETI for Pattern REcognition Technique for ISOCAM data, because glitches with transient behaviors are isolated in the wavelet space, i.e. frequency space, where they present peculiar signatures in the form of patterns automatically identified and then reconstructed. We have tested PRETI with Monte-Carlo simulations of fake ISOCAM data. These simulations allowed us to define the fraction of remaining false sources due to cosmic rays, the sensitivity and completeness limits as well as the photometric accuracy as a function of the observation parameters. Although the main scientific applications of this technique have appeared or will appear in separated papers, we present here an application to the ISOCAM-Hubble Deep Field image. This work completes and confirms the results already published (\\cite[Aussel et al. 1999]{starck:aussel99}).
Robison, Weston; Patel, Sonya K; Mehta, Akshat; Senkowski, Tristan; Allen, John; Shaw, Eric; Senkowski, Christopher K
2018-03-01
To study the effects of fatigue on general surgery residents' performance on the da Vinci Skills Simulator (dVSS). 15 General Surgery residents from various postgraduate training years (PGY2, PGY3, PGY4, and PGY5) performed 5 simulation tasks on the dVSS as recommended by the Robotic Training Network (RTN). The General Surgery residents had no prior experience with the dVSS. Participants were assigned to either the Pre-call group or Post-call group based on call schedule. As a measure of subjective fatigue, residents were given the Epworth Sleepiness Scale (ESS) prior to their dVSS testing. The dVSS MScore™ software recorded various metrics (Objective Structured Assessment of Technical Skills, OSATS) that were used to evaluate the performance of each resident to compare the robotic simulation proficiency between the Pre-call and Post-call groups. Six general surgery residents were stratified into the Pre-call group and nine into the Post-call group. These residents were also stratified into Fatigued (10) or Nonfatigued (5) groups, as determined by their reported ESS scores. A statistically significant difference was found between the Pre-call and Post-call reported sleep hours (p = 0.036). There was no statistically significant difference between the Pre-call and Post-call groups or between the Fatigued and Nonfatigued groups in time to complete exercise, number of attempts, and high MScore™ score. Despite variation in fatigue levels, there was no effect on the acquisition of robotic simulator skills.
Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics
NASA Astrophysics Data System (ADS)
Junghans, Christoph; Mniszewski, Susan; Voter, Arthur; Perez, Danny; Eidenbenz, Stephan
2014-03-01
We present an example of a new class of tools that we call application simulators, parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation (PDES). We demonstrate our approach with a TADSim application simulator that models the Temperature Accelerated Dynamics (TAD) method, which is an algorithmically complex member of the Accelerated Molecular Dynamics (AMD) family. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We further extend TADSim to model algorithm extensions to standard TAD, such as speculative spawning of the compute-bound stages of the algorithm, and predict performance improvements without having to implement such a method. Focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights into the TAD algorithm behavior and suggested extensions to the TAD method.
Publishing and sharing of hydrologic models through WaterHUB
NASA Astrophysics Data System (ADS)
Merwade, V.; Ruddell, B. L.; Song, C.; Zhao, L.; Kim, J.; Assi, A.
2011-12-01
Most hydrologists use hydrologic models to simulate the hydrologic processes to understand hydrologic pathways and fluxes for research, decision making and engineering design. Once these tasks are complete including publication of results, the models generally are not published or made available to the public for further use and improvement. Although publication or sharing of models is not required for journal publications, sharing of models may open doors for new collaborations, and avoids duplication of efforts if other researchers are interested in simulating a particular watershed for which a model already exists. For researchers, who are interested in sharing models, there are limited avenues to publishing their models to the wider community. Towards filling this gap, a prototype cyberinfrastructure (CI), called WaterHUB, is developed for sharing hydrologic data and modeling tools in an interactive environment. To test the utility of WaterHUB for sharing hydrologic models, a system to publish and share SWAT (Soil Water Assessment Tool) is developed. Users can utilize WaterHUB to search and download existing SWAT models, and also upload new SWAT models. Metadata such as the name of the watershed, name of the person or agency who developed the model, simulation period, time step, and list of calibrated parameters also published with individual model.
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond
Falenski, Alexander; Weiser, Armin A.; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations. PMID:26247028
A comparison of renewable energy technologies using two simulation softwares: HOMER and RETScreen
NASA Astrophysics Data System (ADS)
Ramli, Mohd Sufian; Wahid, Siti Sufiah Abd; Hassan, Khairul Kamarudin
2017-08-01
This paper concerns on modelling renewable energy technologies including PV standalone system (PVSS) and wind standalone system (WSS) as well as PV-wind hybrid system (PVWHS). To evaluate the performance of all power system configurations in term of economic analysis and optimization, simulation tools called HOMER and RETScreen are used in this paper. HOMER energy modeling software is a powerful tool for designing and analyzing hybrid power systems, which contains a mix of conventional generators, wind turbines, solar photovoltaic's, hydropower, batteries, and other inputs. RETScreen uses a Microsoft Excel-based spreadsheet model that consists of a set of workbooks which calculates the annual average energy flows with adjustment factors to account for temporal effects such as solar-load coincidence. Sizes of equipments are calculated and inserted as inputs to HOMER and RETScreen. The result obtained are analyzed and discussed. The cost per kWh to generate electricity using the PVSS system to supply the average demand of 8.4 kWh/day ranges between RM 1.953/kWh to RM 3.872/kWh. It has been found that the PVSS gives the lowest cost of energy compared to the other proposed two technologies that have been simulated by using HOMER and RETScreen.
NASA Astrophysics Data System (ADS)
Lerendegui-Marco, J.; Cortés-Giraldo, M. A.; Guerrero, C.; Quesada, J. M.; Meo, S. Lo; Massimi, C.; Barbagallo, M.; Colonna, N.; Mancussi, D.; Mingrone, F.; Sabaté-Gilarte, M.; Vannini, G.; Vlachoudis, V.; Aberle, O.; Andrzejewski, J.; Audouin, L.; Bacak, M.; Balibrea, J.; Bečvář, F.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Caamaño, M.; Calviño, F.; Calviani, M.; Cano-Ott, D.; Cardella, R.; Casanovas, A.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Cortés, G.; Cosentino, L.; Damone, L. A.; Diakaki, M.; Domingo-Pardo, C.; Dressler, R.; Dupont, E.; Durán, I.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Göbel, K.; Gómez-Hornillos, M. B.; García, A. R.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González, E.; Griesmayer, E.; Gunsing, F.; Harada, H.; Heinitz, S.; Heyse, J.; Jenkins, D. G.; Jericha, E.; Käppeler, F.; Kadi, Y.; Kalamara, A.; Kavrigin, P.; Kimura, A.; Kivel, N.; Kokkoris, M.; Krtička, M.; Kurtulgil, D.; Leal-Cidoncha, E.; Lederer, C.; Leeb, H.; Lonsdale, S. J.; Macina, D.; Marganiec, J.; Martínez, T.; Masi, A.; Mastinu, P.; Mastromarco, M.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Milazzo, P. M.; Musumarra, A.; Negret, A.; Nolte, R.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Porras, I.; Praena, J.; Radeck, D.; Rauscher, T.; Reifarth, R.; Rout, P. C.; Rubbia, C.; Ryan, J. A.; Saxena, A.; Schillebeeckx, P.; Schumann, D.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Tarifeño-Saldivia, A.; Tassan-Got, L.; Valenta, S.; Variale, V.; Vaz, P.; Ventura, A.; Vlastou, R.; Wallner, A.; Warren, S.; Woods, P. J.; Wright, T.; Žugec, P.
2017-09-01
Monte Carlo (MC) simulations are an essential tool to determine fundamental features of a neutron beam, such as the neutron flux or the γ-ray background, that sometimes can not be measured or at least not in every position or energy range. Until recently, the most widely used MC codes in this field had been MCNPX and FLUKA. However, the Geant4 toolkit has also become a competitive code for the transport of neutrons after the development of the native Geant4 format for neutron data libraries, G4NDL. In this context, we present the Geant4 simulations of the neutron spallation target of the n_TOF facility at CERN, done with version 10.1.1 of the toolkit. The first goal was the validation of the intra-nuclear cascade models implemented in the code using, as benchmark, the characteristics of the neutron beam measured at the first experimental area (EAR1), especially the neutron flux and energy distribution, and the time distribution of neutrons of equal kinetic energy, the so-called Resolution Function. The second goal was the development of a Monte Carlo tool aimed to provide useful calculations for both the analysis and planning of the upcoming measurements at the new experimental area (EAR2) of the facility.
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond.
Falenski, Alexander; Weiser, Armin A; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations.
EDIN0613P weight estimating program. [for launch vehicles
NASA Technical Reports Server (NTRS)
Hirsch, G. N.
1976-01-01
The weight estimating relationships and program developed for space power system simulation are described. The program was developed to size a two-stage launch vehicle for the space power system. The program is actually part of an overall simulation technique called EDIN (Engineering Design and Integration) system. The program sizes the overall vehicle, generates major component weights and derives a large amount of overall vehicle geometry. The program is written in FORTRAN V and is designed for use on the Univac Exec 8 (1110). By utilizing the flexibility of this program while remaining cognizant of the limits imposed upon output depth and accuracy by utilization of generalized input, this program concept can be a useful tool for estimating purposes at the conceptual design stage of a launch vehicle.
2013-01-01
Background The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. Results We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Conclusions Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools. PMID:24209455
Sturgill, David; Malone, John H; Sun, Xia; Smith, Harold E; Rabinow, Leonard; Samson, Marie-Laure; Oliver, Brian
2013-11-09
The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools.
The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.
2005-12-01
The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy-to-use system for constructing SHA computations, a browser-based workflow assembly web portal has been developed. Users can compose complex SHA calculations, specifying SCEC/CME data sets as inputs to calculations, and calling SCEC/CME computational programs to process the data and the output. Knowledge-based software tools have been implemented that utilize ontological descriptions of SHA software and data can validate workflows created with this pathway assembly tool. Data visualization software developed by the collaboration supports analysis and validation of data sets. Several programs have been developed to visualize SCEC/CME data including GMT-based map making software for PSHA codes, 4D wavefield propagation visualization software based on OpenGL, and 3D Geowall-based visualization of earthquakes, faults, and seismic wave propagation. The SCEC/CME Project also helps to sponsor the SCEC UseIT Intern program. The UseIT Intern Program provides research opportunities in both Geosciences and Information Technology to undergraduate students in a variety of fields. The UseIT group has developed a 3D data visualization tool, called SCEC-VDO, as a part of this undergraduate research program.
Simulation Platform: a cloud-based online simulation environment.
Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro
2011-09-01
For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.
Reprint of: Simulation Platform: a cloud-based online simulation environment.
Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro
2011-11-01
For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.
Medicanes in an ocean-atmosphere coupled regional climate model
NASA Astrophysics Data System (ADS)
Akhtar, Naveed; Brauch, Jennifer; Ahrens, Bodo
2014-05-01
So-called medicanes (Mediterranean hurricanes) are meso-scale, marine and warm core Mediterranean cyclones which exhibit some similarities with tropical cyclones. The strong cyclonic winds associated with them are a potential thread for highly populated coastal areas around the Mediterranean basin. In this study we employ an atmospheric limited-area model (COSMO-CLM) coupled with a one-dimensional ocean model (NEMO-1d) to simulate medicanes. The goal of this study is to assess the robustness of the coupled model to simulate these extreme events. For this purpose 11 historical medicane events are simulated by the atmosphere-only and the coupled models using different set-ups (horizontal grid-spacings: 0.44o, 0.22o, 0.088o; with/with-out spectral nudging). The results show that at high resolution the coupled model is not only able to simulate all medicane events but also improves the simulated track length, warm core, and wind speed of simulated medicanes compared to atmosphere-only simulations. In most of the cases the medicanes trajectories and structures are better represented in coupled simulations compared to atmosphere-only simulations. We conclude that the coupled model is a suitable tool for systemic and detailed study of historical medicane events and also for future projections.
Proximity matching for ArF and KrF scanners
NASA Astrophysics Data System (ADS)
Kim, Young Ki; Pohling, Lua; Hwee, Ng Teng; Kim, Jeong Soo; Benyon, Peter; Depre, Jerome; Hong, Jongkyun; Serebriakov, Alexander
2009-03-01
There are many IC-manufacturers over the world that use various exposure systems and work with very high requirements in order to establish and maintain stable lithographic processes of 65 nm, 45 nm and below. Once the process is established, manufacturer desires to be able to run it on different tools that are available. This is why the proximity matching plays a key role to maximize tools utilization in terms of productivity for different types of exposure tools. In this paper, we investigate the source of errors that cause optical proximity mismatch and evaluate several approaches for proximity matching of different types of 193 nm and 248 nm scanner systems such as set-get sigma calibration, contrast adjustment, and, finally, tuning imaging parameters by optimization with Manual Scanner Matcher. First, to monitor the proximity mismatch, we collect CD measurement data for the reference tool and for the tool-to-be-matched. Normally, the measurement is performed for a set of line or space through pitch structures. Secondly, by simulation or experiment, we determine the sensitivity of the critical structures with respect to small adjustment of exposure settings such as NA, sigma inner, sigma outer, dose, focus scan range etc. that are called 'proximity tuning knobs'. Then, with the help of special optimization software, we compute the proximity knob adjustment that has to be applied to the tool-to-be-matched to match the reference tool. Finally, we verify successful matching by exposing on the tool-to-be-matched with tuned exposure settings. This procedure is applicable for inter- and intra scanner type matching, but possibly also for process transfers to the design targets. In order to illustrate the approach we show experimental data as well as results of imaging simulations. The set demonstrate successful matching of critical structures for ArF scanners of different tool generations.
OPTIMIZING THROUGH CO-EVOLUTIONARY AVALANCHES
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. BOETTCHER; A. PERCUS
2000-08-01
We explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problems. The method, called extremal optimization, is inspired by ''self-organized critically,'' a concept introduced to describe emergent complexity in many physical systems. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, extremal optimization successively replaces extremely undesirable elements of a sub-optimal solution with new, random ones. Large fluctuations, called ''avalanches,'' ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements approximation methods inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Those phase transitions are found in the parameter space of most optimization problems, and have recently been conjectured to be the origin of some of the hardest instances in computational complexity. We will demonstrate how extremal optimization can be implemented for a variety of combinatorial optimization problems. We believe that extremal optimization will be a useful tool in the investigation of phase transitions in combinatorial optimization problems, hence valuable in elucidating the origin of computational complexity.« less
Cloud-Based Tools to Support High-Resolution Modeling (Invited)
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Swain, N.; Christensen, S.
2013-12-01
The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
Astroinformatics in the Age of LSST: Analyzing the Summer 2012 Data Release
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; De Lee, N. M.; Stassun, K.; Paegert, M.; Cargile, P.; Burger, D.; Bloom, J. S.; Richards, J.
2013-01-01
The Large Synoptic Survey Telescope (LSST) will image the visible southern sky every three nights. This multi-band, multi-epoch survey will produce a torrent of data, which traditional methods of object-by-object data analysis will not be able to accommodate. Thus the need for new astroinformatics tools to visualize, simulate, mine, and analyze this quantity of data. The Berkeley Center for Time-Domain Informatics (CTDI) is building the informatics infrastructure for generic light curve classification, including the innovation of new algorithms for feature generation and machine learning. The CTDI portal (http://dotastro.org) contains one of the largest collections of public light curves, with visualization and exploration tools. The group has also published the first calibrated probabilistic classification catalog of 50k variable stars along with a data exploration portal called http://bigmacc.info. Twice a year, the LSST collaboration releases simulated LSST data, in order to aid software development. This poster also showcases a suite of new tools from the Vanderbilt Initiative in Data-instensive Astrophysics (VIDA), designed to take advantage of these large data sets. VIDA's Filtergraph interactive web tool allows one to instantly create an interactive data portal for fast, real-time visualization of large data sets. Filtergraph enables quick selection of interesting objects by easily filtering on many different columns, 2-D and 3-D representations, and on-the-fly arithmetic calculations on the data. It also makes sharing the data and the tool with collaborators very easy. The EB/RRL Factory is a neural-network based variable star classifier, which is designed to quickly identify variable stars in a variety of classes from LSST light curve data (currently tuned to Eclipsing Binaries and RR Lyrae stars), and to provide likelihood-based orbital elements or stellar parameters as appropriate. Finally the LCsimulator software allows one to create simulated light curves of multiple types of variable stars based on an LSST cadence.
Java-based Graphical User Interface for MAVERIC-II
NASA Technical Reports Server (NTRS)
Seo, Suk Jai
2005-01-01
A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in files and examine the output data later. Users can also view the output stored in output files by calling a plotting program such as gnuplot. A typical scenario of the use of MAVERIC consists of three-steps; editing existing input data files, running MAVERIC, and plotting output results.
Communicating Value in Simulation: Cost-Benefit Analysis and Return on Investment.
Asche, Carl V; Kim, Minchul; Brown, Alisha; Golden, Antoinette; Laack, Torrey A; Rosario, Javier; Strother, Christopher; Totten, Vicken Y; Okuda, Yasuharu
2018-02-01
Value-based health care requires a balancing of medical outcomes with economic value. Administrators need to understand both the clinical and the economic effects of potentially expensive simulation programs to rationalize the costs. Given the often-disparate priorities of clinical educators relative to health care administrators, justifying the value of simulation requires the use of economic analyses few physicians have been trained to conduct. Clinical educators need to be able to present thorough economic analyses demonstrating returns on investment and cost-effectiveness to effectively communicate with administrators. At the 2017 Academic Emergency Medicine Consensus Conference "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes," our breakout session critically evaluated the cost-benefit and return on investment of simulation. In this paper we provide an overview of some of the economic tools that a clinician may use to present the value of simulation training to financial officers and other administrators in the economic terms they understand. We also define three themes as a call to action for research related to cost-benefit analysis in simulation as well as four specific research questions that will help guide educators and hospital leadership to make decisions on the value of simulation for their system or program. © 2017 by the Society for Academic Emergency Medicine.
A new tool called DISSECT for analysing large genomic data sets using a Big Data approach
Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert
2015-01-01
Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010
Yule, Steven; Parker, Sarah Henrickson; Wilkinson, Jill; McKinley, Aileen; MacDonald, Jamie; Neill, Adrian; McAdam, Tim
2015-01-01
To investigate the effect of coaching on non-technical skills and performance during laparoscopic cholecystectomy in a simulated operating room (OR). Non-technical skills (situation awareness, decision making, teamwork, and leadership) underpin technical ability and are critical to the success of operations and the safety of patients in the OR. The rate of developing assessment tools in this area has outpaced development of workable interventions to improve non-technical skills in surgical training and beyond. A randomized trial was conducted with senior surgical residents (n = 16). Participants were randomized to receive either non-technical skills coaching (intervention) or to self-reflect (control) after each of 5 simulated operations. Coaching was based on the Non-Technical Skills For Surgeons (NOTSS) behavior observation system. Surgeon-coaches trained in this method coached participants in the intervention group for 10 minutes after each simulation. Primary outcome measure was non-technical skills, assessed from video by a surgeon using the NOTSS system. Secondary outcomes were time to call for help during bleeding, operative time, and path length of laparoscopic instruments. Non-technical skills improved in the intervention group from scenario 1 to scenario 5 compared with those in the control group (p = 0.04). The intervention group was faster to call for help when faced with unstoppable bleeding in the final scenario (no. 5; p = 0.03). Coaching improved residents' non-technical skills in the simulated OR compared with those in the control group. Important next steps are to implement non-technical skills coaching in the real OR and assess effect on clinically important process measures and patient outcomes. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
A Cellular Automata Model of Infection Control on Medical Implants
Prieto-Langarica, Alicia; Kojouharov, Hristo; Chen-Charpentier, Benito; Tang, Liping
2011-01-01
S. epidermidis infections on medically implanted devices are a common problem in modern medicine due to the abundance of the bacteria. Once inside the body, S. epidermidis gather in communities called biofilms and can become extremely hard to eradicate, causing the patient serious complications. We simulate the complex S. epidermidis-Neutrophils interactions in order to determine the optimum conditions for the immune system to be able to contain the infection and avoid implant rejection. Our cellular automata model can also be used as a tool for determining the optimal amount of antibiotics for combating biofilm formation on medical implants. PMID:23543851
Finding the sweet spot: how to get the right staffing for variable workloads.
Bryce, David J; Christensen, Taylor J
2011-03-01
All too often, hospital department managers set their staff schedules too much in anticipation of high levels of demand for services, leading to higher-than-necessary staffing costs when demand is lower than expected. The opposite approach of scheduling too few staff to meet demand, then relying on on-call or callback staff to address the shortage, also results in higher-than-necessary costs due to the premium wages that such staff must be paid. A staffing and workload simulation tool allows hospital departments to find the right balance between these extremes.
An evaluation of copy number variation detection tools for cancer using whole exome sequencing data.
Zare, Fatima; Dow, Michelle; Monteleone, Nicholas; Hosny, Abdelrahman; Nabavi, Sheida
2017-05-31
Recently copy number variation (CNV) has gained considerable interest as a type of genomic/genetic variation that plays an important role in disease susceptibility. Advances in sequencing technology have created an opportunity for detecting CNVs more accurately. Recently whole exome sequencing (WES) has become primary strategy for sequencing patient samples and study their genomics aberrations. However, compared to whole genome sequencing, WES introduces more biases and noise that make CNV detection very challenging. Additionally, tumors' complexity makes the detection of cancer specific CNVs even more difficult. Although many CNV detection tools have been developed since introducing NGS data, there are few tools for somatic CNV detection for WES data in cancer. In this study, we evaluated the performance of the most recent and commonly used CNV detection tools for WES data in cancer to address their limitations and provide guidelines for developing new ones. We focused on the tools that have been designed or have the ability to detect cancer somatic aberrations. We compared the performance of the tools in terms of sensitivity and false discovery rate (FDR) using real data and simulated data. Comparative analysis of the results of the tools showed that there is a low consensus among the tools in calling CNVs. Using real data, tools show moderate sensitivity (~50% - ~80%), fair specificity (~70% - ~94%) and poor FDRs (~27% - ~60%). Also, using simulated data we observed that increasing the coverage more than 10× in exonic regions does not improve the detection power of the tools significantly. The limited performance of the current CNV detection tools for WES data in cancer indicates the need for developing more efficient and precise CNV detection methods. Due to the complexity of tumors and high level of noise and biases in WES data, employing advanced novel segmentation, normalization and de-noising techniques that are designed specifically for cancer data is necessary. Also, CNV detection development suffers from the lack of a gold standard for performance evaluation. Finally, developing tools with user-friendly user interfaces and visualization features can enhance CNV studies for a broader range of users.
Mobility analysis, simulation, and scale model testing for the design of wheeled planetary rovers
NASA Technical Reports Server (NTRS)
Lindemann, Randel A.; Eisen, Howard J.
1993-01-01
The use of computer based techniques to model and simulate wheeled rovers on rough natural terrains is considered. Physical models of a prototype vehicle can be used to test the correlation of the simulations in scaled testing. The computer approaches include a quasi-static planar or two dimensional analysis and design tool based on the traction necessary for the vehicle to have imminent mobility. The computer program modeled a six by six wheel drive vehicle of original kinematic configuration, called the Rocker Bogie. The Rocker Bogie was optimized using the quasi-static software with respect to its articulation parameters prior to fabrication of a prototype. In another approach used, the dynamics of the Rocker Bogie vehicle in 3-D space was modeled on an engineering workstation using commercial software. The model included the complex and nonlinear interaction of the tire and terrain. The results of the investigation yielded numerical and graphical results of the rover traversing rough terrain on the earth, moon, and Mars. In addition, animations of the rover excursions were also generated. A prototype vehicle was then used in a series of testbed and field experiments. Correspondence was then established between the computer models and the physical model. The results indicated the utility of the quasi-static tool for configurational design, as well as the predictive ability of the 3-D simulation to model the dynamic behavior of the vehicle over short traverses.
NASA Technical Reports Server (NTRS)
Clancey, William J.; Linde, Charlotte; Seah, Chin; Shafto, Michael
2013-01-01
The transition from the current air traffic system to the next generation air traffic system will require the introduction of new automated systems, including transferring some functions from air traffic controllers to on-board automation. This report describes a new design verification and validation (V&V) methodology for assessing aviation safety. The approach involves a detailed computer simulation of work practices that includes people interacting with flight-critical systems. The research is part of an effort to develop new modeling and verification methodologies that can assess the safety of flight-critical systems, system configurations, and operational concepts. The 2002 Ueberlingen mid-air collision was chosen for analysis and modeling because one of the main causes of the accident was one crew's response to a conflict between the instructions of the air traffic controller and the instructions of TCAS, an automated Traffic Alert and Collision Avoidance System on-board warning system. It thus furnishes an example of the problem of authority versus autonomy. It provides a starting point for exploring authority/autonomy conflict in the larger system of organization, tools, and practices in which the participants' moment-by-moment actions take place. We have developed a general air traffic system model (not a specific simulation of Überlingen events), called the Brahms Generalized Ueberlingen Model (Brahms-GUeM). Brahms is a multi-agent simulation system that models people, tools, facilities/vehicles, and geography to simulate the current air transportation system as a collection of distributed, interactive subsystems (e.g., airports, air-traffic control towers and personnel, aircraft, automated flight systems and air-traffic tools, instruments, crew). Brahms-GUeM can be configured in different ways, called scenarios, such that anomalous events that contributed to the Überlingen accident can be modeled as functioning according to requirements or in an anomalous condition, as occurred during the accident. Brahms-GUeM thus implicitly defines a class of scenarios, which include as an instance what occurred at Überlingen. Brahms-GUeM is a modeling framework enabling "what if" analysis of alternative work system configurations and thus facilitating design of alternative operations concepts. It enables subsequent adaption (reusing simulation components) for modeling and simulating NextGen scenarios. This project demonstrates that BRAHMS provides the capacity to model the complexity of air transportation systems, going beyond idealized and simple flights to include for example the interaction of pilots and ATCOs. The research shows clearly that verification and validation must include the entire work system, on the one hand to check that mechanisms exist to handle failures of communication and alerting subsystems and/or failures of people to notice, comprehend, or communicate problematic (unsafe) situations; but also to understand how people must use their own judgment in relating fallible systems like TCAS to other sources of information and thus to evaluate how the unreliability of automation affects system safety. The simulation shows in particular that distributed agents (people and automated systems) acting without knowledge of each others' actions can create a complex, dynamic system whose interactive behavior is unexpected and is changing too quickly to comprehend and control.
Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software
Mejías, Andrés; Herrera, Reyes S.; Márquez, Marco A.; Calderón, Antonio José; González, Isaías; Andújar, José Manuel
2017-01-01
There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)—this one designed using the intuitive graphical system of EJS—located on the user’s computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented. PMID:28067801
Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software.
Mejías, Andrés; Herrera, Reyes S; Márquez, Marco A; Calderón, Antonio José; González, Isaías; Andújar, José Manuel
2017-01-05
There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)-this one designed using the intuitive graphical system of EJS-located on the user's computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented.
Flow field prediction in full-scale Carrousel oxidation ditch by using computational fluid dynamics.
Yang, Yin; Wu, Yingying; Yang, Xiao; Zhang, Kai; Yang, Jiakuan
2010-01-01
In order to optimize the flow field in a full-scale Carrousel oxidation ditch with many sets of disc aerators operating simultaneously, an experimentally validated numerical tool, based on computational fluid dynamics (CFD), was proposed. A full-scale, closed-loop bioreactor (Carrousel oxidation ditch) in Ping Dingshan Sewage Treatment Plant in Ping Dingshan City, a medium-sized city in Henan Province of China, was evaluated using CFD. Moving wall model was created to simulate many sets of disc aerators which created fluid motion in the ditch. The simulated results were acceptable compared with the experimental data and the following results were obtained: (1) a new method called moving wall model could simulate the flow field in Carrousel oxidation ditch with many sets of disc aerators operating simultaneously. The whole number of cells of grids decreased significantly, thus the calculation amount decreased, and (2) CFD modeling generally characterized the flow pattern in the full-scale tank. 3D simulation could be a good supplement for improving the hydrodynamic performance in oxidation ditch designs.
NASA Astrophysics Data System (ADS)
Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.
2018-05-01
The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.
Huang, Rong; Bilderback, Donald H.; Finkelstein, Kenneth
2014-01-01
Cornell energy-recovery linac (ERL) beamlines will have higher power density and higher fractional coherence than those available at third-generation sources; therefore the capability of a monochromator for ERL beamlines has to be studied. A cryogenic Si monochromator is considered in this paper because the perfect atomic structure of Si crystal is needed to deliver highly coherent radiation. Since neither the total heat load nor the power density alone can determine the severity of crystal deformation, a metric called modified linear power density is used to gauge the thermal deformation. For all ERL undulator beamlines, crystal thermal deformation profiles are simulated using the finite-element analysis tool ANSYS, and wavefront propagations are simulated using Synchrotron Radiation Workshop. It is concluded that cryogenic Si monochromators will be suitable for ERL beamlines in general. PMID:24562557
Computed myography: three-dimensional reconstruction of motor functions from surface EMG data
NASA Astrophysics Data System (ADS)
van den Doel, Kees; Ascher, Uri M.; Pai, Dinesh K.
2008-12-01
We describe a methodology called computed myography to qualitatively and quantitatively determine the activation level of individual muscles by voltage measurements from an array of voltage sensors on the skin surface. A finite element model for electrostatics simulation is constructed from morphometric data. For the inverse problem, we utilize a generalized Tikhonov regularization. This imposes smoothness on the reconstructed sources inside the muscles and suppresses sources outside the muscles using a penalty term. Results from experiments with simulated and human data are presented for activation reconstructions of three muscles in the upper arm (biceps brachii, bracialis and triceps). This approach potentially offers a new clinical tool to sensitively assess muscle function in patients suffering from neurological disorders (e.g., spinal cord injury), and could more accurately guide advances in the evaluation of specific rehabilitation training regimens.
NASA Astrophysics Data System (ADS)
Koepferl, Christine M.; Robitaille, Thomas P.
2017-11-01
When modeling astronomical objects throughout the universe, it is important to correctly treat the limitations of the data, for instance finite resolution and sensitivity. In order to simulate these effects, and to make radiative transfer models directly comparable to real observations, we have developed an open-source Python package called the FluxCompensator that enables the post-processing of the output of 3D Monte Carlo radiative transfer codes, such as Hyperion. With the FluxCompensator, realistic synthetic observations can be generated by modeling the effects of convolution with arbitrary point-spread functions, transmission curves, finite pixel resolution, noise, and reddening. Pipelines can be applied to compute synthetic observations that simulate observatories, such as the Spitzer Space Telescope or the Herschel Space Observatory. Additionally, this tool can read in existing observations (e.g., FITS format) and use the same settings for the synthetic observations. In this paper, we describe the package as well as present examples of such synthetic observations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koepferl, Christine M.; Robitaille, Thomas P., E-mail: koepferl@usm.lmu.de
When modeling astronomical objects throughout the universe, it is important to correctly treat the limitations of the data, for instance finite resolution and sensitivity. In order to simulate these effects, and to make radiative transfer models directly comparable to real observations, we have developed an open-source Python package called the FluxCompensator that enables the post-processing of the output of 3D Monte Carlo radiative transfer codes, such as Hyperion. With the FluxCompensator, realistic synthetic observations can be generated by modeling the effects of convolution with arbitrary point-spread functions, transmission curves, finite pixel resolution, noise, and reddening. Pipelines can be applied tomore » compute synthetic observations that simulate observatories, such as the Spitzer Space Telescope or the Herschel Space Observatory . Additionally, this tool can read in existing observations (e.g., FITS format) and use the same settings for the synthetic observations. In this paper, we describe the package as well as present examples of such synthetic observations.« less
A field-emission based vacuum device for the generation of THz waves
NASA Astrophysics Data System (ADS)
Lin, Ming-Chieh
2005-03-01
Terahertz waves have been used to characterize the electronic, vibrational and compositional properties of solid, liquid and gas phase materials during the past decade. More and more applications in imaging science and technology call for the well development of THz wave sources. Amplification and generation of a high frequency electromagnetic wave are a common interest of field emission based devices. In the present work, we propose a vacuum electronic device based on field emission mechanism for the generation of THz waves. To verify our thinking and designs, the cold tests and the hot tests have been studied via the simulation tools, SUPERFISH and MAGIC. In the hot tests, two types of electron emission mechanisms are considered. One is the field emission and the other is the explosive emission. The preliminary design of the device is carried out and tested by the numerical simulations. The simulation results show that an electronic efficiency up to 4% can be achieved without employing any magnetic circuits.
NASA Astrophysics Data System (ADS)
Himr, D.
2013-04-01
Article describes simulation of unsteady flow during water hammer with two programs, which use different numerical approaches to solve ordinary one dimensional differential equations describing the dynamics of hydraulic elements and pipes. First one is Matlab-Simulink-SimHydraulics, which is a commercial software developed to solve the dynamics of general hydraulic systems. It defines them with block elements. The other software is called HYDRA and it is based on the Lax-Wendrff numerical method, which serves as a tool to solve the momentum and continuity equations. This program was developed in Matlab by Brno University of Technology. Experimental measurements were performed on a simple test rig, which consists of an elastic pipe with strong damping connecting two reservoirs. Water hammer is induced with fast closing the valve. Physical properties of liquid and pipe elasticity parameters were considered in both simulations, which are in very good agreement and differences in comparison with experimental data are minimal.
The Osseus platform: a prototype for advanced web-based distributed simulation
NASA Astrophysics Data System (ADS)
Franceschini, Derrick; Riecken, Mark
2016-05-01
Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.
VERSE - Virtual Equivalent Real-time Simulation
NASA Technical Reports Server (NTRS)
Zheng, Yang; Martin, Bryan J.; Villaume, Nathaniel
2005-01-01
Distributed real-time simulations provide important timing validation and hardware in the- loop results for the spacecraft flight software development cycle. Occasionally, the need for higher fidelity modeling and more comprehensive debugging capabilities - combined with a limited amount of computational resources - calls for a non real-time simulation environment that mimics the real-time environment. By creating a non real-time environment that accommodates simulations and flight software designed for a multi-CPU real-time system, we can save development time, cut mission costs, and reduce the likelihood of errors. This paper presents such a solution: Virtual Equivalent Real-time Simulation Environment (VERSE). VERSE turns the real-time operating system RTAI (Real-time Application Interface) into an event driven simulator that runs in virtual real time. Designed to keep the original RTAI architecture as intact as possible, and therefore inheriting RTAI's many capabilities, VERSE was implemented with remarkably little change to the RTAI source code. This small footprint together with use of the same API allows users to easily run the same application in both real-time and virtual time environments. VERSE has been used to build a workstation testbed for NASA's Space Interferometry Mission (SIM PlanetQuest) instrument flight software. With its flexible simulation controls and inexpensive setup and replication costs, VERSE will become an invaluable tool in future mission development.
High resolution modelling of extreme precipitation events in urban areas
NASA Astrophysics Data System (ADS)
Siemerink, Martijn; Volp, Nicolette; Schuurmans, Wytze; Deckers, Dave
2015-04-01
The present day society needs to adjust to the effects of climate change. More extreme weather conditions are expected, which can lead to longer periods of drought, but also to more extreme precipitation events. Urban water systems are not designed for such extreme events. Most sewer systems are not able to drain the excessive storm water, causing urban flooding. This leads to high economic damage. In order to take appropriate measures against extreme urban storms, detailed knowledge about the behaviour of the urban water system above and below the streets is required. To investigate the behaviour of urban water systems during extreme precipitation events new assessment tools are necessary. These tools should provide a detailed and integral description of the flow in the full domain of overland runoff, sewer flow, surface water flow and groundwater flow. We developed a new assessment tool, called 3Di, which provides detailed insight in the urban water system. This tool is based on a new numerical methodology that can accurately deal with the interaction between overland runoff, sewer flow and surface water flow. A one-dimensional model for the sewer system and open channel flow is fully coupled to a two-dimensional depth-averaged model that simulates the overland flow. The tool uses a subgrid-based approach in order to take high resolution information of the sewer system and of the terrain into account [1, 2]. The combination of using the high resolution information and the subgrid based approach results in an accurate and efficient modelling tool. It is now possible to simulate entire urban water systems using extreme high resolution (0.5m x 0.5m) terrain data in combination with a detailed sewer and surface water network representation. The new tool has been tested in several Dutch cities, such as Rotterdam, Amsterdam and The Hague. We will present the results of an extreme precipitation event in the city of Schiedam (The Netherlands). This city deals with significant soil consolidation and the low-lying areas are prone to urban flooding. The simulation results are compared with measurements in the sewer network. References [1] Guus S. Stelling G.S., 2012. Quadtree flood simulations with subgrid digital elevation models. Water Management 165 (WM1):1329-1354. [2] Vincenzo Cassuli and Guus S. Stelling, 2013. A semi-implicit numerical model for urban drainage systems. International Journal for Numerical Methods in Fluids. Vol. 73:600-614. DOI: 10.1002/fld.3817
Cislan-2 extension final document by University of Twente (Netherlands)
NASA Astrophysics Data System (ADS)
Niemegeers, Ignas; Baumann, Frank; Beuwer, Wim; Jordense, Marcel; Pras, Aiko; Schutte, Leon; Tracey, Ian
1992-01-01
Results of worked performed under the so called Cislan extension contract are presented. The adaptation of the Cislan 2 prototype design to an environment of interconnected Local Area Networks (LAN's) instead of a single 802.5 token ring LAN is considered. In order to extend the network architecture, the Interconnection Function (IF) protocol layer was subdivided into two protocol layers: a new IF layer, and below the Medium Enhancement (ME) protocol layer. Some small enhancements to the distributed bandwidth allocation protocol were developed, which in fact are also applicable to the 'normal' Cislan 2 system. The new services and protocols are described together with some scenarios and requirements for the new internetting Cislan 2 system. How to overcome the degradation of the quality of speech due to packet loss on the LAN subsystem was studied. Experiments were planned in order to measure this speech quality degradation. Simulations were performed of two Cislan subsystems, the bandwidth allocation protocol and the clock synchronization mechanism. Results on both simulations, performed on SUN workstations using QNAP as a simulation tool, are given. Results of the simulations of the clock synchronization mechanism, and results of the simulation of the distributed bandwidth allocation protocol are given.
NASA Astrophysics Data System (ADS)
Delgado, Francisco; Saha, Abhijit; Chandrasekharan, Srinivasan; Cook, Kem; Petry, Catherine; Ridgway, Stephen
2014-08-01
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator is being used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities and assist with performance margin investigations of the LSST system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo; Sterpin, Edmond
2016-04-15
Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithmmore » of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.« less
Flight Testing an Iced Business Jet for Flight Simulation Model Validation
NASA Technical Reports Server (NTRS)
Ratvasky, Thomas P.; Barnhart, Billy P.; Lee, Sam; Cooper, Jon
2007-01-01
A flight test of a business jet aircraft with various ice accretions was performed to obtain data to validate flight simulation models developed through wind tunnel tests. Three types of ice accretions were tested: pre-activation roughness, runback shapes that form downstream of the thermal wing ice protection system, and a wing ice protection system failure shape. The high fidelity flight simulation models of this business jet aircraft were validated using a software tool called "Overdrive." Through comparisons of flight-extracted aerodynamic forces and moments to simulation-predicted forces and moments, the simulation models were successfully validated. Only minor adjustments in the simulation database were required to obtain adequate match, signifying the process used to develop the simulation models was successful. The simulation models were implemented in the NASA Ice Contamination Effects Flight Training Device (ICEFTD) to enable company pilots to evaluate flight characteristics of the simulation models. By and large, the pilots confirmed good similarities in the flight characteristics when compared to the real airplane. However, pilots noted pitch up tendencies at stall with the flaps extended that were not representative of the airplane and identified some differences in pilot forces. The elevator hinge moment model and implementation of the control forces on the ICEFTD were identified as a driver in the pitch ups and control force issues, and will be an area for future work.
NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware
NASA Astrophysics Data System (ADS)
Johnson, V. L.; Teuben, P. J.; Penprase, B. E.
An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.
Saletti, Dominique
2017-01-01
Rapid progress in ultra-high-speed imaging has allowed material properties to be studied at high strain rates by applying full-field measurements and inverse identification methods. Nevertheless, the sensitivity of these techniques still requires a better understanding, since various extrinsic factors present during an actual experiment make it difficult to separate different sources of errors that can significantly affect the quality of the identified results. This study presents a methodology using simulated experiments to investigate the accuracy of the so-called spalling technique (used to study tensile properties of concrete subjected to high strain rates) by numerically simulating the entire identification process. The experimental technique uses the virtual fields method and the grid method. The methodology consists of reproducing the recording process of an ultra-high-speed camera by generating sequences of synthetically deformed images of a sample surface, which are then analysed using the standard tools. The investigation of the uncertainty of the identified parameters, such as Young's modulus along with the stress–strain constitutive response, is addressed by introducing the most significant user-dependent parameters (i.e. acquisition speed, camera dynamic range, grid sampling, blurring), proving that the used technique can be an effective tool for error investigation. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956505
The Progress of Research Project for Magnetized Target Fusion in China
NASA Astrophysics Data System (ADS)
Yang, Xian-Jun
2015-11-01
The fusion of magnetized plasma called Magnetized Target Fusion (MTF) is a hot research area recently. It may significantly reduce the cost and size. Great progress has been achieved in past decades around the world. Five years ago, China initiated the MTF project and has gotten some progress as follows: 1. Verifying the feasibility of ignition of MTF by means of first principle and MHD simulation; 2. Generating the magnetic field over 1400 Tesla, which can be suppress the heat conduction from charged particles, deposit the energy of alpha particle to promote the ignition process, and produce the stable magnetized plasma for the target of ignition; 3. The imploding facility of FP-1 can put several Mega Joule energy to the solid liner of about ten gram in the range of microsecond risen time, while the simulating tool has been developed for design and analysis of the process; 4. The target of FRC can be generated by ``YG 1 facility'' while some simulating tools have be developed. Next five years, the above theoretical work and the experiments of MTF may be integrated to step up as the National project, which may make my term play an important lead role and be supposed to achieve farther progress in China. Supported by the National Natural Science Foundation of China under Grant No 11175028.
Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade
2014-01-01
This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity.
Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.
2014-01-01
Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241
Capturing remote mixing due to internal tides using multi-scale modeling tool: SOMAR-LES
NASA Astrophysics Data System (ADS)
Santilli, Edward; Chalamalla, Vamsi; Scotti, Alberto; Sarkar, Sutanu
2016-11-01
Internal tides that are generated during the interaction of an oscillating barotropic tide with the bottom bathymetry dissipate only a fraction of their energy near the generation region. The rest is radiated away in the form of low- high-mode internal tides. These internal tides dissipate energy at remote locations when they interact with the upper ocean pycnocline, continental slope, and large scale eddies. Capturing the wide range of length and time scales involved during the life-cycle of internal tides is computationally very expensive. A recently developed multi-scale modeling tool called SOMAR-LES combines the adaptive grid refinement features of SOMAR with the turbulence modeling features of a Large Eddy Simulation (LES) to capture multi-scale processes at a reduced computational cost. Numerical simulations of internal tide generation at idealized bottom bathymetries are performed to demonstrate this multi-scale modeling technique. Although each of the remote mixing phenomena have been considered independently in previous studies, this work aims to capture remote mixing processes during the life cycle of an internal tide in more realistic settings, by allowing multi-level (coarse and fine) grids to co-exist and exchange information during the time stepping process.
Jørgensen, Katarina M; Haddow, Pauline C
2011-08-01
Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.
The Precision Formation Flying Integrated Analysis Tool (PFFIAT)
NASA Technical Reports Server (NTRS)
Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor
2004-01-01
Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.
The Precision Formation Flying Integrated Analysis Tool (PFFIAT)
NASA Technical Reports Server (NTRS)
Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor
2004-01-01
Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.
Managing biological networks by using text mining and computer-aided curation
NASA Astrophysics Data System (ADS)
Yu, Seok Jong; Cho, Yongseong; Lee, Min-Ho; Lim, Jongtae; Yoo, Jaesoo
2015-11-01
In order to understand a biological mechanism in a cell, a researcher should collect a huge number of protein interactions with experimental data from experiments and the literature. Text mining systems that extract biological interactions from papers have been used to construct biological networks for a few decades. Even though the text mining of literature is necessary to construct a biological network, few systems with a text mining tool are available for biologists who want to construct their own biological networks. We have developed a biological network construction system called BioKnowledge Viewer that can generate a biological interaction network by using a text mining tool and biological taggers. It also Boolean simulation software to provide a biological modeling system to simulate the model that is made with the text mining tool. A user can download PubMed articles and construct a biological network by using the Multi-level Knowledge Emergence Model (KMEM), MetaMap, and A Biomedical Named Entity Recognizer (ABNER) as a text mining tool. To evaluate the system, we constructed an aging-related biological network that consist 9,415 nodes (genes) by using manual curation. With network analysis, we found that several genes, including JNK, AP-1, and BCL-2, were highly related in aging biological network. We provide a semi-automatic curation environment so that users can obtain a graph database for managing text mining results that are generated in the server system and can navigate the network with BioKnowledge Viewer, which is freely available at http://bioknowledgeviewer.kisti.re.kr.
Cannon, Robert C; Gleeson, Padraig; Crook, Sharon; Ganapathy, Gautham; Marin, Boris; Piasini, Eugenio; Silver, R Angus
2014-01-01
Computational models are increasingly important for studying complex neurophysiological systems. As scientific tools, it is essential that such models can be reproduced and critically evaluated by a range of scientists. However, published models are currently implemented using a diverse set of modeling approaches, simulation tools, and computer languages making them inaccessible and difficult to reproduce. Models also typically contain concepts that are tightly linked to domain-specific simulators, or depend on knowledge that is described exclusively in text-based documentation. To address these issues we have developed a compact, hierarchical, XML-based language called LEMS (Low Entropy Model Specification), that can define the structure and dynamics of a wide range of biological models in a fully machine readable format. We describe how LEMS underpins the latest version of NeuroML and show that this framework can define models of ion channels, synapses, neurons and networks. Unit handling, often a source of error when reusing models, is built into the core of the language by specifying physical quantities in models in terms of the base dimensions. We show how LEMS, together with the open source Java and Python based libraries we have developed, facilitates the generation of scripts for multiple neuronal simulators and provides a route for simulator free code generation. We establish that LEMS can be used to define models from systems biology and map them to neuroscience-domain specific simulators, enabling models to be shared between these traditionally separate disciplines. LEMS and NeuroML 2 provide a new, comprehensive framework for defining computational models of neuronal and other biological systems in a machine readable format, making them more reproducible and increasing the transparency and accessibility of their underlying structure and properties.
Cannon, Robert C.; Gleeson, Padraig; Crook, Sharon; Ganapathy, Gautham; Marin, Boris; Piasini, Eugenio; Silver, R. Angus
2014-01-01
Computational models are increasingly important for studying complex neurophysiological systems. As scientific tools, it is essential that such models can be reproduced and critically evaluated by a range of scientists. However, published models are currently implemented using a diverse set of modeling approaches, simulation tools, and computer languages making them inaccessible and difficult to reproduce. Models also typically contain concepts that are tightly linked to domain-specific simulators, or depend on knowledge that is described exclusively in text-based documentation. To address these issues we have developed a compact, hierarchical, XML-based language called LEMS (Low Entropy Model Specification), that can define the structure and dynamics of a wide range of biological models in a fully machine readable format. We describe how LEMS underpins the latest version of NeuroML and show that this framework can define models of ion channels, synapses, neurons and networks. Unit handling, often a source of error when reusing models, is built into the core of the language by specifying physical quantities in models in terms of the base dimensions. We show how LEMS, together with the open source Java and Python based libraries we have developed, facilitates the generation of scripts for multiple neuronal simulators and provides a route for simulator free code generation. We establish that LEMS can be used to define models from systems biology and map them to neuroscience-domain specific simulators, enabling models to be shared between these traditionally separate disciplines. LEMS and NeuroML 2 provide a new, comprehensive framework for defining computational models of neuronal and other biological systems in a machine readable format, making them more reproducible and increasing the transparency and accessibility of their underlying structure and properties. PMID:25309419
Toward a quantitative account of pitch distribution in spontaneous narrative: Method and validation
Matteson, Samuel E.; Streit Olness, Gloria; Caplow, Nancy J.
2013-01-01
Pitch is well-known both to animate human discourse and to convey meaning in communication. The study of the statistical population distributions of pitch in discourse will undoubtedly benefit from methodological improvements. The current investigation examines a method that parameterizes pitch in discourse as musical pitch interval H measured in units of cents and that disaggregates the sequence of peak word-pitches using tools employed in time-series analysis and digital signal processing. The investigators test the proposed methodology by its application to distributions in pitch interval of the peak word-pitch (collectively called the discourse gamut) that occur in simulated and actual spontaneous emotive narratives obtained from 17 middle-aged African-American adults. The analysis, in rigorous tests, not only faithfully reproduced simulated distributions imbedded in realistic time series that drift and include pitch breaks, but the protocol also reveals that the empirical distributions exhibit a common hidden structure when normalized to a slowly varying mode (called the gamut root) of their respective probability density functions. Quantitative differences between narratives reveal the speakers' relative propensity for the use of pitch levels corresponding to elevated degrees of a discourse gamut (the “e-la”) superimposed upon a continuum that conforms systematically to an asymmetric Laplace distribution. PMID:23654400
Geant4 Computing Performance Benchmarking and Monitoring
Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; ...
2015-12-23
Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less
Simulation of diurnal thermal energy storage systems: Preliminary results
NASA Astrophysics Data System (ADS)
Katipamula, S.; Somasundaram, S.; Williams, H. R.
1994-12-01
This report describes the results of a simulation of thermal energy storage (TES) integrated with a simple-cycle gas turbine cogeneration system. Integrating TES with cogeneration can serve the electrical and thermal loads independently while firing all fuel in the gas turbine. The detailed engineering and economic feasibility of diurnal TES systems integrated with cogeneration systems has been described in two previous PNL reports. The objective of this study was to lay the ground work for optimization of the TES system designs using a simulation tool called TRNSYS (TRaNsient SYstem Simulation). TRNSYS is a transient simulation program with a sequential-modular structure developed at the Solar Energy Laboratory, University of Wisconsin-Madison. The two TES systems selected for the base-case simulations were: (1) a one-tank storage model to represent the oil/rock TES system; and (2) a two-tank storage model to represent the molten nitrate salt TES system. Results of the study clearly indicate that an engineering optimization of the TES system using TRNSYS is possible. The one-tank stratified oil/rock storage model described here is a good starting point for parametric studies of a TES system. Further developments to the TRNSYS library of available models (economizer, evaporator, gas turbine, etc.) are recommended so that the phase-change processes is accurately treated.
Serçinoglu, Onur; Ozbek, Pemra
2018-05-25
Atomistic molecular dynamics (MD) simulations generate a wealth of information related to the dynamics of proteins. If properly analyzed, this information can lead to new insights regarding protein function and assist wet-lab experiments. Aiming to identify interactions between individual amino acid residues and the role played by each in the context of MD simulations, we present a stand-alone software called gRINN (get Residue Interaction eNergies and Networks). gRINN features graphical user interfaces (GUIs) and a command-line interface for generating and analyzing pairwise residue interaction energies and energy correlations from protein MD simulation trajectories. gRINN utilizes the features of NAMD or GROMACS MD simulation packages and automatizes the steps necessary to extract residue-residue interaction energies from user-supplied simulation trajectories, greatly simplifying the analysis for the end-user. A GUI, including an embedded molecular viewer, is provided for visualization of interaction energy time-series, distributions, an interaction energy matrix, interaction energy correlations and a residue correlation matrix. gRINN additionally offers construction and analysis of Protein Energy Networks, providing residue-based metrics such as degrees, betweenness-centralities, closeness centralities as well as shortest path analysis. gRINN is free and open to all users without login requirement at http://grinn.readthedocs.io.
Sapak, Z; Salam, M U; Minchinton, E J; MacManus, G P V; Joyce, D C; Galea, V J
2017-09-01
A weather-based simulation model, called Powdery Mildew of Cucurbits Simulation (POMICS), was constructed to predict fungicide application scheduling to manage powdery mildew of cucurbits. The model was developed on the principle that conditions favorable for Podosphaera xanthii, a causal pathogen of this crop disease, generate a number of infection cycles in a single growing season. The model consists of two components that (i) simulate the disease progression of P. xanthii in secondary infection cycles under natural conditions and (ii) predict the disease severity with application of fungicides at any recurrent disease cycles. The underlying environmental factors associated with P. xanthii infection were quantified from laboratory and field studies, and also gathered from literature. The performance of the POMICS model when validated with two datasets of uncontrolled natural infection was good (the mean difference between simulated and observed disease severity on a scale of 0 to 5 was 0.02 and 0.05). In simulations, POMICS was able to predict high- and low-risk disease alerts. Furthermore, the predicted disease severity was responsive to the number of fungicide applications. Such responsiveness indicates that the model has the potential to be used as a tool to guide the scheduling of judicious fungicide applications.
Mechanical discrete simulator of the electro-mechanical lift with n:1 roping
NASA Astrophysics Data System (ADS)
Alonso, F. J.; Herrera, I.
2016-05-01
The design process of new products in lift engineering is a difficult task due to, mainly, the complexity and slenderness of the lift system, demanding a predictive tool for the lift mechanics. A mechanical ad-hoc discrete simulator, as an alternative to ‘general purpose’ mechanical simulators is proposed. Firstly, the synthesis and experimentation process that has led to establish a suitable model capable of simulating accurately the response of the electromechanical lift is discussed. Then, the equations of motion are derived. The model comprises a discrete system of 5 vertically displaceable masses (car, counterweight, car frame, passengers/loads and lift drive), an inertial mass of the assembly tension pulley-rotor shaft which can rotate about the machine axis and 6 mechanical connectors with 1:1 suspension layout. The model is extended to any n:1 roping lift by setting 6 equivalent mechanical components (suspension systems for car and counterweight, lift drive silent blocks, tension pulley-lift drive stator and passengers/load equivalent spring-damper) by inductive inference from 1:1 and generalized 2:1 roping system. The application to simulate real elevator systems is proposed by numeric time integration of the governing equations using the Kutta-Meden algorithm and implemented in a computer program for ad-hoc elevator simulation called ElevaCAD.
Communicating Value in Simulation: Cost Benefit Analysis and Return on Investment.
Asche, Carl V; Kim, Minchul; Brown, Alisha; Golden, Antoinette; Laack, Torrey A; Rosario, Javier; Strother, Christopher; Totten, Vicken Y; Okuda, Yasuharu
2017-10-26
Value-based health care requires a balancing of medical outcomes with economic value. Administrators need to understand both the clinical and economic effects of potentially expensive simulation programs to rationalize the costs. Given the often-disparate priorities of clinical educators relative to health care administrators, justifying the value of simulation requires the use of economic analyses few physicians have been trained to conduct. Clinical educators need to be able to present thorough economic analyses demonstrating returns on investment and cost effectiveness to effectively communicate with administrators. At the 2017 Academic Emergency Medicine Consensus Conference "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes", our breakout session critically evaluated the cost benefit and return on investment of simulation. In this paper we provide an overview of some of the economic tools that a clinician may use to present the value of simulation training to financial officers and other administrators in the economic terms they understand. We also define three themes as a call to action for research related to cost benefit analysis in simulation as well as four specific research questions that will help guide educators and hospital leadership to make decisions on the value of simulation for their system or program. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
NASA Astrophysics Data System (ADS)
Hadder, Eric Michael
There are many computer aided engineering tools and software used by aerospace engineers to design and predict specific parameters of an airplane. These tools help a design engineer predict and calculate such parameters such as lift, drag, pitching moment, takeoff range, maximum takeoff weight, maximum flight range and much more. However, there are very limited ways to predict and calculate the minimum control speeds of an airplane in engine inoperative flight. There are simple solutions, as well as complicated solutions, yet there is neither standard technique nor consistency throughout the aerospace industry. To further complicate this subject, airplane designers have the option of using an Automatic Thrust Control System (ATCS), which directly alters the minimum control speeds of an airplane. This work addresses this issue with a tool used to predict and calculate the Minimum Control Speed on the Ground (VMCG) as well as the Minimum Control Airspeed (VMCA) of any existing or design-stage airplane. With simple line art of an airplane, a program called VORLAX is used to generate an aerodynamic database used to calculate the stability derivatives of an airplane. Using another program called Numerical Propulsion System Simulation (NPSS), a propulsion database is generated to use with the aerodynamic database to calculate both VMCG and VMCA. This tool was tested using two airplanes, the Airbus A320 and the Lockheed Martin C130J-30 Super Hercules. The A320 does not use an Automatic Thrust Control System (ATCS), whereas the C130J-30 does use an ATCS. The tool was able to properly calculate and match known values of VMCG and VMCA for both of the airplanes. The fact that this tool was able to calculate the known values of VMCG and VMCA for both airplanes means that this tool would be able to predict the VMCG and VMCA of an airplane in the preliminary stages of design. This would allow design engineers the ability to use an Automatic Thrust Control System (ATCS) as part of the design of an airplane and still have the ability to predict the VMCG and VMCA of the airplane.
Challenges of NDE simulation tool validation, optimization, and utilization for composites
NASA Astrophysics Data System (ADS)
Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter
2016-02-01
Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.
Earth as a Tool for Astrobiology—A European Perspective
NASA Astrophysics Data System (ADS)
Martins, Zita; Cottin, Hervé; Kotler, Julia Michelle; Carrasco, Nathalie; Cockell, Charles S.; de la Torre Noetzel, Rosa; Demets, René; de Vera, Jean-Pierre; d'Hendecourt, Louis; Ehrenfreund, Pascale; Elsaesser, Andreas; Foing, Bernard; Onofri, Silvano; Quinn, Richard; Rabbow, Elke; Rettberg, Petra; Ricco, Antonio J.; Slenzka, Klaus; Stalport, Fabien; ten Kate, Inge L.; van Loon, Jack J. W. A.; Westall, Frances
2017-07-01
Scientists use the Earth as a tool for astrobiology by analyzing planetary field analogues (i.e. terrestrial samples and field sites that resemble planetary bodies in our Solar System). In addition, they expose the selected planetary field analogues in simulation chambers to conditions that mimic the ones of planets, moons and Low Earth Orbit (LEO) space conditions, as well as the chemistry occurring in interstellar and cometary ices. This paper reviews the ways the Earth is used by astrobiologists: (i) by conducting planetary field analogue studies to investigate extant life from extreme environments, its metabolisms, adaptation strategies and modern biosignatures; (ii) by conducting planetary field analogue studies to investigate extinct life from the oldest rocks on our planet and its biosignatures; (iii) by exposing terrestrial samples to simulated space or planetary environments and producing a sample analogue to investigate changes in minerals, biosignatures and microorganisms. The European Space Agency (ESA) created a topical team in 2011 to investigate recent activities using the Earth as a tool for astrobiology and to formulate recommendations and scientific needs to improve ground-based astrobiological research. Space is an important tool for astrobiology (see Horneck et al. in Astrobiology, 16:201-243, 2016; Cottin et al., 2017), but access to space is limited. Complementing research on Earth provides fast access, more replications and higher sample throughput. The major conclusions of the topical team and suggestions for the future include more scientifically qualified calls for field campaigns with planetary analogy, and a centralized point of contact at ESA or the EU for the organization of a survey of such expeditions. An improvement of the coordinated logistics, infrastructures and funding system supporting the combination of field work with planetary simulation investigations, as well as an optimization of the scientific return and data processing, data storage and data distribution is also needed. Finally, a coordinated EU or ESA education and outreach program would improve the participation of the public in the astrobiological activities.
Create and Publish a Hierarchical Progressive Survey (HiPS)
NASA Astrophysics Data System (ADS)
Fernique, P.; Boch, T.; Pineau, F.; Oberto, A.
2014-05-01
Since 2009, the CDS promotes a method for visualizing based on the HEALPix sky tessellation. This method, called “Hierarchical Progressive Survey" or HiPS, allows one to display a survey progressively. It is particularly suited for all-sky surveys or deep fields. This visualization method is now integrated in several applications, notably Aladin, the SiTools/MIZAR CNES framework, and the recent HTML5 “Aladin Lite". Also, more than one hundred surveys are already available in this view mode. In this article, we will present the progress concerning this method and its recent adaptation to the astronomical catalogs such as the GAIA simulation.
Integrated performance and reliability specification for digital avionics systems
NASA Technical Reports Server (NTRS)
Brehm, Eric W.; Goettge, Robert T.
1995-01-01
This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.
Medical Operations Console Procedure Evaluation: BME Response to Crew Call Down for an Emergency
NASA Technical Reports Server (NTRS)
Johnson-Troop; Pettys, Marianne; Hurst, Victor, IV; Smaka, Todd; Paul, Bonnie; Rosenquist, Kevin; Gast, Karin; Gillis, David; McCulley, Phyllis
2006-01-01
International Space Station (ISS) Mission Operations are managed by multiple flight control disciplines located at the lead Mission Control Center (MCC) at NASA-Johnson Space Center (JSC). ISS Medical Operations are supported by the complementary roles of Flight Surgeons (Surgeon) and Biomedical Engineer (BME) flight controllers. The Surgeon, a board certified physician, oversees all medical concerns of the crew and the BME provides operational and engineering support for Medical Operations Crew Health Care System. ISS Medical Operations is currently addressing the coordinated response to a crew call down for an emergent medical event, in particular when the BME is the only Medical Operations representative in MCC. In this case, the console procedure BME Response to Crew Call Down for an Emergency will be used. The procedure instructs the BME to contact a Surgeon as soon as possible, coordinate with other flight disciplines to establish a Private Medical Conference (PMC) for the crew and Surgeon, gather information from the crew if time permits, and provide Surgeon with pertinent console resources. It is paramount that this procedure is clearly written and easily navigated to assist the BME to respond consistently and efficiently. A total of five BME flight controllers participated in the study. Each BME participant sat in a simulated MCC environment at a console configured with resources specific to the BME MCC console and was presented with two scripted emergency call downs from an ISS crew member. Each participant used the procedure while interacting with analog MCC disciplines to respond to the crew call down. Audio and video recordings of the simulations were analyzed and each BME participant's actions were compared to the procedure. Structured debriefs were conducted at the conclusion of both simulations. The procedure was evaluated for its ability to elicit consistent responses from each BME participant. Trials were examined for deviations in procedure task completion and/or navigation, in particular the execution of the Surgeon call sequence. Debrief comments were used to analyze unclear procedural steps and to discern any discrepancies between the procedure and generally accepted BME actions. The sequence followed by BME participants differed considerably from the sequence intended by the procedure. Common deviations included the call sequence used to contact Surgeon, the content of BME and crew interaction and the gathering of pertinent console resources. Differing perceptions of task priority and imprecise language seem to have caused multiple deviations from the procedure s intended sequence. The study generated 40 recommendations for the procedure, of which 34 are being implemented. These recommendations address improving the clarity of the instructions, identifying training considerations, expediting Surgeon contact, improving cues for anticipated flight control team communication and identifying missing console tools.
gPKPDSim: a SimBiology®-based GUI application for PKPD modeling in drug development.
Hosseini, Iraj; Gajjala, Anita; Bumbaca Yadav, Daniela; Sukumaran, Siddharth; Ramanujan, Saroja; Paxson, Ricardo; Gadkar, Kapil
2018-04-01
Modeling and simulation (M&S) is increasingly used in drug development to characterize pharmacokinetic-pharmacodynamic (PKPD) relationships and support various efforts such as target feasibility assessment, molecule selection, human PK projection, and preclinical and clinical dose and schedule determination. While model development typically require mathematical modeling expertise, model exploration and simulations could in many cases be performed by scientists in various disciplines to support the design, analysis and interpretation of experimental studies. To this end, we have developed a versatile graphical user interface (GUI) application to enable easy use of any model constructed in SimBiology ® to execute various common PKPD analyses. The MATLAB ® -based GUI application, called gPKPDSim, has a single screen interface and provides functionalities including simulation, data fitting (parameter estimation), population simulation (exploring the impact of parameter variability on the outputs of interest), and non-compartmental PK analysis. Further, gPKPDSim is a user-friendly tool with capabilities including interactive visualization, exporting of results and generation of presentation-ready figures. gPKPDSim was designed primarily for use in preclinical and translational drug development, although broader applications exist. gPKPDSim is a MATLAB ® -based open-source application and is publicly available to download from MATLAB ® Central™. We illustrate the use and features of gPKPDSim using multiple PKPD models to demonstrate the wide applications of this tool in pharmaceutical sciences. Overall, gPKPDSim provides an integrated, multi-purpose user-friendly GUI application to enable efficient use of PKPD models by scientists from various disciplines, regardless of their modeling expertise.
Methods for improving simulations of biological systems: systemic computation and fractal proteins
Bentley, Peter J.
2009-01-01
Modelling and simulation are becoming essential for new fields such as synthetic biology. Perhaps the most important aspect of modelling is to follow a clear design methodology that will help to highlight unwanted deficiencies. The use of tools designed to aid the modelling process can be of benefit in many situations. In this paper, the modelling approach called systemic computation (SC) is introduced. SC is an interaction-based language, which enables individual-based expression and modelling of biological systems, and the interactions between them. SC permits a precise description of a hypothetical mechanism to be written using an intuitive graph-based or a calculus-based notation. The same description can then be directly run as a simulation, merging the hypothetical mechanism and the simulation into the same entity. However, even when using well-designed modelling tools to produce good models, the best model is not always the most accurate one. Frequently, computational constraints or lack of data make it infeasible to model an aspect of biology. Simplification may provide one way forward, but with inevitable consequences of decreased accuracy. Instead of attempting to replace an element with a simpler approximation, it is sometimes possible to substitute the element with a different but functionally similar component. In the second part of this paper, this modelling approach is described and its advantages are summarized using an exemplar: the fractal protein model. Finally, the paper ends with a discussion of good biological modelling practice by presenting lessons learned from the use of SC and the fractal protein model. PMID:19324681
TADSim: Discrete Event-based Performance Prediction for Temperature Accelerated Dynamics
Mniszewski, Susan M.; Junghans, Christoph; Voter, Arthur F.; ...
2015-04-16
Next-generation high-performance computing will require more scalable and flexible performance prediction tools to evaluate software--hardware co-design choices relevant to scientific applications and hardware architectures. Here, we present a new class of tools called application simulators—parameterized fast-running proxies of large-scale scientific applications using parallel discrete event simulation. Parameterized choices for the algorithmic method and hardware options provide a rich space for design exploration and allow us to quickly find well-performing software--hardware combinations. We demonstrate our approach with a TADSim simulator that models the temperature-accelerated dynamics (TAD) method, an algorithmically complex and parameter-rich member of the accelerated molecular dynamics (AMD) family ofmore » molecular dynamics methods. The essence of the TAD application is captured without the computational expense and resource usage of the full code. We accomplish this by identifying the time-intensive elements, quantifying algorithm steps in terms of those elements, abstracting them out, and replacing them by the passage of time. We use TADSim to quickly characterize the runtime performance and algorithmic behavior for the otherwise long-running simulation code. We extend TADSim to model algorithm extensions, such as speculative spawning of the compute-bound stages, and predict performance improvements without having to implement such a method. Validation against the actual TAD code shows close agreement for the evolution of an example physical system, a silver surface. Finally, focused parameter scans have allowed us to study algorithm parameter choices over far more scenarios than would be possible with the actual simulation. This has led to interesting performance-related insights and suggested extensions.« less
NASA Technical Reports Server (NTRS)
Vigil, Jeannine S.
1997-01-01
The small satellite community has been interested in accessing fixed ground stations for means of space-to-ground transmissions, although a problem arises from the limited global coverage. There is a growing interest for using the Space Network (SN) or Tracking and Data Relay Satellites (TDRS) as the primary support for communications because of the coverage it provides. This thesis will address the potential for satellite access of the Space Network with a non-gimbaled antenna configuration and low-power, coded transmission. The non-gimbaled antenna and the TDRS satellites, TDRS-East, TDRS-West, and TDRS-Zone of Exclusion, were configured in an orbital analysis software package called Satellite Tool Kit to emulate the three-dimensional position of the satellites. The access potential, which is the average number of contacts per day and the average time per contact, were obtained through simulations run over a 30-day period to gain all the possible orientations. The orbital altitude was varied from 600 km through 1200 km with the results being a function of orbital inclination angles varying from 20 deg through 100 deg and pointing half-angles of I0 deg through 40 deg. To compare the validity of the simulations, Jet Propulsion Laboratory granted the use of the TOPEX satellite. The TOPEX satellite was configured to emulate a spin-stabilized antenna with its communications antenna stowed in the zenith-pointing direction. This mimicked the antenna pointing spin-stabilized satellite in the simulations. To make valid comparisons, the TOPEX orbital parameters were entered into Satellite Tool Kit and simulated over five test times provided by Jet Propulsion Laboratory.
NASA Astrophysics Data System (ADS)
Cros, Maria; Joemai, Raoul M. S.; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal
2017-08-01
This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT examinations in a 320 detector-row cone-beam scanner.
Cros, Maria; Joemai, Raoul M S; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal
2017-07-17
This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT examinations in a 320 detector-row cone-beam scanner.
Information Presentation and Control in a Modern Air Traffic Control Tower Simulator
NASA Technical Reports Server (NTRS)
Haines, Richard F.; Doubek, Sharon; Rabin, Boris; Harke, Stanton
1996-01-01
The proper presentation and management of information in America's largest and busiest (Level V) air traffic control towers calls for an in-depth understanding of many different human-computer considerations: user interface design for graphical, radar, and text; manual and automated data input hardware; information/display output technology; reconfigurable workstations; workload assessment; and many other related subjects. This paper discusses these subjects in the context of the Surface Development and Test Facility (SDTF) currently under construction at NASA's Ames Research Center, a full scale, multi-manned, air traffic control simulator which will provide the "look and feel" of an actual airport tower cab. Special emphasis will be given to the human-computer interfaces required for the different kinds of information displayed at the various controller and supervisory positions and to the computer-aided design (CAD) and other analytic, computer-based tools used to develop the facility.
A Petri Net Approach Based Elementary Siphons Supervisor for Flexible Manufacturing Systems
NASA Astrophysics Data System (ADS)
Abdul-Hussin, Mowafak Hassan
2015-05-01
This paper presents an approach to constructing a class of an S3PR net for modeling, simulation and control of processes occurring in the flexible manufacturing system (FMS) used based elementary siphons of a Petri net. Siphons are very important to the analysis and control of deadlocks of FMS that is significant objectives of siphons. Petri net models in the efficiency structure analysis, and utilization of the FMSs when different policy can be implemented lead to the deadlock prevention. We are representing an effective deadlock-free policy of a special class of Petri nets called S3PR. Simulation of Petri net structural analysis and reachability graph analysis is used for analysis and control of Petri nets. Petri nets contain been successfully as one of the most powerful tools for modelling of FMS, where Using structural analysis, we show that liveness of such systems can be attributed to the absence of under marked siphons.
HEAVY-DUTY GREENHOUSE GAS EMISSIONS MODEL ...
Class 2b-8 vocational truck manufacturers and Class 7/8 tractor manufacturers would be subject to vehicle-based fuel economy and emission standards that would use a truck simulation model to evaluate the impact of the truck tires and/or tractor cab design on vehicle compliance with any new standards. The EPA has created a model called “GHG Emissions Model (GEM)”, which is specifically tailored to predict truck GHG emissions. As the model is designed for the express purpose of vehicle compliance demonstration, it is less configurable than similar commercial products and its only outputs are GHG emissions and fuel consumption. This approach gives a simple and compact tool for vehicle compliance without the overhead and costs of a more sophisticated model. Evaluation of both fuel consumption and CO2 emissions from heavy-duty highway vehicles through a whole-vehicle operation simulation model.
The Overgrid Interface for Computational Simulations on Overset Grids
NASA Technical Reports Server (NTRS)
Chan, William M.; Kwak, Dochan (Technical Monitor)
2002-01-01
Computational simulations using overset grids typically involve multiple steps and a variety of software modules. A graphical interface called OVERGRID has been specially designed for such purposes. Data required and created by the different steps include geometry, grids, domain connectivity information and flow solver input parameters. The interface provides a unified environment for the visualization, processing, generation and diagnosis of such data. General modules are available for the manipulation of structured grids and unstructured surface triangulations. Modules more specific for the overset approach include surface curve generators, hyperbolic and algebraic surface grid generators, a hyperbolic volume grid generator, Cartesian box grid generators, and domain connectivity: pre-processing tools. An interface provides automatic selection and viewing of flow solver boundary conditions, and various other flow solver inputs. For problems involving multiple components in relative motion, a module is available to build the component/grid relationships and to prescribe and animate the dynamics of the different components.
NASA Astrophysics Data System (ADS)
Skaugen, Thomas; Weltzien, Ingunn H.
2016-09-01
Snow is an important and complicated element in hydrological modelling. The traditional catchment hydrological model with its many free calibration parameters, also in snow sub-models, is not a well-suited tool for predicting conditions for which it has not been calibrated. Such conditions include prediction in ungauged basins and assessing hydrological effects of climate change. In this study, a new model for the spatial distribution of snow water equivalent (SWE), parameterized solely from observed spatial variability of precipitation, is compared with the current snow distribution model used in the operational flood forecasting models in Norway. The former model uses a dynamic gamma distribution and is called Snow Distribution_Gamma, (SD_G), whereas the latter model has a fixed, calibrated coefficient of variation, which parameterizes a log-normal model for snow distribution and is called Snow Distribution_Log-Normal (SD_LN). The two models are implemented in the parameter parsimonious rainfall-runoff model Distance Distribution Dynamics (DDD), and their capability for predicting runoff, SWE and snow-covered area (SCA) is tested and compared for 71 Norwegian catchments. The calibration period is 1985-2000 and validation period is 2000-2014. Results show that SDG better simulates SCA when compared with MODIS satellite-derived snow cover. In addition, SWE is simulated more realistically in that seasonal snow is melted out and the building up of "snow towers" and giving spurious positive trends in SWE, typical for SD_LN, is prevented. The precision of runoff simulations using SDG is slightly inferior, with a reduction in Nash-Sutcliffe and Kling-Gupta efficiency criterion of 0.01, but it is shown that the high precision in runoff prediction using SD_LN is accompanied with erroneous simulations of SWE.
ERIC Educational Resources Information Center
Heys, Chris
2008-01-01
Excel, Microsoft's spreadsheet program, offers several tools which have proven useful in solving some optimization problems that arise in operations research. We will look at two such tools, the Excel modules called Solver and Goal Seek--this after deriving an equation, called the "cash accumulation equation", to be used in conjunction with them.
NASA Astrophysics Data System (ADS)
Block, J.; Crawl, D.; Artes, T.; Cowart, C.; de Callafon, R.; DeFanti, T.; Graham, J.; Smarr, L.; Srivas, T.; Altintas, I.
2016-12-01
The NSF-funded WIFIRE project has designed a web-based wildfire modeling simulation and visualization tool called FireMap. The tool executes FARSITE to model fire propagation using dynamic weather and fire data, configuration settings provided by the user, and static topography and fuel datasets already built-in. Using GIS capabilities combined with scalable big data integration and processing, FireMap enables simple execution of the model with options for running ensembles by taking the information uncertainty into account. The results are easily viewable, sharable, repeatable, and can be animated as a time series. From these capabilities, users can model real-time fire behavior, analyze what-if scenarios, and keep a history of model runs over time for sharing with collaborators. Firemap runs FARSITE with national and local sensor networks for real-time weather data ingestion and High-Resolution Rapid Refresh (HRRR) weather for forecasted weather. The HRRR is a NOAA/NCEP operational weather prediction system comprised of a numerical forecast model and an analysis/assimilation system to initialize the model. It is run with a horizontal resolution of 3 km, has 50 vertical levels, and has a temporal resolution of 15 minutes. The HRRR requires an Environmental Data Exchange (EDEX) server to receive the feed and generate secondary products out of it for the modeling. UCSD's EDEX server, funded by NSF, makes high-resolution weather data available to researchers worldwide and enables visualization of weather systems and weather events lasting months or even years. The high-speed server aggregates weather data from the University Consortium for Atmospheric Research by way of a subscription service from the Consortium called the Internet Data Distribution system. These features are part of WIFIRE's long term goals to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. Although Firemap is a research product of WIFIRE, developed in collaboration with a number of fire departments, the tool is operational in pilot form for providing big data-driven predictive fire spread modeling. Most recently, FireMap was used for situational awareness in the July 2016 Sand Fire by LA City and LA County Fire Departments.
Multiplatform Mission Planning and Operations Simulation Environment for Adaptive Remote Sensors
NASA Astrophysics Data System (ADS)
Smith, G.; Ball, C.; O'Brien, A.; Johnson, J. T.
2017-12-01
We report on the design and development of mission simulator libraries to support the emerging field of adaptive remote sensors. We will outline the current state of the art in adaptive sensing, provide analysis of how the current approach to performing observing system simulation experiments (OSSEs) must be changed to enable adaptive sensors for remote sensing, and present an architecture to enable their inclusion in future OSSEs.The growing potential of sensors capable of real-time adaptation of their operational parameters calls for a new class of mission planning and simulation tools. Existing simulation tools used in OSSEs assume a fixed set of sensor parameters in terms of observation geometry, frequencies used, resolution, or observation time, which allows simplifications to be made in the simulation and allows sensor observation errors to be characterized a priori. Adaptive sensors may vary these parameters depending on the details of the scene observed, so that sensor performance is not simple to model without conducting OSSE simulations that include sensor adaptation in response to varying observational environment. Adaptive sensors are of significance to resource-constrained, small satellite platforms because they enable the management of power and data volumes while providing methods for multiple sensors to collaborate.The new class of OSSEs required to utilize adaptive sensors located on multiple platforms must answer the question: If the physical act of sensing has a cost, how does the system determine if the science value of a measurement is worth the cost and how should that cost be shared among the collaborating sensors?Here we propose to answer this question using an architecture structured around three modules: ADAPT, MANAGE and COLLABORATE. The ADAPT module is a set of routines to facilitate modeling of adaptive sensors, the MANAGE module will implement a set of routines to facilitate simulations of sensor resource management when power and data volume are constrained, and the COLLABORATE module will support simulations of coordination among multiple platforms with adaptive sensors. When used together these modules will for a simulation OSSEs that can enable both the design of adaptive algorithms to support remote sensing and the prediction of the sensor performance.
A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition
2015-10-05
simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows
Front panel engineering with CAD simulation tool
NASA Astrophysics Data System (ADS)
Delacour, Jacques; Ungar, Serge; Mathieu, Gilles; Hasna, Guenther; Martinez, Pascal; Roche, Jean-Christophe
1999-04-01
THe progress made recently in display technology covers many fields of application. The specification of radiance, colorimetry and lighting efficiency creates some new challenges for designers. Photometric design is limited by the capability of correctly predicting the result of a lighting system, to save on the costs and time taken to build multiple prototypes or bread board benches. The second step of the research carried out by company OPTIS is to propose an optimization method to be applied to the lighting system, developed in the software SPEOS. The main features of the tool requires include the CAD interface, to enable fast and efficient transfer between mechanical and light design software, the source modeling, the light transfer model and an optimization tool. The CAD interface is mainly a prototype of transfer, which is not the subjects here. Photometric simulation is efficiently achieved by using the measured source encoding and a simulation by the Monte Carlo method. Today, the advantages and the limitations of the Monte Carlo method are well known. The noise reduction requires a long calculation time, which increases with the complexity of the display panel. A successful optimization is difficult to achieve, due to the long calculation time required for each optimization pass including a Monte Carlo simulation. The problem was initially defined as an engineering method of study. The experience shows that good understanding and mastering of the phenomenon of light transfer is limited by the complexity of non sequential propagation. The engineer must call for the help of a simulation and optimization tool. The main point needed to be able to perform an efficient optimization is a quick method for simulating light transfer. Much work has been done in this area and some interesting results can be observed. It must be said that the Monte Carlo method wastes time calculating some results and information which are not required for the needs of the simulation. Low efficiency transfer system cost a lot of lost time. More generally, the light transfer simulation can be treated efficiently when the integrated result is composed of elementary sub results that include quick analytical calculated intersections. The first axis of research appear. The quick integration research and the quick calculation of geometric intersections. The first axis of research brings some general solutions also valid for multi-reflection systems. The second axis requires some deep thinking on the intersection calculation. An interesting way is the subdivision of space in VOXELS. This is an adapted method of 3D division of space according to the objects and their location. An experimental software has been developed to provide a validation of the method. The gain is particularly high in complex systems. An important reduction in the calculation time has been achieved.
A fuzzy call admission control scheme in wireless networks
NASA Astrophysics Data System (ADS)
Ma, Yufeng; Gong, Shenguang; Hu, Xiulin; Zhang, Yunyu
2007-11-01
Scarcity of the spectrum resource and mobility of users make quality of service (QoS) provision a critical issue in wireless networks. This paper presents a fuzzy call admission control scheme to meet the requirement of the QoS. A performance measure is formed as a weighted linear function of new call and handoff call blocking probabilities. Simulation compares the proposed fuzzy scheme with an adaptive channel reservation scheme. Simulation results show that fuzzy scheme has a better robust performance in terms of average blocking criterion.
DIVWAG Model Documentation. Volume II. Programmer/Analyst Manual. Part 3. Chapter 9 Through 12.
1976-07-01
entered through a routine, NAM2, that calls the segment controlling routine NBARAS. (4) Segment 3, controlled by the routine NFIRE , simulates round...nuclear fire, NAM calls in sequence the routines NFIRE (segment 3), ASUNIT (segment 2), SASSMT (segment 4), and NFIRE (segment 3). These calls simulate...this is a call to NFIRE (ISEG equals one or two), control goes to block L2. (2) Block 2. If this is to assess a unit passing through a nuclear barrier
Tool use and affordance: Manipulation-based versus reasoning-based approaches.
Osiurak, François; Badets, Arnaud
2016-10-01
Tool use is a defining feature of human species. Therefore, a fundamental issue is to understand the cognitive bases of human tool use. Given that people cannot use tools without manipulating them, proponents of the manipulation-based approach have argued that tool use might be supported by the simulation of past sensorimotor experiences, also sometimes called affordances. However, in the meanwhile, evidence has been accumulated demonstrating the critical role of mechanical knowledge in tool use (i.e., the reasoning-based approach). The major goal of the present article is to examine the validity of the assumptions derived from the manipulation-based versus the reasoning-based approach. To do so, we identified 3 key issues on which the 2 approaches differ, namely, (a) the reference frame issue, (b) the intention issue, and (c) the action domain issue. These different issues will be addressed in light of studies in experimental psychology and neuropsychology that have provided valuable contributions to the topic (i.e., tool-use interaction, orientation effect, object-size effect, utilization behavior and anarchic hand, tool use and perception, apraxia of tool use, transport vs. use actions). To anticipate our conclusions, the reasoning-based approach seems to be promising for understanding the current literature, even if it is not fully satisfactory because of a certain number of findings easier to interpret with regard to the manipulation-based approach. A new avenue for future research might be to develop a framework accommodating both approaches, thereby shedding a new light on the cognitive bases of human tool use and affordances. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Thomas, N. L.; Chisel, D. M.
1976-01-01
The success of a rocket-borne experiment depends not only on the pointing of the attitude control system, but on the alignment of the attitude control system to the payload. To ensure proper alignment, special optical tools and alignment techniques are required. Those that were used in the SPARCS program are described and discussed herein. These tools include theodolites, autocollimators, a 38-cm diameter solar simulator, a high-performance 1-m heliostat to provide a stable solar source during the integration of the rocket payload, a portable 75-cm sun tracker for use at the launch site, and an innovation called the Solar Alignment Prism. Using the real sun as the primary reference under field conditions, the Solar Alignment Prism facilitates the coalignment of the attitude sun sensor with the payload. The alignment techniques were developed to ensure the precise alignment of the solar payloads to the SPARCS attitude sensors during payload integration and to verify the required alignment under field conditions just prior to launch.
NASA Astrophysics Data System (ADS)
Morizet, N.; Godin, N.; Tang, J.; Maillet, E.; Fregonese, M.; Normand, B.
2016-03-01
This paper aims to propose a novel approach to classify acoustic emission (AE) signals deriving from corrosion experiments, even if embedded into a noisy environment. To validate this new methodology, synthetic data are first used throughout an in-depth analysis, comparing Random Forests (RF) to the k-Nearest Neighbor (k-NN) algorithm. Moreover, a new evaluation tool called the alter-class matrix (ACM) is introduced to simulate different degrees of uncertainty on labeled data for supervised classification. Then, tests on real cases involving noise and crevice corrosion are conducted, by preprocessing the waveforms including wavelet denoising and extracting a rich set of features as input of the RF algorithm. To this end, a software called RF-CAM has been developed. Results show that this approach is very efficient on ground truth data and is also very promising on real data, especially for its reliability, performance and speed, which are serious criteria for the chemical industry.
HELIOGate, a Portal for the Heliophysics Community
NASA Astrophysics Data System (ADS)
Pierantoni; Gabriele; Carley, Eoin
2014-10-01
Heliophysics is the branch of physics that investigates the interactions between the Sun and the other bodies of the solar system. Heliophysicists rely on data collected from numerous sources scattered across the Solar System. The data collected from these sources is processed to extract metadata and the metadata extracted in this fashion is then used to build indexes of features and events called catalogues. Heliophysicists also develop conceptual and mathematical models of the phenomena and the environment of the Solar System. More specifically, they investigate the physical characteristics of the phenomena and they simulate how they propagate throughout the Solar System with mathematical and physical abstractions called propagation models. HELIOGate aims at addressing the need to combine and orchestrate existing web services in a flexible and easily configurable fashion to tackle different scientific questions. HELIOGate also offers a tool capable of connecting to size! able computation and storage infrastructures to execute data processing codes that are needed to calibrate raw data and to extract metadata.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Signe K.; Purohit, Sumit; Boyd, Lauren W.
The Geothermal Technologies Office Code Comparison Study (GTO-CCS) aims to support the DOE Geothermal Technologies Office in organizing and executing a model comparison activity. This project is directed at testing, diagnosing differences, and demonstrating modeling capabilities of a worldwide collection of numerical simulators for evaluating geothermal technologies. Teams of researchers are collaborating in this code comparison effort, and it is important to be able to share results in a forum where technical discussions can easily take place without requiring teams to travel to a common location. Pacific Northwest National Laboratory has developed an open-source, flexible framework called Velo that providesmore » a knowledge management infrastructure and tools to support modeling and simulation for a variety of types of projects in a number of scientific domains. GTO-Velo is a customized version of the Velo Framework that is being used as the collaborative tool in support of the GTO-CCS project. Velo is designed around a novel integration of a collaborative Web-based environment and a scalable enterprise Content Management System (CMS). The underlying framework provides a flexible and unstructured data storage system that allows for easy upload of files that can be in any format. Data files are organized in hierarchical folders and each folder and each file has a corresponding wiki page for metadata. The user interacts with Velo through a web browser based wiki technology, providing the benefit of familiarity and ease of use. High-level folders have been defined in GTO-Velo for the benchmark problem descriptions, descriptions of simulator/code capabilities, a project notebook, and folders for participating teams. Each team has a subfolder with write access limited only to the team members, where they can upload their simulation results. The GTO-CCS participants are charged with defining the benchmark problems for the study, and as each GTO-CCS Benchmark problem is defined, the problem creator can provide a description using a template on the metadata page corresponding to the benchmark problem folder. Project documents, references and videos of the weekly online meetings are shared via GTO-Velo. A results comparison tool allows users to plot their uploaded simulation results on the fly, along with those of other teams, to facilitate weekly discussions of the benchmark problem results being generated by the teams. GTO-Velo is an invaluable tool providing the project coordinators and team members with a framework for collaboration among geographically dispersed organizations.« less
10 CFR 434.606 - Simulation tool.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 3 2013-01-01 2013-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...
10 CFR 434.606 - Simulation tool.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 3 2012-01-01 2012-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...
10 CFR 434.606 - Simulation tool.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 3 2011-01-01 2011-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...
10 CFR 434.606 - Simulation tool.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 3 2014-01-01 2014-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...
2013-01-01
Background Out-of-hospital cardiac arrest (OHCA) is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS) system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS) simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch) increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1) static changes such as trimming the emergency call handling time or (2) dynamic changes such as location of emergency resources or which resources should carry a defibrillator. PMID:23415045
2007-11-01
Engineer- ing Research Laboratory is currently developing a set of facility ‘architec- tural’ programming tools , called Facility ComposerTM (FC). FC...requirements in the early phases of project development. As the facility program, crite- ria, and requirements are chosen, these tools populate the IFC...developing a set of facility “ar- chitectural” programming tools , called Facility Composer (FC), to support the capture and tracking of facility criteria
High-speed GPU-based finite element simulations for NDT
NASA Astrophysics Data System (ADS)
Huthwaite, P.; Shi, F.; Van Pamel, A.; Lowe, M. J. S.
2015-03-01
The finite element method solved with explicit time increments is a general approach which can be applied to many ultrasound problems. It is widely used as a powerful tool within NDE for developing and testing inspection techniques, and can also be used in inversion processes. However, the solution technique is computationally intensive, requiring many calculations to be performed for each simulation, so traditionally speed has been an issue. For maximum speed, an implementation of the method, called Pogo [Huthwaite, J. Comp. Phys. 2014, doi: 10.1016/j.jcp.2013.10.017], has been developed to run on graphics cards, exploiting the highly parallelisable nature of the algorithm. Pogo typically demonstrates speed improvements of 60-90x over commercial CPU alternatives. Pogo is applied to three NDE examples, where the speed improvements are important: guided wave tomography, where a full 3D simulation must be run for each source transducer and every different defect size; scattering from rough cracks, where many simulations need to be run to build up a statistical model of the behaviour; and ultrasound propagation within coarse-grained materials where the mesh must be highly refined and many different cases run.
Power System Simulation for Policymaking and Making Policymakers
NASA Astrophysics Data System (ADS)
Cohen, Michael Ari
Power system simulation is a vital tool for anticipating, planning for and ultimately addressing future conditions on the power grid, especially in light of contemporary shifts in power generation, transmission and use that are being driven by a desire to utilize more environmentally responsible energy sources. This dissertation leverages power system simulation and engineering-economic analysis to provide initial answers to one open question about future power systems: how will high penetrations of distributed (rooftop) solar power affect the physical and economic operation of distribution feeders? We find that the overall impacts of distributed solar power (both positive and negative) on the feeders we modeled are minor compared to the overall cost of energy, but that there is on average a small net benefit provided by distributed generation. We then describe an effort to make similar analyses more accessible to a non-engineering (high school) audience by developing an educational video game called "Griddle" that is based on the same power system simulation techniques used in the first study. We describe the design and evaluation of Griddle and find that it demonstrates potential to provide students with insights about key power system learning objectives.
Abdelshehid, Corollos S; Quach, Stephen; Nelson, Corey; Graversen, Joseph; Lusch, Achim; Zarraga, Jerome; Alipanah, Reza; Landman, Jaime; McDougall, Elspeth M
2013-01-01
The use of low-risk simulation training for resident education is rapidly expanding as teaching centers integrate simulation-based team training (SBTT) sessions into their education curriculum. SBTT is a valuable tool in technical and communication skills training and assessment for residents. We created a unique SBTT scenario for urology residents involving a laparoscopic partial nephrectomy procedure. Urology residents were randomly paired with a certified registered nurse anesthetists or an anesthesia resident. The scenario incorporated a laparoscopic right partial nephrectomy utilizing a unique polyvinyl alcohol kidney model with an embedded 3cm lower pole exophytic tumor and the high-fidelity SimMan3G mannequin. The Urology residents were instructed to pay particular attention to the patient's identifying information provided at the beginning of the case. Two scripted events occurred, the patient had an anaphylactic reaction to a drug and, after tumor specimen was sent for a frozen section, the confederate pathologist called into the operating room (OR) twice, first with the wrong patient name and subsequently with the wrong specimen. After the scenario was complete, technical performance and nontechnical performance were evaluated and assessed. A debriefing session followed the scenario to discuss and assess technical performance and interdisciplinary nontechnical communication between the team. All Urology residents (n = 9) rated the SBTT scenario as a useful tool in developing communication skills among the OR team and 88% rated the model as useful for technical skills training. Despite cuing to note patient identification, only 3 of 9 (33%) participants identified that the wrong patient information was presented when the confederate "pathologist" called in to report pathology results. All urology residents rated SBTT sessions as useful for the development of communication skills between different team members and making residents aware of unlikely but potential critical errors in the OR. We will continue to use SBTT as a useful method to develop resident technical and nontechnical skills outside of the high-risk operating environment. Copyright © 2013 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Zone calculation as a tool for assessing performance outcome in laparoscopic suturing.
Buckley, Christina E; Kavanagh, Dara O; Nugent, Emmeline; Ryan, Donncha; Traynor, Oscar J; Neary, Paul C
2015-06-01
Simulator performance is measured by metrics, which are valued as an objective way of assessing trainees. Certain procedures such as laparoscopic suturing, however, may not be suitable for assessment under traditionally formulated metrics. Our aim was to assess if our new metric is a valid method of assessing laparoscopic suturing. A software program was developed to order to create a new metric, which would calculate the percentage of time spent operating within pre-defined areas called "zones." Twenty-five candidates (medical students N = 10, surgical residents N = 10, and laparoscopic experts N = 5) performed the laparoscopic suturing task on the ProMIS III(®) simulator. New metrics of "in-zone" and "out-zone" scores as well as traditional metrics of time, path length, and smoothness were generated. Performance was also assessed by two blinded observers using the OSATS and FLS rating scales. This novel metric was evaluated by comparing it to both traditional metrics and subjective scores. There was a significant difference in the average in-zone and out-zone scores between all three experience groups (p < 0.05). The new zone metrics scores correlated significantly with the subjective-blinded observer scores of OSATS and FLS (p = 0.0001). The new zone metric scores also correlated significantly with the traditional metrics of path length, time, and smoothness (p < 0.05). The new metric is a valid tool for assessing laparoscopic suturing objectively. This could be incorporated into a competency-based curriculum to monitor resident progression in the simulated setting.
RuleMonkey: software for stochastic simulation of rule-based models
2010-01-01
Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of rule-based models for which the underlying reaction networks are large. It is typically faster than DYNSTOC for benchmark problems that we have examined. RuleMonkey is freely available as a stand-alone application http://public.tgen.org/rulemonkey. It is also available as a simulation engine within GetBonNie, a web-based environment for building, analyzing and sharing rule-based models. PMID:20673321
Exploring the Dynamics of Cell Processes through Simulations of Fluorescence Microscopy Experiments
Angiolini, Juan; Plachta, Nicolas; Mocskos, Esteban; Levi, Valeria
2015-01-01
Fluorescence correlation spectroscopy (FCS) methods are powerful tools for unveiling the dynamical organization of cells. For simple cases, such as molecules passively moving in a homogeneous media, FCS analysis yields analytical functions that can be fitted to the experimental data to recover the phenomenological rate parameters. Unfortunately, many dynamical processes in cells do not follow these simple models, and in many instances it is not possible to obtain an analytical function through a theoretical analysis of a more complex model. In such cases, experimental analysis can be combined with Monte Carlo simulations to aid in interpretation of the data. In response to this need, we developed a method called FERNET (Fluorescence Emission Recipes and Numerical routines Toolkit) based on Monte Carlo simulations and the MCell-Blender platform, which was designed to treat the reaction-diffusion problem under realistic scenarios. This method enables us to set complex geometries of the simulation space, distribute molecules among different compartments, and define interspecies reactions with selected kinetic constants, diffusion coefficients, and species brightness. We apply this method to simulate single- and multiple-point FCS, photon-counting histogram analysis, raster image correlation spectroscopy, and two-color fluorescence cross-correlation spectroscopy. We believe that this new program could be very useful for predicting and understanding the output of fluorescence microscopy experiments. PMID:26039162
A Prototyping Effort for the Integrated Spacecraft Analysis System
NASA Technical Reports Server (NTRS)
Wong, Raymond; Tung, Yu-Wen; Maldague, Pierre
2011-01-01
Computer modeling and simulation has recently become an essential technique for predicting and validating spacecraft performance. However, most computer models only examine spacecraft subsystems, and the independent nature of the models creates integration problems, which lowers the possibilities of simulating a spacecraft as an integrated unit despite a desire for this type of analysis. A new project called Integrated Spacecraft Analysis was proposed to serve as a framework for an integrated simulation environment. The project is still in its infancy, but a software prototype would help future developers assess design issues. The prototype explores a service oriented design paradigm that theoretically allows programs written in different languages to communicate with one another. It includes creating a uniform interface to the SPICE libraries such that different in-house tools like APGEN or SEQGEN can exchange information with it without much change. Service orientation may result in a slower system as compared to a single application, and more research needs to be done on the different available technologies, but a service oriented approach could increase long term maintainability and extensibility.
Antle, John M.; Stoorvogel, Jetse J.; Valdivia, Roberto O.
2014-01-01
This article presents conceptual and empirical foundations for new parsimonious simulation models that are being used to assess future food and environmental security of farm populations. The conceptual framework integrates key features of the biophysical and economic processes on which the farming systems are based. The approach represents a methodological advance by coupling important behavioural processes, for example, self-selection in adaptive responses to technological and environmental change, with aggregate processes, such as changes in market supply and demand conditions or environmental conditions as climate. Suitable biophysical and economic data are a critical limiting factor in modelling these complex systems, particularly for the characterization of out-of-sample counterfactuals in ex ante analyses. Parsimonious, population-based simulation methods are described that exploit available observational, experimental, modelled and expert data. The analysis makes use of a new scenario design concept called representative agricultural pathways. A case study illustrates how these methods can be used to assess food and environmental security. The concluding section addresses generalizations of parametric forms and linkages of regional models to global models. PMID:24535388
Antle, John M; Stoorvogel, Jetse J; Valdivia, Roberto O
2014-04-05
This article presents conceptual and empirical foundations for new parsimonious simulation models that are being used to assess future food and environmental security of farm populations. The conceptual framework integrates key features of the biophysical and economic processes on which the farming systems are based. The approach represents a methodological advance by coupling important behavioural processes, for example, self-selection in adaptive responses to technological and environmental change, with aggregate processes, such as changes in market supply and demand conditions or environmental conditions as climate. Suitable biophysical and economic data are a critical limiting factor in modelling these complex systems, particularly for the characterization of out-of-sample counterfactuals in ex ante analyses. Parsimonious, population-based simulation methods are described that exploit available observational, experimental, modelled and expert data. The analysis makes use of a new scenario design concept called representative agricultural pathways. A case study illustrates how these methods can be used to assess food and environmental security. The concluding section addresses generalizations of parametric forms and linkages of regional models to global models.
In-Situ Visualization Experiments with ParaView Cinema in RAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kares, Robert John
2015-10-15
A previous paper described some numerical experiments performed using the ParaView/Catalyst in-situ visualization infrastructure deployed in the Los Alamos RAGE radiation-hydrodynamics code to produce images from a running large scale 3D ICF simulation. One challenge of the in-situ approach apparent in these experiments was the difficulty of choosing parameters likes isosurface values for the visualizations to be produced from the running simulation without the benefit of prior knowledge of the simulation results and the resultant cost of recomputing in-situ generated images when parameters are chosen suboptimally. A proposed method of addressing this difficulty is to simply render multiple images atmore » runtime with a range of possible parameter values to produce a large database of images and to provide the user with a tool for managing the resulting database of imagery. Recently, ParaView/Catalyst has been extended to include such a capability via the so-called Cinema framework. Here I describe some initial experiments with the first delivery of Cinema and make some recommendations for future extensions of Cinema’s capabilities.« less
NASA Astrophysics Data System (ADS)
Paxton, L. J.; Schaefer, R. K.; Nix, M.; Fountain, G. H.; Weiss, M.; Swartz, W. H.; Parker, C. L.; MacDonald, L.; Ihde, A. G.; Simpkins, S.; GAIA Team
2011-12-01
In this paper we describe the application of a proven methodology for modeling the complex social and economic interactions embodied in real-world decision making to water scarcity and water resources. We have developed a generalizable, extensible facility we call "GAIA" - Global Assimilation of Information for Action - and applied it to different problem sets. We describe the use of the "Green Country Model" and other gaming/simulation tools to address the impacts of climate and climate disruption issues at the intersection of science, economics, policy, and society. There is a long history in the Defense community of using what are known as strategic simulations or "wargames" to model the complex interactions between the environment, people, resources, infrastructure and the economy in a competitive environment. We describe in this paper, work that we have done on understanding how this heritage can be repurposed to help us explore how the complex interplay between climate disruption and our socio/political and economic structures will affect our future. Our focus here is on a fundamental and growing issue - water and water availability. We consider water and the role of "virtual water" in the system. Various "actors" are included in the simulations. While these simulations cannot definitively predict what will happen, they do illuminate non-linear feedbacks between, for example, treaty agreement, the environment, the economy, and the government. These simulations can be focused on the global, regional, or local environment. We note that these simulations are not "zero sum" games - there need not be a winner and a loser. They are, however, competitive influence games: they represent the tools that a nation, state, faction or group has at its disposal to influence policy (diplomacy), finances, industry (economy), infrastructure, information, etc to achieve their particular goals. As in the real world the problem is competitive - not everyone shares the same definition of a successful or favorable outcome.
Häggström, Ida; Beattie, Bradley J; Schmidtlein, C Ross
2016-06-01
To develop and evaluate a fast and simple tool called dpetstep (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. The tool was developed in matlab using both new and previously reported modules of petstep (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). dpetstep was 8000 times faster than MC. Dynamic images from dpetstep had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dpetstep and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dpetstep images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dpetstep images and noise properties agreed better with MC. The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dpetstep to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dpetstep can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.
Workflow Agents vs. Expert Systems: Problem Solving Methods in Work Systems Design
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Seah, Chin
2009-01-01
During the 1980s, a community of artificial intelligence researchers became interested in formalizing problem solving methods as part of an effort called "second generation expert systems" (2nd GES). How do the motivations and results of this research relate to building tools for the workplace today? We provide an historical review of how the theory of expertise has developed, a progress report on a tool for designing and implementing model-based automation (Brahms), and a concrete example how we apply 2nd GES concepts today in an agent-based system for space flight operations (OCAMS). Brahms incorporates an ontology for modeling work practices, what people are doing in the course of a day, characterized as "activities." OCAMS was developed using a simulation-to-implementation methodology, in which a prototype tool was embedded in a simulation of future work practices. OCAMS uses model-based methods to interactively plan its actions and keep track of the work to be done. The problem solving methods of practice are interactive, employing reasoning for and through action in the real world. Analogously, it is as if a medical expert system were charged not just with interpreting culture results, but actually interacting with a patient. Our perspective shifts from building a "problem solving" (expert) system to building an actor in the world. The reusable components in work system designs include entire "problem solvers" (e.g., a planning subsystem), interoperability frameworks, and workflow agents that use and revise models dynamically in a network of people and tools. Consequently, the research focus shifts so "problem solving methods" include ways of knowing that models do not fit the world, and ways of interacting with other agents and people to gain or verify information and (ultimately) adapt rules and procedures to resolve problematic situations.
Development of a Groundwater Transport Simulation Tool for Remedial Process Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ivarson, Kristine A.; Hanson, James P.; Tonkin, M.
2015-01-14
The groundwater remedy for hexavalent chromium at the Hanford Site includes operation of five large pump-and-treat systems along the Columbia River. The systems at the 100-HR-3 and 100-KR-4 groundwater operable units treat a total of about 9,840 liters per minute (2,600 gallons per minute) of groundwater to remove hexavalent chromium, and cover an area of nearly 26 square kilometers (10 square miles). The pump-and-treat systems result in large scale manipulation of groundwater flow direction, velocities, and most importantly, the contaminant plumes. Tracking of the plumes and predicting needed system modifications is part of the remedial process optimization, and is amore » continual process with the goal of reducing costs and shortening the timeframe to achieve the cleanup goals. While most of the initial system evaluations are conducted by assessing performance (e.g., reduction in contaminant concentration in groundwater and changes in inferred plume size), changes to the well field are often recommended. To determine the placement for new wells, well realignments, and modifications to pumping rates, it is important to be able to predict resultant plume changes. In smaller systems, it may be effective to make small scale changes periodically and adjust modifications based on groundwater monitoring results. Due to the expansive nature of the remediation systems at Hanford, however, additional tools were needed to predict the plume reactions to system changes. A computer simulation tool was developed to support pumping rate recommendations for optimization of large pump-and-treat groundwater remedy systems. This tool, called the Pumping Optimization Model, or POM, is based on a 1-layer derivation of a multi-layer contaminant transport model using MODFLOW and MT3D.« less
A Practical Guide to the Technology and Adoption of Software Process Automation
1994-03-01
IDE’s integration of Software through Pictures, CodeCenter, and FrameMaker ). However, successful use of in- tegrated tools, as reflected in actual...tool for a specific platform. Thus, when a Work Context calls for a word processor, the weaver.tis file can be set up to call FrameMaker for the Sun4
Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade
2014-01-01
Objective This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. Materials and Methods A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. Results With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. Conclusion The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity. PMID:25741101
OpenMP parallelization of a gridded SWAT (SWATG)
NASA Astrophysics Data System (ADS)
Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin
2017-12-01
Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.
NASA Astrophysics Data System (ADS)
Grigoras, Costin; Carminati, Federico; Vladimirovna Datskova, Olga; Schreiner, Steffen; Lee, Sehoon; Zhu, Jianlin; Gheata, Mihaela; Gheata, Andrei; Saiz, Pablo; Betev, Latchezar; Furano, Fabrizio; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Bagnasco, Stefano; Peters, Andreas Joachim; Saiz Santos, Maria Dolores
2011-12-01
With the LHC and ALICE entering a full operation and production modes, the amount of Simulation and RAW data processing and end user analysis computational tasks are increasing. The efficient management of all these tasks, all of which have large differences in lifecycle, amounts of processed data and methods to analyze the end result, required the development and deployment of new tools in addition to the already existing Grid infrastructure. To facilitate the management of the large scale simulation and raw data reconstruction tasks, ALICE has developed a production framework called a Lightweight Production Manager (LPM). The LPM is automatically submitting jobs to the Grid based on triggers and conditions, for example after a physics run completion. It follows the evolution of the job and publishes the results on the web for worldwide access by the ALICE physicists. This framework is tightly integrated with the ALICE Grid framework AliEn. In addition to the publication of the job status, LPM is also allowing a fully authenticated interface to the AliEn Grid catalogue, to browse and download files, and in the near future will provide simple types of data analysis through ROOT plugins. The framework is also being extended to allow management of end user jobs.
Minerva: User-Centered Science Operations Software Capability for Future Human Exploration
NASA Technical Reports Server (NTRS)
Deans, Matthew; Marquez, Jessica J.; Cohen, Tamar; Miller, Matthew J.; Deliz, Ivonne; Hillenius, Steven; Hoffman, Jeffrey; Lee, Yeon Jin; Lees, David; Norheim, Johannes;
2017-01-01
In June of 2016, the Biologic Analog Science Associated with Lava Terrains (BASALT) research project conducted its first field deployment, which we call BASALT-1. BASALT-1 consisted of a science-driven field campaign in a volcanic field in Idaho as a simulated human mission to Mars. Scientists and mission operators were provided a suite of ground software tools that we refer to collectively as Minerva to carry out their work. Minerva provides capabilities for traverse planning and route optimization, timeline generation and display, procedure management, execution monitoring, data archiving, visualization, and search. This paper describes the Minerva architecture, constituent components, use cases, and some preliminary findings from the BASALT-1 campaign.
Simulation of amide I' band profiles of trans polyproline based on an excitonic coupling model
NASA Astrophysics Data System (ADS)
Measey, Thomas; Schweitzer-Stenner, Reinhard
2005-06-01
We measured the amide I' band profile of the IR, isotropic Raman, anisotropic Raman, and Vibrational Circular Dichroism spectrum of poly- L-proline in D 2O. The band shapes were modeled by using an algorithm that exploits the delocalized character of the excited vibrational states [R. Schweitzer-Stenner, J. Phys. Chem. B. 108 (2004) 16965]. The band shapes could be quantitatively reproduced by invoking the polyproline II or 3 1-helix conformation for all peptide residues. This corroborates the notion that the combined use of the above spectroscopies is an ideal tool to discriminate different conformations associated with the so-called random coil state of peptides.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammerstrom, Donald J.; Widergren, Steven E.; Irwin, Chris
About 11 years ago, the U.S. Department of Energy (DOE) funded the Pacific Northwest National Laboratory (PNNL) to conduct one of the first-ever field demonstrations of what later became called a transactive system. Transactive systems have since become important tools in the DOE’s research efforts to modernize the U.S. electric power grid and conserve energy in U.S. buildings. The DOE currently funds fundamental and applied research to advance transactive system technologies, including their simulation, standardization, theoretical principles, valuation, demonstration, and automation. This article will discuss both the historical and recent DOE research and development activities in this topic area, includingmore » especially a recent PNNL report concerning the valuation of transactive systems.« less
Time-frequency representation of a highly nonstationary signal via the modified Wigner distribution
NASA Technical Reports Server (NTRS)
Zoladz, T. F.; Jones, J. H.; Jong, J.
1992-01-01
A new signal analysis technique called the modified Wigner distribution (MWD) is presented. The new signal processing tool has been very successful in determining time frequency representations of highly non-stationary multicomponent signals in both simulations and trials involving actual Space Shuttle Main Engine (SSME) high frequency data. The MWD departs from the classic Wigner distribution (WD) in that it effectively eliminates the cross coupling among positive frequency components in a multiple component signal. This attribute of the MWD, which prevents the generation of 'phantom' spectral peaks, will undoubtedly increase the utility of the WD for real world signal analysis applications which more often than not involve multicomponent signals.
Advancing the LSST Operations Simulator
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Ridgway, S. T.; Cook, K. H.; Delgado, F.; Chandrasekharan, S.; Petry, C. E.; Operations Simulator Group
2013-01-01
The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions (including weather and seeing), as well as additional scheduled and unscheduled downtime. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history database are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. This poster reports recent work which has focussed on an architectural restructuring of the code that will allow us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific opportunity. The enhanced simulator will be used to assess the feasibility of desired observing cadences, study the impact of changing science program priorities, and assist with performance margin investigations of the LSST system.
Medicanes in an ocean-atmosphere coupled regional climate model
NASA Astrophysics Data System (ADS)
Akhtar, N.; Brauch, J.; Dobler, A.; Béranger, K.; Ahrens, B.
2014-03-01
So-called medicanes (Mediterranean hurricanes) are meso-scale, marine, and warm-core Mediterranean cyclones that exhibit some similarities to tropical cyclones. The strong cyclonic winds associated with medicanes threaten the highly populated coastal areas around the Mediterranean basin. To reduce the risk of casualties and overall negative impacts, it is important to improve the understanding of medicanes with the use of numerical models. In this study, we employ an atmospheric limited-area model (COSMO-CLM) coupled with a one-dimensional ocean model (1-D NEMO-MED12) to simulate medicanes. The aim of this study is to assess the robustness of the coupled model in simulating these extreme events. For this purpose, 11 historical medicane events are simulated using the atmosphere-only model, COSMO-CLM, and coupled model, with different setups (horizontal atmospheric grid-spacings of 0.44°, 0.22°, and 0.08°; with/without spectral nudging, and an ocean grid-spacing of 1/12°). The results show that at high-resolution, the coupled model is able to not only simulate most of medicane events but also improve the track length, core temperature, and wind speed of simulated medicanes compared to the atmosphere-only simulations. The results suggest that the coupled model is more proficient for systemic and detailed studies of historical medicane events, and that this model can be an effective tool for future projections.
Medicanes in an ocean-atmosphere coupled regional climate model
NASA Astrophysics Data System (ADS)
Akhtar, N.; Brauch, J.; Dobler, A.; Béranger, K.; Ahrens, B.
2014-08-01
So-called medicanes (Mediterranean hurricanes) are meso-scale, marine, and warm-core Mediterranean cyclones that exhibit some similarities to tropical cyclones. The strong cyclonic winds associated with medicanes threaten the highly populated coastal areas around the Mediterranean basin. To reduce the risk of casualties and overall negative impacts, it is important to improve the understanding of medicanes with the use of numerical models. In this study, we employ an atmospheric limited-area model (COSMO-CLM) coupled with a one-dimensional ocean model (1-D NEMO-MED12) to simulate medicanes. The aim of this study is to assess the robustness of the coupled model in simulating these extreme events. For this purpose, 11 historical medicane events are simulated using the atmosphere-only model, COSMO-CLM, and coupled model, with different setups (horizontal atmospheric grid spacings of 0.44, 0.22, and 0.08°; with/without spectral nudging, and an ocean grid spacing of 1/12°). The results show that at high resolution, the coupled model is able to not only simulate most of medicane events but also improve the track length, core temperature, and wind speed of simulated medicanes compared to the atmosphere-only simulations. The results suggest that the coupled model is more proficient for systemic and detailed studies of historical medicane events, and that this model can be an effective tool for future projections.
Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M
2018-04-01
Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.
Pre-layout AC decoupling analysis with Mentor Graphics HyperLynx
NASA Astrophysics Data System (ADS)
Hnatiuc, Mihaela; Iov, Cǎtǎlin J.
2015-02-01
Considerable resources have been used since the humans got interested to discover the world around. Any discovery and science advance was taken tremendously amount of time, money, sometimes lives. All of these define the cost of a discovery, developing process. Getting back to electronics, this field faced in the last 20-30 years, a big boom in terms of technologies and opportunities. Thousands of equipment were developed and placed on the market. The big difference between various competitors is made at the moment by that we call the time to market. A mobile, for instance, has a time to market of around 6 months and the tendency is to have it smaller than that. That means between the concept and the first model sale, no more than 6 months should be passing. That is why new approaches are needed. The one extensively used now is the simulation. We call the simulation virtual prototyping. The virtual prototyping takes into account more than the components only. It takes into account some other project parameters that would affect the final product. Certified tools can handle such analysis. In our paper we present the case of HyperLynx, a concept developed by Mentor Graphics Company, assisting the hardware designer throughout the designing process, from thermal point of view. A test case board was analyzed at the pre-layout stage and the results presented.
3D FEM Simulation of Flank Wear in Turning
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio
2011-05-01
This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.
NASA Astrophysics Data System (ADS)
Bell, C.; Li, Y.; Lopez, E.; Hogue, T. S.
2017-12-01
Decision support tools that quantitatively estimate the cost and performance of infrastructure alternatives are valuable for urban planners. Such a tool is needed to aid in planning stormwater projects to meet diverse goals such as the regulation of stormwater runoff and its pollutants, minimization of economic costs, and maximization of environmental and social benefits in the communities served by the infrastructure. This work gives a brief overview of an integrated decision support tool, called i-DST, that is currently being developed to serve this need. This presentation focuses on the development of a default database for the i-DST that parameterizes water quality treatment efficiency of stormwater best management practices (BMPs) by region. Parameterizing the i-DST by region will allow the tool to perform accurate simulations in all parts of the United States. A national dataset of BMP performance is analyzed to determine which of a series of candidate regionalizations explains the most variance in the national dataset. The data used in the regionalization analysis comes from the International Stormwater BMP Database and data gleaned from an ongoing systematic review of peer-reviewed and gray literature. In addition to identifying a regionalization scheme for water quality performance parameters in the i-DST, our review process will also provide example methods and protocols for systematic reviews in the field of Earth Science.
CALL in the Zone of Proximal Development: Novelty Effects and Teacher Guidance
ERIC Educational Resources Information Center
Karlström, Petter; Lundin, Eva
2013-01-01
Digital tools are not always used in the manner their designers had in mind. Therefore, it is not enough to assume that learning through CALL tools occurs in intended ways, if at all. We have studied the use of an enhanced word processor for writing essays in Swedish as a second language. The word processor contained natural language processing…
Performance Analysis of an Actor-Based Distributed Simulation
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1998-01-01
Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.
VIPER: a web application for rapid expert review of variant calls.
Wöste, Marius; Dugas, Martin
2018-06-01
With the rapid development in next-generation sequencing, cost and time requirements for genomic sequencing are decreasing, enabling applications in many areas such as cancer research. Many tools have been developed to analyze genomic variation ranging from single nucleotide variants to whole chromosomal aberrations. As sequencing throughput increases, the number of variants called by such tools also grows. Often employed manual inspection of such calls is thus becoming a time-consuming procedure. We developed the Variant InsPector and Expert Rating tool (VIPER) to speed up this process by integrating the Integrative Genomics Viewer into a web application. Analysts can then quickly iterate through variants, apply filters and make decisions based on the generated images and variant metadata. VIPER was successfully employed in analyses with manual inspection of more than 10 000 calls. VIPER is implemented in Java and Javascript and is freely available at https://github.com/MarWoes/viper. marius.woeste@uni-muenster.de. Supplementary data are available at Bioinformatics online.
Intelligent call admission control for multi-class services in mobile cellular networks
NASA Astrophysics Data System (ADS)
Ma, Yufeng; Hu, Xiulin; Zhang, Yunyu
2005-11-01
Scarcity of the spectrum resource and mobility of users make quality of service (QoS) provision a critical issue in mobile cellular networks. This paper presents a fuzzy call admission control scheme to meet the requirement of the QoS. A performance measure is formed as a weighted linear function of new call and handoff call blocking probabilities of each service class. Simulation compares the proposed fuzzy scheme with complete sharing and guard channel policies. Simulation results show that fuzzy scheme has a better robust performance in terms of average blocking criterion.
Vehicle Technology Simulation and Analysis Tools | Transportation Research
| NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology
Cook, Tessa S; Hernandez, Jessica; Scanlon, Mary; Langlotz, Curtis; Li, Chun-Der L
2016-07-01
Despite its increasing use in training other medical specialties, high-fidelity simulation to prepare diagnostic radiology residents for call remains an underused educational resource. To attempt to characterize the barriers toward adoption of this technology, we conducted a survey of academic radiologists and radiology trainees. An Institutional Review Board-approved survey was distributed to the Association of University Radiologists members via e-mail. Survey results were collected electronically, tabulated, and analyzed. A total of 68 survey responses representing 51 programs were received from program directors, department chairs, chief residents, and program administrators. The most common form of educational activity for resident call preparation was lectures. Faculty supervised "baby call" was also widely reported. Actual simulated call environments were quite rare with only three programs reporting this type of educational activity. Barriers to the use of simulation include lack of faculty time, lack of faculty expertise, and lack of perceived need. High-fidelity simulation can be used to mimic the high-stress, high-stakes independent call environment that the typical radiology resident encounters during the second year of training, and can provide objective data for program directors to assess the Accreditation Council of Graduate Medical Education milestones. We predict that this technology will begin to supplement traditional diagnostic radiology teaching methods and to improve patient care and safety in the next decade. Published by Elsevier Inc.
Beaked Whales Respond to Simulated and Actual Navy Sonar
2011-03-14
predator recognition in harbour seals. Nature 420: 171–173. 34. Ford JKB (1989) Acoustic behavior of resident killer whales (Orcinus orca) off Vancouver...acoustic exposure and behavioral reactions of beaked whales to one controlled exposure each of simulated military sonar, killer whale calls, and band...of simulated military sonar, killer whale calls, and band-limited noise. The beaked whales reacted to these three sound playbacks at sound pressure
Cho, Jin-Young; Lee, Hyoung-Joo; Jeong, Seul-Ki; Paik, Young-Ki
2017-12-01
Mass spectrometry (MS) is a widely used proteome analysis tool for biomedical science. In an MS-based bottom-up proteomic approach to protein identification, sequence database (DB) searching has been routinely used because of its simplicity and convenience. However, searching a sequence DB with multiple variable modification options can increase processing time, false-positive errors in large and complicated MS data sets. Spectral library searching is an alternative solution, avoiding the limitations of sequence DB searching and allowing the detection of more peptides with high sensitivity. Unfortunately, this technique has less proteome coverage, resulting in limitations in the detection of novel and whole peptide sequences in biological samples. To solve these problems, we previously developed the "Combo-Spec Search" method, which uses manually multiple references and simulated spectral library searching to analyze whole proteomes in a biological sample. In this study, we have developed a new analytical interface tool called "Epsilon-Q" to enhance the functions of both the Combo-Spec Search method and label-free protein quantification. Epsilon-Q performs automatically multiple spectral library searching, class-specific false-discovery rate control, and result integration. It has a user-friendly graphical interface and demonstrates good performance in identifying and quantifying proteins by supporting standard MS data formats and spectrum-to-spectrum matching powered by SpectraST. Furthermore, when the Epsilon-Q interface is combined with the Combo-Spec search method, called the Epsilon-Q system, it shows a synergistic function by outperforming other sequence DB search engines for identifying and quantifying low-abundance proteins in biological samples. The Epsilon-Q system can be a versatile tool for comparative proteome analysis based on multiple spectral libraries and label-free quantification.
Geomega: MEGAlib's Uniform Geometry and Detector Description Tool for Geant3, MGGPOD, and Geant4
NASA Astrophysics Data System (ADS)
Zoglauer, Andreas C.; Andritschke, R.; Schopper, F.; Wunderer, C. B.
2006-09-01
The Medium Energy Gamma-ray Astronomy library MEGAlib is a set of software tools for the analysis of low to medium energy gamma-ray telescopes, especially Compton telescopes. It comprises all necessary data analysis steps from simulation/measurements via event reconstruction to image reconstruction and enables detailed performance assessments. In the energy range of Compton telescopes (with energy deposits from a few keV up to hundreds of MeV), the Geant Monte-Carlo software packages (Geant3 with its MGGPOD extension as well as Geant4) are widely used. Since each tool has its unique advantages, MEGAlib contains a geometry and detector description library, called Geomega, which allows to use those tools in a uniform way. It incorporates the versatile 3D display facilities available within the ROOT libraries. The same geometry, material, trigger, and detector description can be used for all simulation tools as well as for the later event analysis in the MEGAlib framework. This is done by converting the MEGAlib geometry into the Geant3 or MGGPOD format or directly linking the Geomega library into Geant4. The geometry description can handle most (and can be extended to handle all) volumes common to Geant3, Geant4 and ROOT. In Geomega a list of features is implemented which are especially useful for optimizing detector geometries: It allows to define constants, can handle mathematical operations, enables volume scaling, checks for overlaps of detector volumes, does mass calculations, etc. Used in combination with MEGAlib, Geomega enables discretization, application of detector noise, thresholds, various trigger conditions, defective pixels, etc. The highly modular and completely object-oriented library is written in C++ and based on ROOT. It has been originally developed for the tracking Compton scattering and Pair creation telescope MEGA and has been successfully applied to a wide variety of telescopes, such as ACT, NuSTAR, or GRI.
NASA Technical Reports Server (NTRS)
Bledsoe, Kristin
2013-01-01
The Crew Exploration Vehicle Parachute Assembly System (CPAS) is the parachute system for NASA s Orion spacecraft. The test program consists of numerous drop tests, wherein a test article rigged with parachutes is extracted or released from an aircraft. During such tests, range safety is paramount, as is the recoverability of the parachutes and test article. It is crucial to establish an aircraft release point that will ensure that the article and all items released from it will land in safe locations. A new footprint predictor tool, called Sasquatch, was created in MATLAB. This tool takes in a simulated trajectory for the test article, information about all released objects, and atmospheric wind data (simulated or actual) to calculate the trajectories of the released objects. Dispersions are applied to the landing locations of those objects, taking into account the variability of winds, aircraft release point, and object descent rate. Sasquatch establishes a payload release point (e.g., where the payload will be extracted from the carrier aircraft) that will ensure that the payload and all objects released from it will land in a specified cleared area. The landing locations (the final points in the trajectories) are plotted on a map of the test range. Sasquatch was originally designed for CPAS drop tests and includes extensive information about both the CPAS hardware and the primary test range used for CPAS testing. However, it can easily be adapted for more complex CPAS drop tests, other NASA projects, and commercial partners. CPAS has developed the Sasquatch footprint tool to ensure range safety during parachute drop tests. Sasquatch is well correlated to test data and continues to ensure the safety of test personnel as well as the safe recovery of all equipment. The tool will continue to be modified based on new test data, improving predictions and providing added capability to meet the requirements of more complex testing.
NASA Astrophysics Data System (ADS)
Rouillard, A. P.; Lavraud, B.; Génot, V.; Bouchemit, M.; Dufourg, N.; Plotnikov, I.; Pinto, R. F.; Sanchez-Diaz, E.; Lavarra, M.; Penou, M.; Jacquey, C.; André, N.; Caussarieu, S.; Toniutti, J.-P.; Popescu, D.; Buchlin, E.; Caminade, S.; Alingery, P.; Davies, J. A.; Odstrcil, D.; Mays, L.
2017-11-01
The remoteness of the Sun and the harsh conditions prevailing in the solar corona have so far limited the observational data used in the study of solar physics to remote-sensing observations taken either from the ground or from space. In contrast, the 'solar wind laboratory' is directly measured in situ by a fleet of spacecraft measuring the properties of the plasma and magnetic fields at specific points in space. Since 2007, the solar-terrestrial relations observatory (STEREO) has been providing images of the solar wind that flows between the solar corona and spacecraft making in-situ measurements. This has allowed scientists to directly connect processes imaged near the Sun with the subsequent effects measured in the solar wind. This new capability prompted the development of a series of tools and techniques to track heliospheric structures through space. This article presents one of these tools, a web-based interface called the 'Propagation Tool' that offers an integrated research environment to study the evolution of coronal and solar wind structures, such as Coronal Mass Ejections (CMEs), Corotating Interaction Regions (CIRs) and Solar Energetic Particles (SEPs). These structures can be propagated from the Sun outwards to or alternatively inwards from planets and spacecraft situated in the inner and outer heliosphere. In this paper, we present the global architecture of the tool, discuss some of the assumptions made to simulate the evolution of the structures and show how the tool connects to different databases.
NASA Technical Reports Server (NTRS)
Armstrong, Herbert B.; Roske-Hofstrand, Renate J.
1989-01-01
This paper discusses the use of computer-assisted instructions and flight simulations to enhance procedural and perceptual motor task training. Attention is called to the fact that incorporating the accident and incident data contained in reports filed with the Aviation Safety Reporting System (ASRS) would be a valuable training tool which the learner could apply for other situations. The need to segment the events is emphasized; this would make it possible to modify events in order to suit the needs of the training environment. Methods were developed for designing meaningful scenario development on runway incursions on the basis of analysis of ASRS reports. It is noted that, while the development of interactive training tools using the ASRS and other data bases holds much promise, the design and production of interactive video programs and laser disks are very expensive. It is suggested that this problem may be overcome by sharing the costs of production to develop a library of materials available to a broad range of users.
Generic Airspace Concepts and Research
NASA Technical Reports Server (NTRS)
Mogford, Richard H.
2010-01-01
The purpose of this study was to evaluate methods for reducing the training and memorization required to manage air traffic in mid-term, Next Generation Air Transportation System (NextGen) airspace. We contrasted the performance of controllers using a sector information display and NextGen automation tools while working with familiar and unfamiliar sectors. The airspace included five sectors from Oakland and Salt Lake City Centers configured as a "generic center" called "West High Center." The Controller Information Tool was used to present essential information for managing these sectors. The Multi Aircraft Control System air traffic control simulator provided data link and conflict detection and resolution. There were five experienced air traffic controller participants. Each was familiar with one or two of the five sectors, but not the others. The participants rotated through all five sectors during the ten data collection runs. The results addressing workload, traffic management, and safety, as well as controller and observer comments, supported the generic sector concept. The unfamiliar sectors were comparable to the familiar sectors on all relevant measures.
NASA Astrophysics Data System (ADS)
Azami, Hamed; Escudero, Javier
2017-01-01
Multiscale entropy (MSE) is an appealing tool to characterize the complexity of time series over multiple temporal scales. Recent developments in the field have tried to extend the MSE technique in different ways. Building on these trends, we propose the so-called refined composite multivariate multiscale fuzzy entropy (RCmvMFE) whose coarse-graining step uses variance (RCmvMFEσ2) or mean (RCmvMFEμ). We investigate the behavior of these multivariate methods on multichannel white Gaussian and 1/ f noise signals, and two publicly available biomedical recordings. Our simulations demonstrate that RCmvMFEσ2 and RCmvMFEμ lead to more stable results and are less sensitive to the signals' length in comparison with the other existing multivariate multiscale entropy-based methods. The classification results also show that using both the variance and mean in the coarse-graining step offers complexity profiles with complementary information for biomedical signal analysis. We also made freely available all the Matlab codes used in this paper.
Communication library for run-time visualization of distributed, asynchronous data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rowlan, J.; Wightman, B.T.
1994-04-01
In this paper we present a method for collecting and visualizing data generated by a parallel computational simulation during run time. Data distributed across multiple processes is sent across parallel communication lines to a remote workstation, which sorts and queues the data for visualization. We have implemented our method in a set of tools called PORTAL (for Parallel aRchitecture data-TrAnsfer Library). The tools comprise generic routines for sending data from a parallel program (callable from either C or FORTRAN), a semi-parallel communication scheme currently built upon Unix Sockets, and a real-time connection to the scientific visualization program AVS. Our methodmore » is most valuable when used to examine large datasets that can be efficiently generated and do not need to be stored on disk. The PORTAL source libraries, detailed documentation, and a working example can be obtained by anonymous ftp from info.mcs.anl.gov from the file portal.tar.Z from the directory pub/portal.« less
A Data-Driven Diagnostic Framework for Wind Turbine Structures: A Holistic Approach
Bogoevska, Simona; Spiridonakos, Minas; Chatzi, Eleni; Dumova-Jovanoska, Elena; Höffer, Rudiger
2017-01-01
The complex dynamics of operational wind turbine (WT) structures challenges the applicability of existing structural health monitoring (SHM) strategies for condition assessment. At the center of Europe’s renewable energy strategic planning, WT systems call for implementation of strategies that may describe the WT behavior in its complete operational spectrum. The framework proposed in this paper relies on the symbiotic treatment of acting environmental/operational variables and the monitored vibration response of the structure. The approach aims at accurate simulation of the temporal variability characterizing the WT dynamics, and subsequently at the tracking of the evolution of this variability in a longer-term horizon. The bi-component analysis tool is applied on long-term data, collected as part of continuous monitoring campaigns on two actual operating WT structures located in different sites in Germany. The obtained data-driven structural models verify the potential of the proposed strategy for development of an automated SHM diagnostic tool. PMID:28358346
Optimum Design of LLC Resonant Converter using Inductance Ratio (Lm/Lr)
NASA Astrophysics Data System (ADS)
Palle, Kowstubha; Krishnaveni, K.; Ramesh Reddy, Kolli
2017-06-01
The main benefits of LLC resonant dc/dc converter over conventional series and parallel resonant converters are its light load regulation, less circulating currents, larger bandwidth for zero voltage switching, and less tuning of switching frequency for controlled output. An unique analytical tool, called fundamental harmonic approximation with peak gain adjustment is used for designing the converter. In this paper, an optimum design of the converter is proposed by considering three different design criterions with different values of inductance ratio (Lm/Lr) to achieve good efficiency at high input voltage. The optimum design includes the analysis in operating range, switching frequency range, primary side losses of a switch and stability. The analysis is carried out with simulation using the software tools like MATLAB and PSIM. The performance of the optimized design is demonstrated for a design specification of 12 V, 5 A output operating with an input voltage range of 300-400 V using FSFR 2100 IC of Texas instruments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, R.D.
The results of a research effort to develop a multiphase naturally fractured, lenticular reservoir simulator is presented. The simulator possesses the capability of investigating the effects of non-Darcy flow, Klinkenberg effect, and transient multiphase wellbore storage for wells with finite and infinite conductivity fractures. The simulator has been utilized to simulate actual pressure transient data for gas wells associated with the United States Department of Energy, Western Gas Sands Project, MWX Experiments. The results of these simulations are contained in the report as well as simulation results for hypothetical wells which are producing under multiphase flow conditions. In addition tomore » the reservoir simulation development, and theoretical and field case studies the results of an experimental program to investigate multiphase non-Darcy flow coefficients (inertial resistance coefficients or beta factors as they are sometimes called) are also presented. The experimental data was obtained for non-Darcy flow in porous and fractured media. The results clearly indicate the dependence of the non-Darcy flow coefficient upon liquid saturation. Where appropriate comparisons are made against data available in the open literature. In addition, theoretical development of a correlation to predict non-Darcy flow coefficients as a function of effective gas permeability, liquid saturations, and porosity is presentd. The results presented in this report will provide scientists and engineers tools to investigate well performance data and production trends for wells completed in lenticular, naturally fractured formations producing under non-Darcy, multiphase conditions. 65 refs., 57 figs., 15 tabs.« less
ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES
Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...
Proceedings of the 1987 conference on tools for the simulation profession
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hawkins, R.; Klukis, K.
1987-01-01
This book covers the proceedings of the 1987 conference on tools for the simulation profession. Some of the topics are: SIMULACT: a generic tool for simulating distributed systems; ESL language simulation of spacecraft batteries; and Trends in global cadmium levels from increased use of fossil fuels.
DOT National Transportation Integrated Search
2003-04-01
Introduction Motorist aid call boxes are used to provide motorist assistance, improve safety, and can serve as an incident detection tool. More recently, Intelligent Transportation Systems (ITS) applications have been added to call box systems to enh...
IgSimulator: a versatile immunosequencing simulator.
Safonova, Yana; Lapidus, Alla; Lill, Jennie
2015-10-01
The recent introduction of next-generation sequencing technologies to antibody studies have resulted in a growing number of immunoinformatics tools for antibody repertoire analysis. However, benchmarking these newly emerging tools remains problematic since the gold standard datasets that are needed to validate these tools are typically not available. Since simulating antibody repertoires is often the only feasible way to benchmark new immunoinformatics tools, we developed the IgSimulator tool that addresses various complications in generating realistic antibody repertoires. IgSimulator's code has modular structure and can be easily adapted to new requirements to simulation. IgSimulator is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from yana-safonova.github.io/ig_simulator. safonova.yana@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Risk Reduction and Training using Simulation Based Tools - 12180
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Irin P.
2012-07-01
Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shahidehpour, Mohammad
Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practicesmore » can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision results are often text-based demonstrations. WINS includes a powerful visualization tool and user interface capability for transmission analyses, planning, and assessment, which will be of great interest to power market participants, power system planners and operators, and state and federal regulatory entities; and (3) WINS can handle extended transmission models for wind integration studies. WINS models include limitations on transmission flow as well as bus voltage for analyzing power system states. The existing decision tools often consider transmission flow constraints (dc power flow) alone which could result in the over-utilization of existing resources when analyzing wind integration. WINS can be used to assist power market participants including transmission companies, independent system operators, power system operators in vertically integrated utilities, wind energy developers, and regulatory agencies to analyze economics, security, and reliability of various options for wind integration including transmission upgrades and the planning of new transmission facilities. WINS can also be used by industry for the offline training of reliability and operation personnel when analyzing wind integration uncertainties, identifying critical spots in power system operation, analyzing power system vulnerabilities, and providing credible decisions for examining operation and planning options for wind integration. Researches in this project on wind integration included (1) Development of WINS; (2) Transmission Congestion Analysis in the Eastern Interconnection; (3) Analysis of 2030 Large-Scale Wind Energy Integration in the Eastern Interconnection; (4) Large-scale Analysis of 2018 Wind Energy Integration in the Eastern U.S. Interconnection. The research resulted in 33 papers, 9 presentations, 9 PhD degrees, 4 MS degrees, and 7 awards. The education activities in this project on wind energy included (1) Wind Energy Training Facility Development; (2) Wind Energy Course Development.« less
Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment
NASA Technical Reports Server (NTRS)
Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.
2007-01-01
Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.
Development and Testing of Control Laws for the Active Aeroelastic Wing Program
NASA Technical Reports Server (NTRS)
Dibley, Ryan P.; Allen, Michael J.; Clarke, Robert; Gera, Joseph; Hodgkinson, John
2005-01-01
The Active Aeroelastic Wing research program was a joint program between the U.S. Air Force Research Laboratory and NASA established to investigate the characteristics of an aeroelastic wing and the technique of using wing twist for roll control. The flight test program employed the use of an F/A-18 aircraft modified by reducing the wing torsional stiffness and adding a custom research flight control system. The research flight control system was optimized to maximize roll rate using only wing surfaces to twist the wing while simultaneously maintaining design load limits, stability margins, and handling qualities. NASA Dryden Flight Research Center developed control laws using the software design tool called CONDUIT, which employs a multi-objective function optimization to tune selected control system design parameters. Modifications were made to the Active Aeroelastic Wing implementation in this new software design tool to incorporate the NASA Dryden Flight Research Center nonlinear F/A-18 simulation for time history analysis. This paper describes the design process, including how the control law requirements were incorporated into constraints for the optimization of this specific software design tool. Predicted performance is also compared to results from flight.
Brawanski, Alexander
2017-01-01
Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data. PMID:28255331
Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander
2017-01-01
Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.
A human-hearing-related prediction tool for soundscapes and community noise
NASA Astrophysics Data System (ADS)
Genuit, Klaus
2002-11-01
There are several methods of calculation available for the prediction of the A-weighted sound-pressure level of environmental noise, which are, however, not suitable for a qualified prediction of the residents' annoyance and physiological strain. The subjectively felt noise quality does not only depend on the A-weighted sound-pressure level, but also on other psychoacoustical parameters, such as loudness, roughness, sharpness, etc. In addition to these physical and psychoacoustical aspects of noise, the so-called psychological or cognitive aspects have to be considered, too, which means that the listeners' expectations, their mental attitude, as well as the information content of the noise finally influence the noise quality perceived by the individual persons. Within the scope of a research project SVEN (Sound Quality of Vehicle Exterior Noise), which is promoted by the EC, a new tool has been developed which allows a binaural simulation and prediction of the environmental noise to evaluate the influence of different contributions by the sound events with respect to the psychoacoustical parameters, the spatial distribution, movement, and frequency. By means of this tool it is now possible to consider completely new aspects regarding the audible perception of noise when establishing a soundscape or when planning community noise.
Learning in Authentic Earth and Planetary Contexts
NASA Astrophysics Data System (ADS)
Fergusson, J. A.; Oliver, C. A.
2006-12-01
A Virtual Field Trip project has been developed in collaboration with NASA Learning Technologies to allow students, internationally, to accompany scientists on a field trip to the Pilbara region of Western Australia to debate the relevance of ancient structures called stromatolites, to the origins of life on Earth and the search for life on Mars. The project was planned with the aim of exposing high school students to `science in the making', including exposure to the ongoing debate and uncertainties involved in scientific research. The development of the project stemmed from both research-based and anecdotal evidence that current science education programs are not providing secondary students with a good understanding of the processes of science. This study seeks to examine the effectiveness of student use of the tools to increase awareness of the processes of science and to evaluate the effectiveness of the tools in terms of student learning. The literature reports that there is a need for learning activities to be conducted within meaningful contexts. The virtual field trip tools create an environment that simulates key elements in the scientific process. Such an approach allows students to learn by doing, to work like scientists and apply their learning in an authentic context.
Tools and Equipment Modeling for Automobile Interactive Assembling Operating Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu Dianliang; Zhu Hongmin; Shanghai Key Laboratory of Advance Manufacturing Environment
Tools and equipment play an important role in the simulation of virtual assembly, especially in the assembly process simulation and plan. Because of variety in function and complexity in structure and manipulation, the simulation of tools and equipments remains to be a challenge for interactive assembly operation. Based on analysis of details and characteristics of interactive operations for automobile assembly, the functional requirement for tools and equipments of automobile assembly is given. Then, a unified modeling method for information expression and function realization of general tools and equipments is represented, and the handling methods of manual, semi-automatic, automatic tools andmore » equipments are discussed. Finally, the application in assembly simulation of rear suspension and front suspension of Roewe 750 automobile is given. The result shows that the modeling and handling methods are applicable in the interactive simulation of various tools and equipments, and can also be used for supporting assembly process planning in virtual environment.« less
Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.
2015-01-01
Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.
Artificial organisms as tools for the development of psychological theory: Tolman's lesson.
Miglino, Orazio; Gigliotta, Onofrio; Cardaci, Maurizio; Ponticorvo, Michela
2007-12-01
In the 1930s and 1940s, Edward Tolman developed a psychological theory of spatial orientation in rats and humans. He expressed his theory as an automaton (the "schematic sowbug") or what today we would call an "artificial organism." With the technology of the day, he could not implement his model. Nonetheless, he used it to develop empirical predictions which tested with animals in the laboratory. This way of proceeding was in line with scientific practice dating back to Galileo. The way psychologists use artificial organisms in their work today breaks with this tradition. Modern "artificial organisms" are constructed a posteriori, working from experimental or ethological observations. As a result, researchers can use them to confirm a theoretical model or to simulate its operation. But they make no contribution to the actual building of models. In this paper, we try to return to Tolman's original strategy: implementing his theory of "vicarious trial and error" in a simulated robot, forecasting the robot's behavior and conducting experiments that verify or falsify these predictions.
Overview of the relevant CFD work at Thiokol Corporation
NASA Technical Reports Server (NTRS)
Chwalowski, Pawel; Loh, Hai-Tien
1992-01-01
An in-house developed proprietary advanced computational fluid dynamics code called SHARP (Trademark) is a primary tool for many flow simulations and design analyses. The SHARP code is a time dependent, two dimensional (2-D) axisymmetric numerical solution technique for the compressible Navier-Stokes equations. The solution technique in SHARP uses a vectorizable implicit, second order accurate in time and space, finite volume scheme based on an upwind flux-difference splitting of a Roe-type approximated Riemann solver, Van Leer's flux vector splitting, and a fourth order artificial dissipation scheme with a preconditioning to accelerate the flow solution. Turbulence is simulated by an algebraic model, and ultimately the kappa-epsilon model. Some other capabilities of the code are 2-D two-phase Lagrangian particle tracking and cell blockages. Extensive development and testing has been conducted on the 3-D version of the code with flow, combustion, and turbulence interactions. The emphasis here is on the specific applications of SHARP in Solid Rocket Motor design. Information is given in viewgraph form.
Fasel, J H; Gingins, P; Kalra, P; Magnenat-Thalmann, N; Baur, C; Cuttat, J F; Muster, M; Gailloud, P
1997-01-01
Endoscopic surgery, also called minimally invasive surgery, is presumed drastically to reduce postoperative morbidity and thus to offer both human and economic benefits. For the surgeon, however, this approach leads to a number of gestural challenges that require extensive training to be mastered. In order to replace experimentation on animals and patients, we developed a simulator for endoscopic surgery. To achieve this goal, a first step was to develop a working prototype, a "standard patient," on which the informatic and microengineering tools could be validated. We used the visible man dataset for this purpose. The external shape of the visible man's liver, his biliary passages, and his extrahepatic portal system turned out to be fully within the standard pattern of normal anatomy. Anatomic variations were observed in the intrahepatic right portal vein, the hepatic veins, and the arterial blood supply to the liver. Thus, the visible man dataset reveals itself to be well suited for the simulation of minimally invasive surgical operation such as endoscopic cholecystectomy.
Modeling, simulation, and estimation of optical turbulence
NASA Astrophysics Data System (ADS)
Formwalt, Byron Paul
This dissertation documents three new contributions to simulation and modeling of optical turbulence. The first contribution is the formalization, optimization, and validation of a modeling technique called successively conditioned rendering (SCR). The SCR technique is empirically validated by comparing the statistical error of random phase screens generated with the technique. The second contribution is the derivation of the covariance delineation theorem, which provides theoretical bounds on the error associated with SCR. It is shown empirically that the theoretical bound may be used to predict relative algorithm performance. Therefore, the covariance delineation theorem is a powerful tool for optimizing SCR algorithms. For the third contribution, we introduce a new method for passively estimating optical turbulence parameters, and demonstrate the method using experimental data. The technique was demonstrated experimentally, using a 100 m horizontal path at 1.25 m above sun-heated tarmac on a clear afternoon. For this experiment, we estimated C2n ≈ 6.01 · 10-9 m-23 , l0 ≈ 17.9 mm, and L0 ≈ 15.5 m.
An organizational metamodel for hospital emergency departments.
Kaptan, Kubilay
2014-10-01
I introduce an organizational model describing the response of the hospital emergency department. The hybrid simulation/analytical model (called a "metamodel") can estimate a hospital's capacity and dynamic response in real time and incorporate the influence of damage to structural and nonstructural components on the organizational ones. The waiting time is the main parameter of response and is used to evaluate the disaster resilience of health care facilities. Waiting time behavior is described by using a double exponential function and its parameters are calibrated based on simulated data. The metamodel covers a large range of hospital configurations and takes into account hospital resources in terms of staff and infrastructures, operational efficiency, and the possible existence of an emergency plan; maximum capacity; and behavior both in saturated and overcapacitated conditions. The sensitivity of the model to different arrival rates, hospital configurations, and capacities and the technical and organizational policies applied during and before a disaster were investigated. This model becomes an important tool in the decision process either for the engineering profession or for policy makers.
Schroeder, Indra
2015-01-01
Abstract A main ingredient for the understanding of structure/function correlates of ion channels is the quantitative description of single-channel gating and conductance. However, a wealth of information provided from fast current fluctuations beyond the temporal resolution of the recording system is often ignored, even though it is close to the time window accessible to molecular dynamics simulations. This kind of current fluctuations provide a special technical challenge, because individual opening/closing or blocking/unblocking events cannot be resolved, and the resulting averaging over undetected events decreases the single-channel current. Here, I briefly summarize the history of fast-current fluctuation analysis and focus on the so-called “beta distributions.” This tool exploits characteristics of current fluctuation-induced excess noise on the current amplitude histograms to reconstruct the true single-channel current and kinetic parameters. A guideline for the analysis and recent applications demonstrate that a construction of theoretical beta distributions by Markov Model simulations offers maximum flexibility as compared to analytical solutions. PMID:26368656
An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.
Nguyen, Ngan; Watson, William D; Dominguez, Edward
2016-01-01
Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4) it provides formative and constructive feedback to bridge the gap between the learners' KSAs and the targeted KSAs. The EBAT methodology guides the design of simulation that incorporates these 4 features and, thus, enhances training effectiveness with simulation. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
ART-Ada design project, phase 2
NASA Technical Reports Server (NTRS)
Lee, S. Daniel; Allen, Bradley P.
1990-01-01
Interest in deploying expert systems in Ada has increased. An Ada based expert system tool is described called ART-Ada, which was built to support research into the language and methodological issues of expert systems in Ada. ART-Ada allows applications of an existing expert system tool called ART-IM (Automated Reasoning Tool for Information Management) to be deployed in various Ada environments. ART-IM, a C-based expert system tool, is used to generate Ada source code which is compiled and linked with an Ada based inference engine to produce an Ada executable image. ART-Ada is being used to implement several expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.
NASA Technical Reports Server (NTRS)
Bosilovich, Michael G.; Sud, Yogesh; Schubert, Siegfried D.; Walker, Gregory K.
2003-01-01
There are several important research questions that the Global Energy and Water Cycle Experiment (GEWEX) is actively pursuing, namely: What is the intensity of the water cycle and how does it change? And what is the sustainability of water resources? Much of the research to address these questions is directed at understanding the atmospheric water cycle. In this paper, we have used a new diagnostic tool, called Water Vapor Tracers (WVTs), to quantify the how much precipitation originated as continental or oceanic evaporation. This shows how long water can remain in the atmosphere and how far it can travel. The model-simulated data are analyzed over regions of interest to the GEWEX community, specifically, their Continental Scale Experiments (CSEs) that are in place in the United States, Europe, Asia, Brazil, Africa and Canada. The paper presents quantitative data on how much each continent and ocean on Earth supplies water for each CSE. Furthermore, the analysis also shows the seasonal variation of the water sources. For example, in the United States, summertime precipitation is dominated by continental (land surface) sources of water, while wintertime precipitation is dominated by the Pacific Ocean sources of water. We also analyze the residence time of water in the atmosphere. The new diagnostic shows a longer residence time for water (9.2 days) than more traditional estimates (7.5 days). We emphasize that the results are based on model simulations and they depend on the model s veracity. However, there are many potential uses for the new diagnostic tool in understanding weather processes and large and small scales.
NASA Astrophysics Data System (ADS)
Baluev, Roman V.
2013-08-01
We present PlanetPack, a new software tool that we developed to facilitate and standardize the advanced analysis of radial velocity (RV) data for the goal of exoplanets detection, characterization, and basic dynamical N-body simulations. PlanetPack is a command-line interpreter, that can run either in an interactive mode or in a batch mode of automatic script interpretation. Its major abilities include: (i) advanced RV curve fitting with the proper maximum-likelihood treatment of unknown RV jitter; (ii) user-friendly multi-Keplerian as well as Newtonian N-body RV fits; (iii) use of more efficient maximum-likelihood periodograms that involve the full multi-planet fitting (sometimes called as “residual” or “recursive” periodograms); (iv) easily calculatable parametric 2D likelihood function level contours, reflecting the asymptotic confidence regions; (v) fitting under some useful functional constraints is user-friendly; (vi) basic tasks of short- and long-term planetary dynamical simulation using a fast Everhart-type integrator based on Gauss-Legendre spacings; (vii) fitting the data with red noise (auto-correlated errors); (viii) various analytical and numerical methods for the tasks of determining the statistical significance. It is planned that further functionality may be added to PlanetPack in the future. During the development of this software, a lot of effort was made to improve the calculational speed, especially for CPU-demanding tasks. PlanetPack was written in pure C++ (standard of 1998/2003), and is expected to be compilable and useable on a wide range of platforms.
PandExo: A Community Tool for Transiting Exoplanet Science with JWST & HST
NASA Astrophysics Data System (ADS)
Batalha, Natasha E.; Mandell, Avi; Pontoppidan, Klaus; Stevenson, Kevin B.; Lewis, Nikole K.; Kalirai, Jason; Earl, Nick; Greene, Thomas; Albert, Loïc; Nielsen, Louise D.
2017-06-01
As we approach the James Webb Space Telescope (JWST) era, several studies have emerged that aim to (1) characterize how the instruments will perform and (2) determine what atmospheric spectral features could theoretically be detected using transmission and emission spectroscopy. To some degree, all these studies have relied on modeling of JWST’s theoretical instrument noise. With under two years left until launch, it is imperative that the exoplanet community begins to digest and integrate these studies into their observing plans, as well as think about how to leverage the Hubble Space Telescope (HST) to optimize JWST observations. To encourage this and to allow all members of the community access to JWST & HST noise simulations, we present here an open-source Python package and online interface for creating observation simulations of all observatory-supported timeseries spectroscopy modes. This noise simulator, called PandExo, relies on some aspects of Space Telescope Science Institute’s Exposure Time Calculator, Pandeia. We describe PandExo and the formalism for computing noise sources for JWST. Then we benchmark PandExo's performance against each instrument team’s independently written noise simulator for JWST, and previous observations for HST. We find that PandExo is within 10% agreement for HST/WFC3 and for all JWST instruments.
Rau, Chi-Lun; Tsai, Pei-Fang Jennifer; Liang, Sheau-Farn Max; Tan, Jhih-Cian; Syu, Hong-Cheng; Jheng, Yue-Ling; Ciou, Ting-Syuan; Jaw, Fu-Shan
2013-12-01
This study uses a simulation model as a tool for strategic capacity planning for an outpatient physical therapy clinic in Taipei, Taiwan. The clinic provides a wide range of physical treatments, with 6 full-time therapists in each session. We constructed a discrete-event simulation model to study the dynamics of patient mixes with realistic treatment plans, and to estimate the practical capacity of the physical therapy room. The changes in time-related and space-related performance measurements were used to evaluate the impact of various strategies on the capacity of the clinic. The simulation results confirmed that the clinic is extremely patient-oriented, with a bottleneck occurring at the traction units for Intermittent Pelvic Traction (IPT), with usage at 58.9 %. Sensitivity analysis showed that attending to more patients would significantly increase the number of patients staying for overtime sessions. We found that pooling the therapists produced beneficial results. The average waiting time per patient could be reduced by 45 % when we pooled 2 therapists. We found that treating up to 12 new patients per session had no significantly negative impact on returning patients. Moreover, we found that the average waiting time for new patients decreased if they were given priority over returning patients when called by the therapists.
NASA Astrophysics Data System (ADS)
Kacprzak, T.; Herbel, J.; Amara, A.; Réfrégier, A.
2018-02-01
Approximate Bayesian Computation (ABC) is a method to obtain a posterior distribution without a likelihood function, using simulations and a set of distance metrics. For that reason, it has recently been gaining popularity as an analysis tool in cosmology and astrophysics. Its drawback, however, is a slow convergence rate. We propose a novel method, which we call qABC, to accelerate ABC with Quantile Regression. In this method, we create a model of quantiles of distance measure as a function of input parameters. This model is trained on a small number of simulations and estimates which regions of the prior space are likely to be accepted into the posterior. Other regions are then immediately rejected. This procedure is then repeated as more simulations are available. We apply it to the practical problem of estimation of redshift distribution of cosmological samples, using forward modelling developed in previous work. The qABC method converges to nearly same posterior as the basic ABC. It uses, however, only 20% of the number of simulations compared to basic ABC, achieving a fivefold gain in execution time for our problem. For other problems the acceleration rate may vary; it depends on how close the prior is to the final posterior. We discuss possible improvements and extensions to this method.
Zhou, Y; Murata, T; Defanti, T A
2000-01-01
Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.
Terminal Area Conflict Detection and Resolution Tool
NASA Technical Reports Server (NTRS)
Verma, Savita Arora
2011-01-01
This poster will describe analysis of a conflict detection and resolution tool for the terminal area called T-TSAFE. With altitude clearance information, the tool can reduce false alerts to as low as 2 per hour.
Advanced Simulation and Computing: A Summary Report to the Director's Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, M G; Peck, T
2003-06-01
It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way tomore » sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.« less
Samadian, Soroush; Bruce, Jeff P; Pugh, Trevor J
2018-03-01
Somatic copy number variations (CNVs) play a crucial role in development of many human cancers. The broad availability of next-generation sequencing data has enabled the development of algorithms to computationally infer CNV profiles from a variety of data types including exome and targeted sequence data; currently the most prevalent types of cancer genomics data. However, systemic evaluation and comparison of these tools remains challenging due to a lack of ground truth reference sets. To address this need, we have developed Bamgineer, a tool written in Python to introduce user-defined haplotype-phased allele-specific copy number events into an existing Binary Alignment Mapping (BAM) file, with a focus on targeted and exome sequencing experiments. As input, this tool requires a read alignment file (BAM format), lists of non-overlapping genome coordinates for introduction of gains and losses (bed file), and an optional file defining known haplotypes (vcf format). To improve runtime performance, Bamgineer introduces the desired CNVs in parallel using queuing and parallel processing on a local machine or on a high-performance computing cluster. As proof-of-principle, we applied Bamgineer to a single high-coverage (mean: 220X) exome sequence file from a blood sample to simulate copy number profiles of 3 exemplar tumors from each of 10 tumor types at 5 tumor cellularity levels (20-100%, 150 BAM files in total). To demonstrate feasibility beyond exome data, we introduced read alignments to a targeted 5-gene cell-free DNA sequencing library to simulate EGFR amplifications at frequencies consistent with circulating tumor DNA (10, 1, 0.1 and 0.01%) while retaining the multimodal insert size distribution of the original data. We expect Bamgineer to be of use for development and systematic benchmarking of CNV calling algorithms by users using locally-generated data for a variety of applications. The source code is freely available at http://github.com/pughlab/bamgineer.
... is usually done using a tool called a stethoscope. Health care providers routinely listen to a person's ... unborn infants. This can be done with a stethoscope or with sound waves (called Doppler ultrasound). Auscultation ...
ERIC Educational Resources Information Center
Parmaxi, Antigoni; Zaphiris, Panayiotis
2017-01-01
This study explores the research development pertaining to the use of Web 2.0 technologies in the field of Computer-Assisted Language Learning (CALL). Published research manuscripts related to the use of Web 2.0 tools in CALL have been explored, and the following research foci have been determined: (1) Web 2.0 tools that dominate second/foreign…
Trinkaus, Hans L; Gaisser, Andrea E
2010-09-01
Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
ToTem: a tool for variant calling pipeline optimization.
Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka
2018-06-26
High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at https://totem.software .
NASA Astrophysics Data System (ADS)
Rimbault, C.; Le Meur, G.; Blampuy, F.; Bambade, P.; Schulte, D.
2009-12-01
Depolarization is a new feature in the beam-beam simulation tool GUINEA-PIG++ (GP++). The results of this simulation are studied and compared with another beam-beam simulation tool, CAIN, considering different beam parameters for the International Linear Collider (ILC) with a centre-of-mass energy of 500 GeV.
Three-Dimensional General Relativistic Monte Carlo Neutrino Transport in Neutron Star Mergers
NASA Astrophysics Data System (ADS)
Richers, Sherwood; Radice, David
2018-06-01
How neutrinos interact with the debris ejected from merging neutron stars determines how much matter escapes, how hot the matter is, and the relative amounts of neutrons and protons. This makes understanding neutrino irradiation of ejected matter a necessary part of interpreting recent and future observations of so-called "kilonovae" to determine whether neutron star mergers can be the origin of heavy elements in the universe. I will discuss a new Monte Carlo method for simulating neutrino transport in these highly relativistic, multi-dimensional environments. I will use this tool to estimate how well approximate transport methods capture the neutrino irradiation and propose improvements to approximate methods that will aid in accurate modeling and interpretation of kilonovae.
A modified homogeneous relaxation model for CO2 two-phase flow in vapour ejector
NASA Astrophysics Data System (ADS)
Haida, M.; Palacz, M.; Smolka, J.; Nowak, A. J.; Hafner, A.; Banasiak, K.
2016-09-01
In this study, the homogenous relaxation model (HRM) for CO2 flow in a two-phase ejector was modified in order to increase the accuracy of the numerical simulations The two- phase flow model was implemented on the effective computational tool called ejectorPL for fully automated and systematic computations of various ejector shapes and operating conditions. The modification of the HRM was performed by a change of the relaxation time and the constants included in the relaxation time equation based on the experimental result under the operating conditions typical for the supermarket refrigeration system. The modified HRM was compared to the HEM results, which were performed based on the comparison of motive nozzle and suction nozzle mass flow rates.
Determination of optimal tool parameters for hot mandrel bending of pipe elbows
NASA Astrophysics Data System (ADS)
Tabakajew, Dmitri; Homberg, Werner
2018-05-01
Seamless pipe elbows are important components in mechanical, plant and apparatus engineering. Typically, they are produced by the so-called `Hamburg process'. In this hot forming process, the initial pipes are subsequently pushed over an ox-horn-shaped bending mandrel. The geometric shape of the mandrel influences the diameter, bending radius and wall thickness distribution of the pipe elbow. This paper presents the numerical simulation model of the hot mandrel bending process created to ensure that the optimum mandrel geometry can be determined at an early stage. A fundamental analysis was conducted to determine the influence of significant parameters on the pipe elbow quality. The chosen methods and approach as well as the corresponding results are described in this paper.
Weinger, Matthew B; Banerjee, Arna; Burden, Amanda R; McIvor, William R; Boulet, John; Cooper, Jeffrey B; Steadman, Randolph; Shotwell, Matthew S; Slagle, Jason M; DeMaria, Samuel; Torsher, Laurence; Sinz, Elizabeth; Levine, Adam I; Rask, John; Davis, Fred; Park, Christine; Gaba, David M
2017-09-01
We sought to determine whether mannequin-based simulation can reliably characterize how board-certified anesthesiologists manage simulated medical emergencies. Our primary focus was to identify gaps in performance and to establish psychometric properties of the assessment methods. A total of 263 consenting board-certified anesthesiologists participating in existing simulation-based maintenance of certification courses at one of eight simulation centers were video recorded performing simulated emergency scenarios. Each participated in two 20-min, standardized, high-fidelity simulated medical crisis scenarios, once each as primary anesthesiologist and first responder. Via a Delphi technique, an independent panel of expert anesthesiologists identified critical performance elements for each scenario. Trained, blinded anesthesiologists rated video recordings using standardized rating tools. Measures included the percentage of critical performance elements observed and holistic (one to nine ordinal scale) ratings of participant's technical and nontechnical performance. Raters also judged whether the performance was at a level expected of a board-certified anesthesiologist. Rater reliability for most measures was good. In 284 simulated emergencies, participants were rated as successfully completing 81% (interquartile range, 75 to 90%) of the critical performance elements. The median rating of both technical and nontechnical holistic performance was five, distributed across the nine-point scale. Approximately one-quarter of participants received low holistic ratings (i.e., three or less). Higher-rated performances were associated with younger age but not with previous simulation experience or other individual characteristics. Calling for help was associated with better individual and team performance. Standardized simulation-based assessment identified performance gaps informing opportunities for improvement. If a substantial proportion of experienced anesthesiologists struggle with managing medical emergencies, continuing medical education activities should be reevaluated.
Spectrum simulation in DTSA-II.
Ritchie, Nicholas W M
2009-10-01
Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.
Designing Crop Simulation Web Service with Service Oriented Architecture Principle
NASA Astrophysics Data System (ADS)
Chinnachodteeranun, R.; Hung, N. D.; Honda, K.
2015-12-01
Crop simulation models are efficient tools for simulating crop growth processes and yield. Running crop models requires data from various sources as well as time-consuming data processing, such as data quality checking and data formatting, before those data can be inputted to the model. It makes the use of crop modeling limited only to crop modelers. We aim to make running crop models convenient for various users so that the utilization of crop models will be expanded, which will directly improve agricultural applications. As the first step, we had developed a prototype that runs DSSAT on Web called as Tomorrow's Rice (v. 1). It predicts rice yields based on a planting date, rice's variety and soil characteristics using DSSAT crop model. A user only needs to select a planting location on the Web GUI then the system queried historical weather data from available sources and expected yield is returned. Currently, we are working on weather data connection via Sensor Observation Service (SOS) interface defined by Open Geospatial Consortium (OGC). Weather data can be automatically connected to a weather generator for generating weather scenarios for running the crop model. In order to expand these services further, we are designing a web service framework consisting of layers of web services to support compositions and executions for running crop simulations. This framework allows a third party application to call and cascade each service as it needs for data preparation and running DSSAT model using a dynamic web service mechanism. The framework has a module to manage data format conversion, which means users do not need to spend their time curating the data inputs. Dynamic linking of data sources and services are implemented using the Service Component Architecture (SCA). This agriculture web service platform demonstrates interoperability of weather data using SOS interface, convenient connections between weather data sources and weather generator, and connecting various services for running crop models for decision support.
Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations
NASA Technical Reports Server (NTRS)
Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.
2015-01-01
The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.
SMARTE: SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (BELFAST, IRELAND)
The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...
Warthog: A MOOSE-Based Application for the Direct Code Coupling of BISON and PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCaskey, Alexander J.; Slattery, Stuart; Billings, Jay Jay
The Nuclear Energy Advanced Modeling and Simulation (NEAMS) program from the Department of Energy's Office of Nuclear Energy provides a robust toolkit for the modeling and simulation of current and future advanced nuclear reactor designs. This toolkit provides these technologies organized across product lines: two divisions targeted at fuels and end-to-end reactor modeling, and a third for integration, coupling, and high-level workflow management. The Fuels Product Line and the Reactor Product line provide advanced computational technologies that serve each respective field well, however, their current lack of integration presents a major impediment to future improvements of simulation solution fidelity. Theremore » is a desire for the capability to mix and match tools across Product Lines in an effort to utilize the best from both to improve NEAMS modeling and simulation technologies. This report details a new effort to provide this Product Line interoperability through the development of a new application called Warthog. This application couples the BISON Fuel Performance application from the Fuels Product Line and the PROTEUS Core Neutronics application from the Reactors Product Line in an effort to utilize the best from all parts of the NEAMS toolkit and improve overall solution fidelity of nuclear fuel simulations. To achieve this, Warthog leverages as much prior work from the NEAMS program as possible, and in doing so, enables interoperability between the disparate MOOSE and SHARP frameworks, and the libMesh and MOAB mesh data formats. This report describes this work in full. We begin with a detailed look at the individual NEAMS framework technologies used and developed in the various Product Lines, and the current status of their interoperability. We then introduce the Warthog application: its overall architecture and the ways it leverages the best existing tools from across the NEAMS toolkit to enable BISON-PROTEUS integration. Furthermore, we show how Warthog leverages a tool known as DataTransferKit to seamlessly enable the transfer for solution data between disparate frameworks and mesh formats. To end, we demonstrate tests for the direct software coupling of BISON and PROTEUS using Warthog, and discuss current impediments and solutions to the construction of physically realistic input models for this coupled BISON-PROTEUS system.« less
Examining A Health Care Price Transparency Tool: Who Uses It, And How They Shop For Care.
Sinaiko, Anna D; Rosenthal, Meredith B
2016-04-01
Calls for transparency in health care prices are increasing, in an effort to encourage and enable patients to make value-based decisions. Yet there is very little evidence of whether and how patients use health care price transparency tools. We evaluated the experiences, in the period 2011-12, of an insured population of nonelderly adults with Aetna's Member Payment Estimator, a web-based tool that provides real-time, personalized, episode-level price estimates. Overall, use of the tool increased during the study period but remained low. Nonetheless, for some procedures the number of people searching for prices of services (called searchers) was high relative to the number of people who received the service (called patients). Among Aetna patients who had an imaging service, childbirth, or one of several outpatient procedures, searchers for price information were significantly more likely to be younger and healthier and to have incurred higher annual deductible spending than patients who did not search for price information. A campaign to deliver price information to consumers may be important to increase patients' engagement with price transparency tools. Project HOPE—The People-to-People Health Foundation, Inc.
Taylor, Charles J.; Williamson, Tanja N.; Newson, Jeremy K.; Ulery, Randy L.; Nelson, Hugh L.; Cinotto, Peter J.
2012-01-01
This report describes Phase II modifications made to the Water Availability Tool for Environmental Resources (WATER), which applies the process-based TOPMODEL approach to simulate or predict stream discharge in surface basins in the Commonwealth of Kentucky. The previous (Phase I) version of WATER did not provide a means of identifying sinkhole catchments or accounting for the effects of karst (internal) drainage in a TOPMODEL-simulated basin. In the Phase II version of WATER, sinkhole catchments are automatically identified and delineated as internally drained subbasins, and a modified TOPMODEL approach (called the sinkhole drainage process, or SDP-TOPMODEL) is applied that calculates mean daily discharges for the basin based on summed area-weighted contributions from sinkhole drain-age (SD) areas and non-karstic topographically drained (TD) areas. Results obtained using the SDP-TOPMODEL approach were evaluated for 12 karst test basins located in each of the major karst terrains in Kentucky. Visual comparison of simulated hydrographs and flow-duration curves, along with statistical measures applied to the simulated discharge data (bias, correlation, root mean square error, and Nash-Sutcliffe efficiency coefficients), indicate that the SDPOPMODEL approach provides acceptably accurate estimates of discharge for most flow conditions and typically provides more accurate simulation of stream discharge in karstic basins compared to the standard TOPMODEL approach. Additional programming modifications made to the Phase II version of WATER included implementation of a point-and-click graphical user interface (GUI), which fully automates the delineation of simulation-basin boundaries and improves the speed of input-data processing. The Phase II version of WATER enables the user to select a pour point anywhere on a stream reach of interest, and the program will automatically delineate all upstream areas that contribute drainage to that point. This capability enables automatic delineation of a simulation basin of any size (area) and having any level of stream-network complexity. WATER then automatically identifies the presence of sinkholes catchments within the simulation basin boundaries; extracts and compiles the necessary climatic, topographic, and basin characteristics datasets; and runs the SDP-TOPMODEL approach to estimate daily mean discharges (streamflow).
Wang, Minghuai; Larson, Vincent E.; Ghan, Steven; ...
2015-04-18
In this study, a higher-order turbulence closure scheme, called Cloud Layers Unified by Binormals (CLUBB), is implemented into a Multi-scale Modeling Framework (MMF) model to improve low cloud simulations. The performance of CLUBB in MMF simulations with two different microphysics configurations (one-moment cloud microphysics without aerosol treatment and two-moment cloud microphysics coupled with aerosol treatment) is evaluated against observations and further compared with results from the Community Atmosphere Model, Version 5 (CAM5) with conventional cloud parameterizations. CLUBB is found to improve low cloud simulations in the MMF, and the improvement is particularly evident in the stratocumulus-to-cumulus transition regions. Compared tomore » the single-moment cloud microphysics, CLUBB with two-moment microphysics produces clouds that are closer to the coast, and agrees better with observations. In the stratocumulus-to cumulus transition regions, CLUBB with two-moment cloud microphysics produces shortwave cloud forcing in better agreement with observations, while CLUBB with single moment cloud microphysics overestimates shortwave cloud forcing. CLUBB is further found to produce quantitatively similar improvements in the MMF and CAM5, with slightly better performance in the MMF simulations (e.g., MMF with CLUBB generally produces low clouds that are closer to the coast than CAM5 with CLUBB). As a result, improved low cloud simulations in MMF make it an even more attractive tool for studying aerosol-cloud-precipitation interactions.« less
Li, Junli; Li, Chunyan; Qiu, Rui; Yan, Congchong; Xie, Wenzhang; Wu, Zhen; Zeng, Zhi; Tung, Chuanjong
2015-09-01
The method of Monte Carlo simulation is a powerful tool to investigate the details of radiation biological damage at the molecular level. In this paper, a Monte Carlo code called NASIC (Nanodosimetry Monte Carlo Simulation Code) was developed. It includes physical module, pre-chemical module, chemical module, geometric module and DNA damage module. The physical module can simulate physical tracks of low-energy electrons in the liquid water event-by-event. More than one set of inelastic cross sections were calculated by applying the dielectric function method of Emfietzoglou's optical-data treatments, with different optical data sets and dispersion models. In the pre-chemical module, the ionised and excited water molecules undergo dissociation processes. In the chemical module, the produced radiolytic chemical species diffuse and react. In the geometric module, an atomic model of 46 chromatin fibres in a spherical nucleus of human lymphocyte was established. In the DNA damage module, the direct damages induced by the energy depositions of the electrons and the indirect damages induced by the radiolytic chemical species were calculated. The parameters should be adjusted to make the simulation results be agreed with the experimental results. In this paper, the influence study of the inelastic cross sections and vibrational excitation reaction on the parameters and the DNA strand break yields were studied. Further work of NASIC is underway. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Follen, Gregory; auBuchon, M.
2000-01-01
Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer along with the concept of numerical zooming between zero-dimensional to one-, two-, and three-dimensional component engine codes. In addition, the NPSS is refining the computing and communication technologies necessary to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Of the different technology areas that contribute to the development of the NPSS Environment, the subject of this paper is a discussion on numerical zooming between a NPSS engine simulation and higher fidelity representations of the engine components (fan, compressor, burner, turbines, etc.). What follows is a description of successfully zooming one-dimensional (row-by-row) high-pressure compressor analysis results back to a zero-dimensional NPSS engine simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the capability of the engine system simulation and increase the level of virtual test conducted prior to committing the design to hardware.
NASA Technical Reports Server (NTRS)
Iversen, J. D.
1991-01-01
The aeolian wind tunnel is a special case of a larger subset of the wind tunnel family which is designed to simulate the atmospheric surface layer winds to small scale (a member of this larger subset is usually called an atmospheric boundary layer wind tunnel or environmental wind tunnel). The atmospheric boundary layer wind tunnel is designed to simulate, as closely as possible, the mean velocity and turbulence that occur naturally in the atmospheric boundary layer (defined as the lowest portion of the atmosphere, of the order of 500 m, in which the winds are most greatly affected by surface roughness and topography). The aeolian wind tunnel is used for two purposes: to simulate the physics of the saltation process and to model at small scale the erosional and depositional processes associated with topographic surface features. For purposes of studying aeolian effects on the surface of Mars and Venus as well as on Earth, the aeolian wind tunnel continues to prove to be a useful tool for estimating wind speeds necessary to move small particles on the three planets as well as to determine the effects of topography on the evolution of aeolian features such as wind streaks and dune patterns.
Space Station Human Factors: Designing a Human-Robot Interface
NASA Technical Reports Server (NTRS)
Rochlis, Jennifer L.; Clarke, John Paul; Goza, S. Michael
2001-01-01
The experiments described in this paper are part of a larger joint MIT/NASA research effort and focus on the development of a methodology for designing and evaluating integrated interfaces for highly dexterous and multifunctional telerobot. Specifically, a telerobotic workstation is being designed for an Extravehicular Activity (EVA) anthropomorphic space station telerobot called Robonaut. Previous researchers have designed telerobotic workstations based upon performance of discrete subsets of tasks (for example, peg-in-hole, tracking, etc.) without regard for transitions that operators go through between tasks performed sequentially in the context of larger integrated tasks. The experiments presented here took an integrated approach to describing teleoperator performance and assessed how subjects operating a full-immersion telerobot perform during fine position and gross position tasks. In addition, a Robonaut simulation was also developed as part of this research effort, and experimentally tested against Robonaut itself to determine its utility. Results show that subject performance of teleoperated tasks using both Robonaut and the simulation are virtually identical, with no significant difference between the two. These results indicate that the simulation can be utilized as both a Robonaut training tool, and as a powerful design platform for telepresence displays and aids.
Rautenberg, Philipp L.; Kumaraswamy, Ajayrama; Tejero-Cantero, Alvaro; Doblander, Christoph; Norouzian, Mohammad R.; Kai, Kazuki; Jacobsen, Hans-Arno; Ai, Hiroyuki; Wachtler, Thomas; Ikeno, Hidetoshi
2014-01-01
Neuroscience today deals with a “data deluge” derived from the availability of high-throughput sensors of brain structure and brain activity, and increased computational resources for detailed simulations with complex output. We report here (1) a novel approach to data sharing between collaborating scientists that brings together file system tools and cloud technologies, (2) a service implementing this approach, called NeuronDepot, and (3) an example application of the service to a complex use case in the neurosciences. The main drivers for our approach are to facilitate collaborations with a transparent, automated data flow that shields scientists from having to learn new tools or data structuring paradigms. Using NeuronDepot is simple: one-time data assignment from the originator and cloud based syncing—thus making experimental and modeling data available across the collaboration with minimum overhead. Since data sharing is cloud based, our approach opens up the possibility of using new software developments and hardware scalabitliy which are associated with elastic cloud computing. We provide an implementation that relies on existing synchronization services and is usable from all devices via a reactive web interface. We are motivating our solution by solving the practical problems of the GinJang project, a collaboration of three universities across eight time zones with a complex workflow encompassing data from electrophysiological recordings, imaging, morphological reconstructions, and simulations. PMID:24971059
Computational systems chemical biology.
Oprea, Tudor I; May, Elebeoba E; Leitão, Andrei; Tropsha, Alexander
2011-01-01
There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology (SCB) (Nat Chem Biol 3: 447-450, 2007).The overarching goal of computational SCB is to develop tools for integrated chemical-biological data acquisition, filtering and processing, by taking into account relevant information related to interactions between proteins and small molecules, possible metabolic transformations of small molecules, as well as associated information related to genes, networks, small molecules, and, where applicable, mutants and variants of those proteins. There is yet an unmet need to develop an integrated in silico pharmacology/systems biology continuum that embeds drug-target-clinical outcome (DTCO) triplets, a capability that is vital to the future of chemical biology, pharmacology, and systems biology. Through the development of the SCB approach, scientists will be able to start addressing, in an integrated simulation environment, questions that make the best use of our ever-growing chemical and biological data repositories at the system-wide level. This chapter reviews some of the major research concepts and describes key components that constitute the emerging area of computational systems chemical biology.
Rautenberg, Philipp L; Kumaraswamy, Ajayrama; Tejero-Cantero, Alvaro; Doblander, Christoph; Norouzian, Mohammad R; Kai, Kazuki; Jacobsen, Hans-Arno; Ai, Hiroyuki; Wachtler, Thomas; Ikeno, Hidetoshi
2014-01-01
Neuroscience today deals with a "data deluge" derived from the availability of high-throughput sensors of brain structure and brain activity, and increased computational resources for detailed simulations with complex output. We report here (1) a novel approach to data sharing between collaborating scientists that brings together file system tools and cloud technologies, (2) a service implementing this approach, called NeuronDepot, and (3) an example application of the service to a complex use case in the neurosciences. The main drivers for our approach are to facilitate collaborations with a transparent, automated data flow that shields scientists from having to learn new tools or data structuring paradigms. Using NeuronDepot is simple: one-time data assignment from the originator and cloud based syncing-thus making experimental and modeling data available across the collaboration with minimum overhead. Since data sharing is cloud based, our approach opens up the possibility of using new software developments and hardware scalabitliy which are associated with elastic cloud computing. We provide an implementation that relies on existing synchronization services and is usable from all devices via a reactive web interface. We are motivating our solution by solving the practical problems of the GinJang project, a collaboration of three universities across eight time zones with a complex workflow encompassing data from electrophysiological recordings, imaging, morphological reconstructions, and simulations.
Computational Systems Chemical Biology
Oprea, Tudor I.; May, Elebeoba E.; Leitão, Andrei; Tropsha, Alexander
2013-01-01
There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically-based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology, SCB (Oprea et al., 2007). The overarching goal of computational SCB is to develop tools for integrated chemical-biological data acquisition, filtering and processing, by taking into account relevant information related to interactions between proteins and small molecules, possible metabolic transformations of small molecules, as well as associated information related to genes, networks, small molecules and, where applicable, mutants and variants of those proteins. There is yet an unmet need to develop an integrated in silico pharmacology / systems biology continuum that embeds drug-target-clinical outcome (DTCO) triplets, a capability that is vital to the future of chemical biology, pharmacology and systems biology. Through the development of the SCB approach, scientists will be able to start addressing, in an integrated simulation environment, questions that make the best use of our ever-growing chemical and biological data repositories at the system-wide level. This chapter reviews some of the major research concepts and describes key components that constitute the emerging area of computational systems chemical biology. PMID:20838980
SS-mPMG and SS-GA: tools for finding pathways and dynamic simulation of metabolic networks.
Katsuragi, Tetsuo; Ono, Naoaki; Yasumoto, Keiichi; Altaf-Ul-Amin, Md; Hirai, Masami Y; Sriyudthsak, Kansuporn; Sawada, Yuji; Yamashita, Yui; Chiba, Yukako; Onouchi, Hitoshi; Fujiwara, Toru; Naito, Satoshi; Shiraishi, Fumihide; Kanaya, Shigehiko
2013-05-01
Metabolomics analysis tools can provide quantitative information on the concentration of metabolites in an organism. In this paper, we propose the minimum pathway model generator tool for simulating the dynamics of metabolite concentrations (SS-mPMG) and a tool for parameter estimation by genetic algorithm (SS-GA). SS-mPMG can extract a subsystem of the metabolic network from the genome-scale pathway maps to reduce the complexity of the simulation model and automatically construct a dynamic simulator to evaluate the experimentally observed behavior of metabolites. Using this tool, we show that stochastic simulation can reproduce experimentally observed dynamics of amino acid biosynthesis in Arabidopsis thaliana. In this simulation, SS-mPMG extracts the metabolic network subsystem from published databases. The parameters needed for the simulation are determined using a genetic algorithm to fit the simulation results to the experimental data. We expect that SS-mPMG and SS-GA will help researchers to create relevant metabolic networks and carry out simulations of metabolic reactions derived from metabolomics data.
Judd, Belinda K; Scanlan, Justin N; Alison, Jennifer A; Waters, Donna; Gordon, Christopher J
2016-08-05
Despite the recent widespread adoption of simulation in clinical education in physiotherapy, there is a lack of validated tools for assessment in this setting. The Assessment of Physiotherapy Practice (APP) is a comprehensive tool used in clinical placement settings in Australia to measure professional competence of physiotherapy students. The aim of the study was to evaluate the validity of the APP for student assessment in simulation settings. A total of 1260 APPs were collected, 971 from students in simulation and 289 from students in clinical placements. Rasch analysis was used to examine the construct validity of the APP tool in three different simulation assessment formats: longitudinal assessment over 1 week of simulation; longitudinal assessment over 2 weeks; and a short-form (25 min) assessment of a single simulation scenario. Comparison with APPs from 5 week clinical placements in hospital and clinic-based settings were also conducted. The APP demonstrated acceptable fit to the expectations of the Rasch model for the 1 and 2 week clinical simulations, exhibiting unidimensional properties that were able to distinguish different levels of student performance. For the short-form simulation, nine of the 20 items recorded greater than 25 % of scores as 'not-assessed' by clinical educators which impacted on the suitability of the APP tool in this simulation format. The APP was a valid assessment tool when used in longitudinal simulation formats. A revised APP may be required for assessment in short-form simulation scenarios.
Ketso: A New Tool for Extension Professionals
ERIC Educational Resources Information Center
Bates, James S.
2016-01-01
Extension professionals employ many techniques and tools to obtain feedback, input, information, and data from stakeholders, research participants, and program learners. An information-gathering tool called Ketso is described in this article. This tool and its associated techniques can be used in all phases of program development, implementation,…
A call for benchmarking transposable element annotation methods.
Hoen, Douglas R; Hickey, Glenn; Bourque, Guillaume; Casacuberta, Josep; Cordaux, Richard; Feschotte, Cédric; Fiston-Lavier, Anna-Sophie; Hua-Van, Aurélie; Hubley, Robert; Kapusta, Aurélie; Lerat, Emmanuelle; Maumus, Florian; Pollock, David D; Quesneville, Hadi; Smit, Arian; Wheeler, Travis J; Bureau, Thomas E; Blanchette, Mathieu
2015-01-01
DNA derived from transposable elements (TEs) constitutes large parts of the genomes of complex eukaryotes, with major impacts not only on genomic research but also on how organisms evolve and function. Although a variety of methods and tools have been developed to detect and annotate TEs, there are as yet no standard benchmarks-that is, no standard way to measure or compare their accuracy. This lack of accuracy assessment calls into question conclusions from a wide range of research that depends explicitly or implicitly on TE annotation. In the absence of standard benchmarks, toolmakers are impeded in improving their tools, annotators cannot properly assess which tools might best suit their needs, and downstream researchers cannot judge how accuracy limitations might impact their studies. We therefore propose that the TE research community create and adopt standard TE annotation benchmarks, and we call for other researchers to join the authors in making this long-overdue effort a success.
Using telephony data to facilitate discovery of clinical workflows.
Rucker, Donald W
2017-04-19
Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.
10 CFR 434.606 - Simulation tool.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 3 2010-01-01 2010-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria...
Landsat-7 Simulation and Testing Environments
NASA Technical Reports Server (NTRS)
Holmes, E.; Ha, K.; Hawkins, K.; Lombardo, J.; Ram, M.; Sabelhaus, P.; Scott, S.; Phillips, R.
1999-01-01
A spacecraft Attitude Control and Determination Subsystem (ACDS) is heavily dependent upon simulation throughout its entire development, implementation and ground test cycle. Engineering simulation tools are typically developed to design and analyze control systems to validate the design and software simulation tools are required to qualify the flight software. However, the need for simulation does not end here. Operating the ACDS of a spacecraft on the ground requires the simulation of spacecraft dynamics, disturbance modeling and celestial body motion. Sensor data must also be simulated and substituted for actual sensor data on the ground so that the spacecraft will respond by sending commands to the actuators as they will on orbit. And finally, the simulators is the primary training tool and test-bed for the Flight Operations Team. In this paper various ACDS simulation, developed for or used by the Landsat 7 project will be described. The paper will include a description of each tool, its unique attributes, and its role in the overall development and testing of the ACDS. Finally, a section is included which discusses how the coordinated use of these simulation tools can maximize the probability of uncovering software, hardware and operations errors during the ground test process.
Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)
2005-02-01
method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background
Azarnoush, Hamed; Siar, Samaneh; Sawaya, Robin; Zhrani, Gmaan Al; Winkler-Schwartz, Alexander; Alotaibi, Fahad Eid; Bugdadi, Abdulgadir; Bajunaid, Khalid; Marwa, Ibrahim; Sabbagh, Abdulrahman Jafar; Del Maestro, Rolando F
2017-07-01
OBJECTIVE Virtual reality simulators allow development of novel methods to analyze neurosurgical performance. The concept of a force pyramid is introduced as a Tier 3 metric with the ability to provide visual and spatial analysis of 3D force application by any instrument used during simulated tumor resection. This study was designed to answer 3 questions: 1) Do study groups have distinct force pyramids? 2) Do handedness and ergonomics influence force pyramid structure? 3) Are force pyramids dependent on the visual and haptic characteristics of simulated tumors? METHODS Using a virtual reality simulator, NeuroVR (formerly NeuroTouch), ultrasonic aspirator force application was continually assessed during resection of simulated brain tumors by neurosurgeons, residents, and medical students. The participants performed simulated resections of 18 simulated brain tumors with different visual and haptic characteristics. The raw data, namely, coordinates of the instrument tip as well as contact force values, were collected by the simulator. To provide a visual and qualitative spatial analysis of forces, the authors created a graph, called a force pyramid, representing force sum along the z-coordinate for different xy coordinates of the tool tip. RESULTS Sixteen neurosurgeons, 15 residents, and 84 medical students participated in the study. Neurosurgeon, resident and medical student groups displayed easily distinguishable 3D "force pyramid fingerprints." Neurosurgeons had the lowest force pyramids, indicating application of the lowest forces, followed by resident and medical student groups. Handedness, ergonomics, and visual and haptic tumor characteristics resulted in distinct well-defined 3D force pyramid patterns. CONCLUSIONS Force pyramid fingerprints provide 3D spatial assessment displays of instrument force application during simulated tumor resection. Neurosurgeon force utilization and ergonomic data form a basis for understanding and modulating resident force application and improving patient safety during tumor resection.
Reconstruction of Ancestral Genomes in Presence of Gene Gain and Loss.
Avdeyev, Pavel; Jiang, Shuai; Aganezov, Sergey; Hu, Fei; Alekseyev, Max A
2016-03-01
Since most dramatic genomic changes are caused by genome rearrangements as well as gene duplications and gain/loss events, it becomes crucial to understand their mechanisms and reconstruct ancestral genomes of the given genomes. This problem was shown to be NP-complete even in the "simplest" case of three genomes, thus calling for heuristic rather than exact algorithmic solutions. At the same time, a larger number of input genomes may actually simplify the problem in practice as it was earlier illustrated with MGRA, a state-of-the-art software tool for reconstruction of ancestral genomes of multiple genomes. One of the key obstacles for MGRA and other similar tools is presence of breakpoint reuses when the same breakpoint region is broken by several different genome rearrangements in the course of evolution. Furthermore, such tools are often limited to genomes composed of the same genes with each gene present in a single copy in every genome. This limitation makes these tools inapplicable for many biological datasets and degrades the resolution of ancestral reconstructions in diverse datasets. We address these deficiencies by extending the MGRA algorithm to genomes with unequal gene contents. The developed next-generation tool MGRA2 can handle gene gain/loss events and shares the ability of MGRA to reconstruct ancestral genomes uniquely in the case of limited breakpoint reuse. Furthermore, MGRA2 employs a number of novel heuristics to cope with higher breakpoint reuse and process datasets inaccessible for MGRA. In practical experiments, MGRA2 shows superior performance for simulated and real genomes as compared to other ancestral genome reconstruction tools.
Soldering Tool for Integrated Circuits
NASA Technical Reports Server (NTRS)
Takahashi, Ted H.
1987-01-01
Many connections soldered simultaneously in confined spaces. Improved soldering tool bonds integrated circuits onto printed-circuit boards. Intended especially for use with so-called "leadless-carrier" integrated circuits.
NASA Astrophysics Data System (ADS)
Gajdošová, Lenka; Seyringer, Dana
2017-02-01
We present the design and simulation of 20-channel, 50-GHz Si3N4 based AWG using three different commercial photonics tools, namely PHASAR from Optiwave Systems Inc., APSS from Apollo Photonics Inc. and RSoft from Synopsys Inc. For this purpose we created identical waveguide structures and identical AWG layouts in these tools and performed BPM simulations. For the simulations the same calculation conditions were used. These AWGs were designed for TM-polarized light with an AWG central wavelength of 850 nm. The output of all simulations, the transmission characteristics, were used to calculate the transmission parameters defining the optical properties of the simulated AWGs. These parameters were summarized and compared with each other. The results feature very good correlation between the tools and are comparable to the designed parameters in AWG-Parameters tool.
Compactified cosmological simulations of the infinite universe
NASA Astrophysics Data System (ADS)
Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László
2018-06-01
We present a novel N-body simulation method that compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to follow the evolution of the large-scale structure. Our approach eliminates the need for periodic boundary conditions, a mere numerical convenience which is not supported by observation and which modifies the law of force on large scales in an unrealistic fashion. We demonstrate that our method outclasses standard simulations executed on workstation-scale hardware in dynamic range, it is balanced in following a comparable number of high and low k modes and, its fundamental geometry and topology match observations. Our approach is also capable of simulating an expanding, infinite universe in static coordinates with Newtonian dynamics. The price of these achievements is that most of the simulated volume has smoothly varying mass and spatial resolution, an approximation that carries different systematics than periodic simulations. Our initial implementation of the method is called StePS which stands for Stereographically projected cosmological simulations. It uses stereographic projection for space compactification and naive O(N^2) force calculation which is nevertheless faster to arrive at a correlation function of the same quality than any standard (tree or P3M) algorithm with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence our code can function as a high-speed prediction tool for modern large-scale surveys. To learn about the limits of the respective methods, we compare StePS with GADGET-2 running matching initial conditions.
Simulation and Measurement of Stray Light in the CLASP
NASA Technical Reports Server (NTRS)
Narukage, Noriyuki; Kano, Ryohei; Bando, Takamasa; Ishikawa, Ryoko; Kubo, Masahito; Tsuzuki, Toshihiro; Katsukawa, Yukio; Ishikawa, Shin-nosuke; Giono, Gabriel; Suematsu, Yoshinori;
2015-01-01
We are planning an international rocket experiment Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is (2015 planned) that Lyman Alpha line polarization spectroscopic observations from the sun. The purpose of this experiment, detected with high accuracy of the linear polarization of the Ly?? lines to 0.1% by using a Hanle effect is to measure the magnetic field of the chromosphere-transition layer directly. For total flux of the sun visible light overwhelmingly larger and about 200 000 times the Ly?? line wavelength region, also hinder to 0.1% of the polarization photometric accuracy achieved in the stray light of slight visible light. Therefore we were first carried out using the illumination design analysis software called stray light simulation CLASP Light Tools. Feature of this simulation, using optical design file (ZEMAX format) and structural design file (STEP format), to reproduce realistic CLASP as possible to calculate machine is that it was stray study. And, at the stage in the actual equipment that made the provisional set of CLASP, actually put sunlight into CLASP using coelostat of National Astronomical Observatory of Japan, was subjected to measurement of stray light (San test). Pattern was not observed in the simulation is observed in the stray light measurement results need arise that measures. However, thanks to the stray light measurement and simulation was performed by adding, it was found this pattern is due to the diffracted light at the slit. Currently, the simulation results is where you have taken steps to reference. In this presentation, we report the stray light simulation and stray light measurement results that we have implemented
Three-dimensional touch interface for medical education.
Panchaphongsaphak, Bundit; Burgkart, Rainer; Riener, Robert
2007-05-01
We present the technical principle and evaluation of a multimodal virtual reality (VR) system for medical education, called a touch simulator. This touch simulator comes with an innovative three-dimensional (3-D) touch sensitive input device. The device comprises a six-axis force-torque sensor connected to a tangible object representing the shape of an anatomical structure. Information related to the point of contact is recorded by the sensor, processed, and audiovisually displayed. The touch simulator provides a high level of user-friendliness and fidelity compared to other purely graphically oriented simulation environments. In this paper, the touch simulator has been realized as an interactive neuroanatomical training simulator. The user can visualize and manipulate graphical information of the brain surface or different cross-sectional slices by a finger-touch on a brain-like shaped tangible object. We evaluated the system by theoretical derivations, experiments, and subjective questionnaires. In the theoretical analysis, we could show that the contact point estimation error mainly depends on the accuracy and the noise of the sensor, the amount and direction of the applied force, and the geometry of the tangible object. The theoretical results could be validated by experiments: applying a normal force of 10 N on a 120 mm x 120 mm x 120 mm cube causes a maximum error of 2.5 +/- 0.7 mm. This error becomes smaller when increasing the contact force. Based on the survey results, the touch simulator may be a useful tool for assisting medical schools in the visualization of brain image data and the study of neuroanatomy.
Differing types of cellular phone conversations and dangerous driving.
Dula, Chris S; Martin, Benjamin A; Fox, Russell T; Leonard, Robin L
2011-01-01
This study sought to investigate the relationship between cell phone conversation type and dangerous driving behaviors. It was hypothesized that more emotional phone conversations engaged in while driving would produce greater frequencies of dangerous driving behaviors in a simulated environment than more mundane conversation or no phone conversation at all. Participants were semi-randomly assigned to one of three conditions: (1) no call, (2) mundane call, and, (3) emotional call. While driving in a simulated environment, participants in the experimental groups received a phone call from a research confederate who either engaged them in innocuous conversation (mundane call) or arguing the opposite position of a deeply held belief of the participant (emotional call). Participants in the no call and mundane call groups differed significantly only on percent time spent speeding and center line crossings, though the mundane call group consistently engaged in more of all dangerous driving behaviors than did the no call participants. Participants in the emotional call group engaged in significantly more dangerous driving behaviors than participants in both the no call and mundane call groups, with the exception of traffic light infractions, where there were no significant group differences. Though there is need for replication, the authors concluded that whereas talking on a cell phone while driving is risky to begin with, having emotionally intense conversations is considerably more dangerous. Copyright © 2010 Elsevier Ltd. All rights reserved.
Durr, W
1998-01-01
Call centers are strategically and tactically important to many industries, including the healthcare industry. Call centers play a key role in acquiring and retaining customers. The ability to deliver high-quality and timely customer service without much expense is the basis for the proliferation and expansion of call centers. Call centers are unique blends of people and technology, where performance indicates combining appropriate technology tools with sound management practices built on key operational data. While the technology is fascinating, the people working in call centers and the skill of the management team ultimately make a difference to their companies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Häggström, Ida, E-mail: haeggsti@mskcc.org; Beattie, Bradley J.; Schmidtlein, C. Ross
2016-06-15
Purpose: To develop and evaluate a fast and simple tool called dPETSTEP (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. Methods: The tool was developed in MATLAB using both new and previously reported modules of PETSTEP (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuationmore » are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). Results: dPETSTEP was 8000 times faster than MC. Dynamic images from dPETSTEP had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dPETSTEP and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dPETSTEP images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dPETSTEP images and noise properties agreed better with MC. Conclusions: The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dPETSTEP to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dPETSTEP can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.« less
Ocean Drilling Program: TAMRF Administrative Services: Meeting, Travel, and
Port-Call Information ODP/TAMU Science Operator Home Mirror sites ODP/TAMU staff Cruise information Science and curation services Publication services and products Drilling services and tools Online ODP Meeting, Travel, and Port-Call Information All ODP meeting and port-call activities are complete
Marketing, Management and Performance: Multilingualism as Commodity in a Tourism Call Centre
ERIC Educational Resources Information Center
Duchene, Alexandre
2009-01-01
This paper focuses on the ways an institution of the new economy--a tourism call centre in Switzerland--markets, manages and performs multilingual services. In particular, it explores the ways multilingualism operates as a strategic and managerial tool within tourism call centres and how the institutional regulation of language practices…
Thinking about Pregnancy After Premature Birth
... Moms Need Blog News & Media News Videos Mission stories Ambassadors Spotlights Tools & Resources Frequently asked media questions ... a kind of fertility treatment called assisted reproductive technology (also called ART). Fertility treatment is medical treatment ...
OCSEGen: Open Components and Systems Environment Generator
NASA Technical Reports Server (NTRS)
Tkachuk, Oksana
2014-01-01
To analyze a large system, one often needs to break it into smaller components.To analyze a component or unit under analysis, one needs to model its context of execution, called environment, which represents the components with which the unit interacts. Environment generation is a challenging problem, because the environment needs to be general enough to uncover unit errors, yet precise enough to make the analysis tractable. In this paper, we present a tool for automated environment generation for open components and systems. The tool, called OCSEGen, is implemented on top of the Soot framework. We present the tool's current support and discuss its possible future extensions.
Semantic Importance Sampling for Statistical Model Checking
2015-01-16
SMT calls while maintaining correctness. Finally, we implement SIS in a tool called osmosis and use it to verify a number of stochastic systems with...2 surveys related work. Section 3 presents background definitions and concepts. Section 4 presents SIS, and Section 5 presents our tool osmosis . In...which I∗M|=Φ(x) = 1. We do this by first randomly selecting a cube c from C∗ with uniform probability since each cube has equal probability 9 5. OSMOSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udhay Ravishankar; Milos manic
2013-08-01
This paper presents a micro-grid simulator tool useful for implementing and testing multi-agent controllers (SGridSim). As a common engineering practice it is important to have a tool that simplifies the modeling of the salient features of a desired system. In electric micro-grids, these salient features are the voltage and power distributions within the micro-grid. Current simplified electric power grid simulator tools such as PowerWorld, PowerSim, Gridlab, etc, model only the power distribution features of a desired micro-grid. Other power grid simulators such as Simulink, Modelica, etc, use detailed modeling to accommodate the voltage distribution features. This paper presents a SGridSimmore » micro-grid simulator tool that simplifies the modeling of both the voltage and power distribution features in a desired micro-grid. The SGridSim tool accomplishes this simplified modeling by using Effective Node-to-Node Complex Impedance (EN2NCI) models of components that typically make-up a micro-grid. The term EN2NCI models means that the impedance based components of a micro-grid are modeled as single impedances tied between their respective voltage nodes on the micro-grid. Hence the benefit of the presented SGridSim tool are 1) simulation of a micro-grid is performed strictly in the complex-domain; 2) faster simulation of a micro-grid by avoiding the simulation of detailed transients. An example micro-grid model was built using the SGridSim tool and tested to simulate both the voltage and power distribution features with a total absolute relative error of less than 6%.« less
Boxwala, A A; Chaney, E L; Fritsch, D S; Friedman, C P; Rosenman, J G
1998-09-01
The purpose of this investigation was to design and implement a prototype physician workstation, called PortFolio, as a platform for developing and evaluating, by means of controlled observer studies, user interfaces and interactive tools for analyzing and managing digital portal images. The first observer study was designed to measure physician acceptance of workstation technology, as an alternative to a view box, for inspection and analysis of portal images for detection of treatment setup errors. The observer study was conducted in a controlled experimental setting to evaluate physician acceptance of the prototype workstation technology exemplified by PortFolio. PortFolio incorporates a windows user interface, a compact kit of carefully selected image analysis tools, and an object-oriented data base infrastructure. The kit evaluated in the observer study included tools for contrast enhancement, registration, and multimodal image visualization. Acceptance was measured in the context of performing portal image analysis in a structured protocol designed to simulate clinical practice. The acceptability and usage patterns were measured from semistructured questionnaires and logs of user interactions. Radiation oncologists, the subjects for this study, perceived the tools in PortFolio to be acceptable clinical aids. Concerns were expressed regarding user efficiency, particularly with respect to the image registration tools. The results of our observer study indicate that workstation technology is acceptable to radiation oncologists as an alternative to a view box for clinical detection of setup errors from digital portal images. Improvements in implementation, including more tools and a greater degree of automation in the image analysis tasks, are needed to make PortFolio more clinically practical.
Xu, Lingyang; Hou, Yali; Bickhart, Derek M; Song, Jiuzhou; Liu, George E
2013-06-25
Copy number variations (CNVs) are gains and losses of genomic sequence between two individuals of a species when compared to a reference genome. The data from single nucleotide polymorphism (SNP) microarrays are now routinely used for genotyping, but they also can be utilized for copy number detection. Substantial progress has been made in array design and CNV calling algorithms and at least 10 comparison studies in humans have been published to assess them. In this review, we first survey the literature on existing microarray platforms and CNV calling algorithms. We then examine a number of CNV calling tools to evaluate their impacts using bovine high-density SNP data. Large incongruities in the results from different CNV calling tools highlight the need for standardizing array data collection, quality assessment and experimental validation. Only after careful experimental design and rigorous data filtering can the impacts of CNVs on both normal phenotypic variability and disease susceptibility be fully revealed.
Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework
NASA Astrophysics Data System (ADS)
Cañadas, M.; Arce, P.; Rato Mendes, P.
2011-01-01
Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL-1 and the simulated peak was 247.1 kcps at 0.87 MBq mL-1). Agreement better than 3% was obtained in the scatter fraction comparison study. We also measured and simulated a mini-Derenzo phantom obtaining images with similar quality using iterative reconstruction methods. We concluded that the overall performance of the simulation showed good agreement with the measured results and validates the GAMOS package for PET applications. Furthermore, its ease of use and flexibility recommends it as an excellent tool to optimize design features or image reconstruction techniques.
FFI: A software tool for ecological monitoring
Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman
2009-01-01
A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...
A mathematical and numerical model is developed to simulate the transport and fate of NAPLs (Non-Aqueous Phase Liquids) in near-surface granular soils. The resulting three-dimensional, three phase simulator is called NAPL. The simulator accommodates three mobile phases: water, NA...
SMARTE: IMPROVING REVITALIZATION DECISIONS (BERLIN, GERMANY)
The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...
Enhancement of the EPA Stormwater BMP Decision-Support Tool (SUSTAIN)
U.S. Environmental Protection Agency (EPA) has been developing and improving a decision-support tool for placement of stormwater best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment and Analysis...
NASA Astrophysics Data System (ADS)
Kiliclar, Yalin; Laurischkat, Roman; Vladimirov, Ivaylo N.; Reese, Stefanie
2011-08-01
The presented project deals with a robot based incremental sheet metal forming process, which is called roboforming and has been developed at the Chair of Production Systems. It is characterized by flexible shaping using a freely programmable path-synchronous movement of two industrial robots. The final shape is produced by the incremental infeed of the forming tool in depth direction and its movement along the part contour in lateral direction. However, the resulting geometries formed in roboforming deviate several millimeters from the reference geometry. This results from the compliance of the involved machine structures and the springback effects of the workpiece. The project aims to predict these deviations caused by resiliences and to carry out a compensative path planning based on this prediction. Therefore a planning tool is implemented which compensates the robots's compliance and the springback effects of the sheet metal. The forming process is simulated by means of a finite element analysis using a material model developed at the Institute of Applied Mechanics (IFAM). It is based on the multiplicative split of the deformation gradient in the context of hyperelasticity and combines nonlinear kinematic and isotropic hardening. Low-order finite elements used to simulate thin sheet structures, such as used for the experiments, have the major problem of locking, a nonphysical stiffening effect. For an efficient finite element analysis a special solid-shell finite element formulation based on reduced integration with hourglass stabilization has been developed. To circumvent different locking effects, the enhanced assumed strain (EAS) and the assumed natural strain (ANS) concepts are included in this formulation. Having such powerful tools available we obtain more accurate geometries.
A Computational Tool to Detect and Avoid Redundancy in Selected Reaction Monitoring
Röst, Hannes; Malmström, Lars; Aebersold, Ruedi
2012-01-01
Selected reaction monitoring (SRM), also called multiple reaction monitoring, has become an invaluable tool for targeted quantitative proteomic analyses, but its application can be compromised by nonoptimal selection of transitions. In particular, complex backgrounds may cause ambiguities in SRM measurement results because peptides with interfering transitions similar to those of the target peptide may be present in the sample. Here, we developed a computer program, the SRMCollider, that calculates nonredundant theoretical SRM assays, also known as unique ion signatures (UIS), for a given proteomic background. We show theoretically that UIS of three transitions suffice to conclusively identify 90% of all yeast peptides and 85% of all human peptides. Using predicted retention times, the SRMCollider also simulates time-scheduled SRM acquisition, which reduces the number of interferences to consider and leads to fewer transitions necessary to construct an assay. By integrating experimental fragment ion intensities from large scale proteome synthesis efforts (SRMAtlas) with the information content-based UIS, we combine two orthogonal approaches to create high quality SRM assays ready to be deployed. We provide a user friendly, open source implementation of an algorithm to calculate UIS of any order that can be accessed online at http://www.srmcollider.org to find interfering transitions. Finally, our tool can also simulate the specificity of novel data-independent MS acquisition methods in Q1–Q3 space. This allows us to predict parameters for these methods that deliver a specificity comparable with that of SRM. Using SRM interference information in addition to other sources of information can increase the confidence in an SRM measurement. We expect that the consideration of information content will become a standard step in SRM assay design and analysis, facilitated by the SRMCollider. PMID:22535207
Trabelsi, Meriam; Mandart, Elisabeth; Le Grusse, Philippe; Bord, Jean-Paul
2016-01-01
The use of plant protection products enables farmers to maximize economic performance and yields, but in return, the environment and human health can be greatly affected because of their toxicity. There are currently strong calls for farmers to reduce the use of these toxic products for the preservation of the environment and the human health, and it has become urgent to invest in more sustainable models that help reduce these risks. One possible solution is the transition toward agroecological production systems. These new systems must be beneficial economically, socially, and environmentally in terms of human health. There are many tools available, based on a range of indicators, for assessing the sustainability of agricultural systems on conventional farm holdings. These methods are little suitable to agroecological farms and do not measure the performance of agroecological transition farms. In this article, we therefore develop a model for the strategic definition, guidance, and assistance for a transition to agroecological practices, capable of assessing performance of this transition and simulating the consequences of possible changes. This model was built by coupling (i) a decision-support tool and a technico-economic simulator with (ii) a conceptual model built from the dynamics of agroecological practices. This tool is currently being tested in the framework of a Compte d'Affectation Spéciale pour le Développement Agricole et Rural (CASDAR) project (CASDAR: project launched in 2013 by the French Ministry of Agriculture, Food and Forestry, on the theme "collective mobilisation for agroecology," http://agriculture.gouv.fr/Appel-a-projets-CASDAR ) using data from farms, most of which are engaged in agroenvironmental process and reducing plant protection treatments since 2008.
Combining Simulation Tools for End-to-End Trajectory Optimization
NASA Technical Reports Server (NTRS)
Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min
2015-01-01
Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.
The value of SPaCE in delivering patient feedback.
Clapham, Laura; Allan, Laura; Stirling, Kevin
2016-02-01
The use of simulated patients (SPs) within undergraduate medical curricula is an established and valued learning opportunity. Within the context of simulation, it is imperative to capture feedback from all participants within the simulation activity. The Simulated Patient Candidate Evaluation (SPaCE) tool was developed to deliver SP feedback following a simulation activity. SpaCE is a closed feedback tool that allows SPs to rate a student's performance, using a five-point Likert scale, in three domains: attitude; interaction skills; and management. This research study examined the value of the SPaCE tool and how it contributes to the overall feedback that a student receives. Classical test theory was used to determine the reliability of the SPaCE tool. An evaluation of all SP responses was conducted to observe trends in scoring patterns for each question. Qualitative data were collected via a free-text questionnaire and subsequent focus group discussion. It is imperative to capture feedback from all participants within the simulation activity Classical test theory determined that the SPaCE tool had a reliability co-efficient of 0.89. A total of 13 SPs replied to the questionnaire. A thematic analysis of all questionnaire data identified that the SPaCE tool provides a structure that allows patient feedback to be given effectively following a simulation activity. These themes were discussed further with six SPs who attended the subsequent focus group session. The SPaCE tool has been shown to be a reliable closed feedback tool that allows SPs to discriminate between students, based on their performance. The next stage in the development of the SPaCE tool is to test the wider applicability of this feedback tool. © 2015 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Thomas, Stan J.
1993-01-01
KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. In order to bring KATE to the level of performance, functionality, and integratability needed for firing room applications, efforts are underway to implement KATE in the C++ programming language using an X-windows interface. Two programs which were designed and added to the collection of tools which comprise the KATE toolbox are described. The first tool, called the schematic viewer, gives the KATE user the capability to view digitized schematic drawings in the KATE environment. The second tool, called the model editor, gives the KATE model builder a tool for creating and editing knowledge base files. Design and implementation issues having to do with these two tools are discussed. It will be useful to anyone maintaining or extending either the schematic viewer or the model editor.
NASA Astrophysics Data System (ADS)
Chakraborty, Abhishek
Detection of particulate matter thinly dispersed in a fluid medium with the aid of the difference in electrical conductivity between the pure fluid and the particles has been practiced at least since the last 50 to 60 years. The first such instruments were employed to measure cell counts in samples of biological fluid. Following a detailed study of the physics and principles operating within the device, called the Electric Sensing Zone (ESZ) principle, a new device called the Liquid Metal Cleanliness Analyzer (LiMCA) was invented which could measure and count particles of inclusions in molten metal. It provided a fast and fairly accurate tool to make online measurement of the quality of steel during refining and casting operations. On similar lines of development as the LiMCA, a water analogue of the device called, the Aqueous Particle Sensor (APS) was developed for physical modeling experiments of metal refining operations involving water models. The APS can detect and measure simulated particles of inclusions added to the working fluid (water). The present study involves the designing, building and final application of a new and improved APS in water modeling experiments to study inclusion behavior in a tundish operation. The custom built instrument shows superior performance and applicability in experiments involving physical modeling of metal refining operations, compared to its commercial counterparts. In addition to higher accuracy and range of operating parameters, its capability to take real-time experimental data for extended periods of time helps to reduce the total number of experiments required to reach a result, and makes it suitable for analyzing temporal changes occurring in unsteady systems. With the modern impetus on the quality of the final product of metallurgical operations, the new APS can prove to be an indispensable research tool to study and put forward innovative design and parametric changes in industrially practised metallurgical operations.
New Tooling System for Forming Aluminum Beverage Can End Shell
NASA Astrophysics Data System (ADS)
Yamazaki, Koetsu; Otsuka, Takayasu; Han, Jing; Hasegawa, Takashi; Shirasawa, Taketo
2011-08-01
This paper proposes a new tooling system for forming shells of aluminum beverage can ends. At first, forming process of a conversional tooling system has been simulated using three-dimensional finite element models. Simulation results have been confirmed to be consistent with those of axisymmetric models, so simulations for further study have been performed using axisymmetric models to save computational time. A comparison shows that thinning of the shell formed by the proposed tooling system has been improved about 3.6%. Influences of the tool upmost surface profiles and tool initial positions in the new tooling system have been investigated and the design optimization method based on the numerical simulations has been then applied to search optimum design points, in order to minimize thinning subjected to the constraints of the geometrical dimensions of the shell. At last, the performance of the shell subjected to internal pressure has been confirmed to meet design requirements.
Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan
2014-01-01
LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784
Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan
2014-12-15
LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David
2018-05-01
As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.
Upgrading Custom Simulink Library Components for Use in Newer Versions of Matlab
NASA Technical Reports Server (NTRS)
Stewart, Camiren L.
2014-01-01
The Spaceport Command and Control System (SCCS) at Kennedy Space Center (KSC) is a control system for monitoring and launching manned launch vehicles. Simulations of ground support equipment (GSE) and the launch vehicle systems are required throughout the life cycle of SCCS to test software, hardware, and procedures to train the launch team. The simulations of the GSE at the launch site in conjunction with off-line processing locations are developed using Simulink, a piece of Commercial Off-The-Shelf (COTS) software. The simulations that are built are then converted into code and ran in a simulation engine called Trick, a Government off-the-shelf (GOTS) piece of software developed by NASA. In the world of hardware and software, it is not uncommon to see the products that are utilized be upgraded and patched or eventually fade away into an obsolete status. In the case of SCCS simulation software, Matlab, a MathWorks product, has released a number of stable versions of Simulink since the deployment of the software on the Development Work Stations in the Linux environment (DWLs). The upgraded versions of Simulink has introduced a number of new tools and resources that, if utilized fully and correctly, will save time and resources during the overall development of the GSE simulation and its correlating documentation. Unfortunately, simply importing the already built simulations into the new Matlab environment will not suffice as it will produce results that may not be expected as they were in the version that is currently being utilized. Thus, an upgrade execution plan was developed and executed to fully upgrade the simulation environment to one of the latest versions of Matlab.
NASA Astrophysics Data System (ADS)
Colarco, P. R.; Gasso, S.; Jethva, H. T.; Buchard, V.; Ahn, C.; Torres, O.; daSilva, A.
2016-12-01
Output from the NASA Goddard Earth Observing System, version 5 (GEOS-5) Earth system model is used to simulate the top-of-atmosphere 354 and 388 nm radiances observed by the Ozone Monitoring Instrument (OMI) onboard the Aura spacecraft. The principle purpose of developing this simulator tool is to compute from the modeled fields the so-called OMI Aerosol Index (AI), which is a more fundamental retrieval product than higher level products such as the aerosol optical depth (AOD) or absorbing aerosol optical depth (AAOD). This lays the groundwork for eventually developing a capability to assimilate either the OMI AI or its radiances, which would provide further constraint on aerosol loading and absorption properties for global models. We extend the use of the simulator capability to understand the nature of the OMI aerosol retrieval algorithms themselves in an Observing System Simulation Experiment (OSSE). The simulated radiances are used to calculate the AI from the modeled fields. These radiances are also provided to the OMI aerosol algorithms, which return their own retrievals of the AI, AOD, and AAOD. Our assessment reveals that the OMI-retrieved AI can be mostly harmonized with the model-derived AI given the same radiances provided a common surface pressure field is assumed. This is important because the operational OMI algorithms presently assume a fixed pressure field, while the contribution of molecular scattering to the actual OMI signal in fact responds to the actual atmospheric pressure profile, which is accounted for in our OSSE by using GEOS-5 produced atmospheric reanalyses. Other differences between the model and OMI AI are discussed, and we present a preliminary assessment of the OMI AOD and AAOD products with respect to the known inputs from the GEOS-5 simulation.
A flexible, interactive software tool for fitting the parameters of neuronal models.
Friedrich, Péter; Vella, Michael; Gulyás, Attila I; Freund, Tamás F; Káli, Szabolcs
2014-01-01
The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool.
Simulations for the Development of Thermoelectric Measurements
NASA Astrophysics Data System (ADS)
Zabrocki, Knud; Ziolkowski, Pawel; Dasgupta, Titas; de Boor, Johannes; Müller, Eckhard
2013-07-01
In thermoelectricity, continuum theoretical equations are usually used for the calculation of the characteristics and performance of thermoelectric elements, modules or devices as a function of external parameters (material, geometry, temperatures, current, flow, load, etc.). An increasing number of commercial software packages aimed at applications, such as COMSOL and ANSYS, contain vkernels using direct thermoelectric coupling. Application of these numerical tools also allows analysis of physical measurement conditions and can lead to specifically adapted methods for developing special test equipment required for the determination of TE material and module properties. System-theoretical and simulation-based considerations of favorable geometries are taken into account to create draft sketches in the development of such measurement systems. Particular consideration is given to the development of transient measurement methods, which have great advantages compared with the conventional static methods in terms of the measurement duration required. In this paper the benefits of using numerical tools in designing measurement facilities are shown using two examples. The first is the determination of geometric correction factors in four-point probe measurement of electrical conductivity, whereas the second example is focused on the so-called combined thermoelectric measurement (CTEM) system, where all thermoelectric material properties (Seebeck coefficient, electrical and thermal conductivity, and Harman measurement of zT) are measured in a combined way. Here, we want to highlight especially the measurement of thermal conductivity in a transient mode. Factors influencing the measurement results such as coupling to the environment due to radiation, heat losses via the mounting of the probe head, as well as contact resistance between the sample and sample holder are illustrated, analyzed, and discussed. By employing the results of the simulations, we have developed an improved sample head that allows for measurements over a larger temperature interval with enhanced accuracy.
A flexible, interactive software tool for fitting the parameters of neuronal models
Friedrich, Péter; Vella, Michael; Gulyás, Attila I.; Freund, Tamás F.; Káli, Szabolcs
2014-01-01
The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool. PMID:25071540
VCFR: A package to manipulate and visualize variant call format data in R
USDA-ARS?s Scientific Manuscript database
Software to call single nucleotide polymorphisms or related genetic variants has converged on the variant call format (vcf) as their output format of choice. This has created a need for tools to work with vcf files. While an increasing number of software exists to read vcf data, many of them only ex...
ERIC Educational Resources Information Center
Hamel, Marie-Josee; Caws, Catherine
2010-01-01
This article discusses CALL development from both educational and ergonomic perspectives. It focuses on the learner-task-tool interaction, in particular on the aspects contributing to its overall quality, herein called "usability." Two pilot studies are described that were carried out with intermediate to advanced learners of French in two…
Enhancement of the EPA Stormwater BMP Decision-Support Tool (SUSTAIN) - slides
U.S. Environmental Protection Agency (EPA) has been developing and improving a decision-support tool for placement of stormwater best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment and Analysis...
Data and Tools | Hydrogen and Fuel Cells | NREL
researchers, developers, investors, and others interested in the viability, analysis, and development of , energy use, and emissions. Alternative Fuels Data Center Tools Collection of tools-calculators -makers reduce petroleum use. FASTSim: Future Automotive Systems Technology Simulator Simulation tool that
Windowed multitaper correlation analysis of multimodal brain monitoring parameters.
Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome.
INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT
A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...
Hu, Haixiang; Zhang, Xin; Ford, Virginia; Luo, Xiao; Qi, Erhui; Zeng, Xuefeng; Zhang, Xuejun
2016-11-14
Edge effect is regarded as one of the most difficult technical issues in a computer controlled optical surfacing (CCOS) process. Traditional opticians have to even up the consequences of the two following cases. Operating CCOS in a large overhang condition affects the accuracy of material removal, while in a small overhang condition, it achieves a more accurate performance, but leaves a narrow rolled-up edge, which takes time and effort to remove. In order to control the edge residuals in the latter case, we present a new concept of the 'heterocercal' tool influence function (TIF). Generated from compound motion equipment, this type of TIF can 'transfer' the material removal from the inner place to the edge, meanwhile maintaining the high accuracy and efficiency of CCOS. We call it the 'heterocercal' TIF, because of the inspiration from the heterocercal tails of sharks, whose upper lobe provides most of the explosive power. The heterocercal TIF was theoretically analyzed, and physically realized in CCOS facilities. Experimental and simulation results showed good agreement. It enables significant control of the edge effect and convergence of entire surface errors in large tool-to-mirror size-ratio conditions. This improvement will largely help manufacturing efficiency in some extremely large optical system projects, like the tertiary mirror of the Thirty Meter Telescope.
AquaCrop-OS: A tool for resilient management of land and water resources in agriculture
NASA Astrophysics Data System (ADS)
Foster, Timothy; Brozovic, Nicholas; Butler, Adrian P.; Neale, Christopher M. U.; Raes, Dirk; Steduto, Pasquale; Fereres, Elias; Hsiao, Theodore C.
2017-04-01
Water managers, researchers, and other decision makers worldwide are faced with the challenge of increasing food production under population growth, drought, and rising water scarcity. Crop simulation models are valuable tools in this effort, and, importantly, provide a means of quantifying rapidly crop yield response to water, climate, and field management practices. Here, we introduce a new open-source crop modelling tool called AquaCrop-OS (Foster et al., 2017), which extends the functionality of the globally used FAO AquaCrop model. Through case studies focused on groundwater-fed irrigation in the High Plains and Central Valley of California in the United States, we demonstrate how AquaCrop-OS can be used to understand the local biophysical, behavioural, and institutional drivers of water risks in agricultural production. Furthermore, we also illustrate how AquaCrop-OS can be combined effectively with hydrologic and economic models to support drought risk mitigation and decision-making around water resource management at a range of spatial and temporal scales, and highlight future plans for model development and training. T. Foster, et al. (2017) AquaCrop-OS: An open source version of FAO's crop water productivity model. Agricultural Water Management. 181: 18-22. http://dx.doi.org/10.1016/j.agwat.2016.11.015.
Scheuch, Matthias; Höper, Dirk; Beer, Martin
2015-03-03
Fuelled by the advent and subsequent development of next generation sequencing technologies, metagenomics became a powerful tool for the analysis of microbial communities both scientifically and diagnostically. The biggest challenge is the extraction of relevant information from the huge sequence datasets generated for metagenomics studies. Although a plethora of tools are available, data analysis is still a bottleneck. To overcome the bottleneck of data analysis, we developed an automated computational workflow called RIEMS - Reliable Information Extraction from Metagenomic Sequence datasets. RIEMS assigns every individual read sequence within a dataset taxonomically by cascading different sequence analyses with decreasing stringency of the assignments using various software applications. After completion of the analyses, the results are summarised in a clearly structured result protocol organised taxonomically. The high accuracy and performance of RIEMS analyses were proven in comparison with other tools for metagenomics data analysis using simulated sequencing read datasets. RIEMS has the potential to fill the gap that still exists with regard to data analysis for metagenomics studies. The usefulness and power of RIEMS for the analysis of genuine sequencing datasets was demonstrated with an early version of RIEMS in 2011 when it was used to detect the orthobunyavirus sequences leading to the discovery of Schmallenberg virus.
Patient-specific finite element modeling of bones.
Poelert, Sander; Valstar, Edward; Weinans, Harrie; Zadpoor, Amir A
2013-04-01
Finite element modeling is an engineering tool for structural analysis that has been used for many years to assess the relationship between load transfer and bone morphology and to optimize the design and fixation of orthopedic implants. Due to recent developments in finite element model generation, for example, improved computed tomography imaging quality, improved segmentation algorithms, and faster computers, the accuracy of finite element modeling has increased vastly and finite element models simulating the anatomy and properties of an individual patient can be constructed. Such so-called patient-specific finite element models are potentially valuable tools for orthopedic surgeons in fracture risk assessment or pre- and intraoperative planning of implant placement. The aim of this article is to provide a critical overview of current themes in patient-specific finite element modeling of bones. In addition, the state-of-the-art in patient-specific modeling of bones is compared with the requirements for a clinically applicable patient-specific finite element method, and judgment is passed on the feasibility of application of patient-specific finite element modeling as a part of clinical orthopedic routine. It is concluded that further development in certain aspects of patient-specific finite element modeling are needed before finite element modeling can be used as a routine clinical tool.
NASA Astrophysics Data System (ADS)
Weber, T.; Bartl, P.; Durst, J.; Haas, W.; Michel, T.; Ritter, A.; Anton, G.
2011-08-01
In the last decades, phase-contrast imaging using a Talbot-Lau grating interferometer is possible even with a low-brilliance X-ray source. With the potential of increasing the soft-tissue contrast, this method is on its way into medical imaging. For this purpose, the knowledge of the underlying physics of this technique is necessary.With this paper, we would like to contribute to the understanding of grating-based phase-contrast imaging by presenting results on measurements and simulations regarding the noise behaviour of the differential phases.These measurements were done using a microfocus X-ray tube with a hybrid, photon-counting, semiconductor Medipix2 detector. The additional simulations were performed by our in-house developed phase-contrast simulation tool “SPHINX”, combining both wave and particle contributions of the simulated photons.The results obtained by both of these methods show the same behaviour. Increasing the number of photons leads to a linear decrease of the standard deviation of the phase. The number of used phase steps has no influence on the standard deviation, if the total number of photons is held constant.Furthermore, the probability density function (pdf) of the reconstructed differential phases was analysed. It turned out that the so-called von Mises distribution is the physically correct pdf, which was also confirmed by measurements.This information advances the understanding of grating-based phase-contrast imaging and can be used to improve image quality.
Image simulation for HardWare In the Loop simulation in EO domain
NASA Astrophysics Data System (ADS)
Cathala, Thierry; Latger, Jean
2015-10-01
Infrared camera as a weapon sub system for automatic guidance is a key component for military carrier such as missile for example. The associated Image Processing, that controls the navigation, needs to be intensively assessed. Experimentation in the real world is very expensive. This is the main reason why hybrid simulation also called HardWare In the Loop (HWIL) is more and more required nowadays. In that field, IR projectors are able to cast IR fluxes of photons directly onto the IR camera of a given weapon system, typically a missile seeker head. Though in laboratory, the missile is so stimulated exactly like in the real world, provided a realistic simulation tool enables to perform synthetic images to be displayed by the IR projectors. The key technical challenge is to render the synthetic images at the required frequency. This paper focuses on OKTAL-SE experience in this domain through its product SE-FAST-HWIL. It shows the methodology and Return of Experience from OKTAL-SE. Examples are given, in the frame of the SE-Workbench. The presentation focuses on trials on real operational complex 3D cases. In particular, three important topics, that are very sensitive with regards to IG performance, are detailed: first the 3D sea surface representation, then particle systems rendering especially to simulate flares and at last sensor effects modelling. Beyond "projection mode", some information will be given on the SE-FAST-HWIL new capabilities dedicated to "injection mode".
NASA Astrophysics Data System (ADS)
Faucher-Giguere, Claude-Andre
2016-10-01
HST has invested thousands of orbits to complete multi-wavelength surveys of high-redshift galaxies including the Deep Fields, COSMOS, 3D-HST and CANDELS. Over the next few years, JWST will undertake complementary, spatially-resolved infrared observations. Cosmological simulations are the most powerful tool to make detailed predictions for the properties of galaxy populations and to interpret these surveys. We will leverage recent major advances in the predictive power of cosmological hydrodynamic simulations to produce the first statistical sample of hundreds of galaxies simulated with 10 pc resolution and with explicit interstellar medium and stellar feedback physics proved to simultaneously reproduce the galaxy stellar mass function, the chemical enrichment of galaxies, and the neutral hydrogen content of galaxy halos. We will process our new set of full-volume cosmological simulations, called FIREBOX, with a mock imaging and spectral synthesis pipeline to produce realistic mock HST and JWST observations, including spatially-resolved photometry and spectroscopy. By comparing FIREBOX with recent high-redshift HST surveys, we will study the stellar build up of galaxies, the evolution massive star-forming clumps, their contribution to bulge growth, the connection of bulges to star formation quenching, and the triggering mechanisms of AGN activity. Our mock data products will also enable us to plan future JWST observing programs. We will publicly release all our mock data products to enable HST and JWST science beyond our own analysis, including with the Frontier Fields.
Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.
Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades
2015-01-01
DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.
Multi-beamlet investigation of the deflection compensation methods of SPIDER beamlets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baltador, C., E-mail: carlo.baltador@igi.cnr.it; Veltri, P.; Agostinetti, P.
2016-02-15
SPIDER (Source for Production of Ions of Deuterium Extracted from a Rf plasma) is an ion source test bed designed to extract and accelerate a negative ion current up to 40 A and 100 kV whose first beam is expected by the end of 2016. Two main effects perturb beamlet optics during the acceleration stage: space charge repulsion and the deflection induced by the permanent magnets (called co-extracted electron suppression magnets) embedded in the EG. The purpose of this work is to evaluate and compare benefits, collateral effects, and limitations of electrical and magnetic compensation methods for beamlet deflection. Themore » study of these methods has been carried out by means of numerical modeling tools: multi-beamlet simulations have been performed for the first time.« less
FALSTAFF: A new tool for fission studies
NASA Astrophysics Data System (ADS)
Dore, D.; Farget, F.; Lecolley, F.-R.; Lehaut, G.; Materna, T.; Pancin, J.; Panebianco, S.; Papaevangelou, Th.
2013-12-01
The future NFS installation will produce high intensity neutron beams from hundreds of keV up to 40 MeV. Taking advantage of this facility, data of particular interest for the nuclear community in view of the development of the fast reactor technology will be measured. The development of an experimental setup called FALSTAFF for a full characterization of actinide fission fragments has been undertaken. Fission fragment isotopic yields and associated neutron multiplicities will be measured as a function of the neutron energy. Based on time-of-flight and residual energy technique, the setup will allow the simultaneous measurement of the complementary fragments velocity and energy. The performances of TOF detectors of FALSTAFF will be presented and expected resolutions for fragment masses and neutron multiplicities, based on realistic simulations, will be shown.
Multi-beamlet investigation of the deflection compensation methods of SPIDER beamlets
NASA Astrophysics Data System (ADS)
Baltador, C.; Veltri, P.; Agostinetti, P.; Chitarin, G.; Serianni, G.
2016-02-01
SPIDER (Source for Production of Ions of Deuterium Extracted from a Rf plasma) is an ion source test bed designed to extract and accelerate a negative ion current up to 40 A and 100 kV whose first beam is expected by the end of 2016. Two main effects perturb beamlet optics during the acceleration stage: space charge repulsion and the deflection induced by the permanent magnets (called co-extracted electron suppression magnets) embedded in the EG. The purpose of this work is to evaluate and compare benefits, collateral effects, and limitations of electrical and magnetic compensation methods for beamlet deflection. The study of these methods has been carried out by means of numerical modeling tools: multi-beamlet simulations have been performed for the first time.
Treangen, Todd J; Ondov, Brian D; Koren, Sergey; Phillippy, Adam M
2014-01-01
Whole-genome sequences are now available for many microbial species and clades, however existing whole-genome alignment methods are limited in their ability to perform sequence comparisons of multiple sequences simultaneously. Here we present the Harvest suite of core-genome alignment and visualization tools for the rapid and simultaneous analysis of thousands of intraspecific microbial strains. Harvest includes Parsnp, a fast core-genome multi-aligner, and Gingr, a dynamic visual platform. Together they provide interactive core-genome alignments, variant calls, recombination detection, and phylogenetic trees. Using simulated and real data we demonstrate that our approach exhibits unrivaled speed while maintaining the accuracy of existing methods. The Harvest suite is open-source and freely available from: http://github.com/marbl/harvest.
Effects of sound source directivity on auralizations
NASA Astrophysics Data System (ADS)
Sheets, Nathan W.; Wang, Lily M.
2002-05-01
Auralization, the process of rendering audible the sound field in a simulated space, is a useful tool in the design of acoustically sensitive spaces. The auralization depends on the calculation of an impulse response between a source and a receiver which have certain directional behavior. Many auralizations created to date have used omnidirectional sources; the effects of source directivity on auralizations is a relatively unexplored area. To examine if and how the directivity of a sound source affects the acoustical results obtained from a room, we used directivity data for three sources in a room acoustic modeling program called Odeon. The three sources are: violin, piano, and human voice. The results from using directional data are compared to those obtained using omnidirectional source behavior, both through objective measure calculations and subjective listening tests.
Enhanced Electric Power Transmission by Hybrid Compensation Technique
NASA Astrophysics Data System (ADS)
Palanichamy, C.; Kiu, G. Q.
2015-04-01
In today's competitive environment, new power system engineers are likely to contribute immediately to the task, without years of seasoning via on-the-job training, mentoring, and rotation assignments. At the same time it is becoming obligatory to train power system engineering graduates for an increasingly quality-minded corporate environment. In order to achieve this, there is a need to make available better-quality tools for educating and training power system engineering students and in-service system engineers too. As a result of the swift advances in computer hardware and software, many windows-based computer software packages were developed for the purpose of educating and training. In line with those packages, a simulation package called Hybrid Series-Shunt Compensators (HSSC) has been developed and presented in this paper for educational purposes.
Status of the AIAA Modeling and Simulation Format Standard
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Hildreth, Bruce L.
2008-01-01
The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.
NASA Astrophysics Data System (ADS)
Abdel-Aal, H. A.; Mansori, M. El
2012-12-01
Cutting tools are subject to extreme thermal and mechanical loads during operation. The state of loading is intensified in dry cutting environment especially when cutting the so called hard-to-cut-materials. Although, the effect of mechanical loads on tool failure have been extensively studied, detailed studies on the effect of thermal dissipation on the deterioration of the cutting tool are rather scarce. In this paper we study failure of coated carbide tools due to thermal loading. The study emphasizes the role assumed by the thermo-physical properties of the tool material in enhancing or preventing mass attrition of the cutting elements within the tool. It is shown that within a comprehensive view of the nature of conduction in the tool zone, thermal conduction is not solely affected by temperature. Rather it is a function of the so called thermodynamic forces. These are the stress, the strain, strain rate, rate of temperature rise, and the temperature gradient. Although that within such consideration description of thermal conduction is non-linear, it is beneficial to employ such a form because it facilitates a full mechanistic understanding of thermal activation of tool wear.
SimulCAT: Windows Software for Simulating Computerized Adaptive Test Administration
ERIC Educational Resources Information Center
Han, Kyung T.
2012-01-01
Most, if not all, computerized adaptive testing (CAT) programs use simulation techniques to develop and evaluate CAT program administration and operations, but such simulation tools are rarely available to the public. Up to now, several software tools have been available to conduct CAT simulations for research purposes; however, these existing…
NASA Astrophysics Data System (ADS)
Mikkili, Suresh; Panda, Anup Kumar; Prattipati, Jayanthi
2015-06-01
Nowadays the researchers want to develop their model in real-time environment. Simulation tools have been widely used for the design and improvement of electrical systems since the mid twentieth century. The evolution of simulation tools has progressed in step with the evolution of computing technologies. In recent years, computing technologies have improved dramatically in performance and become widely available at a steadily decreasing cost. Consequently, simulation tools have also seen dramatic performance gains and steady cost decreases. Researchers and engineers now have the access to affordable, high performance simulation tools that were previously too cost prohibitive, except for the largest manufacturers. This work has introduced a specific class of digital simulator known as a real-time simulator by answering the questions "what is real-time simulation", "why is it needed" and "how it works". The latest trend in real-time simulation consists of exporting simulation models to FPGA. In this article, the Steps involved for implementation of a model from MATLAB to REAL-TIME are provided in detail.
Abductive networks applied to electronic combat
NASA Astrophysics Data System (ADS)
Montgomery, Gerard J.; Hess, Paul; Hwang, Jong S.
1990-08-01
A practical approach to dealing with combinatorial decision problems and uncertainties associated with electronic combat through the use of networks of high-level functional elements called abductive networks is presented. It describes the application of the Abductory Induction Mechanism (AIMTM) a supervised inductive learning tool for synthesizing polynomial abductive networks to the electronic combat problem domain. From databases of historical expert-generated or simulated combat engagements AIM can often induce compact and robust network models for making effective real-time electronic combat decisions despite significant uncertainties or a combinatorial explosion of possible situations. The feasibility of applying abductive networks to realize advanced combat decision aiding capabilities was demonstrated by applying AIM to a set of electronic combat simulations. The networks synthesized by AIM generated accurate assessments of the intent lethality and overall risk associated with a variety of simulated threats and produced reasonable estimates of the expected effectiveness of a group of electronic countermeasures for a large number of simulated combat scenarios. This paper presents the application of abductive networks to electronic combat summarizes the results of experiments performed using AIM discusses the benefits and limitations of applying abductive networks to electronic combat and indicates why abductive networks can often result in capabilities not attainable using alternative approaches. 1. ELECTRONIC COMBAT. UNCERTAINTY. AND MACHINE LEARNING Electronic combat has become an essential part of the ability to make war and has become increasingly complex since
ERIC Educational Resources Information Center
Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.
2011-01-01
An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…
Measuring Pilot Workload in a Moving-base Simulator. Part 2: Building Levels of Workload
NASA Technical Reports Server (NTRS)
Kantowitz, B. H.; Hart, S. G.; Bortolussi, M. R.; Shively, R. J.; Kantowitz, S. C.
1984-01-01
Pilot behavior in flight simulators often use a secondary task as an index of workload. His routine to regard flying as the primary task and some less complex task as the secondary task. While this assumption is quite reasonable for most secondary tasks used to study mental workload in aircraft, the treatment of flying a simulator through some carefully crafted flight scenario as a unitary task is less justified. The present research acknowledges that total mental workload depends upon the specific nature of the sub-tasks that a pilot must complete as a first approximation, flight tasks were divided into three levels of complexity. The simplest level (called the Base Level) requires elementary maneuvers that do not utilize all the degrees of freedom of which an aircraft, or a moving-base simulator; is capable. The second level (called the Paired Level) requires the pilot to simultaneously execute two Base Level tasks. The third level (called the Complex Level) imposes three simultaneous constraints upon the pilot.
Engineering Fracking Fluids with Computer Simulation
NASA Astrophysics Data System (ADS)
Shaqfeh, Eric
2015-11-01
There are no comprehensive simulation-based tools for engineering the flows of viscoelastic fluid-particle suspensions in fully three-dimensional geometries. On the other hand, the need for such a tool in engineering applications is immense. Suspensions of rigid particles in viscoelastic fluids play key roles in many energy applications. For example, in oil drilling the ``drilling mud'' is a very viscous, viscoelastic fluid designed to shear-thin during drilling, but thicken at stoppage so that the ``cuttings'' can remain suspended. In a related application known as hydraulic fracturing suspensions of solids called ``proppant'' are used to prop open the fracture by pumping them into the well. It is well-known that particle flow and settling in a viscoelastic fluid can be quite different from that which is observed in Newtonian fluids. First, it is now well known that the ``fluid particle split'' at bifurcation cracks is controlled by fluid rheology in a manner that is not understood. Second, in Newtonian fluids, the presence of an imposed shear flow in the direction perpendicular to gravity (which we term a cross or orthogonal shear flow) has no effect on the settling of a spherical particle in Stokes flow (i.e. at vanishingly small Reynolds number). By contrast, in a non-Newtonian liquid, the complex rheological properties induce a nonlinear coupling between the sedimentation and shear flow. Recent experimental data have shown both the shear thinning and the elasticity of the suspending polymeric solutions significantly affects the fluid-particle split at bifurcations, as well as the settling rate of the solids. In the present work, we use the Immersed Boundary Method to develop computer simulations of viscoelastic flow in suspensions of spheres to study these problems. These simulations allow us to understand the detailed physical mechanisms for the remarkable physical behavior seen in practice, and actually suggest design rules for creating new fluid recipes.
Teaching Web Security Using Portable Virtual Labs
ERIC Educational Resources Information Center
Chen, Li-Chiou; Tao, Lixin
2012-01-01
We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
Impact of tool wear on cross wedge rolling process stability and on product quality
NASA Astrophysics Data System (ADS)
Gutierrez, Catalina; Langlois, Laurent; Baudouin, Cyrille; Bigot, Régis; Fremeaux, Eric
2017-10-01
Cross wedge rolling (CWR) is a metal forming process used in the automotive industry. One of its applications is in the manufacturing process of connecting rods. CWR transforms a cylindrical billet into a complex axisymmetrical shape with an accurate distribution of material. This preform is forged into shape in a forging die. In order to improve CWR tool lifecycle and product quality it is essential to understand tool wear evolution and the physical phenomena that change on the CWR process due to the resulting geometry of the tool when undergoing tool wear. In order to understand CWR tool wear behavior, numerical simulations are necessary. Nevertheless, if the simulations are performed with the CAD geometry of the tool, results are limited. To solve this difficulty, two numerical simulations with FORGE® were performed using the real geometry of the tools (both up and lower roll) at two different states: (1) before starting lifecycle and (2) end of lifecycle. The tools were 3D measured with ATOS triple scan by GOM® using optical 3D measuring techniques. The result was a high-resolution point cloud of the entire geometry of the tool. Each 3D point cloud was digitalized and converted into a STL format. The geometry of the tools in a STL format was input for the 3D simulations. Both simulations were compared. Defects of products obtained in simulation were compared to main defects of products found industrially. Two main defects are: (a) surface defects on the preform that are not fixed in the die forging operation; and (b) Preform bent (no longer straight), with two possible impacts: on the one hand that the robot cannot grab it to take it to the forging stage; on the other hand, an unfilled section in the forging operation.
SolarPILOT | Concentrating Solar Power | NREL
tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is
Introducing Computer Simulation into the High School: An Applied Mathematics Curriculum.
ERIC Educational Resources Information Center
Roberts, Nancy
1981-01-01
A programing language called DYNAMO, developed especially for writing simulation models, is promoted. Details of six, self-teaching curriculum packages recently developed for simulation-oriented instruction are provided. (MP)
Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J
2016-08-01
Building modelers need simulation tools capable of simultaneously considering building energy use, airflow and indoor air quality (IAQ) to design and evaluate the ability of buildings and their systems to meet today's demanding energy efficiency and IAQ performance requirements. CONTAM is a widely-used multizone building airflow and contaminant transport simulation tool that requires indoor temperatures as input values. EnergyPlus is a prominent whole-building energy simulation program capable of performing heat transfer calculations that require interzone and infiltration airflows as input values. On their own, each tool is limited in its ability to account for thermal processes upon which building airflow may be significantly dependent and vice versa. This paper describes the initial phase of coupling of CONTAM with EnergyPlus to capture the interdependencies between airflow and heat transfer using co-simulation that allows for sharing of data between independently executing simulation tools. The coupling is accomplished based on the Functional Mock-up Interface (FMI) for Co-simulation specification that provides for integration between independently developed tools. A three-zone combined heat transfer/airflow analytical BESTEST case was simulated to verify the co-simulation is functioning as expected, and an investigation of a two-zone, natural ventilation case designed to challenge the coupled thermal/airflow solution methods was performed.
Creation and Delphi-method refinement of pediatric disaster triage simulations.
Cicero, Mark X; Brown, Linda; Overly, Frank; Yarzebski, Jorge; Meckler, Garth; Fuchs, Susan; Tomassoni, Anthony; Aghababian, Richard; Chung, Sarita; Garrett, Andrew; Fagbuyi, Daniel; Adelgais, Kathleen; Goldman, Ran; Parker, James; Auerbach, Marc; Riera, Antonio; Cone, David; Baum, Carl R
2014-01-01
There is a need for rigorously designed pediatric disaster triage (PDT) training simulations for paramedics. First, we sought to design three multiple patient incidents for EMS provider training simulations. Our second objective was to determine the appropriate interventions and triage level for each victim in each of the simulations and develop evaluation instruments for each simulation. The final objective was to ensure that each simulation and evaluation tool was free of bias toward any specific PDT strategy. We created mixed-methods disaster simulation scenarios with pediatric victims: a school shooting, a school bus crash, and a multiple-victim house fire. Standardized patients, high-fidelity manikins, and low-fidelity manikins were used to portray the victims. Each simulation had similar acuity of injuries and 10 victims. Examples include children with special health-care needs, gunshot wounds, and smoke inhalation. Checklist-based evaluation tools and behaviorally anchored global assessments of function were created for each simulation. Eight physicians and paramedics from areas with differing PDT strategies were recruited as Subject Matter Experts (SMEs) for a modified Delphi iterative critique of the simulations and evaluation tools. The modified Delphi was managed with an online survey tool. The SMEs provided an expected triage category for each patient. The target for modified Delphi consensus was ≥85%. Using Likert scales and free text, the SMEs assessed the validity of the simulations, including instances of bias toward a specific PDT strategy, clarity of learning objectives, and the correlation of the evaluation tools to the learning objectives and scenarios. After two rounds of the modified Delphi, consensus for expected triage level was >85% for 28 of 30 victims, with the remaining two achieving >85% consensus after three Delphi iterations. To achieve consensus, we amended 11 instances of bias toward a specific PDT strategy and corrected 10 instances of noncorrelation between evaluations and simulation. The modified Delphi process, used to derive novel PDT simulation and evaluation tools, yielded a high degree of consensus among the SMEs, and eliminated biases toward specific PDT strategies in the evaluations. The simulations and evaluation tools may now be tested for reliability and validity as part of a prehospital PDT curriculum.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.
Modeling and Simulation Tools for Heavy Lift Airships
NASA Technical Reports Server (NTRS)
Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John
2016-01-01
For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.
Scalable isosurface visualization of massive datasets on commodity off-the-shelf clusters
Bajaj, Chandrajit
2009-01-01
Tomographic imaging and computer simulations are increasingly yielding massive datasets. Interactive and exploratory visualizations have rapidly become indispensable tools to study large volumetric imaging and simulation data. Our scalable isosurface visualization framework on commodity off-the-shelf clusters is an end-to-end parallel and progressive platform, from initial data access to the final display. Interactive browsing of extracted isosurfaces is made possible by using parallel isosurface extraction, and rendering in conjunction with a new specialized piece of image compositing hardware called Metabuffer. In this paper, we focus on the back end scalability by introducing a fully parallel and out-of-core isosurface extraction algorithm. It achieves scalability by using both parallel and out-of-core processing and parallel disks. It statically partitions the volume data to parallel disks with a balanced workload spectrum, and builds I/O-optimal external interval trees to minimize the number of I/O operations of loading large data from disk. We also describe an isosurface compression scheme that is efficient for progress extraction, transmission and storage of isosurfaces. PMID:19756231
NASA Astrophysics Data System (ADS)
Griesse-Nascimento, Sarah; Bridger, Joshua; Brown, Keith; Westervelt, Robert
2011-03-01
Interactive computer simulations increase students' understanding of difficult concepts and their ability to explain complex ideas. We created a module of eight interactive programs and accompanying lesson plans for teaching the fundamental concepts of Nuclear Magnetic Resonance (NMR) and Magnetic Resonance Imaging (MRI) that we call interactive NMR (iNMR). We begin with an analogy between nuclear spins and metronomes to start to build intuition about the dynamics of spins in a magnetic field. We continue to explain T1, T2, and pulse sequences with the metronome analogy. The final three programs are used to introduce and explain the Magnetic Resonance Switch, a recent diagnostic technique based on NMR. A modern relevant application is useful to generate interest in the topic and confidence in the students' ability to apply their knowledge. The iNMR module was incorporated into a high school AP physics class. In a preliminary evaluation of implementation, students expressed enthusiasm and demonstrated enhanced understanding of the material relative to the previous year. Funded by NSF PHY-0646094 grant.
Loggers and Forest Fragmentation: Behavioral Models of Road Building in the Amazon Basin
NASA Technical Reports Server (NTRS)
Arima, Eugenio Y.; Walker, Robert T.; Perz, Stephen G.; Caldas, Marcellus
2005-01-01
Although a large literature now exists on the drivers of tropical deforestation, less is known about its spatial manifestation. This is a critical shortcoming in our knowledge base since the spatial pattern of land-cover change and forest fragmentation, in particular, strongly affect biodiversity. The purpose of this article is to consider emergent patterns of road networks, the initial proximate cause of fragmentation in tropical forest frontiers. Specifically, we address the road-building processes of loggers who are very active in the Amazon landscape. To this end, we develop an explanation of road expansions, using a positive approach combining a theoretical model of economic behavior with geographic information systems (GIs) software in order to mimic the spatial decisions of road builders. We simulate two types of road extensions commonly found in the Amazon basin in a region: showing the fishbone pattern of fragmentation. Although our simulation results are only partially successful, they call attention to the role of multiple agents in the landscape, the importance of legal and institutional constraints on economic behavior, and the power of GIs as a research tool.
treeman: an R package for efficient and intuitive manipulation of phylogenetic trees.
Bennett, Dominic J; Sutton, Mark D; Turvey, Samuel T
2017-01-07
Phylogenetic trees are hierarchical structures used for representing the inter-relationships between biological entities. They are the most common tool for representing evolution and are essential to a range of fields across the life sciences. The manipulation of phylogenetic trees-in terms of adding or removing tips-is often performed by researchers not just for reasons of management but also for performing simulations in order to understand the processes of evolution. Despite this, the most common programming language among biologists, R, has few class structures well suited to these tasks. We present an R package that contains a new class, called TreeMan, for representing the phylogenetic tree. This class has a list structure allowing phylogenetic trees to be manipulated more efficiently. Computational running times are reduced because of the ready ability to vectorise and parallelise methods. Development is also improved due to fewer lines of code being required for performing manipulation processes. We present three use cases-pinning missing taxa to a supertree, simulating evolution with a tree-growth model and detecting significant phylogenetic turnover-that demonstrate the new package's speed and simplicity.
Mastin, M.C.; Le, Thanh
2001-01-01
The U.S. Geological Survey, in cooperation with Pierce County Department of Public Works, Washington, has developed an operational tool called the Puyallup Flood-Alert System to alert users of impending floods in the Puyallup River Basin. The system acquires and incorporates meteorological and hydrological data into the Streamflow Synthesis and Reservoir Regulation (SSARR) hydrologic flow-routing model to simulate floods in the Puyallup River Basin. SSARRMENU is the user-interactive graphical interface between the user, the input and output data, and the SSARR model. In a companion cooperative project with Pierce County, the SSARR model for the Puyallup River Basin was calibrated and validated. The calibrated model is accessed through SSARRMENU, which has been specifically programed for the Puyallup River and the needs of Pierce County. SSARRMENU automates the retrieval of data from ADAPS (Automated DAta Processing System, the U.S. Geological Survey?s real-time hydrologic database), formats the data for use with SSARR, initiates SSARR model runs, displays alerts for impending floods, and provides utilities to display the simulated and observed data. An on-screen map of the basin and a series of menu items provide the user wi
Water Network Tool for Resilience v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-12-09
WNTR is a python package designed to simulate and analyze resilience of water distribution networks. The software includes: - Pressure driven and demand driven hydraulic simulation - Water quality simulation to track concentration, trace, and water age - Conditional controls to simulate power outages - Models to simulate pipe breaks - A wide range of resilience metrics - Analysis and visualization tools
NAPL: SIMULATOR DOCUMENTATION (EPA/600/SR-97/102)
A mathematical and numerical model is developed to simulate the transport and fate of NAPLs (Non-Aqueous Phase Liquids) in near-surface granular soils. The resulting three-dimensional, three phase simulator is called NAPL. The simulator accommodates three mobile phases: water, NA...
Computer simulator for a mobile telephone system
NASA Technical Reports Server (NTRS)
Schilling, D. L.
1981-01-01
A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.
Computer-Aided Air-Traffic Control In The Terminal Area
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
1995-01-01
Developmental computer-aided system for automated management and control of arrival traffic at large airport includes three integrated subsystems. One subsystem, called Traffic Management Advisor, another subsystem, called Descent Advisor, and third subsystem, called Final Approach Spacing Tool. Data base that includes current wind measurements and mathematical models of performances of types of aircraft contributes to effective operation of system.
ESIF Call for High-Impact Integrated Projects | Energy Systems Integration
Integrated Projects As a U.S. Department of Energy user facility, the Energy Systems Integration Facility concepts, tools, and technologies needed to measure, analyze, predict, protect, and control the grid of the Facility | NREL ESIF Call for High-Impact Integrated Projects ESIF Call for High-Impact
Using WebQuests as Idea Banks for Fostering Autonomy in Online Language Courses
ERIC Educational Resources Information Center
Sadaghian, Shirin; Marandi, S. Susan
2016-01-01
The concept of language learner autonomy has influenced Computer-Assisted Language Learning (CALL) to the extent that Schwienhorst (2012) informs us of a paradigm change in CALL design in the light of learner autonomy. CALL is not considered a tool anymore, but a learner environment available to language learners anywhere in the world. Based on a…
WiFiSiM: An Educational Tool for the Study and Design of Wireless Networks
ERIC Educational Resources Information Center
Mateo Sanguino, T. J.; Serrano Lopez, C.; Marquez Hernandez, F. A.
2013-01-01
A new educational simulation tool designed for the generic study of wireless networks, the Wireless Fidelity Simulator (WiFiSim), is presented in this paper. The goal of this work was to create and implement a didactic tool to improve the teaching and learning of computer networks by means of two complementary strategies: simulating the behavior…
Kirkman, Matthew A; Muirhead, William; Nandi, Dipankar; Sevdalis, Nick
2014-01-01
Neurosurgical simulation training is becoming increasingly popular. Attitudes toward simulation among residents can contribute to the effectiveness of simulation training, but such attitudes remain poorly explored in neurosurgery with no psychometrically proven measure in the literature. The aim of the present study was to evaluate prospectively a newly developed tool for this purpose: the Neurosurgical Evaluation of Attitudes towards simulation Training (NEAT). The NEAT tool was prospectively developed in 2 stages and psychometrically evaluated (validity and reliability) in 2 administrations with the same participants. The tool comprises a questionnaire with 9 Likert scale items and 2 free-text sections assessing attitudes toward simulation in neurosurgery. The evaluation was completed with 31 neurosurgery residents in London, United Kingdom, who were generally favorable toward neurosurgical simulation. The internal consistency of the questionnaire was high, as demonstrated by the overall Cronbach α values (α=0.899 and α=0.955). All but 2 questionnaire items had "substantial" or "almost perfect" test-retest reliability following repeated survey administrations (median Pearson r correlation=0.688; range, 0.248-0.841). NEAT items were well correlated with each other on both occasions, showing good validity of content within the NEAT tool. There was no significant relationship between either gender or length of neurosurgical experience and item ratings. NEAT is the first psychometrically evaluated tool for evaluating attitudes toward simulation in neurosurgery. Further implementation of NEAT is required in wider neurosurgical populations to establish whether specific population groups differ. Use of NEAT in studies of neurosurgical simulation could offer an additional outcome measure to performance metrics, permitting evaluation of the impact of neurosurgical simulation on attitudes toward simulation both between participants and within the same participants over time. Copyright © 2014 Elsevier Inc. All rights reserved.
Grid Integration Research | Wind | NREL
-generated simulation of a wind turbine. Wind Power Plant Modeling and Simulation Engineers at the National computer-aided engineering tool, FAST, as well as their wind power plant simulation tool, Wind-Plant
BS-virus-finder: virus integration calling using bisulfite sequencing data.
Gao, Shengjie; Hu, Xuesong; Xu, Fengping; Gao, Changduo; Xiong, Kai; Zhao, Xiao; Chen, Haixiao; Zhao, Shancen; Wang, Mengyao; Fu, Dongke; Zhao, Xiaohui; Bai, Jie; Mao, Likai; Li, Bo; Wu, Song; Wang, Jian; Li, Shengbin; Yang, Huangming; Bolund, Lars; Pedersen, Christian N S
2018-01-01
DNA methylation plays a key role in the regulation of gene expression and carcinogenesis. Bisulfite sequencing studies mainly focus on calling single nucleotide polymorphism, different methylation region, and find allele-specific DNA methylation. Until now, only a few software tools have focused on virus integration using bisulfite sequencing data. We have developed a new and easy-to-use software tool, named BS-virus-finder (BSVF, RRID:SCR_015727), to detect viral integration breakpoints in whole human genomes. The tool is hosted at https://github.com/BGI-SZ/BSVF. BS-virus-finder demonstrates high sensitivity and specificity. It is useful in epigenetic studies and to reveal the relationship between viral integration and DNA methylation. BS-virus-finder is the first software tool to detect virus integration loci by using bisulfite sequencing data. © The Authors 2017. Published by Oxford University Press.
VoiceThread: A Useful Program Evaluation Tool
ERIC Educational Resources Information Center
Mott, Rebecca
2018-01-01
With today's technology, Extension professionals have a variety of tools available for program evaluation. This article describes an innovative platform called VoiceThread that has been used in many classrooms but also is useful for conducting virtual focus group research. I explain how this tool can be used to collect qualitative participant…
Rapid Response to Decision Making for Complex Issues - How Technologies of Cooperation Can Help
2005-11-01
creating bottom–up taxonomies—called folksonomies —using metadata tools like del.icio.us (in which users create their own tags for bookmarking Web...tools such as RSS, tagging (and the consequent development of folksonomies ), wikis, and group visualization tools all help multiply the individual
[Virtual reality simulation training in gynecology: review and perspectives].
Ricard-Gauthier, Dominique; Popescu, Silvia; Benmohamed, Naida; Petignat, Patrick; Dubuisson, Jean
2016-10-26
Laparoscopic simulation has rapidly become an important tool for learning and acquiring technical skills in surgery. It is based on two different complementary pedagogic tools : the box model trainer and the virtual reality simulator. The virtual reality simulator has shown its efficiency by improving surgical skills, decreasing operating time, improving economy of movements and improving self-confidence. The main objective of this tool is the opportunity to easily organize a regular, structured and uniformed training program enabling an automated individualized feedback.
Spacecraft Guidance, Navigation, and Control Visualization Tool
NASA Technical Reports Server (NTRS)
Mandic, Milan; Acikmese, Behcet; Blackmore, Lars
2011-01-01
G-View is a 3D visualization tool for supporting spacecraft guidance, navigation, and control (GN&C) simulations relevant to small-body exploration and sampling (see figure). The tool is developed in MATLAB using Virtual Reality Toolbox and provides users with the ability to visualize the behavior of their simulations, regardless of which programming language (or machine) is used to generate simulation results. The only requirement is that multi-body simulation data is generated and placed in the proper format before applying G-View.
FDTD simulation tools for UWB antenna analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brocato, Robert Wesley
2004-12-01
This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.
Chen, J.; Wu, Y.
2012-01-01
This paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.
Design and Evaluation of the Terminal Area Precision Scheduling and Spacing System
NASA Technical Reports Server (NTRS)
Swenson, Harry N.; Thipphavong, Jane; Sadovsky, Alex; Chen, Liang; Sullivan, Chris; Martin, Lynne
2011-01-01
This paper describes the design, development and results from a high fidelity human-in-the-loop simulation of an integrated set of trajectory-based automation tools providing precision scheduling, sequencing and controller merging and spacing functions. These integrated functions are combined into a system called the Terminal Area Precision Scheduling and Spacing (TAPSS) system. It is a strategic and tactical planning tool that provides Traffic Management Coordinators, En Route and Terminal Radar Approach Control air traffic controllers the ability to efficiently optimize the arrival capacity of a demand-impacted airport while simultaneously enabling fuel-efficient descent procedures. The TAPSS system consists of four-dimensional trajectory prediction, arrival runway balancing, aircraft separation constraint-based scheduling, traffic flow visualization and trajectory-based advisories to assist controllers in efficient metering, sequencing and spacing. The TAPSS system was evaluated and compared to today's ATC operation through extensive series of human-in-the-loop simulations for arrival flows into the Los Angeles International Airport. The test conditions included the variation of aircraft demand from a baseline of today's capacity constrained periods through 5%, 10% and 20% increases. Performance data were collected for engineering and human factor analysis and compared with similar operations both with and without the TAPSS system. The engineering data indicate operations with the TAPSS show up to a 10% increase in airport throughput during capacity constrained periods while maintaining fuel-efficient aircraft descent profiles from cruise to landing.
Simulating the Camp David Negotiations: A Problem-Solving Tool in Critical Pedagogy
ERIC Educational Resources Information Center
McMahon, Sean F.; Miller, Chris
2013-01-01
This article reflects critically on simulations. Building on the authors' experience simulating the Palestinian-Israeli-American Camp David negotiations of 2000, they argue that simulations are useful pedagogical tools that encourage creative--but not critical--thinking and constructivist learning. However, they can also have the deleterious…
SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE
The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...
NASA Sea Level Change Portal - It not just another portal site
NASA Astrophysics Data System (ADS)
Huang, T.; Quach, N.; Abercrombie, S. P.; Boening, C.; Brennan, H. P.; Gill, K. M.; Greguska, F. R., III; Jackson, R.; Larour, E. Y.; Shaftel, H.; Tenenbaum, L. F.; Zlotnicki, V.; Moore, B.; Moore, J.; Boeck, A.
2017-12-01
The NASA Sea Level Change Portal (https://sealevel.nasa.gov) is designed as a "one-stop" source for current sea level change information, including interactive tools for accessing and viewing regional data, a virtual dashboard of sea level indicators, and ongoing updates through a suite of editorial products that include content articles, graphics, videos, and animations. With increasing global temperatures warming the ocean and melting ice sheets and glaciers, there is an immediate need both for accelerating sea level change research and for making this research accessible to scientists in disparate discipline, to the general public, to policy makers and business. The immersive and innovative NASA portal debuted at the 2015 AGU attracts thousands of daily visitors and over 30K followers on Facebook®. Behind its intuitive interface is an extensible architecture that integrates site contents, data for various sources, visualization, horizontal-scale geospatial data analytic technology (called NEXUS), and an interactive 3D simulation platform (called the Virtual Earth System Laboratory). We will present an overview of our NASA portal and some of our architectural decisions along with discussion on our open-source, cloud-based data analytic technology that enables on-the-fly analysis of heterogeneous data.