Sample records for simulator ads tool

  1. Managing emergency department overcrowding via ambulance diversion: a discrete event simulation model.

    PubMed

    Lin, Chih-Hao; Kao, Chung-Yao; Huang, Chong-Ye

    2015-01-01

    Ambulance diversion (AD) is considered one of the possible solutions to relieve emergency department (ED) overcrowding. Study of the effectiveness of various AD strategies is prerequisite for policy-making. Our aim is to develop a tool that quantitatively evaluates the effectiveness of various AD strategies. A simulation model and a computer simulation program were developed. Three sets of simulations were executed to evaluate AD initiating criteria, patient-blocking rules, and AD intervals, respectively. The crowdedness index, the patient waiting time for service, and the percentage of adverse patients were assessed to determine the effect of various AD policies. Simulation results suggest that, in a certain setting, the best timing for implementing AD is when the crowdedness index reaches the critical value, 1.0 - an indicator that ED is operating at its maximal capacity. The strategy to divert all patients transported by ambulance is more effective than to divert either high-acuity patients only or low-acuity patients only. Given a total allowable AD duration, implementing AD multiple times with short intervals generally has better effect than having a single AD with maximal allowable duration. An input-throughput-output simulation model is proposed for simulating ED operation. Effectiveness of several AD strategies on relieving ED overcrowding was assessed via computer simulations based on this model. By appropriate parameter settings, the model can represent medical resource providers of different scales. It is also feasible to expand the simulations to evaluate the effect of AD strategies on a community basis. The results may offer insights for making effective AD policies. Copyright © 2012. Published by Elsevier B.V.

  2. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  3. Medical Simulations for Exploration Medicine

    NASA Technical Reports Server (NTRS)

    Reyes, David; Suresh, Rahul; Pavela, James; Urbina, Michelle; Mindock, Jennifer; Antonsen, Erik

    2018-01-01

    Medical simulation is a useful tool that can be used to train personnel, develop medical processes, and assist cross-disciplinary communication. Medical simulations have been used in the past at NASA for these purposes, however they are usually created ad hoc. A stepwise approach to scenario development has not previously been used. The NASA Exploration Medical Capability (ExMC) created a medical scenario development tool to test medical procedures, technologies, concepts of operation and for use in systems engineering (SE) processes.

  4. Adding Badging to a Marketing Simulation to Increase Motivation to Learn

    ERIC Educational Resources Information Center

    Saxton, M. Kim

    2015-01-01

    Badging has become a popular tool for obtaining social recognition for personal accomplishments. This innovation describes a way to add badging to a marketing simulation to increase student motivation to achieve the simulation's goals. Assessments indicate that badging both motivates students to perform better and helps explain students' perceived…

  5. Publicly Releasing a Large Simulation Dataset with NDS Labs

    NASA Astrophysics Data System (ADS)

    Goldbaum, Nathan

    2016-03-01

    Optimally, all publicly funded research should be accompanied by the tools, code, and data necessary to fully reproduce the analysis performed in journal articles describing the research. This ideal can be difficult to attain, particularly when dealing with large (>10 TB) simulation datasets. In this lightning talk, we describe the process of publicly releasing a large simulation dataset to accompany the submission of a journal article. The simulation was performed using Enzo, an open source, community-developed N-body/hydrodynamics code and was analyzed using a wide range of community- developed tools in the scientific Python ecosystem. Although the simulation was performed and analyzed using an ecosystem of sustainably developed tools, we enable sustainable science using our data by making it publicly available. Combining the data release with the NDS Labs infrastructure allows a substantial amount of added value, including web-based access to analysis and visualization using the yt analysis package through an IPython notebook interface. In addition, we are able to accompany the paper submission to the arXiv preprint server with links to the raw simulation data as well as interactive real-time data visualizations that readers can explore on their own or share with colleagues during journal club discussions. It is our hope that the value added by these services will substantially increase the impact and readership of the paper.

  6. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  7. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  8. Simulation Tools Prevent Signal Interference on Spacecraft

    NASA Technical Reports Server (NTRS)

    2014-01-01

    NASA engineers use simulation software to detect and prevent interference between different radio frequency (RF) systems on a rocket and satellite before launch. To speed up the process, Kennedy Space Center awarded SBIR funding to Champaign, Illinois-based Delcross Technologies LLC, which added a drag-and-drop feature to its commercial simulation software, resulting in less time spent preparing for the analysis.

  9. New tools for sculpting cranial implants in a shared haptic augmented reality environment.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2006-01-01

    New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.

  10. Circuit design tool. User's manual, revision 2

    NASA Technical Reports Server (NTRS)

    Miyake, Keith M.; Smith, Donald E.

    1992-01-01

    The CAM chip design was produced in a UNIX software environment using a design tool that supports definition of digital electronic modules, composition of these modules into higher level circuits, and event-driven simulation of these circuits. Our design tool provides an interface whose goals include straightforward but flexible primitive module definition and circuit composition, efficient simulation, and a debugging environment that facilitates design verification and alteration. The tool provides a set of primitive modules which can be composed into higher level circuits. Each module is a C-language subroutine that uses a set of interface protocols understood by the design tool. Primitives can be altered simply by recoding their C-code image; in addition new primitives can be added allowing higher level circuits to be described in C-code rather than as a composition of primitive modules--this feature can greatly enhance the speed of simulation.

  11. Cubesat Constellation Design for Air Traffic Monitoring

    NASA Technical Reports Server (NTRS)

    Nag, Sreeja; Rios, Joseph Lucio; Gerhardt, David; Pham, Camvu

    2015-01-01

    Suitably equipped global and local air traffic can be tracked. The tracking information may then be used for control from ground-based stations by receiving the Automatic Dependent Surveillance-Broadcast (ADS-B) signal. The ADS-B signal, emitted from the aircraft's Mode-S transponder, is currently tracked by terrestrial based receivers but not over remote oceans or sparsely populated regions such as Alaska or the Pacific Ocean. Lack of real-time aircraft time/location information in remote areas significantly hinders optimal planning and control because bigger "safety bubbles" (lateral and vertical separation) are required around the aircraft until they reach radar-controlled airspace. Moreover, it presents a search-and-rescue bottleneck. Aircraft in distress, e.g. Air France AF449 that crashed in 2009, take days to be located or cannot be located at all, e.g. Malaysia Airlines MH370 in 2014. In this paper, we describe a tool for designing a constellation of small satellites which demonstrates, through high-fidelity modeling based on simulated air traffic data, the value of space-based ADS-B monitoring and provides recommendations for cost-efficient deployment of a constellation of small satellites to increase safety and situational awareness in the currently poorly-served surveillance area of Alaska. Air traffic data has been obtained from the Future ATM Concepts Evaluation Tool (FACET), developed at NASA Ames Research Center, simulated over the Alaskan airspace over a period of one day. The simulation is driven by MATLAB with satellites propagated and coverage calculated using AGI's Satellite ToolKit(STK10).

  12. Accomplishments and challenges of surgical simulation.

    PubMed

    Satava, R M

    2001-03-01

    For nearly a decade, advanced computer technologies have created extraordinary educational tools using three-dimensional (3D) visualization and virtual reality. Pioneering efforts in surgical simulation with these tools have resulted in a first generation of simulators for surgical technical skills. Accomplishments include simulations with 3D models of anatomy for practice of surgical tasks, initial assessment of student performance in technical skills, and awareness by professional societies of potential in surgical education and certification. However, enormous challenges remain, which include improvement of technical fidelity, standardization of accurate metrics for performance evaluation, integration of simulators into a robust educational curriculum, stringent evaluation of simulators for effectiveness and value added to surgical training, determination of simulation application to certification of surgical technical skills, and a business model to implement and disseminate simulation successfully throughout the medical education community. This review looks at the historical progress of surgical simulators, their accomplishments, and the challenges that remain.

  13. The Effects of a Concept Map-Based Support Tool on Simulation-Based Inquiry Learning

    ERIC Educational Resources Information Center

    Hagemans, Mieke G.; van der Meij, Hans; de Jong, Ton

    2013-01-01

    Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations, while dynamic color coding of the concepts displayed…

  14. Mainstreaming Modeling and Simulation to Accelerate Public Health Innovation

    PubMed Central

    Sepulveda, Martin-J.; Mabry, Patricia L.

    2014-01-01

    Dynamic modeling and simulation are systems science tools that examine behaviors and outcomes resulting from interactions among multiple system components over time. Although there are excellent examples of their application, they have not been adopted as mainstream tools in population health planning and policymaking. Impediments to their use include the legacy and ease of use of statistical approaches that produce estimates with confidence intervals, the difficulty of multidisciplinary collaboration for modeling and simulation, systems scientists’ inability to communicate effectively the added value of the tools, and low funding for population health systems science. Proposed remedies include aggregation of diverse data sets, systems science training for public health and other health professionals, changing research incentives toward collaboration, and increased funding for population health systems science projects. PMID:24832426

  15. Comparison of Numerically Simulated and Experimentally Measured Performance of a Rotating Detonation Engine

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.; Fotia, Matthew L.; Hoke, John; Schauer, Fred

    2015-01-01

    A quasi-two-dimensional, computational fluid dynamic (CFD) simulation of a rotating detonation engine (RDE) is described. The simulation operates in the detonation frame of reference and utilizes a relatively coarse grid such that only the essential primary flow field structure is captured. This construction and other simplifications yield rapidly converging, steady solutions. Viscous effects, and heat transfer effects are modeled using source terms. The effects of potential inlet flow reversals are modeled using boundary conditions. Results from the simulation are compared to measured data from an experimental RDE rig with a converging-diverging nozzle added. The comparison is favorable for the two operating points examined. The utility of the code as a performance optimization tool and a diagnostic tool are discussed.

  16. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  17. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  18. Ground Contact Model for Mars Science Laboratory Mission Simulations

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Way, David

    2012-01-01

    The Program to Optimize Simulated Trajectories II (POST 2) has been successful in simulating the flight of launch vehicles and entry bodies on earth and other planets. POST 2 has been the primary simulation tool for the Entry Descent, and Landing (EDL) phase of numerous Mars lander missions such as Mars Pathfinder in 1997, the twin Mars Exploration Rovers (MER-A and MER-B) in 2004, Mars Phoenix lander in 2007, and it is now the main trajectory simulation tool for Mars Science Laboratory (MSL) in 2012. In all previous missions, the POST 2 simulation ended before ground impact, and a tool other than POST 2 simulated landing dynamics. It would be ideal for one tool to simulate the entire EDL sequence, thus avoiding errors that could be introduced by handing off position, velocity, or other fight parameters from one simulation to the other. The desire to have one continuous end-to-end simulation was the motivation for developing the ground interaction model in POST 2. Rover landing, including the detection of the postlanding state, is a very critical part of the MSL mission, as the EDL landing sequence continues for a few seconds after landing. The method explained in this paper illustrates how a simple ground force interaction model has been added to POST 2, which allows simulation of the entire EDL from atmospheric entry through touchdown.

  19. Logistics Process Analysis ToolProcess Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  20. Adding biological realism to assessments of landscape connectivity

    EPA Science Inventory

    Researchers have long appreciated the practical value of connectivity and source-sink analyses. The importance of these assessments for conservation, planning, and reserve design has motivated many empirical and simulation studies. But there are few modeling tools available that ...

  1. Informing Hospital Change Processes through Visualization and Simulation: A Case Study at a Children's Emergency Clinic.

    PubMed

    Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Johansson, Gerd

    2014-01-01

    To demonstrate the use of visualization and simulation tools in order to involve stakeholders and inform the process in hospital change processes, illustrated by an empirical study from a children's emergency clinic. Reorganization and redevelopment of a hospital is a complex activity that involves many stakeholders and demands. Visualization and simulation tools have proven useful for involving practitioners and eliciting relevant knowledge. More knowledge is desired about how these tools can be implemented in practice for hospital planning processes. A participatory planning process including practitioners and researchers was executed over a 3-year period to evaluate a combination of visualization and simulation tools to involve stakeholders in the planning process and to elicit knowledge about needs and requirements. The initial clinic proposal from the architect was discarded as a result of the empirical study. Much general knowledge about the needs of the organization was extracted by means of the adopted tools. Some of the tools proved to be more accessible than others for the practitioners participating in the study. The combination of tools added value to the process by presenting information in alternative ways and eliciting questions from different angles. Visualization and simulation tools inform a planning process (or other types of change processes) by providing the means to see beyond present demands and current work structures. Long-term involvement in combination with accessible tools is central for creating a participatory setting where the practitioners' knowledge guides the process. © 2014 Vendome Group, LLC.

  2. Big data to smart data in Alzheimer's disease: Real-world examples of advanced modeling and simulation.

    PubMed

    Haas, Magali; Stephenson, Diane; Romero, Klaus; Gordon, Mark Forrest; Zach, Neta; Geerts, Hugo

    2016-09-01

    Many disease-modifying clinical development programs in Alzheimer's disease (AD) have failed to date, and development of new and advanced preclinical models that generate actionable knowledge is desperately needed. This review reports on computer-based modeling and simulation approach as a powerful tool in AD research. Statistical data-analysis techniques can identify associations between certain data and phenotypes, such as diagnosis or disease progression. Other approaches integrate domain expertise in a formalized mathematical way to understand how specific components of pathology integrate into complex brain networks. Private-public partnerships focused on data sharing, causal inference and pathway-based analysis, crowdsourcing, and mechanism-based quantitative systems modeling represent successful real-world modeling examples with substantial impact on CNS diseases. Similar to other disease indications, successful real-world examples of advanced simulation can generate actionable support of drug discovery and development in AD, illustrating the value that can be generated for different stakeholders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  3. McStas 1.7 - a new version of the flexible Monte Carlo neutron scattering package

    NASA Astrophysics Data System (ADS)

    Willendrup, Peter; Farhi, Emmanuel; Lefmann, Kim

    2004-07-01

    Current neutron instrumentation is both complex and expensive, and accurate simulation has become essential both for building new instruments and for using them effectively. The McStas neutron ray-trace simulation package is a versatile tool for producing such simulations, developed in collaboration between Risø and ILL. The new version (1.7) has many improvements, among these added support for the popular Microsoft Windows platform. This presentation will demonstrate a selection of the new features through a simulation of the ILL IN6 beamline.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Animesh, E-mail: animesh@zedat.fu-berlin.de; Delle Site, Luigi, E-mail: dellesite@fu-berlin.de

    Quantum effects due to the spatial delocalization of light atoms are treated in molecular simulation via the path integral technique. Among several methods, Path Integral (PI) Molecular Dynamics (MD) is nowadays a powerful tool to investigate properties induced by spatial delocalization of atoms; however, computationally this technique is very demanding. The above mentioned limitation implies the restriction of PIMD applications to relatively small systems and short time scales. One of the possible solutions to overcome size and time limitation is to introduce PIMD algorithms into the Adaptive Resolution Simulation Scheme (AdResS). AdResS requires a relatively small region treated at pathmore » integral level and embeds it into a large molecular reservoir consisting of generic spherical coarse grained molecules. It was previously shown that the realization of the idea above, at a simple level, produced reasonable results for toy systems or simple/test systems like liquid parahydrogen. Encouraged by previous results, in this paper, we show the simulation of liquid water at room conditions where AdResS, in its latest and more accurate Grand-Canonical-like version (GC-AdResS), is merged with two of the most relevant PIMD techniques available in the literature. The comparison of our results with those reported in the literature and/or with those obtained from full PIMD simulations shows a highly satisfactory agreement.« less

  5. Assessment of the Draft AIAA S-119 Flight Dynamic Model Exchange Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Murri, Daniel G.; Hill, Melissa A.; Jessick, Matthew V.; Penn, John M.; Hasan, David A.; Crues, Edwin Z.; Falck, Robert D.; McCarthy, Thomas G.; Vuong, Nghia; hide

    2011-01-01

    An assessment of a draft AIAA standard for flight dynamics model exchange, ANSI/AIAA S-119-2011, was conducted on behalf of NASA by a team from the NASA Engineering and Safety Center. The assessment included adding the capability of importing standard models into real-time simulation facilities at several NASA Centers as well as into analysis simulation tools. All participants were successful at importing two example models into their respective simulation frameworks by using existing software libraries or by writing new import tools. Deficiencies in the libraries and format documentation were identified and fixed; suggestions for improvements to the standard were provided to the AIAA. An innovative tool to generate C code directly from such a model was developed. Performance of the software libraries compared favorably with compiled code. As a result of this assessment, several NASA Centers can now import standard models directly into their simulations. NASA is considering adopting the now-published S-119 standard as an internal recommended practice.

  6. Harnessing the power of emerging petascale platforms

    NASA Astrophysics Data System (ADS)

    Mellor-Crummey, John

    2007-07-01

    As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.

  7. Virtual Grower 3: A powerful decision support tool for greenhouse systems

    USDA-ARS?s Scientific Manuscript database

    Several years ago, Virtual Grower software was released to the public. Initially designed to help greenhouse growers determine heating costs and do simple simulations to figure out where heat savings could be achieved, it has slowly added features. Now, Virtual Grower can help not only identify he...

  8. The Value of Experiential Learning in Long-Term Care Education

    ERIC Educational Resources Information Center

    Wasmuth, Norma

    1975-01-01

    Experiential learning has proved a useful tool in adding meaning to an undergraduate course in the problems of aging and delivery of long-term care. Sensory deprivation and institutionalization commonly experienced by the elderly can be simulated. The response to this educational process increased the students' understanding of sensory…

  9. Adding an Expert to the Team: The Expert Flight Plan Critic

    ERIC Educational Resources Information Center

    Gibbons, Andrew; Waki, Randy; Fairweather, Peter

    2008-01-01

    This paper reports the development of a practical tool that provides expert feedback to students following an extended simulation exercise in cross-country flight planning. In contrast to development for laboratory settings, the development of an expert instructional product for everyday use posed some interesting challenges, including dealing…

  10. Multiple-body simulation with emphasis on integrated Space Shuttle vehicle

    NASA Technical Reports Server (NTRS)

    Chiu, Ing-Tsau

    1993-01-01

    The program to obtain intergrid communications - Pegasus - was enhanced to make better use of computing resources. Periodic block tridiagonal and penta-diagonal diagonal routines in OVERFLOW were modified to use a better algorithm to speed up the calculation for grids with periodic boundary conditions. Several programs were added to collar grid tools and a user friendly shell script was developed to help users generate collar grids. User interface for HYPGEN was modified to cope with the changes in HYPGEN. ET/SRB attach hardware grids were added to the computational model for the space shuttle and is currently incorporated into the refined shuttle model jointly developed at Johnson Space Center and Ames Research Center. Flow simulation for the integrated space shuttle vehicle at flight Reynolds number was carried out and compared with flight data as well as the earlier simulation for wind tunnel Reynolds number.

  11. The analysis of thermal comfort requirements through the simulation of an occupied building.

    PubMed

    Thellier, F; Cordier, A; Monchoux, F

    1994-05-01

    Building simulation usually focuses on the study of physical indoor parameters, but we must not forget the main aim of a house: to provide comfort to the occupants. This study was undertaken in order to build a complete tool to model thermal behaviour that will enable the prediction of thermal sensations of humans in a real environment. A human thermoregulation model was added to TRNSYS, a building simulation program. For our purposes, improvements had to be made to the original physiological model, by refining the calculation of all heat exchanges with the environment and adding a representation of clothes. This paper briefly describes the program, its modifications, and compares its results with experimental ones. An example of potential use is given, which points out the usefulness of such models in seeking the best solutions to reach optimal environmental conditions for global, and specially local comfort, of building occupants.

  12. CubeSat constellation design for air traffic monitoring

    NASA Astrophysics Data System (ADS)

    Nag, Sreeja; Rios, Joseph L.; Gerhardt, David; Pham, Camvu

    2016-11-01

    Suitably equipped global and local air traffic can be tracked. The tracking information may then be used for control from ground-based stations by receiving the Automatic Dependent Surveillance-Broadcast (ADS-B) signal. In this paper, we describe a tool for designing a constellation of small satellites which demonstrates, through high-fidelity modeling based on simulated air traffic data, the value of space-based ADS-B monitoring. It thereby provides recommendations for cost-efficient deployment of a constellation of small satellites to increase safety and situational awareness in the currently poorly-served surveillance area of Alaska. Air traffic data were obtained from NASA's Future ATM Concepts Evaluation Tool, for the Alaskan airspace over one day. The results presented were driven by MATLAB and the satellites propagated and coverage calculated using AGI's Satellite Tool. While Ad-hoc and precession spread constellations have been quantitatively evaluated, Walker constellations show the best performance in simulation. Sixteen satellites in two perpendicular orbital planes are shown to provide more than 99% coverage over representative Alaskan airspace and the maximum time gap where any airplane in Alaska is not covered is six minutes, therefore meeting the standard set by the International Civil Aviation Organization to monitor every airplane at least once every fifteen minutes. In spite of the risk of signal collision when multiple packets arrive at the satellite receiver, the proposed constellation shows 99% cumulative probability of reception within four minutes when the airplanes are transmitting every minute, and at 100% reception probability if transmitting every second. Data downlink can be performed using any of the three ground stations of NASA Earth Network in Alaska.

  13. Adding Automatic Evaluation to Interactive Virtual Labs

    ERIC Educational Resources Information Center

    Farias, Gonzalo; Muñoz de la Peña, David; Gómez-Estern, Fabio; De la Torre, Luis; Sánchez, Carlos; Dormido, Sebastián

    2016-01-01

    Automatic evaluation is a challenging field that has been addressed by the academic community in order to reduce the assessment workload. In this work we present a new element for the authoring tool Easy Java Simulations (EJS). This element, which is named automatic evaluation element (AEE), provides automatic evaluation to virtual and remote…

  14. Operational Characteristics Identification and Simulation Model Verification for Incheon International Airport

    NASA Technical Reports Server (NTRS)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Zhu, Zhifan; Jung, Yoon C.; Jeong, Myeongsook; Kim, Hyounkyong; Oh, Eunmi; Hong, Sungkwon; Lee, Junwon

    2016-01-01

    Incheon International Airport (ICN) is one of the hub airports in East Asia. Airport operations at ICN have been growing more than 5% per year in the past five years. According to the current airport expansion plan, a new passenger terminal will be added and the current cargo ramp will be expanded in 2018. This expansion project will bring 77 new stands without adding a new runway to the airport. Due to such continuous growth in airport operations and future expansion of the ramps, it will be highly likely that airport surface traffic will experience more congestion, and therefore, suffer from efficiency degradation. There is a growing awareness in aviation research community of need for strategic and tactical surface scheduling capabilities for efficient airport surface operations. Specific to ICN airport operations, a need for A-CDM (Airport - Collaborative Decision Making) or S-CDM(Surface - Collaborative Decision Making), and controller decision support tools for efficient air traffic management has arisen since several years ago. In the United States, there has been independent research efforts made by academia, industry, and government research organizations to enhance efficiency and predictability of surface operations at busy airports. Among these research activities, the Spot and Runway Departure Advisor (SARDA) developed and tested by National Aeronautics and Space Administration (NASA) is a decision support tool to provide tactical advisories to the controllers for efficient surface operations. The effectiveness of SARDA concept, was successfully verified through the human-in-the-loop (HITL) simulations for both spot release and runway operations advisories for ATC Tower controllers of Dallas/Fort Worth International Airport (DFW) in 2010 and 2012, and gate pushback advisories for the ramp controller of Charlotte/Douglas International Airport (CLT) in 2014. The SARDA concept for tactical surface scheduling is further enhanced and is being integrated into NASA's Airspace Technology Demonstration - 2 (ATD-2) project for technology demonstration of Integrated Arrival/Departure/Surface (ADS) operations at CLT. This study is a part of the international research collaboration between KAIA (Korea Agency for Infrastructure Technology Advancement)/KARI (Korea Aerospace Research Institute) and NASA, which is being conducted to validate the effectiveness of SARDA concept as a controller decision support tool for departure and surface management of ICN. This paper presents the preliminary results of the collaboration effort. It includes investigation of the operational environment of ICN, data analysis for identification of the operational characteristics of the airport, construction and verification of airport simulation model using Surface Operations Simulator and Scheduler (SOSS), NASA's fast-time simulation tool.

  15. Simulations of electron avalanches in an ultra-low-background proportional counter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, John W.; Aalseth, Craig; Dion, Michael P.

    2016-02-01

    New classes have been added to the simulation package Garfield++ to import the potential and electric field solutions generated by ANSYS R MaxwellTM v.16. Using these tools we report results on the simulation of electron avalanches and induced signal waveforms in comparison to experimental data of the ultra-lowbackground gas proportional counters being developed at Pacific Northwest National Laboratory. Furthermore, an improved mesh search algorithm based on Delaunay triangulation was implemented and provided at least a three order of magnitude time savings when compared to the built-in point-location search class of Garfield++.

  16. Graphical user interface for wireless sensor networks simulator

    NASA Astrophysics Data System (ADS)

    Paczesny, Tomasz; Paczesny, Daniel; Weremczuk, Jerzy

    2008-01-01

    Wireless Sensor Networks (WSN) are currently very popular area of development. It can be suited in many applications form military through environment monitoring, healthcare, home automation and others. Those networks, when working in dynamic, ad-hoc model, need effective protocols which must differ from common computer networks algorithms. Research on those protocols would be difficult without simulation tool, because real applications often use many nodes and tests on such a big networks take much effort and costs. The paper presents Graphical User Interface (GUI) for simulator which is dedicated for WSN studies, especially in routing and data link protocols evaluation.

  17. Guided wave energy trapping to detect hidden multilayer delamination damage

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.

    2015-03-01

    Nondestructive Evaluation (NDE) and Structural Health Monitoring (SHM) simulation tools capable of modeling three-dimensional (3D) realistic energy-damage interactions are needed for aerospace composites. Current practice in NDE/SHM simulation for composites commonly involves over-simplification of the material parameters and/or a simplified two-dimensional (2D) approach. The unique damage types that occur in composite materials (delamination, microcracking, etc) develop as complex 3D geometry features. This paper discusses the application of 3D custom ultrasonic simulation tools to study wave interaction with multilayer delamination damage in carbon-fiber reinforced polymer (CFRP) composites. In particular, simulation based studies of ultrasonic guided wave energy trapping due to multilayer delamination damage were performed. The simulation results show changes in energy trapping at the composite surface as additional delaminations are added through the composite thickness. The results demonstrate a potential approach for identifying the presence of hidden multilayer delamination damage in applications where only single-sided access to a component is available. The paper also describes recent advancements in optimizing the custom ultrasonic simulation code for increases in computation speed.

  18. The virtual child: evaluation of an internet-based pediatric behavior management simulation.

    PubMed

    Boynton, James R; Green, Thomas G; Johnson, Lynn A; Nainar, S M Hashim; Straffon, Lloyd H

    2007-09-01

    This article describes an Internet-based instructional tool designed to give predoctoral dental students a virtual simulation of clinical pediatric dentistry to develop their pediatric behavior management knowledge. Effectiveness of this tool was evaluated using two consecutive classes of junior dental students. The control group was exposed to the traditional behavior management curriculum (two lectures) in a spring term course. The next class of dental students was exposed to the two lectures and, in addition, completed the behavior management simulation during the following spring term. Both groups completed a two-part examination (objective section=18 questions; open-ended section=responses to a clinical situation) designed to test their behavior management knowledge. The simulation group performed significantly better in both parts of the examination (objective section: p=.028; open-ended section: p=.012). The simulation was evaluated by students and perceived by most to be an effective addition to the curriculum. It was concluded that the experimental behavior management simulation, when added to the traditional lecture curriculum, improved pediatric behavior management knowledge in predoctoral dental students.

  19. Multi Sector Planning Tools for Trajectory-Based Operations

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Mainini, Matthew; Brasil, Connie

    2010-01-01

    This paper discusses a suite of multi sector planning tools for trajectory-based operations that were developed and evaluated in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center. The toolset included tools for traffic load and complexity assessment as well as trajectory planning and coordination. The situation assessment tools included an integrated suite of interactive traffic displays, load tables, load graphs, and dynamic aircraft filters. The planning toolset allowed for single and multi aircraft trajectory planning and data communication-based coordination of trajectories between operators. Also newly introduced was a real-time computation of sector complexity into the toolset that operators could use in lieu of aircraft count to better estimate and manage sector workload, especially in situations with convective weather. The tools were used during a joint NASA/FAA multi sector planner simulation in the AOL in 2009 that had multiple objectives with the assessment of the effectiveness of the tools being one of them. Current air traffic control operators who were experienced as area supervisors and traffic management coordinators used the tools throughout the simulation and provided their usefulness and usability ratings in post simulation questionnaires. This paper presents these subjective assessments as well as the actual usage data that was collected during the simulation. The toolset was rated very useful and usable overall. Many elements received high scores by the operators and were used frequently and successfully. Other functions were not used at all, but various requests for new functions and capabilities were received that could be added to the toolset.

  20. Surgical scissors extension adds the 7th axis of force feedback to the Freedom 6S.

    PubMed

    Powers, Marilyn J; Sinclair, Ian P W; Brouwer, Iman; Laroche, Denis

    2007-01-01

    A virtual reality surgical simulator ideally allows seamless transition between the real and virtual world. In that respect, all of a surgeon's motions and tools must be simulated. Until now researchers have been limited to using a pen-like tool in six degrees-of-freedom. This paper presents the addition of haptically enabled scissors to the end effector of a 6-DOF haptic device, the Freedom 6S. The scissors are capable of pinching a maximum torque of 460 mN.m with low inertia and low back-drive friction. The device is a balanced design so that the user feels like they are holding no more than actual scissors, although with some added inertia on the load end. The system is interchangeable between the 6-DOF and 7-DOF configurations to allow switching tools quickly.

  1. AD-LIBS: inferring ancestry across hybrid genomes using low-coverage sequence data.

    PubMed

    Schaefer, Nathan K; Shapiro, Beth; Green, Richard E

    2017-04-04

    Inferring the ancestry of each region of admixed individuals' genomes is useful in studies ranging from disease gene mapping to speciation genetics. Current methods require high-coverage genotype data and phased reference panels, and are therefore inappropriate for many data sets. We present a software application, AD-LIBS, that uses a hidden Markov model to infer ancestry across hybrid genomes without requiring variant calling or phasing. This approach is useful for non-model organisms and in cases of low-coverage data, such as ancient DNA. We demonstrate the utility of AD-LIBS with synthetic data. We then use AD-LIBS to infer ancestry in two published data sets: European human genomes with Neanderthal ancestry and brown bear genomes with polar bear ancestry. AD-LIBS correctly infers 87-91% of ancestry in simulations and produces ancestry maps that agree with published results and global ancestry estimates in humans. In brown bears, we find more polar bear ancestry than has been published previously, using both AD-LIBS and an existing software application for local ancestry inference, HAPMIX. We validate AD-LIBS polar bear ancestry maps by recovering a geographic signal within bears that mirrors what is seen in SNP data. Finally, we demonstrate that AD-LIBS is more effective than HAPMIX at inferring ancestry when preexisting phased reference data are unavailable and genomes are sequenced to low coverage. AD-LIBS is an effective tool for ancestry inference that can be used even when few individuals are available for comparison or when genomes are sequenced to low coverage. AD-LIBS is therefore likely to be useful in studies of non-model or ancient organisms that lack large amounts of genomic DNA. AD-LIBS can therefore expand the range of studies in which admixture mapping is a viable tool.

  2. Modeling of the UAE Wind Turbine for Refinement of FAST{_}AD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonkman, J. M.

    The Unsteady Aerodynamics Experiment (UAE) research wind turbine was modeled both aerodynamically and structurally in the FAST{_}AD wind turbine design code, and its response to wind inflows was simulated for a sample of test cases. A study was conducted to determine why wind turbine load magnitude discrepancies-inconsistencies in aerodynamic force coefficients, rotor shaft torque, and out-of-plane bending moments at the blade root across a range of operating conditions-exist between load predictions made by FAST{_}AD and other modeling tools and measured loads taken from the actual UAE wind turbine during the NASA-Ames wind tunnel tests. The acquired experimental test data representmore » the finest, most accurate set of wind turbine aerodynamic and induced flow field data available today. A sample of the FAST{_}AD model input parameters most critical to the aerodynamics computations was also systematically perturbed to determine their effect on load and performance predictions. Attention was focused on the simpler upwind rotor configuration, zero yaw error test cases. Inconsistencies in input file parameters, such as aerodynamic performance characteristics, explain a noteworthy fraction of the load prediction discrepancies of the various modeling tools.« less

  3. Federation Development and Execution Process (FEDEP) Tools in Support of NATO Modelling & Simulation (M&S) Programmes (Des outils d’aide au processus de d’eveloppement et d’execution de federations (FEDEP))

    DTIC Science & Technology

    2004-05-01

    currently contains 79 tools and others should be added as they become known. Finally, the Task Group has recommended that the tool list be made available...approach and analysis. Conclusions and recommendations are contained in Chapter 5. RTO-TR-MSG-005 v Des outils d’aide au processus de développement...generation, Version 1.5 [A.3-1], was created in December 1999 and contained only minor editorial changes. RTO-TR-MSG-005 2 - 1 FEDEP With this

  4. Death During Simulation: A Literature Review.

    PubMed

    Heller, Benjamin J; DeMaria, Samuel; Katz, Daniel; Heller, Joshua A; Goldberg, Andrew T

    2016-01-01

    One of the goals of simulation is to teach subjects critical skills and knowledge applicable to live encounters, without the risk of harming actual patients. Although simulation education has surged in medical training over the last two decades, several ethically challenging educational methods have arisen. Simulated death has arisen as one of these challenging issues and currently there is no consensus regarding how to best manage this controversial topic in the simulated environment. The goal of this review is to analyze how simulated mortality has been used and discover whether or not this tool is beneficial to learners. In May 2016, the authors performed a literature search on both Pubmed and the Cochrane database using multiple variations of keywords; they then searched bibliographies and related articles. There were 901 articles acquired in the initial search. The authors eliminated articles that were not relevant to the subject matter. After adding articles from bibliographies and related articles, the authors included the 43 articles cited in this article. As a result, the authors of this article believe that death, when used appropriately in simulation, can be an effective teaching tool and can be used in a responsible manner.

  5. Coupling West WRF to GSSHA with GSSHApy

    NASA Astrophysics Data System (ADS)

    Snow, A. D.

    2017-12-01

    The West WRF output data is in the gridded NetCDF output format containing the required forcing data needed to run a GSSHA simulation. These data include precipitation, pressure, temperature, relative humidity, cloud cover, wind speed, and solar radiation. Tools to reproject, resample, and reformat the data for GSSHA have recently been added to the open source Python library GSSHApy (https://github.com/ci-water/gsshapy). These tools have created a connection that has made it possible to run forecasts using the West WRF forcing data with GSSHA to produce both streamflow and lake level predictions.

  6. Effects of an Approach Spacing Flight Deck Tool on Pilot Eyescan

    NASA Technical Reports Server (NTRS)

    Oseguera-Lohr, Rosa M.; Nadler, Eric D.

    2004-01-01

    An airborne tool has been developed based on the concept of an aircraft maintaining a time-based spacing interval from the preceding aircraft. The Advanced Terminal Area Approach Spacing (ATAAS) tool uses Automatic Dependent Surveillance-Broadcast (ADS-B) aircraft state data to compute a speed command for the ATAAS-equipped aircraft to obtain a required time interval behind another aircraft. The tool and candidate operational procedures were tested in a high-fidelity, full mission simulator with active airline subject pilots flying an arrival scenario using three different modes for speed control. Eyetracker data showed only slight changes in instrument scan patterns, and no significant change in the amount of time spent looking out the window with ATAAS, versus standard ILS procedures.

  7. A Re-Engineered Software Interface and Workflow for the Open-Source SimVascular Cardiovascular Modeling Package.

    PubMed

    Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L

    2018-02-01

    Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.

  8. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    PubMed

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  9. Mechanical discrete simulator of the electro-mechanical lift with n:1 roping

    NASA Astrophysics Data System (ADS)

    Alonso, F. J.; Herrera, I.

    2016-05-01

    The design process of new products in lift engineering is a difficult task due to, mainly, the complexity and slenderness of the lift system, demanding a predictive tool for the lift mechanics. A mechanical ad-hoc discrete simulator, as an alternative to ‘general purpose’ mechanical simulators is proposed. Firstly, the synthesis and experimentation process that has led to establish a suitable model capable of simulating accurately the response of the electromechanical lift is discussed. Then, the equations of motion are derived. The model comprises a discrete system of 5 vertically displaceable masses (car, counterweight, car frame, passengers/loads and lift drive), an inertial mass of the assembly tension pulley-rotor shaft which can rotate about the machine axis and 6 mechanical connectors with 1:1 suspension layout. The model is extended to any n:1 roping lift by setting 6 equivalent mechanical components (suspension systems for car and counterweight, lift drive silent blocks, tension pulley-lift drive stator and passengers/load equivalent spring-damper) by inductive inference from 1:1 and generalized 2:1 roping system. The application to simulate real elevator systems is proposed by numeric time integration of the governing equations using the Kutta-Meden algorithm and implemented in a computer program for ad-hoc elevator simulation called ElevaCAD.

  10. Processing biobased polymers using plasticizers: Numerical simulations versus experiments

    NASA Astrophysics Data System (ADS)

    Desplentere, Frederik; Cardon, Ludwig; Six, Wim; Erkoç, Mustafa

    2016-03-01

    In polymer processing, the use of biobased products shows lots of possibilities. Considering biobased materials, biodegradability is in most cases the most important issue. Next to this, bio based materials aimed at durable applications, are gaining interest. Within this research, the influence of plasticizers on the processing of the bio based material is investigated. This work is done for an extrusion grade of PLA, Natureworks PLA 2003D. Extrusion through a slit die equipped with pressure sensors is used to compare the experimental pressure values to numerical simulation results. Additional experimental data (temperature and pressure data along the extrusion screw and die are recorded) is generated on a dr. Collin Lab extruder producing a 25mm diameter tube. All these experimental data is used to indicate the appropriate functioning of the numerical simulation tool Virtual Extrusion Laboratory 6.7 for the simulation of both the industrial available extrusion grade PLA and the compound in which 15% of plasticizer is added. Adding the applied plasticizer, resulted in a 40% lower pressure drop over the extrusion die. The combination of different experiments allowed to fit the numerical simulation results closely to the experimental values. Based on this experience, it is shown that numerical simulations also can be used for modified bio based materials if appropriate material and process data are taken into account.

  11. NASA Operational Simulator for Small Satellites: Tools for Software Based Validation and Verification of Small Satellites

    NASA Technical Reports Server (NTRS)

    Grubb, Matt

    2016-01-01

    The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For testing, plug-ins are implemented in COSMOS to control the NOS3 simulations, while the command and telemetry tools available in COSMOS are used to communicate with FSW. NOS3 is actively being used for FSW development and component testing of the Simulation-to-Flight 1 (STF-1) CubeSat. As NOS3 matures, hardware models have been added for common CubeSat components such as Novatel GPS receivers, ClydeSpace electrical power systems and batteries, ISISpace antenna systems, etc. In the future, NASA IVV plans to distribute NOS3 to other CubeSat developers and release the suite to the open-source community.

  12. Voltage Support Study of Smart PV Inverters on a High-Photovoltaic Penetration Utility Distribution Feeder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Pratt, Annabelle; Bialek, Tom

    2016-11-21

    This paper reports on tools and methodologies developed to study the impact of adding rooftop photovoltaic (PV) systems, with and without the ability to provide voltage support, on the voltage profile of distribution feeders. Simulation results are provided from a study of a specific utility feeder. The simulation model of the utility distribution feeder was built in OpenDSS and verified by comparing the simulated voltages to field measurements. First, we set all PV systems to operate at unity power factor and analyzed the impact on feeder voltages. Then we conducted multiple simulations with voltage support activated for all the smartmore » PV inverters. These included different constant power factor settings and volt/VAR controls.« less

  13. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy.

    PubMed

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  14. Characterization of complexity in the electroencephalograph activity of Alzheimer's disease based on fuzzy entropy

    NASA Astrophysics Data System (ADS)

    Cao, Yuzhen; Cai, Lihui; Wang, Jiang; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing

    2015-08-01

    In this paper, experimental neurophysiologic recording and statistical analysis are combined to investigate the nonlinear characteristic and the cognitive function of the brain. Fuzzy approximate entropy and fuzzy sample entropy are applied to characterize the model-based simulated series and electroencephalograph (EEG) series of Alzheimer's disease (AD). The effectiveness and advantages of these two kinds of fuzzy entropy are first verified through the simulated EEG series generated by the alpha rhythm model, including stronger relative consistency and robustness. Furthermore, in order to detect the abnormality of irregularity and chaotic behavior in the AD brain, the complexity features based on these two fuzzy entropies are extracted in the delta, theta, alpha, and beta bands. It is demonstrated that, due to the introduction of fuzzy set theory, the fuzzy entropies could better distinguish EEG signals of AD from that of the normal than the approximate entropy and sample entropy. Moreover, the entropy values of AD are significantly decreased in the alpha band, particularly in the temporal brain region, such as electrode T3 and T4. In addition, fuzzy sample entropy could achieve higher group differences in different brain regions and higher average classification accuracy of 88.1% by support vector machine classifier. The obtained results prove that fuzzy sample entropy may be a powerful tool to characterize the complexity abnormalities of AD, which could be helpful in further understanding of the disease.

  15. Feasibility and fidelity of practising surgical fixation on a virtual ulna bone

    PubMed Central

    LeBlanc, Justin; Hutchison, Carol; Hu, Yaoping; Donnon, Tyrone

    2013-01-01

    Background Surgical simulators provide a safe environment to learn and practise psychomotor skills. A goal for these simulators is to achieve high levels of fidelity. The purpose of this study was to develop a reliable surgical simulator fidelity questionnaire and to assess whether a newly developed virtual haptic simulator for fixation of an ulna has comparable levels of fidelity as Sawbones. Methods Simulator fidelity questionnaires were developed. We performed a stratified randomized study with surgical trainees. They performed fixation of the ulna using a virtual simulator and Sawbones. They completed the fidelity questionnaires after each procedure. Results Twenty-two trainees participated in the study. The reliability of the fidelity questionnaire for each separate domain (environment, equipment, psychological) was Cronbach α greater than 0.70, except for virtual environment. The Sawbones had significantly higher levels of fidelity than the virtual simulator (p < 0.001) with a large effect size difference (Cohen d < 1.3). Conclusion The newly developed fidelity questionnaire is a reliable tool that can potentially be used to determine the fidelity of other surgical simulators. Increasing the fidelity of this virtual simulator is required before its use as a training tool for surgical fixation. The virtual simulator brings with it the added benefits of repeated, independent safe use with immediate, objective feedback and the potential to alter the complexity of the skill. PMID:23883510

  16. Study of a Simulation Tool to Determine Achievable Control Dynamics and Control Power Requirements with Perfect Tracking

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.

    1998-01-01

    This paper contains a study of two methods for use in a generic nonlinear simulation tool that could be used to determine achievable control dynamics and control power requirements while performing perfect tracking maneuvers over the entire flight envelope. The two methods are NDI (nonlinear dynamic inversion) and the SOFFT(Stochastic Optimal Feedforward and Feedback Technology) feedforward control structure. Equivalent discrete and continuous SOFFT feedforward controllers have been developed. These equivalent forms clearly show that the closed-loop plant model loop is a plant inversion and is the same as the NDI formulation. The main difference is that the NDI formulation has a closed-loop controller structure whereas SOFFT uses an open-loop command model. Continuous, discrete, and hybrid controller structures have been developed and integrated into the formulation. Linear simulation results show that seven different configurations all give essentially the same response, with the NDI hybrid being slightly different. The SOFFT controller gave better tracking performance compared to the NDI controller when a nonlinear saturation element was added. Future plans include evaluation using a nonlinear simulation.

  17. NEURON and Python.

    PubMed

    Hines, Michael L; Davison, Andrew P; Muller, Eilif

    2009-01-01

    The NEURON simulation program now allows Python to be used, alone or in combination with NEURON's traditional Hoc interpreter. Adding Python to NEURON has the immediate benefit of making available a very extensive suite of analysis tools written for engineering and science. It also catalyzes NEURON software development by offering users a modern programming tool that is recognized for its flexibility and power to create and maintain complex programs. At the same time, nothing is lost because all existing models written in Hoc, including graphical user interface tools, continue to work without change and are also available within the Python context. An example of the benefits of Python availability is the use of the xml module in implementing NEURON's Import3D and CellBuild tools to read MorphML and NeuroML model specifications.

  18. NEURON and Python

    PubMed Central

    Hines, Michael L.; Davison, Andrew P.; Muller, Eilif

    2008-01-01

    The NEURON simulation program now allows Python to be used, alone or in combination with NEURON's traditional Hoc interpreter. Adding Python to NEURON has the immediate benefit of making available a very extensive suite of analysis tools written for engineering and science. It also catalyzes NEURON software development by offering users a modern programming tool that is recognized for its flexibility and power to create and maintain complex programs. At the same time, nothing is lost because all existing models written in Hoc, including graphical user interface tools, continue to work without change and are also available within the Python context. An example of the benefits of Python availability is the use of the xml module in implementing NEURON's Import3D and CellBuild tools to read MorphML and NeuroML model specifications. PMID:19198661

  19. Hierarchical Testing with Automated Document Generation for Amanzi, ASCEM's Subsurface Flow and Reactive Transport Simulator

    NASA Astrophysics Data System (ADS)

    Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.

    2013-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).

  20. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model

    PubMed Central

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  1. Analytical tools for identification of non-intentionally added substances (NIAS) coming from polyurethane adhesives in multilayer packaging materials and their migration into food simulants.

    PubMed

    Félix, Juliana S; Isella, Francesca; Bosetti, Osvaldo; Nerín, Cristina

    2012-07-01

    Adhesives used in food packaging to glue different materials can provide several substances as potential migrants, and the identification of potential migrants and migration tests are required to assess safety in the use of adhesives. Solid-phase microextraction in headspace mode and gas chromatography coupled to mass spectrometry (HS-SPME-GC-MS) and ChemSpider and SciFinder databases were used as powerful tools to identify the potential migrants in the polyurethane (PU) adhesives and also in the individual plastic films (polyethylene terephthalate, polyamide, polypropylene, polyethylene, and polyethylene/ethyl vinyl alcohol). Migration tests were carried out by using Tenax(®) and isooctane as food simulants, and the migrants were analyzed by gas chromatography coupled to mass spectrometry. More than 63 volatile and semivolatile compounds considered as potential migrants were detected either in the adhesives or in the films. Migration tests showed two non-intentionally added substances (NIAS) coming from PU adhesives that migrated through the laminates into Tenax(®) and into isooctane. Identification of these NIAS was achieved through their mass spectra, and 1,6-dioxacyclododecane-7,12-dione and 1,4,7-trioxacyclotridecane-8,13-dione were confirmed. Caprolactam migrated into isooctane, and its origin was the external plastic film in the multilayer, demonstrating real diffusion through the multilayer structure. Comparison of the migration values between the simulants and conditions will be shown and discussed.

  2. Hybrid Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trujillo, David J.; Sridharan, Srikesh; Weinstock, Irvin

    HybSim (short for Hybrid Simulator) is a flexible, easy to use screening tool that allows the user to quanti the technical and economic benefits of installing a village hybrid generating system and simulates systems with any combination of —Diesel generator sets —Photovoltaic arrays -Wind Turbines and -Battery energy storage systems Most village systems (or small population sites such as villages, remote military bases, small communities, independent or isolated buildings or centers) depend on diesel generation systems for their source of energy. HybSim allows the user to determine other "sources" of energy that can greatly reduce the dollar to kilo-watt hourmore » ratio. Supported by the DOE, Energy Storage Program, HybSim was initially developed to help analyze the benefits of energy storage systems in Alaskan villages. Soon after its development, other sources of energy were added providing the user with a greater range of analysis opportunities and providing the village with potentially added savings. In addition to village systems, HybSim has generated interest for use from military institutions in energy provisions and USAID for international village analysis.« less

  3. Towards a supported common NEAMS software stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormac Garvey

    2012-04-01

    The NEAMS IPSC's are developing multidimensional, multiphysics, multiscale simulation codes based on first principles that will be capable of predicting all aspects of current and future nuclear reactor systems. These new breeds of simulation codes will include rigorous verification, validation and uncertainty quantification checks to quantify the accuracy and quality of the simulation results. The resulting NEAMS IPSC simulation codes will be an invaluable tool in designing the next generation of Nuclear Reactors and also contribute to a more speedy process in the acquisition of licenses from the NRC for new Reactor designs. Due to the high resolution of themore » models, the complexity of the physics and the added computational resources to quantify the accuracy/quality of the results, the NEAMS IPSC codes will require large HPC resources to carry out the production simulation runs.« less

  4. A Simulation Tool for the Duties of Computer Specialist Non-Commissioned Officers on a Turkish Air Force Base

    DTIC Science & Technology

    2009-09-01

    Interface IFR Instrument Flight Rules LANTIRN Low-Altitude Navigation and Targeting Infrared for Night MANTIRN Medium Altitude Navigation and...MANTIRN categories, and IFR weather categories. Aside from the category of personnel (computer specialist NCOs rather than pilots), the main...of the node, (2) Adding a description, (3) Implementing event arguments , local variables, and state transitions, (4) Implementing a code that is

  5. Defense in Depth Added to Malicious Activities Simulation Tools (MAST)

    DTIC Science & Technology

    2015-09-01

    cipher suites. The TLS Handshake is a combination of three components: handshake, change cipher spec, and alert. 41 (1) The Handshake ( Hello ) The...TLS Handshake, specifically the “ Hello ” portion, is designed to negotiate session parameters (cipher suite). The client informs the server of the...protocols and standards that it supports and then the server selects the highest common protocols and standards. Specifically, the Client Hello message

  6. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  7. A Fast-Time Simulation Environment for Airborne Merging and Spacing Research

    NASA Technical Reports Server (NTRS)

    Bussink, Frank J. L.; Doble, Nathan A.; Barmore, Bryan E.; Singer, Sharon

    2005-01-01

    As part of NASA's Distributed Air/Ground Traffic Management (DAG-TM) effort, NASA Langley Research Center is developing concepts and algorithms for merging multiple aircraft arrival streams and precisely spacing aircraft over the runway threshold. An airborne tool has been created for this purpose, called Airborne Merging and Spacing for Terminal Arrivals (AMSTAR). To evaluate the performance of AMSTAR and complement human-in-the-loop experiments, a simulation environment has been developed that enables fast-time studies of AMSTAR operations. The environment is based on TMX, a multiple aircraft desktop simulation program created by the Netherlands National Aerospace Laboratory (NLR). This paper reviews the AMSTAR concept, discusses the integration of the AMSTAR algorithm into TMX and the enhancements added to TMX to support fast-time AMSTAR studies, and presents initial simulation results.

  8. Construct validation of a novel hybrid surgical simulator.

    PubMed

    Broe, D; Ridgway, P F; Johnson, S; Tierney, S; Conlon, K C

    2006-06-01

    Simulated minimal access surgery has improved recently as both a learning and assessment tool. The construct validation of a novel simulator, ProMis, is described for use by residents in training. ProMis is a surgical simulator that can design tasks in both virtual and actual reality. A pilot group of surgical residents ranging from novice to expert completed three standardized tasks: orientation, dissection, and basic suturing. The tasks were tested for construct validity. Two experienced surgeons examined the recorded tasks in a blinded fashion using an objective structured assessment of technical skills format (OSATS: task-specific checklist and global rating score) as well as metrics delivered by the simulator. The findings showed excellent interrater reliability (Cronbach's alpha of 0.88 for the checklist and 0.93 for the global rating). The median scores in the experience groups were statistically different in both the global rating and the task-specific checklists (p < 0.05). The scores for the orientation task alone did not reach significance (p = 0.1), suggesting that modification is required before ProMis could be used in isolation as an assessment tool. The three simulated tasks in combination are construct valid for differentiating experience levels among surgeons in training. This hybrid simulator has potential added benefits of marrying the virtual with actual, and of combining simple box traits and advanced virtual reality simulation.

  9. Mars Exploration Rover Terminal Descent Mission Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.

    2004-01-01

    Because of NASA's added reliance on simulation for successful interplanetary missions, the MER mission has developed a detailed EDL trajectory modeling and simulation. This paper summarizes how the MER EDL sequence of events are modeled, verification of the methods used, and the inputs. This simulation is built upon a multibody parachute trajectory simulation tool that has been developed in POST I1 that accurately simulates the trajectory of multiple vehicles in flight with interacting forces. In this model the parachute and the suspended bodies are treated as 6 Degree-of-Freedom (6 DOF) bodies. The terminal descent phase of the mission consists of several Entry, Descent, Landing (EDL) events, such as parachute deployment, heatshield separation, deployment of the lander from the backshell, deployment of the airbags, RAD firings, TIRS firings, etc. For an accurate, reliable simulation these events need to be modeled seamlessly and robustly so that the simulations will remain numerically stable during Monte-Carlo simulations. This paper also summarizes how the events have been modeled, the numerical issues, and modeling challenges.

  10. The Application of SNiPER to the JUNO Simulation

    NASA Astrophysics Data System (ADS)

    Lin, Tao; Zou, Jiaheng; Li, Weidong; Deng, Ziyan; Fang, Xiao; Cao, Guofu; Huang, Xingtao; You, Zhengyun; JUNO Collaboration

    2017-10-01

    The JUNO (Jiangmen Underground Neutrino Observatory) is a multipurpose neutrino experiment which is designed to determine neutrino mass hierarchy and precisely measure oscillation parameters. As one of the important systems, the JUNO offline software is being developed using the SNiPER software. In this proceeding, we focus on the requirements of JUNO simulation and present the working solution based on the SNiPER. The JUNO simulation framework is in charge of managing event data, detector geometries and materials, physics processes, simulation truth information etc. It glues physics generator, detector simulation and electronics simulation modules together to achieve a full simulation chain. In the implementation of the framework, many attractive characteristics of the SNiPER have been used, such as dynamic loading, flexible flow control, multiple event management and Python binding. Furthermore, additional efforts have been made to make both detector and electronics simulation flexible enough to accommodate and optimize different detector designs. For the Geant4-based detector simulation, each sub-detector component is implemented as a SNiPER tool which is a dynamically loadable and configurable plugin. So it is possible to select the detector configuration at runtime. The framework provides the event loop to drive the detector simulation and interacts with the Geant4 which is implemented as a passive service. All levels of user actions are wrapped into different customizable tools, so that user functions can be easily extended by just adding new tools. The electronics simulation has been implemented by following an event driven scheme. The SNiPER task component is used to simulate data processing steps in the electronics modules. The electronics and trigger are synchronized by triggered events containing possible physics signals. The JUNO simulation software has been released and is being used by the JUNO collaboration to do detector design optimization, event reconstruction algorithm development and physics sensitivity studies.

  11. TU-EF-204-07: Add Tube Current Modulation to a Low Dose Simulation Tool for CT Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Y.; Department of Physics, University of Arizona, Tucson, AZ; Wen, G.

    2015-06-15

    Purpose: We extended the capabilities of a low dose simulation tool to model Tube-Current Modulation (TCM). TCM is widely used in clinical practice to reduce radiation dose in CT scans. We expect the tool to be valuable for various clinical applications (e.g., optimize protocols, compare reconstruction techniques and evaluate TCM methods). Methods: The tube current is input as a function of z location, instead of a fixed value. Starting from the line integrals of a scan, a new Poisson noise realization at a lower dose is generated for each view. To validate the new functionality, we compared simulated scans withmore » real scans in image space. Results: First we assessed noise in the difference between the low-dose simulations and the original high-dose scan. When the simulated tube current is a step function of z location, the noise at each segment matches the noise of 3 separate constant-tube-current-simulations. Secondly, with a phantom that forces TCM, we compared a low-dose simulation with an equivalent real low-dose scan. The mean CT number of the simulated scan and the real low-dose scan were 137.7±0.6 and 137.8±0.5 respectively. Furthermore, with 240 ROIs, the noise of the simulated scan and the real low-dose scan were 24.03±0.45 and 23.99±0.43 respectively, and they were not statistically different (2-sample t-test, p-value=0.28). The facts that the noise reflected the trend of the TCM curve, and that the absolute noise measurements were not statistically different validated the TCM function. Conclusion: We successfully added tube-current modulation functionality in an existing low dose simulation tool. We demonstrated that the noise reflected an input tube-current modulation curve. In addition, we verified that the noise and mean CT number of our simulation agreed with a real low dose scan. The authors are all employees of Philips. Yijun Ding is also supported by NIBIB P41EB002035 and NIBIB R01EB000803.« less

  12. Aortic dissection simulation models for clinical support: fluid-structure interaction vs. rigid wall models.

    PubMed

    Alimohammadi, Mona; Sherwood, Joseph M; Karimpour, Morad; Agu, Obiekezie; Balabani, Stavroula; Díaz-Zuccarini, Vanessa

    2015-04-15

    The management and prognosis of aortic dissection (AD) is often challenging and the use of personalised computational models is being explored as a tool to improve clinical outcome. Including vessel wall motion in such simulations can provide more realistic and potentially accurate results, but requires significant additional computational resources, as well as expertise. With clinical translation as the final aim, trade-offs between complexity, speed and accuracy are inevitable. The present study explores whether modelling wall motion is worth the additional expense in the case of AD, by carrying out fluid-structure interaction (FSI) simulations based on a sample patient case. Patient-specific anatomical details were extracted from computed tomography images to provide the fluid domain, from which the vessel wall was extrapolated. Two-way fluid-structure interaction simulations were performed, with coupled Windkessel boundary conditions and hyperelastic wall properties. The blood was modelled using the Carreau-Yasuda viscosity model and turbulence was accounted for via a shear stress transport model. A simulation without wall motion (rigid wall) was carried out for comparison purposes. The displacement of the vessel wall was comparable to reports from imaging studies in terms of intimal flap motion and contraction of the true lumen. Analysis of the haemodynamics around the proximal and distal false lumen in the FSI model showed complex flow structures caused by the expansion and contraction of the vessel wall. These flow patterns led to significantly different predictions of wall shear stress, particularly its oscillatory component, which were not captured by the rigid wall model. Through comparison with imaging data, the results of the present study indicate that the fluid-structure interaction methodology employed herein is appropriate for simulations of aortic dissection. Regions of high wall shear stress were not significantly altered by the wall motion, however, certain collocated regions of low and oscillatory wall shear stress which may be critical for disease progression were only identified in the FSI simulation. We conclude that, if patient-tailored simulations of aortic dissection are to be used as an interventional planning tool, then the additional complexity, expertise and computational expense required to model wall motion is indeed justified.

  13. DspaceOgre 3D Graphics Visualization Tool

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan; Myin, Steven; Pomerantz, Marc I.

    2011-01-01

    This general-purpose 3D graphics visualization C++ tool is designed for visualization of simulation and analysis data for articulated mechanisms. Examples of such systems are vehicles, robotic arms, biomechanics models, and biomolecular structures. DspaceOgre builds upon the open-source Ogre3D graphics visualization library. It provides additional classes to support the management of complex scenes involving multiple viewpoints and different scene groups, and can be used as a remote graphics server. This software provides improved support for adding programs at the graphics processing unit (GPU) level for improved performance. It also improves upon the messaging interface it exposes for use as a visualization server.

  14. InterSpread Plus: a spatial and stochastic simulation model of disease in animal populations.

    PubMed

    Stevenson, M A; Sanson, R L; Stern, M W; O'Leary, B D; Sujau, M; Moles-Benfell, N; Morris, R S

    2013-04-01

    We describe the spatially explicit, stochastic simulation model of disease spread, InterSpread Plus, in terms of its epidemiological framework, operation, and mode of use. The input data required by the model, the method for simulating contact and infection spread, and methods for simulating disease control measures are described. Data and parameters that are essential for disease simulation modelling using InterSpread Plus are distinguished from those that are non-essential, and it is suggested that a rational approach to simulating disease epidemics using this tool is to start with core data and parameters, adding additional layers of complexity if and when the specific requirements of the simulation exercise require it. We recommend that simulation models of disease are best developed as part of epidemic contingency planning so decision makers are familiar with model outputs and assumptions and are well-positioned to evaluate their strengths and weaknesses to make informed decisions in times of crisis. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. An Observing System Simulation Experiment Approach to Meteorological Network Assessment

    NASA Astrophysics Data System (ADS)

    Abbasnezhadi, K.; Rasmussen, P. F.; Stadnyk, T.; Boluwade, A.

    2016-12-01

    A proper knowledge of the spatiotemporal distribution of rainfall is important in order to conduct a mindful investigation of water movement and storage throughout a catchment. Currently, the most accurate precipitation information available for the remote Boreal ecozones of northern Manitoba is coming from the Canadian Precipitation Analysis (CaPA) data assimilation system. Throughout the Churchill River Basin (CRB), CaPA still does not have the proper skill due to the limited number of weather stations. A new approach to experimental network design was investigated based on the concept of Observing System Simulation Experiment (OSSE). The OSSE-based network assessment procedure which simulates the CaPA system provides a scientific and hydrologically significant tool to assess the sensitivity of CaPA precipitation analysis to observation network density throughout the CRB. To simulate CaPA system, synthetic background and station data were simulated, respectively, by adding spatially uncorrelated and correlated Gaussian noises to an assumingly true daily weather field synthesized by a gridded precipitation generator which simulates CaPA data. Given the true reference field on one hand, and a set of pseudo-CaPA analyses associated with different network realizations on the other hand, a WATFLOOD hydrological model was employed to compare the modeled runoff. The simulations showed that as network density increases, the accuracy of CaPA precipitation products improves up to a certain limit beyond which adding more stations to the network does not result in further accuracy.

  16. Next-generation Event Horizon Telescope developments: new stations for enhanced imaging

    NASA Astrophysics Data System (ADS)

    Palumbo, Daniel; Johnson, Michael; Doeleman, Sheperd; Chael, Andrew; Bouman, Katherine

    2018-01-01

    The Event Horizon Telescope (EHT) is a multinational Very Long Baseline Interferometry (VLBI) network of dishes joined to resolve general relativistic behavior near a supermassive black hole. The imaging quality of the EHT is largely dependent upon the sensitivity and spatial frequency coverage of the many baselines between its constituent telescopes. The EHT already contains many highly sensitive dishes, including the crucial Atacama Large Millimeter/Submillimeter Array (ALMA), making it viable to add smaller, cheaper telescopes to the array, greatly improving future capabilities of the EHT. We develop tools for optimizing the positions of new dishes in planned arrays. We also explore the feasibility of adding small orbiting dishes to the EHT, and develop orbital optimization tools for space-based VLBI imaging. Unlike the Millimetron mission planned to be at L2, we specifically treat near-earth orbiters, and find rapid filling of spatial frequency coverage across a large range of baseline lengths. Finally, we demonstrate significant improvement in image quality when adding small dishes to planned arrays in simulated observations.

  17. Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base

    NASA Technical Reports Server (NTRS)

    Bryant, Richard B., Jr.; Carrelli, David J.

    2006-01-01

    The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.

  18. Matlab Geochemistry: An open source geochemistry solver based on MRST

    NASA Astrophysics Data System (ADS)

    McNeece, C. J.; Raynaud, X.; Nilsen, H.; Hesse, M. A.

    2017-12-01

    The study of geological systems often requires the solution of complex geochemical relations. To address this need we present an open source geochemical solver based on the Matlab Reservoir Simulation Toolbox (MRST) developed by SINTEF. The implementation supports non-isothermal multicomponent aqueous complexation, surface complexation, ion exchange, and dissolution/precipitation reactions. The suite of tools available in MRST allows for rapid model development, in particular the incorporation of geochemical calculations into transport simulations of multiple phases, complex domain geometry and geomechanics. Different numerical schemes and additional physics can be easily incorporated into the existing tools through the object-oriented framework employed by MRST. The solver leverages the automatic differentiation tools available in MRST to solve arbitrarily complex geochemical systems with any choice of species or element concentration as input. Four mathematical approaches enable the solver to be quite robust: 1) the choice of chemical elements as the basis components makes all entries in the composition matrix positive thus preserving convexity, 2) a log variable transformation is used which transfers the nonlinearity to the convex composition matrix, 3) a priori bounds on variables are calculated from the structure of the problem, constraining Netwon's path and 4) an initial guess is calculated implicitly by sequentially adding model complexity. As a benchmark we compare the model to experimental and semi-analytic solutions of the coupled salinity-acidity transport system. Together with the reservoir simulation capabilities of MRST the solver offers a promising tool for geochemical simulations in reservoir domains for applications in a diversity of fields from enhanced oil recovery to radionuclide storage.

  19. A method for obtaining a statistically stationary turbulent free shear flow

    NASA Technical Reports Server (NTRS)

    Timson, Stephen F.; Lele, S. K.; Moser, R. D.

    1994-01-01

    The long-term goal of the current research is the study of Large-Eddy Simulation (LES) as a tool for aeroacoustics. New algorithms and developments in computer hardware are making possible a new generation of tools for aeroacoustic predictions, which rely on the physics of the flow rather than empirical knowledge. LES, in conjunction with an acoustic analogy, holds the promise of predicting the statistics of noise radiated to the far-field of a turbulent flow. LES's predictive ability will be tested through extensive comparison of acoustic predictions based on a Direct Numerical Simulation (DNS) and LES of the same flow, as well as a priori testing of DNS results. The method presented here is aimed at allowing simulation of a turbulent flow field that is both simple and amenable to acoustic predictions. A free shear flow is homogeneous in both the streamwise and spanwise directions and which is statistically stationary will be simulated using equations based on the Navier-Stokes equations with a small number of added terms. Studying a free shear flow eliminates the need to consider flow-surface interactions as an acoustic source. The homogeneous directions and the flow's statistically stationary nature greatly simplify the application of an acoustic analogy.

  20. Soapy: an adaptive optics simulation written purely in Python for rapid concept development

    NASA Astrophysics Data System (ADS)

    Reeves, Andrew

    2016-07-01

    Soapy is a newly developed Adaptive Optics (AO) simulation which aims be a flexible and fast to use tool-kit for many applications in the field of AO. It is written purely in the Python language, adding to and taking advantage of the already rich ecosystem of scientific libraries and programs. The simulation has been designed to be extremely modular, such that each component can be used stand-alone for projects which do not require a full end-to-end simulation. Ease of use, modularity and code clarity have been prioritised at the expense of computational performance. Though this means the code is not yet suitable for large studies of Extremely Large Telescope AO systems, it is well suited to education, exploration of new AO concepts and investigations of current generation telescopes.

  1. The development of a stochastic mathematical model of Alzheimer’s disease to help improve the design of clinical trials of potential treatments

    PubMed Central

    Ower, Alison K.; de Wolf, Frank; Anderson, Roy M.

    2018-01-01

    Alzheimer’s disease (AD) is a neurodegenerative disorder characterised by a slow progressive deterioration of cognitive capacity. Drugs are urgently needed for the treatment of AD and unfortunately almost all clinical trials of AD drug candidates have failed or been discontinued to date. Mathematical, computational and statistical tools can be employed in the construction of clinical trial simulators to assist in the improvement of trial design and enhance the chances of success of potential new therapies. Based on the analysis of a set of clinical data provided by the Alzheimer's Disease Neuroimaging Initiative (ADNI) we developed a simple stochastic mathematical model to simulate the development and progression of Alzheimer’s in a longitudinal cohort study. We show how this modelling framework could be used to assess the effect and the chances of success of hypothetical treatments that are administered at different stages and delay disease development. We demonstrate that the detection of the true efficacy of an AD treatment can be very challenging, even if the treatment is highly effective. An important reason behind the inability to detect signals of efficacy in a clinical trial in this therapy area could be the high between- and within-individual variability in the measurement of diagnostic markers and endpoints, which consequently results in the misdiagnosis of an individual’s disease state. PMID:29377891

  2. The development of a stochastic mathematical model of Alzheimer's disease to help improve the design of clinical trials of potential treatments.

    PubMed

    Hadjichrysanthou, Christoforos; Ower, Alison K; de Wolf, Frank; Anderson, Roy M

    2018-01-01

    Alzheimer's disease (AD) is a neurodegenerative disorder characterised by a slow progressive deterioration of cognitive capacity. Drugs are urgently needed for the treatment of AD and unfortunately almost all clinical trials of AD drug candidates have failed or been discontinued to date. Mathematical, computational and statistical tools can be employed in the construction of clinical trial simulators to assist in the improvement of trial design and enhance the chances of success of potential new therapies. Based on the analysis of a set of clinical data provided by the Alzheimer's Disease Neuroimaging Initiative (ADNI) we developed a simple stochastic mathematical model to simulate the development and progression of Alzheimer's in a longitudinal cohort study. We show how this modelling framework could be used to assess the effect and the chances of success of hypothetical treatments that are administered at different stages and delay disease development. We demonstrate that the detection of the true efficacy of an AD treatment can be very challenging, even if the treatment is highly effective. An important reason behind the inability to detect signals of efficacy in a clinical trial in this therapy area could be the high between- and within-individual variability in the measurement of diagnostic markers and endpoints, which consequently results in the misdiagnosis of an individual's disease state.

  3. Fast computation of derivative based sensitivities of PSHA models via algorithmic differentiation

    NASA Astrophysics Data System (ADS)

    Leövey, Hernan; Molkenthin, Christian; Scherbaum, Frank; Griewank, Andreas; Kuehn, Nicolas; Stafford, Peter

    2015-04-01

    Probabilistic seismic hazard analysis (PSHA) is the preferred tool for estimation of potential ground-shaking hazard due to future earthquakes at a site of interest. A modern PSHA represents a complex framework which combines different models with possible many inputs. Sensitivity analysis is a valuable tool for quantifying changes of a model output as inputs are perturbed, identifying critical input parameters and obtaining insight in the model behavior. Differential sensitivity analysis relies on calculating first-order partial derivatives of the model output with respect to its inputs. Moreover, derivative based global sensitivity measures (Sobol' & Kucherenko '09) can be practically used to detect non-essential inputs of the models, thus restricting the focus of attention to a possible much smaller set of inputs. Nevertheless, obtaining first-order partial derivatives of complex models with traditional approaches can be very challenging, and usually increases the computation complexity linearly with the number of inputs appearing in the models. In this study we show how Algorithmic Differentiation (AD) tools can be used in a complex framework such as PSHA to successfully estimate derivative based sensitivities, as is the case in various other domains such as meteorology or aerodynamics, without no significant increase in the computation complexity required for the original computations. First we demonstrate the feasibility of the AD methodology by comparing AD derived sensitivities to analytically derived sensitivities for a basic case of PSHA using a simple ground-motion prediction equation. In a second step, we derive sensitivities via AD for a more complex PSHA study using a ground motion attenuation relation based on a stochastic method to simulate strong motion. The presented approach is general enough to accommodate more advanced PSHA studies of higher complexity.

  4. In vivo assessment of the effect of taxifolin glycoside on atopic dermatitis-like skin lesions using biomedical tools in NC/Nga mice.

    PubMed

    Kim, J Y; Lee, O S; Ha, S; Kim, J H; Park, G; Kim, J K; Oh, C H

    2015-07-01

    Noninvasive methods of assessment are widely used in clinical trials. However, such methods have not been established in atopic dermatitis (AD), which is a chronic inflammatory skin disease. To demonstrate, using biomedical tools, the benefits of a new substance, taxifolin glycoside (TAX), in an AD model, the NC/Nga mouse. We evaluated the efficacy of topical TAX for AD by measuring clinical skin severity score, cytokine expression and serum IgE level, and by using biomedical measures (vapometry and corneometry). Topical TAX was applied to AD-induced NC/Nga mice for 3 weeks. The anti-inflammatory effects of this compound were demonstrated noninvasively using biomedical tools and immunological assays. Our method of AD assessment using biomedical tools is more objective and accurate than visual inspection. The results obtained using the biomedical tools were identical to those obtained using immunological assays. In vivo biomedical tools are useful for diagnosing and monitoring treatment effects in AD. © 2014 British Association of Dermatologists.

  5. OSI Network-layer Abstraction: Analysis of Simulation Dynamics and Performance Indicators

    NASA Astrophysics Data System (ADS)

    Lawniczak, Anna T.; Gerisch, Alf; Di Stefano, Bruno

    2005-06-01

    The Open Systems Interconnection (OSI) reference model provides a conceptual framework for communication among computers in a data communication network. The Network Layer of this model is responsible for the routing and forwarding of packets of data. We investigate the OSI Network Layer and develop an abstraction suitable for the study of various network performance indicators, e.g. throughput, average packet delay, average packet speed, average packet path-length, etc. We investigate how the network dynamics and the network performance indicators are affected by various routing algorithms and by the addition of randomly generated links into a regular network connection topology of fixed size. We observe that the network dynamics is not simply the sum of effects resulting from adding individual links to the connection topology but rather is governed nonlinearly by the complex interactions caused by the existence of all randomly added and already existing links in the network. Data for our study was gathered using Netzwerk-1, a C++ simulation tool that we developed for our abstraction.

  6. Automatic differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. AD is assessed as a tool for engineering design. The forward and reverse modes of AD, their computing requirements, as well as approaches to implementing AD are discussed. The application of two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation is also discussed. The observation is made that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available; in some instances, AD may be the alternative to consider in lieu of analytical sensitivity analysis.

  7. Application Of Moldex3D For Thin-wall Injection Moulding Simulation

    NASA Astrophysics Data System (ADS)

    Šercer, Mladen; Godec, Damir; Bujanić, Božo

    2007-05-01

    The benefits associated with decreasing wall thicknesses below their current values are still measurable and desired even if the final wall thickness is nowhere near those of the aggressive portable electronics industry. It is important to note that gains in wall section reduction do not always occur without investment, in this case, in tooling and machinery upgrades. Equally important is the fact that productivity and performance benefits of reduced material usage, fast cycle times, and lighter weight can often outweigh most of the added costs. In order to eliminate unnecessary mould trials, minimize product development cycle, reduce overall costs and improve product quality, polymeric engineers use new CAE technology (Computer Aided Engineering). This technology is a simulation tool, which combines proven theories, material properties and process conditions to generate realistic simulations and produce valuable recommendations. Based on these recommendations, an optional combination of product design, material and process conditions can be identified. In this work, Moldex3D software was used for simulation of injection moulding in order to avoid potential moulding problems. The results gained from the simulation were used for the optimization of an existing product design, for mould development and for optimization of processing parameters, e.g. injection pressure, mould cavity temperature, etc.

  8. Direct digital conversion detector technology

    NASA Astrophysics Data System (ADS)

    Mandl, William J.; Fedors, Richard

    1995-06-01

    Future imaging sensors for the aerospace and commercial video markets will depend on low cost, high speed analog-to-digital (A/D) conversion to efficiently process optical detector signals. Current A/D methods place a heavy burden on system resources, increase noise, and limit the throughput. This paper describes a unique method for incorporating A/D conversion right on the focal plane array. This concept is based on Sigma-Delta sampling, and makes optimum use of the active detector real estate. Combined with modern digital signal processors, such devices will significantly increase data rates off the focal plane. Early conversion to digital format will also decrease the signal susceptibility to noise, lowering the communications bit error rate. Computer modeling of this concept is described, along with results from several simulation runs. A potential application for direct digital conversion is also reviewed. Future uses for this technology could range from scientific instruments to remote sensors, telecommunications gear, medical diagnostic tools, and consumer products.

  9. Operational Characteristics Identification and Simulation Model Verification for Incheon International Airport

    NASA Technical Reports Server (NTRS)

    Eun, Yeonju; Jeon, Daekeun; Lee, Hanbong; Zhu, Zhifan; Jung, Yoon C.; Jeong, Myeongsook; Kim, Hyounkyong; Oh, Eunmi; Hong, Sungkwon; Lee, Junwon

    2016-01-01

    Incheon International Airport (ICN) is one of the hub airports in East Asia. Airport operations at ICN have been growing more than 5 percent per year in the past five years. According to the current airport expansion plan, a new passenger terminal will be added and the current cargo ramp will be expanded in 2018. This expansion project will bring 77 new stands without adding a new runway to the airport. Due to such continuous growth in airport operations and future expansion of the ramps, it will be highly likely that airport surface traffic will experience more congestion, and therefore, suffer from efficiency degradation. There is a growing awareness in aviation research community of need for strategic and tactical surface scheduling capabilities for efficient airport surface operations. Specific to ICN airport operations, a need for A-CDM (Airport - Collaborative Decision Making) or S-CDM (Surface - Collaborative Decision Making), and controller decision support tools for efficient air traffic management has arisen since several years ago. In the United States, there has been independent research efforts made by academia, industry, and government research organizations to enhance efficiency and predictability of surface operations at busy airports. Among these research activities, the Spot and Runway Departure Advisor (SARDA) developed and tested by National Aeronautics and Space Administration (NASA) is a decision support tool to provide tactical advisories to the controllers for efficient surface operations. The effectiveness of SARDA concept, was successfully verified through the human-in-the-loop (HITL) simulations for both spot release and runway operations advisories for ATC Tower controllers of Dallas-Fort Worth International Airport (DFW) in 2010 and 2012, and gate pushback advisories for the ramp controller of Charlotte-Douglas International Airport (CLT) in 2014. The SARDA concept for tactical surface scheduling is further enhanced and is being integrated into NASA's Airspace Technology Demonstration-2 (ATD-2) project for technology demonstration of Integrated Arrival-Departure-Surface (IADS) operations at CLT. This study is a part of the international research collaboration between KAIA (Korea Agency for Infrastructure Technology Advancement), KARI (Korea Aerospace Research Institute) and NASA, which is being conducted to validate the effectiveness of SARDA concept as a controller decision support tool for departure and surface management of ICN. This paper presents the preliminary results of the collaboration effort. It includes investigation of the operational environment of ICN, data analysis for identification of the operational characteristics of the airport, construction and verification of airport simulation model using Surface Operations Simulator and Scheduler (SOSS), NASA's fast-time simulation tool.

  10. Planning Tool for Strategic Evaluation of Facility Plans - 13570

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magoulas, Virginia; Cercy, Michael; Hall, Irin

    2013-07-01

    Savannah River National Laboratory (SRNL) has developed a strategic planning tool for the evaluation of the utilization of its unique resources for processing and research and development of nuclear materials. The Planning Tool is a strategic level tool for assessing multiple missions that could be conducted utilizing the SRNL facilities and showcasing the plan. Traditional approaches using standard scheduling tools and laying out a strategy on paper tended to be labor intensive and offered either a limited or cluttered view for visualizing and communicating results. A tool that can assess the process throughput, duration, and utilization of the facility wasmore » needed. SRNL teamed with Newport News Shipbuilding (NNS), a division of Huntington Ingalls Industries, to create the next generation Planning Tool. The goal of this collaboration was to create a simulation based tool that allows for quick evaluation of strategies with respect to new or changing missions, and clearly communicates results to the decision makers. This tool has been built upon a mature modeling and simulation software previously developed by NNS. The Planning Tool provides a forum for capturing dependencies, constraints, activity flows, and variable factors. It is also a platform for quickly evaluating multiple mission scenarios, dynamically adding/updating scenarios, generating multiple views for evaluating/communicating results, and understanding where there are areas of risks and opportunities with respect to capacity. The Planning Tool that has been developed is useful in that it presents a clear visual plan for the missions at the Savannah River Site (SRS). It not only assists in communicating the plans to SRS corporate management, but also allows the area stakeholders a visual look at the future plans for SRS. The design of this tool makes it easily deployable to other facility and mission planning endeavors. (authors)« less

  11. Sensor Based Framework for Secure Multimedia Communication in VANET

    PubMed Central

    Rahim, Aneel; Khan, Zeeshan Shafi; Bin Muhaya, Fahad T.; Sher, Muhammad; Kim, Tai-Hoon

    2010-01-01

    Secure multimedia communication enhances the safety of passengers by providing visual pictures of accidents and danger situations. In this paper we proposed a framework for secure multimedia communication in Vehicular Ad-Hoc Networks (VANETs). Our proposed framework is mainly divided into four components: redundant information, priority assignment, malicious data verification and malicious node verification. The proposed scheme jhas been validated with the help of the NS-2 network simulator and the Evalvid tool. PMID:22163462

  12. Adding Four- Dimensional Data Assimilation (aka grid ...

    EPA Pesticide Factsheets

    Adding four-dimensional data assimilation (a.k.a. grid nudging) to MPAS.The U.S. Environmental Protection Agency is investigating the use of MPAS as the meteorological driver for its next-generation air quality model. To function as such, MPAS needs to operate in a diagnostic mode in much the same manner as the current meteorological driver, the Weather Research and Forecasting (WRF) model. The WRF operates in diagnostic mode using Four-Dimensional Data Assimilation, also known as "grid nudging". MPAS version 4.0 has been modified with the addition of an FDDA routine to the standard physics drivers to nudge the state variables for wind, temperature and water vapor towards MPAS initialization fields defined at 6-hour intervals from GFS-derived data. The results to be shown demonstrate the ability to constrain MPAS simulations to known historical conditions and thus provide the U.S. EPA with a practical meteorological driver for global-scale air quality simulations. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use bo

  13. Analysis of eddy currents induced by transverse and longitudinal gradient coils in different tungsten collimators geometries for SPECT/MRI integration.

    PubMed

    Samoudi, Amine M; Van Audenhaege, Karen; Vermeeren, Günter; Poole, Michael; Tanghe, Emmeric; Martens, Luc; Van Holen, Roel; Joseph, Wout

    2015-12-01

    We investigated the temporal variation of the induced magnetic field due to the transverse and the longitudinal gradient coils in tungsten collimators arranged in hexagonal and pentagonal geometries with and without gaps between the collimators. We modeled x-, y-, and z-gradient coils and different arrangements of single-photon emission computed tomography (SPECT) collimators using FEKO, a three-dimensional electromagnetic simulation tool. A time analysis approach was used to generate the pulsed magnetic field gradient. The approach was validated with measurements using a 7T MRI scanner. Simulations showed an induced magnetic field representing 4.66% and 0.87% of the applied gradient field (gradient strength = 500 mT/m) for longitudinal and transverse gradient coils, respectively. These values can be reduced by 75% by adding gaps between the collimators for the pentagonal arrangement, bringing the maximum induced magnetic field to less than 2% of the applied gradient for all of the gradient coils. Characterization of the maximum induced magnetic field shows that by adding gaps between the collimators for an integrated SPECT/MRI system, eddy currents can be corrected by the MRI system to avoid artifact. The numerical model was validated and was proposed as a tool for studying the effect of a SPECT collimator within the MRI gradient coils. © 2014 Wiley Periodicals, Inc.

  14. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.

  15. Simulation loop between cad systems, GEANT-4 and GeoModel: Implementation and results

    NASA Astrophysics Data System (ADS)

    Sharmazanashvili, A.; Tsutskiridze, Niko

    2016-09-01

    Compare analysis of simulation and as-built geometry descriptions of detector is important field of study for data_vs_Monte-Carlo discrepancies. Shapes consistency and detalization is not important while adequateness of volumes and weights of detector components are essential for tracking. There are 2 main reasons of faults of geometry descriptions in simulation: (1) Difference between simulated and as-built geometry descriptions; (2) Internal inaccuracies of geometry transformations added by simulation software infrastructure itself. Georgian Engineering team developed hub on the base of CATIA platform and several tools enabling to read in CATIA different descriptions used by simulation packages, like XML->CATIA; VP1->CATIA; Geo-Model->CATIA; Geant4->CATIA. As a result it becomes possible to compare different descriptions with each other using the full power of CATIA and investigate both classes of reasons of faults of geometry descriptions. Paper represents results of case studies of ATLAS Coils and End-Cap toroid structures.

  16. Integrated Display and Simulation for Automatic Dependent Surveillance-Broadcast and Traffic Collision Avoidance System Data Fusion.

    PubMed

    Wang, Yanran; Xiao, Gang; Dai, Zhouyun

    2017-11-13

    Automatic Dependent Surveillance-Broadcast (ADS-B) is the direction of airspace surveillance development. Research analyzing the benefits of Traffic Collision Avoidance System (TCAS) and ADS-B data fusion is almost absent. The paper proposes an ADS-B minimum system from ADS-B In and ADS-B Out. In ADS-B In, a fusion model with a variable sampling Variational Bayesian-Interacting Multiple Model (VSVB-IMM) algorithm is proposed for integrated display and an airspace traffic situation display is developed by using ADS-B information. ADS-B Out includes ADS-B Out transmission based on a simulator platform and an Unmanned Aerial Vehicle (UAV) platform. This paper describes the overall implementation of ADS-B minimum system, including theoretical model design, experimental simulation verification, engineering implementation, results analysis, etc. Simulation and implementation results show that the fused system has better performance than each independent subsystem and it can work well in engineering applications.

  17. Inherit Space

    NASA Technical Reports Server (NTRS)

    Giarratano, Joseph C.; Jenks, K. C.

    1997-01-01

    The objective of the proposed research was to begin development of a unique educational tool targeted at educating and inspiring young people 12-16 years old about NASA and the Space Program. Since these young people are the future engineers, scientists and space pioneers, the nurturing of their enthusiasm and interest is of critical importance to the Nation. This summer the basic infrastructure of the tool was developed in the context of an educational game paradigm. The game paradigm has achieved remarkable success in maintaining the interest of young people in a self-paced, student-directed learning environment. This type of environment encourages student exploration and curiosity which are exactly the traits that future space pioneers need to develop to prepare for the unexpected. The Inherit Space Educational Tool is an open-ended learning environment consisting of a finite-state machine classic adventure game paradigm. As the young person explores this world, different obstacles must be overcome. Rewards will be offered such as using the flight simulator to fly around and explore Titan. This simulator was modeled on conventional Earth flight simulators but has been considerably enhanced to add texture mapping of Titan's atmosphere utilizing the latest information from the NASA Galileo Space Probe. Additional scenery was added to provide color VGA graphics of a futuristic research station on Titan as well as an interesting story to keep the youngster's attention. This summer the game infrastructure has been developed as well as the Titan Flight Simulator. A number of other enhancements are planned.

  18. Using driving simulators to assess driving safety.

    PubMed

    Boyle, Linda Ng; Lee, John D

    2010-05-01

    Changes in drivers, vehicles, and roadways pose substantial challenges to the transportation safety community. Crash records and naturalistic driving data are useful for examining the influence of past or existing technology on drivers, and the associations between risk factors and crashes. However, they are limited because causation cannot be established and technology not yet installed in production vehicles cannot be assessed. Driving simulators have become an increasingly widespread tool to understand evolving and novel technologies. The ability to manipulate independent variables in a randomized, controlled setting also provides the added benefit of identifying causal links. This paper introduces a special issue on simulator-based safety studies. The special issue comprises 25 papers that demonstrate the use of driving simulators to address pressing transportation safety problems and includes topics as diverse as neurological dysfunction, work zone design, and driver distraction. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  19. C3 System Performance Simulation and User Manual. Getting Started: Guidelines for Users

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This document is a User's Manual describing the C3 Simulation capabilities. The subject work was designed to simulate the communications involved in the flight of a Remotely Operated Aircraft (ROA) using the Opnet software. Opnet provides a comprehensive development environment supporting the modeling of communication networks and distributed systems. It has tools for model design, simulation, data collection, and data analysis. Opnet models are hierarchical -- consisting of a project which contains node models which in turn contain process models. Nodes can be fixed, mobile, or satellite. Links between nodes can be physical or wireless. Communications are packet based. The model is very generic in its current form. Attributes such as frequency and bandwidth can easily be modified to better reflect a specific platform. The model is not fully developed at this stage -- there are still more enhancements to be added. Current issues are documented throughout this guide.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doubleday, Kate; Meintz, Andrew; Markel, Tony

    System right-sizing is critical to implementation of in-motion wireless power transfer (WPT) for electric vehicles. This study introduces a modeling tool, WPTSim, which uses one-second speed, location, and road grade data from an on-demand employee shuttle in operation to simulate the incorporation of WPT at fine granularity. Vehicle power and state of charge are simulated over the drive cycle to evaluate potential system designs. The required battery capacity is determined based on the rated power at a variable number of charging locations. Adding just one WPT location can more than halve the battery capacity needed. Many configurations are capable ofmore » being self sustaining with WPT, while others benefit from supplemental stationary charging.« less

  1. Energy consumption program: A computer model simulating energy loads in buildings

    NASA Technical Reports Server (NTRS)

    Stoller, F. W.; Lansing, F. L.; Chai, V. W.; Higgins, S.

    1978-01-01

    The JPL energy consumption computer program developed as a useful tool in the on-going building modification studies in the DSN energy conservation project is described. The program simulates building heating and cooling loads and computes thermal and electric energy consumption and cost. The accuracy of computations are not sacrificed, however, since the results lie within + or - 10 percent margin compared to those read from energy meters. The program is carefully structured to reduce both user's time and running cost by asking minimum information from the user and reducing many internal time-consuming computational loops. Many unique features were added to handle two-level electronics control rooms not found in any other program.

  2. A New Network Modeling Tool for the Ground-based Nuclear Explosion Monitoring Community

    NASA Astrophysics Data System (ADS)

    Merchant, B. J.; Chael, E. P.; Young, C. J.

    2013-12-01

    Network simulations have long been used to assess the performance of monitoring networks to detect events for such purposes as planning station deployments and network resilience to outages. The standard tool has been the SAIC-developed NetSim package. With correct parameters, NetSim can produce useful simulations; however, the package has several shortcomings: an older language (FORTRAN), an emphasis on seismic monitoring with limited support for other technologies, limited documentation, and a limited parameter set. Thus, we are developing NetMOD (Network Monitoring for Optimal Detection), a Java-based tool designed to assess the performance of ground-based networks. NetMOD's advantages include: coded in a modern language that is multi-platform, utilizes modern computing performance (e.g. multi-core processors), incorporates monitoring technologies other than seismic, and includes a well-validated default parameter set for the IMS stations. NetMOD is designed to be extendable through a plugin infrastructure, so new phenomenological models can be added. Development of the Seismic Detection Plugin is being pursued first. Seismic location and infrasound and hydroacoustic detection plugins will follow. By making NetMOD an open-release package, it can hopefully provide a common tool that the monitoring community can use to produce assessments of monitoring networks and to verify assessments made by others.

  3. VLSI Design Tools, Reference Manual, Release 2.0.

    DTIC Science & Technology

    1984-08-01

    eder. 2.3 ITACV: Libary ofC readne. far oesumdg a layoit 1-,, tiling. V ~2.4 "QUILT: CeinS"Wbesa-i-M-8euar ray f atwok til 2.5 "TIL: Tockmeleff...8217patterns package was added so that complex and repetitive digital waveforms could be generated far more easily. The recently written program MTP (Multiple...circuit model to estimate timing delays through digital circuits. It also has a mode that allows it to be used as a switch (gate) level simulator

  4. GPS test range mission planning

    NASA Astrophysics Data System (ADS)

    Roberts, Iris P.; Hancock, Thomas P.

    The principal features of the Test Range User Mission Planner (TRUMP), a PC-resident tool designed to aid in deploying and utilizing GPS-based test range assets, are reviewed. TRUMP features time history plots of time-space-position information (TSPI); performance based on a dynamic GPS/inertial system simulation; time history plots of TSPI data link connectivity; digital terrain elevation data maps with user-defined cultural features; and two-dimensional coverage plots of ground-based test range assets. Some functions to be added during the next development phase are discussed.

  5. Automatic Differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. In this paper, it is assessed as a tool for engineering design. The paper discusses the forward and reverse modes of AD, their computing requirements, and approaches to implementing AD. It continues with application to two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation. The paper concludes with the observation that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available.

  6. H++ 3.0: automating pK prediction and the preparation of biomolecular structures for atomistic molecular modeling and simulations.

    PubMed

    Anandakrishnan, Ramu; Aguilar, Boris; Onufriev, Alexey V

    2012-07-01

    The accuracy of atomistic biomolecular modeling and simulation studies depend on the accuracy of the input structures. Preparing these structures for an atomistic modeling task, such as molecular dynamics (MD) simulation, can involve the use of a variety of different tools for: correcting errors, adding missing atoms, filling valences with hydrogens, predicting pK values for titratable amino acids, assigning predefined partial charges and radii to all atoms, and generating force field parameter/topology files for MD. Identifying, installing and effectively using the appropriate tools for each of these tasks can be difficult for novice and time-consuming for experienced users. H++ (http://biophysics.cs.vt.edu/) is a free open-source web server that automates the above key steps in the preparation of biomolecular structures for molecular modeling and simulations. H++ also performs extensive error and consistency checking, providing error/warning messages together with the suggested corrections. In addition to numerous minor improvements, the latest version of H++ includes several new capabilities and options: fix erroneous (flipped) side chain conformations for HIS, GLN and ASN, include a ligand in the input structure, process nucleic acid structures and generate a solvent box with specified number of common ions for explicit solvent MD.

  7. Integrated Display and Simulation for Automatic Dependent Surveillance–Broadcast and Traffic Collision Avoidance System Data Fusion

    PubMed Central

    Wang, Yanran; Xiao, Gang; Dai, Zhouyun

    2017-01-01

    Automatic Dependent Surveillance–Broadcast (ADS-B) is the direction of airspace surveillance development. Research analyzing the benefits of Traffic Collision Avoidance System (TCAS) and ADS-B data fusion is almost absent. The paper proposes an ADS-B minimum system from ADS-B In and ADS-B Out. In ADS-B In, a fusion model with a variable sampling Variational Bayesian-Interacting Multiple Model (VSVB-IMM) algorithm is proposed for integrated display and an airspace traffic situation display is developed by using ADS-B information. ADS-B Out includes ADS-B Out transmission based on a simulator platform and an Unmanned Aerial Vehicle (UAV) platform. This paper describes the overall implementation of ADS-B minimum system, including theoretical model design, experimental simulation verification, engineering implementation, results analysis, etc. Simulation and implementation results show that the fused system has better performance than each independent subsystem and it can work well in engineering applications. PMID:29137194

  8. Mechanical problem-solving strategies in Alzheimer's disease and semantic dementia.

    PubMed

    Lesourd, Mathieu; Baumard, Josselin; Jarry, Christophe; Etcharry-Bouyx, Frédérique; Belliard, Serge; Moreaud, Olivier; Croisile, Bernard; Chauviré, Valérie; Granjon, Marine; Le Gall, Didier; Osiurak, François

    2016-07-01

    The goal of this study was to explore whether the tool-use disorders observed in Alzheimer's disease (AD) and semantic dementia (SD) are of the same nature as those observed in left brain-damaged (LBD) patients. Recent evidence indicates that LBD patients with apraxia of tool use encounter difficulties in solving mechanical problems, characterized by the absence of specific strategies. This pattern may show the presence of impaired mechanical knowledge, critical for both familiar and novel tool use. So, we explored the strategies followed by AD and SD patients in mechanical problem-solving tasks in order to determine whether mechanical knowledge is also impaired in these patients. We used a mechanical problem-solving task in both choice (i.e., several tools were proposed) and no-choice (i.e., only 1 tool was proposed) conditions. We analyzed quantitative data and strategy profiles. AD patients but not SD patients met difficulties in solving mechanical problem-solving tasks. However, the key finding is that AD patients, despite their difficulties, showed strategy profiles that are similar to that of SD patients or controls. Moreover, AD patients exhibited a strategy profile distinct from the one previously observed in LBD patients. Those observations lead us to consider that difficulties met by AD patients to solve mechanical problems or even to use familiar tools may not be caused by mechanical knowledge impairment per se. In broad terms, what we call apraxia of tool use in AD is certainly not the same as apraxia of tool use observed in LBD patients. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    NASA Technical Reports Server (NTRS)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  10. Barcoded pyrosequencing analysis of the microbial community in a simulator of the human gastrointestinal tract showed a colon region-specific microbiota modulation for two plant-derived polysaccharide blends.

    PubMed

    Marzorati, Massimo; Maignien, Lois; Verhelst, An; Luta, Gabriela; Sinnott, Robert; Kerckhof, Frederiek Maarten; Boon, Nico; Van de Wiele, Tom; Possemiers, Sam

    2013-02-01

    The combination of a Simulator of the Human Intestinal Microbial Ecosystem with ad hoc molecular techniques (i.e. pyrosequencing, denaturing gradient gel electrophoresis and quantitative PCR) allowed an evaluation of the extent to which two plant polysaccharide supplements could modify a complex gut microbial community. The presence of Aloe vera gel powder and algae extract in product B as compared to the standard blend (product A) improved its fermentation along the entire simulated colon. The potential extended effect of product B in the simulated distal colon, as compared to product A, was confirmed by: (i) the separate clustering of the samples before and after the treatment in the phylogenetic-based dendrogram and OTU-based PCoA plot only for product B; (ii) a higher richness estimator (+33 vs. -36 % of product A); and (iii) a higher dynamic parameter (21 vs. 13 %). These data show that the combination of well designed in vitro simulators with barcoded pyrosequencing is a powerful tool for characterizing changes occurring in the gut microbiota following a treatment. However, for the quantification of low-abundance species-of interest because of their relationship to potential positive health effects (i.e. bifidobacteria or lactobacilli)-conventional molecular ecological approaches, such as PCR-DGGE and qPCR, still remain a very useful complementary tool.

  11. Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.

    PubMed

    Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs.

  12. Securing Mobile Ad Hoc Networks Using Danger Theory-Based Artificial Immune Algorithm

    PubMed Central

    2015-01-01

    A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001

  13. ADS-33C related handling qualities research performed using the NRC Bell 205 airborne simulator

    NASA Technical Reports Server (NTRS)

    Morgan, J. Murray; Baillie, Stewart W.

    1993-01-01

    Over 10 years ago a project was initiated by the U.S. Army AVSCOM to update the military helicopter flying qualities specification MIL-8501-A. While not yet complete, the project reached a major milestone in 1989 with the publication of an Airworthiness Design Standard, ADS-33C. The 8501 update project initially set out to identify critical gaps in the requisite data base and then proceeded to fill them using a variety of directed research studies. The magnitude of the task required that it become an international effort: appropriate research studies were conducted in Germany, the UK and Canada as well as in the USA. Canadian participation was supported by the Department of National Defence (DND) through the Chief of Research and Development. Both ground based and in-flight simulation were used to study the defined areas and the Canadian Bell 205-A1 variable stability helicopter was used extensively as one of the primary research tools available for this effort. This paper reviews the involvement of the Flight Research Laboratory of the National Research Council of Canada in the update project, it describes the various experiments conducted on the Airborne Simulator, it notes significant results obtained and describes ongoing research associated with the project.

  14. MDcons: Intermolecular contact maps as a tool to analyze the interface of protein complexes from molecular dynamics trajectories

    PubMed Central

    2014-01-01

    Background Molecular Dynamics (MD) simulations of protein complexes suffer from the lack of specific tools in the analysis step. Analyses of MD trajectories of protein complexes indeed generally rely on classical measures, such as the RMSD, RMSF and gyration radius, conceived and developed for single macromolecules. As a matter of fact, instead, researchers engaged in simulating the dynamics of a protein complex are mainly interested in characterizing the conservation/variation of its biological interface. Results On these bases, herein we propose a novel approach to the analysis of MD trajectories or other conformational ensembles of protein complexes, MDcons, which uses the conservation of inter-residue contacts at the interface as a measure of the similarity between different snapshots. A "consensus contact map" is also provided, where the conservation of the different contacts is drawn in a grey scale. Finally, the interface area of the complex is monitored during the simulations. To show its utility, we used this novel approach to study two protein-protein complexes with interfaces of comparable size and both dominated by hydrophilic interactions, but having binding affinities at the extremes of the experimental range. MDcons is demonstrated to be extremely useful to analyse the MD trajectories of the investigated complexes, adding important insight into the dynamic behavior of their biological interface. Conclusions MDcons specifically allows the user to highlight and characterize the dynamics of the interface in protein complexes and can thus be used as a complementary tool for the analysis of MD simulations of both experimental and predicted structures of protein complexes. PMID:25077693

  15. MDcons: Intermolecular contact maps as a tool to analyze the interface of protein complexes from molecular dynamics trajectories.

    PubMed

    Abdel-Azeim, Safwat; Chermak, Edrisse; Vangone, Anna; Oliva, Romina; Cavallo, Luigi

    2014-01-01

    Molecular Dynamics (MD) simulations of protein complexes suffer from the lack of specific tools in the analysis step. Analyses of MD trajectories of protein complexes indeed generally rely on classical measures, such as the RMSD, RMSF and gyration radius, conceived and developed for single macromolecules. As a matter of fact, instead, researchers engaged in simulating the dynamics of a protein complex are mainly interested in characterizing the conservation/variation of its biological interface. On these bases, herein we propose a novel approach to the analysis of MD trajectories or other conformational ensembles of protein complexes, MDcons, which uses the conservation of inter-residue contacts at the interface as a measure of the similarity between different snapshots. A "consensus contact map" is also provided, where the conservation of the different contacts is drawn in a grey scale. Finally, the interface area of the complex is monitored during the simulations. To show its utility, we used this novel approach to study two protein-protein complexes with interfaces of comparable size and both dominated by hydrophilic interactions, but having binding affinities at the extremes of the experimental range. MDcons is demonstrated to be extremely useful to analyse the MD trajectories of the investigated complexes, adding important insight into the dynamic behavior of their biological interface. MDcons specifically allows the user to highlight and characterize the dynamics of the interface in protein complexes and can thus be used as a complementary tool for the analysis of MD simulations of both experimental and predicted structures of protein complexes.

  16. Leveraging e-Science infrastructure for electrochemical research.

    PubMed

    Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F

    2011-08-28

    As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.

  17. Initial Investigations of Controller Tools and Procedures for Schedule-Based Arrival Operations with Mixed Flight-Deck Interval Management Equipage

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Cabrall, Christopher; Kupfer, Michael; Omar, Faisal G.; Prevot, Thomas

    2012-01-01

    NASA?s Air Traffic Management Demonstration-1 (ATD-1) is a multi-year effort to demonstrate high-throughput, fuel-efficient arrivals at a major U.S. airport using NASA-developed scheduling automation, controller decision-support tools, and ADS-B-enabled Flight-Deck Interval Management (FIM) avionics. First-year accomplishments include the development of a concept of operations for managing scheduled arrivals flying Optimized Profile Descents with equipped aircraft conducting FIM operations, and the integration of laboratory prototypes of the core ATD-1 technologies. Following each integration phase, a human-in-the-loop simulation was conducted to evaluate and refine controller tools, procedures, and clearance phraseology. From a ground-side perspective, the results indicate the concept is viable and the operations are safe and acceptable. Additional training is required for smooth operations that yield notable benefits, particularly in the areas of FIM operations and clearance phraseology.

  18. A New Improved and Extended Version of the Multicell Bacterial Simulator gro.

    PubMed

    Gutiérrez, Martín; Gregorio-Godoy, Paula; Pérez Del Pulgar, Guillermo; Muñoz, Luis E; Sáez, Sandra; Rodríguez-Patón, Alfonso

    2017-08-18

    gro is a cell programming language developed in Klavins Lab for simulating colony growth and cell-cell communication. It is used as a synthetic biology prototyping tool for simulating multicellular biocircuits and microbial consortia. In this work, we present several extensions made to gro that improve the performance of the simulator, make it easier to use, and provide new functionalities. The new version of gro is between 1 and 2 orders of magnitude faster than the original version. It is able to grow microbial colonies with up to 10 5 cells in less than 10 min. A new library, CellEngine, accelerates the resolution of spatial physical interactions between growing and dividing cells by implementing a new shoving algorithm. A genetic library, CellPro, based on Probabilistic Timed Automata, simulates gene expression dynamics using simplified and easy to compute digital proteins. We also propose a more convenient language specification layer, ProSpec, based on the idea that proteins drive cell behavior. CellNutrient, another library, implements Monod-based growth and nutrient uptake functionalities. The intercellular signaling management was improved and extended in a library called CellSignals. Finally, bacterial conjugation, another local cell-cell communication process, was added to the simulator. To show the versatility and potential outreach of this version of gro, we provide studies and novel examples ranging from synthetic biology to evolutionary microbiology. We believe that the upgrades implemented for gro have made it into a powerful and fast prototyping tool capable of simulating a large variety of systems and synthetic biology designs.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Preece, D.S.; Knudsen, S.D.

    The spherical element computer code DMC (Distinct Motion Code) used to model rock motion resulting from blasting has been enhanced to allow routine computer simulations of bench blasting. The enhancements required for bench blast simulation include: (1) modifying the gas flow portion of DMC, (2) adding a new explosive gas equation of state capability, (3) modifying the porosity calculation, and (4) accounting for blastwell spacing parallel to the face. A parametric study performed with DMC shows logical variation of the face velocity as burden, spacing, blastwell diameter and explosive type are varied. These additions represent a significant advance in themore » capability of DMC which will not only aid in understanding the physics involved in blasting but will also become a blast design tool. 8 refs., 7 figs., 1 tab.« less

  20. An Opportunistic Wireless Charging System Design for an On-Demand Shuttle Service: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doubleday, Kate; Meintz, Andrew; Markel, Tony

    System right-sizing is critical to implementation of in-motion wireless power transfer (WPT) for electric vehicles. This study introduces a modeling tool, WPTSim, which uses one-second speed, location, and road grade data from an on-demand employee shuttle in operation to simulate the incorporation of WPT at fine granularity. Vehicle power and state of charge are simulated over the drive cycle to evaluate potential system designs. The required battery capacity is determined based on the rated power at a variable number of charging locations. Adding just one WPT location can more than halve the battery capacity needed. Many configurations are capable ofmore » being self sustaining with WPT, while others benefit from supplemental stationary charging.« less

  1. Tool For Installation Of Seal In Tube Fitting

    NASA Technical Reports Server (NTRS)

    Trevathan, Joseph R.

    1993-01-01

    Plierslike tool helps secure repair seal in fitting. Tool crimps repair seal into tube fitting, ensuring tight fit every time. Modified pair of snapring pliers to which knife-edge jaws have been added. Spring added between handles. Also includes separate, accompanying support ring.

  2. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 3 (L1V3).

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; König, Matthias; Moraru, Ion; Nickerson, David; Le Novère, Nicolas; Olivier, Brett G; Sahle, Sven; Smith, Lucian; Waltemath, Dagmar

    2018-03-19

    The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML) describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML) is an XML-based format that encodes, for a given simulation experiment, (i) which models to use; (ii) which modifications to apply to models before simulation; (iii) which simulation procedures to run on each model; (iv) how to post-process the data; and (v) how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1) implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  3. An AD100 implementation of a real-time STOVL aircraft propulsion system

    NASA Technical Reports Server (NTRS)

    Ouzts, Peter J.; Drummond, Colin K.

    1990-01-01

    A real-time dynamic model of the propulsion system for a Short Take-Off and Vertical Landing (STOVL) aircraft was developed for the AD100 simulation environment. The dynamic model was adapted from a FORTRAN based simulation using the dynamic programming capabilities of the AD100 ADSIM simulation language. The dynamic model includes an aerothermal representation of a turbofan jet engine, actuator and sensor models, and a multivariable control system. The AD100 model was tested for agreement with the FORTRAN model and real-time execution performance. The propulsion system model was also linked to an airframe dynamic model to provide an overall STOVL aircraft simulation for the purposes of integrated flight and propulsion control studies. An evaluation of the AD100 system for use as an aircraft simulation environment is included.

  4. Internal force corrections with machine learning for quantum mechanics/molecular mechanics simulations.

    PubMed

    Wu, Jingheng; Shen, Lin; Yang, Weitao

    2017-10-28

    Ab initio quantum mechanics/molecular mechanics (QM/MM) molecular dynamics simulation is a useful tool to calculate thermodynamic properties such as potential of mean force for chemical reactions but intensely time consuming. In this paper, we developed a new method using the internal force correction for low-level semiempirical QM/MM molecular dynamics samplings with a predefined reaction coordinate. As a correction term, the internal force was predicted with a machine learning scheme, which provides a sophisticated force field, and added to the atomic forces on the reaction coordinate related atoms at each integration step. We applied this method to two reactions in aqueous solution and reproduced potentials of mean force at the ab initio QM/MM level. The saving in computational cost is about 2 orders of magnitude. The present work reveals great potentials for machine learning in QM/MM simulations to study complex chemical processes.

  5. Sensitivity Observing System Experiment (SOSE)-a new effective NWP-based tool in designing the global observing system

    NASA Astrophysics Data System (ADS)

    Marseille, Gert-Jan; Stoffelen, Ad; Barkmeijer, Jan

    2008-03-01

    Lacking an established methodology to test the potential impact of prospective extensions to the global observing system (GOS) in real atmospheric cases we developed such a method, called Sensitivity Observing System Experiment (SOSE). For example, since the GOS is non uniform it is of interest to investigate the benefit of complementary observing systems filling its gaps. In a SOSE adjoint sensitivity structures are used to define a pseudo true atmospheric state for the simulation of the prospective observing system. Next, the synthetic observations are used together with real observations from the existing GOS in a state-of-the-art Numerical Weather Prediction (NWP) model to assess the potential added value of the new observing system. Unlike full observing system simulation experiments (OSSE), SOSE can be applied to real extreme events that were badly forecast operationally and only requires the simulation of the new instrument. As such SOSE is an effective tool, for example, to define observation requirements for extensions to the GOS. These observation requirements may serve as input for the design of an operational network of prospective observing systems. In a companion paper we use SOSE to simulate potential future space borne Doppler Wind Lidar (DWL) scenarios and assess their capability to sample meteorologically sensitive areas not well captured by the current GOS, in particular over the Northern Hemisphere oceans.

  6. Interactive simulations as teaching tools for engineering mechanics courses

    NASA Astrophysics Data System (ADS)

    Carbonell, Victoria; Romero, Carlos; Martínez, Elvira; Flórez, Mercedes

    2013-07-01

    This study aimed to gauge the effect of interactive simulations in class as an active teaching strategy for a mechanics course. Engineering analysis and design often use the properties of planar sections in calculations. In the stress analysis of a beam under bending and torsional loads, cross-sectional properties are used to determine stress and displacement distributions in the beam cross section. The centroid, moments and products of inertia of an area made up of several common shapes (rectangles usually) may thus be obtained by adding the moments of inertia of the component areas (U-shape, L-shape, C-shape, etc). This procedure is used to calculate the second moments of structural shapes in engineering practice because the determination of their moments of inertia is necessary for the design of structural components. This paper presents examples of interactive simulations developed for teaching the ‘Mechanics and mechanisms’ course at the Universidad Politecnica de Madrid, Spain. The simulations focus on fundamental topics such as centroids, the properties of the moment of inertia, second moments of inertia with respect to two axes, principal moments of inertia and Mohr's Circle for plane stress, and were composed using Geogebra software. These learning tools feature animations, graphics and interactivity and were designed to encourage student participation and engagement in active learning activities, to effectively explain and illustrate course topics, and to build student problem-solving skills.

  7. Development of TIF based figuring algorithm for deterministic pitch tool polishing

    NASA Astrophysics Data System (ADS)

    Yi, Hyun-Su; Kim, Sug-Whan; Yang, Ho-Soon; Lee, Yun-Woo

    2007-12-01

    Pitch is perhaps the oldest material used for optical polishing, leaving superior surface texture, and has been used widely in the optics shop floor. However, for its unpredictable controllability of removal characteristics, the pitch tool polishing has been rarely analysed quantitatively and many optics shops rely heavily on optician's "feel" even today. In order to bring a degree of process controllability to the pitch tool polishing, we added motorized tool motions to the conventional Draper type polishing machine and modelled the tool path in the absolute machine coordinate. We then produced a number of Tool Influence Function (TIF) both from an analytical model and a series of experimental polishing runs using the pitch tool. The theoretical TIFs agreed well with the experimental TIFs to the profile accuracy of 79 % in terms of its shape. The surface figuring algorithm was then developed in-house utilizing both theoretical and experimental TIFs. We are currently undertaking a series of trial figuring experiments to prove the performance of the polishing algorithm, and the early results indicate that the highly deterministic material removal control with the pitch tool can be achieved to a certain level of form error. The machine renovation, TIF theory and experimental confirmation, figuring simulation results are reported together with implications to deterministic polishing.

  8. Modeling screening, prevention, and delaying of Alzheimer's disease: an early-stage decision analytic model

    PubMed Central

    2010-01-01

    Background Alzheimer's Disease (AD) affects a growing proportion of the population each year. Novel therapies on the horizon may slow the progress of AD symptoms and avoid cases altogether. Initiating treatment for the underlying pathology of AD would ideally be based on biomarker screening tools identifying pre-symptomatic individuals. Early-stage modeling provides estimates of potential outcomes and informs policy development. Methods A time-to-event (TTE) simulation provided estimates of screening asymptomatic patients in the general population age ≥55 and treatment impact on the number of patients reaching AD. Patients were followed from AD screen until all-cause death. Baseline sensitivity and specificity were 0.87 and 0.78, with treatment on positive screen. Treatment slowed progression by 50%. Events were scheduled using literature-based age-dependent incidences of AD and death. Results The base case results indicated increased AD free years (AD-FYs) through delays in onset and a reduction of 20 AD cases per 1000 screened individuals. Patients completely avoiding AD accounted for 61% of the incremental AD-FYs gained. Total years of treatment per 1000 screened patients was 2,611. The number-needed-to-screen was 51 and the number-needed-to-treat was 12 to avoid one case of AD. One-way sensitivity analysis indicated that duration of screening sensitivity and rescreen interval impact AD-FYs the most. A two-way sensitivity analysis found that for a test with an extended duration of sensitivity (15 years) the number of AD cases avoided was 6,000-7,000 cases for a test with higher sensitivity and specificity (0.90,0.90). Conclusions This study yielded valuable parameter range estimates at an early stage in the study of screening for AD. Analysis identified duration of screening sensitivity as a key variable that may be unavailable from clinical trials. PMID:20433705

  9. Modeling screening, prevention, and delaying of Alzheimer's disease: an early-stage decision analytic model.

    PubMed

    Furiak, Nicolas M; Klein, Robert W; Kahle-Wrobleski, Kristin; Siemers, Eric R; Sarpong, Eric; Klein, Timothy M

    2010-04-30

    Alzheimer's Disease (AD) affects a growing proportion of the population each year. Novel therapies on the horizon may slow the progress of AD symptoms and avoid cases altogether. Initiating treatment for the underlying pathology of AD would ideally be based on biomarker screening tools identifying pre-symptomatic individuals. Early-stage modeling provides estimates of potential outcomes and informs policy development. A time-to-event (TTE) simulation provided estimates of screening asymptomatic patients in the general population age > or =55 and treatment impact on the number of patients reaching AD. Patients were followed from AD screen until all-cause death. Baseline sensitivity and specificity were 0.87 and 0.78, with treatment on positive screen. Treatment slowed progression by 50%. Events were scheduled using literature-based age-dependent incidences of AD and death. The base case results indicated increased AD free years (AD-FYs) through delays in onset and a reduction of 20 AD cases per 1000 screened individuals. Patients completely avoiding AD accounted for 61% of the incremental AD-FYs gained. Total years of treatment per 1000 screened patients was 2,611. The number-needed-to-screen was 51 and the number-needed-to-treat was 12 to avoid one case of AD. One-way sensitivity analysis indicated that duration of screening sensitivity and rescreen interval impact AD-FYs the most. A two-way sensitivity analysis found that for a test with an extended duration of sensitivity (15 years) the number of AD cases avoided was 6,000-7,000 cases for a test with higher sensitivity and specificity (0.90,0.90). This study yielded valuable parameter range estimates at an early stage in the study of screening for AD. Analysis identified duration of screening sensitivity as a key variable that may be unavailable from clinical trials.

  10. Development and validation of quasi-steady-state heat pump water heater model having stratified water tank and wrapped-tank condenser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Bo; Nawaz, Kashif; Baxter, Van D.

    Heat pump water heater systems (HPWH) introduce new challenges for design and modeling tools, because they require vapor compression system balanced with a water storage tank. In addition, a wrapped-tank condenser coil has strong coupling with a stratified water tank, which leads HPWH simulation to a transient process. To tackle these challenges and deliver an effective, hardware-based HPWH equipment design tool, a quasi-steady-state HPWH model was developed based on the DOE/ORNL Heat Pump Design Model (HPDM). Two new component models were added via this study. One is a one-dimensional stratified water tank model, an improvement to the open-source EnergyPlus watermore » tank model, by introducing a calibration factor to account for bulk mixing effect due to water draws, circulations, etc. The other is a wrapped-tank condenser coil model, using a segment-to-segment modeling approach. In conclusion, the HPWH system model was validated against available experimental data. After that, the model was used for parametric simulations to determine the effects of various design factors.« less

  11. Development and validation of quasi-steady-state heat pump water heater model having stratified water tank and wrapped-tank condenser

    DOE PAGES

    Shen, Bo; Nawaz, Kashif; Baxter, Van D.; ...

    2017-10-31

    Heat pump water heater systems (HPWH) introduce new challenges for design and modeling tools, because they require vapor compression system balanced with a water storage tank. In addition, a wrapped-tank condenser coil has strong coupling with a stratified water tank, which leads HPWH simulation to a transient process. To tackle these challenges and deliver an effective, hardware-based HPWH equipment design tool, a quasi-steady-state HPWH model was developed based on the DOE/ORNL Heat Pump Design Model (HPDM). Two new component models were added via this study. One is a one-dimensional stratified water tank model, an improvement to the open-source EnergyPlus watermore » tank model, by introducing a calibration factor to account for bulk mixing effect due to water draws, circulations, etc. The other is a wrapped-tank condenser coil model, using a segment-to-segment modeling approach. In conclusion, the HPWH system model was validated against available experimental data. After that, the model was used for parametric simulations to determine the effects of various design factors.« less

  12. Modeling evolution of spatially distributed bacterial communities: a simulation with the haploid evolutionary constructor

    PubMed Central

    2015-01-01

    Background Multiscale approaches for integrating submodels of various levels of biological organization into a single model became the major tool of systems biology. In this paper, we have constructed and simulated a set of multiscale models of spatially distributed microbial communities and study an influence of unevenly distributed environmental factors on the genetic diversity and evolution of the community members. Results Haploid Evolutionary Constructor software http://evol-constructor.bionet.nsc.ru/ was expanded by adding the tool for the spatial modeling of a microbial community (1D, 2D and 3D versions). A set of the models of spatially distributed communities was built to demonstrate that the spatial distribution of cells affects both intensity of selection and evolution rate. Conclusion In spatially heterogeneous communities, the change in the direction of the environmental flow might be reflected in local irregular population dynamics, while the genetic structure of populations (frequencies of the alleles) remains stable. Furthermore, in spatially heterogeneous communities, the chemotaxis might dramatically affect the evolution of community members. PMID:25708911

  13. After-hours/on-call experience during primary care nurse practitioner education utilizing standard scenarios and simulated patients.

    PubMed

    Kelly, Michelle M; Blunt, Elizabeth; Nestor, Kelly

    2017-12-01

    Few nurse practitioner (NP) programs include an after-hours/on-call component in their clinical preparation of NP students. This role is expected in many primary and specialty care practices, and is one that students feel unprepared to competently navigate. Utilizing simulated callers as patients or parents, NP students participated in a simulated after-hours/on-call experience that included receiving the call, managing the patient, and submitting documentation of the encounter. Students completed pre- and postparticipation evaluations, and were evaluated by the simulated patient callers and faculty using standardized evaluation tools. NP students rated the experience as an educationally valuable experience despite feeling anxious and nervous about the experience. Several essential skills were identified including critical thinking, clear communication, self-confidence, and access to resources. After participation NP students were more receptive to an NP position with an on-call component. Inclusion of a simulated on-call experience is a feasible component of NP education and should be added to the NP curriculum. ©2017 American Association of Nurse Practitioners.

  14. [Hi-Fi simulation: Teaching crisis resource management to surgery residents].

    PubMed

    Georgescu, Mihai; Tanoubi, Issam; Drolet, Pierre; Robitaille, Arnaud; Perron, Roger; Patenaude, Jean Victor

    2015-02-01

    High-fidelity (HiFi) simulation has shown its effectiveness for teaching crisis resource management (CRM) principles, and our institutional experience in this area is mainly with anesthesiology residents. We recently added to our postgraduate curriculum a new CRM course designed to cater to the specific needs of surgical residents. This short communication describes the experience of the University of Montreal Simulation Centre (Centre d'Apprentissage des Attitudes et Habiletés Cliniques CAAHC) regarding HiFi simulationbased CRM and communication skills teaching for surgical residents. Thirty residents agreed to participate in a simulation course with pre-established scenarios and educational CRM objectives on a voluntary basis. When surveyed immediately after the activity, all residents agreed that the educational objectives were well defined (80% "strongly agree" and 20% "agree"). The survey also showed that the course was well accepted by all participants (96% "strongly agree" and 4% "agree"). Further trials with randomized groups and more reliable assessment tools are needed to validate our results. Still, integrating HiFi simulation based CRM learning in the surgical residency curriculum seems like an interesting step.

  15. Analysis of speckle and material properties in laider tracer

    NASA Astrophysics Data System (ADS)

    Ross, Jacob W.; Rigling, Brian D.; Watson, Edward A.

    2017-04-01

    The SAL simulation tool Laider Tracer models speckle: the random variation in intensity of an incident light beam across a rough surface. Within Laider Tracer, the speckle field is modeled as a 2-D array of jointly Gaussian random variables projected via ray tracing onto the scene of interest. Originally, all materials in Laider Tracer were treated as ideal diffuse scatterers, for which the far-field return computed uses the Lambertian Bidirectional Reflectance Distribution Function (BRDF). As presented here, we implement material properties into Laider Tracer via the Non-conventional Exploitation Factors Data System: a database of properties for thousands of different materials sampled at various wavelengths and incident angles. We verify the intensity behavior as a function of incident angle after material properties are added to the simulation.

  16. RAY-UI: A powerful and extensible user interface for RAY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumgärtel, P., E-mail: peter.baumgaertel@helmholtz-berlin.de; Erko, A.; Schäfers, F.

    2016-07-27

    The RAY-UI project started as a proof-of-concept for an interactive and graphical user interface (UI) for the well-known ray tracing software RAY [1]. In the meantime, it has evolved into a powerful enhanced version of RAY that will serve as the platform for future development and improvement of associated tools. The software as of today supports nearly all sophisticated simulation features of RAY. Furthermore, it delivers very significant usability and work efficiency improvements. Beamline elements can be quickly added or removed in the interactive sequence view. Parameters of any selected element can be accessed directly and in arbitrary order. Withmore » a single click, parameter changes can be tested and new simulation results can be obtained. All analysis results can be explored interactively right after ray tracing by means of powerful integrated image viewing and graphing tools. Unlimited image planes can be positioned anywhere in the beamline, and bundles of image planes can be created for moving the plane along the beam to identify the focus position with live updates of the simulated results. In addition to showing the features and workflow of RAY-UI, we will give an overview of the underlying software architecture as well as examples for use and an outlook for future developments.« less

  17. Eugene--a domain specific language for specifying and constraining synthetic biological parts, devices, and systems.

    PubMed

    Bilitchenko, Lesia; Liu, Adam; Cheung, Sherine; Weeding, Emma; Xia, Bing; Leguia, Mariana; Anderson, J Christopher; Densmore, Douglas

    2011-04-29

    Synthetic biological systems are currently created by an ad-hoc, iterative process of specification, design, and assembly. These systems would greatly benefit from a more formalized and rigorous specification of the desired system components as well as constraints on their composition. Therefore, the creation of robust and efficient design flows and tools is imperative. We present a human readable language (Eugene) that allows for the specification of synthetic biological designs based on biological parts, as well as provides a very expressive constraint system to drive the automatic creation of composite Parts (Devices) from a collection of individual Parts. We illustrate Eugene's capabilities in three different areas: Device specification, design space exploration, and assembly and simulation integration. These results highlight Eugene's ability to create combinatorial design spaces and prune these spaces for simulation or physical assembly. Eugene creates functional designs quickly and cost-effectively. Eugene is intended for forward engineering of DNA-based devices, and through its data types and execution semantics, reflects the desired abstraction hierarchy in synthetic biology. Eugene provides a powerful constraint system which can be used to drive the creation of new devices at runtime. It accomplishes all of this while being part of a larger tool chain which includes support for design, simulation, and physical device assembly.

  18. Population and Activity of On-road Vehicles in MOVES2014 ...

    EPA Pesticide Factsheets

    This report describes the sources and derivation for on-road vehicle population and activity information and associated adjustments as stored in the MOVES2014 default databases. Motor Vehicle Emission Simulator, the MOVES2014 model, is a set of modeling tools for estimating emissions produced by on-road (cars, trucks, motorcycles, etc.) and nonroad (backhoes, lawnmowers, etc.) mobile sources. The national default activity information in MOVES2014 provides a reasonable basis for estimating national emissions. However, the uncertainties and variability in the default data contribute to the uncertainty in the resulting emission estimates. Properly characterizing emissions from the on-road vehicle subset requires a detailed understanding of the cars and trucks that make up the vehicle fleet and their patterns of operation. The MOVES model calculates emission inventories by multiplying emission rates by the appropriate emission-related activity, applying correction (adjustment) factors as needed to simulate specific situations, and then adding up the emissions from all sources (populations) and regions. This report describes the sources and derivation for on-road vehicle population and activity information and associated adjustments as stored in the MOVES2014 default databases. Motor Vehicle Emission Simulator, the MOVES2014 model, is a set of modeling tools for estimating emissions produced by on-road (cars, trucks, motorcycles, etc.) and nonroad (backhoes, law

  19. High-Performance First-Principles Molecular Dynamics for Predictive Theory and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gygi, Francois; Galli, Giulia; Schwegler, Eric

    This project focused on developing high-performance software tools for First-Principles Molecular Dynamics (FPMD) simulations, and applying them in investigations of materials relevant to energy conversion processes. FPMD is an atomistic simulation method that combines a quantum-mechanical description of electronic structure with the statistical description provided by molecular dynamics (MD) simulations. This reliance on fundamental principles allows FPMD simulations to provide a consistent description of structural, dynamical and electronic properties of a material. This is particularly useful in systems for which reliable empirical models are lacking. FPMD simulations are increasingly used as a predictive tool for applications such as batteries, solarmore » energy conversion, light-emitting devices, electro-chemical energy conversion devices and other materials. During the course of the project, several new features were developed and added to the open-source Qbox FPMD code. The code was further optimized for scalable operation of large-scale, Leadership-Class DOE computers. When combined with Many-Body Perturbation Theory (MBPT) calculations, this infrastructure was used to investigate structural and electronic properties of liquid water, ice, aqueous solutions, nanoparticles and solid-liquid interfaces. Computing both ionic trajectories and electronic structure in a consistent manner enabled the simulation of several spectroscopic properties, such as Raman spectra, infrared spectra, and sum-frequency generation spectra. The accuracy of the approximations used allowed for direct comparisons of results with experimental data such as optical spectra, X-ray and neutron diffraction spectra. The software infrastructure developed in this project, as applied to various investigations of solids, liquids and interfaces, demonstrates that FPMD simulations can provide a detailed, atomic-scale picture of structural, vibrational and electronic properties of complex systems relevant to energy conversion devices.« less

  20. Monte carlo simulations of Yttrium reaction rates in Quinta uranium target

    NASA Astrophysics Data System (ADS)

    Suchopár, M.; Wagner, V.; Svoboda, O.; Vrzalová, J.; Chudoba, P.; Tichý, P.; Kugler, A.; Adam, J.; Závorka, L.; Baldin, A.; Furman, W.; Kadykov, M.; Khushvaktov, J.; Solnyshkin, A.; Tsoupko-Sitnikov, V.; Tyutyunnikov, S.; Bielewicz, M.; Kilim, S.; Strugalska-Gola, E.; Szuta, M.

    2017-03-01

    The international collaboration Energy and Transmutation of Radioactive Waste (E&T RAW) performed intensive studies of several simple accelerator-driven system (ADS) setups consisting of lead, uranium and graphite which were irradiated by relativistic proton and deuteron beams in the past years at the Joint Institute for Nuclear Research (JINR) in Dubna, Russia. The most recent setup called Quinta, consisting of natural uranium target-blanket and lead shielding, was irradiated by deuteron beams in the energy range between 1 and 8 GeV in three accelerator runs at JINR Nuclotron in 2011 and 2012 with yttrium samples among others inserted inside the setup to measure the neutron flux in various places. Suitable activation detectors serve as one of possible tools for monitoring of proton and deuteron beams and for measurements of neutron field distribution in ADS studies. Yttrium is one of such suitable materials for monitoring of high energy neutrons. Various threshold reactions can be observed in yttrium samples. The yields of isotopes produced in the samples were determined using the activation method. Monte Carlo simulations of the reaction rates leading to production of different isotopes were performed in the MCNPX transport code and compared with the experimental results obtained from the yttrium samples.

  1. Modeling the Cost Effectiveness of Malaria Control Interventions in the Highlands of Western Kenya

    PubMed Central

    Stuckey, Erin M.; Stevenson, Jennifer; Galactionova, Katya; Baidjoe, Amrish Y.; Bousema, Teun; Odongo, Wycliffe; Kariuki, Simon; Drakeley, Chris; Smith, Thomas A.; Cox, Jonathan; Chitnis, Nakul

    2014-01-01

    Introduction Tools that allow for in silico optimization of available malaria control strategies can assist the decision-making process for prioritizing interventions. The OpenMalaria stochastic simulation modeling platform can be applied to simulate the impact of interventions singly and in combination as implemented in Rachuonyo South District, western Kenya, to support this goal. Methods Combinations of malaria interventions were simulated using a previously-published, validated model of malaria epidemiology and control in the study area. An economic model of the costs of case management and malaria control interventions in Kenya was applied to simulation results and cost-effectiveness of each intervention combination compared to the corresponding simulated outputs of a scenario without interventions. Uncertainty was evaluated by varying health system and intervention delivery parameters. Results The intervention strategy with the greatest simulated health impact employed long lasting insecticide treated net (LLIN) use by 80% of the population, 90% of households covered by indoor residual spraying (IRS) with deployment starting in April, and intermittent screen and treat (IST) of school children using Artemether lumefantrine (AL) with 80% coverage twice per term. However, the current malaria control strategy in the study area including LLIN use of 56% and IRS coverage of 70% was the most cost effective at reducing disability-adjusted life years (DALYs) over a five year period. Conclusions All the simulated intervention combinations can be considered cost effective in the context of available resources for health in Kenya. Increasing coverage of vector control interventions has a larger simulated impact compared to adding IST to the current implementation strategy, suggesting that transmission in the study area is not at a level to warrant replacing vector control to a school-based screen and treat program. These results have the potential to assist malaria control program managers in the study area in adding new or changing implementation of current interventions. PMID:25290939

  2. Modeling the cost effectiveness of malaria control interventions in the highlands of western Kenya.

    PubMed

    Stuckey, Erin M; Stevenson, Jennifer; Galactionova, Katya; Baidjoe, Amrish Y; Bousema, Teun; Odongo, Wycliffe; Kariuki, Simon; Drakeley, Chris; Smith, Thomas A; Cox, Jonathan; Chitnis, Nakul

    2014-01-01

    Tools that allow for in silico optimization of available malaria control strategies can assist the decision-making process for prioritizing interventions. The OpenMalaria stochastic simulation modeling platform can be applied to simulate the impact of interventions singly and in combination as implemented in Rachuonyo South District, western Kenya, to support this goal. Combinations of malaria interventions were simulated using a previously-published, validated model of malaria epidemiology and control in the study area. An economic model of the costs of case management and malaria control interventions in Kenya was applied to simulation results and cost-effectiveness of each intervention combination compared to the corresponding simulated outputs of a scenario without interventions. Uncertainty was evaluated by varying health system and intervention delivery parameters. The intervention strategy with the greatest simulated health impact employed long lasting insecticide treated net (LLIN) use by 80% of the population, 90% of households covered by indoor residual spraying (IRS) with deployment starting in April, and intermittent screen and treat (IST) of school children using Artemether lumefantrine (AL) with 80% coverage twice per term. However, the current malaria control strategy in the study area including LLIN use of 56% and IRS coverage of 70% was the most cost effective at reducing disability-adjusted life years (DALYs) over a five year period. All the simulated intervention combinations can be considered cost effective in the context of available resources for health in Kenya. Increasing coverage of vector control interventions has a larger simulated impact compared to adding IST to the current implementation strategy, suggesting that transmission in the study area is not at a level to warrant replacing vector control to a school-based screen and treat program. These results have the potential to assist malaria control program managers in the study area in adding new or changing implementation of current interventions.

  3. Engine System Model Development for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Nelson, Karl W.; Simpson, Steven P.

    2006-01-01

    In order to design, analyze, and evaluate conceptual Nuclear Thermal Propulsion (NTP) engine systems, an improved NTP design and analysis tool has been developed. The NTP tool utilizes the Rocket Engine Transient Simulation (ROCETS) system tool and many of the routines from the Enabler reactor model found in Nuclear Engine System Simulation (NESS). Improved non-nuclear component models and an external shield model were added to the tool. With the addition of a nearly complete system reliability model, the tool will provide performance, sizing, and reliability data for NERVA-Derived NTP engine systems. A new detailed reactor model is also being developed and will replace Enabler. The new model will allow more flexibility in reactor geometry and include detailed thermal hydraulics and neutronics models. A description of the reactor, component, and reliability models is provided. Another key feature of the modeling process is the use of comprehensive spreadsheets for each engine case. The spreadsheets include individual worksheets for each subsystem with data, plots, and scaled figures, making the output very useful to each engineering discipline. Sample performance and sizing results with the Enabler reactor model are provided including sensitivities. Before selecting an engine design, all figures of merit must be considered including the overall impacts on the vehicle and mission. Evaluations based on key figures of merit of these results and results with the new reactor model will be performed. The impacts of clustering and external shielding will also be addressed. Over time, the reactor model will be upgraded to design and analyze other NTP concepts with CERMET and carbide fuel cores.

  4. CFD Based Added Mass Prediction in Cruise Condition of Underwater Vehicle Dynamic

    NASA Astrophysics Data System (ADS)

    Agoes Moelyadi, Mochammad; Bambang Riswandi, Bagus

    2018-04-01

    One of the unsteady flow behavior on the hydrodynamic characteristics of underwater vehicle is the presence of added mass. In cruising conditions, the underwater vehicle may require the addition of speed or experience the disturbance in the form of unsteady flow so that cause the hydrodynamic interaction between the surface of the vehicle with the surrounding fluid. This leads to the rise of local velocity of flow and the great changes of hydrodynamic forces which are very influential on the stability of the underwater vehicle. One of the result is an additional force called added mass. It is very useful parameter to control underwater vehicle dynamic.This paper reports the research on the added mass coefficient of underwater vehicles obtained through the Computational Fluid Dynmaic (CFD) simulation method using CFX software. Added mass coefficient is calculated by performing an unsteady simulation or known as transient simulation. Computational simulations are based on the Reynold Average Navier- Stokes (RANS) equation solution. The simulated vehicle moves forward and backward according to the sinus function, with a frequency of 0.25 Hz, a 2 m amplitude, a cruising depth of 10 m below sea level, and Vcruise 1.54 m / s (Re = 9.000.000). Simulation result data includes velocity contour, variation of force and acceleration to frequency, and added mass coefficient.

  5. Source attribution using FLEXPART and carbon monoxide emission inventories: SOFT-IO version 1.0

    NASA Astrophysics Data System (ADS)

    Sauvage, Bastien; Fontaine, Alain; Eckhardt, Sabine; Auby, Antoine; Boulanger, Damien; Petetin, Hervé; Paugam, Ronan; Athier, Gilles; Cousin, Jean-Marc; Darras, Sabine; Nédélec, Philippe; Stohl, Andreas; Turquety, Solène; Cammas, Jean-Pierre; Thouret, Valérie

    2017-12-01

    Since 1994, the In-service Aircraft for a Global Observing System (IAGOS) program has produced in situ measurements of the atmospheric composition during more than 51 000 commercial flights. In order to help analyze these observations and understand the processes driving the observed concentration distribution and variability, we developed the SOFT-IO tool to quantify source-receptor links for all measured data. Based on the FLEXPART particle dispersion model (Stohl et al., 2005), SOFT-IO simulates the contributions of anthropogenic and biomass burning emissions from the ECCAD emission inventory database for all locations and times corresponding to the measured carbon monoxide mixing ratios along each IAGOS flight. Contributions are simulated from emissions occurring during the last 20 days before an observation, separating individual contributions from the different source regions. The main goal is to supply added-value products to the IAGOS database by evincing the geographical origin and emission sources driving the CO enhancements observed in the troposphere and lower stratosphere. This requires a good match between observed and modeled CO enhancements. Indeed, SOFT-IO detects more than 95 % of the observed CO anomalies over most of the regions sampled by IAGOS in the troposphere. In the majority of cases, SOFT-IO simulates CO pollution plumes with biases lower than 10-15 ppbv. Differences between the model and observations are larger for very low or very high observed CO values. The added-value products will help in the understanding of the trace-gas distribution and seasonal variability. They are available in the IAGOS database via http://www.iagos.org. The SOFT-IO tool could also be applied to similar data sets of CO observations (e.g., ground-based measurements, satellite observations). SOFT-IO could also be used for statistical validation as well as for intercomparisons of emission inventories using large amounts of data.

  6. Innovative research of AD HOC network mobility model

    NASA Astrophysics Data System (ADS)

    Chen, Xin

    2017-08-01

    It is difficult for researchers of AD HOC network to conduct actual deployment during experimental stage as the network topology is changeable and location of nodes is unfixed. Thus simulation still remains the main research method of the network. Mobility model is an important component of AD HOC network simulation. It is used to describe the movement pattern of nodes in AD HOC network (including location and velocity, etc.) and decides the movement trail of nodes, playing as the abstraction of the movement modes of nodes. Therefore, mobility model which simulates node movement is an important foundation for simulation research. In AD HOC network research, mobility model shall reflect the movement law of nodes as truly as possible. In this paper, node generally refers to the wireless equipment people carry. The main research contents include how nodes avoid obstacles during movement process and the impacts of obstacles on the mutual relation among nodes, based on which a Node Self Avoiding Obstacle, i.e. NASO model is established in AD HOC network.

  7. Numerical Simulation of Evacuation Process in Malaysia By Using Distinct-Element-Method Based Multi-Agent Model

    NASA Astrophysics Data System (ADS)

    Abustan, M. S.; Rahman, N. A.; Gotoh, H.; Harada, E.; Talib, S. H. A.

    2016-07-01

    In Malaysia, not many researches on crowd evacuation simulation had been reported. Hence, the development of numerical crowd evacuation process by taking into account people behavioral patterns and psychological characteristics is crucial in Malaysia. On the other hand, tsunami disaster began to gain attention of Malaysian citizens after the 2004 Indian Ocean Tsunami that need quick evacuation process. In relation to the above circumstances, we have conducted simulations of tsunami evacuation process at the Miami Beach of Penang Island by using Distinct Element Method (DEM)-based crowd behavior simulator. The main objectives are to investigate and reproduce current conditions of evacuation process at the said locations under different hypothetical scenarios for the efficiency study of the evacuation. The sim-1 is initial condition of evacuation planning while sim-2 as improvement of evacuation planning by adding new evacuation area. From the simulation result, sim-2 have a shorter time of evacuation process compared to the sim-1. The evacuation time recuded 53 second. The effect of the additional evacuation place is confirmed from decreasing of the evacuation completion time. Simultaneously, the numerical simulation may be promoted as an effective tool in studying crowd evacuation process.

  8. BioJazz: in silico evolution of cellular networks with unbounded complexity using rule-based modeling.

    PubMed

    Feng, Song; Ollivier, Julien F; Swain, Peter S; Soyer, Orkun S

    2015-10-30

    Systems biologists aim to decipher the structure and dynamics of signaling and regulatory networks underpinning cellular responses; synthetic biologists can use this insight to alter existing networks or engineer de novo ones. Both tasks will benefit from an understanding of which structural and dynamic features of networks can emerge from evolutionary processes, through which intermediary steps these arise, and whether they embody general design principles. As natural evolution at the level of network dynamics is difficult to study, in silico evolution of network models can provide important insights. However, current tools used for in silico evolution of network dynamics are limited to ad hoc computer simulations and models. Here we introduce BioJazz, an extendable, user-friendly tool for simulating the evolution of dynamic biochemical networks. Unlike previous tools for in silico evolution, BioJazz allows for the evolution of cellular networks with unbounded complexity by combining rule-based modeling with an encoding of networks that is akin to a genome. We show that BioJazz can be used to implement biologically realistic selective pressures and allows exploration of the space of network architectures and dynamics that implement prescribed physiological functions. BioJazz is provided as an open-source tool to facilitate its further development and use. Source code and user manuals are available at: http://oss-lab.github.io/biojazz and http://osslab.lifesci.warwick.ac.uk/BioJazz.aspx. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. A framework for incorporating DTI Atlas Builder registration into Tract-Based Spatial Statistics and a simulated comparison to standard TBSS.

    PubMed

    Leming, Matthew; Steiner, Rachel; Styner, Martin

    2016-02-27

    Tract-based spatial statistics (TBSS) 6 is a software pipeline widely employed in comparative analysis of the white matter integrity from diffusion tensor imaging (DTI) datasets. In this study, we seek to evaluate the relationship between different methods of atlas registration for use with TBSS and different measurements of DTI (fractional anisotropy, FA, axial diffusivity, AD, radial diffusivity, RD, and medial diffusivity, MD). To do so, we have developed a novel tool that builds on existing diffusion atlas building software, integrating it into an adapted version of TBSS called DAB-TBSS (DTI Atlas Builder-Tract-Based Spatial Statistics) by using the advanced registration offered in DTI Atlas Builder 7 . To compare the effectiveness of these two versions of TBSS, we also propose a framework for simulating population differences for diffusion tensor imaging data, providing a more substantive means of empirically comparing DTI group analysis programs such as TBSS. In this study, we used 33 diffusion tensor imaging datasets and simulated group-wise changes in this data by increasing, in three different simulations, the principal eigenvalue (directly altering AD), the second and third eigenvalues (RD), and all three eigenvalues (MD) in the genu, the right uncinate fasciculus, and the left IFO. Additionally, we assessed the benefits of comparing the tensors directly using a functional analysis of diffusion tensor tract statistics (FADTTS 10 ). Our results indicate comparable levels of FA-based detection between DAB-TBSS and TBSS, with standard TBSS registration reporting a higher rate of false positives in other measurements of DTI. Within the simulated changes investigated here, this study suggests that the use of DTI Atlas Builder's registration enhances TBSS group-based studies.

  10. A User Guide for Smoothing Air Traffic Radar Data

    NASA Technical Reports Server (NTRS)

    Bach, Ralph E.; Paielli, Russell A.

    2014-01-01

    Matlab software was written to provide smoothing of radar tracking data to simulate ADS-B (Automatic Dependent Surveillance-Broadcast) data in order to test a tactical conflict probe. The probe, called TSAFE (Tactical Separation-Assured Flight Environment), is designed to handle air-traffic conflicts left undetected or unresolved when loss-of-separation is predicted to occur within approximately two minutes. The data stream that is down-linked from an aircraft equipped with an ADS-B system would include accurate GPS-derived position and velocity information at sample rates of 1 Hz. Nation-wide ADS-B equipage (mandated by 2020) should improve surveillance accuracy and TSAFE performance. Currently, position data are provided by Center radar (nominal 12-sec samples) and Terminal radar (nominal 4.8-sec samples). Aircraft ground speed and ground track are estimated using real-time filtering, causing lags up to 60 sec, compromising performance of a tactical resolution tool. Offline smoothing of radar data reduces wild-point errors, provides a sample rate as high as 1 Hz, and yields more accurate and lag-free estimates of ground speed, ground track, and climb rate. Until full ADS-B implementation is available, smoothed radar data should provide reasonable track estimates for testing TSAFE in an ADS-B-like environment. An example illustrates the smoothing of radar data and shows a comparison of smoothed-radar and ADS-B tracking. This document is intended to serve as a guide for using the smoothing software.

  11. Simulation of Black Hole Collisions in Asymptotically anti-de Sitter Spacetimes

    NASA Astrophysics Data System (ADS)

    Bantilan, Hans; Romatschke, Paul

    2015-04-01

    The main purpose of this talk is to describe, in detail, the necessary ingredients for achieving stable Cauchy evolution of black hole collisions in asymptotically anti-de Sitter (AdS) spacetimes. I will begin by motivating this program in terms of the heavy-ion physics it is intended to clarify. I will then give an overview of asymptotically AdS spacetimes, the mapping to the dual conformal field theory on the AdS boundary, and the method we use to numerically solve the fully non-linear Einstein field equations with AdS boundary conditions. As a concrete example of these ideas, I will describe the first proof of principle simulation of stable AdS black hole mergers in 5 dimensions.

  12. ImaSim, a software tool for basic education of medical x-ray imaging in radiotherapy and radiology

    NASA Astrophysics Data System (ADS)

    Landry, Guillaume; deBlois, François; Verhaegen, Frank

    2013-11-01

    Introduction: X-ray imaging is an important part of medicine and plays a crucial role in radiotherapy. Education in this field is mostly limited to textbook teaching due to equipment restrictions. A novel simulation tool, ImaSim, for teaching the fundamentals of the x-ray imaging process based on ray-tracing is presented in this work. ImaSim is used interactively via a graphical user interface (GUI). Materials and methods: The software package covers the main x-ray based medical modalities: planar kilo voltage (kV), planar (portal) mega voltage (MV), fan beam computed tomography (CT) and cone beam CT (CBCT) imaging. The user can modify the photon source, object to be imaged and imaging setup with three-dimensional editors. Objects are currently obtained by combining blocks with variable shapes. The imaging of three-dimensional voxelized geometries is currently not implemented, but can be added in a later release. The program follows a ray-tracing approach, ignoring photon scatter in its current implementation. Simulations of a phantom CT scan were generated in ImaSim and were compared to measured data in terms of CT number accuracy. Spatial variations in the photon fluence and mean energy from an x-ray tube caused by the heel effect were estimated from ImaSim and Monte Carlo simulations and compared. Results: In this paper we describe ImaSim and provide two examples of its capabilities. CT numbers were found to agree within 36 Hounsfield Units (HU) for bone, which corresponds to a 2% attenuation coefficient difference. ImaSim reproduced the heel effect reasonably well when compared to Monte Carlo simulations. Discussion: An x-ray imaging simulation tool is made available for teaching and research purposes. ImaSim provides a means to facilitate the teaching of medical x-ray imaging.

  13. Introduction and a Quick Look at MUESR, the Magnetic Structure and mUon Embedding Site Refinement Suite

    NASA Astrophysics Data System (ADS)

    Bonfà, Pietro; Onuorah, Ifeanyi John; De Renzi, Roberto

    The estimation of the magnetic field generated at a given point by magnetic dipoles is an undergraduate exercise. However, under certain approximation, this is all that is needed to evaluate the local field at the muon site once the interstitial position of the muon in the unit cell is known. The development of an application to specifically solve this problem may therefore seem an excessive effort. At the same time, the lack of a general solution leads to the development of small ad hoc codes that are generally rewritten or re-adapted for different experiments and are poorly optimized. This and other motivations led to the development of MuESR, a python+C tool to perform dipolar field simulations. In this manuscript we will describe the tool, its features and its development strategies.

  14. A new method to assess the added value of high-resolution regional climate simulations: application to the EURO-CORDEX dataset

    NASA Astrophysics Data System (ADS)

    Soares, P. M. M.; Cardoso, R. M.

    2017-12-01

    Regional climate models (RCM) are used with increasing resolutions pursuing to represent in an improved way regional to local scale atmospheric phenomena. The EURO-CORDEX simulations at 0.11° and simulations exploiting finer grid spacing approaching the convective-permitting regimes are representative examples. The climate runs are computationally very demanding and do not always show improvements. These depend on the region, variable and object of study. The gains or losses associated with the use of higher resolution in relation to the forcing model (global climate model or reanalysis), or to different resolution RCM simulations, is known as added value. Its characterization is a long-standing issue, and many different added-value measures have been proposed. In the current paper, a new method is proposed to assess the added value of finer resolution simulations, in comparison to its forcing data or coarser resolution counterparts. This approach builds on a probability density function (PDF) matching score, giving a normalised measure of the difference between diverse resolution PDFs, mediated by the observational ones. The distribution added value (DAV) is an objective added value measure that can be applied to any variable, region or temporal scale, from hindcast or historical (non-synchronous) simulations. The DAVs metric and an application to the EURO-CORDEX simulations, for daily temperatures and precipitation, are here presented. The EURO-CORDEX simulations at both resolutions (0.44o,0.11o) display a clear added value in relation to ERA-Interim, with values around 30% in summer and 20% in the intermediate seasons, for precipitation. When both RCM resolutions are directly compared the added value is limited. The regions with the larger precipitation DAVs are areas where convection is relevant, e.g. Alps and Iberia. When looking at the extreme precipitation PDF tail, the higher resolution improvement is generally greater than the low resolution for seasons and regions. For temperature, the added value is smaller. AcknowledgmentsThe authors wish to acknowledge SOLAR (PTDC/GEOMET/7078/2014) and FCT UID/GEO/50019/ 2013 (Instituto Dom Luiz) projects.

  15. Integrating research tools to support the management of social-ecological systems under climate change

    USGS Publications Warehouse

    Miller, Brian W.; Morisette, Jeffrey T.

    2014-01-01

    Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.

  16. Performance in nondairy drinks of probiotic L. casei strains usually employed in dairy products.

    PubMed

    Céspedes, Mario; Cárdenas, Pamela; Staffolani, Martín; Ciappini, María C; Vinderola, Gabriel

    2013-05-01

    The increase in vegetarianism as dietary habit and the increased allergy episodes against dairy proteins fuel the demand for probiotics in nondairy products. Lactose intolerance and the cholesterol content of dairy products can also be considered two additional reasons why some consumers are looking for probiotics in other foods. We aimed at determining cell viability in nondairy drinks and resistance to simulated gastric digestion of commercial probiotic lactobacilli commonly used in dairy products. Lactobacillus casei LC-01 and L. casei BGP 93 were added to different commercial nondairy drinks and viability and resistance to simulated gastric digestion (pH 2.5, 90 min, 37 °C) were monitored along storage (5 and 20 °C). For both strains, at least one nondairy drink was found to offer cell counts around 7 log orders until the end of the storage period. Changes in resistance to simulated gastric digestion were observed as well. Commercial probiotic cultures of L. casei can be added to commercial fruit juices after a carefull selection of the product that warrants cell viability. The resistance to simulated gastric digestion is an easy-to-apply in vitro tool that may contribute to product characterization and may help in the choice of the food matrix when no changes in cell viability are observed along storage. Sensorial evaluation is mandatory before marketing since the product type and storage conditions might influence the sensorial properties of the product due to the possibility of growth and lactic acid production by probiotic bacteria. © 2013 Institute of Food Technologists®

  17. Modeling and Simulation of an UAS Collision Avoidance Systems

    NASA Technical Reports Server (NTRS)

    Oliveros, Edgardo V.; Murray, A. Jennifer

    2010-01-01

    This paper describes a Modeling and Simulation of an Unmanned Aircraft Systems (UAS) Collision Avoidance System, capable of representing different types of scenarios for UAS collision avoidance. Commercial and military piloted aircraft currently utilize various systems for collision avoidance such as Traffic Alert and Collision A voidance System (TCAS), Automatic Dependent Surveillance-Broadcast (ADS-B), Radar and ElectroOptical and Infrared Sensors (EO-IR). The integration of information from these systems is done by the pilot in the aircraft to determine the best course of action. In order to operate optimally in the National Airspace System (NAS) UAS have to work in a similar or equivalent manner to a piloted aircraft by applying the principle of "detect-see and avoid" (DSA) to other air traffic. Hence, we have taken these existing sensor technologies into consideration in order to meet the challenge of researching the modeling and simulation of an approximated DSA system. A Schematic Model for a UAS Collision Avoidance System (CAS) has been developed ina closed loop block diagram for that purpose. We have found that the most suitable software to carry out this task is the Satellite Tool Kit (STK) from Analytical Graphics Inc. (AGI). We have used the Aircraft Mission Modeler (AMM) for modeling and simulation of a scenario where a UAS is placed on a possible collision path with an initial intruder and then with a second intruder, but is able to avoid them by executing a right tum maneuver and then climbing. Radars have also been modeled with specific characteristics for the UAS and both intruders. The software provides analytical, graphical user interfaces and data controlling tools which allow the operator to simulate different conditions. Extensive simulations have been carried out which returned excellent results.

  18. Tool use in left brain damage and Alzheimer's disease: What about function and manipulation knowledge?

    PubMed

    Jarry, Christophe; Osiurak, François; Besnard, Jérémy; Baumard, Josselin; Lesourd, Mathieu; Croisile, Bernard; Etcharry-Bouyx, Frédérique; Chauviré, Valérie; Le Gall, Didier

    2016-03-01

    Tool use disorders are usually associated with difficulties in retrieving function and manipulation knowledge. Here, we investigate tool use (Real Tool Use, RTU), function (Functional Association, FA) and manipulation knowledge (Gesture Recognition, GR) in 17 left-brain-damaged (LBD) patients and 14 AD patients (Alzheimer disease). LBD group exhibited predicted deficit on RTU but not on FA and GR while AD patients showed deficits on GR and FA with preserved tool use skills. These findings question the role played by function and manipulation knowledge in actual tool use. © 2016 The British Psychological Society.

  19. Downscaled and debiased climate simulations for North America from 21,000 years ago to 2100AD

    PubMed Central

    Lorenz, David J.; Nieto-Lugilde, Diego; Blois, Jessica L.; Fitzpatrick, Matthew C.; Williams, John W.

    2016-01-01

    Increasingly, ecological modellers are integrating paleodata with future projections to understand climate-driven biodiversity dynamics from the past through the current century. Climate simulations from earth system models are necessary to this effort, but must be debiased and downscaled before they can be used by ecological models. Downscaling methods and observational baselines vary among researchers, which produces confounding biases among downscaled climate simulations. We present unified datasets of debiased and downscaled climate simulations for North America from 21 ka BP to 2100AD, at 0.5° spatial resolution. Temporal resolution is decadal averages of monthly data until 1950AD, average climates for 1950–2005 AD, and monthly data from 2010 to 2100AD, with decadal averages also provided. This downscaling includes two transient paleoclimatic simulations and 12 climate models for the IPCC AR5 (CMIP5) historical (1850–2005), RCP4.5, and RCP8.5 21st-century scenarios. Climate variables include primary variables and derived bioclimatic variables. These datasets provide a common set of climate simulations suitable for seamlessly modelling the effects of past and future climate change on species distributions and diversity. PMID:27377537

  20. Downscaled and debiased climate simulations for North America from 21,000 years ago to 2100AD.

    PubMed

    Lorenz, David J; Nieto-Lugilde, Diego; Blois, Jessica L; Fitzpatrick, Matthew C; Williams, John W

    2016-07-05

    Increasingly, ecological modellers are integrating paleodata with future projections to understand climate-driven biodiversity dynamics from the past through the current century. Climate simulations from earth system models are necessary to this effort, but must be debiased and downscaled before they can be used by ecological models. Downscaling methods and observational baselines vary among researchers, which produces confounding biases among downscaled climate simulations. We present unified datasets of debiased and downscaled climate simulations for North America from 21 ka BP to 2100AD, at 0.5° spatial resolution. Temporal resolution is decadal averages of monthly data until 1950AD, average climates for 1950-2005 AD, and monthly data from 2010 to 2100AD, with decadal averages also provided. This downscaling includes two transient paleoclimatic simulations and 12 climate models for the IPCC AR5 (CMIP5) historical (1850-2005), RCP4.5, and RCP8.5 21st-century scenarios. Climate variables include primary variables and derived bioclimatic variables. These datasets provide a common set of climate simulations suitable for seamlessly modelling the effects of past and future climate change on species distributions and diversity.

  1. Characteristics of Hands-On Simulations with Added Value for Innovative Secondary and Higher Vocational Education

    ERIC Educational Resources Information Center

    Khaled, Anne; Gulikers, Judith; Biemans, Harm; van der Wel, Marjan; Mulder, Martin

    2014-01-01

    The intentions with which hands-on simulations are used in vocational education are not always clear. Also, pedagogical-didactic approaches in hands-on simulations are not well conceptualised from a learning theory perspective. This makes it difficult to pinpoint the added value that hands-on simulations can have in an innovative vocational…

  2. Time Domain Stability Margin Assessment Method

    NASA Technical Reports Server (NTRS)

    Clements, Keith

    2017-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation.

  3. Time-Domain Stability Margin Assessment

    NASA Technical Reports Server (NTRS)

    Clements, Keith

    2016-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation.

  4. Gas network model allows full reservoir coupling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Methnani, M.M.

    The gas-network flow model (Gasnet) developed for and added to an existing Qatar General Petroleum Corp. (OGPC) in-house reservoir simulator, allows improved modeling of the interaction among the reservoir, wells, and pipeline networks. Gasnet is a three-phase model that is modified to handle gas-condensate systems. The numerical solution is based on a control volume scheme that uses the concept of cells and junctions, whereby pressure and phase densities are defined in cells, while phase flows are defined at junction links. The model features common numerical equations for the reservoir, the well, and the pipeline components and an efficient state-variable solutionmore » method in which all primary variables including phase flows are solved directly. Both steady-state and transient flow events can be simulated with the same tool. Three test cases show how the model runs. One case simulates flow redistribution in a simple two-branch gas network. The second simulates a horizontal gas well in a waterflooded gas reservoir. The third involves an export gas pipeline coupled to a producing reservoir.« less

  5. Surface vacancies concentration of CeO2(1 1 1) using kinetic Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Mattiello, S.; Kolling, S.; Heiliger, C.

    2016-01-01

    Kinetic Monte Carlo simulations (kMC) are useful tools for the investigation of the dynamics of surface properties. Within this method we investigate the oxygen vacancy concentration of \\text{Ce}{{\\text{O}}2}(1 1 1) at ultra high vacuum conditions (UHV). In order to achieve first principles calculations the input for the simulations, i.e. energy barriers for the microscopic processes, we use density functional theory (DFT) results from literature. We investigate the possibility of ad- and desorption of oxygen on ceria as well as the diffusion of oxygen vacancies to and from the subsurface. In particular, we focus on the vacancy surface concentration as well as on the ratio of the number of subsurface vacancies to the number of vacancies at the surface. The comparison of our dynamically obtained results to the experimental findings leads to several issues. In conclusion, we can claim a substantial incompatibility of the experimental results and the dynamical calculation using DFT inputs.

  6. A neural-network-based model for the dynamic simulation of the tire/suspension system while traversing road irregularities.

    PubMed

    Guarneri, Paolo; Rocca, Gianpiero; Gobbi, Massimiliano

    2008-09-01

    This paper deals with the simulation of the tire/suspension dynamics by using recurrent neural networks (RNNs). RNNs are derived from the multilayer feedforward neural networks, by adding feedback connections between output and input layers. The optimal network architecture derives from a parametric analysis based on the optimal tradeoff between network accuracy and size. The neural network can be trained with experimental data obtained in the laboratory from simulated road profiles (cleats). The results obtained from the neural network demonstrate good agreement with the experimental results over a wide range of operation conditions. The NN model can be effectively applied as a part of vehicle system model to accurately predict elastic bushings and tire dynamics behavior. Although the neural network model, as a black-box model, does not provide a good insight of the physical behavior of the tire/suspension system, it is a useful tool for assessing vehicle ride and noise, vibration, harshness (NVH) performance due to its good computational efficiency and accuracy.

  7. Acceptability of Flight Deck-Based Interval Management Crew Procedures

    NASA Technical Reports Server (NTRS)

    Murdock, Jennifer L.; Wilson, Sara R.; Hubbs, Clay E.; Smail, James W.

    2013-01-01

    The Interval Management for Near-term Operations Validation of Acceptability (IM-NOVA) experiment was conducted at the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC) in support of the NASA Next Generation Air Transportation System (NextGen) Airspace Systems Program's Air Traffic Management Technology Demonstration - 1 (ATD-1). ATD-1 is intended to showcase an integrated set of technologies that provide an efficient arrival solution for managing aircraft using NextGen surveillance, navigation, procedures, and automation for both airborne and ground-based systems. The goal of the IM-NOVA experiment was to assess if procedures outlined by the ATD-1 Concept of Operations, when used with a minimum set of Flight deck-based Interval Management (FIM) equipment and a prototype crew interface, were acceptable to and feasible for use by flight crews in a voice communications environment. To investigate an integrated arrival solution using ground-based air traffic control tools and aircraft automatic dependent surveillance broadcast (ADS-B) tools, the LaRC FIM system and the Traffic Management Advisor with Terminal Metering and Controller Managed Spacing tools developed at the NASA Ames Research Center (ARC) were integrated in LaRC's Air Traffic Operations Laboratory. Data were collected from 10 crews of current, qualified 757/767 pilots asked to fly a high-fidelity, fixed based simulator during scenarios conducted within an airspace environment modeled on the Dallas-Fort Worth (DFW) Terminal Radar Approach Control area. The aircraft simulator was equipped with the Airborne Spacing for Terminal Area Routes algorithm and a FIM crew interface consisting of electronic flight bags and ADS-B guidance displays. Researchers used "pseudo-pilot" stations to control 24 simulated aircraft that provided multiple air traffic flows into DFW, and recently retired DFW air traffic controllers served as confederate Center, Feeder, Final, and Tower controllers. Pilot participant feedback indicated that the procedures used by flight crews to receive and execute interval management (IM) clearances in a voice communications environment were logical, easy to follow, did not contain any missing or extraneous steps, and required the use of an acceptable level of workload. The majority of the pilot participants found the IM concept, in addition to the proposed FIM crew procedures, to be acceptable and indicated that the ATD-1 procedures can be successfully executed in a near-term NextGen environment.

  8. Advances in Alzheimer's Diagnosis and Therapy: The Implications of Nanotechnology.

    PubMed

    Hajipour, Mohammad Javad; Santoso, Michelle R; Rezaee, Farhad; Aghaverdi, Haniyeh; Mahmoudi, Morteza; Perry, George

    2017-10-01

    Alzheimer's disease (AD) is a type of dementia that causes major issues for patients' memory, thinking, and behavior. Despite efforts to advance AD diagnostic and therapeutic tools, AD remains incurable due to its complex and multifactorial nature and lack of effective diagnostics/therapeutics. Nanoparticles (NPs) have demonstrated the potential to overcome the challenges and limitations associated with traditional diagnostics/therapeutics. Nanotechnology is now offering new tools and insights to advance our understanding of AD and eventually may offer new hope to AD patients. Here, we review the key roles of nanotechnologies in the recent literature, in both diagnostic and therapeutic aspects of AD, and discuss how these achievements may improve patient prognosis and quality of life. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Evaluating driving performance of outpatients with Alzheimer disease.

    PubMed

    Cox, D J; Quillian, W C; Thorndike, F P; Kovatchev, B P; Hanna, G

    1998-01-01

    Alzheimer disease (AD) is a progressive disease, with multiple physiologic, psychologic, and social implications. A critical issue in its management is when to recommend restrictions on autonomous functioning, such as driving an automobile. This study evaluates driving performance of patients with AD and its relation to patient scores on the Mini-Mental State Exam (MMSE). This study compared 29 outpatients with probable AD with 21 age-matched control participants on an interactive driving simulator to determine how the two groups differed and how such differences related to mental status. Patients with AD (1) were less likely to comprehend and operate the simulator cognitively, (2) drove off the road more often, (3) spent more time driving considerably slower than the posted speed limit, (4) spent less time driving faster than the speed limit, (5) applied less brake pressure in stop zones, (6) spent more time negotiating left turns, and (7) drove more poorly overall. There were no observed differences between AD patients and the control group in terms of crossing the midline and driving speed variability. Among the AD patients, those who could not drive the simulator because of confusion and disorientation (n = 10) had lower MMSE scores and drove fewer miles annually. Those AD patients who had stopped driving also scored lower on their MMSE but did not perform more poorly on the driving simulator. Factor analysis revealed five driving factors associated with AD, explaining 93 percent of the variance. These five factors correctly classified 27 (85 percent) of 32 AD patients compared with the control group. Of the 15 percent who were improperly classified, there were three false positives (control participants misclassified as AD patients) and two false negatives (AD patients misclassified as control participants). The computed total driving score correlated significantly with MMSE scores (r = -.403, P = 0.011). Driving simulators can provide an objective means of assessing driving safety.

  10. Eugene – A Domain Specific Language for Specifying and Constraining Synthetic Biological Parts, Devices, and Systems

    PubMed Central

    Bilitchenko, Lesia; Liu, Adam; Cheung, Sherine; Weeding, Emma; Xia, Bing; Leguia, Mariana; Anderson, J. Christopher; Densmore, Douglas

    2011-01-01

    Background Synthetic biological systems are currently created by an ad-hoc, iterative process of specification, design, and assembly. These systems would greatly benefit from a more formalized and rigorous specification of the desired system components as well as constraints on their composition. Therefore, the creation of robust and efficient design flows and tools is imperative. We present a human readable language (Eugene) that allows for the specification of synthetic biological designs based on biological parts, as well as provides a very expressive constraint system to drive the automatic creation of composite Parts (Devices) from a collection of individual Parts. Results We illustrate Eugene's capabilities in three different areas: Device specification, design space exploration, and assembly and simulation integration. These results highlight Eugene's ability to create combinatorial design spaces and prune these spaces for simulation or physical assembly. Eugene creates functional designs quickly and cost-effectively. Conclusions Eugene is intended for forward engineering of DNA-based devices, and through its data types and execution semantics, reflects the desired abstraction hierarchy in synthetic biology. Eugene provides a powerful constraint system which can be used to drive the creation of new devices at runtime. It accomplishes all of this while being part of a larger tool chain which includes support for design, simulation, and physical device assembly. PMID:21559524

  11. Free Chlorine and Cyanuric Acid Simulator Application ...

    EPA Pesticide Factsheets

    A web-based application designed to simulate the free chlorine in systems adding free chlorine and cyanuric acid, including the application of Dichlor and Trichlor. A web-based application designed to simulate the free chlorine in systems adding free chlorine and cyanuric acid, including the application of Dichlor and Trichlor.

  12. WENESSA, Wide Eye-Narrow Eye Space Simulation fo Situational Awareness

    NASA Astrophysics Data System (ADS)

    Albarait, O.; Payne, D. M.; LeVan, P. D.; Luu, K. K.; Spillar, E.; Freiwald, W.; Hamada, K.; Houchard, J.

    In an effort to achieve timelier indications of anomalous object behaviors in geosynchronous earth orbit, a Planning Capability Concept (PCC) for a “Wide Eye-Narrow Eye” (WE-NE) telescope network has been established. The PCC addresses the problem of providing continuous and operationally robust, layered and cost-effective, Space Situational Awareness (SSA) that is focused on monitoring deep space for anomalous behaviors. It does this by first detecting the anomalies with wide field of regard systems, and then providing reliable handovers for detailed observational follow-up by another optical asset. WENESSA will explore the added value of such a system to the existing Space Surveillance Network (SSN). The study will assess and quantify the degree to which the PCC completely fulfills, or improves or augments, these deep space knowledge deficiencies relative to current operational systems. In order to improve organic simulation capabilities, we will explore options for the federation of diverse community simulation approaches, while evaluating the efficiencies offered by a network of small and larger aperture, ground-based telescopes. Existing Space Modeling and Simulation (M&S) tools designed for evaluating WENESSA-like problems will be taken into consideration as we proceed in defining and developing the tools needed to perform this study, leading to the creation of a unified Space M&S environment for the rapid assessment of new capabilities. The primary goal of this effort is to perform a utility assessment of the WE-NE concept. The assessment will explore the mission utility of various WE-NE concepts in discovering deep space anomalies in concert with the SSN. The secondary goal is to generate an enduring modeling and simulation environment to explore the utility of future proposed concepts and supporting technologies. Ultimately, our validated simulation framework would support the inclusion of other ground- and space-based SSA assets through integrated analysis. Options will be explored using at least two competing simulation capabilities, but emphasis will be placed on reasoned analyses as supported by the simulations.

  13. Using Computational Modeling to Assess the Impact of Clinical Decision Support on Cancer Screening within Community Health Centers

    PubMed Central

    Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.

    2014-01-01

    Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241

  14. The TRIDEC Virtual Tsunami Atlas - customized value-added simulation data products for Tsunami Early Warning generated on compute clusters

    NASA Astrophysics Data System (ADS)

    Löwe, P.; Hammitzsch, M.; Babeyko, A.; Wächter, J.

    2012-04-01

    The development of new Tsunami Early Warning Systems (TEWS) requires the modelling of spatio-temporal spreading of tsunami waves both recorded from past events and hypothetical future cases. The model results are maintained in digital repositories for use in TEWS command and control units for situation assessment once a real tsunami occurs. Thus the simulation results must be absolutely trustworthy, in a sense that the quality of these datasets is assured. This is a prerequisite as solid decision making during a crisis event and the dissemination of dependable warning messages to communities under risk will be based on them. This requires data format validity, but even more the integrity and information value of the content, being a derived value-added product derived from raw tsunami model output. Quality checking of simulation result products can be done in multiple ways, yet the visual verification of both temporal and spatial spreading characteristics for each simulation remains important. The eye of the human observer still remains an unmatched tool for the detection of irregularities. This requires the availability of convenient, human-accessible mappings of each simulation. The improvement of tsunami models necessitates the changes in many variables, including simulation end-parameters. Whenever new improved iterations of the general models or underlying spatial data are evaluated, hundreds to thousands of tsunami model results must be generated for each model iteration, each one having distinct initial parameter settings. The use of a Compute Cluster Environment (CCE) of sufficient size allows the automated generation of all tsunami-results within model iterations in little time. This is a significant improvement to linear processing on dedicated desktop machines or servers. This allows for accelerated/improved visual quality checking iterations, which in turn can provide a positive feedback into the overall model improvement iteratively. An approach to set-up and utilize the CCE has been implemented by the project Collaborative, Complex, and Critical Decision Processes in Evolving Crises (TRIDEC) funded under the European Union's FP7. TRIDEC focuses on real-time intelligent information management in Earth management. The addressed challenges include the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources with accelerated generation of large volumes of data. These include sensor systems, geo-information repositories, simulations and data fusion tools. Additionally, TRIDEC adopts enhancements of Service Oriented Architecture (SOA) principles in terms of Event Driven Architecture (EDA) design. As a next step the implemented CCE's services to generate derived and customized simulation products are foreseen to be provided via an EDA service for on-demand processing for specific threat-parameters and to accommodate for model improvements.

  15. Evaluation and development the routing protocol of a fully functional simulation environment for VANETs

    NASA Astrophysics Data System (ADS)

    Ali, Azhar Tareq; Warip, Mohd Nazri Mohd; Yaakob, Naimah; Abduljabbar, Waleed Khalid; Atta, Abdu Mohammed Ali

    2017-11-01

    Vehicular Ad-hoc Networks (VANETs) is an area of wireless technologies that is attracting a great deal of interest. There are still several areas of VANETS, such as security and routing protocols, medium access control, that lack large amounts of research. There is also a lack of freely available simulators that can quickly and accurately simulate VANETs. The main goal of this paper is to develop a freely available VANETS simulator and to evaluate popular mobile ad-hoc network routing protocols in several VANETS scenarios. The VANETS simulator consisted of a network simulator, traffic (mobility simulator) and used a client-server application to keep the two simulators in sync. The VANETS simulator also models buildings to create a more realistic wireless network environment. Ad-Hoc Distance Vector routing (AODV), Dynamic Source Routing (DSR) and Dynamic MANET On-demand (DYMO) were initially simulated in a city, country, and highway environment to provide an overall evaluation.

  16. Evaluation of assigned-value uncertainty for complex calibrator value assignment processes: a prealbumin example.

    PubMed

    Middleton, John; Vaks, Jeffrey E

    2007-04-01

    Errors of calibrator-assigned values lead to errors in the testing of patient samples. The ability to estimate the uncertainties of calibrator-assigned values and other variables minimizes errors in testing processes. International Organization of Standardization guidelines provide simple equations for the estimation of calibrator uncertainty with simple value-assignment processes, but other methods are needed to estimate uncertainty in complex processes. We estimated the assigned-value uncertainty with a Monte Carlo computer simulation of a complex value-assignment process, based on a formalized description of the process, with measurement parameters estimated experimentally. This method was applied to study uncertainty of a multilevel calibrator value assignment for a prealbumin immunoassay. The simulation results showed that the component of the uncertainty added by the process of value transfer from the reference material CRM470 to the calibrator is smaller than that of the reference material itself (<0.8% vs 3.7%). Varying the process parameters in the simulation model allowed for optimizing the process, while keeping the added uncertainty small. The patient result uncertainty caused by the calibrator uncertainty was also found to be small. This method of estimating uncertainty is a powerful tool that allows for estimation of calibrator uncertainty for optimization of various value assignment processes, with a reduced number of measurements and reagent costs, while satisfying the requirements to uncertainty. The new method expands and augments existing methods to allow estimation of uncertainty in complex processes.

  17. Aggregation, impaired degradation and immunization targeting of amyloid-beta dimers in Alzheimer’s disease: a stochastic modelling approach

    PubMed Central

    2012-01-01

    Background Alzheimer’s disease (AD) is the most frequently diagnosed neurodegenerative disorder affecting humans, with advanced age being the most prominent risk factor for developing AD. Despite intense research efforts aimed at elucidating the precise molecular underpinnings of AD, a definitive answer is still lacking. In recent years, consensus has grown that dimerisation of the polypeptide amyloid-beta (Aß), particularly Aß42, plays a crucial role in the neuropathology that characterise AD-affected post-mortem brains, including the large-scale accumulation of fibrils, also referred to as senile plaques. This has led to the realistic hope that targeting Aß42 immunotherapeutically could drastically reduce plaque burden in the ageing brain, thus delaying AD onset or symptom progression. Stochastic modelling is a useful tool for increasing understanding of the processes underlying complex systems-affecting disorders such as AD, providing a rapid and inexpensive strategy for testing putative new therapies. In light of the tool’s utility, we developed computer simulation models to examine Aß42 turnover and its aggregation in detail and to test the effect of immunization against Aß dimers. Results Our model demonstrates for the first time that even a slight decrease in the clearance rate of Aß42 monomers is sufficient to increase the chance of dimers forming, which could act as instigators of protofibril and fibril formation, resulting in increased plaque levels. As the process is slow and levels of Aβ are normally low, stochastic effects are important. Our model predicts that reducing the rate of dimerisation leads to a significant reduction in plaque levels and delays onset of plaque formation. The model was used to test the effect of an antibody mediated immunological response. Our results showed that plaque levels were reduced compared to conditions where antibodies are not present. Conclusion Our model supports the current thinking that levels of dimers are important in initiating the aggregation process. Although substantial knowledge exists regarding the process, no therapeutic intervention is on offer that reliably decreases disease burden in AD patients. Computer modelling could serve as one of a number of tools to examine both the validity of reliable biomarkers and aid the discovery of successful intervention strategies. PMID:22748062

  18. BilKristal 2.0: A tool for pattern information extraction from crystal structures

    NASA Astrophysics Data System (ADS)

    Okuyan, Erhan; Güdükbay, Uğur

    2014-01-01

    We present a revised version of the BilKristal tool of Okuyan et al. (2007). We converted the development environment into Microsoft Visual Studio 2005 in order to resolve compatibility issues. We added multi-core CPU support and improvements are made to graphics functions in order to improve performance. Discovered bugs are fixed and exporting functionality to a material visualization tool is added.

  19. Influence of the Size of Cohorts in Adaptive Design for Nonlinear Mixed Effects Models: An Evaluation by Simulation for a Pharmacokinetic and Pharmacodynamic Model for a Biomarker in Oncology

    PubMed Central

    Lestini, Giulia; Dumont, Cyrielle; Mentré, France

    2015-01-01

    Purpose In this study we aimed to evaluate adaptive designs (ADs) by clinical trial simulation for a pharmacokinetic-pharmacodynamic model in oncology and to compare them with one-stage designs, i.e. when no adaptation is performed, using wrong prior parameters. Methods We evaluated two one-stage designs, ξ0 and ξ*, optimised for prior and true population parameters, Ψ0 and Ψ*, and several ADs (two-, three- and five-stage). All designs had 50 patients. For ADs, the first cohort design was ξ0. The next cohort design was optimised using prior information updated from the previous cohort. Optimal design was based on the determinant of the Fisher information matrix using PFIM. Design evaluation was performed by clinical trial simulations using data simulated from Ψ*. Results Estimation results of two-stage ADs and ξ* were close and much better than those obtained with ξ0. The balanced two-stage AD performed better than two-stage ADs with different cohort sizes. Three-and five-stage ADs were better than two-stage with small first cohort, but not better than the balanced two-stage design. Conclusions Two-stage ADs are useful when prior parameters are unreliable. In case of small first cohort, more adaptations are needed but these designs are complex to implement. PMID:26123680

  20. Influence of the Size of Cohorts in Adaptive Design for Nonlinear Mixed Effects Models: An Evaluation by Simulation for a Pharmacokinetic and Pharmacodynamic Model for a Biomarker in Oncology.

    PubMed

    Lestini, Giulia; Dumont, Cyrielle; Mentré, France

    2015-10-01

    In this study we aimed to evaluate adaptive designs (ADs) by clinical trial simulation for a pharmacokinetic-pharmacodynamic model in oncology and to compare them with one-stage designs, i.e., when no adaptation is performed, using wrong prior parameters. We evaluated two one-stage designs, ξ0 and ξ*, optimised for prior and true population parameters, Ψ0 and Ψ*, and several ADs (two-, three- and five-stage). All designs had 50 patients. For ADs, the first cohort design was ξ0. The next cohort design was optimised using prior information updated from the previous cohort. Optimal design was based on the determinant of the Fisher information matrix using PFIM. Design evaluation was performed by clinical trial simulations using data simulated from Ψ*. Estimation results of two-stage ADs and ξ * were close and much better than those obtained with ξ 0. The balanced two-stage AD performed better than two-stage ADs with different cohort sizes. Three- and five-stage ADs were better than two-stage with small first cohort, but not better than the balanced two-stage design. Two-stage ADs are useful when prior parameters are unreliable. In case of small first cohort, more adaptations are needed but these designs are complex to implement.

  1. WebbPSF: Updated PSF Models Based on JWST Ground Testing Results

    NASA Astrophysics Data System (ADS)

    Osborne, Shannon; Perrin, Marshall D.; Melendez Hernandez, Marcio

    2018-06-01

    WebbPSF is a widely-used package that allows astronomers to create simulated point spread functions (PSFs) for the James Webb Space Telescope (JWST). WebbPSF provides the user with the flexibility to produce PSFs for direct imaging and coronographic modes, for a range of filters and masks, and across all the JWST instruments. These PSFs can then be analyzed with built-in evaluation tools or can be output to be used with users’ own tools. In the most recent round of updates, the accuracy of the PSFs have been improved with updated analyses of the instrument test data from NASA Goddard and with the new data from the testing of the combined Optical Telescope Element and Integrated Science Instrument Module (OTIS) at NASA Johnson. A post-processing function applying detector effects and pupil distortions to input PSFs has also been added to the WebbPSF package.

  2. Development of a Three-Dimensional, Unstructured Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph C.; Stern, Eric C.; Muppidi, Suman; Palmer, Grant E.; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. This extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  3. Six-degree-of-freedom missile simulation using the ADI AD 100 digital computer and ADSIM simulation language

    NASA Technical Reports Server (NTRS)

    Zwaanenburg, Koos

    1989-01-01

    The use of an AD 100 computer and the ADSIM language in the six-degree-of-freedom digital simulation of an air-to-ground missile is illustrated. The missile is launched from a moving platform, typically a helicopter, and is capable of striking a mobile target up to 10 kilometers away. The missile could be any tactical missile. The performance numbers of the AD 100 show that it is possible to implement a high performance missile model in a real-time simulation without the problems associated with an implementation on a general purpose computer using FORTRAN.

  4. Super-Resolution Microscopy of Cerebrospinal Fluid Biomarkers as a Tool for Alzheimer's Disease Diagnostics.

    PubMed

    Zhang, William I; Antonios, Gregory; Rabano, Alberto; Bayer, Thomas A; Schneider, Anja; Rizzoli, Silvio O

    2015-01-01

    Alzheimer's disease (AD) is neuropathologically characterized by aggregates of amyloid-β peptides (Aβ) and tau proteins. The consensus in the AD field is that Aβ and tau should serve as diagnostic biomarkers for AD. However, their aggregates have been difficult to investigate by conventional fluorescence microscopy, since their size is below the diffraction limit (∼200 nm). To solve this, we turned to a super-resolution imaging technique, stimulated emission depletion (STED) microscopy, which has a high enough precision to allow the discrimination of low- and high-molecular weight aggregates prepared in vitro. We used STED to analyze the structural organization of Aβ and tau in cerebrospinal fluid (CSF) from 36 AD patients, 11 patients with mild cognitive impairment (MCI), and 21 controls. We measured the numbers of aggregates in the CSF samples, and the aggregate sizes and intensities. These parameters enabled us to distinguish AD patients from controls with a specificity of ∼87% and a sensitivity of ∼79% . In addition, the aggregate parameters determined with STED microscopy correlated with the severity of cognitive impairment in AD patients. Finally, these parameters may be useful as predictive tools for MCI cases. The STED parameters of two MCI patients who developed AD during the course of the study, as well as of MCI patients whose Aβ ELISA values fall within the accepted range for AD, placed them close to the AD averages. We suggest that super-resolution imaging is a promising tool for AD diagnostics.

  5. Direct Simulation Monte Carlo Simulations of Low Pressure Semiconductor Plasma Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gochberg, L. A.; Ozawa, T.; Deng, H.

    2008-12-31

    The two widely used plasma deposition tools for semiconductor processing are Ionized Metal Physical Vapor Deposition (IMPVD) of metals using either planar or hollow cathode magnetrons (HCM), and inductively-coupled plasma (ICP) deposition of dielectrics in High Density Plasma Chemical Vapor Deposition (HDP-CVD) reactors. In these systems, the injected neutral gas flows are generally in the transonic to supersonic flow regime. The Hybrid Plasma Equipment Model (HPEM) has been developed and is strategically and beneficially applied to the design of these tools and their processes. For the most part, the model uses continuum-based techniques, and thus, as pressures decrease below 10more » mTorr, the continuum approaches in the model become questionable. Modifications have been previously made to the HPEM to significantly improve its accuracy in this pressure regime. In particular, the Ion Monte Carlo Simulation (IMCS) was added, wherein a Monte Carlo simulation is used to obtain ion and neutral velocity distributions in much the same way as in direct simulation Monte Carlo (DSMC). As a further refinement, this work presents the first steps towards the adaptation of full DSMC calculations to replace part of the flow module within the HPEM. Six species (Ar, Cu, Ar*, Cu*, Ar{sup +}, and Cu{sup +}) are modeled in DSMC. To couple SMILE as a module to the HPEM, source functions for species, momentum and energy from plasma sources will be provided by the HPEM. The DSMC module will then compute a quasi-converged flow field that will provide neutral and ion species densities, momenta and temperatures. In this work, the HPEM results for a hollow cathode magnetron (HCM) IMPVD process using the Boltzmann distribution are compared with DSMC results using portions of those HPEM computations as an initial condition.« less

  6. Parallel Harmony Search Based Distributed Energy Resource Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ceylan, Oguzhan; Liu, Guodong; Tomsovic, Kevin

    2015-01-01

    This paper presents a harmony search based parallel optimization algorithm to minimize voltage deviations in three phase unbalanced electrical distribution systems and to maximize active power outputs of distributed energy resources (DR). The main contribution is to reduce the adverse impacts on voltage profile during a day as photovoltaics (PVs) output or electrical vehicles (EVs) charging changes throughout a day. The IEEE 123- bus distribution test system is modified by adding DRs and EVs under different load profiles. The simulation results show that by using parallel computing techniques, heuristic methods may be used as an alternative optimization tool in electricalmore » power distribution systems operation.« less

  7. A comprehensive surface-groundwater flow model

    NASA Astrophysics Data System (ADS)

    Arnold, Jeffrey G.; Allen, Peter M.; Bernhardt, Gilbert

    1993-02-01

    In this study, a simple groundwater flow and height model was added to an existing basin-scale surface water model. The linked model is: (1) watershed scale, allowing the basin to be subdivided; (2) designed to accept readily available inputs to allow general use over large regions; (3) continuous in time to allow simulation of land management, including such factors as climate and vegetation changes, pond and reservoir management, groundwater withdrawals, and stream and reservoir withdrawals. The model is described, and is validated on a 471 km 2 watershed near Waco, Texas. This linked model should provide a comprehensive tool for water resource managers in development and planning.

  8. The Computer as a Tool for Learning

    PubMed Central

    Starkweather, John A.

    1986-01-01

    Experimenters from the beginning recognized the advantages computers might offer in medical education. Several medical schools have gained experience in such programs in automated instruction. Television images and graphic display combined with computer control and user interaction are effective for teaching problem solving. The National Board of Medical Examiners has developed patient-case simulation for examining clinical skills, and the National Library of Medicine has experimented with combining media. Advances from the field of artificial intelligence and the availability of increasingly powerful microcomputers at lower cost will aid further development. Computers will likely affect existing educational methods, adding new capabilities to laboratory exercises, to self-assessment and to continuing education. PMID:3544511

  9. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    PubMed

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  10. Endonasal Skull Base Tumor Removal Using Concentric Tube Continuum Robots: A Phantom Study.

    PubMed

    Swaney, Philip J; Gilbert, Hunter B; Webster, Robert J; Russell, Paul T; Weaver, Kyle D

    2015-03-01

    Objectives The purpose of this study is to experimentally evaluate the use of concentric tube continuum robots in endonasal skull base tumor removal. This new type of surgical robot offers many advantages over existing straight and rigid surgical tools including added dexterity, the ability to scale movements, and the ability to rotate the end effector while leaving the robot fixed in space. In this study, a concentric tube continuum robot was used to remove simulated pituitary tumors from a skull phantom. Design The robot was teleoperated by experienced skull base surgeons to remove a phantom pituitary tumor within a skull. Percentage resection was measured by weight. Resection duration was timed. Setting Academic research laboratory. Main Outcome Measures Percentage removal of tumor material and procedure duration. Results Average removal percentage of 79.8 ± 5.9% and average time to complete procedure of 12.5 ± 4.1 minutes (n = 20). Conclusions The robotic system presented here for use in endonasal skull base surgery shows promise in improving the dexterity, tool motion, and end effector capabilities currently available with straight and rigid tools while remaining an effective tool for resecting the tumor.

  11. Development and Verification of the Soil-Pile Interaction Extension for SubDyn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damiani, Rick R; Wendt, Fabian F

    SubDyn is the substructure structural-dynamics module for the aero-hydro-servo-elastic tool FAST v8. SubDyn uses a finite-element model (FEM) to simulate complex multimember lattice structures connected to conventional turbines and towers, and it can make use of the Craig-Bampton model reduction. Here we describe the newly added capability to handle soil-pile stiffness and compare results for monopile and jacket-based offshore wind turbines as obtained with FAST v8, SACS, and EDP (the latter two are modeling software packages commonly used in the offshore oil and gas industry). The level of agreement in terms of modal properties and loads for the entire offshoremore » wind turbine components is excellent, thus allowing SubDyn and FAST v8 to accurately simulate offshore wind turbines on fixed-bottom structures and accounting for the effect of soil dynamics, thus reducing risk to the project.« less

  12. Laser shocking of materials: Toward the national ignition facility

    DOE PAGES

    Meyers, M. A.; Remington, B. A.; Maddox, B.; ...

    2010-01-16

    In recent years a powerful experimental tool has been added to the arsenal at the disposal of the materials scientist investigating materials response at extreme regimes of strain rates, temperatures, and pressures: laser compression. In this paper, this technique has been applied successfully to mono-, poly-, and nanocrystalline metals and the results have been compared with predictions from analytical models and molecular dynamics simulations. Special flash x-ray radiography and flash x-ray diffraction, combined with laser shock propagation, are yielding the strength of metals at strain rates on the order of 10 7–10 8 s -1 and resolving details of themore » kinetics of phase transitions. A puzzling result is that experiments, analysis, and simulations predict dislocation densities that are off by orders of magnitude. Finally, other surprises undoubtedly await us as we explore even higher pressure/strain rate/temperature regimes enabled by the National Ignition Facility.« less

  13. Stepwise and stagewise approaches for spatial cluster detection

    PubMed Central

    Xu, Jiale

    2016-01-01

    Spatial cluster detection is an important tool in many areas such as sociology, botany and public health. Previous work has mostly taken either hypothesis testing framework or Bayesian framework. In this paper, we propose a few approaches under a frequentist variable selection framework for spatial cluster detection. The forward stepwise methods search for multiple clusters by iteratively adding currently most likely cluster while adjusting for the effects of previously identified clusters. The stagewise methods also consist of a series of steps, but with tiny step size in each iteration. We study the features and performances of our proposed methods using simulations on idealized grids or real geographic area. From the simulations, we compare the performance of the proposed methods in terms of estimation accuracy and power of detections. These methods are applied to the the well-known New York leukemia data as well as Indiana poverty data. PMID:27246273

  14. Stepwise and stagewise approaches for spatial cluster detection.

    PubMed

    Xu, Jiale; Gangnon, Ronald E

    2016-05-01

    Spatial cluster detection is an important tool in many areas such as sociology, botany and public health. Previous work has mostly taken either a hypothesis testing framework or a Bayesian framework. In this paper, we propose a few approaches under a frequentist variable selection framework for spatial cluster detection. The forward stepwise methods search for multiple clusters by iteratively adding currently most likely cluster while adjusting for the effects of previously identified clusters. The stagewise methods also consist of a series of steps, but with a tiny step size in each iteration. We study the features and performances of our proposed methods using simulations on idealized grids or real geographic areas. From the simulations, we compare the performance of the proposed methods in terms of estimation accuracy and power. These methods are applied to the the well-known New York leukemia data as well as Indiana poverty data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Characterizing acid diffusion lengths in chemically amplified resists from measurements of deprotection kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patil, Abhijit A.; Pandey, Yogendra Narayan; Doxastakis, Manolis

    2014-10-01

    The acid-catalyzed deprotection of glassy poly(4-hydroxystyrene-co-tertbutyl acrylate) films was studied with infrared absorbance spectroscopy and stochastic simulations. Experimental data were interpreted with a simple description of subdiffusive acid transport coupled to second-order acid loss. This model predicts key attributes of observed deprotection rates, such as fast reaction at short times, slow reaction at long times, and a nonlinear dependence on acid loading. Fickian diffusion is approached by increasing the post-exposure bake temperature or adding plasticizing agents to the polymer resin. These findings demonstrate that acid mobility and overall deprotection kinetics are coupled to glassy matrix dynamics. To complement the analysismore » of bulk kinetics, acid diffusion lengths were calculated from the anomalous transport model and compared with nanopattern line widths. The consistent scaling between experiments and simulations suggests that the anomalous diffusion model could be further developed into a predictive lithography tool.« less

  16. On energy harvesting for augmented tags

    NASA Astrophysics Data System (ADS)

    Allane, Dahmane; Duroc, Yvan; Andia Vera, Gianfranco; Touhami, Rachida; Tedjini, Smail

    2017-02-01

    In this paper, the harmonic signals generated by UHF RFID chips, usually considered as spurious effects and unused, are exploited. Indeed, the harmonic signals are harvested to feed a supplementary circuitry associated with a passive RFID tag. Two approaches are presented and compared. In the first one, the third-harmonic signal is combined with an external 2.45-GHz Wi-Fi signal. The integration is done in such a way that the composite signal boosts the conversion efficiency of the energy harvester. In the second approach, the third-harmonic signal is used as the only source of a harvester that energizes a commercial temperature sensor associated with the tag. The design procedures of the two "augmented-tag" approaches are presented. The performance of each system is simulated with ADS software, and using Harmonic Balance tool (HB), the results obtained in simulation and measurements are compared also. xml:lang="fr"

  17. Time Domain Stability Margin Assessment of the NASA Space Launch System GN&C Design for Exploration Mission One

    NASA Technical Reports Server (NTRS)

    Clements, Keith; Wall, John

    2017-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation.

  18. Time Domain Stability Margin Assessment of the NS Space Launch System GN&C Design for Exploration Mission One

    NASA Technical Reports Server (NTRS)

    Clements, Keith; Wall, John

    2017-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation.

  19. Irrigation Dynamics and Tactics - Developing a Sustainable and Profitable Irrigation Strategy for Agricultural Areas

    NASA Astrophysics Data System (ADS)

    Van Opstal, J.; Neale, C. M. U.; Lecina, S.

    2014-12-01

    Irrigation management is a dynamic process that adapts according to weather conditions and water availability, as well as socio-economic influences. The goal of water users is to adapt their management to achieve maximum profits. However, these decisions should take into account the environmental impact on the surroundings. Agricultural irrigation systems need to be viewed as a system that is an integral part of a watershed. Therefore changes in the infrastructure, operation and management of an irrigated area, has an impact on the water quantity and quality available for other water users. A strategy can be developed for decision-makers using an irrigation system modelling tool. Such a tool can simulate the impact of the infrastructure, operation and management of an irrigation area on its hydrology and agricultural productivity. This combination of factors is successfully simulated with the Ador model, which is able to reproduce on-farm irrigation and water delivery by a canal system. Model simulations for this study are supported with spatial analysis tools using GIS and remote sensing. Continuous measurements of drainage water will be added to indicate the water quality aspects. The Bear River Canal Company located in Northern Utah (U.S.A.) is used as a case study for this research. The irrigation area encompasses 26,000 ha and grows mainly alfalfa, grains, corn and onions. The model allows the simulation of different strategies related to water delivery, on-farm water use, crop rotations, and reservoirs and networks capacities under different weather and water availability conditions. Such changes in the irrigation area will have consequences for farmers in the study area regarding crop production, and for downstream users concerning both the quantity and quality of outflows. The findings from this study give insight to decision-makers and water users for changing irrigation water delivery strategies to improve the sustainability and profitability of agriculture in the future.

  20. Mobilizing local innovation capacity through a simulation game in a participatory research project on agricultural innovation in El Brahmi irrigation scheme (Tunisia).

    NASA Astrophysics Data System (ADS)

    Dolinska, Aleksandra; d'Aquino, Patrick; Imache, Amar; Dionnet, Mathieu; Rougier, Jean-Emmanuel

    2015-04-01

    In the framework of the European Union and African Union cooperative research to increase Food production in irrigated farming systems in Africa (EAU4Food project) we conducted a participatory research on the possible innovative practices to increase production of dairy farms in the irrigation scheme El Brahmi in Tunisia in the face of changing economic, political and environmental conditions. Our aim was to find effective research method to stimulate farmers' participation in the innovation process. Although the capacities of farmers in producing knowledge and in innovating are recognized and the shift from the linear model of technology transfer towards more participatory approaches to innovation is postulated, in which the role of researchers changes from providing solutions towards supporting farmers in finding their own solutions, in practice, the position of farmers in shaping innovation practice and process remains weak. After a series of participatory workshops and in-depth interviews with the actors of the local innovation system we developed and tested a simple open simulation game Laitconomie for farmers. The game proved to be effective in increasing our understanding of the system as the farmers were adding new elements and rules while playing, and in mobilizing farmers' knowledge (including tacit knowledge) in the simulated innovation process. The result reported by the participants was learning how to improve farm management, soil fertility management and cow nutrition practices. Some of the participants used the game as a decision support tool. While our game and its scope were modest and mobilized only two types of players (farmers and extension agent), open simulation proved to be a useful tool to analyze a local innovation system. Designing similar type of tools that would mobilize more diverse players and hence have a larger scope can be imagined.

  1. The Processing of Airspace Concept Evaluations Using FASTE-CNS as a Pre- or Post-Simulation CNS Analysis Tool

    NASA Technical Reports Server (NTRS)

    Mainger, Steve

    2004-01-01

    As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.

  2. On the application of accelerated molecular dynamics to liquid water simulations.

    PubMed

    de Oliveira, César Augusto F; Hamelberg, Donald; McCammon, J Andrew

    2006-11-16

    Our group recently proposed a robust bias potential function that can be used in an efficient all-atom accelerated molecular dynamics (MD) approach to simulate the transition of high energy barriers without any advance knowledge of the potential-energy landscape. The main idea is to modify the potential-energy surface by adding a bias, or boost, potential in regions close to the local minima, such that all transitions rates are increased. By applying the accelerated MD simulation method to liquid water, we observed that this new simulation technique accelerates the molecular motion without losing its microscopic structure and equilibrium properties. Our results showed that the application of a small boost energy on the potential-energy surface significantly reduces the statistical inefficiency of the simulation while keeping all the other calculated properties unchanged. On the other hand, although aggressive acceleration of the dynamics simulation increases the self-diffusion coefficient of water molecules greatly and dramatically reduces the correlation time of the simulation, configurations representative of the true structure of liquid water are poorly sampled. Our results also showed the strength and robustness of this simulation technique, which confirm this approach as a very useful and promising tool to extend the time scale of the all-atom simulations of biological system with explicit solvent models. However, we should keep in mind that there is a compromise between the strength of the boost applied in the simulation and the reproduction of the ensemble average properties.

  3. Driving Simulator Performance in Patients with Possible and Probable Alzheimer’s Disease

    PubMed Central

    Stein, Anthony C.; Dubinsky, Richard M.

    2011-01-01

    Drivers with more advanced stages of Alzheimer’s disease (AD) have been previously associated with an increased rate of motor vehicle accidents. Drivers suffering from early AD are also involved in, and may even cause motor vehicle accidents with greater frequency than “normal” drivers. Consequently there is considerable public concern regarding traffic safety issues for those with AD and subsequently for society, but there has been little research in understanding whether deterioration in driving ability is progressive, or has a sudden onset once the disease has reached a certain severity. The purpose of this study was to identify possible degradation in simulated driving performance that may occur at the earliest stages of AD, and compare these decrements to a control group of normal drivers. Using a single blind design, seventeen AD subjects, eight at a Clinical Dementia Rating (CDR) of 0.5 (possible AD) and nine at a CDR of 1 (probable AD), were compared to 63 cognitively normal, elderly controls. All subjects were trained to drive a computerized interactive driving simulator and then tested on a 19.3 km (12 mile) test course. The AD subjects demonstrated impaired driving performance when compared to the controls. The simulated driving performance of the CDR 1 AD subjects was so degraded that it would be regarded as unsafe by standard assessment criteria. The CDR 0.5 subjects made similar errors, suggesting that driving impairment may occur at the earliest stages of the disease. Further work will be necessary to determine the significance of these findings. PMID:22105407

  4. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  5. Tool use in neurodegenerative diseases: Planning or technical reasoning?

    PubMed

    Baumard, Josselin; Lesourd, Mathieu; Remigereau, Chrystelle; Jarry, Christophe; Etcharry-Bouyx, Frédérique; Chauviré, Valérie; Osiurak, François; Le Gall, Didier

    2017-04-29

    Recent works showed that tool use can be impaired in stroke patients because of either planning or technical reasoning deficits, but these two hypotheses have not yet been compared in the field of neurodegenerative diseases. The aim of this study was to address the relationships between real tool use, mechanical problem-solving, and planning skills in patients with Alzheimer's disease (AD, n = 32), semantic dementia (SD, n = 16), and corticobasal syndrome (CBS, n = 9). Patients were asked to select and use ten common tools, to solve three mechanical problems, and to complete the Tower of London test. Motor function and episodic memory were controlled using the Purdue Pegboard Test and the BEC96 questionnaire, respectively. A data-transformation method was applied to avoid ceiling effects, and single-case analysis was performed based on raw scores and completion time. All groups demonstrated either impaired or slowed tool use. Planning deficits were found only in the AD group. Mechanical problem-solving deficits were observed only in the AD and CBS groups. Performance in the Tower of London test was the best predictor of tool use skills in the AD group, suggesting these patients had general rather than mechanical problem-solving deficits. Episodic memory seemed to play little role in performance. Motor dysfunction tended to be associated with tool use skills in CBS patients, while tool use disorders are interpreted as a consequence of the semantic loss in SD in line with previous works. These findings may encourage caregivers to set up disease-centred interventions. © 2017 The British Psychological Society.

  6. Tight-binding model for materials at mesoscale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tai, Yuan-Yen; Choi, Hongchul; Zhu, Wei

    2016-12-21

    TBM3 is an open source package for computational simulations of quantum materials at multiple scales in length and time. The project originated to investigate the multiferroic behavior in transition-metal oxide heterostructures. The framework has also been designed to study emergent phemona in other quantum materials like 2-dimensional transition-metal dichalcogenides, graphene, topological insulators, and skyrmion in materials, etc. In the long term, we will enable the package for transport and time-resolved phenomena. TBM3 is currently a C++ based numerical tool package and framework for the design and construction of any kind of lattice structures with multi-orbital and spin degrees of freedom.more » The fortran based portion of the package will be added in the near future. The design of TBM3 is in a highly flexible and reusable framework and the tight-binding parameters can be modeled or informed by DFT calculations. It is currently GPU enabled and feature of CPU enabled MPI will be added in the future.« less

  7. Hamiltonian adaptive resolution molecular dynamics simulation of infrared dielectric functions of liquids

    NASA Astrophysics Data System (ADS)

    Wang, C. C.; Tan, J. Y.; Liu, L. H.

    2018-05-01

    Hamiltonian adaptive resolution scheme (H-AdResS), which allows to simulate materials by treating different domains of the system at different levels of resolution, is a recently proposed atomistic/coarse-grained multiscale model. In this work, a scheme to calculate the dielectric functions of liquids on account of H-AdResS is presented. In the proposed H-AdResS dielectric-function calculation scheme (DielectFunctCalS), the corrected molecular dipole moments are calculated by multiplying molecular dipole moment by the weighting fraction of the molecular mapping point. As the widths of all-atom and hybrid regions show different degrees of influence on the dielectric functions, a prefactor is multiplied to eliminate the effects of all-atom and hybrid region widths. Since one goal of using the H-AdResS method is to reduce computational costs, widths of the all-atom region and the hybrid region can be reduced considering that the coarse-grained simulation is much more timesaving compared to atomistic simulation. Liquid water and ethanol are taken as test cases to validate the DielectFunctCalS. The H-AdResS DielectFunctCalS results are in good agreement with all-atom molecular dynamics simulations. The accuracy of the H-AdResS results, together with all-atom molecular dynamics results, depends heavily on the choice of the force field and force field parameters. The H-AdResS DielectFunctCalS allows us to calculate the dielectric functions of macromolecule systems with high efficiency and makes the dielectric function calculations of large biomolecular systems possible.

  8. Adaptive resolution simulation of a biomolecule and its hydration shell: Structural and dynamical properties

    NASA Astrophysics Data System (ADS)

    Fogarty, Aoife C.; Potestio, Raffaello; Kremer, Kurt

    2015-05-01

    A fully atomistic modelling of many biophysical and biochemical processes at biologically relevant length- and time scales is beyond our reach with current computational resources, and one approach to overcome this difficulty is the use of multiscale simulation techniques. In such simulations, when system properties necessitate a boundary between resolutions that falls within the solvent region, one can use an approach such as the Adaptive Resolution Scheme (AdResS), in which solvent particles change their resolution on the fly during the simulation. Here, we apply the existing AdResS methodology to biomolecular systems, simulating a fully atomistic protein with an atomistic hydration shell, solvated in a coarse-grained particle reservoir and heat bath. Using as a test case an aqueous solution of the regulatory protein ubiquitin, we first confirm the validity of the AdResS approach for such systems, via an examination of protein and solvent structural and dynamical properties. We then demonstrate how, in addition to providing a computational speedup, such a multiscale AdResS approach can yield otherwise inaccessible physical insights into biomolecular function. We use our methodology to show that protein structure and dynamics can still be correctly modelled using only a few shells of atomistic water molecules. We also discuss aspects of the AdResS methodology peculiar to biomolecular simulations.

  9. Vibration reduction of pneumatic percussive rivet tools: mechanical and ergonomic re-design approaches.

    PubMed

    Cherng, John G; Eksioglu, Mahmut; Kizilaslan, Kemal

    2009-03-01

    This paper presents a systematic design approach, which is the result of years of research effort, to ergonomic re-design of rivet tools, i.e. rivet hammers and bucking bars. The investigation was carried out using both ergonomic approach and mechanical analysis of the rivet tools dynamic behavior. The optimal mechanical design parameters of the re-designed rivet tools were determined by Taguchi method. Two ergonomically re-designed rivet tools with vibration damping/isolation mechanisms were tested against two conventional rivet tools in both laboratory and field tests. Vibration characteristics of both types of tools were measured by laboratory tests using a custom-made test fixture. The subjective field evaluations of the tools were performed by six experienced riveters at an aircraft repair shop. Results indicate that the isolation spring and polymer damper are very effective in reducing the overall level of vibration under both unweighted and weighted acceleration conditions. The mass of the dolly head and the housing played a significant role in the vibration absorption of the bucking bars. Another important result was that the duct iron has better vibration reducing capability compared to steel and aluminum for bucking bars. Mathematical simulation results were also consistent with the experimental results. Overall conclusion obtained from the study was that by applying the design principles of ergonomics and by adding vibration damping/isolation mechanisms to the rivet tools, the vibration level can significantly be reduced and the tools become safer and user friendly. The details of the experience learned, design modifications, test methods, mathematical models and the results are included in the paper.

  10. Land-surface parameter optimisation using data assimilation techniques: the adJULES system V1.0

    NASA Astrophysics Data System (ADS)

    Raoult, Nina M.; Jupp, Tim E.; Cox, Peter M.; Luke, Catherine M.

    2016-08-01

    Land-surface models (LSMs) are crucial components of the Earth system models (ESMs) that are used to make coupled climate-carbon cycle projections for the 21st century. The Joint UK Land Environment Simulator (JULES) is the land-surface model used in the climate and weather forecast models of the UK Met Office. JULES is also extensively used offline as a land-surface impacts tool, forced with climatologies into the future. In this study, JULES is automatically differentiated with respect to JULES parameters using commercial software from FastOpt, resulting in an analytical gradient, or adjoint, of the model. Using this adjoint, the adJULES parameter estimation system has been developed to search for locally optimum parameters by calibrating against observations. This paper describes adJULES in a data assimilation framework and demonstrates its ability to improve the model-data fit using eddy-covariance measurements of gross primary production (GPP) and latent heat (LE) fluxes. adJULES also has the ability to calibrate over multiple sites simultaneously. This feature is used to define new optimised parameter values for the five plant functional types (PFTs) in JULES. The optimised PFT-specific parameters improve the performance of JULES at over 85 % of the sites used in the study, at both the calibration and evaluation stages. The new improved parameters for JULES are presented along with the associated uncertainties for each parameter.

  11. Supporting Dynamic Ad hoc Collaboration Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Deborah A.; Berket, Karlo

    2003-07-14

    Modern HENP experiments such as CMS and Atlas involve as many as 2000 collaborators around the world. Collaborations this large will be unable to meet often enough to support working closely together. Many of the tools currently available for collaboration focus on heavy-weight applications such as videoconferencing tools. While these are important, there is a more basic need for tools that support connecting physicists to work together on an ad hoc or continuous basis. Tools that support the day-to-day connectivity and underlying needs of a group of collaborators are important for providing light-weight, non-intrusive, and flexible ways to work collaboratively.more » Some example tools include messaging, file-sharing, and shared plot viewers. An important component of the environment is a scalable underlying communication framework. In this paper we will describe our current progress on building a dynamic and ad hoc collaboration environment and our vision for its evolution into a HENP collaboration environment.« less

  12. A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition

    DTIC Science & Technology

    2015-10-05

    simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows

  13. Direct simulation Monte Carlo method for gas flows in micro-channels with bends with added curvature

    NASA Astrophysics Data System (ADS)

    Tisovský, Tomáš; Vít, Tomáš

    Gas flows in micro-channels are simulated using an open source Direct Simulation Monte Carlo (DSMC) code dsmcFOAM for general application to rarefied gas flow written within the framework of the open source C++ toolbox called OpenFOAM. Aim of this paper is to investigate the flow in micro-channel with bend with added curvature. Results are compared with flows in channel without added curvature and equivalent straight channel. Effects of micro-channel bend was already thoroughly investigated by White et al. Geometry proposed by White is also used here for refference.

  14. Digital Quantum Simulation of Minimal AdS /CFT

    NASA Astrophysics Data System (ADS)

    García-Álvarez, L.; Egusquiza, I. L.; Lamata, L.; del Campo, A.; Sonner, J.; Solano, E.

    2017-07-01

    We propose the digital quantum simulation of a minimal AdS /CFT model in controllable quantum platforms. We consider the Sachdev-Ye-Kitaev model describing interacting Majorana fermions with randomly distributed all-to-all couplings, encoding nonlocal fermionic operators onto qubits to efficiently implement their dynamics via digital techniques. Moreover, we also give a method for probing nonequilibrium dynamics and the scrambling of information. Finally, our approach serves as a protocol for reproducing a simplified low-dimensional model of quantum gravity in advanced quantum platforms as trapped ions and superconducting circuits.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fogarty, Aoife C., E-mail: fogarty@mpip-mainz.mpg.de; Potestio, Raffaello, E-mail: potestio@mpip-mainz.mpg.de; Kremer, Kurt, E-mail: kremer@mpip-mainz.mpg.de

    A fully atomistic modelling of many biophysical and biochemical processes at biologically relevant length- and time scales is beyond our reach with current computational resources, and one approach to overcome this difficulty is the use of multiscale simulation techniques. In such simulations, when system properties necessitate a boundary between resolutions that falls within the solvent region, one can use an approach such as the Adaptive Resolution Scheme (AdResS), in which solvent particles change their resolution on the fly during the simulation. Here, we apply the existing AdResS methodology to biomolecular systems, simulating a fully atomistic protein with an atomistic hydrationmore » shell, solvated in a coarse-grained particle reservoir and heat bath. Using as a test case an aqueous solution of the regulatory protein ubiquitin, we first confirm the validity of the AdResS approach for such systems, via an examination of protein and solvent structural and dynamical properties. We then demonstrate how, in addition to providing a computational speedup, such a multiscale AdResS approach can yield otherwise inaccessible physical insights into biomolecular function. We use our methodology to show that protein structure and dynamics can still be correctly modelled using only a few shells of atomistic water molecules. We also discuss aspects of the AdResS methodology peculiar to biomolecular simulations.« less

  16. Developing an analytical tool for evaluating EMS system design changes and their impact on cardiac arrest outcomes: combining geographic information systems with register data on survival rates

    PubMed Central

    2013-01-01

    Background Out-of-hospital cardiac arrest (OHCA) is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS) system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS) simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch) increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1) static changes such as trimming the emergency call handling time or (2) dynamic changes such as location of emergency resources or which resources should carry a defibrillator. PMID:23415045

  17. Theoretical Tools and Software for Modeling, Simulation and Control Design of Rocket Test Facilities

    NASA Technical Reports Server (NTRS)

    Richter, Hanz

    2004-01-01

    A rocket test stand and associated subsystems are complex devices whose operation requires that certain preparatory calculations be carried out before a test. In addition, real-time control calculations must be performed during the test, and further calculations are carried out after a test is completed. The latter may be required in order to evaluate if a particular test conformed to specifications. These calculations are used to set valve positions, pressure setpoints, control gains and other operating parameters so that a desired system behavior is obtained and the test can be successfully carried out. Currently, calculations are made in an ad-hoc fashion and involve trial-and-error procedures that may involve activating the system with the sole purpose of finding the correct parameter settings. The goals of this project are to develop mathematical models, control methodologies and associated simulation environments to provide a systematic and comprehensive prediction and real-time control capability. The models and controller designs are expected to be useful in two respects: 1) As a design tool, a model is the only way to determine the effects of design choices without building a prototype, which is, in the context of rocket test stands, impracticable; 2) As a prediction and tuning tool, a good model allows to set system parameters off-line, so that the expected system response conforms to specifications. This includes the setting of physical parameters, such as valve positions, and the configuration and tuning of any feedback controllers in the loop.

  18. Circuit-based versus full-wave modelling of active microwave circuits

    NASA Astrophysics Data System (ADS)

    Bukvić, Branko; Ilić, Andjelija Ž.; Ilić, Milan M.

    2018-03-01

    Modern full-wave computational tools enable rigorous simulations of linear parts of complex microwave circuits within minutes, taking into account all physical electromagnetic (EM) phenomena. Non-linear components and other discrete elements of the hybrid microwave circuit are then easily added within the circuit simulator. This combined full-wave and circuit-based analysis is a must in the final stages of the circuit design, although initial designs and optimisations are still faster and more comfortably done completely in the circuit-based environment, which offers real-time solutions at the expense of accuracy. However, due to insufficient information and general lack of specific case studies, practitioners still struggle when choosing an appropriate analysis method, or a component model, because different choices lead to different solutions, often with uncertain accuracy and unexplained discrepancies arising between the simulations and measurements. We here design a reconfigurable power amplifier, as a case study, using both circuit-based solver and a full-wave EM solver. We compare numerical simulations with measurements on the manufactured prototypes, discussing the obtained differences, pointing out the importance of measured parameters de-embedding, appropriate modelling of discrete components and giving specific recipes for good modelling practices.

  19. Accuracy of Estimating Solar Radiation Pressure for GEO Debris with Tumbling Effect

    NASA Astrophysics Data System (ADS)

    Chao, Chia-Chun George

    2009-03-01

    The accuracy of estimating solar radiation pressure for GEO debris is examined and demonstrated, via numerical simulations, by fitting a batch (months) of simulated position vectors. These simulated position vectors are generated from a "truth orbit" with added white noise using high-precision numerical integration tools. After the long-arc fit of the simulated observations (position vectors), one can accurately and reliably determine how close the estimated value of solar radiation pressure is to the truth. Results of this study show that the inherent accuracy in estimating the solar radiation pressure coefficient can be as good as 1% if a long-arc fit span up to 180 days is used and the satellite is not tumbling. The corresponding position prediction accuracy can be as good as, in maximum error, 1 km along in-track, 0.3 km along radial and 0.1 km along cross-track up to 30 days. Similar accuracies can be expected when the object is tumbling as long as the rate of attitude change is different from the orbit rate. Results of this study reveal an important phenomenon that the solar radiation pressure significantly affects the orbit motion when the spin rate is equal to the orbit rate.

  20. A Review of Simulators with Haptic Devices for Medical Training.

    PubMed

    Escobar-Castillejos, David; Noguez, Julieta; Neri, Luis; Magana, Alejandra; Benes, Bedrich

    2016-04-01

    Medical procedures often involve the use of the tactile sense to manipulate organs or tissues by using special tools. Doctors require extensive preparation in order to perform them successfully; for example, research shows that a minimum of 750 operations are needed to acquire sufficient experience to perform medical procedures correctly. Haptic devices have become an important training alternative and they have been considered to improve medical training because they let users interact with virtual environments by adding the sense of touch to the simulation. Previous articles in the field state that haptic devices enhance the learning of surgeons compared to current training environments used in medical schools (corpses, animals, or synthetic skin and organs). Consequently, virtual environments use haptic devices to improve realism. The goal of this paper is to provide a state of the art review of recent medical simulators that use haptic devices. In particular we focus on stitching, palpation, dental procedures, endoscopy, laparoscopy, and orthopaedics. These simulators are reviewed and compared from the viewpoint of used technology, the number of degrees of freedom, degrees of force feedback, perceived realism, immersion, and feedback provided to the user. In the conclusion, several observations per area and suggestions for future work are provided.

  1. Mitral valve repair using ePTFE sutures for ruptured mitral chordae tendineae: a computational simulation study.

    PubMed

    Rim, Yonghoon; Laing, Susan T; McPherson, David D; Kim, Hyunggun

    2014-01-01

    Mitral valve (MV) repair using expanded polytetrafluoroethylene sutures is an established and preferred interventional method to resolve the complex pathophysiologic problems associated with chordal rupture. We developed a novel computational evaluation protocol to determine the effect of the artificial sutures on restoring MV function following valve repair. A virtual MV was created using three-dimensional echocardiographic data in a patient with ruptured mitral chordae tendineae (RMCT). Virtual repairs were designed by adding artificial sutures between the papillary muscles and the posterior leaflet where the native chordae were ruptured. Dynamic finite element simulations were performed to evaluate pre- and post-repair MV function. Abnormal posterior leaflet prolapse and mitral regurgitation was clearly demonstrated in the MV with ruptured chordae. Following virtual repair to reconstruct ruptured chordae, the severity of the posterior leaflet prolapse decreased and stress concentration was markedly reduced both in the leaflet tissue and the intact native chordae. Complete leaflet coaptation was restored when four or six sutures were utilized. Computational simulations provided quantitative information of functional improvement following MV repair. This novel simulation strategy may provide a powerful tool for evaluation and prediction of interventional treatment for RMCT.

  2. Meta-Analysis of a Continuous Outcome Combining Individual Patient Data and Aggregate Data: A Method Based on Simulated Individual Patient Data

    ERIC Educational Resources Information Center

    Yamaguchi, Yusuke; Sakamoto, Wataru; Goto, Masashi; Staessen, Jan A.; Wang, Jiguang; Gueyffier, Francois; Riley, Richard D.

    2014-01-01

    When some trials provide individual patient data (IPD) and the others provide only aggregate data (AD), meta-analysis methods for combining IPD and AD are required. We propose a method that reconstructs the missing IPD for AD trials by a Bayesian sampling procedure and then applies an IPD meta-analysis model to the mixture of simulated IPD and…

  3. Using discrete event computer simulation to improve patient flow in a Ghanaian acute care hospital.

    PubMed

    Best, Allyson M; Dixon, Cinnamon A; Kelton, W David; Lindsell, Christopher J; Ward, Michael J

    2014-08-01

    Crowding and limited resources have increased the strain on acute care facilities and emergency departments worldwide. These problems are particularly prevalent in developing countries. Discrete event simulation is a computer-based tool that can be used to estimate how changes to complex health care delivery systems such as emergency departments will affect operational performance. Using this modality, our objective was to identify operational interventions that could potentially improve patient throughput of one acute care setting in a developing country. We developed a simulation model of acute care at a district level hospital in Ghana to test the effects of resource-neutral (eg, modified staff start times and roles) and resource-additional (eg, increased staff) operational interventions on patient throughput. Previously captured deidentified time-and-motion data from 487 acute care patients were used to develop and test the model. The primary outcome was the modeled effect of interventions on patient length of stay (LOS). The base-case (no change) scenario had a mean LOS of 292 minutes (95% confidence interval [CI], 291-293). In isolation, adding staffing, changing staff roles, and varying shift times did not affect overall patient LOS. Specifically, adding 2 registration workers, history takers, and physicians resulted in a 23.8-minute (95% CI, 22.3-25.3) LOS decrease. However, when shift start times were coordinated with patient arrival patterns, potential mean LOS was decreased by 96 minutes (95% CI, 94-98), and with the simultaneous combination of staff roles (registration and history taking), there was an overall mean LOS reduction of 152 minutes (95% CI, 150-154). Resource-neutral interventions identified through discrete event simulation modeling have the potential to improve acute care throughput in this Ghanaian municipal hospital. Discrete event simulation offers another approach to identifying potentially effective interventions to improve patient flow in emergency and acute care in resource-limited settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Hydrogen generation in CSP plants and maintenance of DPO/BP heat transfer fluids - A simulation approach

    NASA Astrophysics Data System (ADS)

    Kuckelkorn, Thomas; Jung, Christian; Gnädig, Tim; Lang, Christoph; Schall, Christina

    2016-05-01

    The ageing of diphenyl oxide/ biphenyl (DPO/BP) Heat Transfer Fluids (HTFs) implies challenging tasks for operators of parabolic trough power plants in order to find the economic optimum between plant performance and O&M costs. Focusing on the generation of hydrogen, which is effecting from the HTF ageing process, the balance of hydrogen pressure in the HTF is simulated for different operation scenarios. Accelerated build-up of hydrogen pressure in the HTF is causing increased permeation into the annular vacuum space of the installed receivers and must be avoided in order to maintain the performance of these components. Therefore, the effective hydrogen partial pressure in the HTF has to be controlled and limited according to the specified values so that the vacuum lifetime of the receivers and the overall plant performance can be ensured. In order to simulate and visualize the hydrogen balance of a typical parabolic trough plant, initially a simple model is used to calculate the balance of hydrogen in the system and this is described. As input data for the simulation, extrapolated hydrogen generation rates have been used, which were calculated from results of lab tests performed by DLR in Cologne, Germany. Hourly weather data, surface temperatures of the tubing system calculated by using the simulation tool from NREL, and hydrogen permeation rates for stainless steel and carbon steel grades taken from literature have been added to the model. In a first step the effect of HTF ageing, build-up of hydrogen pressure in the HTF and hydrogen loss rates through piping and receiver components have been modeled. In a second step a selective hydrogen removal process has been added to the model. The simulation results are confirming the need of active monitoring and controlling the effective hydrogen partial pressure in parabolic trough solar thermal power plants with DPO/BP HTF. Following the results of the simulation, the expected plant performance can only be achieved over lifetime, if the hydrogen partial pressure is actively controlled and limited.

  5. eSBMTools 1.0: enhanced native structure-based modeling tools.

    PubMed

    Lutz, Benjamin; Sinner, Claude; Heuermann, Geertje; Verma, Abhinav; Schug, Alexander

    2013-11-01

    Molecular dynamics simulations provide detailed insights into the structure and function of biomolecular systems. Thus, they complement experimental measurements by giving access to experimentally inaccessible regimes. Among the different molecular dynamics techniques, native structure-based models (SBMs) are based on energy landscape theory and the principle of minimal frustration. Typically used in protein and RNA folding simulations, they coarse-grain the biomolecular system and/or simplify the Hamiltonian resulting in modest computational requirements while achieving high agreement with experimental data. eSBMTools streamlines running and evaluating SBM in a comprehensive package and offers high flexibility in adding experimental- or bioinformatics-derived restraints. We present a software package that allows setting up, modifying and evaluating SBM for both RNA and proteins. The implemented workflows include predicting protein complexes based on bioinformatics-derived inter-protein contact information, a standardized setup of protein folding simulations based on the common PDB format, calculating reaction coordinates and evaluating the simulation by free-energy calculations with weighted histogram analysis method or by phi-values. The modules interface with the molecular dynamics simulation program GROMACS. The package is open source and written in architecture-independent Python2. http://sourceforge.net/projects/esbmtools/. alexander.schug@kit.edu. Supplementary data are available at Bioinformatics online.

  6. Simulation of Ge Dopant Emission in Indirect-Drive ICF Implosion Experiments

    NASA Astrophysics Data System (ADS)

    Macfarlane, J. J.; Golovkin, I.; Kulkarni, S.; Regan, S.; Epstein, R.; Mancini, R.; Peterson, K.; Suter, L. J.

    2013-10-01

    We present results from simulations performed to study the radiative properties of dopants used in inertial confinement fusion indirect-drive capsule implosion experiments on NIF. In Rev5 NIF ignition capsules, a Ge dopant is added to an inner region of the CH ablator to absorb hohlraum x-ray preheat. Spectrally resolved emission from ablator dopants can be used to study the degree of mixing of ablator material into the ignition hot spot. Here, we study the atomic processes that affect the radiative characteristics of these elements using a set of simulation tools to first estimate the evolution of plasma conditions in the compressed target, and then to compute the atomic kinetics of the dopant and the resultant radiative emission. Using estimates of temperature and density profiles predicted by radiation-hydrodynamics simulations, we set up simple 2-D plasma grids where we allow dopant material to be embedded in the fuel, and perform multi-dimensional collisional-radiative simulations using SPECT3D to compute non-LTE atomic level populations and spectral signatures from the dopant. Recently improved Stark-broadened line shape modeling for Ge K-shell lines has been included. The goal is to study the radiative and atomic processes that affect the emergent spectra, including the effects of inner-shell photoabsorption and K α reemission from the dopant.

  7. Modeling Vortex Generators in a Navier-Stokes Code

    NASA Technical Reports Server (NTRS)

    Dudek, Julianne C.

    2011-01-01

    A source-term model that simulates the effects of vortex generators was implemented into the Wind-US Navier-Stokes code. The source term added to the Navier-Stokes equations simulates the lift force that would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, subsonic flow in an S-duct with 22 corotating vortex generators, and supersonic flow in a rectangular duct with a counter-rotating vortex-generator pair. The model was also used to successfully simulate microramps in supersonic flow by treating each microramp as a pair of vanes with opposite angles of incidence. The validation results indicate that the source-term vortex-generator model provides a useful tool for screening vortex-generator configurations and gives comparable results to solutions computed using gridded vanes.

  8. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  9. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  10. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  11. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using the...

  12. 77 FR 6685 - Airworthiness Directives; The Boeing Company Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... proposed AD reduces compliance times for Model 767-400ER series airplanes. In addition, this proposed AD...). This proposed AD would reduce the compliance times for Model 767-400ER series airplanes. In addition... airplanes, the existing AD also requires a one- time inspection to determine if a tool runout option has...

  13. Understanding casing flow in Pelton turbines by numerical simulation

    NASA Astrophysics Data System (ADS)

    Rentschler, M.; Neuhauser, M.; Marongiu, J. C.; Parkinson, E.

    2016-11-01

    For rehabilitation projects of Pelton turbines, the flow in the casing may have an important influence on the overall performance of the machine. Water sheets returning on the jets or on the runner significantly reduce efficiency, and run-away speed depends on the flow in the casing. CFD simulations can provide a detailed insight into this type of flow, but these simulations are computationally intensive. As in general the volume of water in a Pelton turbine is small compared to the complete volume of the turbine housing, a single phase simulation greatly reduces the complexity of the simulation. In the present work a numerical tool based on the SPH-ALE meshless method is used to simulate the casing flow in a Pelton turbine. Using improved order schemes reduces the numerical viscosity. This is necessary to resolve the flow in the jet and on the casing wall, where the velocity differs by two orders of magnitude. The results are compared to flow visualizations and measurement in a hydraulic laboratory. Several rehabilitation projects proved the added value of understanding the flow in the Pelton casing. The flow simulation helps designing casing insert, not only to see their influence on the flow, but also to calculate the stress in the inserts. In some projects, the casing simulation leads to the understanding of unexpected behavior of the flow. One such example is presented where the backsplash of a deflector hit the runner, creating a reversed rotation of the runner.

  14. Incorporation of the equilibrium temperature approach in a Soil and Water Assessment Tool hydroclimatological stream temperature model

    NASA Astrophysics Data System (ADS)

    Du, Xinzhong; Shrestha, Narayan Kumar; Ficklin, Darren L.; Wang, Junye

    2018-04-01

    Stream temperature is an important indicator for biodiversity and sustainability in aquatic ecosystems. The stream temperature model currently in the Soil and Water Assessment Tool (SWAT) only considers the impact of air temperature on stream temperature, while the hydroclimatological stream temperature model developed within the SWAT model considers hydrology and the impact of air temperature in simulating the water-air heat transfer process. In this study, we modified the hydroclimatological model by including the equilibrium temperature approach to model heat transfer processes at the water-air interface, which reflects the influences of air temperature, solar radiation, wind speed and streamflow conditions on the heat transfer process. The thermal capacity of the streamflow is modeled by the variation of the stream water depth. An advantage of this equilibrium temperature model is the simple parameterization, with only two parameters added to model the heat transfer processes. The equilibrium temperature model proposed in this study is applied and tested in the Athabasca River basin (ARB) in Alberta, Canada. The model is calibrated and validated at five stations throughout different parts of the ARB, where close to monthly samplings of stream temperatures are available. The results indicate that the equilibrium temperature model proposed in this study provided better and more consistent performances for the different regions of the ARB with the values of the Nash-Sutcliffe Efficiency coefficient (NSE) greater than those of the original SWAT model and the hydroclimatological model. To test the model performance for different hydrological and environmental conditions, the equilibrium temperature model was also applied to the North Fork Tolt River Watershed in Washington, United States. The results indicate a reasonable simulation of stream temperature using the model proposed in this study, with minimum relative error values compared to the other two models. However, the NSE values were lower than those of the hydroclimatological model, indicating that more model verification needs to be done. The equilibrium temperature model uses existing SWAT meteorological data as input, can be calibrated using fewer parameters and less effort and has an overall better performance in stream temperature simulation. Thus, it can be used as an effective tool for predicting the changes in stream temperature regimes under varying hydrological and meteorological conditions. In addition, the impact of the stream temperature simulations on chemical reaction rates and concentrations was tested. The results indicate that the improved performance of the stream temperature simulation could significantly affect chemical reaction rates and the simulated concentrations, and the equilibrium temperature model could be a potential tool to model stream temperature in water quality simulations.

  15. Modeling of skin cancer dermatoscopy images

    NASA Astrophysics Data System (ADS)

    Iralieva, Malica B.; Myakinin, Oleg O.; Bratchenko, Ivan A.; Zakharov, Valery P.

    2018-04-01

    An early identified cancer is more likely to effective respond to treatment and has a less expensive treatment as well. Dermatoscopy is one of general diagnostic techniques for skin cancer early detection that allows us in vivo evaluation of colors and microstructures on skin lesions. Digital phantoms with known properties are required during new instrument developing to compare sample's features with data from the instrument. An algorithm for image modeling of skin cancer is proposed in the paper. Steps of the algorithm include setting shape, texture generation, adding texture and normal skin background setting. The Gaussian represents the shape, and then the texture generation based on a fractal noise algorithm is responsible for spatial chromophores distributions, while the colormap applied to the values corresponds to spectral properties. Finally, a normal skin image simulated by mixed Monte Carlo method using a special online tool is added as a background. Varying of Asymmetry, Borders, Colors and Diameter settings is shown to be fully matched to the ABCD clinical recognition algorithm. The asymmetry is specified by setting different standard deviation values of Gaussian in different parts of image. The noise amplitude is increased to set the irregular borders score. Standard deviation is changed to determine size of the lesion. Colors are set by colormap changing. The algorithm for simulating different structural elements is required to match with others recognition algorithms.

  16. Greenland Regional and Ice Sheet-wide Geometry Sensitivity to Boundary and Initial conditions

    NASA Astrophysics Data System (ADS)

    Logan, L. C.; Narayanan, S. H. K.; Greve, R.; Heimbach, P.

    2017-12-01

    Ice sheet and glacier model outputs require inputs from uncertainly known initial and boundary conditions, and other parameters. Conservation and constitutive equations formalize the relationship between model inputs and outputs, and the sensitivity of model-derived quantities of interest (e.g., ice sheet volume above floatation) to model variables can be obtained via the adjoint model of an ice sheet. We show how one particular ice sheet model, SICOPOLIS (SImulation COde for POLythermal Ice Sheets), depends on these inputs through comprehensive adjoint-based sensitivity analyses. SICOPOLIS discretizes the shallow-ice and shallow-shelf approximations for ice flow, and is well-suited for paleo-studies of Greenland and Antarctica, among other computational domains. The adjoint model of SICOPOLIS was developed via algorithmic differentiation, facilitated by the source transformation tool OpenAD (developed at Argonne National Lab). While model sensitivity to various inputs can be computed by costly methods involving input perturbation simulations, the time-dependent adjoint model of SICOPOLIS delivers model sensitivities to initial and boundary conditions throughout time at lower cost. Here, we explore both the sensitivities of the Greenland Ice Sheet's entire and regional volumes to: initial ice thickness, precipitation, basal sliding, and geothermal flux over the Holocene epoch. Sensitivity studies such as described here are now accessible to the modeling community, based on the latest version of SICOPOLIS that has been adapted for OpenAD to generate correct and efficient adjoint code.

  17. MacArthur Competence Assessment Tool for Treatment in Alzheimer disease: cross-cultural adaptation.

    PubMed

    Santos, Raquel Luiza; Sousa, Maria Fernanda Barroso de; Simões, José Pedro; Bertrand, Elodie; Mograbi, Daniel C; Landeira-Fernandez, Jesus; Laks, Jerson; Dourado, Marcia Cristina Nascimento

    2017-01-01

    We adapted the MacArthur Competence Assessment Tool for Treatment (MacCAT-T) to Brazilian Portuguese, pilot testing it on mild and moderate patients with Alzheimer's disease (AD). The cross-cultural process required six steps. Sixty-six patients with AD were assessed for competence to consent to treatment, global cognition, working memory, awareness of disease, functionality, depressive symptoms and dementia severity. The items had semantic, idiomatic, conceptual and experiential equivalence. We found no difference between mild and moderate patients with AD on the MacCAT-T domains. The linear regressions showed that reasoning (p = 0.000) and functional status (p = 0.003) were related to understanding. Understanding (p = 0.000) was related to appreciation and reasoning. Awareness of disease (p = 0.001) was related to expressing a choice. The MacCAT-T adaptation was well-understood and the constructs of the original version were maintained. The results of the pilot study demonstrated an available Brazilian tool focused on decision-making capacity in AD.

  18. The impact of brief team communication, leadership and team behavior training on ad hoc team performance in trauma care settings.

    PubMed

    Roberts, Nicole K; Williams, Reed G; Schwind, Cathy J; Sutyak, John A; McDowell, Christopher; Griffen, David; Wall, Jarrod; Sanfey, Hilary; Chestnut, Audra; Meier, Andreas H; Wohltmann, Christopher; Clark, Ted R; Wetter, Nathan

    2014-02-01

    Communication breakdowns and care coordination problems often cause preventable adverse patient care events, which can be especially acute in the trauma setting, in which ad hoc teams have little time for advanced planning. Existing teamwork curricula do not address the particular issues associated with ad hoc emergency teams providing trauma care. Ad hoc trauma teams completed a preinstruction simulated trauma encounter and were provided with instruction on appropriate team behaviors and team communication. Teams completed a postinstruction simulated trauma encounter immediately afterward and 3 weeks later, then completed a questionnaire. Blinded raters rated videotapes of the simulations. Participants expressed high levels of satisfaction and intent to change practice after the intervention. Participants changed teamwork and communication behavior on the posttest, and changes were sustained after a 3-week interval, though there was some loss of retention. Brief training exercises can change teamwork and communication behaviors on ad hoc trauma teams. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Two NextGen Air Safety Tools: An ADS-B Equipped UAV and a Wake Turbulence Estimator

    NASA Astrophysics Data System (ADS)

    Handley, Ward A.

    Two air safety tools are developed in the context of the FAA's NextGen program. The first tool addresses the alarming increase in the frequency of near-collisions between manned and unmanned aircraft by equipping a common hobby class UAV with an ADS-B transponder that broadcasts its position, speed, heading and unique identification number to all local air traffic. The second tool estimates and outputs the location of dangerous wake vortex corridors in real time based on the ADS-B data collected and processed using a custom software package developed for this project. The TRansponder based Position Information System (TRAPIS) consists of data packet decoders, an aircraft database, Graphical User Interface (GUI) and the wake vortex extension application. Output from TRAPIS can be visualized in Google Earth and alleviates the problem of pilots being left to imagine where invisible wake vortex corridors are based solely on intuition or verbal warnings from ATC. The result of these two tools is the increased situational awareness, and hence safety, of human pilots in the National Airspace System (NAS).

  20. Towards Better Simulation of US Maize Yield Responses to Climate in the Community Earth System Model

    NASA Astrophysics Data System (ADS)

    Peng, B.; Guan, K.; Chen, M.; Lawrence, D. M.; Jin, Z.; Bernacchi, C.; Ainsworth, E. A.; DeLucia, E. H.; Lombardozzi, D. L.; Lu, Y.

    2017-12-01

    Global food security is undergoing continuing pressure from increased population and climate change despites the potential advancement in breeding and management technologies. Earth system models (ESMs) are essential tools to study the impacts of historical and future climate on regional and global food production, as well as to assess the effectiveness of possible adaptations and their potential feedback to climate. Here we developed an improved maize representation within the Community Earth System Model (CESM) by combining the strengths of both the Community Land Model version 4.5 (CLM4.5) and the Agricultural Production Systems sIMulator (APSIM) models. Specifically, we modified the maize planting scheme, incorporated the phenology scheme adopted from the APSIM model, added a new carbon allocation scheme into CLM4.5, and improved the estimation of canopy structure parameters including leaf area index (LAI) and canopy height. Unique features of the new model (CLM-APSIM) include more detailed phenology stages, an explicit implementation of the impacts of various abiotic environmental stresses (including nitrogen, water, temperature and heat stresses) on maize phenology and carbon allocation, as well as an explicit simulation of grain number and grain size. We conducted a regional simulation of this new model over the US Corn Belt during 1990 to 2010. The simulated maize yield as well as its responses to climate (growing season mean temperature and precipitation) are benchmarked with data from UADA NASS statistics. Our results show that the CLM-APSIM model outperforms the CLM4.5 in simulating county-level maize yield production and reproduces more realistic yield responses to climate variations than CLM4.5. However, some critical processes (such as crop failure due to frost and inundation and suboptimal growth condition due to biotic stresses) are still missing in both CLM-APSIM and CLM4.5, making the simulated yield responses to climate slightly deviate from the reality. Our results demonstrate that with improved paramterization of crop growth, the ESMs can be powerful tools for realistically simulating agricultural production, which is gaining increasing interests and critical to study of global food security and food-energy-water nexus.

  1. A Game Theory Based Solution for Security Challenges in CRNs

    NASA Astrophysics Data System (ADS)

    Poonam; Nagpal, Chander Kumar

    2018-03-01

    Cognitive radio networks (CRNs) are being envisioned to drive the next generation Ad hoc wireless networks due to their ability to provide communications resilience in continuously changing environments through the use of dynamic spectrum access. Conventionally CRNs are dependent upon the information gathered by other secondary users to ensure the accuracy of spectrum sensing making them vulnerable to security attacks leading to the need of security mechanisms like cryptography and trust. However, a typical cryptography based solution is not a viable security solution for CRNs owing to their limited resources. Effectiveness of trust based approaches has always been, in question, due to credibility of secondary trust resources. Game theory with its ability to optimize in an environment of conflicting interests can be quite a suitable tool to manage an ad hoc network in the presence of autonomous selfish/malevolent/malicious and attacker nodes. The literature contains several theoretical proposals for augmenting game theory in the ad hoc networks without explicit/detailed implementation. This paper implements a game theory based solution in MATLAB-2015 to secure the CRN environment and compares the obtained results with the traditional approaches of trust and cryptography. The simulation result indicates that as the time progresses the game theory performs much better with higher throughput, lower jitter and better identification of selfish/malicious nodes.

  2. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    PubMed

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Development of an Unstructured, Three-Dimensional Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph; Stern, Eric; Palmer, Grant; Muppidi, Suman; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. The extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries as well as multi-dimensional physics, which have been shown to be important in some scenarios and are not captured by one-dimensional models. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  4. UMI-tools: modeling sequencing errors in Unique Molecular Identifiers to improve quantification accuracy

    PubMed Central

    2017-01-01

    Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package. PMID:28100584

  5. Composite mathematical modeling of calcium signaling behind neuronal cell death in Alzheimer's disease.

    PubMed

    Ranjan, Bobby; Chong, Ket Hing; Zheng, Jie

    2018-04-11

    Alzheimer's disease (AD) is a progressive neurological disorder, recognized as the most common cause of dementia affecting people aged 65 and above. AD is characterized by an increase in amyloid metabolism, and by the misfolding and deposition of β-amyloid oligomers in and around neurons in the brain. These processes remodel the calcium signaling mechanism in neurons, leading to cell death via apoptosis. Despite accumulating knowledge about the biological processes underlying AD, mathematical models to date are restricted to depicting only a small portion of the pathology. Here, we integrated multiple mathematical models to analyze and understand the relationship among amyloid depositions, calcium signaling and mitochondrial permeability transition pore (PTP) related cell apoptosis in AD. The model was used to simulate calcium dynamics in the absence and presence of AD. In the absence of AD, i.e. without β-amyloid deposition, mitochondrial and cytosolic calcium level remains in the low resting concentration. However, our in silico simulation of the presence of AD with the β-amyloid deposition, shows an increase in the entry of calcium ions into the cell and dysregulation of Ca 2+ channel receptors on the Endoplasmic Reticulum. This composite model enabled us to make simulation that is not possible to measure experimentally. Our mathematical model depicting the mechanisms affecting calcium signaling in neurons can help understand AD at the systems level and has potential for diagnostic and therapeutic applications.

  6. Initial Evaluation of a Conflict Detection Tool in the Terminal Area

    NASA Technical Reports Server (NTRS)

    Verma Savita Arora; Tang, Huabin; Ballinger, Deborah S.; Kozon, Thomas E.; Farrahi, Amir Hossein

    2012-01-01

    Despite the recent economic recession and its adverse impact on air travel, the Federal Aviation Administration (FAA) continues to forecast an increase in air traffic demand that may see traffic double or triple by the year 2025. Increases in air traffic will burden the air traffic management system, and higher levels of safety and efficiency will be required. The air traffic controllers primary task is to ensure separation between aircraft in their airspace and keep the skies safe. As air traffic is forecasted to increase in volume and complexity [1], there is an increased likelihood of conflicts between aircraft, which adds risk and inefficiency to air traffic management and increases controller workload. To attenuate these factors, recent ATM research has shown that air and ground-based automation tools could reduce controller workload, especially if the automation is focused on conflict detection and resolution. Conflict Alert is a short time horizon conflict detection tool deployed in the Terminal Radar Approach Control (TRACON), which has limited utility due to the high number of false alerts generated and its use of dead reckoning to predict loss of separation between aircraft. Terminal Tactical Separation Assurance Flight Environment (T-TSAFE) is a short time horizon conflict detection tool that uses both flight intent and dead reckoning to detect conflicts. Results of a fast time simulation experiment indicated that TTSAFE provided a more effective alert lead-time and generated less false alerts than Conflict Alert [2]. TSAFE was previously tested in a Human-In-The-Loop (HITL) simulation study that focused on the en route phase of flight [3]. The current study tested the T-TSAFE tool in an HITL simulation study, focusing on the terminal environment with current day operations. The study identified procedures, roles, responsibilities, information requirements and usability, with the help of TRACON controllers who participated in the experiment. Metrics such as lead alert time, alert response time, workload, situation awareness and other measures were statistically analyzed. These metrics were examined from an overall perspective and comparisons between conditions (altitude resolutions via keyboard entry vs. ADS-B entry) and controller positions (two final approach sectors and two feeder sectors) were also examined. Results of these analyses and controller feedback provided evidence of T-TSAFE s potential promise as a useful air traffic controller tool. Heuristic analysis also provided information on ways in which the T-TSAFE tool can be improved. Details of analyses results will be presented in the full paper.

  7. Sasquatch Footprint Tool

    NASA Technical Reports Server (NTRS)

    Bledsoe, Kristin

    2013-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) is the parachute system for NASA s Orion spacecraft. The test program consists of numerous drop tests, wherein a test article rigged with parachutes is extracted or released from an aircraft. During such tests, range safety is paramount, as is the recoverability of the parachutes and test article. It is crucial to establish an aircraft release point that will ensure that the article and all items released from it will land in safe locations. A new footprint predictor tool, called Sasquatch, was created in MATLAB. This tool takes in a simulated trajectory for the test article, information about all released objects, and atmospheric wind data (simulated or actual) to calculate the trajectories of the released objects. Dispersions are applied to the landing locations of those objects, taking into account the variability of winds, aircraft release point, and object descent rate. Sasquatch establishes a payload release point (e.g., where the payload will be extracted from the carrier aircraft) that will ensure that the payload and all objects released from it will land in a specified cleared area. The landing locations (the final points in the trajectories) are plotted on a map of the test range. Sasquatch was originally designed for CPAS drop tests and includes extensive information about both the CPAS hardware and the primary test range used for CPAS testing. However, it can easily be adapted for more complex CPAS drop tests, other NASA projects, and commercial partners. CPAS has developed the Sasquatch footprint tool to ensure range safety during parachute drop tests. Sasquatch is well correlated to test data and continues to ensure the safety of test personnel as well as the safe recovery of all equipment. The tool will continue to be modified based on new test data, improving predictions and providing added capability to meet the requirements of more complex testing.

  8. Land-surface parameter optimisation using data assimilation techniques: the adJULES system V1.0

    DOE PAGES

    Raoult, Nina M.; Jupp, Tim E.; Cox, Peter M.; ...

    2016-08-25

    Land-surface models (LSMs) are crucial components of the Earth system models (ESMs) that are used to make coupled climate–carbon cycle projections for the 21st century. The Joint UK Land Environment Simulator (JULES) is the land-surface model used in the climate and weather forecast models of the UK Met Office. JULES is also extensively used offline as a land-surface impacts tool, forced with climatologies into the future. In this study, JULES is automatically differentiated with respect to JULES parameters using commercial software from FastOpt, resulting in an analytical gradient, or adjoint, of the model. Using this adjoint, the adJULES parameter estimationmore » system has been developed to search for locally optimum parameters by calibrating against observations. This paper describes adJULES in a data assimilation framework and demonstrates its ability to improve the model–data fit using eddy-covariance measurements of gross primary production (GPP) and latent heat (LE) fluxes. adJULES also has the ability to calibrate over multiple sites simultaneously. This feature is used to define new optimised parameter values for the five plant functional types (PFTs) in JULES. The optimised PFT-specific parameters improve the performance of JULES at over 85 % of the sites used in the study, at both the calibration and evaluation stages. Furthermore, the new improved parameters for JULES are presented along with the associated uncertainties for each parameter.« less

  9. [Simulation-based learning and internal medicine: Opportunities and current perspectives for a national harmonized program].

    PubMed

    Galland, J; Abbara, S; Terrier, B; Samson, M; Tesnières, A; Fournier, J P; Braun, M

    2018-06-01

    Simulation-based learning (SBL) is developing rapidly in France and the question of its use in the teaching of internal medicine (IM) is essential. While HAS encourages its integration into medical education, French Young Internists (AJI) set up a working group to reflect on the added-value of this tool in our specialty. Different sorts of SBL exist: human, synthetic and electronic. It enables student to acquire and evaluate technical skills (strengths, invasive procedures, etc.) and non-technical skills (relational, reasoning…). The debriefing that follows the simulation session is an essential time in pedagogical terms. It enables the acquisition of knowledge by encouraging the students' reflection to reshape their reasoning patterns by self-correcting. IM interns are supportive of its use. The simulation would allow young internists to acquire skills specific to our specialty such as certain gestures, complex consulting management, the synthesis of difficult clinical cases. SBL remains confronted with human and financial cost issues. The budgets allocated to the development and maintenance of simulation centres are uneven, making the supply of training unequal on the territory. Simulation sessions are time-consuming and require teacher training. Are faculties ready to train and invest their time in simulation, even though the studies do not allow us to conclude on its pedagogical validity? Copyright © 2018 Société Nationale Française de Médecine Interne (SNFMI). Published by Elsevier SAS. All rights reserved.

  10. 3D FEM Simulation of Flank Wear in Turning

    NASA Astrophysics Data System (ADS)

    Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio

    2011-05-01

    This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.

  11. Advanced Doubling Adding Method for Radiative Transfer in Planetary Atmospheres

    NASA Astrophysics Data System (ADS)

    Liu, Quanhua; Weng, Fuzhong

    2006-12-01

    The doubling adding method (DA) is one of the most accurate tools for detailed multiple-scattering calculations. The principle of the method goes back to the nineteenth century in a problem dealing with reflection and transmission by glass plates. Since then the doubling adding method has been widely used as a reference tool for other radiative transfer models. The method has never been used in operational applications owing to tremendous demand on computational resources from the model. This study derives an analytical expression replacing the most complicated thermal source terms in the doubling adding method. The new development is called the advanced doubling adding (ADA) method. Thanks also to the efficiency of matrix and vector manipulations in FORTRAN 90/95, the advanced doubling adding method is about 60 times faster than the doubling adding method. The radiance (i.e., forward) computation code of ADA is easily translated into tangent linear and adjoint codes for radiance gradient calculations. The simplicity in forward and Jacobian computation codes is very useful for operational applications and for the consistency between the forward and adjoint calculations in satellite data assimilation.

  12. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov Websites

    | NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology

  13. Re-engineering the mission life cycle with ABC and IDEF

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Rackley, Michael; Karlin, Jay

    1994-01-01

    The theory behind re-engineering a business process is to remove the non-value added activities thereby lowering the process cost. In order to achieve this, one must be able to identify where the non-value added elements are located which is not a trivial task. This is because the non-value added elements are often hidden in the form of overhead and/or pooled resources. In order to be able to isolate these non-value added processes from among the other processes, one must first decompose the overall top level process into lower layers of sub-processes. In addition, costing data must be assigned to each sub-process along with the value the sub-process adds towards the final product. IDEF0 is a Federal Information Processing Standard (FIPS) process-modeling tool that allows for this functional decomposition through structured analysis. In addition, it illustrates the relationship of the process and the value added to the product or service. The value added portion is further defined in IDEF1X which is an entity relationship diagramming tool. The entity relationship model is the blueprint of the product as it moves along the 'assembly line' and therefore relates all of the parts to each other and the final product. It also relates the parts to the tools that produce the product and all of the paper work that is used in their acquisition. The use of IDEF therefore facilitates the use of Activity Based Costing (ABC). ABC is an essential method in a high variety, product-customizing environment, to facilitate rapid response to externally caused change. This paper describes the work being done in the Mission Operations Division to re-engineer the development and operation life cycle of Mission Operations Centers using these tools.

  14. Episodic simulation of future events is impaired in mild Alzheimer's disease

    PubMed Central

    Addis, Donna Rose; Sacchetti, Daniel C.; Ally, Brandon A.; Budson, Andrew E.; Schacter, Daniel L.

    2009-01-01

    Recent neuroimaging studies have demonstrated that both remembering the past and simulating the future activate a core neural network including the medial temporal lobes. Regions of this network, in particular the medial temporal lobes, are prime sites for amyloid deposition and are structurally and functionally compromised in Alzheimer's disease (AD). While we know some functions of this core network, specifically episodic autobiographical memory, are impaired in AD, no study has examined whether future episodic simulation is similarly impaired. We tested the ability of sixteen AD patients and sixteen age-matched controls to generate past and future autobiographical events using an adapted version of the Autobiographical Interview. Participants also generated five remote autobiographical memories from across the lifespan. Event transcriptions were segmented into distinct details, classified as either internal (episodic) or external (non-episodic). AD patients exhibited deficits in both remembering past events and simulating future events, generating fewer internal and external episodic details than healthy older controls. The internal and external detail scores were strongly correlated across past and future events, providing further evidence of the close linkages between the mental representations of past and future. PMID:19497331

  15. Development and Testing of Control Laws for the Active Aeroelastic Wing Program

    NASA Technical Reports Server (NTRS)

    Dibley, Ryan P.; Allen, Michael J.; Clarke, Robert; Gera, Joseph; Hodgkinson, John

    2005-01-01

    The Active Aeroelastic Wing research program was a joint program between the U.S. Air Force Research Laboratory and NASA established to investigate the characteristics of an aeroelastic wing and the technique of using wing twist for roll control. The flight test program employed the use of an F/A-18 aircraft modified by reducing the wing torsional stiffness and adding a custom research flight control system. The research flight control system was optimized to maximize roll rate using only wing surfaces to twist the wing while simultaneously maintaining design load limits, stability margins, and handling qualities. NASA Dryden Flight Research Center developed control laws using the software design tool called CONDUIT, which employs a multi-objective function optimization to tune selected control system design parameters. Modifications were made to the Active Aeroelastic Wing implementation in this new software design tool to incorporate the NASA Dryden Flight Research Center nonlinear F/A-18 simulation for time history analysis. This paper describes the design process, including how the control law requirements were incorporated into constraints for the optimization of this specific software design tool. Predicted performance is also compared to results from flight.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.; Murray, Regan; Bynum, Michael

    Water utilities are vulnerable to a wide variety of human-caused and natural disasters. These disruptive events can result in loss of water service, contaminated water, pipe breaks, and failed equipment. Furthermore, long term changes in water supply and customer demand can have a large impact on the operating conditions of the network. The ability to maintain drinking water service during and following these types of events is critical. Simulation and analysis tools can help water utilities explore how their network will respond to disruptive events and plan effective mitigation strategies. The U.S. Environmental Protection Agency and Sandia National Laboratories aremore » developing new software tools to meet this need. The Water Network Tool for Resilience (WNTR, pronounced winter) is a Python package designed to help water utilities investigate resilience of water distribution systems over a wide range of hazardous scenarios and to evaluate resilience-enhancing actions. The following documentation includes installation instructions and examples, description of software features, and software license. It is assumed that the reader is familiar with the Python Programming Language. References are included for additional background on software components. Online documentation, hosted at http://wntr.readthedocsio/, will be updated as new features are added. The online version includes API documentation and information for developers.« less

  17. Mechanical abuse simulation and thermal runaway risks of large-format Li-ion batteries

    NASA Astrophysics Data System (ADS)

    Wang, Hsin; Lara-Curzio, Edgar; Rule, Evan T.; Winchester, Clinton S.

    2017-02-01

    Internal short circuit of large-format Li-ion pouch cells induced by mechanical abuse was simulated using a modified mechanical pinch test. A torsion force was added manually at ∼40% maximum compressive loading force during the pinch test. The cell was twisted about 5° to the side by horizontally pulling a wire attached to the anode tab. The combined torsion-compression force created small failure at the separator yet allowed testing of fully charged large format Li-ion cells without triggering thermal runaway. Two types of commercial cells were tested using 4-6 cells at each state-of-charge (SOC). Commercially available 18 Ahr LiFePO4 (LFP) and 25 Ahr Li(NiMnCo)1/3O2 (NMC) cells were tested, and a thermal runaway risk (TRR) score system was used to evaluate the safety of the cells under the same testing conditions. The aim was to provide the cell manufacturers and end users with a tool to compare different designs and safety features.

  18. The impact of range anxiety and home, workplace, and public charging infrastructure on simulated battery electric vehicle lifetime utility

    NASA Astrophysics Data System (ADS)

    Neubauer, Jeremy; Wood, Eric

    2014-07-01

    Battery electric vehicles (BEVs) offer the potential to reduce both oil imports and greenhouse gas emissions, but have a limited utility due to factors including driver range anxiety and access to charging infrastructure. In this paper we apply NREL's Battery Lifetime Analysis and Simulation Tool for Vehicles (BLAST-V) to examine the sensitivity of BEV utility to range anxiety and different charging infrastructure scenarios, including variable time schedules, power levels, and locations (home, work, and public installations). We find that the effects of range anxiety can be significant, but are reduced with access to additional charging infrastructure. We also find that (1) increasing home charging power above that provided by a common 15 A, 120 V circuit offers little added utility, (2) workplace charging offers significant utility benefits to select high mileage commuters, and (3) broadly available public charging can bring many lower mileage drivers to near-100% utility while strongly increasing the achieved miles of high mileage drivers.

  19. Test of the ``radical-like polymerization'' scheme in molecular dynamics on the behavior of polymers under shock loading

    NASA Astrophysics Data System (ADS)

    Lemarchand, Claire; Bousquet, David; Schnell, Benoît; Pineau, Nicolas

    2017-06-01

    The behavior of polymer melts under shock loading is a question attracting more and more attention because of applications such as polymer-bonded explosives, light-weight armor and civilian protective equipment, like sports and car equipment. Molecular dynamics (MD) simulations are a very good tool to characterize the microscopic response of the polymer to a shock wave. To do so, the initial configuration of the polymer melt needs to be realistic. The ``radical-like polymerization'' scheme is a method to obtain near equilibrium configurations of a melt of long polymer chains. It consists in adding one neighboring monomer at a time to each growing chain. Between each polymerization step an MD run is performed to relax the new configuration. We test how details of our implementation of the ``radical-like polymerization'' scheme can impact or not Hugoniot curves and changes of chain configuration under shock. We compare our results to other simulation and experimental results on reference polymers.

  20. Modeling Vortex Generators in the Wind-US Code

    NASA Technical Reports Server (NTRS)

    Dudek, Julianne C.

    2010-01-01

    A source term model which simulates the effects of vortex generators was implemented into the Wind-US Navier Stokes code. The source term added to the Navier-Stokes equations simulates the lift force which would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, supersonic flow in a rectangular duct with a counterrotating vortex generator pair, and subsonic flow in an S-duct with 22 co-rotating vortex generators. The validation results indicate that the source term vortex generator model provides a useful tool for screening vortex generator configurations and gives comparable results to solutions computed using a gridded vane.

  1. Validation of CT dose-reduction simulation

    PubMed Central

    Massoumzadeh, Parinaz; Don, Steven; Hildebolt, Charles F.; Bae, Kyongtae T.; Whiting, Bruce R.

    2009-01-01

    The objective of this research was to develop and validate a custom computed tomography dose-reduction simulation technique for producing images that have an appearance consistent with the same scan performed at a lower mAs (with fixed kVp, rotation time, and collimation). Synthetic noise is added to projection (sinogram) data, incorporating a stochastic noise model that includes energy-integrating detectors, tube-current modulation, bowtie beam filtering, and electronic system noise. Experimental methods were developed to determine the parameters required for each component of the noise model. As a validation, the outputs of the simulations were compared to measurements with cadavers in the image domain and with phantoms in both the sinogram and image domain, using an unbiased root-mean-square relative error metric to quantify agreement in noise processes. Four-alternative forced-choice (4AFC) observer studies were conducted to confirm the realistic appearance of simulated noise, and the effects of various system model components on visual noise were studied. The “just noticeable difference (JND)” in noise levels was analyzed to determine the sensitivity of observers to changes in noise level. Individual detector measurements were shown to be normally distributed (p>0.54), justifying the use of a Gaussian random noise generator for simulations. Phantom tests showed the ability to match original and simulated noise variance in the sinogram domain to within 5.6%±1.6% (standard deviation), which was then propagated into the image domain with errors less than 4.1%±1.6%. Cadaver measurements indicated that image noise was matched to within 2.6%±2.0%. More importantly, the 4AFC observer studies indicated that the simulated images were realistic, i.e., no detectable difference between simulated and original images (p=0.86) was observed. JND studies indicated that observers’ sensitivity to change in noise levels corresponded to a 25% difference in dose, which is far larger than the noise accuracy achieved by simulation. In summary, the dose-reduction simulation tool demonstrated excellent accuracy in providing realistic images. The methodology promises to be a useful tool for researchers and radiologists to explore dose reduction protocols in an effort to produce diagnostic images with radiation dose “as low as reasonably achievable.” PMID:19235386

  2. PICS: SIMULATIONS OF STRONG GRAVITATIONAL LENSING IN GALAXY CLUSTERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Nan; Gladders, Michael D.; Florian, Michael K.

    2016-09-01

    Gravitational lensing has become one of the most powerful tools available for investigating the “dark side” of the universe. Cosmological strong gravitational lensing, in particular, probes the properties of the dense cores of dark matter halos over decades in mass and offers the opportunity to study the distant universe at flux levels and spatial resolutions otherwise unavailable. Studies of strongly lensed variable sources offer even further scientific opportunities. One of the challenges in realizing the potential of strong lensing is to understand the statistical context of both the individual systems that receive extensive follow-up study, as well as that ofmore » the larger samples of strong lenses that are now emerging from survey efforts. Motivated by these challenges, we have developed an image simulation pipeline, Pipeline for Images of Cosmological Strong lensing (PICS), to generate realistic strong gravitational lensing signals from group- and cluster-scale lenses. PICS uses a low-noise and unbiased density estimator based on (resampled) Delaunay Tessellations to calculate the density field; lensed images are produced by ray-tracing images of actual galaxies from deep Hubble Space Telescope observations. Other galaxies, similarly sampled, are added to fill in the light cone. The pipeline further adds cluster member galaxies and foreground stars into the lensed images. The entire image ensemble is then observed using a realistic point-spread function that includes appropriate detector artifacts for bright stars. Noise is further added, including such non-Gaussian elements as noise window-paning from mosaiced observations, residual bad pixels, and cosmic rays. The aim is to produce simulated images that appear identical—to the eye (expert or otherwise)—to real observations in various imaging surveys.« less

  3. PICS: SIMULATIONS OF STRONG GRAVITATIONAL LENSING IN GALAXY CLUSTERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Nan; Gladders, Michael D.; Rangel, Esteban M.

    2016-08-29

    Gravitational lensing has become one of the most powerful tools available for investigating the “dark side” of the universe. Cosmological strong gravitational lensing, in particular, probes the properties of the dense cores of dark matter halos over decades in mass and offers the opportunity to study the distant universe at flux levels and spatial resolutions otherwise unavailable. Studies of strongly lensed variable sources offer even further scientific opportunities. One of the challenges in realizing the potential of strong lensing is to understand the statistical context of both the individual systems that receive extensive follow-up study, as well as that ofmore » the larger samples of strong lenses that are now emerging from survey efforts. Motivated by these challenges, we have developed an image simulation pipeline, Pipeline for Images of Cosmological Strong lensing (PICS), to generate realistic strong gravitational lensing signals from group- and cluster-scale lenses. PICS uses a low-noise and unbiased density estimator based on (resampled) Delaunay Tessellations to calculate the density field; lensed images are produced by ray-tracing images of actual galaxies from deep Hubble Space Telescope observations. Other galaxies, similarly sampled, are added to fill in the light cone. The pipeline further adds cluster member galaxies and foreground stars into the lensed images. The entire image ensemble is then observed using a realistic point-spread function that includes appropriate detector artifacts for bright stars. Noise is further added, including such non-Gaussian elements as noise window-paning from mosaiced observations, residual bad pixels, and cosmic rays. The aim is to produce simulated images that appear identical—to the eye (expert or otherwise)—to real observations in various imaging surveys.« less

  4. Numerical simulation of aerobic exercise as a countermeasure in human spaceflight

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    The objective of this work is to analyse the efficacy of long-term regular exercise on relevant cardiovascular parameters when the human body is also exposed to microgravity. Computer simulations are an important tool which may be used to predict and analyse these possible effects, and compare them with in-flight experiments. We based our study on a electrical-like computer model (NELME: Numerical Evaluation of Long-term Microgravity Effects) which was developed in our laboratory and validated with the available data, focusing on the cardiovascu-lar parameters affected by changes in gravity exposure. NELME is based on an electrical-like control system model of the physiological changes, that are known to take place when grav-ity changes are applied. The computer implementation has a modular architecture. Hence, different output parameters, potential effects, organs and countermeasures can be easily imple-mented and evaluated. We added to the previous cardiovascular system module a perturbation module to evaluate the effect of regular exercise on the output parameters previously studied. Therefore, we simulated a well-known countermeasure with different protocols of exercising, as a pattern of input electric-like perturbations on the basic module. Different scenarios have been numerically simulated for both men and women, in different patterns of microgravity, reduced gravity and time exposure. Also EVAs were simulated as perturbations to the system. Results show slight differences in gender, with more risk reduction for women than for men after following an aerobic exercise pattern during a simulated mission. Also, risk reduction of a cardiovascular malfunction is evaluated, with a ceiling effect found in all scenarios. A turning point in vascular resistance for a long-term exposure of microgravity below 0.4g has been found of particular interest. In conclusion, we show that computer simulations are a valuable tool to analyse different effects of long-term microgravity exposure on the human body. Potential countermeasures such as physical exercise can also be evaluated as an induced perturbation into the system. Relevant results are compatible with existing data, and are of valuable interest as an assessment of the efficacy of aerobic exercise as a countermeasure in future missions to Mars.

  5. Non-linear dynamical classification of short time series of the rössler system in high noise regimes.

    PubMed

    Lainscsek, Claudia; Weyhenmeyer, Jonathan; Hernandez, Manuel E; Poizner, Howard; Sejnowski, Terrence J

    2013-01-01

    Time series analysis with delay differential equations (DDEs) reveals non-linear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG) data recorded from patients with Parkinson's disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c) of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10 to -30 dB signal-to-noise ratios (SNR). Structure selection was supervised by selecting the number of terms, delays, and order of non-linearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A' under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data.

  6. Non-Linear Dynamical Classification of Short Time Series of the Rössler System in High Noise Regimes

    PubMed Central

    Lainscsek, Claudia; Weyhenmeyer, Jonathan; Hernandez, Manuel E.; Poizner, Howard; Sejnowski, Terrence J.

    2013-01-01

    Time series analysis with delay differential equations (DDEs) reveals non-linear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG) data recorded from patients with Parkinson’s disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c) of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10 to −30 dB signal-to-noise ratios (SNR). Structure selection was supervised by selecting the number of terms, delays, and order of non-linearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A′ under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data. PMID:24379798

  7. Proceedings of the 1987 conference on tools for the simulation profession

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hawkins, R.; Klukis, K.

    1987-01-01

    This book covers the proceedings of the 1987 conference on tools for the simulation profession. Some of the topics are: SIMULACT: a generic tool for simulating distributed systems; ESL language simulation of spacecraft batteries; and Trends in global cadmium levels from increased use of fossil fuels.

  8. Operational Improvements From the Automatic Dependant Surveillance Broadcast In-Trail Procedure in the Pacific Organized Track System

    NASA Technical Reports Server (NTRS)

    Chartrand, Ryan C.; Jones, Kenneth M.; Allen, Bonnie D.

    2012-01-01

    The Federal Aviation Administration's Surveillance and Broadcast Services Program has supported implementation of the Automatic Dependant Surveillance Broadcast (ADS-B) In-Trail Procedure (ITP) on commercial revenue flights. ADS-B ITP is intended to be used in non-radar airspace that is employing procedural separation. Through the use of onboard tools, pilots are able to make a new type of altitude change request to an Air Traffic Service Provider (ATSP). The FAA, in partnership with United Airlines, is conducting flight trials of the ITP in revenue service in the Pacific. To support the expansion of flight trials to the rest of the US managed Pacific Airspace Region, a computerized batch study was conducted to investigate the operational impacts and potential benefits that can be gained through the use of the ITP in the Pacific Organized Track System (PACOTS). This study, which simulated the Oakland managed portion of the PACOTS, suggests that potential benefits in the PACOTS are significant with a considerable increase in time spent at optimum altitude and associated fuel savings.

  9. How Confounder Strength Can Affect Allocation of Resources in Electronic Health Records.

    PubMed

    Lynch, Kristine E; Whitcomb, Brian W; DuVall, Scott L

    2018-01-01

    When electronic health record (EHR) data are used, multiple approaches may be available for measuring the same variable, introducing potentially confounding factors. While additional information may be gleaned and residual confounding reduced through resource-intensive assessment methods such as natural language processing (NLP), whether the added benefits offset the added cost of the additional resources is not straightforward. We evaluated the implications of misclassification of a confounder when using EHRs. Using a combination of simulations and real data surrounding hospital readmission, we considered smoking as a potential confounder. We compared ICD-9 diagnostic code assignment, which is an easily available measure but has the possibility of substantial misclassification of smoking status, with NLP, a method of determining smoking status that more expensive and time-consuming than ICD-9 code assignment but has less potential for misclassification. Classification of smoking status with NLP consistently produced less residual confounding than the use of ICD-9 codes; however, when minimal confounding was present, differences between the approaches were small. When considerable confounding is present, investing in a superior measurement tool becomes advantageous.

  10. Development of autonomous controller system of high speed UAV from simulation to ready to fly condition

    NASA Astrophysics Data System (ADS)

    Yudhi Irwanto, Herma

    2018-02-01

    The development of autonomous controller system that is specially used in our high speed UAV, it’s call RKX-200EDF/TJ controlled vehicle needs to be continued as a step to mastery and to developt control system of LAPAN’s satellite launching rocket. The weakness of the existing control system in this high speed UAV needs to be repaired and replaced using the autonomous controller system. Conversion steps for ready-to-fly system involved controlling X tail fin, adjusting auto take off procedure by adding X axis sensor, procedure of way points reading and process of measuring distance and heading to the nearest way point, developing user-friendly ground station, and adding tools for safety landing. The development of this autonomous controller system also covered a real flying test in Pandanwangi, Lumajang in November 2016. Unfortunately, the flying test was not successful because the booster rocket was blown right after burning. However, the system could record the event and demonstrated that the controller system had worked according to plan.

  11. Smooth Particle Hydrodynamics GPU-Acceleration Tool for Asteroid Fragmentation Simulation

    NASA Astrophysics Data System (ADS)

    Buruchenko, Sergey K.; Schäfer, Christoph M.; Maindl, Thomas I.

    2017-10-01

    The impact threat of near-Earth objects (NEOs) is a concern to the global community, as evidenced by the Chelyabinsk event (caused by a 17-m meteorite) in Russia on February 15, 2013 and a near miss by asteroid 2012 DA14 ( 30 m diameter), on the same day. The expected energy, from either a low-altitude air burst or direct impact, would have severe consequences, especially in populated regions. To mitigate this threat one of the methods is employment of large kinetic-energy impactors (KEIs). The simulation of asteroid target fragmentation is a challenging task which demands efficient and accurate numerical methods with large computational power. Modern graphics processing units (GPUs) lead to a major increase 10 times and more in the performance of the computation of astrophysical and high velocity impacts. The paper presents a new implementation of the numerical method smooth particle hydrodynamics (SPH) using NVIDIA-GPU and the first astrophysical and high velocity application of the new code. The code allows for a tremendous increase in speed of astrophysical simulations with SPH and self-gravity at low costs for new hardware. We have implemented the SPH equations to model gas, liquids and elastic, and plastic solid bodies and added a fragmentation model for brittle materials. Self-gravity may be optionally included in the simulations.

  12. Hyperspectral imaging simulation of object under sea-sky background

    NASA Astrophysics Data System (ADS)

    Wang, Biao; Lin, Jia-xuan; Gao, Wei; Yue, Hui

    2016-10-01

    Remote sensing image simulation plays an important role in spaceborne/airborne load demonstration and algorithm development. Hyperspectral imaging is valuable in marine monitoring, search and rescue. On the demand of spectral imaging of objects under the complex sea scene, physics based simulation method of spectral image of object under sea scene is proposed. On the development of an imaging simulation model considering object, background, atmosphere conditions, sensor, it is able to examine the influence of wind speed, atmosphere conditions and other environment factors change on spectral image quality under complex sea scene. Firstly, the sea scattering model is established based on the Philips sea spectral model, the rough surface scattering theory and the water volume scattering characteristics. The measured bi directional reflectance distribution function (BRDF) data of objects is fit to the statistical model. MODTRAN software is used to obtain solar illumination on the sea, sky brightness, the atmosphere transmittance from sea to sensor and atmosphere backscattered radiance, and Monte Carlo ray tracing method is used to calculate the sea surface object composite scattering and spectral image. Finally, the object spectrum is acquired by the space transformation, radiation degradation and adding the noise. The model connects the spectrum image with the environmental parameters, the object parameters, and the sensor parameters, which provide a tool for the load demonstration and algorithm development.

  13. Mitral Valve Repair Using ePTFE Sutures for Ruptured Mitral Chordae Tendineae: A Computational Simulation Study

    PubMed Central

    Rim, Yonghoon; Laing, Susan T.; McPherson, David D.; Kim, Hyunggun

    2013-01-01

    Mitral valve repair using expanded polytetrafluoroethylene (ePTFE) sutures is an established and preferred interventional method to resolve the complex pathophysiologic problems associated with chordal rupture. We developed a novel computational evaluation protocol to determine the effect of the artificial sutures on restoring mitral valve function following valve repair. A virtual mitral valve was created using three-dimensional echocardiographic data in a patient with ruptured mitral chordae tendineae. Virtual repairs were designed by adding artificial sutures between the papillary muscles and the posterior leaflet where the native chordae were ruptured. Dynamic finite element simulations were performed to evaluate pre- and post-repair mitral valve function. Abnormal posterior leaflet prolapse and mitral regurgitation was clearly demonstrated in the mitral valve with ruptured chordae. Following virtual repair to reconstruct ruptured chordae, the severity of the posterior leaflet prolapse decreased and stress concentration was markedly reduced both in the leaflet tissue and the intact native chordae. Complete leaflet coaptation was restored when four or six sutures were utilized. Computational simulations provided quantitative information of functional improvement following mitral valve repair. This novel simulation strategy may provide a powerful tool for evaluation and prediction of interventional treatment for ruptured mitral chordae tendineae. PMID:24072489

  14. IgSimulator: a versatile immunosequencing simulator.

    PubMed

    Safonova, Yana; Lapidus, Alla; Lill, Jennie

    2015-10-01

    The recent introduction of next-generation sequencing technologies to antibody studies have resulted in a growing number of immunoinformatics tools for antibody repertoire analysis. However, benchmarking these newly emerging tools remains problematic since the gold standard datasets that are needed to validate these tools are typically not available. Since simulating antibody repertoires is often the only feasible way to benchmark new immunoinformatics tools, we developed the IgSimulator tool that addresses various complications in generating realistic antibody repertoires. IgSimulator's code has modular structure and can be easily adapted to new requirements to simulation. IgSimulator is open source and freely available as a C++ and Python program running on all Unix-compatible platforms. The source code is available from yana-safonova.github.io/ig_simulator. safonova.yana@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Dynamic load synthesis for shock numerical simulation in space structure design

    NASA Astrophysics Data System (ADS)

    Monti, Riccardo; Gasbarri, Paolo

    2017-08-01

    Pyroshock loads are the most stressing environments that a space equipment experiences during its operating life from a mechanical point of view. In general, the mechanical designer considers the pyroshock analysis as a very demanding constraint. Unfortunately, due to the non-linear behaviour of the structure under such loads, only the experimental tests can demonstrate if it is able to withstand these dynamic loads. By taking all the previous considerations into account, some preliminary information about the design correctness could be done by performing ;ad-hoc; numerical simulations, for example via commercial finite element software (i.e. MSC Nastran). Usually these numerical tools face the shock solution in two ways: 1) a direct mode, by using a time dependent enforcement and by evaluating the time-response and space-response as well as the internal forces; 2) a modal basis approach, by considering a frequency dependent load and of course by evaluating internal forces in the frequency domain. This paper has the main aim to develop a numerical tool to synthetize the time dependent enforcement based on deterministic and/or genetic algorithm optimisers. In particular starting from a specified spectrum in terms of SRS (Shock Response Spectrum) a time dependent discrete function, typically an acceleration profile, will be obtained to force the equipment by simulating the shock event. The synthetizing time and the interface with standards numerical codes will be two of the main topics dealt with in the paper. In addition a congruity and consistency methodology will be presented to ensure that the identified time dependent loads fully match the specified spectrum.

  16. PlanetPack: A radial-velocity time-series analysis tool facilitating exoplanets detection, characterization, and dynamical simulations

    NASA Astrophysics Data System (ADS)

    Baluev, Roman V.

    2013-08-01

    We present PlanetPack, a new software tool that we developed to facilitate and standardize the advanced analysis of radial velocity (RV) data for the goal of exoplanets detection, characterization, and basic dynamical N-body simulations. PlanetPack is a command-line interpreter, that can run either in an interactive mode or in a batch mode of automatic script interpretation. Its major abilities include: (i) advanced RV curve fitting with the proper maximum-likelihood treatment of unknown RV jitter; (ii) user-friendly multi-Keplerian as well as Newtonian N-body RV fits; (iii) use of more efficient maximum-likelihood periodograms that involve the full multi-planet fitting (sometimes called as “residual” or “recursive” periodograms); (iv) easily calculatable parametric 2D likelihood function level contours, reflecting the asymptotic confidence regions; (v) fitting under some useful functional constraints is user-friendly; (vi) basic tasks of short- and long-term planetary dynamical simulation using a fast Everhart-type integrator based on Gauss-Legendre spacings; (vii) fitting the data with red noise (auto-correlated errors); (viii) various analytical and numerical methods for the tasks of determining the statistical significance. It is planned that further functionality may be added to PlanetPack in the future. During the development of this software, a lot of effort was made to improve the calculational speed, especially for CPU-demanding tasks. PlanetPack was written in pure C++ (standard of 1998/2003), and is expected to be compilable and useable on a wide range of platforms.

  17. Risk Reduction and Training using Simulation Based Tools - 12180

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Irin P.

    2012-07-01

    Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less

  18. A human virus improves diabetes

    USDA-ARS?s Scientific Manuscript database

    A single inoculation of mice with Ad36, a human adenovirus, lastingly improved high fat diet-induced-diabetes (DID), while Ad2, another human adenovirus did not. The study objective in these 2 studies was to determine if Ad36 could be used as a tool to reveal novel pathways for improving dysglycemia...

  19. PyFly: A fast, portable aerodynamics simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Daniel; Ghommem, M.; Collier, Nathaniel O.

    Here, we present a fast, user-friendly implementation of a potential flow solver based on the unsteady vortex lattice method (UVLM), namely PyFly. UVLM computes the aerodynamic loads applied on lifting surfaces while capturing the unsteady effects such as the added mass forces, the growth of bound circulation, and the wake while assuming that the flow separation location is known a priori. This method is based on discretizing the body surface into a lattice of vortex rings and relies on the Biot–Savart law to construct the velocity field at every point in the simulated domain. We introduce the pointwise approximation approachmore » to simulate the interactions of the far-field vortices to overcome the computational burden associated with the classical implementation of UVLM. The computational framework uses the Python programming language to provide an easy to handle user interface while the computational kernels are written in Fortran. The mixed language approach enables high performance regarding solution time and great flexibility concerning easiness of code adaptation to different system configurations and applications. The computational tool predicts the unsteady aerodynamic behavior of multiple moving bodies (e.g., flapping wings, rotating blades, suspension bridges) subject to incoming air. The aerodynamic simulator can also deal with enclosure effects, multi-body interactions, and B-spline representation of body shapes. Finally, we simulate different aerodynamic problems to illustrate the usefulness and effectiveness of PyFly.« less

  20. Multi-physics CFD simulations in engineering

    NASA Astrophysics Data System (ADS)

    Yamamoto, Makoto

    2013-08-01

    Nowadays Computational Fluid Dynamics (CFD) software is adopted as a design and analysis tool in a great number of engineering fields. We can say that single-physics CFD has been sufficiently matured in the practical point of view. The main target of existing CFD software is single-phase flows such as water and air. However, many multi-physics problems exist in engineering. Most of them consist of flow and other physics, and the interactions between different physics are very important. Obviously, multi-physics phenomena are critical in developing machines and processes. A multi-physics phenomenon seems to be very complex, and it is so difficult to be predicted by adding other physics to flow phenomenon. Therefore, multi-physics CFD techniques are still under research and development. This would be caused from the facts that processing speed of current computers is not fast enough for conducting a multi-physics simulation, and furthermore physical models except for flow physics have not been suitably established. Therefore, in near future, we have to develop various physical models and efficient CFD techniques, in order to success multi-physics simulations in engineering. In the present paper, I will describe the present states of multi-physics CFD simulations, and then show some numerical results such as ice accretion and electro-chemical machining process of a three-dimensional compressor blade which were obtained in my laboratory. Multi-physics CFD simulations would be a key technology in near future.

  1. PyFly: A fast, portable aerodynamics simulator

    DOE PAGES

    Garcia, Daniel; Ghommem, M.; Collier, Nathaniel O.; ...

    2018-03-14

    Here, we present a fast, user-friendly implementation of a potential flow solver based on the unsteady vortex lattice method (UVLM), namely PyFly. UVLM computes the aerodynamic loads applied on lifting surfaces while capturing the unsteady effects such as the added mass forces, the growth of bound circulation, and the wake while assuming that the flow separation location is known a priori. This method is based on discretizing the body surface into a lattice of vortex rings and relies on the Biot–Savart law to construct the velocity field at every point in the simulated domain. We introduce the pointwise approximation approachmore » to simulate the interactions of the far-field vortices to overcome the computational burden associated with the classical implementation of UVLM. The computational framework uses the Python programming language to provide an easy to handle user interface while the computational kernels are written in Fortran. The mixed language approach enables high performance regarding solution time and great flexibility concerning easiness of code adaptation to different system configurations and applications. The computational tool predicts the unsteady aerodynamic behavior of multiple moving bodies (e.g., flapping wings, rotating blades, suspension bridges) subject to incoming air. The aerodynamic simulator can also deal with enclosure effects, multi-body interactions, and B-spline representation of body shapes. Finally, we simulate different aerodynamic problems to illustrate the usefulness and effectiveness of PyFly.« less

  2. Simulation of Ge Dopant Emission in Indirect-Drive ICF Implosion Experiments

    NASA Astrophysics Data System (ADS)

    Macfarlane, Joseph; Golovkin, I.; Regan, S.; Epstein, R.; Mancini, R.; Peterson, K.; Suter, L.

    2012-10-01

    We present results from simulations performed to study the radiative properties of dopants used in inertial confinement fusion indirect-drive capsule implosion experiments on NIF. In Rev5 NIF ignition capsules, a Ge dopant is added to an inner region of the CH ablator to absorb hohlraum x-ray preheat. Spectrally resolved emission from ablator dopants can be used to study the degree of mixing of ablator material into the ignition hot spot. Here, we study the atomic processes that affect the radiative characteristics of these elements using a set of simulation tools to first estimate the evolution of plasma conditions in the compressed target, and then to compute the atomic kinetics of the dopant and the resultant radiative emission. Using estimates of temperature and density profiles predicted by radiation-hydrodynamics simulations, we set up simple plasma grids where we allow dopant material to be embedded in the fuel, and perform multi-dimensional collisional-radiative simulations using SPECT3D to compute non-LTE atomic level populations and spectral signatures from the dopant. Recently improved Stark-broadened line shape modeling for Ge K-shell lines has been included. The goal is to study the radiative and atomic processes that affect the emergent spectra, including the effects of inner-shell photoabsorption and Kα reemission from the dopant, and to study the sensitivity of the emergent spectra to the dopant and the hot spot and ablator conditions.

  3. Anaerobic Digestion

    EPA Pesticide Factsheets

    Inform visitors about the science of AD, the environmental and economic benefits that can be realized through AD projects, and direct visitors to appropriate regulatory information, feasibility tools and technical resources from related EPA programs.

  4. A simulation model to predict the fiscal and public health impact of a change in cigarette excise taxes.

    PubMed

    van Walbeek, Corné

    2010-02-01

    (1) To present a model that predicts changes in cigarette consumption and excise revenue in response to excise tax changes, and (2) to demonstrate that, if the industry has market power, increases in specific taxes have better tobacco control consequences than increases in ad valorem taxes. All model parameters are user-determined. The model calculates likely changes in cigarette consumption, smoking prevalence and excise tax revenues due to an excise tax change. The model is applicable to countries that levy excise tax as specific or ad valorem taxes. For a representative low-income or middle-income country a 20% excise tax increase decreases cigarette consumption and industry revenue by 5% and increases excise tax revenues by 14%, if there is no change in the net-of-tax price. If the excise tax is levied as a specific tax, the industry has an incentive to raise the net-of-tax price, enhancing the consumption-reducing impact of the tax increase. If the excise tax is levied as an ad valorem tax, the industry has no such incentive. The industry has an incentive to reduce the net-of-tax price in response to an ad valorem excise tax increase, undermining the public health and fiscal benefits of the tax increase. This paper presents a simple web-based tool that allows policy makers and tobacco control advocates to estimate the likely consumption, fiscal and mortality impacts of a change in the cigarette excise tax. If a country wishes to reduce cigarette consumption by increasing the excise tax, a specific tax structure is better than an ad valorem tax structure.

  5. Tools and Equipment Modeling for Automobile Interactive Assembling Operating Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu Dianliang; Zhu Hongmin; Shanghai Key Laboratory of Advance Manufacturing Environment

    Tools and equipment play an important role in the simulation of virtual assembly, especially in the assembly process simulation and plan. Because of variety in function and complexity in structure and manipulation, the simulation of tools and equipments remains to be a challenge for interactive assembly operation. Based on analysis of details and characteristics of interactive operations for automobile assembly, the functional requirement for tools and equipments of automobile assembly is given. Then, a unified modeling method for information expression and function realization of general tools and equipments is represented, and the handling methods of manual, semi-automatic, automatic tools andmore » equipments are discussed. Finally, the application in assembly simulation of rear suspension and front suspension of Roewe 750 automobile is given. The result shows that the modeling and handling methods are applicable in the interactive simulation of various tools and equipments, and can also be used for supporting assembly process planning in virtual environment.« less

  6. Inclusion Detection in Aluminum Alloys Via Laser-Induced Breakdown Spectroscopy

    NASA Astrophysics Data System (ADS)

    Hudson, Shaymus W.; Craparo, Joseph; De Saro, Robert; Apelian, Diran

    2018-04-01

    Laser-induced breakdown spectroscopy (LIBS) has shown promise as a technique to quickly determine molten metal chemistry in real time. Because of its characteristics, LIBS could also be used as a technique to sense for unwanted inclusions and impurities. Simulated Al2O3 inclusions were added to molten aluminum via a metal-matrix composite. LIBS was performed in situ to determine whether particles could be detected. Outlier analysis on oxygen signal was performed on LIBS data and compared to oxide volume fraction measured through metallography. It was determined that LIBS could differentiate between melts with different amounts of inclusions by monitoring the fluctuations in signal for elements of interest. LIBS shows promise as an enabling tool for monitoring metal cleanliness.

  7. Earth observing system instrument pointing control modeling for polar orbiting platforms

    NASA Technical Reports Server (NTRS)

    Briggs, H. C.; Kia, T.; Mccabe, S. A.; Bell, C. E.

    1987-01-01

    An approach to instrument pointing control performance assessment for large multi-instrument platforms is described. First, instrument pointing requirements and reference platform control systems for the Eos Polar Platforms are reviewed. Performance modeling tools including NASTRAN models of two large platforms, a modal selection procedure utilizing a balanced realization method, and reduced order platform models with core and instrument pointing control loops added are then described. Time history simulations of instrument pointing and stability performance in response to commanded slewing of adjacent instruments demonstrates the limits of tolerable slew activity. Simplified models of rigid body responses are also developed for comparison. Instrument pointing control methods required in addition to the core platform control system to meet instrument pointing requirements are considered.

  8. Optimal design of active EMC filters

    NASA Astrophysics Data System (ADS)

    Chand, B.; Kut, T.; Dickmann, S.

    2013-07-01

    A recent trend in automotive industry is adding electrical drive systems to conventional drives. The electrification allows an expansion of energy sources and provides great opportunities for environmental friendly mobility. The electrical powertrain and its components can also cause disturbances which couple into nearby electronic control units and communication cables. Therefore the communication can be degraded or even permanently disrupted. To minimize these interferences, different approaches are possible. One possibility is to use EMC filters. However, the diversity of filters is very large and the determination of an appropriate filter for each application is time-consuming. Therefore, the filter design is determined by using a simulation tool including an effective optimization algorithm. This method leads to improvements in terms of weight, volume and cost.

  9. Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.

    2015-01-01

    Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.

  10. The Value Simulation-Based Learning Added to Machining Technology in Singapore

    ERIC Educational Resources Information Center

    Fang, Linda; Tan, Hock Soon; Thwin, Mya Mya; Tan, Kim Cheng; Koh, Caroline

    2011-01-01

    This study seeks to understand the value simulation-based learning (SBL) added to the learning of Machining Technology in a 15-week core subject course offered to university students. The research questions were: (1) How did SBL enhance classroom learning? (2) How did SBL help participants in their test? (3) How did SBL prepare participants for…

  11. Comments on "Adaptive resolution simulation in equilibrium and beyond" by H. Wang and A. Agarwal

    NASA Astrophysics Data System (ADS)

    Klein, R.

    2015-09-01

    Wang and Agarwal (Eur. Phys. J. Special Topics, this issue, 2015, doi: 10.1140/epjst/e2015-02411-2) discuss variants of Adaptive Resolution Molecular Dynamics Simulations (AdResS), and their applications. Here we comment on their report, addressing scaling properties of the method, artificial forcings implemented to ensure constant density across the full simulation despite changing thermodynamic properties of the simulated media, the possible relation between an AdResS system on the one hand and a phase transition phenomenon on the other, and peculiarities of the SPC/E water model.

  12. Automated Driving System Architecture to Ensure Safe Delegation of Driving Authority

    NASA Astrophysics Data System (ADS)

    YUN, Sunkil; NISHIMURA, Hidekazu

    2016-09-01

    In this paper, the architecture of an automated driving system (ADS) is proposed to ensure safe delegation of driving authority between the ADS and a driver. Limitations of the ADS functions may activate delegation of driving authority to a driver. However, it leads to severe consequences in emergency situations where a driver may be drowsy or distracted. To address these issues, first, the concept model for the ADS in the situation for delegation of driving authority is described taking the driver's behaviour and state into account. Second, the behaviour / state of a driver and functional flow / state of ADS and the interactions between them are modelled to understand the context where the ADS requests to delegate the driving authority to a driver. Finally, the proposed architecture of the ADS is verified under the simulations based on the emergency braking scenarios. In the verification process using simulation, we have derived the necessary condition for safe delegation of driving authority is that the ADS should assist s driver even after delegating driving authority to a driver who has not enough capability to regain control of the driving task.

  13. A model of tungsten anode x-ray spectra.

    PubMed

    Hernández, G; Fernández, F

    2016-08-01

    A semiempirical model for x-ray production in tungsten thick-targets was evaluated using a new characterization of electron fluence. Electron fluence is modeled taking into account both the energy and angular distributions, each of them adjusted to Monte Carlo simulated data. Distances were scaled by the CSDA range to reduce the energy dependence. Bremsstrahlung production was found by integrating the cross section with the fluence in a 1D penetration model. Characteristic radiation was added using a semiempirical law whose validity was checked. The results were compared the experimental results of Bhat et al., with the SpekCalc numerical tool, and with mcnpx simulation results from the work of Hernandez and Boone. The model described shows better agreement with the experimental results than the SpekCalc predictions in the sense of area between the spectra. A general improvement of the predictions of half-value layers is also found. The results are also in good agreement with the simulation results in the 50-640 keV energy range. A complete model for x-ray production in thick bremsstrahlung targets has been developed, improving the results of previous works and extending the energy range covered to the 50-640 keV interval.

  14. Successful technical trading agents using genetic programming.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Othling, Andrew S.; Kelly, John A.; Pryor, Richard J.

    2004-10-01

    Genetic programming (GP) has proved to be a highly versatile and useful tool for identifying relationships in data for which a more precise theoretical construct is unavailable. In this project, we use a GP search to develop trading strategies for agent based economic models. These strategies use stock prices and technical indicators, such as the moving average convergence/divergence and various exponentially weighted moving averages, to generate buy and sell signals. We analyze the effect of complexity constraints on the strategies as well as the relative performance of various indicators. We also present innovations in the classical genetic programming algorithm thatmore » appear to improve convergence for this problem. Technical strategies developed by our GP algorithm can be used to control the behavior of agents in economic simulation packages, such as ASPEN-D, adding variety to the current market fundamentals approach. The exploitation of arbitrage opportunities by technical analysts may help increase the efficiency of the simulated stock market, as it does in the real world. By improving the behavior of simulated stock markets, we can better estimate the effects of shocks to the economy due to terrorism or natural disasters.« less

  15. Supervisory control of mobile sensor networks: math formulation, simulation, and implementation.

    PubMed

    Giordano, Vincenzo; Ballal, Prasanna; Lewis, Frank; Turchiano, Biagio; Zhang, Jing Bing

    2006-08-01

    This paper uses a novel discrete-event controller (DEC) for the coordination of cooperating heterogeneous wireless sensor networks (WSNs) containing both unattended ground sensors (UGSs) and mobile sensor robots. The DEC sequences the most suitable tasks for each agent and assigns sensor resources according to the current perception of the environment. A matrix formulation makes this DEC particularly useful for WSN, where missions change and sensor agents may be added or may fail. WSN have peculiarities that complicate their supervisory control. Therefore, this paper introduces several new tools for DEC design and operation, including methods for generating the required supervisory matrices based on mission planning, methods for modifying the matrices in the event of failed nodes, or nodes entering the network, and a novel dynamic priority assignment weighting approach for selecting the most appropriate and useful sensors for a given mission task. The resulting DEC represents a complete dynamical description of the WSN system, which allows a fast programming of deployable WSN, a computer simulation analysis, and an efficient implementation. The DEC is actually implemented on an experimental wireless-sensor-network prototyping system. Both simulation and experimental results are presented to show the effectiveness and versatility of the developed control architecture.

  16. Convolving optically addressed VLSI liquid crystal SLM

    NASA Astrophysics Data System (ADS)

    Jared, David A.; Stirk, Charles W.

    1994-03-01

    We designed, fabricated, and tested an optically addressed spatial light modulator (SLM) that performs a 3 X 3 kernel image convolution using ferroelectric liquid crystal on VLSI technology. The chip contains a 16 X 16 array of current-mirror-based convolvers with a fixed kernel for finding edges. The pixels are located on 75 micron centers, and the modulators are 20 microns on a side. The array successfully enhanced edges in illumination patterns. We developed a high-level simulation tool (CON) for analyzing the performance of convolving SLM designs. CON has a graphical interface and simulates SLM functions using SPICE-like device models. The user specifies the pixel function along with the device parameters and nonuniformities. We discovered through analysis, simulation and experiment that the operation of current-mirror-based convolver pixels is degraded at low light levels by the variation of transistor threshold voltages inherent to CMOS chips. To function acceptable, the test SLM required the input image to have an minimum irradiance of 10 (mu) W/cm2. The minimum required irradiance can be further reduced by adding a photodarlington near the photodetector or by increasing the size of the transistors used to calculate the convolution.

  17. The SELGIFS data challenge: generating synthetic observations of CALIFA galaxies from hydrodynamical simulations

    NASA Astrophysics Data System (ADS)

    Guidi, G.; Casado, J.; Ascasibar, Y.; García-Benito, R.; Galbany, L.; Sánchez-Blázquez, P.; Sánchez, S. F.; Rosales-Ortega, F. F.; Scannapieco, C.

    2018-06-01

    In this work we present a set of synthetic observations that mimic the properties of the Integral Field Spectroscopy (IFS) survey CALIFA, generated using radiative transfer techniques applied to hydrodynamical simulations of galaxies in a cosmological context. The simulated spatially-resolved spectra include stellar and nebular emission, kinematic broadening of the lines, and dust extinction and scattering. The results of the radiative transfer simulations have been post-processed to reproduce the main properties of the CALIFA V500 and V1200 observational setups. The data has been further formatted to mimic the CALIFA survey in terms of field of view size, spectral range and sampling. We have included the effect of the spatial and spectral Point Spread Functions affecting CALIFA observations, and added detector noise after characterizing it on a sample of 367 galaxies. The simulated datacubes are suited to be analysed by the same algorithms used on real IFS data. In order to provide a benchmark to compare the results obtained applying IFS observational techniques to our synthetic datacubes, and test the calibration and accuracy of the analysis tools, we have computed the spatially-resolved properties of the simulations. Hence, we provide maps derived directly from the hydrodynamical snapshots or the noiseless spectra, in a way that is consistent with the values recovered by the observational analysis algorithms. Both the synthetic observations and the product datacubes are public and can be found in the collaboration website http://astro.ft.uam.es/selgifs/data_challenge/.

  18. Reconstructing in-vivo reflectance spectrum of pigmented skin lesion by Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Wang, Shuang; He, Qingli; Zhao, Jianhua; Lui, Harvey; Zeng, Haishan

    2012-03-01

    In dermatology applications, diffuse reflectance spectroscopy has been extensively investigated as a promising tool for the noninvasive method to distinguish melanoma from benign pigmented skin lesion (nevus), which is concentrated with the skin chromophores like melanin and hemoglobin. We carried out a theoretical study to examine melanin distribution in human skin tissue and establish a practical optical model for further pigmented skin investigation. The theoretical simulation was using junctional nevus as an example. A multiple layer skin optical model was developed on established anatomy structures of skin, the published optical parameters of different skin layers, blood and melanin. Monte Carlo simulation was used to model the interaction between excitation light and skin tissue and rebuild the diffuse reflectance process from skin tissue. A testified methodology was adopted to determine melanin contents in human skin based on in vivo diffuse reflectance spectra. The rebuild diffuse reflectance spectra were investigated by adding melanin into different layers of the theoretical model. One of in vivo reflectance spectra from Junctional nevi and their surrounding normal skin was studied by compare the ratio between nevus and normal skin tissue in both the experimental and simulated diffuse reflectance spectra. The simulation result showed a good agreement with our clinical measurements, which indicated that our research method, including the spectral ratio method, skin optical model and modifying the melanin content in the model, could be applied in further theoretical simulation of pigmented skin lesions.

  19. Molecular dynamics modeling of bonding two materials by atomic scale friction stir welding

    NASA Astrophysics Data System (ADS)

    Konovalenko S., Iv.; Konovalenko, Ig. S.; Psakhie, S. G.

    2017-12-01

    Molecular dynamics model of atomic scale friction stir welding has been developed. Formation of a butt joint between two crystallites was modeled by means of rotating rigid conical tool traveling along the butt joint line. The formed joint had an intermixed atomic structure composed of atoms initially belonged to the opposite mated piece of metal. Heat removal was modeled by adding the extra viscous force to peripheral atomic layers. This technique provides the temperature control in the tool-affected zone during welding. Auxiliary vibration action was added to the rotating tool. The model provides the variation of the tool's angular velocity, amplitude, frequency and direction of the auxiliary vibration action to provide modeling different welding modes.

  20. tkLayout: a design tool for innovative silicon tracking detectors

    NASA Astrophysics Data System (ADS)

    Bianchi, G.

    2014-03-01

    A new CMS tracker is scheduled to become operational for the LHC Phase 2 upgrade in the early 2020's. tkLayout is a software package developed to create 3d models for the design of the CMS tracker and to evaluate its fundamental performance figures. The new tracker will have to cope with much higher luminosity conditions, resulting in increased track density, harsher radiation exposure and, especially, much higher data acquisition bandwidth, such that equipping the tracker with triggering capabilities is envisaged. The design of an innovative detector involves deciding on an architecture offering the best trade-off among many figures of merit, such as tracking resolution, power dissipation, bandwidth, cost and so on. Quantitatively evaluating these figures of merit as early as possible in the design phase is of capital importance and it is best done with the aid of software models. tkLayout is a flexible modeling tool: new performance estimates and support for different detector geometries can be quickly added, thanks to its modular structure. Besides, the software executes very quickly (about two minutes), so that many possible architectural variations can be rapidly modeled and compared, to help in the choice of a viable detector layout and then to optimize it. A tracker geometry is generated from simple configuration files, defining the module types, layout and materials. Support structures are automatically added and services routed to provide a realistic tracker description. The tracker geometries thus generated can be exported to the standard CMS simulation framework (CMSSW) for full Monte Carlo studies. tkLayout has proven essential in giving guidance to CMS in studying different detector layouts and exploring the feasibility of innovative solutions for tracking detectors, in terms of design, performance and projected costs. This tool has been one of the keys to making important design decisions for over five years now and has also enabled project engineers and simulation experts to focus their efforts on other important or specific issues. Even if tkLayout was designed for the CMS tracker upgrade project, its flexibility makes it experiment-agnostic, so that it could be easily adapted to model other tracking detectors. The technology behind tkLayout is presented, as well as some of the results obtained in the context of the CMS silicon tracker design studies.

  1. Implementation of depolarization due to beam-beam effects in the beam-beam interaction simulation tool GUINEA-PIG++

    NASA Astrophysics Data System (ADS)

    Rimbault, C.; Le Meur, G.; Blampuy, F.; Bambade, P.; Schulte, D.

    2009-12-01

    Depolarization is a new feature in the beam-beam simulation tool GUINEA-PIG++ (GP++). The results of this simulation are studied and compared with another beam-beam simulation tool, CAIN, considering different beam parameters for the International Linear Collider (ILC) with a centre-of-mass energy of 500 GeV.

  2. Does the Market Value Value-Added? Evidence from Housing Prices after a Public Release of School and Teacher Value-Added. Working Paper #47

    ERIC Educational Resources Information Center

    Imberman, Scott; Lovenheim, Michael F.

    2015-01-01

    Value-added data have become an increasingly common evaluation tool for schools and teachers. Many school districts have begun to adopt these methods and have released results publicly. In this paper, we use the unique public release of value-added data in Los Angeles to identify how this measure of school quality is capitalized into housing…

  3. How Can Value-Added Measures Be Used for Teacher Improvement? What We Know Series: Value-Added Methods and Applications. Knowledge Brief 13

    ERIC Educational Resources Information Center

    Loeb, Susanna

    2013-01-01

    The question for this brief is whether education leaders can use value-added measures as tools for improving schooling and, if so, how to do this. Districts, states, and schools can, at least in theory, generate gains in educational outcomes for students using value-added measures in three ways: creating information on effective programs, making…

  4. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  5. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  6. 40 CFR 86.162-00 - Approval of alternative air conditioning test simulations and descriptions of AC1 and AC2.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... exhaust emission results of air conditioning operation in an environmental test cell by adding additional... conditioning operation in an environmental test cell by adding a heat load to the passenger compartment. The... the simulation matches environmental cell test data for the range of vehicles to be covered by the...

  7. CycADS: an annotation database system to ease the development and update of BioCyc databases

    PubMed Central

    Vellozo, Augusto F.; Véron, Amélie S.; Baa-Puyoulet, Patrice; Huerta-Cepas, Jaime; Cottret, Ludovic; Febvay, Gérard; Calevro, Federica; Rahbé, Yvan; Douglas, Angela E.; Gabaldón, Toni; Sagot, Marie-France; Charles, Hubert; Colella, Stefano

    2011-01-01

    In recent years, genomes from an increasing number of organisms have been sequenced, but their annotation remains a time-consuming process. The BioCyc databases offer a framework for the integrated analysis of metabolic networks. The Pathway tool software suite allows the automated construction of a database starting from an annotated genome, but it requires prior integration of all annotations into a specific summary file or into a GenBank file. To allow the easy creation and update of a BioCyc database starting from the multiple genome annotation resources available over time, we have developed an ad hoc data management system that we called Cyc Annotation Database System (CycADS). CycADS is centred on a specific database model and on a set of Java programs to import, filter and export relevant information. Data from GenBank and other annotation sources (including for example: KAAS, PRIAM, Blast2GO and PhylomeDB) are collected into a database to be subsequently filtered and extracted to generate a complete annotation file. This file is then used to build an enriched BioCyc database using the PathoLogic program of Pathway Tools. The CycADS pipeline for annotation management was used to build the AcypiCyc database for the pea aphid (Acyrthosiphon pisum) whose genome was recently sequenced. The AcypiCyc database webpage includes also, for comparative analyses, two other metabolic reconstruction BioCyc databases generated using CycADS: TricaCyc for Tribolium castaneum and DromeCyc for Drosophila melanogaster. Linked to its flexible design, CycADS offers a powerful software tool for the generation and regular updating of enriched BioCyc databases. The CycADS system is particularly suited for metabolic gene annotation and network reconstruction in newly sequenced genomes. Because of the uniform annotation used for metabolic network reconstruction, CycADS is particularly useful for comparative analysis of the metabolism of different organisms. Database URL: http://www.cycadsys.org PMID:21474551

  8. Automatic Differentiation in Quantum Chemistry with Applications to Fully Variational Hartree-Fock.

    PubMed

    Tamayo-Mendoza, Teresa; Kreisbeck, Christoph; Lindh, Roland; Aspuru-Guzik, Alán

    2018-05-23

    Automatic differentiation (AD) is a powerful tool that allows calculating derivatives of implemented algorithms with respect to all of their parameters up to machine precision, without the need to explicitly add any additional functions. Thus, AD has great potential in quantum chemistry, where gradients are omnipresent but also difficult to obtain, and researchers typically spend a considerable amount of time finding suitable analytical forms when implementing derivatives. Here, we demonstrate that AD can be used to compute gradients with respect to any parameter throughout a complete quantum chemistry method. We present DiffiQult , a Hartree-Fock implementation, entirely differentiated with the use of AD tools. DiffiQult is a software package written in plain Python with minimal deviation from standard code which illustrates the capability of AD to save human effort and time in implementations of exact gradients in quantum chemistry. We leverage the obtained gradients to optimize the parameters of one-particle basis sets in the context of the floating Gaussian framework.

  9. Automatic Differentiation in Quantum Chemistry with Applications to Fully Variational Hartree–Fock

    PubMed Central

    2018-01-01

    Automatic differentiation (AD) is a powerful tool that allows calculating derivatives of implemented algorithms with respect to all of their parameters up to machine precision, without the need to explicitly add any additional functions. Thus, AD has great potential in quantum chemistry, where gradients are omnipresent but also difficult to obtain, and researchers typically spend a considerable amount of time finding suitable analytical forms when implementing derivatives. Here, we demonstrate that AD can be used to compute gradients with respect to any parameter throughout a complete quantum chemistry method. We present DiffiQult, a Hartree–Fock implementation, entirely differentiated with the use of AD tools. DiffiQult is a software package written in plain Python with minimal deviation from standard code which illustrates the capability of AD to save human effort and time in implementations of exact gradients in quantum chemistry. We leverage the obtained gradients to optimize the parameters of one-particle basis sets in the context of the floating Gaussian framework.

  10. Effects of channel tap spacing on delay-lock tracking

    NASA Astrophysics Data System (ADS)

    Dana, Roger A.; Milner, Brian R.; Bogusch, Robert L.

    1995-12-01

    High fidelity simulations of communication links operating through frequency selective fading channels require both accurate channel models and faithful reproduction of the received signal. In modern radio receivers, processing beyond the analog-to-digital converter (A/D) is done digitally, so a high fidelity simulation is actually an emulation of this digital signal processing. The 'simulation' occurs in constructing the output of the A/D. One approach to constructing the A/D output is to convolve the channel impulse response function with the combined impulse response of the transmitted modulation and the A/D. For both link simulations and hardware channel simulators, the channel impulse response function is then generated with a finite number of samples per chip, and the convolution is implemented in a tapped delay line. In this paper we discuss the effects of the channel model tap spacing on the performance of delay locked loops (DLLs) in both direct sequence and frequency hopped spread spectrum systems. A frequency selective fading channel is considered, and the channel impulse response function is constructed with an integer number of taps per modulation symbol or chip. The tracking loop time delay is computed theoretically for this tapped delay line channel model and is compared to the results of high fidelity simulations of actual DLLs. A surprising result is obtained. The performance of the DLL depends strongly on the number of taps per chip. As this number increases the DLL delay approaches the theoretical limit.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raoult, Nina M.; Jupp, Tim E.; Cox, Peter M.

    Land-surface models (LSMs) are crucial components of the Earth system models (ESMs) that are used to make coupled climate–carbon cycle projections for the 21st century. The Joint UK Land Environment Simulator (JULES) is the land-surface model used in the climate and weather forecast models of the UK Met Office. JULES is also extensively used offline as a land-surface impacts tool, forced with climatologies into the future. In this study, JULES is automatically differentiated with respect to JULES parameters using commercial software from FastOpt, resulting in an analytical gradient, or adjoint, of the model. Using this adjoint, the adJULES parameter estimationmore » system has been developed to search for locally optimum parameters by calibrating against observations. This paper describes adJULES in a data assimilation framework and demonstrates its ability to improve the model–data fit using eddy-covariance measurements of gross primary production (GPP) and latent heat (LE) fluxes. adJULES also has the ability to calibrate over multiple sites simultaneously. This feature is used to define new optimised parameter values for the five plant functional types (PFTs) in JULES. The optimised PFT-specific parameters improve the performance of JULES at over 85 % of the sites used in the study, at both the calibration and evaluation stages. Furthermore, the new improved parameters for JULES are presented along with the associated uncertainties for each parameter.« less

  12. Nanotechnology solutions for Alzheimer's disease: advances in research tools, diagnostic methods and therapeutic agents.

    PubMed

    Nazem, Amir; Mansoori, G Ali

    2008-03-01

    A century of research has passed since the discovery and definition of Alzheimer's disease (AD), the primary common dementing disorder worldwide. However, AD lacks definite diagnostic approaches and effective cure at the present. Moreover, the currently available diagnostic tools are not sufficient for an early screening of AD in order to start preventive approaches. Recently the emerging field of nanotechnology has promised new techniques to solve some of the AD challenges. Nanotechnology refers to the techniques of designing and manufacturing nanosize (1-100 nm) structures through controlled positional and/or self-assembly of atoms and molecules. In this report, we present the promises that nanotechnology brings in research on the AD diagnosis and therapy. They include its potential for the better understanding of the AD root cause molecular mechanisms, AD's early diagnoses, and effective treatment. The advances in AD research offered by the atomic force microscopy, single molecule fluorescence microscopy and NanoSIMS microscopy are examined here. In addition, the recently proposed applications of nanotechnology for the early diagnosis of AD including bio-barcode assay, localized surface plasmon resonance nanosensor, quantum dot and nanomechanical cantilever arrays are analyzed. Applications of nanotechnology in AD therapy including neuroprotections against oxidative stress and anti-amyloid therapeutics, neuroregeneration and drug delivery beyond the blood brain barrier (BBB) are discussed and analyzed. All of these applications could improve the treatment approach of AD and other neurodegenerative diseases. The complete cure of AD may become feasible by a combination of nanotechnology and some other novel approaches, like stem cell technology.

  13. Simulation and Measurement of Stray Light in the CLASP

    NASA Technical Reports Server (NTRS)

    Narukage, Noriyuki; Kano, Ryohei; Bando, Takamasa; Ishikawa, Ryoko; Kubo, Masahito; Tsuzuki, Toshihiro; Katsukawa, Yukio; Ishikawa, Shin-nosuke; Giono, Gabriel; Suematsu, Yoshinori; hide

    2015-01-01

    We are planning an international rocket experiment Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is (2015 planned) that Lyman Alpha line polarization spectroscopic observations from the sun. The purpose of this experiment, detected with high accuracy of the linear polarization of the Ly?? lines to 0.1% by using a Hanle effect is to measure the magnetic field of the chromosphere-transition layer directly. For total flux of the sun visible light overwhelmingly larger and about 200 000 times the Ly?? line wavelength region, also hinder to 0.1% of the polarization photometric accuracy achieved in the stray light of slight visible light. Therefore we were first carried out using the illumination design analysis software called stray light simulation CLASP Light Tools. Feature of this simulation, using optical design file (ZEMAX format) and structural design file (STEP format), to reproduce realistic CLASP as possible to calculate machine is that it was stray study. And, at the stage in the actual equipment that made the provisional set of CLASP, actually put sunlight into CLASP using coelostat of National Astronomical Observatory of Japan, was subjected to measurement of stray light (San test). Pattern was not observed in the simulation is observed in the stray light measurement results need arise that measures. However, thanks to the stray light measurement and simulation was performed by adding, it was found this pattern is due to the diffracted light at the slit. Currently, the simulation results is where you have taken steps to reference. In this presentation, we report the stray light simulation and stray light measurement results that we have implemented

  14. SS-mPMG and SS-GA: tools for finding pathways and dynamic simulation of metabolic networks.

    PubMed

    Katsuragi, Tetsuo; Ono, Naoaki; Yasumoto, Keiichi; Altaf-Ul-Amin, Md; Hirai, Masami Y; Sriyudthsak, Kansuporn; Sawada, Yuji; Yamashita, Yui; Chiba, Yukako; Onouchi, Hitoshi; Fujiwara, Toru; Naito, Satoshi; Shiraishi, Fumihide; Kanaya, Shigehiko

    2013-05-01

    Metabolomics analysis tools can provide quantitative information on the concentration of metabolites in an organism. In this paper, we propose the minimum pathway model generator tool for simulating the dynamics of metabolite concentrations (SS-mPMG) and a tool for parameter estimation by genetic algorithm (SS-GA). SS-mPMG can extract a subsystem of the metabolic network from the genome-scale pathway maps to reduce the complexity of the simulation model and automatically construct a dynamic simulator to evaluate the experimentally observed behavior of metabolites. Using this tool, we show that stochastic simulation can reproduce experimentally observed dynamics of amino acid biosynthesis in Arabidopsis thaliana. In this simulation, SS-mPMG extracts the metabolic network subsystem from published databases. The parameters needed for the simulation are determined using a genetic algorithm to fit the simulation results to the experimental data. We expect that SS-mPMG and SS-GA will help researchers to create relevant metabolic networks and carry out simulations of metabolic reactions derived from metabolomics data.

  15. The Impact of Alzheimer's Disease on the Chinese Economy.

    PubMed

    Keogh-Brown, Marcus R; Jensen, Henning Tarp; Arrighi, H Michael; Smith, Richard D

    2016-02-01

    Recent increases in life expectancy may greatly expand future Alzheimer's Disease (AD) burdens. China's demographic profile, aging workforce and predicted increasing burden of AD-related care make its economy vulnerable to AD impacts. Previous economic estimates of AD predominantly focus on health system burdens and omit wider whole-economy effects, potentially underestimating the full economic benefit of effective treatment. AD-related prevalence, morbidity and mortality for 2011-2050 were simulated and were, together with associated caregiver time and costs, imposed on a dynamic Computable General Equilibrium model of the Chinese economy. Both economic and non-economic outcomes were analyzed. Simulated Chinese AD prevalence quadrupled during 2011-50 from 6-28 million. The cumulative discounted value of eliminating AD equates to China's 2012 GDP (US$8 trillion), and the annual predicted real value approaches US AD cost-of-illness (COI) estimates, exceeding US$1 trillion by 2050 (2011-prices). Lost labor contributes 62% of macroeconomic impacts. Only 10% derives from informal care, challenging previous COI-estimates of 56%. Health and macroeconomic models predict an unfolding 2011-2050 Chinese AD epidemic with serious macroeconomic consequences. Significant investment in research and development (medical and non-medical) is warranted and international researchers and national authorities should therefore target development of effective AD treatment and prevention strategies.

  16. The Impact of Alzheimer's Disease on the Chinese Economy

    PubMed Central

    Keogh-Brown, Marcus R.; Jensen, Henning Tarp; Arrighi, H. Michael; Smith, Richard D.

    2015-01-01

    Background Recent increases in life expectancy may greatly expand future Alzheimer's Disease (AD) burdens. China's demographic profile, aging workforce and predicted increasing burden of AD-related care make its economy vulnerable to AD impacts. Previous economic estimates of AD predominantly focus on health system burdens and omit wider whole-economy effects, potentially underestimating the full economic benefit of effective treatment. Methods AD-related prevalence, morbidity and mortality for 2011–2050 were simulated and were, together with associated caregiver time and costs, imposed on a dynamic Computable General Equilibrium model of the Chinese economy. Both economic and non-economic outcomes were analyzed. Findings Simulated Chinese AD prevalence quadrupled during 2011–50 from 6–28 million. The cumulative discounted value of eliminating AD equates to China's 2012 GDP (US$8 trillion), and the annual predicted real value approaches US AD cost-of-illness (COI) estimates, exceeding US$1 trillion by 2050 (2011-prices). Lost labor contributes 62% of macroeconomic impacts. Only 10% derives from informal care, challenging previous COI-estimates of 56%. Interpretation Health and macroeconomic models predict an unfolding 2011–2050 Chinese AD epidemic with serious macroeconomic consequences. Significant investment in research and development (medical and non-medical) is warranted and international researchers and national authorities should therefore target development of effective AD treatment and prevention strategies. PMID:26981556

  17. Updates to the CMAQ Post Processing and Evaluation Tools for 2016

    EPA Science Inventory

    In the spring of 2016, the evaluation tools distributed with the CMAQ model code were updated and new tools were added to the existing set of tools. Observation data files, compatible with the AMET software, were also made available on the CMAS website for the first time with the...

  18. Ventilator caregiver education through the use of high-fidelity pediatric simulators: a pilot study.

    PubMed

    Tofil, Nancy M; Rutledge, Chrystal; Zinkan, J Lynn; Youngblood, Amber Q; Stone, Julie; Peterson, Dawn Taylor; Slayton, Donna; Makris, Chris; Magruder, Terri; White, Marjorie Lee

    2013-11-01

    Introduction. Home ventilator programs (HVP) have been developed to train parents of critically ill children. Simulators are used in health care, but not often for parents. We added simulation to our HVP and assessed parents' response. Methods. In July 2008, the HVP at Children's of Alabama added simulation to parent training. Debriefing was provided after the training session to reinforce correct skills and critical thinking. Follow-up surveys were completed after training. Results. Fifteen families participated. All parents were confident in changing tracheostomies, knowing signs of breathing difficulties, and responding to alarms. 71% strongly agree that simulation resulted in feeling better prepared to care for their child. 86% felt simulation improved their confidence in taking care of their child. Conclusion. Simulators provide a crucial transition between learned skills and application. This novel use of simulation-based education improves parents' confidence in emergencies and may lead to shortened training resulting in cost savings.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Animesh; Wang, Han, E-mail: han.wang@fu-berlin.de; Site, Luigi Delle, E-mail: dellesite@fu-berlin.de

    We employ the adaptive resolution approach AdResS, in its recently developed Grand Canonical-like version (GC-AdResS) [H. Wang, C. Hartmann, C. Schütte, and L. Delle Site, Phys. Rev. X 3, 011018 (2013)], to calculate the excess chemical potential, μ{sup ex}, of various liquids and mixtures. We compare our results with those obtained from full atomistic simulations using the technique of thermodynamic integration and show a satisfactory agreement. In GC-AdResS, the procedure to calculate μ{sup ex} corresponds to the process of standard initial equilibration of the system; this implies that, independently of the specific aim of the study, μ{sup ex}, for eachmore » molecular species, is automatically calculated every time a GC-AdResS simulation is performed.« less

  20. The validity of a professional competence tool for physiotherapy students in simulation-based clinical education: a Rasch analysis.

    PubMed

    Judd, Belinda K; Scanlan, Justin N; Alison, Jennifer A; Waters, Donna; Gordon, Christopher J

    2016-08-05

    Despite the recent widespread adoption of simulation in clinical education in physiotherapy, there is a lack of validated tools for assessment in this setting. The Assessment of Physiotherapy Practice (APP) is a comprehensive tool used in clinical placement settings in Australia to measure professional competence of physiotherapy students. The aim of the study was to evaluate the validity of the APP for student assessment in simulation settings. A total of 1260 APPs were collected, 971 from students in simulation and 289 from students in clinical placements. Rasch analysis was used to examine the construct validity of the APP tool in three different simulation assessment formats: longitudinal assessment over 1 week of simulation; longitudinal assessment over 2 weeks; and a short-form (25 min) assessment of a single simulation scenario. Comparison with APPs from 5 week clinical placements in hospital and clinic-based settings were also conducted. The APP demonstrated acceptable fit to the expectations of the Rasch model for the 1 and 2 week clinical simulations, exhibiting unidimensional properties that were able to distinguish different levels of student performance. For the short-form simulation, nine of the 20 items recorded greater than 25 % of scores as 'not-assessed' by clinical educators which impacted on the suitability of the APP tool in this simulation format. The APP was a valid assessment tool when used in longitudinal simulation formats. A revised APP may be required for assessment in short-form simulation scenarios.

  1. Changes in Gait with Anteriorly Added Mass: A Pregnancy Simulation Study

    PubMed Central

    Ogamba, Maureen I.; Loverro, Kari L.; Laudicina, Natalie M.; Gill, Simone V.; Lewis, Cara L.

    2016-01-01

    During pregnancy, the female body experiences structural changes, such as weight gain. As pregnancy advances, most of the additional mass is concentrated anteriorly on the lower trunk. The purpose of this study is to analyze kinematic and kinetic changes when load is added anteriorly to the trunk, simulating a physical change experienced during pregnancy. Twenty healthy females walked on a treadmill while wearing a custom made pseudo-pregnancy sac (1 kg) under three load conditions: sac only, 10 pound condition (4.535 kg added anteriorly), and 20 pound condition (9.07 kg added anteriorly), used to simulate pregnancy, in the second trimester and at full term pregnancy, respectively. The increase in anterior mass resulted in kinematic changes at the knee, hip, pelvis, and trunk in the sagittal and frontal planes. Additionally, ankle, knee, and hip joint moments normalized to baseline mass increased with increased load; however, these moments decreased when normalized to total mass. These kinematic and kinetic changes may suggest that women modify gait biomechanics to reduce the effect of added load. Furthermore, the increase in joint moments increases stress on the musculoskeletal system and may contribute to musculoskeletal pain. PMID:26958743

  2. Effective dose evaluation of NORM-added consumer products using Monte Carlo simulations and the ICRP computational human phantoms.

    PubMed

    Lee, Hyun Cheol; Yoo, Do Hyeon; Testa, Mauro; Shin, Wook-Geun; Choi, Hyun Joon; Ha, Wi-Ho; Yoo, Jaeryong; Yoon, Seokwon; Min, Chul Hee

    2016-04-01

    The aim of this study is to evaluate the potential hazard of naturally occurring radioactive material (NORM) added consumer products. Using the Monte Carlo method, the radioactive products were simulated with ICRP reference phantom and the organ doses were calculated with the usage scenario. Finally, the annual effective doses were evaluated as lower than the public dose limit of 1mSv y(-1) for 44 products. It was demonstrated that NORM-added consumer products could be quantitatively assessed for the safety regulation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. 10 CFR 434.606 - Simulation tool.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria...

  4. Landsat-7 Simulation and Testing Environments

    NASA Technical Reports Server (NTRS)

    Holmes, E.; Ha, K.; Hawkins, K.; Lombardo, J.; Ram, M.; Sabelhaus, P.; Scott, S.; Phillips, R.

    1999-01-01

    A spacecraft Attitude Control and Determination Subsystem (ACDS) is heavily dependent upon simulation throughout its entire development, implementation and ground test cycle. Engineering simulation tools are typically developed to design and analyze control systems to validate the design and software simulation tools are required to qualify the flight software. However, the need for simulation does not end here. Operating the ACDS of a spacecraft on the ground requires the simulation of spacecraft dynamics, disturbance modeling and celestial body motion. Sensor data must also be simulated and substituted for actual sensor data on the ground so that the spacecraft will respond by sending commands to the actuators as they will on orbit. And finally, the simulators is the primary training tool and test-bed for the Flight Operations Team. In this paper various ACDS simulation, developed for or used by the Landsat 7 project will be described. The paper will include a description of each tool, its unique attributes, and its role in the overall development and testing of the ACDS. Finally, a section is included which discusses how the coordinated use of these simulation tools can maximize the probability of uncovering software, hardware and operations errors during the ground test process.

  5. Using serious games and virtual worlds in pesticides transport teaching

    NASA Astrophysics Data System (ADS)

    Payraudeau, Sylvain; Alvarez-Zaldivar, Pablo; van Dijk, Paul; Imfeld, Gwenaël

    2017-04-01

    Teaching environmental scenarios, such as the availability and transport of pesticides in catchments, may fail with traditional lectures and tutorials due to the complex and synergic interplay of soil, landuse, compounds properties, hydroclimatic forcing and biogeochemical processes. To tackle and pedagogically enter into this complexity, virtual worlds (i.e. computer-based simulated environment) and serious games (i.e. applied games with added pedagogical value) can efficiently improve knowledge and know-how of the future water management stakeholders and scientists. We have developed an e-learning teaching unit using virtual catchments and serious games by gradually adapting the level of complexity depending of the targeted public. The first targeted group is farmers in continuing education centers. We developed a distributed pesticide transport tool in a virtual agricultural catchment to highlight the specific risks of off-site pesticide transport along crop growing season. Students of this first group can interactively define and combine climatic, land-use and soil type scenarios with different pesticides to experiment the components of worst-case situations and to propose best-management practices depending of the involved environmental compartments, i.e. atmosphere, soil, surface water or groundwater. For Master's degree students, we added a level of complexity by adding a specific module focusing on pesticide degradation using cutting-edge approaches. With the compound-specific isotope analysis (CSIA) module students are able to link the 13C/12C signature of pesticides to the ongoing dissipation processes within the catchment. By using and interpreting CSIA data, students can thus efficiently understand the difference between non-destructive (e.g. sorption) and destructive (e.g. bio and abiotic degradation) processes occurring in a catchment. This CSIA tool applied to a virtual agricultural catchment will also allow to distinguish the dilution effect from the degradation effect in complex agricultural catchments receiving pesticides. We anticipate our e-learning teaching unit based on serious game and virtual catchments will help future scientists and stakeholders to better understand and manage pesticides transport within catchments.

  6. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    DTIC Science & Technology

    2005-02-01

    method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background

  7. Screening tool to evaluate the vulnerability of down-gradient receptors to groundwater contaminants from uncapped landfills

    USGS Publications Warehouse

    Baker, Ronald J.; Reilly, Timothy J.; Lopez, Anthony R.; Romanok, Kristin M.; Wengrowski, Edward W

    2015-01-01

    A screening tool for quantifying levels of concern for contaminants detected in monitoring wells on or near landfills to down-gradient receptors (streams, wetlands and residential lots) was developed and evaluated. The tool uses Quick Domenico Multi-scenario (QDM), a spreadsheet implementation of Domenico-based solute transport, to estimate concentrations of contaminants reaching receptors under steady-state conditions from a constant-strength source. Unlike most other available Domenico-based model applications, QDM calculates the time for down-gradient contaminant concentrations to approach steady state and appropriate dispersivity values, and allows for up to fifty simulations on a single spreadsheet. Sensitivity of QDM solutions to critical model parameters was quantified. The screening tool uses QDM results to categorize landfills as having high, moderate and low levels of concern, based on contaminant concentrations reaching receptors relative to regulatory concentrations. The application of this tool was demonstrated by assessing levels of concern (as defined by the New Jersey Pinelands Commission) for thirty closed, uncapped landfills in the New Jersey Pinelands National Reserve, using historic water-quality data from monitoring wells on and near landfills and hydraulic parameters from regional flow models. Twelve of these landfills are categorized as having high levels of concern, indicating a need for further assessment. This tool is not a replacement for conventional numerically-based transport model or other available Domenico-based applications, but is suitable for quickly assessing the level of concern posed by a landfill or other contaminant point source before expensive and lengthy monitoring or remediation measures are taken. In addition to quantifying the level of concern using historic groundwater-monitoring data, the tool allows for archiving model scenarios and adding refinements as new data become available.

  8. Lean manufacturing analysis to reduce waste on production process of fan products

    NASA Astrophysics Data System (ADS)

    Siregar, I.; Nasution, A. A.; Andayani, U.; Sari, R. M.; Syahputri, K.; Anizar

    2018-02-01

    This research is based on case study that being on electrical company. One of the products that will be researched is the fan, which when running the production process there is a time that is not value-added, among others, the removal of material which is not efficient in the raw materials and component molding fan. This study aims to reduce waste or non-value added activities and shorten the total lead time by using the tools Value Stream Mapping. Lean manufacturing methods used to analyze and reduce the non-value added activities, namely the value stream mapping analysis tools, process mapping activity with 5W1H, and tools 5 whys. Based on the research note that no value-added activities in the production process of a fan of 647.94 minutes of total lead time of 725.68 minutes. Process cycle efficiency in the production process indicates that the fan is still very low at 11%. While estimates of the repair showed a decrease in total lead time became 340.9 minutes and the process cycle efficiency is greater by 24%, which indicates that the production process has been better.

  9. The development and potential of inverse simulation for the quantitative assessment of helicopter handling qualities

    NASA Technical Reports Server (NTRS)

    Bradley, Roy; Thomson, Douglas G.

    1993-01-01

    In this paper it is proposed that inverse simulation can make a positive contribution to the study of handling qualities. It is shown that mathematical descriptions of the MTEs (Mission Task Elements) defined in ADS-33C may be used to drive an inverse simulation thereby generating, from an appropriate mathematical model, the controls and states of a subject helicopter flying it. By presenting the results of such simulations it is shown that, in the context of inverse simulation, the attitude quickness parameters given in ADS-33C are independent of vehicle configuration. An alternative quickness parameter, associated with the control displacements required to fly the MTE is proposed, and some preliminary results are presented.

  10. Simulation of 20-channel, 50-GHz, Si3N4-based arrayed waveguide grating applying three different photonics tools

    NASA Astrophysics Data System (ADS)

    Gajdošová, Lenka; Seyringer, Dana

    2017-02-01

    We present the design and simulation of 20-channel, 50-GHz Si3N4 based AWG using three different commercial photonics tools, namely PHASAR from Optiwave Systems Inc., APSS from Apollo Photonics Inc. and RSoft from Synopsys Inc. For this purpose we created identical waveguide structures and identical AWG layouts in these tools and performed BPM simulations. For the simulations the same calculation conditions were used. These AWGs were designed for TM-polarized light with an AWG central wavelength of 850 nm. The output of all simulations, the transmission characteristics, were used to calculate the transmission parameters defining the optical properties of the simulated AWGs. These parameters were summarized and compared with each other. The results feature very good correlation between the tools and are comparable to the designed parameters in AWG-Parameters tool.

  11. Design and Development of Wireless Power Transmission for Unmanned Air Vehicles

    DTIC Science & Technology

    2012-09-01

    ELECTRONIC WARFARE SYSTEMS ENGINEERING and MASTER OF SCIENCE IN ELECTRICAL ENGINEERING from the NAVAL POSTGRADUATE SCHOOL September 2012...Agilent Advanced Design System (ADS). Tuning elements were added and adjusted in order to optimize the efficiency. A maximum efficiency of 57% was...investigated by a series of simulations using Agilent Advanced Design System (ADS). Tuning elements were added and adjusted

  12. Structure-mechanism-based engineering of chemical regulators targeting distinct pathological factors in Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Beck, Michael W.; Derrick, Jeffrey S.; Kerr, Richard A.; Oh, Shin Bi; Cho, Woo Jong; Lee, Shin Jung C.; Ji, Yonghwan; Han, Jiyeon; Tehrani, Zahra Aliakbar; Suh, Nayoung; Kim, Sujeong; Larsen, Scott D.; Kim, Kwang S.; Lee, Joo-Yong; Ruotolo, Brandon T.; Lim, Mi Hee

    2016-10-01

    The absence of effective therapeutics against Alzheimer's disease (AD) is a result of the limited understanding of its multifaceted aetiology. Because of the lack of chemical tools to identify pathological factors, investigations into AD pathogenesis have also been insubstantial. Here we report chemical regulators that demonstrate distinct specificity towards targets linked to AD pathology, including metals, amyloid-β (Aβ), metal-Aβ, reactive oxygen species, and free organic radicals. We obtained these chemical regulators through a rational structure-mechanism-based design strategy. We performed structural variations of small molecules for fine-tuning their electronic properties, such as ionization potentials and mechanistic pathways for reactivity towards different targets. We established in vitro and/or in vivo efficacies of the regulators for modulating their targets' reactivities, ameliorating toxicity, reducing amyloid pathology, and improving cognitive deficits. Our chemical tools show promise for deciphering AD pathogenesis and discovering effective drugs.

  13. Structure-mechanism-based engineering of chemical regulators targeting distinct pathological factors in Alzheimer's disease.

    PubMed

    Beck, Michael W; Derrick, Jeffrey S; Kerr, Richard A; Oh, Shin Bi; Cho, Woo Jong; Lee, Shin Jung C; Ji, Yonghwan; Han, Jiyeon; Tehrani, Zahra Aliakbar; Suh, Nayoung; Kim, Sujeong; Larsen, Scott D; Kim, Kwang S; Lee, Joo-Yong; Ruotolo, Brandon T; Lim, Mi Hee

    2016-10-13

    The absence of effective therapeutics against Alzheimer's disease (AD) is a result of the limited understanding of its multifaceted aetiology. Because of the lack of chemical tools to identify pathological factors, investigations into AD pathogenesis have also been insubstantial. Here we report chemical regulators that demonstrate distinct specificity towards targets linked to AD pathology, including metals, amyloid-β (Aβ), metal-Aβ, reactive oxygen species, and free organic radicals. We obtained these chemical regulators through a rational structure-mechanism-based design strategy. We performed structural variations of small molecules for fine-tuning their electronic properties, such as ionization potentials and mechanistic pathways for reactivity towards different targets. We established in vitro and/or in vivo efficacies of the regulators for modulating their targets' reactivities, ameliorating toxicity, reducing amyloid pathology, and improving cognitive deficits. Our chemical tools show promise for deciphering AD pathogenesis and discovering effective drugs.

  14. Using the genome aggregation database, computational pathogenicity prediction tools, and patch clamp heterologous expression studies to demote previously published long QT syndrome type 1 mutations from pathogenic to benign.

    PubMed

    Clemens, Daniel J; Lentino, Anne R; Kapplinger, Jamie D; Ye, Dan; Zhou, Wei; Tester, David J; Ackerman, Michael J

    2018-04-01

    Mutations in the KCNQ1-encoded Kv7.1 potassium channel cause long QT syndrome (LQTS) type 1 (LQT1). It has been suggested that ∼10%-20% of rare LQTS case-derived variants in the literature may have been published erroneously as LQT1-causative mutations and may be "false positives." The purpose of this study was to determine which previously published KCNQ1 case variants are likely false positives. A list of all published, case-derived KCNQ1 missense variants (MVs) was compiled. The occurrence of each MV within the Genome Aggregation Database (gnomAD) was assessed. Eight in silico tools were used to predict each variant's pathogenicity. Case-derived variants that were either (1) too frequently found in gnomAD or (2) absent in gnomAD but predicted to be pathogenic by ≤2 tools were considered potential false positives. Three of these variants were characterized functionally using whole-cell patch clamp technique. Overall, there were 244 KCNQ1 case-derived MVs. Of these, 29 (12%) were seen in ≥10 individuals in gnomAD and are demotable. However, 157 of 244 MVs (64%) were absent in gnomAD. Of these, 7 (4%) were predicted to be pathogenic by ≤2 tools, 3 of which we characterized functionally. There was no significant difference in current density between heterozygous KCNQ1-F127L, -P477L, or -L619M variant-containing channels compared to KCNQ1-WT. This study offers preliminary evidence for the demotion of 32 (13%) previously published LQT1 MVs. Of these, 29 were demoted because of their frequent sighting in gnomAD. Additionally, in silico analysis and in vitro functional studies have facilitated the demotion of 3 ultra-rare MVs (F127L, P477L, L619M). Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  15. SU-F-P-03: Management of Time to Treatment Inititation: Case for An Electronic Whiteboard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adnani, N

    2016-06-15

    Purpose: To determine if data mining of an electronic whiteboard improves the management of the Time to Treatment Initiation (TTI) in radiation oncology. Methods: An electronic whiteboard designed to help in managing the planning workflow and improves communication regarding patient planning progress was used to record the dates at which each phase of the planning process began or completed. These are CT Sim date, Plan Start, Physician Review, Physicist Review, Approval for Treatment Delivery, Setup or Verification of Simulation. Results: During clinical implementation, the electronic whiteboard was able to fulfill its primary objective of providing a transparent account of themore » planning progress of each patient. Peer pressure also meant that individual tasks, such as contouring, were easily brought to the attention of the responsible party and prioritized accordingly. Data mining to analyze the electronic whiteboard per patient (figure 1), per diagnosis (figure 2), per treatment modality (figure 3), per physician (figure 4), per planner (figure 5), etc., added another sophisticated tool in the management of Time to Treatment Initiation without compromising quality of the plans being generated. A longer than necessary time between CT Sim and Plan Start can be discussed among the members of the treatment team as an indication of inadequate/outdated CT Simulator, Contouring Tools, Image Fusion Tools, Other Imaging Studies (MRI, PET/CT) performed, etc. The same for the Plan Start to Physician Review where an extended time than expected may be due unrealistic planning goals, limited planning system features, etc. Conclusion: An Electronic Whiteboard in radiation oncology is not only helping with organizing planning workflow, it is also a potent tool that can be used to reduce the Time to Treatment Initiation by providing the clinic with hard data about the duration of each phase treatment planning as a function of different variable affecting the planning process. The work is supported by the Global Medical Physics Institute.« less

  16. Deficits in Attention and Visual Processing but not Global Cognition Predict Simulated Driving Errors in Drivers Diagnosed With Mild Alzheimer's Disease.

    PubMed

    Yamin, Stephanie; Stinchcombe, Arne; Gagnon, Sylvain

    2016-06-01

    This study sought to predict driving performance of drivers with Alzheimer's disease (AD) using measures of attention, visual processing, and global cognition. Simulated driving performance of individuals with mild AD (n = 20) was contrasted with performance of a group of healthy controls (n = 21). Performance on measures of global cognitive function and specific tests of attention and visual processing were examined in relation to simulated driving performance. Strong associations were observed between measures of attention, notably the Test of Everyday Attention (sustained attention; r = -.651, P = .002) and the Useful Field of View (r = .563, P = .010), and driving performance among drivers with mild AD. The Visual Object and Space Perception Test-object was significantly correlated with the occurrence of crashes (r = .652, P = .002). Tests of global cognition did not correlate with simulated driving outcomes. The results suggest that professionals exercise caution when extrapolating driving performance based on global cognitive indicators. © The Author(s) 2015.

  17. SMMP v. 3.0—Simulating proteins and protein interactions in Python and Fortran

    NASA Astrophysics Data System (ADS)

    Meinke, Jan H.; Mohanty, Sandipan; Eisenmenger, Frank; Hansmann, Ulrich H. E.

    2008-03-01

    We describe a revised and updated version of the program package SMMP. SMMP is an open-source FORTRAN package for molecular simulation of proteins within the standard geometry model. It is designed as a simple and inexpensive tool for researchers and students to become familiar with protein simulation techniques. SMMP 3.0 sports a revised API increasing its flexibility, an implementation of the Lund force field, multi-molecule simulations, a parallel implementation of the energy function, Python bindings, and more. Program summaryTitle of program:SMMP Catalogue identifier:ADOJ_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADOJ_v3_0.html Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions:Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html Programming language used:FORTRAN, Python No. of lines in distributed program, including test data, etc.:52 105 No. of bytes in distributed program, including test data, etc.:599 150 Distribution format:tar.gz Computer:Platform independent Operating system:OS independent RAM:2 Mbytes Classification:3 Does the new version supersede the previous version?:Yes Nature of problem:Molecular mechanics computations and Monte Carlo simulation of proteins. Solution method:Utilizes ECEPP2/3, FLEX, and Lund potentials. Includes Monte Carlo simulation algorithms for canonical, as well as for generalized ensembles. Reasons for new version:API changes and increased functionality. Summary of revisions:Added Lund potential; parameters used in subroutines are now passed as arguments; multi-molecule simulations; parallelized energy calculation for ECEPP; Python bindings. Restrictions:The consumed CPU time increases with the size of protein molecule. Running time:Depends on the size of the simulated molecule.

  18. Virtual Habitat -a dynamic simulation of closed life support systems -human model status

    NASA Astrophysics Data System (ADS)

    Markus Czupalla, M. Sc.; Zhukov, Anton; Hwang, Su-Au; Schnaitmann, Jonas

    In order to optimize Life Support Systems on a system level, stability questions must be in-vestigated. To do so the exploration group of the Technical University of Munich (TUM) is developing the "Virtual Habitat" (V-HAB) dynamic LSS simulation software. V-HAB shall provide the possibility to conduct dynamic simulations of entire mission scenarios for any given LSS configuration. The Virtual Habitat simulation tool consists of four main modules: • Closed Environment Module (CEM) -monitoring of compounds in a closed environment • Crew Module (CM) -dynamic human simulation • P/C Systems Module (PCSM) -dynamic P/C subsystems • Plant Module (PM) -dynamic plant simulation The core module of the simulation is the dynamic and environment sensitive human module. Introduced in its basic version in 2008, the human module has been significantly updated since, increasing its capabilities and maturity significantly. In this paper three newly added human model subsystems (thermal regulation, digestion and schedule controller) are introduced touching also on the human stress subsystem which is cur-rently under development. Upon the introduction of these new subsystems, the integration of these into the overall V-HAB human model is discussed, highlighting the impact on the most important I/F. The overall human model capabilities shall further be summarized and presented based on meaningful test cases. In addition to the presentation of the results, the correlation strategy for the Virtual Habitat human model shall be introduced assessing the models current confidence level and giving an outlook on the future correlation strategy. Last but not least, the remaining V-HAB mod-ules shall be introduced shortly showing how the human model is integrated into the overall simulation.

  19. Web-Based Computational Chemistry Education with CHARMMing II: Coarse-Grained Protein Folding

    PubMed Central

    Schalk, Vinushka; Lerner, Michael G.; Woodcock, H. Lee; Brooks, Bernard R.

    2014-01-01

    A lesson utilizing a coarse-grained (CG) G-like model has been implemented into the CHARMM INterface and Graphics (CHARMMing) web portal (www.charmming.org) to the Chemistry at HARvard Macromolecular Mechanics (CHARMM) molecular simulation package. While widely used to model various biophysical processes, such as protein folding and aggregation, CG models can also serve as an educational tool because they can provide qualitative descriptions of complex biophysical phenomena for a relatively cheap computational cost. As a proof of concept, this lesson demonstrates the construction of a CG model of a small globular protein, its simulation via Langevin dynamics, and the analysis of the resulting data. This lesson makes connections between modern molecular simulation techniques and topics commonly presented in an advanced undergraduate lecture on physical chemistry. It culminates in a straightforward analysis of a short dynamics trajectory of a small fast folding globular protein; we briefly describe the thermodynamic properties that can be calculated from this analysis. The assumptions inherent in the model and the data analysis are laid out in a clear, concise manner, and the techniques used are consistent with those employed by specialists in the field of CG modeling. One of the major tasks in building the G-like model is determining the relative strength of the nonbonded interactions between coarse-grained sites. New functionality has been added to CHARMMing to facilitate this process. The implementation of these features into CHARMMing helps automate many of the tedious aspects of constructing a CG G model. The CG model builder and its accompanying lesson should be a valuable tool to chemistry students, teachers, and modelers in the field. PMID:25058338

  20. Web-based computational chemistry education with CHARMMing II: Coarse-grained protein folding.

    PubMed

    Pickard, Frank C; Miller, Benjamin T; Schalk, Vinushka; Lerner, Michael G; Woodcock, H Lee; Brooks, Bernard R

    2014-07-01

    A lesson utilizing a coarse-grained (CG) Gō-like model has been implemented into the CHARMM INterface and Graphics (CHARMMing) web portal (www.charmming.org) to the Chemistry at HARvard Macromolecular Mechanics (CHARMM) molecular simulation package. While widely used to model various biophysical processes, such as protein folding and aggregation, CG models can also serve as an educational tool because they can provide qualitative descriptions of complex biophysical phenomena for a relatively cheap computational cost. As a proof of concept, this lesson demonstrates the construction of a CG model of a small globular protein, its simulation via Langevin dynamics, and the analysis of the resulting data. This lesson makes connections between modern molecular simulation techniques and topics commonly presented in an advanced undergraduate lecture on physical chemistry. It culminates in a straightforward analysis of a short dynamics trajectory of a small fast folding globular protein; we briefly describe the thermodynamic properties that can be calculated from this analysis. The assumptions inherent in the model and the data analysis are laid out in a clear, concise manner, and the techniques used are consistent with those employed by specialists in the field of CG modeling. One of the major tasks in building the Gō-like model is determining the relative strength of the nonbonded interactions between coarse-grained sites. New functionality has been added to CHARMMing to facilitate this process. The implementation of these features into CHARMMing helps automate many of the tedious aspects of constructing a CG Gō model. The CG model builder and its accompanying lesson should be a valuable tool to chemistry students, teachers, and modelers in the field.

  1. Developing Collective Training for Small Unmanned Aerial Systems Employment

    NASA Technical Reports Server (NTRS)

    Durlach, Paula J.; Priest, Heather; Martin, Glenn A.; Saffold, Jay

    2010-01-01

    The projected use of small unmanned aerial systems (SUAS) in military operations will produce training requirements which go beyond current capabilities. The paper describes the development of prototype training procedures and accompanying research simulations to address this need. We initially constructed a testbed to develop simulation-based training for an SUAS operator equipped with a simulated vertical-lift and land SUAS. However, the required training will go beyond merely training an operator how to pilot an SUAS. In addition to tactics, techniques, and procedures for employment of SUASs, collective training methods must be trained. Moreover, the leader of a unit equipped with SUAS will need to learn how to plan missions which incorporate the SUAS, and take into account air space and frequency management considerations. The demands of the task require the leader to allocate personnel to the SUAS mission, communicate and coordinate with those personnel during the mission, and make use of the information provided. To help address these training issues, we expanded our research testbed to include a command and control node (C2 node), to enable communications between a leader and the SUAS operator. In addition, we added a virtual environment in which dismounted infantry missions can be conducted. This virtual environment provides the opportunity for interactions among human-controlled avatars and non-player characters (NPCs), plus authoring tools to construct scenarios. Using these NPCs, a collective exercise involving friendly, enemy, and civilian personnel can be conducted without the need for a human role-player for every entity. We will describe the results of our first experiment, which examined the ability of players to negotiate use of the C2 node and the virtual environment at the same time, in order to see if this is a feasible combination of tools for training development.

  2. Evaluation of simulation training in cardiothoracic surgery: the Senior Tour perspective.

    PubMed

    Fann, James I; Feins, Richard H; Hicks, George L; Nesbitt, Jonathan C; Hammon, John W; Crawford, Fred A

    2012-02-01

    The study objective was to introduce senior surgeons, referred to as members of the "Senior Tour," to simulation-based learning and evaluate ongoing simulation efforts in cardiothoracic surgery. Thirteen senior cardiothoracic surgeons participated in a 2½-day Senior Tour Meeting. Of 12 simulators, each participant focused on 6 cardiac (small vessel anastomosis, aortic cannulation, cardiopulmonary bypass, aortic valve replacement, mitral valve repair, and aortic root replacement) or 6 thoracic surgical simulators (hilar dissection, esophageal anastomosis, rigid bronchoscopy, video-assisted thoracoscopic surgery lobectomy, tracheal resection, and sleeve resection). The participants provided critical feedback regarding the realism and utility of the simulators, which served as the basis for a composite assessment of the simulators. All participants acknowledged that simulation may not provide a wholly immersive experience. For small vessel anastomosis, the portable chest model is less realistic compared with the porcine model, but is valuable in teaching anastomosis mechanics. The aortic cannulation model allows multiple cannulations and can serve as a thoracic aortic surgery model. The cardiopulmonary bypass simulator provides crisis management experience. The porcine aortic valve replacement, mitral valve annuloplasty, and aortic root models are realistic and permit standardized training. The hilar dissection model is subject to variability of porcine anatomy and fragility of the vascular structures. The realistic esophageal anastomosis simulator presents various approaches to esophageal anastomosis. The exercise associated with the rigid bronchoscopy model is brief, and adding additional procedures should be considered. The tracheal resection, sleeve resection, and video-assisted thoracoscopic surgery lobectomy models are highly realistic and simulate advanced maneuvers. By providing the necessary tools, such as task trainers and assessment instruments, the Senior Tour may be one means to enhance simulation-based learning in cardiothoracic surgery. The Senior Tour members can provide regular programmatic evaluation and critical analyses to ensure that proposed simulators are of educational value. Published by Mosby, Inc.

  3. A Micro-Grid Simulator Tool (SGridSim) using Effective Node-to-Node Complex Impedance (EN2NCI) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Udhay Ravishankar; Milos manic

    2013-08-01

    This paper presents a micro-grid simulator tool useful for implementing and testing multi-agent controllers (SGridSim). As a common engineering practice it is important to have a tool that simplifies the modeling of the salient features of a desired system. In electric micro-grids, these salient features are the voltage and power distributions within the micro-grid. Current simplified electric power grid simulator tools such as PowerWorld, PowerSim, Gridlab, etc, model only the power distribution features of a desired micro-grid. Other power grid simulators such as Simulink, Modelica, etc, use detailed modeling to accommodate the voltage distribution features. This paper presents a SGridSimmore » micro-grid simulator tool that simplifies the modeling of both the voltage and power distribution features in a desired micro-grid. The SGridSim tool accomplishes this simplified modeling by using Effective Node-to-Node Complex Impedance (EN2NCI) models of components that typically make-up a micro-grid. The term EN2NCI models means that the impedance based components of a micro-grid are modeled as single impedances tied between their respective voltage nodes on the micro-grid. Hence the benefit of the presented SGridSim tool are 1) simulation of a micro-grid is performed strictly in the complex-domain; 2) faster simulation of a micro-grid by avoiding the simulation of detailed transients. An example micro-grid model was built using the SGridSim tool and tested to simulate both the voltage and power distribution features with a total absolute relative error of less than 6%.« less

  4. What the Logs Can Tell You: Mediation to Implement Feedback in Training

    NASA Technical Reports Server (NTRS)

    Maluf, David; Wiederhold, Gio; Abou-Khalil, Ali; Norvig, Peter (Technical Monitor)

    2000-01-01

    The problem addressed by Mediation to Implement Feedback in Training (MIFT) is to customize the feedback from training exercizes by exploiting knowledge about the training scenario, training objectives, and specific student/teacher needs. We achieve this by inserting an intelligent mediation layer into the information flow from observations collected during training exercises to the display and user interface. Knowledge about training objectives, scenarios, and tasks is maintained in the mediating layer. A designer constraint is that domain experts must be able to extend mediators by adding domain-specific knowledge that supports additional aggregations, abstractions, and views of the results of training exercises. The MIFT mediation concept is intended to be integrated with existing military training exercise management tools and reduce the cost of developing and maintaining separate feedback and evaluation tools for every training simulator and every set of customer needs. The MIFT Architecture is designed as a set of independently reusable components which interact with each other through standardized formalisms such as the Knowledge Interchange Format (KIF) and Knowledge Query and Manipulation Language (KQML).

  5. Aircraft as Research Tools

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Aeronautical research usually begins with computers, wind tunnels, and flight simulators, but eventually the theories must fly. This is when flight research begins, and aircraft are the primary tools of the trade. Flight research involves doing precision maneuvers in either a specially built experimental aircraft or an existing production airplane that has been modified. For example, the AD-1 was a unique airplane made only for flight research, while the NASA F-18 High Alpha Research Vehicle (HARV) was a standard fighter aircraft that was transformed into a one-of-a-kind aircraft as it was fitted with new propulsion systems, flight controls, and scientific equipment. All research aircraft are able to perform scientific experiments because of the onboard instruments that record data about its systems, aerodynamics, and the outside environment. Since the 1970's, NASA flight research has become more comprehensive, with flights involving everything form Space Shuttles to ultralights. NASA now flies not only the fastest airplanes, but some of the slowest. Flying machines continue to evolve with new wing designs, propulsion systems, and flight controls. As always, a look at today's experimental research aircraft is a preview of the future.

  6. Mobile robotic sensors for perimeter detection and tracking.

    PubMed

    Clark, Justin; Fierro, Rafael

    2007-02-01

    Mobile robot/sensor networks have emerged as tools for environmental monitoring, search and rescue, exploration and mapping, evaluation of civil infrastructure, and military operations. These networks consist of many sensors each equipped with embedded processors, wireless communication, and motion capabilities. This paper describes a cooperative mobile robot network capable of detecting and tracking a perimeter defined by a certain substance (e.g., a chemical spill) in the environment. Specifically, the contributions of this paper are twofold: (i) a library of simple reactive motion control algorithms and (ii) a coordination mechanism for effectively carrying out perimeter-sensing missions. The decentralized nature of the methodology implemented could potentially allow the network to scale to many sensors and to reconfigure when adding/deleting sensors. Extensive simulation results and experiments verify the validity of the proposed cooperative control scheme.

  7. The Eugene language for synthetic biology.

    PubMed

    Bilitchenko, Lesia; Liu, Adam; Densmore, Douglas

    2011-01-01

    Synthetic biological systems are currently created by an ad hoc, iterative process of design, simulation, and assembly. These systems would greatly benefit from the introduction of a more formalized and rigorous specification of the desired system components as well as constraints on their composition. In order to do so, the creation of robust and efficient design flows and tools is imperative. We present a human readable language (Eugene) which allows for both the specification of synthetic biological designs based on biological parts as well as providing a very expressive constraint system to drive the creation of composite devices from collection of parts. This chapter provides an overview of the language primitives as well as instructions on installation and use of Eugene v0.03b. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Sensitivity and Nonlinearity of Thermoacoustic Oscillations

    NASA Astrophysics Data System (ADS)

    Juniper, Matthew P.; Sujith, R. I.

    2018-01-01

    Nine decades of rocket engine and gas turbine development have shown that thermoacoustic oscillations are difficult to predict but can usually be eliminated with relatively small ad hoc design changes. These changes can, however, be ruinously expensive to devise. This review explains why linear and nonlinear thermoacoustic behavior is so sensitive to parameters such as operating point, fuel composition, and injector geometry. It shows how nonperiodic behavior arises in experiments and simulations and discusses how fluctuations in thermoacoustic systems with turbulent reacting flow, which are usually filtered or averaged out as noise, can reveal useful information. Finally, it proposes tools to exploit this sensitivity in the future: adjoint-based sensitivity analysis to optimize passive control designs and complex systems theory to warn of impending thermoacoustic oscillations and to identify the most sensitive elements of a thermoacoustic system.

  9. A flexible framework for process-based hydraulic and water ...

    EPA Pesticide Factsheets

    Background Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. While popular, generally, the GI models are relatively simplistic. However, GI model predictions are being relied upon by many municipalities and State/Local agencies to make decisions about grey vs. green infrastructure improvement planning. Adding complexity to GI modeling frameworks may preclude their use in simpler urban planning situations. Therefore, the goal here was to develop a sophisticated, yet flexible tool that could be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media used in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biophysical processes affecting contaminants such as reactions, and particle-associated transport accurately while maintaining a high degree of flexibly to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated.Framework Features The process-based model framework developed here can be used to model a diverse range of GI practices such as green roof, retention pond, bioretention, infiltration trench, permeable pavement and

  10. WRF added value to capture the spatio-temporal drought variability

    NASA Astrophysics Data System (ADS)

    García-Valdecasas Ojeda, Matilde; Quishpe-Vásquez, César; Raquel Gámiz-Fortis, Sonia; Castro-Díez, Yolanda; Jesús Esteban-Parra, María

    2017-04-01

    Regional Climate Models (RCM) has been widely used as a tool to perform high resolution climate fields in areas with high climate variability such as Spain. However, the outputs provided by downscaling techniques have many sources of uncertainty associated at different aspects. In this study, the ability of the Weather Research and Forecasting (WRF) model to capture drought conditions has been analyzed. The WRF simulation was carried out for a period that spanned from 1980 to 2010 over a domain centered in the Iberian Peninsula with a spatial resolution of 0.088°, and nested in the coarser EURO-CORDEX domain (0.44° spatial resolution). To investigate the spatiotemporal drought variability, the Standardized Precipitation Index (SPI) and the Standardized Precipitation Evapotranspiration Index (SPEI) has been computed at two different timescales: 3- and 12-months due to its suitability to study agricultural and hydrological droughts. The drought indices computed from WRF outputs were compared with those obtained from the observational (MOTEDAS and MOPREDAS) datasets. In order to assess the added value provided by downscaled fields, these indices were also computed from the ERA-Interim Re-Analysis database, which provides the lateral and boundary conditions of the WRF simulations. Results from this study indicate that WRF provides a noticeable benefit with respect to ERA-Interim for many regions in Spain in terms of drought indices, greater for SPI than for SPEI. The improvement offered by WRF depends on the region, index and timescale analyzed, being greater at longer timescales. These findings prove the reliability of the downscaled fields to detect drought events and, therefore, it is a remarkable source of knowledge for a suitable decision making related to water-resource management. Keywords: Drought, added value, Regional Climate Models, WRF, SPEI, SPI. Acknowledgements: This work has been financed by the projects P11-RNM-7941 (Junta de Andalucía-Spain) and CGL2013-48539-R (MINECO-Spain, FEDER).

  11. A simulation study of spine biofidelity in the hybrid-III 6-year-old ATD.

    PubMed

    Wu, Jun; Cao, Libo; Reed, Matthew P; Hu, Jingwen

    2013-01-01

    Because of the lack of pediatric biomechanical data, Hybrid-III (HIII) child anthropomorphic test devices (ATDs) are essentially scaled from the mid-size male ATD based on the geometric considerations. These ATDs inherit a rigid thoracic spine from the adult HIII ATDs, which has been criticized as unrealistic. Therefore, the objective of this study was to explore possible design modifications for improving the spine biofidelity of the HIII 6-year-old ATD. A previously developed and validated HIII 6-year-old MADYMO ATD model was used as the baseline model to investigate the effects of design modifications on the spine biofidelity of the current ATD. Several sets of child volunteer and cadaver test data were considered as the design targets, including child volunteer low-speed crash test data, pediatric cadaver cervical spine tensile test data, and child cadaver crash test data. ATD design modifications include adding an additional joint to the thoracic spine region and changing the joint characteristics at the cervical and lumbar spine regions. Optimization techniques were used to match simulation results to each set of test results. The results indicate that the translational characteristics of the cervical and lumbar spine in the current child ATD need to be reduced to achieve realistic spine flexibility. Adding an additional joint at the thoracic spine region with degree of freedom in both flexion/extension and tension would significantly improve the ATD biofidelity in terms of predicting the overall spine curvature and head excursion in frontal crashes. Future ATD spine modification should focus on reducing the neck and lumbar tension stiffness and adding additional flexibility both in flexion/extension and tension at the thoracic spine region. The child ATD model developed in this study can be used as an important tool to improve child ATD biofidelity and child restraint system design in motor vehicle crashes.

  12. Structural Health Monitoring challenges on the 10-MW offshore wind turbine model

    NASA Astrophysics Data System (ADS)

    Di Lorenzo, E.; Kosova, G.; Musella, U.; Manzato, S.; Peeters, B.; Marulo, F.; Desmet, W.

    2015-07-01

    The real-time structural damage detection on large slender structures has one of its main application on offshore Horizontal Axis Wind Turbines (HAWT). The renewable energy market is continuously pushing the wind turbine sizes and performances. This is the reason why nowadays offshore wind turbines concepts are going toward a 10 MW reference wind turbine model. The aim of the work is to perform operational analyses on the 10-MW reference wind turbine finite element model using an aeroelastic code in order to obtain long-time-low- cost simulations. The aeroelastic code allows simulating the damages in several ways: by reducing the edgewise/flapwise blades stiffness, by adding lumped masses or considering a progressive mass addiction (i.e. ice on the blades). The damage detection is then performed by means of Operational Modal Analysis (OMA) techniques. Virtual accelerometers are placed in order to simulate real measurements and to estimate the modal parameters. The feasibility of a robust damage detection on the model has been performed on the HAWT model in parked conditions. The situation is much more complicated in case of operating wind turbines because the time periodicity of the structure need to be taken into account. Several algorithms have been implemented and tested in the simulation environment. They are needed in order to carry on a damage detection simulation campaign and develop a feasible real-time damage detection method. In addition to these algorithms, harmonic removal tools are needed in order to dispose of the harmonics due to the rotation.

  13. Fast 2D Fluid-Analytical Simulation of IEDs and Plasma Uniformity in Multi-frequency CCPs

    NASA Astrophysics Data System (ADS)

    Kawamura, E.; Lieberman, M. A.; Graves, D. B.

    2014-10-01

    A fast 2D axisymmetric fluid-analytical model using the finite elements tool COMSOL is interfaced with a 1D particle-in-cell (PIC) code to study ion energy distributions (IEDs) in multi-frequency argon capacitively coupled plasmas (CCPs). A bulk fluid plasma model which solves the time-dependent plasma fluid equations is coupled with an analytical sheath model which solves for the sheath parameters. The fluid-analytical results are used as input to a PIC simulation of the sheath region of the discharge to obtain the IEDs at the wafer electrode. Each fluid-analytical-PIC simulation on a moderate 2.2 GHz CPU workstation with 8 GB of memory took about 15-20 minutes. The 2D multi-frequency fluid-analytical model was compared to 1D PIC simulations of a symmetric parallel plate discharge, showing good agreement. Fluid-analytical simulations of a 2/60/162 MHz argon CCP with a typical asymmetric reactor geometry were also conducted. The low 2 MHz frequency controlled the sheath width and voltage while the higher frequencies controlled the plasma production. A standing wave was observable at the highest frequency of 162 MHz. Adding 2 MHz power to a 60 MHz discharge or 162 MHz to a dual frequency 2 MHz/60 MHz discharge enhanced the plasma uniformity. This work was supported by the Department of Energy Office of Fusion Energy Science Contract DE-SC000193, and in part by gifts from Lam Research Corporation and Micron Corporation.

  14. Simulating ground water-lake interactions: Approaches and insights

    USGS Publications Warehouse

    Hunt, R.J.; Haitjema, H.M.; Krohelski, J.T.; Feinstein, D.T.

    2003-01-01

    Approaches for modeling lake-ground water interactions have evolved significantly from early simulations that used fixed lake stages specified as constant head to sophisticated LAK packages for MODFLOW. Although model input can be complex, the LAK package capabilities and output are superior to methods that rely on a fixed lake stage and compare well to other simple methods where lake stage can be calculated. Regardless of the approach, guidelines presented here for model grid size, location of three-dimensional flow, and extent of vertical capture can facilitate the construction of appropriately detailed models that simulate important lake-ground water interactions without adding unnecessary complexity. In addition to MODFLOW approaches, lake simulation has been formulated in terms of analytic elements. The analytic element lake package had acceptable agreement with a published LAK1 problem, even though there were differences in the total lake conductance and number of layers used in the two models. The grid size used in the original LAK1 problem, however, violated a grid size guideline presented in this paper. Grid sensitivity analyses demonstrated that an appreciable discrepancy in the distribution of stream and lake flux was related to the large grid size used in the original LAK1 problem. This artifact is expected regardless of MODFLOW LAK package used. When the grid size was reduced, a finite-difference formulation approached the analytic element results. These insights and guidelines can help ensure that the proper lake simulation tool is being selected and applied.

  15. Prediction of population with Alzheimer's disease in the European Union using a system dynamics model.

    PubMed

    Tomaskova, Hana; Kuhnova, Jitka; Cimler, Richard; Dolezal, Ondrej; Kuca, Kamil

    2016-01-01

    Alzheimer's disease (AD) is a slowly progressing neurodegenerative brain disease with irreversible brain effects; it is the most common cause of dementia. With increasing age, the probability of suffering from AD increases. In this research, population growth of the European Union (EU) until the year 2080 and the number of patients with AD are modeled. The aim of this research is to predict the spread of AD in the EU population until year 2080 using a computer simulation. For the simulation of the EU population and the occurrence of AD in this population, a system dynamics modeling approach has been used. System dynamics is a useful and effective method for the investigation of complex social systems. Over the past decades, its applicability has been demonstrated in a wide variety of applications. In this research, this method has been used to investigate the growth of the EU population and predict the number of patients with AD. The model has been calibrated on the population prediction data created by Eurostat. Based on data from Eurostat, the EU population until year 2080 has been modeled. In 2013, the population of the EU was 508 million and the number of patients with AD was 7.5 million. Based on the prediction, in 2040, the population of the EU will be 524 million and the number of patients with AD will be 13.1 million. By the year 2080, the EU population will be 520 million and the number of patients with AD will be 13.7 million. System dynamics modeling approach has been used for the prediction of the number of patients with AD in the EU population till the year 2080. These results can be used to determine the economic burden of the treatment of these patients. With different input data, the simulation can be used also for the different regions as well as for different noncontagious disease predictions.

  16. Assessment of cognition in early dementia

    PubMed Central

    Silverberg, Nina B.; Ryan, Laurie M.; Carrillo, Maria C.; Sperling, Reisa; Petersen, Ronald C.; Posner, Holly B.; Snyder, Peter J.; Hilsabeck, Robin; Gallagher, Michela; Raber, Jacob; Rizzo, Albert; Possin, Katherine; King, Jonathan; Kaye, Jeffrey; Ott, Brian R.; Albert, Marilyn S.; Wagster, Molly V.; Schinka, John A.; Cullum, C. Munro; Farias, Sarah T.; Balota, David; Rao, Stephen; Loewenstein, David; Budson, Andrew E.; Brandt, Jason; Manly, Jennifer J.; Barnes, Lisa; Strutt, Adriana; Gollan, Tamar H.; Ganguli, Mary; Babcock, Debra; Litvan, Irene; Kramer, Joel H.; Ferman, Tanis J.

    2012-01-01

    Better tools for assessing cognitive impairment in the early stages of Alzheimer’s disease (AD) are required to enable diagnosis of the disease before substantial neurodegeneration has taken place and to allow detection of subtle changes in the early stages of progression of the disease. The National Institute on Aging and the Alzheimer’s Association convened a meeting to discuss state of the art methods for cognitive assessment, including computerized batteries, as well as new approaches in the pipeline. Speakers described research using novel tests of object recognition, spatial navigation, attentional control, semantic memory, semantic interference, prospective memory, false memory and executive function as among the tools that could provide earlier identification of individuals with AD. In addition to early detection, there is a need for assessments that reflect real-world situations in order to better assess functional disability. It is especially important to develop assessment tools that are useful in ethnically, culturally and linguistically diverse populations as well as in individuals with neurodegenerative disease other than AD. PMID:23559893

  17. Seeing Is Believing: The Strategy behind Campaign Imagery and Its Impact on Voters

    ERIC Educational Resources Information Center

    Swigger, Nathaniel

    2009-01-01

    As television ads have become the primary tool of communication in American campaigns, research on campaign effects has focused more and more attention on how these ads influence the electorate. Little attention has been paid, however, to the visual content of these ads. Despite a format that delivers an enormous quantity of visual information,…

  18. Geophysical data analysis and visualization using the Grid Analysis and Display System

    NASA Technical Reports Server (NTRS)

    Doty, Brian E.; Kinter, James L., III

    1995-01-01

    Several problems posed by the rapidly growing volume of geophysical data are described, and a selected set of existing solutions to these problems is outlined. A recently developed desktop software tool called the Grid Analysis and Display System (GrADS) is presented. The GrADS' user interface is a natural extension of the standard procedures scientists apply to their geophysical data analysis problems. The basic GrADS operations have defaults that naturally map to data analysis actions, and there is a programmable interface for customizing data access and manipulation. The fundamental concept of the GrADS' dimension environment, which defines both the space in which the geophysical data reside and the 'slice' of data which is being analyzed at a given time, is expressed The GrADS' data storage and access model is described. An argument is made in favor of describable data formats rather than standard data formats. The manner in which GrADS users may perform operations on their data and display the results is also described. It is argued that two-dimensional graphics provides a powerful quantitative data analysis tool whose value is underestimated in the current development environment which emphasizes three dimensional structure modeling.

  19. The CERAD Neuropsychological Battery in Patients with Frontotemporal Lobar Degeneration

    PubMed Central

    Haanpää, Ramona M.; Suhonen, Noora-Maria; Hartikainen, Päivi; Koivisto, Anne M.; Moilanen, Virpi; Herukka, Sanna-Kaisa; Hänninen, Tuomo; Remes, Anne M.

    2015-01-01

    Background/Aims The diagnosis of frontotemporal lobar degeneration (FTLD) is based on neuropsychological examination in addition to clinical symptoms and brain imaging. There is no simple, validated, cognitive tool available in screening for FTLD. The Consortium to Establish a Registry for Alzheimer's Disease neuropsychological battery (CERAD-NB) was originally devised to identify the early cognitive changes related to Alzheimer's disease (AD). Our aim was to investigate the utility of the CERAD-NB in FTLD. Methods Patients with FTLD (n = 95) and AD (n = 90) were assessed with the CERAD-NB, Trail Making Test parts A and B and single-letter Phonemic Fluency. Results FTLD patients were more severely impaired in the Verbal Fluency subtest in the CERAD-NB and Trail Making Test part A compared to AD patients. In addition, AD patients were more impaired in memory subtests compared to FTLD patients. Conclusion The CERAD-NB may be a useful tool in screening for FTLD. Impaired performance in Verbal Fluency with moderately well-preserved Delayed Recall and Memory Tests may help in identifying patients with probable FTLD and discriminating FTLD from AD. Adding the Trail Making Test to the battery might enhance its value as a screening instrument for FTLD. PMID:25999981

  20. Combining Simulation Tools for End-to-End Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min

    2015-01-01

    Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.

  1. Guidelines for Finite Element Modeling of Acoustic Radiation Force-Induced Shear Wave Propagation in Tissue-Mimicking Media

    PubMed Central

    Palmeri, Mark L.; Qiang, Bo; Chen, Shigao; Urban, Matthew W.

    2017-01-01

    Ultrasound shear wave elastography is emerging as an important imaging modality for evaluating tissue material properties. In its practice, some systematic biases have been associated with ultrasound frequencies, focal depths and configuration, transducer types (linear versus curvilinear), along with displacement estimation and shear wave speed estimation algorithms. Added to that, soft tissues are not purely elastic, so shear waves will travel at different speeds depending on their spectral content, which can be modulated by the acoustic radiation force excitation focusing, duration and the frequency-dependent stiffness of the tissue. To understand how these different acquisition and material property parameters may affect measurements of shear wave velocity, simulations of the propagation of shear waves generated by acoustic radiation force excitations in viscoelastic media are a very important tool. This article serves to provide an in-depth description of how these simulations are performed. The general scheme is broken into three components: (1) simulation of the three-dimensional acoustic radiation force push beam, (2) applying that force distribution to a finite element model, and (3) extraction of the motion data for post-processing. All three components will be described in detail and combined to create a simulation platform that is powerful for developing and testing algorithms for academic and industrial researchers involved in making quantitative shear wave-based measurements of tissue material properties. PMID:28026760

  2. A permeation theory for single-file ion channels: one- and two-step models.

    PubMed

    Nelson, Peter Hugo

    2011-04-28

    How many steps are required to model permeation through ion channels? This question is investigated by comparing one- and two-step models of permeation with experiment and MD simulation for the first time. In recent MD simulations, the observed permeation mechanism was identified as resembling a Hodgkin and Keynes knock-on mechanism with one voltage-dependent rate-determining step [Jensen et al., PNAS 107, 5833 (2010)]. These previously published simulation data are fitted to a one-step knock-on model that successfully explains the highly non-Ohmic current-voltage curve observed in the simulation. However, these predictions (and the simulations upon which they are based) are not representative of real channel behavior, which is typically Ohmic at low voltages. A two-step association/dissociation (A/D) model is then compared with experiment for the first time. This two-parameter model is shown to be remarkably consistent with previously published permeation experiments through the MaxiK potassium channel over a wide range of concentrations and positive voltages. The A/D model also provides a first-order explanation of permeation through the Shaker potassium channel, but it does not explain the asymmetry observed experimentally. To address this, a new asymmetric variant of the A/D model is developed using the present theoretical framework. It includes a third parameter that represents the value of the "permeation coordinate" (fractional electric potential energy) corresponding to the triply occupied state n of the channel. This asymmetric A/D model is fitted to published permeation data through the Shaker potassium channel at physiological concentrations, and it successfully predicts qualitative changes in the negative current-voltage data (including a transition to super-Ohmic behavior) based solely on a fit to positive-voltage data (that appear linear). The A/D model appears to be qualitatively consistent with a large group of published MD simulations, but no quantitative comparison has yet been made. The A/D model makes a network of predictions for how the elementary steps and the channel occupancy vary with both concentration and voltage. In addition, the proposed theoretical framework suggests a new way of plotting the energetics of the simulated system using a one-dimensional permeation coordinate that uses electric potential energy as a metric for the net fractional progress through the permeation mechanism. This approach has the potential to provide a quantitative connection between atomistic simulations and permeation experiments for the first time.

  3. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less

  4. The value of SPaCE in delivering patient feedback.

    PubMed

    Clapham, Laura; Allan, Laura; Stirling, Kevin

    2016-02-01

    The use of simulated patients (SPs) within undergraduate medical curricula is an established and valued learning opportunity. Within the context of simulation, it is imperative to capture feedback from all participants within the simulation activity. The Simulated Patient Candidate Evaluation (SPaCE) tool was developed to deliver SP feedback following a simulation activity. SpaCE is a closed feedback tool that allows SPs to rate a student's performance, using a five-point Likert scale, in three domains: attitude; interaction skills; and management. This research study examined the value of the SPaCE tool and how it contributes to the overall feedback that a student receives. Classical test theory was used to determine the reliability of the SPaCE tool. An evaluation of all SP responses was conducted to observe trends in scoring patterns for each question. Qualitative data were collected via a free-text questionnaire and subsequent focus group discussion. It is imperative to capture feedback from all participants within the simulation activity Classical test theory determined that the SPaCE tool had a reliability co-efficient of 0.89. A total of 13 SPs replied to the questionnaire. A thematic analysis of all questionnaire data identified that the SPaCE tool provides a structure that allows patient feedback to be given effectively following a simulation activity. These themes were discussed further with six SPs who attended the subsequent focus group session. The SPaCE tool has been shown to be a reliable closed feedback tool that allows SPs to discriminate between students, based on their performance. The next stage in the development of the SPaCE tool is to test the wider applicability of this feedback tool. © 2015 John Wiley & Sons Ltd.

  5. Characterization of Alzheimer's Protective and Causative Amyloid-beta Variants Using a Combination of Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Das, Payel; Chakraborty, Srirupa; Chacko, Anita; Murray, Brian; Belfort, Georges

    The aggregation of amyloid-beta (A β) peptides plays a crucial role in the etiology of Alzheimer's disease (AD). Recently, it has been reported that an A2T mutation in A β can protect from AD. Interestingly, an A2V mutation has been also found to offer protection against AD in the heterozygous state. Structural characterization of these natural A β variants thus offers an intriguing approach to understand the molecular mechanism of AD. Toward this goal, we have characterized the conformational landscapes of the intrinsically disordered WT, A2V, and A2T A β1-42 variant monomers and dimers by using extensive atomistic molecular dynamics (MD) simulations. Simulations reveal markedly different secondary and tertiary structure at the central and C-terminal hydrophobic regions of the peptide, which play a crucial role in A β aggregation and related toxicity. For example, an enhanced double β-hairpin formation was observed in A2V monomer. In contrast, the A2T mutation enhances disorder of the conformational ensemble due to formation of atypical long-range interactions. These structural insights obtained from simulations allow understanding of the differential aggregation, oligomer morphology, and LTP inhibition of the variants observed in the experiments and offer a path toward designing and testing aggregation inhibitors.

  6. New Tooling System for Forming Aluminum Beverage Can End Shell

    NASA Astrophysics Data System (ADS)

    Yamazaki, Koetsu; Otsuka, Takayasu; Han, Jing; Hasegawa, Takashi; Shirasawa, Taketo

    2011-08-01

    This paper proposes a new tooling system for forming shells of aluminum beverage can ends. At first, forming process of a conversional tooling system has been simulated using three-dimensional finite element models. Simulation results have been confirmed to be consistent with those of axisymmetric models, so simulations for further study have been performed using axisymmetric models to save computational time. A comparison shows that thinning of the shell formed by the proposed tooling system has been improved about 3.6%. Influences of the tool upmost surface profiles and tool initial positions in the new tooling system have been investigated and the design optimization method based on the numerical simulations has been then applied to search optimum design points, in order to minimize thinning subjected to the constraints of the geometrical dimensions of the shell. At last, the performance of the shell subjected to internal pressure has been confirmed to meet design requirements.

  7. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  8. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    PubMed

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  9. Adoption of Test Driven Development and Continuous Integration for the Development of the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John M.

    2013-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA/Johnson Space Center and many other NASA facilities. It describes what was learned and the significant benefits seen, such as fast, thorough, and clear test feedback every time code is checked-in to the code repository. It also describes a system that encourages development of code that is much more flexible, maintainable, and reliable. The Trick Simulation Toolkit development environment provides a common architecture for user-defined simulations. Trick builds executable simulations using user-supplied simulation-definition files (S_define) and user supplied "model code". For each Trick-based simulation, Trick automatically provides job scheduling, checkpoint / restore, data-recording, interactive variable manipulation (variable server), and an input-processor. Also included are tools for plotting recorded data and various other supporting tools and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX. Prior to adopting this new development approach, Trick testing consisted primarily of running a few large simulations, with the hope that their complexity and scale would exercise most of Trick's code and expose any recently introduced bugs. Unsurprising, this approach yielded inconsistent results. It was obvious that a more systematic, thorough approach was required. After seeing examples of some Java-based projects that used the JUnit test framework, similar test frameworks for C and C++ were sought. Several were found, all clearly inspired by JUnit. Googletest, a freely available Open source testing framework, was selected as the most appropriate and capable. The new approach was implemented while rewriting the Trick memory management component, to eliminate a fundamental design flaw. The benefits became obvious almost immediately, not just in the correctness of the individual functions and classes but also in the correctness and flexibility being added to the overall design. Creating code to be testable, and testing as it was created resulted not only in better working code, but also in better-organized, flexible, and readable (i.e., articulate) code. This was, in essence the Test-driven development (TDD) methodology created by Kent Beck. Seeing the benefits of Test Driven Development, other Trick components were refactored to make them more testable and tests were designed and implemented for them.

  10. A study with ESI PAM-STAMP® on the influence of tool deformation on final part quality during a forming process

    NASA Astrophysics Data System (ADS)

    Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David

    2018-05-01

    As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.

  11. Simulation of turbulent separated flows using a novel, evolution-based, eddy-viscosity formulation

    NASA Astrophysics Data System (ADS)

    Castellucci, Paul

    Currently, there exists a lack of confidence in the computational simulation of turbulent separated flows at large Reynolds numbers. The most accurate methods available are too computationally costly to use in engineering applications. Thus, inexpensive models, developed using the Reynolds-averaged Navier-Stokes (RANS) equations, are often extended beyond their applicability. Although these methods will often reproduce integrated quantities within engineering tolerances, such metrics are often insensitive to details within a separated wake, and therefore, poor indicators of simulation fidelity. Using concepts borrowed from large-eddy simulation (LES), a two-equation RANS model is modified to simulate the turbulent wake behind a circular cylinder. This modification involves the computation of one additional scalar field, adding very little to the overall computational cost. When properly inserted into the baseline RANS model, this modification mimics LES in the separated wake, yet reverts to the unmodified form at the cylinder surface. In this manner, superior predictive capability may be achieved without the additional cost of fine spatial resolution associated with LES near solid boundaries. Simulations using modified and baseline RANS models are benchmarked against both LES and experimental data for a circular cylinder wake at Reynolds number 3900. In addition, the computational tool used in this investigation is subject to verification via the Method of Manufactured Solutions. Post-processing of the resultant flow fields includes both mean value and triple-decomposition analysis. These results reveal substantial improvements using the modified system and appear to drive the baseline wake solution toward that of LES, as intended.

  12. ADDING GLOBAL SOILS DATA TO THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL (AGWA)

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Tool (AGWA) is a GIS-based hydrologic modeling tool that is available as an extension for ArcView 3.x from the USDA-ARS Southwest Watershed Research Center (www.tucson.ars.ag.gov/agwa). AGWA is designed to facilitate the assessment of...

  13. SLTCAP: A Simple Method for Calculating the Number of Ions Needed for MD Simulation.

    PubMed

    Schmit, Jeremy D; Kariyawasam, Nilusha L; Needham, Vince; Smith, Paul E

    2018-04-10

    An accurate depiction of electrostatic interactions in molecular dynamics requires the correct number of ions in the simulation box to capture screening effects. However, the number of ions that should be added to the box is seldom given by the bulk salt concentration because a charged biomolecule solute will perturb the local solvent environment. We present a simple method for calculating the number of ions that requires only the total solute charge, solvent volume, and bulk salt concentration as inputs. We show that the most commonly used method for adding salt to a simulation results in an effective salt concentration that is too high. These findings are confirmed using simulations of lysozyme. We have established a web server where these calculations can be readily performed to aid simulation setup.

  14. An optimized treatment for algorithmic differentiation of an important glaciological fixed-point problem

    DOE PAGES

    Goldberg, Daniel N.; Narayanan, Sri Hari Krishna; Hascoet, Laurent; ...

    2016-05-20

    We apply an optimized method to the adjoint generation of a time-evolving land ice model through algorithmic differentiation (AD). The optimization involves a special treatment of the fixed-point iteration required to solve the nonlinear stress balance, which differs from a straightforward application of AD software, and leads to smaller memory requirements and in some cases shorter computation times of the adjoint. The optimization is done via implementation of the algorithm of Christianson (1994) for reverse accumulation of fixed-point problems, with the AD tool OpenAD. For test problems, the optimized adjoint is shown to have far lower memory requirements, potentially enablingmore » larger problem sizes on memory-limited machines. In the case of the land ice model, implementation of the algorithm allows further optimization by having the adjoint model solve a sequence of linear systems with identical (as opposed to varying) matrices, greatly improving performance. Finally, the methods introduced here will be of value to other efforts applying AD tools to ice models, particularly ones which solve a hybrid shallow ice/shallow shelf approximation to the Stokes equations.« less

  15. ADS-B within a Multi-Aircraft Simulation for Distributed Air-Ground Traffic Management

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Palmer, Michael T.; Chung, William W.; Loveness, Ghyrn W.

    2004-01-01

    Automatic Dependent Surveillance Broadcast (ADS-B) is an enabling technology for NASA s Distributed Air-Ground Traffic Management (DAG-TM) concept. DAG-TM has the goal of significantly increasing capacity within the National Airspace System, while maintaining or improving safety. Under DAG-TM, aircraft exchange state and intent information over ADS-B with other aircraft and ground stations. This information supports various surveillance functions including conflict detection and resolution, scheduling, and conformance monitoring. To conduct more rigorous concept feasibility studies, NASA Langley Research Center s PC-based Air Traffic Operations Simulation models a 1090 MHz ADS-B communication structure, based on industry standards for message content, range, and reception probability. The current ADS-B model reflects a mature operating environment and message interference effects are limited to Mode S transponder replies and ADS-B squitters. This model was recently evaluated in a Joint DAG-TM Air/Ground Coordination Experiment with NASA Ames Research Center. Message probability of reception vs. range was lower at higher traffic levels. The highest message collision probability occurred near the meter fix serving as the confluence for two arrival streams. Even the highest traffic level encountered in the experiment was significantly less than the industry standard "LA Basin 2020" scenario. Future studies will account for Mode A and C message interference (a major effect in several industry studies) and will include Mode A and C aircraft in the simulation, thereby increasing the total traffic level. These changes will support ongoing enhancements to separation assurance functions that focus on accommodating longer ADS-B information update intervals.

  16. Data and Tools | Hydrogen and Fuel Cells | NREL

    Science.gov Websites

    researchers, developers, investors, and others interested in the viability, analysis, and development of , energy use, and emissions. Alternative Fuels Data Center Tools Collection of tools-calculators -makers reduce petroleum use. FASTSim: Future Automotive Systems Technology Simulator Simulation tool that

  17. treeman: an R package for efficient and intuitive manipulation of phylogenetic trees.

    PubMed

    Bennett, Dominic J; Sutton, Mark D; Turvey, Samuel T

    2017-01-07

    Phylogenetic trees are hierarchical structures used for representing the inter-relationships between biological entities. They are the most common tool for representing evolution and are essential to a range of fields across the life sciences. The manipulation of phylogenetic trees-in terms of adding or removing tips-is often performed by researchers not just for reasons of management but also for performing simulations in order to understand the processes of evolution. Despite this, the most common programming language among biologists, R, has few class structures well suited to these tasks. We present an R package that contains a new class, called TreeMan, for representing the phylogenetic tree. This class has a list structure allowing phylogenetic trees to be manipulated more efficiently. Computational running times are reduced because of the ready ability to vectorise and parallelise methods. Development is also improved due to fewer lines of code being required for performing manipulation processes. We present three use cases-pinning missing taxa to a supertree, simulating evolution with a tree-growth model and detecting significant phylogenetic turnover-that demonstrate the new package's speed and simplicity.

  18. Mechanical Abuse Simulation and Thermal Runaway Risks of Large-Format Li-ion Batteries

    DOE PAGES

    Wang, Hsin; Lara-Curzio, Edgar; Rule, Evan; ...

    2017-01-11

    Internal short circuit of large-format Li-ion cells induced by mechanical abuse was simulated using a modified mechanical pinch test. A torsion force was added manually at ~40% maximum compressive loading force during the pinch test. The cell was twisted about 5 degrees to the side by horizontally pulling a wire attached to the anode tab. The combined torsion-compression force created small enough failure at the separator and allowed testing of fully charged large format Li-ion cells without triggering thermal runaway. Two types of commercial cells were tested using 4-6 cells at each state-of-charge (SOC). The 18 Ah LiFePO 4 (LFP)more » and 25 Ah Li(NiMnCo) 1/3O 2 (NMC) cells were tested and the thermal runaway risk (TRR) score system was used to evaluate the safety risk of the cells under the same testing conditions. The aim is to provide the cell manufacturers and end users a tool to compare different designs and safety features.« less

  19. Mechanical Abuse Simulation and Thermal Runaway Risks of Large-Format Li-ion Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hsin; Lara-Curzio, Edgar; Rule, Evan

    Internal short circuit of large-format Li-ion cells induced by mechanical abuse was simulated using a modified mechanical pinch test. A torsion force was added manually at ~40% maximum compressive loading force during the pinch test. The cell was twisted about 5 degrees to the side by horizontally pulling a wire attached to the anode tab. The combined torsion-compression force created small enough failure at the separator and allowed testing of fully charged large format Li-ion cells without triggering thermal runaway. Two types of commercial cells were tested using 4-6 cells at each state-of-charge (SOC). The 18 Ah LiFePO 4 (LFP)more » and 25 Ah Li(NiMnCo) 1/3O 2 (NMC) cells were tested and the thermal runaway risk (TRR) score system was used to evaluate the safety risk of the cells under the same testing conditions. The aim is to provide the cell manufacturers and end users a tool to compare different designs and safety features.« less

  20. Experimental identification of a comb-shaped chaotic region in multiple parameter spaces simulated by the Hindmarsh—Rose neuron model

    NASA Astrophysics Data System (ADS)

    Jia, Bing

    2014-03-01

    A comb-shaped chaotic region has been simulated in multiple two-dimensional parameter spaces using the Hindmarsh—Rose (HR) neuron model in many recent studies, which can interpret almost all of the previously simulated bifurcation processes with chaos in neural firing patterns. In the present paper, a comb-shaped chaotic region in a two-dimensional parameter space was reproduced, which presented different processes of period-adding bifurcations with chaos with changing one parameter and fixed the other parameter at different levels. In the biological experiments, different period-adding bifurcation scenarios with chaos by decreasing the extra-cellular calcium concentration were observed from some neural pacemakers at different levels of extra-cellular 4-aminopyridine concentration and from other pacemakers at different levels of extra-cellular caesium concentration. By using the nonlinear time series analysis method, the deterministic dynamics of the experimental chaotic firings were investigated. The period-adding bifurcations with chaos observed in the experiments resembled those simulated in the comb-shaped chaotic region using the HR model. The experimental results show that period-adding bifurcations with chaos are preserved in different two-dimensional parameter spaces, which provides evidence of the existence of the comb-shaped chaotic region and a demonstration of the simulation results in different two-dimensional parameter spaces in the HR neuron model. The results also present relationships between different firing patterns in two-dimensional parameter spaces.

  1. INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT

    EPA Science Inventory

    A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...

  2. StirMark Benchmark: audio watermarking attacks based on lossy compression

    NASA Astrophysics Data System (ADS)

    Steinebach, Martin; Lang, Andreas; Dittmann, Jana

    2002-04-01

    StirMark Benchmark is a well-known evaluation tool for watermarking robustness. Additional attacks are added to it continuously. To enable application based evaluation, in our paper we address attacks against audio watermarks based on lossy audio compression algorithms to be included in the test environment. We discuss the effect of different lossy compression algorithms like MPEG-2 audio Layer 3, Ogg or VQF on a selection of audio test data. Our focus is on changes regarding the basic characteristics of the audio data like spectrum or average power and on removal of embedded watermarks. Furthermore we compare results of different watermarking algorithms and show that lossy compression is still a challenge for most of them. There are two strategies for adding evaluation of robustness against lossy compression to StirMark Benchmark: (a) use of existing free compression algorithms (b) implementation of a generic lossy compression simulation. We discuss how such a model can be implemented based on the results of our tests. This method is less complex, as no real psycho acoustic model has to be applied. Our model can be used for audio watermarking evaluation of numerous application fields. As an example, we describe its importance for e-commerce applications with watermarking security.

  3. Temperature-Dependent Conformations of Model Viscosity Index Improvers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramasamy, Uma Shantini; Cosimbescu, Lelia; Martini, Ashlie

    2015-05-01

    Lubricants are comprised of base oils and additives where additives are chemicals that are deliberately added to the oil to enhance properties and inhibit degradation of the base oils. Viscosity index (VI) improvers are an important class of additives that reduce the decline of fluid viscosity with temperature [1], enabling optimum lubricant performance over a wider range of operating temperatures. These additives are typically high molecular weight polymers, such as, but not limited to, polyisobutylenes, olefin copolymer, and polyalkylmethacrylates, that are added in concentrations of 2-5% (w/w). Appropriate polymers, when dissolved in base oil, expand from a coiled to anmore » uncoiled state with increasing temperature [2]. The ability of VI additives to increase their molar volume and improve the temperature-viscosity dependence of lubricants suggests there is a strong relationship between molecular structure and additive functionality [3]. In this work, we aim to quantify the changes in polymer size with temperature for four polyisobutylene (PIB) based molecular structures at the nano-scale using molecular simulation tools. As expected, the results show that the polymers adopt more conformations at higher temperatures, and there is a clear indication that the expandability of a polymer is strongly influenced by molecular structure.« less

  4. HEFCE's People Management Self-Assessment Tool: Ticking Boxes or Adding Value? A Case Study

    ERIC Educational Resources Information Center

    McDonald, Claire

    2009-01-01

    This article examines one specific organisational development tool in depth and uses a case study to investigate whether using the tool is more than a tick-box exercise and really can add value and help organisations to develop and improve. The People Management Self-Assessment Tool (SAT) is used to examine higher education institutions' (HEIs)…

  5. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  6. SimulCAT: Windows Software for Simulating Computerized Adaptive Test Administration

    ERIC Educational Resources Information Center

    Han, Kyung T.

    2012-01-01

    Most, if not all, computerized adaptive testing (CAT) programs use simulation techniques to develop and evaluate CAT program administration and operations, but such simulation tools are rarely available to the public. Up to now, several software tools have been available to conduct CAT simulations for research purposes; however, these existing…

  7. Recent Developments in the Code RITRACKS (Relativistic Ion Tracks)

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Ponomarev, Artem L.; Blattnig, Steve R.

    2018-01-01

    The code RITRACKS (Relativistic Ion Tracks) was developed to simulate detailed stochastic radiation track structures of ions of different types and energies. Many new capabilities were added to the code during the recent years. Several options were added to specify the times at which the tracks appear in the irradiated volume, allowing the simulation of dose-rate effects. The code has been used to simulate energy deposition in several targets: spherical, ellipsoidal and cylindrical. More recently, density changes as well as a spherical shell were implemented for spherical targets, in order to simulate energy deposition in walled tissue equivalent proportional counters. RITRACKS is used as a part of the new program BDSTracks (Biological Damage by Stochastic Tracks) to simulate several types of chromosome aberrations in various irradiation conditions. The simulation of damage to various DNA structures (linear and chromatin fiber) by direct and indirect effects has been improved and is ongoing. Many improvements were also made to the graphic user interface (GUI), including the addition of several labels allowing changes of units. A new GUI has been added to display the electron ejection vectors. The parallel calculation capabilities, notably the pre- and post-simulation processing on Windows and Linux machines have been reviewed to make them more portable between different systems. The calculation part is currently maintained in an Atlassian Stash® repository for code tracking and possibly future collaboration.

  8. Review of Real-Time Simulator and the Steps Involved for Implementation of a Model from MATLAB/SIMULINK to Real-Time

    NASA Astrophysics Data System (ADS)

    Mikkili, Suresh; Panda, Anup Kumar; Prattipati, Jayanthi

    2015-06-01

    Nowadays the researchers want to develop their model in real-time environment. Simulation tools have been widely used for the design and improvement of electrical systems since the mid twentieth century. The evolution of simulation tools has progressed in step with the evolution of computing technologies. In recent years, computing technologies have improved dramatically in performance and become widely available at a steadily decreasing cost. Consequently, simulation tools have also seen dramatic performance gains and steady cost decreases. Researchers and engineers now have the access to affordable, high performance simulation tools that were previously too cost prohibitive, except for the largest manufacturers. This work has introduced a specific class of digital simulator known as a real-time simulator by answering the questions "what is real-time simulation", "why is it needed" and "how it works". The latest trend in real-time simulation consists of exporting simulation models to FPGA. In this article, the Steps involved for implementation of a model from MATLAB to REAL-TIME are provided in detail.

  9. A new approach for developing adjoint models

    NASA Astrophysics Data System (ADS)

    Farrell, P. E.; Funke, S. W.

    2011-12-01

    Many data assimilation algorithms rely on the availability of gradients of misfit functionals, which can be efficiently computed with adjoint models. However, the development of an adjoint model for a complex geophysical code is generally very difficult. Algorithmic differentiation (AD, also called automatic differentiation) offers one strategy for simplifying this task: it takes the abstraction that a model is a sequence of primitive instructions, each of which may be differentiated in turn. While extremely successful, this low-level abstraction runs into time-consuming difficulties when applied to the whole codebase of a model, such as differentiating through linear solves, model I/O, calls to external libraries, language features that are unsupported by the AD tool, and the use of multiple programming languages. While these difficulties can be overcome, it requires a large amount of technical expertise and an intimate familiarity with both the AD tool and the model. An alternative to applying the AD tool to the whole codebase is to assemble the discrete adjoint equations and use these to compute the necessary gradients. With this approach, the AD tool must be applied to the nonlinear assembly operators, which are typically small, self-contained units of the codebase. The disadvantage of this approach is that the assembly of the discrete adjoint equations is still very difficult to perform correctly, especially for complex multiphysics models that perform temporal integration; as it stands, this approach is as difficult and time-consuming as applying AD to the whole model. In this work, we have developed a library which greatly simplifies and automates the alternate approach of assembling the discrete adjoint equations. We propose a complementary, higher-level abstraction to that of AD: that a model is a sequence of linear solves. The developer annotates model source code with library calls that build a 'tape' of the operators involved and their dependencies, and supplies callbacks to compute the action of these operators. The library, called libadjoint, is then capable of symbolically manipulating the forward annotation to automatically assemble the adjoint equations. Libadjoint is open source, and is explicitly designed to be bolted-on to an existing discrete model. It can be applied to any discretisation, steady or time-dependent problems, and both linear and nonlinear systems. Using libadjoint has several advantages. It requires the application of an AD tool only to small pieces of code, making the use of AD far more tractable. As libadjoint derives the adjoint equations, the expertise required to develop an adjoint model is greatly diminished. One major advantage of this approach is that the model developer is freed from implementing complex checkpointing strategies for the adjoint model: libadjoint has sufficient information about the forward model to re-play the entire forward solve when necessary, and thus the checkpointing algorithm can be implemented entirely within the library itself. Examples are shown using the Fluidity/ICOM framework, a complex ocean model under development at Imperial College London.

  10. A Multiple-Sessions Interactive Computer-Based Learning Tool for Ability Cultivation in Circuit Simulation

    ERIC Educational Resources Information Center

    Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.

    2011-01-01

    An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…

  11. Applied Time Domain Stability Margin Assessment for Nonlinear Time-Varying Systems

    NASA Technical Reports Server (NTRS)

    Kiefer, J. M.; Johnson, M. D.; Wall, J. H.; Dominguez, A.

    2016-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation. This technique was implemented by using the Stability Aerospace Vehicle Analysis Tool (SAVANT) computer simulation to evaluate the stability of the SLS system with the Adaptive Augmenting Control (AAC) active and inactive along its ascent trajectory. The gains for which the vehicle maintains apparent time-domain stability defines the gain margins, and the time delay similarly defines the phase margin. This method of extracting the control stability margins from the time-domain simulation is relatively straightforward and the resultant margins can be compared to the linearized system results. The sections herein describe the techniques employed to extract the time-domain margins, compare the results between these nonlinear and the linear methods, and provide explanations for observed discrepancies. The SLS ascent trajectory was simulated with SAVANT and the classical linear stability margins were evaluated at one second intervals. The linear analysis was performed with the AAC algorithm disabled to attain baseline stability margins. At each time point, the system was linearized about the current operating point using Simulink's built-in solver. Each linearized system in time was evaluated for its rigid-body gain margin (high frequency gain margin), rigid-body phase margin, and aero gain margin (low frequency gain margin) for each control axis. Using the stability margins derived from the baseline linearization approach, the time domain derived stability margins were determined by executing time domain simulations in which axis-specific incremental gain and phase adjustments were made to the nominal system about the expected neutral stability point at specific flight times. The baseline stability margin time histories were used to shift the system gain to various values around the zero margin point such that a precise amount of expected gain margin was maintained throughout flight. When assessing the gain margins, the gain was applied starting at the time point under consideration, thereafter following the variation in the margin found in the linear analysis. When assessing the rigid-body phase margin, a constant time delay was applied to the system starting at the time point under consideration. If the baseline stability margins were correctly determined via the linear analysis, the time domain simulation results should contain unstable behavior at certain gain and phase values. Examples will be shown from repeated simulations with variable added gain and phase lag. Faithfulness of margins calculated from the linear analysis to the nonlinear system will be demonstrated.

  12. Sublethal effects of catch-and-release fishing: measuring capture stress, fish impairment, and predation risk using a condition index

    USGS Publications Warehouse

    Campbell, Matthew D.; Patino, Reynaldo; Tolan, J.M.; Strauss, R.E.; Diamond, S.

    2009-01-01

    The sublethal effects of simulated capture of red snapper (Lutjanus campechanus) were analysed using physiological responses, condition indexing, and performance variables. Simulated catch-and-release fishing included combinations of depth of capture and thermocline exposure reflective of environmental conditions experienced in the Gulf of Mexico. Frequency of occurrence of barotrauma and lack of reflex response exhibited considerable individual variation. When combined into a single condition or impairment index, individual variation was reduced, and impairment showed significant increases as depth increased and with the addition of thermocline exposure. Performance variables, such as burst swimming speed (BSS) and simulated predator approach distance (AD), were also significantly different by depth. BSSs and predator ADs decreased with increasing depth, were lowest immediately after release, and were affected for up to 15 min, with longer recovery times required as depth increased. The impairment score developed was positively correlated with cortisol concentration and negatively correlated with both BSS and simulated predator AD. The impairment index proved to be an efficient method to estimate the overall impairment of red snapper in the laboratory simulations of capture and shows promise for use in field conditions, to estimate release mortality and vulnerability to predation.

  13. Multiscale Molecular Dynamics Simulations of Beta-Amyloid Interactions with Neurons

    NASA Astrophysics Data System (ADS)

    Qiu, Liming; Vaughn, Mark; Cheng, Kelvin

    2012-10-01

    Early events of human beta-amyloid protein interactions with cholesterol-containing membranes are critical to understanding the pathogenesis of Alzheimer's disease (AD) and to exploring new therapeutic interventions of AD. Atomistic molecular dynamics (AMD) simulations have been extensively used to study the protein-lipid interaction at high atomic resolutions. However, traditional MD simulations are not efficient in sampling the phase space of complex lipid/protein systems with rugged free energy landscapes. Meanwhile, coarse-grained MD (CGD) simulations are efficient in the phase space sampling but suffered from low spatial resolutions and from the fact that the energy landscapes are not identical to those of the AMD. Here, a multiscale approach was employed to simulate the protein-lipid interactions of beta-amyloid upon its release from proteolysis residing in the neuronal membranes. We utilized a forward (AMD to CGD) and reverse (CGD-AMD) strategy to explore new transmembrane and surface protein configuration and evaluate the stabilization mechanisms by measuring the residue-specific protein-lipid or protein conformations. The detailed molecular interactions revealed in this multiscale MD approach will provide new insights into understanding the early molecular events leading to the pathogenesis of AD.

  14. Modeling Abstraction and Simulation Techniques

    DTIC Science & Technology

    2002-12-01

    for data reduction on the patterns stored in normal database . In [58], J. Marin et al. proposed a hybrid model to profile user behavior by the...conv(Ad) as the feasible set 124 in the “surrogate” continuous state space. When the feasible set Ad is not a polyhedron , the set Ac = conv(Ad) may...and is not necessarily convex. Note also that the definition reduces to Ac = conv(Ad) when Ad is formed by all the discrete points in a polyhedron . Now

  15. Computer-aided system design

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  16. COPS: A Sensitive and Accurate Tool for Detecting Somatic Copy Number Alterations Using Short-Read Sequence Data from Paired Samples

    PubMed Central

    Krishnan, Neeraja M.; Gaur, Prakhar; Chaudhary, Rakshit; Rao, Arjun A.; Panda, Binay

    2012-01-01

    Copy Number Alterations (CNAs) such as deletions and duplications; compose a larger percentage of genetic variations than single nucleotide polymorphisms or other structural variations in cancer genomes that undergo major chromosomal re-arrangements. It is, therefore, imperative to identify cancer-specific somatic copy number alterations (SCNAs), with respect to matched normal tissue, in order to understand their association with the disease. We have devised an accurate, sensitive, and easy-to-use tool, COPS, COpy number using Paired Samples, for detecting SCNAs. We rigorously tested the performance of COPS using short sequence simulated reads at various sizes and coverage of SCNAs, read depths, read lengths and also with real tumor:normal paired samples. We found COPS to perform better in comparison to other known SCNA detection tools for all evaluated parameters, namely, sensitivity (detection of true positives), specificity (detection of false positives) and size accuracy. COPS performed well for sequencing reads of all lengths when used with most upstream read alignment tools. Additionally, by incorporating a downstream boundary segmentation detection tool, the accuracy of SCNA boundaries was further improved. Here, we report an accurate, sensitive and easy to use tool in detecting cancer-specific SCNAs using short-read sequence data. In addition to cancer, COPS can be used for any disease as long as sequence reads from both disease and normal samples from the same individual are available. An added boundary segmentation detection module makes COPS detected SCNA boundaries more specific for the samples studied. COPS is available at ftp://115.119.160.213 with username “cops” and password “cops”. PMID:23110103

  17. Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    NASA Astrophysics Data System (ADS)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average grain size may vary with depth. The defect may be a flat-bottomed-hole reference reflector, a spherical void or a spherical inclusion. In future generations of the software, microstructures and defect types will be generalized and oblique incidence inspections will be treated as well. This paper provides an overview of the modeling approach and presents illustrative results output by the first-generation software.

  18. Adaptive MANET multipath routing algorithm based on the simulated annealing approach.

    PubMed

    Kim, Sungwook

    2014-01-01

    Mobile ad hoc network represents a system of wireless mobile nodes that can freely and dynamically self-organize network topologies without any preexisting communication infrastructure. Due to characteristics like temporary topology and absence of centralized authority, routing is one of the major issues in ad hoc networks. In this paper, a new multipath routing scheme is proposed by employing simulated annealing approach. The proposed metaheuristic approach can achieve greater and reciprocal advantages in a hostile dynamic real world network situation. Therefore, the proposed routing scheme is a powerful method for finding an effective solution into the conflict mobile ad hoc network routing problem. Simulation results indicate that the proposed paradigm adapts best to the variation of dynamic network situations. The average remaining energy, network throughput, packet loss probability, and traffic load distribution are improved by about 10%, 10%, 5%, and 10%, respectively, more than the existing schemes.

  19. A Hybrid Wind-Farm Parametrization for Mesoscale and Climate Models

    NASA Astrophysics Data System (ADS)

    Pan, Yang; Archer, Cristina L.

    2018-04-01

    To better understand the potential impact of wind farms on weather and climate at the regional to global scales, a new hybrid wind-farm parametrization is proposed for mesoscale and climate models. The proposed parametrization is a hybrid model because it is not based on physical processes or conservation laws, but on the multiple linear regression of the results of large-eddy simulations (LES) with the geometric properties of the wind-farm layout (e.g., the blockage ratio and blockage distance). The innovative aspect is that each wind turbine is treated individually based on its position in the farm and on the wind direction by predicting the velocity upstream of each turbine. The turbine-induced forces and added turbulence kinetic energy (TKE) are first derived analytically and then implemented in the Weather Research and Forecasting model. Idealized simulations of the offshore Lillgrund wind farm are conducted. The wind-speed deficit and TKE predicted with the hybrid model are in excellent agreement with those from the LES results, while the wind-power production estimated with the hybrid model is within 10% of that observed. Three additional wind farms with larger inter-turbine spacing than at Lillgrund are also considered, and a similar agreement with LES results is found, proving that the hybrid parametrization works well with any wind farm regardless of the spacing between turbines. These results indicate the wind-turbine position, wind direction, and added TKE are essential in accounting for the wind-farm effects on the surroundings, for which the hybrid wind-farm parametrization is a promising tool.

  20. Impact of tool wear on cross wedge rolling process stability and on product quality

    NASA Astrophysics Data System (ADS)

    Gutierrez, Catalina; Langlois, Laurent; Baudouin, Cyrille; Bigot, Régis; Fremeaux, Eric

    2017-10-01

    Cross wedge rolling (CWR) is a metal forming process used in the automotive industry. One of its applications is in the manufacturing process of connecting rods. CWR transforms a cylindrical billet into a complex axisymmetrical shape with an accurate distribution of material. This preform is forged into shape in a forging die. In order to improve CWR tool lifecycle and product quality it is essential to understand tool wear evolution and the physical phenomena that change on the CWR process due to the resulting geometry of the tool when undergoing tool wear. In order to understand CWR tool wear behavior, numerical simulations are necessary. Nevertheless, if the simulations are performed with the CAD geometry of the tool, results are limited. To solve this difficulty, two numerical simulations with FORGE® were performed using the real geometry of the tools (both up and lower roll) at two different states: (1) before starting lifecycle and (2) end of lifecycle. The tools were 3D measured with ATOS triple scan by GOM® using optical 3D measuring techniques. The result was a high-resolution point cloud of the entire geometry of the tool. Each 3D point cloud was digitalized and converted into a STL format. The geometry of the tools in a STL format was input for the 3D simulations. Both simulations were compared. Defects of products obtained in simulation were compared to main defects of products found industrially. Two main defects are: (a) surface defects on the preform that are not fixed in the die forging operation; and (b) Preform bent (no longer straight), with two possible impacts: on the one hand that the robot cannot grab it to take it to the forging stage; on the other hand, an unfilled section in the forging operation.

  1. SolarPILOT | Concentrating Solar Power | NREL

    Science.gov Websites

    tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is

  2. Coupling the Multizone Airflow and Contaminant Transport Software CONTAM with EnergyPlus Using Co-Simulation.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-08-01

    Building modelers need simulation tools capable of simultaneously considering building energy use, airflow and indoor air quality (IAQ) to design and evaluate the ability of buildings and their systems to meet today's demanding energy efficiency and IAQ performance requirements. CONTAM is a widely-used multizone building airflow and contaminant transport simulation tool that requires indoor temperatures as input values. EnergyPlus is a prominent whole-building energy simulation program capable of performing heat transfer calculations that require interzone and infiltration airflows as input values. On their own, each tool is limited in its ability to account for thermal processes upon which building airflow may be significantly dependent and vice versa. This paper describes the initial phase of coupling of CONTAM with EnergyPlus to capture the interdependencies between airflow and heat transfer using co-simulation that allows for sharing of data between independently executing simulation tools. The coupling is accomplished based on the Functional Mock-up Interface (FMI) for Co-simulation specification that provides for integration between independently developed tools. A three-zone combined heat transfer/airflow analytical BESTEST case was simulated to verify the co-simulation is functioning as expected, and an investigation of a two-zone, natural ventilation case designed to challenge the coupled thermal/airflow solution methods was performed.

  3. Creation and Delphi-method refinement of pediatric disaster triage simulations.

    PubMed

    Cicero, Mark X; Brown, Linda; Overly, Frank; Yarzebski, Jorge; Meckler, Garth; Fuchs, Susan; Tomassoni, Anthony; Aghababian, Richard; Chung, Sarita; Garrett, Andrew; Fagbuyi, Daniel; Adelgais, Kathleen; Goldman, Ran; Parker, James; Auerbach, Marc; Riera, Antonio; Cone, David; Baum, Carl R

    2014-01-01

    There is a need for rigorously designed pediatric disaster triage (PDT) training simulations for paramedics. First, we sought to design three multiple patient incidents for EMS provider training simulations. Our second objective was to determine the appropriate interventions and triage level for each victim in each of the simulations and develop evaluation instruments for each simulation. The final objective was to ensure that each simulation and evaluation tool was free of bias toward any specific PDT strategy. We created mixed-methods disaster simulation scenarios with pediatric victims: a school shooting, a school bus crash, and a multiple-victim house fire. Standardized patients, high-fidelity manikins, and low-fidelity manikins were used to portray the victims. Each simulation had similar acuity of injuries and 10 victims. Examples include children with special health-care needs, gunshot wounds, and smoke inhalation. Checklist-based evaluation tools and behaviorally anchored global assessments of function were created for each simulation. Eight physicians and paramedics from areas with differing PDT strategies were recruited as Subject Matter Experts (SMEs) for a modified Delphi iterative critique of the simulations and evaluation tools. The modified Delphi was managed with an online survey tool. The SMEs provided an expected triage category for each patient. The target for modified Delphi consensus was ≥85%. Using Likert scales and free text, the SMEs assessed the validity of the simulations, including instances of bias toward a specific PDT strategy, clarity of learning objectives, and the correlation of the evaluation tools to the learning objectives and scenarios. After two rounds of the modified Delphi, consensus for expected triage level was >85% for 28 of 30 victims, with the remaining two achieving >85% consensus after three Delphi iterations. To achieve consensus, we amended 11 instances of bias toward a specific PDT strategy and corrected 10 instances of noncorrelation between evaluations and simulation. The modified Delphi process, used to derive novel PDT simulation and evaluation tools, yielded a high degree of consensus among the SMEs, and eliminated biases toward specific PDT strategies in the evaluations. The simulations and evaluation tools may now be tested for reliability and validity as part of a prehospital PDT curriculum.

  4. Evaluating climate models: Should we use weather or climate observations?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oglesby, Robert J; Erickson III, David J

    2009-12-01

    Calling the numerical models that we use for simulations of climate change 'climate models' is a bit of a misnomer. These 'general circulation models' (GCMs, AKA global climate models) and their cousins the 'regional climate models' (RCMs) are actually physically-based weather simulators. That is, these models simulate, either globally or locally, daily weather patterns in response to some change in forcing or boundary condition. These simulated weather patterns are then aggregated into climate statistics, very much as we aggregate observations into 'real climate statistics'. Traditionally, the output of GCMs has been evaluated using climate statistics, as opposed to their abilitymore » to simulate realistic daily weather observations. At the coarse global scale this may be a reasonable approach, however, as RCM's downscale to increasingly higher resolutions, the conjunction between weather and climate becomes more problematic. We present results from a series of present-day climate simulations using the WRF ARW for domains that cover North America, much of Latin America, and South Asia. The basic domains are at a 12 km resolution, but several inner domains at 4 km have also been simulated. These include regions of complex topography in Mexico, Colombia, Peru, and Sri Lanka, as well as a region of low topography and fairly homogeneous land surface type (the U.S. Great Plains). Model evaluations are performed using standard climate analyses (e.g., reanalyses; NCDC data) but also using time series of daily station observations. Preliminary results suggest little difference in the assessment of long-term mean quantities, but the variability on seasonal and interannual timescales is better described. Furthermore, the value-added by using daily weather observations as an evaluation tool increases with the model resolution.« less

  5. Development of an OSSE Framework for a Global Atmospheric Data Assimilation System

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald; Errico, Ronald M.; Prive, N.

    2012-01-01

    Observing system simulation experiments (OSSEs) are powerful tools for estimating the usefulness of various configurations of envisioned observing systems and data assimilation techniques. Their utility stems from their being conducted in an entirely simulated context, utilizing simulated observations having simulated errors and drawn from a simulation of the earth's environment. Observations are generated by applying physically based algorithms to the simulated state, such as performed during data assimilation or using other appropriate algorithms. Adding realistic instrument plus representativeness errors, including their biases and correlations, can be critical for obtaining realistic assessments of the impact of a proposed observing system or analysis technique. If estimates of the expected accuracy of proposed observations are realistic, then the OSSE can be also used to learn how best to utilize the new information, accelerating its transition to operations once the real data are available. As with any inferences from simulations, however, it is first imperative that some baseline OSSEs are performed and well validated against corresponding results obtained with a real observing system. This talk provides an overview of, and highlights critical issues related to, the development of an OSSE framework for the tropospheric weather prediction component of the NASA GEOS-5 global atmospheric data assimilation system. The framework includes all existing observations having significant impact on short-term forecast skill. Its validity has been carefully assessed using a range of metrics that can be evaluated in both the OSSE and real contexts, including adjoint-based estimates of observation impact. A preliminary application to the Aeolus Doppler wind lidar mission, scheduled for launch by the European Space Agency in 2014, has also been investigated.

  6. Secular trends and climate drift in coupled ocean-atmosphere general circulation models

    NASA Astrophysics Data System (ADS)

    Covey, Curt; Gleckler, Peter J.; Phillips, Thomas J.; Bader, David C.

    2006-02-01

    Coupled ocean-atmosphere general circulation models (coupled GCMs) with interactive sea ice are the primary tool for investigating possible future global warming and numerous other issues in climate science. A long-standing problem with such models is that when different components of the physical climate system are linked together, the simulated climate can drift away from observation unless constrained by ad hoc adjustments to interface fluxes. However, 11 modern coupled GCMs, including three that do not employ flux adjustments, behave much better in this respect than the older generation of models. Surface temperature trends in control run simulations (with external climate forcing such as solar brightness and atmospheric carbon dioxide held constant) are small compared with observed trends, which include 20th century climate change due to both anthropogenic and natural factors. Sea ice changes in the models are dominated by interannual variations. Deep ocean temperature and salinity trends are small enough for model control runs to extend over 1000 simulated years or more, but trends in some regions, most notably the Arctic, differ substantially among the models and may be problematic. Methods used to initialize coupled GCMs can mitigate climate drift but cannot eliminate it. Lengthy "spin-ups" of models, made possible by increasing computer power, are one reason for the improvements this paper documents.

  7. A model of tungsten anode x-ray spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernández, G.; Fernández, F., E-mail: fdz@usal.es

    2016-08-15

    Purpose: A semiempirical model for x-ray production in tungsten thick-targets was evaluated using a new characterization of electron fluence. Methods: Electron fluence is modeled taking into account both the energy and angular distributions, each of them adjusted to Monte Carlo simulated data. Distances were scaled by the CSDA range to reduce the energy dependence. Bremsstrahlung production was found by integrating the cross section with the fluence in a 1D penetration model. Characteristic radiation was added using a semiempirical law whose validity was checked. The results were compared the experimental results of Bhat et al., with the SpekCalc numerical tool, andmore » with MCNPX simulation results from the work of Hernandez and Boone. Results: The model described shows better agreement with the experimental results than the SpekCalc predictions in the sense of area between the spectra. A general improvement of the predictions of half-value layers is also found. The results are also in good agreement with the simulation results in the 50–640 keV energy range. Conclusions: A complete model for x-ray production in thick bremsstrahlung targets has been developed, improving the results of previous works and extending the energy range covered to the 50–640 keV interval.« less

  8. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    PubMed

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.

  9. A Survey of FDG- and Amyloid-PET Imaging in Dementia and GRADE Analysis

    PubMed Central

    Daniela, Perani; Orazio, Schillaci; Alessandro, Padovani; Mariano, Nobili Flavio; Leonardo, Iaccarino; Pasquale Anthony, Della Rosa; Giovanni, Frisoni; Carlo, Caltagirone

    2014-01-01

    PET based tools can improve the early diagnosis of Alzheimer's disease (AD) and differential diagnosis of dementia. The importance of identifying individuals at risk of developing dementia among people with subjective cognitive complaints or mild cognitive impairment has clinical, social, and therapeutic implications. Within the two major classes of AD biomarkers currently identified, that is, markers of pathology and neurodegeneration, amyloid- and FDG-PET imaging represent decisive tools for their measurement. As a consequence, the PET tools have been recognized to be of crucial value in the recent guidelines for the early diagnosis of AD and other dementia conditions. The references based recommendations, however, include large PET imaging literature based on visual methods that greatly reduces sensitivity and specificity and lacks a clear cut-off between normal and pathological findings. PET imaging can be assessed using parametric or voxel-wise analyses by comparing the subject's scan with a normative data set, significantly increasing the diagnostic accuracy. This paper is a survey of the relevant literature on FDG and amyloid-PET imaging aimed at providing the value of quantification for the early and differential diagnosis of AD. This allowed a meta-analysis and GRADE analysis revealing high values for PET imaging that might be useful in considering recommendations. PMID:24772437

  10. Using Google AdWords for international multilingual recruitment to health research websites.

    PubMed

    Gross, Margaret S; Liu, Nancy H; Contreras, Omar; Muñoz, Ricardo F; Leykin, Yan

    2014-01-20

    Google AdWords, the placement of sponsored links in Google search results, is a potent method of recruitment to Internet-based health studies and interventions. However, the performance of Google AdWords varies considerably depending on the language and the location of the target audience. Our goal was to describe differences in AdWords performance when recruiting participants to the same study conducted in four languages and to determine whether AdWords campaigns can be optimized in order to increase recruitment while decreasing costs. Google AdWords were used to recruit participants to the Mood Screener, a multilingual online depression screening tool available in English, Russian, Spanish, and Chinese. Two distinct recruitment periods are described: (1) "Unmanaged", a 6-month period in which ads were allowed to run using only the AdWords tool itself, with no human intervention, and (2) "Managed", a separate 7-week period during which we systematically sought to optimize our recruitment campaigns. During 6 months of unmanaged recruitment, our ads were shown over 1.3 million times, resulting in over 60,000 site visits. The average click-through rate (ratio of ads clicked to ads displayed) varied from 1.86% for Chinese ads to 8.48% for Russian ads, as did the average cost-per-click (from US $0.20 for Chinese ads to US $0.50 for English ads). Although Chinese speakers' click-through rate was lowest, their rate of consenting to participate was the highest, at 3.62%, with English speakers exhibiting the lowest consent rate (0.97%). The conversion cost (cost to recruit a consenting participant) varied from US $10.80 for Russian speakers to US $51.88 for English speakers. During the 7 weeks of "managed" recruitment, we attempted to improve AdWords' performance in regards to the consent rate and cost by systematically deleting underperforming ads and adjusting keywords. We were able to increase the number of people who consent after coming to the site by 91.8% while also decreasing per-consent cost by 23.3%. Our results illustrate the need to linguistically and culturally adapt Google AdWords campaigns and to manage them carefully to ensure the most cost-effective results.

  11. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  12. LandCaRe DSS--an interactive decision support system for climate change impact assessment and the analysis of potential agricultural land use adaptation strategies.

    PubMed

    Wenkel, Karl-Otto; Berg, Michael; Mirschel, Wilfried; Wieland, Ralf; Nendel, Claas; Köstner, Barbara

    2013-09-01

    Decision support to develop viable climate change adaptation strategies for agriculture and regional land use management encompasses a wide range of options and issues. Up to now, only a few suitable tools and methods have existed for farmers and regional stakeholders that support the process of decision-making in this field. The interactive model-based spatial information and decision support system LandCaRe DSS attempts to close the existing methodical gap. This system supports interactive spatial scenario simulations, multi-ensemble and multi-model simulations at the regional scale, as well as the complex impact assessment of potential land use adaptation strategies at the local scale. The system is connected to a local geo-database and via the internet to a climate data server. LandCaRe DSS uses a multitude of scale-specific ecological impact models, which are linked in various ways. At the local scale (farm scale), biophysical models are directly coupled with a farm economy calculator. New or alternative simulation models can easily be added, thanks to the innovative architecture and design of the DSS. Scenario simulations can be conducted with a reasonable amount of effort. The interactive LandCaRe DSS prototype also offers a variety of data analysis and visualisation tools, a help system for users and a farmer information system for climate adaptation in agriculture. This paper presents the theoretical background, the conceptual framework, and the structure and methodology behind LandCaRe DSS. Scenario studies at the regional and local scale for the two Eastern German regions of Uckermark (dry lowlands, 2600 km(2)) and Weißeritz (humid mountain area, 400 km(2)) were conducted in close cooperation with stakeholders to test the functionality of the DSS prototype. The system is gradually being transformed into a web version (http://www.landcare-dss.de) to ensure the broadest possible distribution of LandCaRe DSS to the public. The system will be continuously developed, updated and used in different research projects and as a learning and knowledge-sharing tool for students. The main objective of LandCaRe DSS is to provide information on the complex long-term impacts of climate change and on potential management options for adaptation by answering "what-if" type questions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Operational Improvements From the In-Trail Procedure in the North Atlantic Organized Track System

    NASA Technical Reports Server (NTRS)

    Chartrand, Ryan C.; Bussink, Frank J. L.; Graff, Thomas J.; Murdoch, Jennifer L.; Jones, Kenneth M.

    2008-01-01

    This paper explains the computerized batch processing experiment examining the operational impacts of the introduction of Automatic Dependent Surveillance-Broadcast (ADS-B) equipment and the In-Trail Procedure (ITP) to the North Atlantic Organized Track System (NATOTS). This experiment was conducted using the Traffic Manager (TMX), a desktop simulation capable of simulating airspace environments and aircraft operations. ADS-B equipment can enable the use of new ground and airborne procedures, such as the ITP. The ITP is among the first of these new procedures, which will make use of improved situation awareness in the local surrounding airspace of ADS-B equipped aircraft to enable more efficient oceanic flight level changes. The data collected were analyzed with respect to multiple operationally relevant parameters including fuel burn, request approval rates, and the distribution of fuel savings. This experiment showed that through the use of ADS-B or ADS-B and the ITP that operational improvements and benefits could be achieved.

  14. Operational Improvements From Using the In-Trail Procedure in the North Atlantic Organized Track System

    NASA Technical Reports Server (NTRS)

    Chartrand, Ryan C.; Bussink, Frank J.; Graff, Thomas J.; Jones, Kenneth M.

    2009-01-01

    This paper explains the computerized batch processing experiment examining the operational impacts of the introduction of Automatic Dependent Surveillance-Broadcast (ADS-B) equipment and the In-Trail Procedure (ITP) to the North Atlantic Organized Track System. This experiment was conducted using the Traffic Manager (TMX), a desktop simulation capable of simulating airspace environments and aircraft operations. ADS-B equipment can enable the use of new ground and airborne procedures, such as the ITP. ITP is among the first of these new procedures, which will make use of improved situation awareness in the local surrounding airspace of ADS-B equipped aircraft to enable more efficient oceanic flight level changes. The collected data were analyzed with respect to multiple operationally relevant parameters including fuel burn, request approval rates, and the distribution of fuel savings. This experiment showed that through the use of ADS-B or ADS-B and the ITP that operational improvements and benefits could be achieved.

  15. sUAS Position Estimation and Fusion in GPS-Degraded and GPS-Denied Environments using an ADS-B Transponder and Local Area Multilateration

    NASA Astrophysics Data System (ADS)

    Larson, Robert Sherman

    An Unmanned Aerial Vehicle (UAV) and a manned aircraft are tracked using ADS-B transponders and the Local Area Multilateration System (LAMS) in simulated GPS-degraded and GPS-denied environments. Several position estimation and fusion algorithms are developed for use with the Autonomous Flight Systems Laboratory (AFSL) TRansponder based Position Information System (TRAPIS) software. At the lowest level, these estimation and fusion algorithms use raw information from ADS-B and LAMS data streams to provide aircraft position estimates to the ground station user. At the highest level, aircraft position is estimated using a discrete time Kalman filter with real-time covariance updates and fusion involving weighted averaging of ADS-B and LAMS positions. Simulation and flight test results are provided, demonstrating the feasibility of incorporating an ADS-B transponder on a commercially-available UAS and maintaining situational awareness of aircraft positions in GPS-degraded and GPS-denied environments.

  16. Water Network Tool for Resilience v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-12-09

    WNTR is a python package designed to simulate and analyze resilience of water distribution networks. The software includes: - Pressure driven and demand driven hydraulic simulation - Water quality simulation to track concentration, trace, and water age - Conditional controls to simulate power outages - Models to simulate pipe breaks - A wide range of resilience metrics - Analysis and visualization tools

  17. Estimation of left ventricular blood flow parameters: clinical application of patient-specific CFD simulations from 4D echocardiography

    NASA Astrophysics Data System (ADS)

    Larsson, David; Spühler, Jeannette H.; Günyeli, Elif; Weinkauf, Tino; Hoffman, Johan; Colarieti-Tosti, Massimiliano; Winter, Reidar; Larsson, Matilda

    2017-03-01

    Echocardiography is the most commonly used image modality in cardiology, assessing several aspects of cardiac viability. The importance of cardiac hemodynamics and 4D blood flow motion has recently been highlighted, however such assessment is still difficult using routine echo-imaging. Instead, combining imaging with computational fluid dynamics (CFD)-simulations has proven valuable, but only a few models have been applied clinically. In the following, patient-specific CFD-simulations from transthoracic dobutamin stress echocardiography have been used to analyze the left ventricular 4D blood flow in three subjects: two with normal and one with reduced left ventricular function. At each stress level, 4D-images were acquired using a GE Vivid E9 (4VD, 1.7MHz/3.3MHz) and velocity fields simulated using a presented pathway involving endocardial segmentation, valve position identification, and solution of the incompressible Navier-Stokes equation. Flow components defined as direct flow, delayed ejection flow, retained inflow, and residual volume were calculated by particle tracing using 4th-order Runge-Kutta integration. Additionally, systolic and diastolic average velocity fields were generated. Results indicated no major changes in average velocity fields for any of the subjects. For the two subjects with normal left ventricular function, increased direct flow, decreased delayed ejection flow, constant retained inflow, and a considerable drop in residual volume was seen at increasing stress. Contrary, for the subject with reduced left ventricular function, the delayed ejection flow increased whilst the retained inflow decreased at increasing stress levels. This feasibility study represents one of the first clinical applications of an echo-based patient-specific CFD-model at elevated stress levels, and highlights the potential of using echo-based models to capture highly transient flow events, as well as the ability of using simulation tools to study clinically complex phenomena. With larger patient studies planned for the future, and with the possibility of adding more anatomical features into the model framework, the current work demonstrates the potential of patient-specific CFD-models as a tool for quantifying 4D blood flow in the heart.

  18. WiFiSiM: An Educational Tool for the Study and Design of Wireless Networks

    ERIC Educational Resources Information Center

    Mateo Sanguino, T. J.; Serrano Lopez, C.; Marquez Hernandez, F. A.

    2013-01-01

    A new educational simulation tool designed for the generic study of wireless networks, the Wireless Fidelity Simulator (WiFiSim), is presented in this paper. The goal of this work was to create and implement a didactic tool to improve the teaching and learning of computer networks by means of two complementary strategies: simulating the behavior…

  19. Development and psychometric evaluation of the "Neurosurgical Evaluation of Attitudes towards simulation Training" (NEAT) tool for use in neurosurgical education and training.

    PubMed

    Kirkman, Matthew A; Muirhead, William; Nandi, Dipankar; Sevdalis, Nick

    2014-01-01

    Neurosurgical simulation training is becoming increasingly popular. Attitudes toward simulation among residents can contribute to the effectiveness of simulation training, but such attitudes remain poorly explored in neurosurgery with no psychometrically proven measure in the literature. The aim of the present study was to evaluate prospectively a newly developed tool for this purpose: the Neurosurgical Evaluation of Attitudes towards simulation Training (NEAT). The NEAT tool was prospectively developed in 2 stages and psychometrically evaluated (validity and reliability) in 2 administrations with the same participants. The tool comprises a questionnaire with 9 Likert scale items and 2 free-text sections assessing attitudes toward simulation in neurosurgery. The evaluation was completed with 31 neurosurgery residents in London, United Kingdom, who were generally favorable toward neurosurgical simulation. The internal consistency of the questionnaire was high, as demonstrated by the overall Cronbach α values (α=0.899 and α=0.955). All but 2 questionnaire items had "substantial" or "almost perfect" test-retest reliability following repeated survey administrations (median Pearson r correlation=0.688; range, 0.248-0.841). NEAT items were well correlated with each other on both occasions, showing good validity of content within the NEAT tool. There was no significant relationship between either gender or length of neurosurgical experience and item ratings. NEAT is the first psychometrically evaluated tool for evaluating attitudes toward simulation in neurosurgery. Further implementation of NEAT is required in wider neurosurgical populations to establish whether specific population groups differ. Use of NEAT in studies of neurosurgical simulation could offer an additional outcome measure to performance metrics, permitting evaluation of the impact of neurosurgical simulation on attitudes toward simulation both between participants and within the same participants over time. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Grid Integration Research | Wind | NREL

    Science.gov Websites

    -generated simulation of a wind turbine. Wind Power Plant Modeling and Simulation Engineers at the National computer-aided engineering tool, FAST, as well as their wind power plant simulation tool, Wind-Plant

  1. 76 FR 31631 - Heavy Forged Hand Tools From China; Scheduling of Expedited Five-Year Reviews Concerning the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-01

    ... INTERNATIONAL TRADE COMMISSION [Investigation Nos. 731-TA-457-A-D Third Review] Heavy Forged Hand... Heavy Forged Hand Tools From China. AGENCY: United States International Trade Commission. ACTION: Notice... the antidumping duty orders on heavy forged hand tools from China would be likely to lead to...

  2. Using the Disease State Fingerprint Tool for Differential Diagnosis of Frontotemporal Dementia and Alzheimer's Disease

    PubMed Central

    Muñoz-Ruiz, Miguel Ángel; Hall, Anette; Mattila, Jussi; Koikkalainen, Juha; Herukka, Sanna-Kaisa; Husso, Minna; Hänninen, Tuomo; Vanninen, Ritva; Liu, Yawu; Hallikainen, Merja; Lötjönen, Jyrki; Remes, Anne M.; Alafuzoff, Irina; Soininen, Hilkka; Hartikainen, Päivi

    2016-01-01

    Background Disease State Index (DSI) and its visualization, Disease State Fingerprint (DSF), form a computer-assisted clinical decision making tool that combines patient data and compares them with cases with known outcomes. Aims To investigate the ability of the DSI to diagnose frontotemporal dementia (FTD) and Alzheimer's disease (AD). Methods The study cohort consisted of 38 patients with FTD, 57 with AD and 22 controls. Autopsy verification of FTD with TDP-43 positive pathology was available for 14 and AD pathology for 12 cases. We utilized data from neuropsychological tests, volumetric magnetic resonance imaging, single-photon emission tomography, cerebrospinal fluid biomarkers and the APOE genotype. The DSI classification results were calculated with a combination of leave-one-out cross-validation and bootstrapping. A DSF visualization of a FTD patient is presented as an example. Results The DSI distinguishes controls from FTD (area under the receiver-operator curve, AUC = 0.99) and AD (AUC = 1.00) very well and achieves a good differential diagnosis between AD and FTD (AUC = 0.89). In subsamples of autopsy-confirmed cases (AUC = 0.97) and clinically diagnosed cases (AUC = 0.94), differential diagnosis of AD and FTD performs very well. Conclusions DSI is a promising computer-assisted biomarker approach for aiding in the diagnostic process of dementing diseases. Here, DSI separates controls from dementia and differentiates between AD and FTD. PMID:27703465

  3. [Virtual reality simulation training in gynecology: review and perspectives].

    PubMed

    Ricard-Gauthier, Dominique; Popescu, Silvia; Benmohamed, Naida; Petignat, Patrick; Dubuisson, Jean

    2016-10-26

    Laparoscopic simulation has rapidly become an important tool for learning and acquiring technical skills in surgery. It is based on two different complementary pedagogic tools : the box model trainer and the virtual reality simulator. The virtual reality simulator has shown its efficiency by improving surgical skills, decreasing operating time, improving economy of movements and improving self-confidence. The main objective of this tool is the opportunity to easily organize a regular, structured and uniformed training program enabling an automated individualized feedback.

  4. Spacecraft Guidance, Navigation, and Control Visualization Tool

    NASA Technical Reports Server (NTRS)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    G-View is a 3D visualization tool for supporting spacecraft guidance, navigation, and control (GN&C) simulations relevant to small-body exploration and sampling (see figure). The tool is developed in MATLAB using Virtual Reality Toolbox and provides users with the ability to visualize the behavior of their simulations, regardless of which programming language (or machine) is used to generate simulation results. The only requirement is that multi-body simulation data is generated and placed in the proper format before applying G-View.

  5. FDTD simulation tools for UWB antenna analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  6. In silico strain optimization by adding reactions to metabolic models.

    PubMed

    Correia, Sara; Rocha, Miguel

    2012-07-24

    Nowadays, the concerns about the environment and the needs to increase the productivity at low costs, demand for the search of new ways to produce compounds with industrial interest. Based on the increasing knowledge of biological processes, through genome sequencing projects, and high-throughput experimental techniques as well as the available computational tools, the use of microorganisms has been considered as an approach to produce desirable compounds. However, this usually requires to manipulate these organisms by genetic engineering and/ or changing the enviromental conditions to make the production of these compounds possible. In many cases, it is necessary to enrich the genetic material of those microbes with hereologous pathways from other species and consequently adding the potential to produce novel compounds. This paper introduces a new plug-in for the OptFlux Metabolic Engineering platform, aimed at finding suitable sets of reactions to add to the genomes of selected microbes (wild type strain), as well as finding complementary sets of deletions, so that the mutant becomes able to overproduce compounds with industrial interest, while preserving their viability. The necessity of adding reactions to the metabolic model arises from existing gaps in the original model or motivated by the productions of new compounds by the organism. The optimization methods used are metaheuristics such as Evolutionary Algorithms and Simulated Annealing. The usefulness of this plug-in is demonstrated by a case study, regarding the production of vanillin by the bacterium E. coli.

  7. In silico strain optimization by adding reactions to metabolic models.

    PubMed

    Correia, Sara; Rocha, Miguel

    2012-12-01

    Nowadays, the concerns about the environment and the needs to increase the productivity at low costs, demand for the search of new ways to produce compounds with industrial interest. Based on the increasing knowledge of biological processes, through genome sequencing projects, and high-throughput experimental techniques as well as the available computational tools, the use of microorganisms has been considered as an approach to produce desirable compounds. However, this usually requires to manipulate these organisms by genetic engineering and/ or changing the enviromental conditions to make the production of these compounds possible. In many cases, it is necessary to enrich the genetic material of those microbes with hereologous pathways from other species and consequently adding the potential to produce novel compounds. This paper introduces a new plug-in for the OptFlux Metabolic Engineering platform, aimed at finding suitable sets of reactions to add to the genomes of selected microbes (wild type strain), as well as finding complementary sets of deletions, so that the mutant becomes able to overproduce compounds with industrial interest, while preserving their viability. The necessity of adding reactions to the metabolic model arises from existing gaps in the original model or motivated by the productions of new compounds by the organism. The optimization methods used are metaheuristics such as Evolutionary Algorithms and Simulated Annealing. The usefulness of this plug-in is demonstrated by a case study, regarding the production of vanillin by the bacterium E. coli.

  8. Simulation framework for electromagnetic effects in plasmonics, filter apertures, wafer scattering, grating mirrors, and nano-crystals

    NASA Astrophysics Data System (ADS)

    Ceperley, Daniel Peter

    This thesis presents a Finite-Difference Time-Domain simulation framework as well as both scientific observations and quantitative design data for emerging optical devices. These emerging applications required the development of simulation capabilities to carefully control numerical experimental conditions, isolate and quantifying specific scattering processes, and overcome memory and run-time limitations on large device structures. The framework consists of a new version 7 of TEMPEST and auxiliary tools implemented as Matlab scripts. In improving the geometry representation and absorbing boundary conditions in TEMPEST from v6 the accuracy has been sustained and key improvements have yielded application specific speed and accuracy improvements. These extensions include pulsed methods, PML for plasmon termination, and plasmon and scattered field sources. The auxiliary tools include application specific methods such as signal flow graphs of plasmon couplers, Bloch mode expansions of sub-wavelength grating waves, and back-propagation methods to characterize edge scattering in diffraction masks. Each application posed different numerical hurdles and physical questions for the simulation framework. The Terrestrial Planet Finder Coronagraph required accurate modeling of diffraction mask structures too large for solely FDTD analysis. This analysis was achieved through a combination of targeted TEMPEST simulations and full system simulator based on thin mask scalar diffraction models by Ball Aerospace for JPL. TEMPEST simulation showed that vertical sidewalls were the strongest scatterers, adding nearly 2lambda of light per mask edge, which could be reduced by 20° undercuts. TEMPEST assessment of coupling in rapid thermal annealing was complicated by extremely sub-wavelength features and fine meshes. Near 100% coupling and low variability was confirmed even in the presence of unidirectional dense metal gates. Accurate analysis of surface plasmon coupling efficiency by small surface features required capabilities to isolate these features and cleanly illuminate them with plasmons and plane-waves. These features were shown to have coupling cross-sections up to and slightly exceeding their physical size. Long run-times for TEMPEST simulations of finite length gratings were overcome with a signal flow graph method. With these methods a plasmon coupler with over a 10lambda 100% capture length was demonstrated. Simulation of 3D nano-particle arrays utilized TEMPEST v7's pulsed methods to minimize the number of multi-day simulations. These simulations led to the discovery that interstitial plasmons were responsible for resonant absorption and transmission but not reflection. Simulation of a sub-wavelength grating mirror using pulsed sources to map resonant spectra showed that neither coupled guided waves nor coupled isolated resonators accurately described the operation. However, a new model based on vertical propagation of lateral Bloch modes with zero phase progression efficiently characterized the device and provided principles for designing similar devices at other wavelengths.

  9. Visual system manifestations of Alzheimer's disease.

    PubMed

    Kusne, Yael; Wolf, Andrew B; Townley, Kate; Conway, Mandi; Peyman, Gholam A

    2017-12-01

    Alzheimer's disease (AD) is an increasingly common disease with massive personal and economic costs. While it has long been known that AD impacts the visual system, there has recently been an increased focus on understanding both pathophysiological mechanisms that may be shared between the eye and brain and how related biomarkers could be useful for AD diagnosis. Here, were review pertinent cellular and molecular mechanisms of AD pathophysiology, the presence of AD pathology in the visual system, associated functional changes, and potential development of diagnostic tools based on the visual system. Additionally, we discuss links between AD and visual disorders, including possible pathophysiological mechanisms and their relevance for improving our understanding of AD. © 2016 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  10. Simulating the Camp David Negotiations: A Problem-Solving Tool in Critical Pedagogy

    ERIC Educational Resources Information Center

    McMahon, Sean F.; Miller, Chris

    2013-01-01

    This article reflects critically on simulations. Building on the authors' experience simulating the Palestinian-Israeli-American Camp David negotiations of 2000, they argue that simulations are useful pedagogical tools that encourage creative--but not critical--thinking and constructivist learning. However, they can also have the deleterious…

  11. SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE

    EPA Science Inventory

    The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...

  12. doGlycans-Tools for Preparing Carbohydrate Structures for Atomistic Simulations of Glycoproteins, Glycolipids, and Carbohydrate Polymers for GROMACS.

    PubMed

    Danne, Reinis; Poojari, Chetan; Martinez-Seara, Hector; Rissanen, Sami; Lolicato, Fabio; Róg, Tomasz; Vattulainen, Ilpo

    2017-10-23

    Carbohydrates constitute a structurally and functionally diverse group of biological molecules and macromolecules. In cells they are involved in, e.g., energy storage, signaling, and cell-cell recognition. All of these phenomena take place in atomistic scales, thus atomistic simulation would be the method of choice to explore how carbohydrates function. However, the progress in the field is limited by the lack of appropriate tools for preparing carbohydrate structures and related topology files for the simulation models. Here we present tools that fill this gap. Applications where the tools discussed in this paper are particularly useful include, among others, the preparation of structures for glycolipids, nanocellulose, and glycans linked to glycoproteins. The molecular structures and simulation files generated by the tools are compatible with GROMACS.

  13. Simulation of Medical Imaging Systems: Emission and Transmission Tomography

    NASA Astrophysics Data System (ADS)

    Harrison, Robert L.

    Simulation is an important tool in medical imaging research. In patient scans the true underlying anatomy and physiology is unknown. We have no way of knowing in a given scan how various factors are confounding the data: statistical noise; biological variability; patient motion; scattered radiation, dead time, and other data contaminants. Simulation allows us to isolate a single factor of interest, for instance when researchers perform multiple simulations of the same imaging situation to determine the effect of statistical noise or biological variability. Simulations are also increasingly used as a design optimization tool for tomographic scanners. This article gives an overview of the mechanics of emission and transmission tomography simulation, reviews some of the publicly available simulation tools, and discusses trade-offs between the accuracy and efficiency of simulations.

  14. Computational modeling to predict mechanical function of joints: application to the lower leg with simulation of two cadaver studies.

    PubMed

    Liacouras, Peter C; Wayne, Jennifer S

    2007-12-01

    Computational models of musculoskeletal joints and limbs can provide useful information about joint mechanics. Validated models can be used as predictive devices for understanding joint function and serve as clinical tools for predicting the outcome of surgical procedures. A new computational modeling approach was developed for simulating joint kinematics that are dictated by bone/joint anatomy, ligamentous constraints, and applied loading. Three-dimensional computational models of the lower leg were created to illustrate the application of this new approach. Model development began with generating three-dimensional surfaces of each bone from CT images and then importing into the three-dimensional solid modeling software SOLIDWORKS and motion simulation package COSMOSMOTION. Through SOLIDWORKS and COSMOSMOTION, each bone surface file was filled to create a solid object and positioned necessary components added, and simulations executed. Three-dimensional contacts were added to inhibit intersection of the bones during motion. Ligaments were represented as linear springs. Model predictions were then validated by comparison to two different cadaver studies, syndesmotic injury and repair and ankle inversion following ligament transection. The syndesmotic injury model was able to predict tibial rotation, fibular rotation, and anterior/posterior displacement. In the inversion simulation, calcaneofibular ligament extension and angles of inversion compared well. Some experimental data proved harder to simulate accurately, due to certain software limitations and lack of complete experimental data. Other parameters that could not be easily obtained experimentally can be predicted and analyzed by the computational simulations. In the syndesmotic injury study, the force generated in the tibionavicular and calcaneofibular ligaments reduced with the insertion of the staple, indicating how this repair technique changes joint function. After transection of the calcaneofibular ligament in the inversion stability study, a major increase in force was seen in several of the ligaments on the lateral aspect of the foot and ankle, indicating the recruitment of other structures to permit function after injury. Overall, the computational models were able to predict joint kinematics of the lower leg with particular focus on the ankle complex. This same approach can be taken to create models of other limb segments such as the elbow and wrist. Additional parameters can be calculated in the models that are not easily obtained experimentally such as ligament forces, force transmission across joints, and three-dimensional movement of all bones. Muscle activation can be incorporated in the model through the action of applied forces within the software for future studies.

  15. Trace Replay and Network Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acun, Bilge; Jain, Nikhil; Bhatele, Abhinav

    2015-03-23

    TraceR is a trace reply tool built upon the ROSS-based CODES simulation framework. TraceR can be used for predicting network performances and understanding network behavior by simulating messaging in High Performance Computing applications on interconnection networks.

  16. Stochastic airspace simulation tool development

    DOT National Transportation Integrated Search

    2009-10-01

    Modeling and simulation is often used to study : the physical world when observation may not be : practical. The overall goal of a recent and ongoing : simulation tool project has been to provide a : documented, lifecycle-managed, multi-processor : c...

  17. Trace Replay and Network Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, Nikhil; Bhatele, Abhinav; Acun, Bilge

    TraceR Is a trace replay tool built upon the ROSS-based CODES simulation framework. TraceR can be used for predicting network performance and understanding network behavior by simulating messaging In High Performance Computing applications on interconnection networks.

  18. New Automotive Air Conditioning System Simulation Tool Developed in MATLAB/Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiss, T.; Chaney, L.; Meyer, J.

    Further improvements in vehicle fuel efficiency require accurate evaluation of the vehicle's transient total power requirement. When operated, the air conditioning (A/C) system is the largest auxiliary load on a vehicle; therefore, accurate evaluation of the load it places on the vehicle's engine and/or energy storage system is especially important. Vehicle simulation software, such as 'Autonomie,' has been used by OEMs to evaluate vehicles' energy performance. A transient A/C simulation tool incorporated into vehicle simulation models would also provide a tool for developing more efficient A/C systems through a thorough consideration of the transient A/C system performance. The dynamic systemmore » simulation software Matlab/Simulink was used to develop new and more efficient vehicle energy system controls. The various modeling methods used for the new simulation tool are described in detail. Comparison with measured data is provided to demonstrate the validity of the model.« less

  19. Signs and symptoms preceding the diagnosis of Alzheimer’s disease: a systematic scoping review of literature from 1937 to 2016

    PubMed Central

    Bature, Fidelia; Guinn, Barbara-ann; Pang, Dong; Pappas, Yannis

    2017-01-01

    Objective Late diagnosis of Alzheimer’s disease (AD) may be due to diagnostic uncertainties. We aimed to determine the sequence and timing of the appearance of established early signs and symptoms in people who are subsequently diagnosed with AD. Methods We used systematic review methodology to investigate the existing literature. Articles were reviewed in May 2016, using the following databases: MEDLINE, PsycINFO, CINAHL, British Nursing Index, PubMed central and the Cochrane library, with no language restriction. Data from the included articles were extracted independently by two authors and quality assessment was undertaken with the quality assessment and diagnostic accuracy tool-2 (QUADAS tool-2 quality assessment tool). Results We found that depression and cognitive impairment were the first symptoms to appear in 98.5% and 99.1% of individuals in a study with late-onset AD (LOAD) and 9% and 80%, respectively, in early-onset AD (EOAD). Memory loss presented early and was experienced 12 years before the clinically defined AD dementia in the LOAD. However, the rapidly progressive late-onset AD presented predominantly with 35 non-established focal symptoms and signs including myoclonus (75%), disturbed gait (66%) and rigidity. These were misdiagnosed as symptoms of Creutzfeldt-Jacob disease (CJD) in all the cases. The participant with the lowest mini-mental state examination score of 25 remained stable for 2 years, which is consistent with the score of the healthy family members. Conclusions The findings of this review suggest that neurological and depressive behaviours are an early occurrence in EOAD with depressive and cognitive symptoms in the measure of semantic memory and conceptual formation in LOAD. Misdiagnosis of rapidly progressive AD as CJD and the familial memory score can be confounding factors while establishing a diagnosis. However, the study was limited by the fact that each one of the findings was based on a single study. PMID:28851777

  20. A blood-based screening tool for Alzheimer's disease that spans serum and plasma: findings from TARC and ADNI.

    PubMed

    O'Bryant, Sid E; Xiao, Guanghua; Barber, Robert; Huebinger, Ryan; Wilhelmsen, Kirk; Edwards, Melissa; Graff-Radford, Neill; Doody, Rachelle; Diaz-Arrastia, Ramon

    2011-01-01

    There is no rapid and cost effective tool that can be implemented as a front-line screening tool for Alzheimer's disease (AD) at the population level. To generate and cross-validate a blood-based screener for AD that yields acceptable accuracy across both serum and plasma. Analysis of serum biomarker proteins were conducted on 197 Alzheimer's disease (AD) participants and 199 control participants from the Texas Alzheimer's Research Consortium (TARC) with further analysis conducted on plasma proteins from 112 AD and 52 control participants from the Alzheimer's Disease Neuroimaging Initiative (ADNI). The full algorithm was derived from a biomarker risk score, clinical lab (glucose, triglycerides, total cholesterol, homocysteine), and demographic (age, gender, education, APOE*E4 status) data. Alzheimer's disease. 11 proteins met our criteria and were utilized for the biomarker risk score. The random forest (RF) biomarker risk score from the TARC serum samples (training set) yielded adequate accuracy in the ADNI plasma sample (training set) (AUC = 0.70, sensitivity (SN) = 0.54 and specificity (SP) = 0.78), which was below that obtained from ADNI cerebral spinal fluid (CSF) analyses (t-tau/Aβ ratio AUC = 0.92). However, the full algorithm yielded excellent accuracy (AUC = 0.88, SN = 0.75, and SP = 0.91). The likelihood ratio of having AD based on a positive test finding (LR+) = 7.03 (SE = 1.17; 95% CI = 4.49-14.47), the likelihood ratio of not having AD based on the algorithm (LR-) = 3.55 (SE = 1.15; 2.22-5.71), and the odds ratio of AD were calculated in the ADNI cohort (OR) = 28.70 (1.55; 95% CI = 11.86-69.47). It is possible to create a blood-based screening algorithm that works across both serum and plasma that provides a comparable screening accuracy to that obtained from CSF analyses.

  1. Multiple scattering modeling pipeline for spectroscopy and photometry of airless Solar System objects

    NASA Astrophysics Data System (ADS)

    Penttilä, Antti; Väisänen, Timo; Markkanen, Johannes; Martikainen, Julia; Gritsevich, Maria; Muinonen, Karri

    2017-10-01

    We combine numerical tools to analyze the reflectance spectra of granular materials. Our motivation comes from the lack of tools when it comes to intimate mixing of materials and modeling space-weathering effects with nano- or micron-sized inclusions. The current practice is to apply a semi-physical models such as the Hapke models (e.g., Icarus 195, 2008). These are expressed in a closed form so that they are fast to apply. The problem is that the validity of the model is not guaranteed, and the derived properties related to particle scattering can be unrealistic (JQSRT 113, 2012).Our pipeline consists of individual scattering simulation codes and a main program that chains them together. The chain for analyzing a macroscopic target with space-weathered mineral would go as: (1) Scattering properties of small inclusions inside a host matrix are derived using exact Maxwell equation solvers. From the scattering properties, we use the so-called incoherent fields and Mueller matrices as input for the next step; (2) Scattering by a regolith grain is solved using a geometrical optics method with surface reflections, internal absorption, and internal diffuse scattering; (3) The radiative transfer simulation is executed inputting the regolith grains from the previous step as the scatterers in a macroscopic planar volume element.For the most realistic asteroid reflectance model, the chain would produce the properties of a planar surface element. Then, a shadowing simulation over the surface elements would be considered, and finally the asteroid phase function would be solved by integrating the bidirectional reflectance distribution function of the planar element over the object's realistic shape model.The tools in the proposed chain already exist, and practical task for us is to tie these together into an easy-to-use public pipeline. We plan to open the pipeline as a web-based open service a dedicated server, using Django application server and Python environment for the main functionality. The individual programs to be ran under the chain can still be programmed with Fortran, C, or other.We acknowledge the ERC AdG No. 320773 ‘SAEMPL’ and the computational resources provided by CSC — IT Center for Science Ltd., Finland.

  2. Using Visual Simulation Tools And Learning Outcomes-Based Curriculum To Help Transportation Engineering Students And Practitioners To Better Understand And Design Traffic Signal Control Systems

    DOT National Transportation Integrated Search

    2012-06-01

    The use of visual simulation tools to convey complex concepts has become a useful tool in education as well as in research. : This report describes a project that developed curriculum and visualization tools to train transportation engineering studen...

  3. WFIRST: Data/Instrument Simulation Support at IPAC

    NASA Astrophysics Data System (ADS)

    Laine, Seppo; Akeson, Rachel; Armus, Lee; Bennett, Lee; Colbert, James; Helou, George; Kirkpatrick, J. Davy; Meshkat, Tiffany; Paladini, Roberta; Ramirez, Solange; Wang, Yun; Xie, Joan; Yan, Lin

    2018-01-01

    As part of WFIRST Science Center preparations, the IPAC Science Operations Center (ISOC) maintains a repository of 1) WFIRST data and instrument simulations, 2) tools to facilitate scientific performance and feasibility studies using the WFIRST, and 3) parameters summarizing the current design and predicted performance of the WFIRST telescope and instruments. The simulation repository provides access for the science community to simulation code, tools, and resulting analyses. Examples of simulation code with ISOC-built web-based interfaces include EXOSIMS (for estimating exoplanet yields in CGI surveys) and the Galaxy Survey Exposure Time Calculator. In the future the repository will provide an interface for users to run custom simulations of a wide range of coronagraph instrument (CGI) observations and sophisticated tools for designing microlensing experiments. We encourage those who are generating simulations or writing tools for exoplanet observations with WFIRST to contact the ISOC team so we can work with you to bring these to the attention of the broader astronomical community as we prepare for the exciting science that will be enabled by WFIRST.

  4. Simulation of DKIST solar adaptive optics system

    NASA Astrophysics Data System (ADS)

    Marino, Jose; Carlisle, Elizabeth; Schmidt, Dirk

    2016-07-01

    Solar adaptive optics (AO) simulations are a valuable tool to guide the design and optimization process of current and future solar AO and multi-conjugate AO (MCAO) systems. Solar AO and MCAO systems rely on extended object cross-correlating Shack-Hartmann wavefront sensors to measure the wavefront. Accurate solar AO simulations require computationally intensive operations, which have until recently presented a prohibitive computational cost. We present an update on the status of a solar AO and MCAO simulation tool being developed at the National Solar Observatory. The simulation tool is a multi-threaded application written in the C++ language that takes advantage of current large multi-core CPU computer systems and fast ethernet connections to provide accurate full simulation of solar AO and MCAO systems. It interfaces with KAOS, a state of the art solar AO control software developed by the Kiepenheuer-Institut fuer Sonnenphysik, that provides reliable AO control. We report on the latest results produced by the solar AO simulation tool.

  5. X-ray system simulation software tools for radiology and radiography education.

    PubMed

    Kengyelics, Stephen M; Treadgold, Laura A; Davies, Andrew G

    2018-02-01

    To develop x-ray simulation software tools to support delivery of radiological science education for a range of learning environments and audiences including individual study, lectures, and tutorials. Two software tools were developed; one simulated x-ray production for a simple two dimensional radiographic system geometry comprising an x-ray source, beam filter, test object and detector. The other simulated the acquisition and display of two dimensional radiographic images of complex three dimensional objects using a ray casting algorithm through three dimensional mesh objects. Both tools were intended to be simple to use, produce results accurate enough to be useful for educational purposes, and have an acceptable simulation time on modest computer hardware. The radiographic factors and acquisition geometry could be altered in both tools via their graphical user interfaces. A comparison of radiographic contrast measurements of the simulators to a real system was performed. The contrast output of the simulators had excellent agreement with measured results. The software simulators were deployed to 120 computers on campus. The software tools developed are easy-to-use, clearly demonstrate important x-ray physics and imaging principles, are accessible within a standard University setting and could be used to enhance the teaching of x-ray physics to undergraduate students. Current approaches to teaching x-ray physics in radiological science lack immediacy when linking theory with practice. This method of delivery allows students to engage with the subject in an experiential learning environment. Copyright © 2017. Published by Elsevier Ltd.

  6. Data and Tools | Research Site Name | NREL

    Science.gov Websites

    aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo laboris nisi ut aliquip ex ea commodo consequat. Research Topic 1 Lorem Ipsum Tool 1 Lorem Ipsum Facility

  7. FleetDASH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singer, Mark R

    2017-09-06

    FleetDASH helps federal fleet managers maximize their use of alternative fuel. This presentation explains how the dashboard works and demonstrates the newest capabilities added to the tool. It also reviews complementary online tools available to fleet managers on the Alternative Fuel Data Center.

  8. Pilot symbol-assisted beamforming algorithms in the WCDMA reverse link

    NASA Astrophysics Data System (ADS)

    Kong, Dongkeon; Lee, Jong H.; Chun, Joohwan; Woo, Yeon Sik; Soh, Ju Won

    2001-08-01

    We present a pilot symbol-assisted beamforming algorithm and a simulation tool of smart antennas for Wideband Code Division Multiple Access (WCDMA) in reverse link. In the 3GPP WCDMA system smart antenna technology has more room to play with than in the second generation wireless mobile systems such as IS-95 because the pilot symbol in Dedicated Physical Control Channel (DPCCH) can be utilized. First we show a smart antenna structure and adaptation algorithms, and then we explain a low-level smart antenna implementation using Simulink and MATLAB. In the design of our smart antenna system we pay special attention for the easiness of the interface to the baseband modem; Our ultimate goal is to implement a baseband smart antenna chip sets that can easily be added to to-be-existed baseband WCDMA modem units.

  9. Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review

    PubMed Central

    Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.

    2009-01-01

    Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508

  10. National Institute on Aging–Alzheimer’s Association guidelines for the neuropathologic assessment of Alzheimer’s disease

    PubMed Central

    Hyman, Bradley T.; Phelps, Creighton H.; Beach, Thomas G.; Bigio, Eileen H.; Cairns, Nigel J.; Carrillo, Maria C.; Dickson, Dennis W.; Duyckaerts, Charles; Frosch, Matthew P.; Masliah, Eliezer; Mirra, Suzanne S.; Nelson, Peter T.; Schneider, Julie A.; Thal, Dietmar Rudolf; Thies, Bill; Trojanowski, John Q.; Vinters, Harry V.; Montine, Thomas J.

    2011-01-01

    The current consensus criteria for the neuropathologic diagnosis of Alzheimer’s disease (AD), known as the National Institute on Aging/Reagan Institute of the Alzheimer Association Consensus Recommendations for the Postmortem Diagnosis of AD or NIA-Reagan Criteria [1], were published in 1997 (hereafter referred to as “1997 Criteria”). Knowledge of AD and the tools used for clinical investigation of cognitive impairment and dementia have advanced substantially since then and have prompted this update on the neuropathologic assessment of AD. PMID:22265587

  11. Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology

    ERIC Educational Resources Information Center

    Basawapatna, Ashok

    2016-01-01

    Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…

  12. Massachusetts reservoir simulation tool—User’s manual

    USGS Publications Warehouse

    Levin, Sara B.

    2016-10-06

    IntroductionThe U.S. Geological Survey developed the Massachusetts Reservoir Simulation Tool to examine the effects of reservoirs on natural streamflows in Massachusetts by simulating the daily water balance of reservoirs. The simulation tool was developed to assist environmental managers to better manage water withdrawals in reservoirs and to preserve downstream aquatic habitats.

  13. Airborne Turbulence Detection System Certification Tool Set

    NASA Technical Reports Server (NTRS)

    Hamilton, David W.; Proctor, Fred H.

    2006-01-01

    A methodology and a corresponding set of simulation tools for testing and evaluating turbulence detection sensors has been presented. The tool set is available to industry and the FAA for certification of radar based airborne turbulence detection systems. The tool set consists of simulated data sets representing convectively induced turbulence, an airborne radar simulation system, hazard tables to convert the radar observable to an aircraft load, documentation, a hazard metric "truth" algorithm, and criteria for scoring the predictions. Analysis indicates that flight test data supports spatial buffers for scoring detections. Also, flight data and demonstrations with the tool set suggest the need for a magnitude buffer.

  14. Scalability enhancement of AODV using local link repairing

    NASA Astrophysics Data System (ADS)

    Jain, Jyoti; Gupta, Roopam; Bandhopadhyay, T. K.

    2014-09-01

    Dynamic change in the topology of an ad hoc network makes it difficult to design an efficient routing protocol. Scalability of an ad hoc network is also one of the important criteria of research in this field. Most of the research works in ad hoc network focus on routing and medium access protocols and produce simulation results for limited-size networks. Ad hoc on-demand distance vector (AODV) is one of the best reactive routing protocols. In this article, modified routing protocols based on local link repairing of AODV are proposed. Method of finding alternate routes for next-to-next node is proposed in case of link failure. These protocols are beacon-less, means periodic hello message is removed from the basic AODV to improve scalability. Few control packet formats have been changed to accommodate suggested modification. Proposed protocols are simulated to investigate scalability performance and compared with basic AODV protocol. This also proves that local link repairing of proposed protocol improves scalability of the network. From simulation results, it is clear that scalability performance of routing protocol is improved because of link repairing method. We have tested protocols for different terrain area with approximate constant node densities and different traffic load.

  15. Using force-based adaptive resolution simulations to calculate solvation free energies of amino acid sidechain analogues

    NASA Astrophysics Data System (ADS)

    Fiorentini, Raffaele; Kremer, Kurt; Potestio, Raffaello; Fogarty, Aoife C.

    2017-06-01

    The calculation of free energy differences is a crucial step in the characterization and understanding of the physical properties of biological molecules. In the development of efficient methods to compute these quantities, a promising strategy is that of employing a dual-resolution representation of the solvent, specifically using an accurate model in the proximity of a molecule of interest and a simplified description elsewhere. One such concurrent multi-resolution simulation method is the Adaptive Resolution Scheme (AdResS), in which particles smoothly change their resolution on-the-fly as they move between different subregions. Before using this approach in the context of free energy calculations, however, it is necessary to make sure that the dual-resolution treatment of the solvent does not cause undesired effects on the computed quantities. Here, we show how AdResS can be used to calculate solvation free energies of small polar solutes using Thermodynamic Integration (TI). We discuss how the potential-energy-based TI approach combines with the force-based AdResS methodology, in which no global Hamiltonian is defined. The AdResS free energy values agree with those calculated from fully atomistic simulations to within a fraction of kBT. This is true even for small atomistic regions whose size is on the order of the correlation length, or when the properties of the coarse-grained region are extremely different from those of the atomistic region. These accurate free energy calculations are possible because AdResS allows the sampling of solvation shell configurations which are equivalent to those of fully atomistic simulations. The results of the present work thus demonstrate the viability of the use of adaptive resolution simulation methods to perform free energy calculations and pave the way for large-scale applications where a substantial computational gain can be attained.

  16. Air Traffic Management Technology Demostration Phase 1 (ATD) Interval Management for Near-Term Operations Validation of Acceptability (IM-NOVA) Experiment

    NASA Technical Reports Server (NTRS)

    Kibler, Jennifer L.; Wilson, Sara R.; Hubbs, Clay E.; Smail, James W.

    2015-01-01

    The Interval Management for Near-term Operations Validation of Acceptability (IM-NOVA) experiment was conducted at the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC) in support of the NASA Airspace Systems Program's Air Traffic Management Technology Demonstration-1 (ATD-1). ATD-1 is intended to showcase an integrated set of technologies that provide an efficient arrival solution for managing aircraft using Next Generation Air Transportation System (NextGen) surveillance, navigation, procedures, and automation for both airborne and ground-based systems. The goal of the IMNOVA experiment was to assess if procedures outlined by the ATD-1 Concept of Operations were acceptable to and feasible for use by flight crews in a voice communications environment when used with a minimum set of Flight Deck-based Interval Management (FIM) equipment and a prototype crew interface. To investigate an integrated arrival solution using ground-based air traffic control tools and aircraft Automatic Dependent Surveillance-Broadcast (ADS-B) tools, the LaRC FIM system and the Traffic Management Advisor with Terminal Metering and Controller Managed Spacing tools developed at the NASA Ames Research Center (ARC) were integrated into LaRC's Air Traffic Operations Laboratory (ATOL). Data were collected from 10 crews of current 757/767 pilots asked to fly a high-fidelity, fixed-based simulator during scenarios conducted within an airspace environment modeled on the Dallas-Fort Worth (DFW) Terminal Radar Approach Control area. The aircraft simulator was equipped with the Airborne Spacing for Terminal Area Routes (ASTAR) algorithm and a FIM crew interface consisting of electronic flight bags and ADS-B guidance displays. Researchers used "pseudo-pilot" stations to control 24 simulated aircraft that provided multiple air traffic flows into the DFW International Airport, and recently retired DFW air traffic controllers served as confederate Center, Feeder, Final, and Tower controllers. Analyses of qualitative data revealed that the procedures used by flight crews to receive and execute interval management (IM) clearances in a voice communications environment were logical, easy to follow, did not contain any missing or extraneous steps, and required the use of an acceptable workload level. The majority of the pilot participants found the IM concept, in addition to the proposed FIM crew procedures, to be acceptable and indicated that the ATD-1 procedures could be successfully executed in a nearterm NextGen environment. Analyses of quantitative data revealed that the proposed procedures were feasible for use by flight crews in a voice communications environment. The delivery accuracy at the achieve-by point was within +/-5 sec, and the delivery precision was less than 5 sec. Furthermore, FIM speed commands occurred at a rate of less than one per minute, and pilots found the frequency of the speed commands to be acceptable at all times throughout the experiment scenarios.

  17. MedAd-AppQ: A quality assessment tool for medication adherence apps on iOS and android platforms.

    PubMed

    Ali, Eskinder Eshetu; Teo, Amanda Kai Sin; Goh, Sherlyn Xue Lin; Chew, Lita; Yap, Kevin Yi-Lwern

    2018-02-02

    With the recent proliferation of smartphone medication adherence applications (apps), it is increasingly more difficult for patients and clinicians to identify the most useful app. To develop a quality assessment tool for medication adherence apps, and evaluate the quality of such apps from the major app stores. In this study, a Medication Adherence App Quality assessment tool (MedAd-AppQ) was developed and two evaluators independently assessed apps that fulfilled the following criteria: availability in English, had at least a medication reminder feature, non-specific to certain disease conditions (generic apps), free of technical malfunctions and availability on both the iPhone Operating System (iOS) and Android platforms. Descriptive statistics, Mann-Whitney U test, Pearson product moment correlation and Spearman rank-order correlation were used for statistical analysis. MedAd-AppQ was designed to have 24 items (total 43 points) categorized under three sections: content reliability (11 points), feature usefulness (29 points) and feature convenience (3 points). The three sections of MedAd-AppQ were found to have inter-rater correlation coefficients of 0.801 (p-value < .001) or higher. Based on analysis of 52 apps (27 iOS and 25 Android), quality scores ranged between 7/43 (16.3%) and 28/43 (65.1%). There was no significant difference between the quality scores of the Android and iOS versions. None of the apps had features for self-management of side effects. Only two apps in each platform provided disease-related and/or medication information. MedAd-AppQ can be used to reliably assess the quality of adherence apps. Clinicians can use the tool in selecting apps for use by patients. Developers of adherence apps should consider features that provide therapy-related information and help patients in medications and side-effects management. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. LibKiSAO: a Java library for Querying KiSAO.

    PubMed

    Zhukova, Anna; Adams, Richard; Laibe, Camille; Le Novère, Nicolas

    2012-09-24

    The Kinetic Simulation Algorithm Ontology (KiSAO) supplies information about existing algorithms available for the simulation of Systems Biology models, their characteristics, parameters and inter-relationships. KiSAO enables the unambiguous identification of algorithms from simulation descriptions. Information about analogous methods having similar characteristics and about algorithm parameters incorporated into KiSAO is desirable for simulation tools. To retrieve this information programmatically an application programming interface (API) for KiSAO is needed. We developed libKiSAO, a Java library to enable querying of the KiSA Ontology. It implements methods to retrieve information about simulation algorithms stored in KiSAO, their characteristics and parameters, and methods to query the algorithm hierarchy and search for similar algorithms providing comparable results for the same simulation set-up. Using libKiSAO, simulation tools can make logical inferences based on this knowledge and choose the most appropriate algorithm to perform a simulation. LibKiSAO also enables simulation tools to handle a wider range of simulation descriptions by determining which of the available methods are similar and can be used instead of the one indicated in the simulation description if that one is not implemented. LibKiSAO enables Java applications to easily access information about simulation algorithms, their characteristics and parameters stored in the OWL-encoded Kinetic Simulation Algorithm Ontology. LibKiSAO can be used by simulation description editors and simulation tools to improve reproducibility of computational simulation tasks and facilitate model re-use.

  19. Using Simulation to Teach About Poverty in Nursing Education: A Review of Available Tools.

    PubMed

    Reid, Carol A; Evanson, Tracy A

    2016-01-01

    Poverty is one of the most significant social determinants of health, and as such, it is imperative that nurses have an understanding of the impact that living in poverty has upon one's life and health. A lack of such understanding will impede nurses from providing care that is patient centered, treats all patients fairly, and advocates for social justice. It is essential that nursing educators assure that poverty-related content and effective teaching strategies are used in nursing curricula in order to help students develop this understanding. Several poverty-simulation tools are available and may be able to assist with development of accurate knowledge, skills, and attitudes. Unfortunately, little evidence exists to evaluate most poverty simulation tools. This article will provide an introduction to several poverty-related simulation tools, discuss any related research that evaluates their effectiveness, and make recommendations for integration of such simulation tools into nursing curricula. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Static tool influence function for fabrication simulation of hexagonal mirror segments for extremely large telescopes.

    PubMed

    Kim, Dae Wook; Kim, Sug-Whan

    2005-02-07

    We present a novel simulation technique that offers efficient mass fabrication strategies for 2m class hexagonal mirror segments of extremely large telescopes. As the first of two studies in series, we establish the theoretical basis of the tool influence function (TIF) for precessing tool polishing simulation for non-rotating workpieces. These theoretical TIFs were then used to confirm the reproducibility of the material removal foot-prints (measured TIFs) of the bulged precessing tooling reported elsewhere. This is followed by the reverse-computation technique that traces, employing the simplex search method, the real polishing pressure from the empirical TIF. The technical details, together with the results and implications described here, provide the theoretical tool for material removal essential to the successful polishing simulation which will be reported in the second study.

  1. doGlycans–Tools for Preparing Carbohydrate Structures for Atomistic Simulations of Glycoproteins, Glycolipids, and Carbohydrate Polymers for GROMACS

    PubMed Central

    2017-01-01

    Carbohydrates constitute a structurally and functionally diverse group of biological molecules and macromolecules. In cells they are involved in, e.g., energy storage, signaling, and cell–cell recognition. All of these phenomena take place in atomistic scales, thus atomistic simulation would be the method of choice to explore how carbohydrates function. However, the progress in the field is limited by the lack of appropriate tools for preparing carbohydrate structures and related topology files for the simulation models. Here we present tools that fill this gap. Applications where the tools discussed in this paper are particularly useful include, among others, the preparation of structures for glycolipids, nanocellulose, and glycans linked to glycoproteins. The molecular structures and simulation files generated by the tools are compatible with GROMACS. PMID:28906114

  2. Development of a Novel Rabies Simulation Model for Application in a Non-endemic Environment

    PubMed Central

    Dürr, Salome; Ward, Michael P.

    2015-01-01

    Domestic dog rabies is an endemic disease in large parts of the developing world and also epidemic in previously free regions. For example, it continues to spread in eastern Indonesia and currently threatens adjacent rabies-free regions with high densities of free-roaming dogs, including remote northern Australia. Mathematical and simulation disease models are useful tools to provide insights on the most effective control strategies and to inform policy decisions. Existing rabies models typically focus on long-term control programs in endemic countries. However, simulation models describing the dog rabies incursion scenario in regions where rabies is still exotic are lacking. We here describe such a stochastic, spatially explicit rabies simulation model that is based on individual dog information collected in two remote regions in northern Australia. Illustrative simulations produced plausible results with epidemic characteristics expected for rabies outbreaks in disease free regions (mean R0 1.7, epidemic peak 97 days post-incursion, vaccination as the most effective response strategy). Systematic sensitivity analysis identified that model outcomes were most sensitive to seven of the 30 model parameters tested. This model is suitable for exploring rabies spread and control before an incursion in populations of largely free-roaming dogs that live close together with their owners. It can be used for ad-hoc contingency or response planning prior to and shortly after incursion of dog rabies in previously free regions. One challenge that remains is model parameterisation, particularly how dogs’ roaming and contacts and biting behaviours change following a rabies incursion in a previously rabies free population. PMID:26114762

  3. Autism detection in early childhood (ADEC): reliability and validity data for a Level 2 screening tool for autistic disorder.

    PubMed

    Nah, Yong-Hwee; Young, Robyn L; Brewer, Neil; Berlingeri, Genna

    2014-03-01

    The Autism Detection in Early Childhood (ADEC; Young, 2007) was developed as a Level 2 clinician-administered autistic disorder (AD) screening tool that was time-efficient, suitable for children under 3 years, easy to administer, and suitable for persons with minimal training and experience with AD. A best estimate clinical Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; DSM-IV-TR; American Psychiatric Association, 2000) diagnosis of AD was made for 70 children using all available information and assessment results, except for the ADEC data. A screening study compared these children on the ADEC with 57 children with other developmental disorders and 64 typically developing children. Results indicated high internal consistency (α = .91). Interrater reliability and test-retest reliability of the ADEC were also adequate. ADEC scores reliably discriminated different diagnostic groups after controlling for nonverbal IQ and Vineland Adaptive Behavior Composite scores. Construct validity (using exploratory factor analysis) and concurrent validity using performance on the Autism Diagnostic Observation Schedule (Lord et al., 2000), the Autism Diagnostic Interview-Revised (Le Couteur, Lord, & Rutter, 2003), and DSM-IV-TR criteria were also demonstrated. Signal detection analysis identified the optimal ADEC cutoff score, with the ADEC identifying all children who had an AD (N = 70, sensitivity = 1.0) but overincluding children with other disabilities (N = 13, specificity ranging from .74 to .90). Together, the reliability and validity data indicate that the ADEC has potential to be established as a suitable and efficient screening tool for infants with AD. 2014 APA

  4. Short-term modern life-like stress exacerbates Aβ-pathology and synapse loss in 3xTg-AD mice.

    PubMed

    Baglietto-Vargas, David; Chen, Yuncai; Suh, Dongjin; Ager, Rahasson R; Rodriguez-Ortiz, Carlos J; Medeiros, Rodrigo; Myczek, Kristoffer; Green, Kim N; Baram, Tallie Z; LaFerla, Frank M

    2015-09-01

    Alzheimer's disease (AD) is a progressive neurological disorder that impairs memory and other cognitive functions in the elderly. The social and financial impacts of AD are overwhelming and are escalating exponentially as a result of population aging. Therefore, identifying AD-related risk factors and the development of more efficacious therapeutic approaches are critical to cure this neurological disorder. Current epidemiological evidence indicates that life experiences, including chronic stress, are a risk for AD. However, it is unknown if short-term stress, lasting for hours, influences the onset or progression of AD. Here, we determined the effect of short-term, multi-modal 'modern life-like' stress on AD pathogenesis and synaptic plasticity in mice bearing three AD mutations (the 3xTg-AD mouse model). We found that combined emotional and physical stress lasting 5 h severely impaired memory in wild-type mice and tended to impact it in already low-performing 3xTg-AD mice. This stress reduced the number of synapse-bearing dendritic spines in 3xTg-AD mice and increased Aβ levels by augmenting AβPP processing. Thus, short-term stress simulating modern-life conditions may exacerbate cognitive deficits in preclinical AD by accelerating amyloid pathology and reducing synapse numbers. Epidemiological evidence indicates that life experiences, including chronic stress, are a risk for Alzheimer disease (AD). However, it is unknown if short stress in the range of hours influences the onset or progression of AD. Here, we determined the effect of short, multi-modal 'modern-lifelike'stress on AD pathogenesis and synaptic plasticity in mice bearing three AD mutations (the 3xTg-AD mouse model). We found that combined emotional and physical stress lasting 5 h severely impaired memory in wild-type mice and tended to impact it in already low-performing 3xTg-AD mice. This stress reduced the number of synapse-bearing dendritic spines in 3xTg-AD mice and increased Aβ levels by augmenting AβPP processing. Thus, short stress simulating modern-life conditions may exacerbate cognitive deficits in preclinical AD by accelerating amyloid pathology and reducing synapse numbers. © 2015 International Society for Neurochemistry.

  5. Mapping Ad Hoc Communications Network of a Large Number Fixed-Wing UAV Swarm

    DTIC Science & Technology

    2017-03-01

    partitioned sub-swarms. The work covered in this thesis is to build a model of the NPS swarm’s communication network in ns-3 simulation software and use...partitioned sub- swarms. The work covered in this thesis is to build a model of the NPS swarm’s communication network in ns-3 simulation software and...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS MAPPING AD HOC COMMUNICATIONS NETWORK OF A LARGE NUMBER FIXED-WING UAV SWARM by Alexis

  6. GAPR2: A DTN Routing Protocol for Communications in Challenged, Degraded, and Denied Environments

    DTIC Science & Technology

    2015-09-01

    Transmission Speed Vs. Latency Figure 4.7: Helsinki Simulation Set 2, High Network Load and Small Buffers Analysis of Delivery Ratio in Helsinki Simulation...ipnsig.org/. [17] MANET routing, class notes for CS4554: Network modeling and analysis . 119 [18] S. Basagni et al. Mobile ad hoc networking . John...Wiley & Sons, 2004. [19] E. Royer et al. A review of current routing protocols for ad hoc mobile wireless networks . Personal Communications, IEEE, 6(2

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curry, Bennett

    The Arizona Commerce Authority (ACA) conducted an Innovation in Advanced Manufacturing Grant Competition to support and grow southern and central Arizona’s Aerospace and Defense (A&D) industry and its supply chain. The problem statement for this grant challenge was that many A&D machining processes utilize older generation CNC machine tool technologies that can result an inefficient use of resources – energy, time and materials – compared to the latest state-of-the-art CNC machines. Competitive awards funded projects to develop innovative new tools and technologies that reduce energy consumption for older generation machine tools and foster working relationships between industry small to medium-sizedmore » manufacturing enterprises and third-party solution providers. During the 42-month term of this grant, 12 competitive awards were made. Final reports have been included with this submission.« less

  8. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators.

    PubMed

    Sánchez, Borja Bordel; Alcarria, Ramón; Sánchez-Picot, Álvaro; Sánchez-de-Rivera, Diego

    2017-09-22

    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users' needs and requirements and various additional factors such as the development team's experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.

  9. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators

    PubMed Central

    Sánchez-Picot, Álvaro

    2017-01-01

    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users’ needs and requirements and various additional factors such as the development team’s experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal. PMID:28937610

  10. The effects of malicious nodes on performance of mobile ad hoc networks

    NASA Astrophysics Data System (ADS)

    Li, Fanzhi; Shi, Xiyu; Jassim, Sabah; Adams, Christopher

    2006-05-01

    Wireless ad hoc networking offers convenient infrastructureless communication over the shared wireless channel. However, the nature of ad hoc networks makes them vulnerable to security attacks. Unlike their wired counterpart, infrastructureless ad hoc networks do not have a clear line of defense, their topology is dynamically changing, and every mobile node can receive messages from its neighbors and can be contacted by all other nodes in its neighborhood. This poses a great danger to network security if some nodes behave in a malicious manner. The immediate concern about the security in this type of networks is how to protect the network and the individual mobile nodes against malicious act of rogue nodes from within the network. This paper is concerned with security aspects of wireless ad hoc networks. We shall present results of simulation experiments on ad hoc network's performance in the presence of malicious nodes. We shall investigate two types of attacks and the consequences will be simulated and quantified in terms of loss of packets and other factors. The results show that network performance, in terms of successful packet delivery ratios, significantly deteriorates when malicious nodes act according to the defined misbehaving characteristics.

  11. Potential effectiveness of anti-smoking advertisement types in ten low and middle income countries: do demographics, smoking characteristics and cultural differences matter?

    PubMed

    Durkin, Sarah; Bayly, Megan; Cotter, Trish; Mullin, Sandra; Wakefield, Melanie

    2013-12-01

    Unlike high income countries, there is limited research to guide selection of anti-tobacco mass media campaigns in low and middle income countries, although some work suggests that messages emphasizing serious health harms perform better than other message types. This study aimed to determine whether certain types of anti-smoking advertisements are more likely to be accepted and perceived as effective across smokers in 10 low to middle income countries. 2399 18-34 year old smokers were recruited in Bangladesh, China, Egypt, India, Indonesia, Mexico, Philippines, Russia, Turkey and Vietnam to view and rate 10 anti-tobacco ads. Five ads were shown in all countries and five ads were chosen by country representatives, providing a total of 37 anti-smoking ads across all countries (10 graphic health effects ads, 6 simulated health effects, 8 emotional stories of health effects, 7 other health effects and 6 non-health effects). Smokers rated ads on a series of 5-point scales containing aggregated measures of Message Acceptance and Perceived Effectiveness. All ads and materials were translated into the local language of the testing regions. In multivariate analysis, graphic health effects ads were most likely to be accepted and perceived as effective, followed by simulated health effects ads, health effects stories, other health effects ads, and then non-health effects ads. Interaction analyses indicated that graphic health effects ads were less likely to differ in acceptance or perceived effectiveness across countries, gender, age, education, parental status and amount smoked, and were less likely to be affected by cultural differences between characters and contexts in ads and those within each country. Ads that did not emphasize the health effects of smoking were most prone to inconsistent impact across countries and population subgroups. Graphic ads about the negative health effects of smoking may be most suitable for wide population broadcast in low and middle income countries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Forward and adjoint spectral-element simulations of seismic wave propagation using hardware accelerators

    NASA Astrophysics Data System (ADS)

    Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri

    2015-04-01

    Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  13. Mathematical simulation of the diel O, S, and C biogeochemistry of a hypersaline microbial mat

    NASA Astrophysics Data System (ADS)

    Decker, K.; Potter, C.

    2003-12-01

    The creation of a mathematical simulation model of photosynthetic microbial mats is an important step in our understanding of key biogeochemical cycles that may have altered the atmospheres of early Earth and of other terrestrial planets. A modeling investigation is presented here as a tool to utilize and integrate empirical results from research on hypersaline mats from Baja California, Mexico into a computational system that can be used to simulate biospheric inputs of trace gases to the atmosphere. An early version of our model calculates fluxes and cycling of oxygen, sulfide, and dissolved inorganic carbon (DIC) via abiotic components and via the major bacterial guilds: cyanobacteria (CYA), sulfur reducing bacteria (SRB), purple sulfur bacteria (PSB) and colorless sulfur bacteria (CSB). We used generalized monod-type equations that incorporate substrate and energy limits upon maximum rates of metabolic processes such as photosynthesis and sulfate reduction. We ran a simulation using temperature and irradiance inputs from data collected from a microbial mat in Guerrero Negro in Baja Mexico. Model oxygen, sulfide, and DIC results compared well with data collected in the field mats. A divergence from the field data was an initial large negative DIC flux early in the morning and little flux into the mat thereafter in the simulation. We hypothesize that this divergence will be reduced or eliminated if the salinity of the water surrounding the mat were used as an environmental input and as a limit to photosynthesis rates. Salinity levels, organic carbon, methane, methanogens and green nonsulfur bacteria will be added to this model before it is incorporated into a global model to simulate geological time scales.

  14. Fast 2D fluid-analytical simulation of ion energy distributions and electromagnetic effects in multi-frequency capacitive discharges

    NASA Astrophysics Data System (ADS)

    Kawamura, E.; Lieberman, M. A.; Graves, D. B.

    2014-12-01

    A fast 2D axisymmetric fluid-analytical plasma reactor model using the finite elements simulation tool COMSOL is interfaced with a 1D particle-in-cell (PIC) code to study ion energy distributions (IEDs) in multi-frequency capacitive argon discharges. A bulk fluid plasma model, which solves the time-dependent plasma fluid equations for the ion continuity and electron energy balance, is coupled with an analytical sheath model, which solves for the sheath parameters. The time-independent Helmholtz equation is used to solve for the fields and a gas flow model solves for the steady-state pressure, temperature and velocity of the neutrals. The results of the fluid-analytical model are used as inputs to a PIC simulation of the sheath region of the discharge to obtain the IEDs at the target electrode. Each 2D fluid-analytical-PIC simulation on a moderate 2.2 GHz CPU workstation with 8 GB of memory took about 15-20 min. The multi-frequency 2D fluid-analytical model was compared to 1D PIC simulations of a symmetric parallel-plate discharge, showing good agreement. We also conducted fluid-analytical simulations of a multi-frequency argon capacitively coupled plasma (CCP) with a typical asymmetric reactor geometry at 2/60/162 MHz. The low frequency 2 MHz power controlled the sheath width and sheath voltage while the high frequencies controlled the plasma production. A standing wave was observable at the highest frequency of 162 MHz. We noticed that adding 2 MHz power to a 60 MHz discharge or 162 MHz to a dual frequency 2 MHz/60 MHz discharge can enhance the plasma uniformity. We found that multiple frequencies were not only useful for controlling IEDs but also plasma uniformity in CCP reactors.

  15. A Review on Regional Convection-Permitting Climate Modeling: Demonstrations, Prospects, and Challenges

    NASA Astrophysics Data System (ADS)

    Prein, A. F.; Langhans, W.; Fosser, G.; Ferrone, A.; Ban, N.; Goergen, K.; Keller, M.; Tölle, M.; Gutjahr, O.; Feser, F.; Brisson, E.; Kollet, S. J.; Schmidli, J.; Van Lipzig, N. P. M.; Leung, L. R.

    2015-12-01

    Regional climate modeling using convection-permitting models (CPMs; horizontal grid spacing <4 km) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs; horizontal grid spacing >10 km). CPMs no longer rely on convection parameterization schemes, which had been identified as a major source of errors and uncertainties in LSMs. Moreover, CPMs allow for a more accurate representation of surface and orography fields. The drawback of CPMs is the high demand on computational resources. For this reason, first CPM climate simulations only appeared a decade ago. We aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic. The most important components in CPMs such as physical parameterizations and dynamical formulations are discussed critically. An overview of weaknesses and an outlook on required future developments is provided. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Improvements are evident mostly for climate statistics related to deep convection, mountainous regions, or extreme events. The climate change signals of CPM simulations suggest an increase in flash floods, changes in hail storm characteristics, and reductions in the snowpack over mountains. In conclusion, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to advance parameterizations of unresolved physics and to assess the full potential of CPMs.

  16. Optimized biogas-fermentation by neural network control.

    PubMed

    Holubar, P; Zani, L; Hager, M; Fröschl, W; Radak, Z; Braun, R

    2003-01-01

    In this work several feed-forward back-propagation neural networks (FFBP) were trained in order to model, and subsequently control, methane production in anaerobic digesters. To produce data for the training of the neural nets, four anaerobic continuous stirred tank reactors (CSTR) were operated in steady-state conditions at organic loading rates (Br) of about 2 kg x m(-3) x d(-1) chemical oxygen demand (COD), and disturbed by pulse-like increase of the organic loading rate. For the pulses additional carbon sources were added to the basic feed (surplus- and primary sludge) to simulate cofermentation and to increase the COD. Measured parameters were: gas composition, methane production rate, volatile fatty acid concentration, pH, redox potential, volatile suspended solids and COD of feed and effluent. A hierarchical system of neural nets was developed and embedded in a Decision Support System (DSS). A 3-3-1 FFBP simulated the pH with a regression coefficient of 0.82. A 9-3-3 FFBP simulated the volatile fatty acid concentration in the sludge with a regression coefficient of 0.86. And a 9-3-2 FFBP simulated the gas production and gas composition with a regression coefficient of 0.90 and 0.80 respectively. A lab-scale anaerobic CSTR controlled by this tool was able to maintain a methane concentration of about 60% at a rather high gas production rate of between 5 to 5.6 m3 x m(-3) x d(-1).

  17. A review on regional convection-permitting climate modeling: Demonstrations, prospects, and challenges.

    PubMed

    Prein, Andreas F; Langhans, Wolfgang; Fosser, Giorgia; Ferrone, Andrew; Ban, Nikolina; Goergen, Klaus; Keller, Michael; Tölle, Merja; Gutjahr, Oliver; Feser, Frauke; Brisson, Erwan; Kollet, Stefan; Schmidli, Juerg; van Lipzig, Nicole P M; Leung, Ruby

    2015-06-01

    Regional climate modeling using convection-permitting models (CPMs; horizontal grid spacing <4 km) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs; horizontal grid spacing >10 km). CPMs no longer rely on convection parameterization schemes, which had been identified as a major source of errors and uncertainties in LSMs. Moreover, CPMs allow for a more accurate representation of surface and orography fields. The drawback of CPMs is the high demand on computational resources. For this reason, first CPM climate simulations only appeared a decade ago. In this study, we aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic. The most important components in CPMs such as physical parameterizations and dynamical formulations are discussed critically. An overview of weaknesses and an outlook on required future developments is provided. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Improvements are evident mostly for climate statistics related to deep convection, mountainous regions, or extreme events. The climate change signals of CPM simulations suggest an increase in flash floods, changes in hail storm characteristics, and reductions in the snowpack over mountains. In conclusion, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to advance parameterizations of unresolved physics and to assess the full potential of CPMs.

  18. What are the assets and weaknesses of HFO detectors? A benchmark framework based on realistic simulations

    PubMed Central

    Pizzo, Francesca; Bartolomei, Fabrice; Wendling, Fabrice; Bénar, Christian-George

    2017-01-01

    High-frequency oscillations (HFO) have been suggested as biomarkers of epileptic tissues. While visual marking of these short and small oscillations is tedious and time-consuming, automatic HFO detectors have not yet met a large consensus. Even though detectors have been shown to perform well when validated against visual marking, the large number of false detections due to their lack of robustness hinder their clinical application. In this study, we developed a validation framework based on realistic and controlled simulations to quantify precisely the assets and weaknesses of current detectors. We constructed a dictionary of synthesized elements—HFOs and epileptic spikes—from different patients and brain areas by extracting these elements from the original data using discrete wavelet transform coefficients. These elements were then added to their corresponding simulated background activity (preserving patient- and region- specific spectra). We tested five existing detectors against this benchmark. Compared to other studies confronting detectors, we did not only ranked them according their performance but we investigated the reasons leading to these results. Our simulations, thanks to their realism and their variability, enabled us to highlight unreported issues of current detectors: (1) the lack of robust estimation of the background activity, (2) the underestimated impact of the 1/f spectrum, and (3) the inadequate criteria defining an HFO. We believe that our benchmark framework could be a valuable tool to translate HFOs into a clinical environment. PMID:28406919

  19. A review on regional convection-permitting climate modeling: Demonstrations, prospects, and challenges

    NASA Astrophysics Data System (ADS)

    Prein, Andreas F.; Langhans, Wolfgang; Fosser, Giorgia; Ferrone, Andrew; Ban, Nikolina; Goergen, Klaus; Keller, Michael; Tölle, Merja; Gutjahr, Oliver; Feser, Frauke; Brisson, Erwan; Kollet, Stefan; Schmidli, Juerg; van Lipzig, Nicole P. M.; Leung, Ruby

    2015-06-01

    Regional climate modeling using convection-permitting models (CPMs; horizontal grid spacing <4 km) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs; horizontal grid spacing >10 km). CPMs no longer rely on convection parameterization schemes, which had been identified as a major source of errors and uncertainties in LSMs. Moreover, CPMs allow for a more accurate representation of surface and orography fields. The drawback of CPMs is the high demand on computational resources. For this reason, first CPM climate simulations only appeared a decade ago. In this study, we aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic. The most important components in CPMs such as physical parameterizations and dynamical formulations are discussed critically. An overview of weaknesses and an outlook on required future developments is provided. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Improvements are evident mostly for climate statistics related to deep convection, mountainous regions, or extreme events. The climate change signals of CPM simulations suggest an increase in flash floods, changes in hail storm characteristics, and reductions in the snowpack over mountains. In conclusion, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to advance parameterizations of unresolved physics and to assess the full potential of CPMs.

  20. Talk the talk and walk the walk. Evaluation of autonomy in aging and Alzheimer disease by simulating instrumental activities of daily living: the S-IADL

    PubMed Central

    Gounden, Yannick; Lacot, Emilie; Couvillers, Frédérique; Lions, Amandine; Hainselin, Mathieu

    2016-01-01

    Objective The autonomy of individuals is linked to the achievement of instrumental activities of daily living that require complex behavior. In the elderly, the assessment of autonomy is usually based on questionnaires that have strong subjective constraints. Considering this fact, we tested elderly healthy adults and Alzheimer disease patients using a new measure, the S-IADL (Simulation of Instrumental Activities for Daily Living), to assess the ability to perform effectively activities of daily living. Method The S-IADL shares many items with the well-known IADL questionnaire proposed by Lawton & Brody (1969). However, as opposed to the IADL, the assessment of autonomy is not based on the completion of a questionnaire but requires the realization or simulation of various activities of daily living. Eighty-three participants (69 healthy elderly, and 14 Alzheimer Disease patients) completed the IADL and performed the S-IADL assessment. Results Results revealed that, like the IADL, the S-IADL is able to identify AD patients who are likely to encounter difficulties in performing everyday activities, and no major differences were found between the IADL and the S-IADL. Conclusions We outlined some advantages for prefering, in certain situation, this new tool based on simulation of activities in functional evaluation. Finally, we discuss the main limits of the S-IADL that should be investigated prior to its utilization by clinicians. PMID:27672491

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prein, Andreas; Langhans, Wolfgang; Fosser, Giorgia

    Regional climate modeling using convection permitting models (CPMs) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs). CPMs do not use convection parameterization schemes, known as a major source of errors and uncertainties, and have more accurate surface and orography elds. The drawback of CPMs is their high demand on computational resources. For this reason, the CPM climate simulations only appeared a decade ago. In this study we aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic.more » The most important components in CPM, such as physical parameterizations and dynamical formulations are discussed, and an outlook on required future developments and computer architectures that would support the application of CPMs is given. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Most improvements are found for processes related to deep convection (e.g., precipitation during summer), for mountainous regions, and for the soil-vegetation-atmosphere interactions. The climate change signals of CPM simulations reveal increases in short and extreme rainfall events and an increased ratio of liquid precipitation at the surface (a decrease of hail) potentially leading to more frequent ash oods. Concluding, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to assess their full potential and support their development.« less

  2. A review on regional convection-permitting climate modeling: Demonstrations, prospects, and challenges

    DOE PAGES

    Prein, Andreas; Langhans, Wolfgang; Fosser, Giorgia; ...

    2015-05-27

    Regional climate modeling using convection permitting models (CPMs) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs). CPMs do not use convection parameterization schemes, known as a major source of errors and uncertainties, and have more accurate surface and orography elds. The drawback of CPMs is their high demand on computational resources. For this reason, the CPM climate simulations only appeared a decade ago. In this study we aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic.more » The most important components in CPM, such as physical parameterizations and dynamical formulations are discussed, and an outlook on required future developments and computer architectures that would support the application of CPMs is given. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Most improvements are found for processes related to deep convection (e.g., precipitation during summer), for mountainous regions, and for the soil-vegetation-atmosphere interactions. The climate change signals of CPM simulations reveal increases in short and extreme rainfall events and an increased ratio of liquid precipitation at the surface (a decrease of hail) potentially leading to more frequent ash oods. Concluding, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to assess their full potential and support their development.« less

  3. Observatory Bibliographies as Research Tools

    NASA Astrophysics Data System (ADS)

    Rots, Arnold H.; Winkelman, S. L.

    2013-01-01

    Traditionally, observatory bibliographies were maintained to provide insight in how successful a observatory is as measured by its prominence in the (refereed) literature. When we set up the bibliographic database for the Chandra X-ray Observatory (http://cxc.harvard.edu/cgi-gen/cda/bibliography) as part of the Chandra Data Archive ((http://cxc.harvard.edu/cda/), very early in the mission, our objective was to make it primarily a useful tool for our user community. To achieve this we are: (1) casting a very wide net in collecting Chandra-related publications; (2) including for each literature reference in the database a wealth of metadata that is useful for the users; and (3) providing specific links between the articles and the datasets in the archive that they use. As a result our users are able to browse the literature and the data archive simultaneously. As an added bonus, the rich metadata content and data links have also allowed us to assemble more meaningful statistics about the scientific efficacy of the observatory. In all this we collaborate closely with the Astrophysics Data System (ADS). Among the plans for future enhancement are the inclusion of press releases and the Chandra image gallery, linking with ADS semantic searching tools, full-text metadata mining, and linking with other observatories' bibliographies. This work is supported by NASA contract NAS8-03060 (CXC) and depends critically on the services provided by the ADS.

  4. 78 FR 6269 - Amendment to the International Traffic in Arms Regulations: Revision of U.S. Munitions List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-30

    ... remain subject to USML control are modeling or simulation tools that model or simulate the environments... USML revision process, the public is asked to provide specific examples of nuclear-related items whose...) Modeling or simulation tools that model or simulate the environments generated by nuclear detonations or...

  5. NREL: News - Advisor 2002-A Powerful Vehicle Simulation Tool Gets Better

    Science.gov Websites

    Advisor 2002-A Powerful Vehicle Simulation Tool Gets Better Golden, Colo., June 11, 2002 A powerful analysis is made possible by co-simulation links to Avant!'s Saber and Ansoft's SIMPLORER�. Transient air conditioning system analysis is possible by co-simulation with C&R Technologies' SINDA/FLUINT

  6. Hygrothermal Simulation: A Tool for Building Envelope Design Analysis

    Treesearch

    Samuel V. Glass; Anton TenWolde; Samuel L. Zelinka

    2013-01-01

    Is it possible to gauge the risk of moisture problems while designing the building envelope? This article provides a brief introduction to computer-based hygrothermal (heat and moisture) simulation, shows how simulation can be useful as a design tool, and points out a number of im-portant considerations regarding model inputs and limita-tions. Hygrothermal simulation...

  7. A numerical tool for reproducing driver behaviour: experiments and predictive simulations.

    PubMed

    Casucci, M; Marchitto, M; Cacciabue, P C

    2010-03-01

    This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions.

  8. Real-time micro-modelling of city evacuations

    NASA Astrophysics Data System (ADS)

    Löhner, Rainald; Haug, Eberhard; Zinggerling, Claudio; Oñate, Eugenio

    2018-01-01

    A methodology to integrate geographical information system (GIS) data with large-scale pedestrian simulations has been developed. Advances in automatic data acquisition and archiving from GIS databases, automatic input for pedestrian simulations, as well as scalable pedestrian simulation tools have made it possible to simulate pedestrians at the individual level for complete cities in real time. An example that simulates the evacuation of the city of Barcelona demonstrates that this is now possible. This is the first step towards a fully integrated crowd prediction and management tool that takes into account not only data gathered in real time from cameras, cell phones or other sensors, but also merges these with advanced simulation tools to predict the future state of the crowd.

  9. An Intelligent Crop Planning Tool for Controlled Ecological Life Support Systems

    NASA Technical Reports Server (NTRS)

    Whitaker, Laura O.; Leon, Jorge

    1996-01-01

    This paper describes a crop planning tool developed for the Controlled Ecological Life Support Systems (CELSS) project which is in the research phases at various NASA facilities. The Crop Planning Tool was developed to assist in the understanding of the long term applications of a CELSS environment. The tool consists of a crop schedule generator as well as a crop schedule simulator. The importance of crop planning tools such as the one developed is discussed. The simulator is outlined in detail while the schedule generator is touched upon briefly. The simulator consists of data inputs, plant and human models, and various other CELSS activity models such as food consumption and waste regeneration. The program inputs such as crew data and crop states are discussed. References are included for all nominal parameters used. Activities including harvesting, planting, plant respiration, and human respiration are discussed using mathematical models. Plans provided to the simulator by the plan generator are evaluated for their 'fitness' to the CELSS environment with an objective function based upon daily reservoir levels. Sample runs of the Crop Planning Tool and future needs for the tool are detailed.

  10. Damping parameter study of a perforated plate with bias flow

    NASA Astrophysics Data System (ADS)

    Mazdeh, Alireza

    One of the main impediments to successful operation of combustion systems in industrial and aerospace applications including gas turbines, ramjets, rocket motors, afterburners (augmenters) and even large heaters/boilers is the dynamic instability also known as thermo-acoustic instability. Concerns with this ongoing problem have grown with the introduction of Lean Premixed Combustion (LPC) systems developed to address the environmental concerns associated with the conventional combustion systems. The most common way to mitigate thermo-acoustic instability is adding acoustic damping to the combustor using acoustic liners. Recently damping properties of bias flow initially introduced to liners only for cooling purposes have been recognized and proven to be an asset in enhancing the damping effectiveness of liners. Acoustic liners are currently being designed using empirical design rules followed by build-test-improve steps; basically by trial and error. There is growing concerns on the lack of reliability associated with the experimental evaluation of the acoustic liners with small size apertures. The development of physics-based tools in assisting the design of such liners has become of great interest to practitioners recently. This dissertation focuses primarily on how Large-Eddy Simulations (LES) or similar techniques such as Scaled Adaptive Simulation (SAS) can be used to characterize damping properties of bias flow. The dissertation also reviews assumptions made in the existing analytical, semi-empirical, and numerical models, provides a criteria to rank order the existing models, and identifies the best existing theoretical model. Flow field calculations by LES provide good insight into the mechanisms that led to acoustic damping. Comparison of simulation results with empirical and analytical studies shows that LES simulation is a viable alternative to the empirical and analytical methods and can accurately predict the damping behavior of liners. Currently the role of LES for research studies concerned with damping properties of liners is limited to validation of other empirical or theoretical approaches. This research has shown that LES can go beyond that and can be used for performing parametric studies to characterize the sensitivity of acoustic properties of multi--perforated liners to the changes in the geometry and flow conditions and be used as a tool to design acoustic liners. The conducted research provides an insightful understanding about the contribution of different flow and geometry parameters such as perforated plate thickness, aperture radius, porosity factors and bias flow velocity. While the study agrees with previous observations obtained by analytical or experimental methods, it also quantifies the impact from these parameters on the acoustic impedance of perforated plate, a key parameter to determine the acoustic performance of any system. The conducted study has also explored the limitations and capabilities of commercial tool when are applied for performing simulation studies on damping properties of liners. The overall agreement between LES results and previous studies proves that commercial tools can be effectively used for these applications under certain conditions.

  11. The BiolAD-DB system : an informatics system for clinical and genetic data.

    PubMed

    Nielsen, David A; Leidner, Marty; Haynes, Chad; Krauthammer, Michael; Kreek, Mary Jeanne

    2007-01-01

    The Biology of Addictive Diseases-Database (BiolAD-DB) system is a research bioinformatics system for archiving, analyzing, and processing of complex clinical and genetic data. The database schema employs design principles for handling complex clinical information, such as response items in genetic questionnaires. Data access and validation is provided by the BiolAD-DB client application, which features a data validation engine tightly coupled to a graphical user interface. Data integrity is provided by the password-protected BiolAD-DB SQL compliant server and database. BiolAD-DB tools further provide functionalities for generating customized reports and views. The BiolAD-DB system schema, client, and installation instructions are freely available at http://www.rockefeller.edu/biolad-db/.

  12. Simulation of the Interactions Between Gamma-Rays and Detectors Using BSIMUL

    NASA Technical Reports Server (NTRS)

    Haywood, S. E.; Rester, A. C., Jr.

    1996-01-01

    Progress made during 1995 on the Monte-Carlo gamma-ray spectrum simulation program BSIMUL is discussed. Several features have been added, including the ability to model shield that are tapered cylinders. Several simulations were made on the Near Earth Asteroid Rendezvous detector.

  13. Effects of Heat and Moisture Exchangers and Exhaled Humidity on Aerosol Deposition in a Simulated Ventilator-Dependent Adult Lung Model.

    PubMed

    Ari, Arzu; Alwadeai, Khalid S; Fink, James B

    2017-05-01

    Many in vitro models report higher inhaled dose with dry versus heated humidity. Heat-and-moisture exchangers (HMEs) provide passive humidity in ventilator-dependent patients but act as a barrier to aerosol. The HMEs designed to allow aerosol delivery (HME-ADs) have not been well described. The purpose of this study is to determine the impact on aerosol deposition of HME-ADs with and without active exhaled humidity in a simulated ventilator-dependent adult model. We used an in vitro lung model consisting of an intubated teaching mannequin with an endotracheal tube of 8.0 mm inner diameter with bronchi directly attached to a collecting filter and passive rubber test lung to provide testing without active exhaled humidity. To simulate exhaled humidity, a Cascade humidifier (37°C and 100% relative humidity) was placed between the collecting filter and test lung, simulating body temperature and pressure saturated exhaled humidity at the bronchi. Albuterol sulfate (2.5 mg/3 mL) was administered with a mesh nebulizer (Aerogen Solo) placed in the inspiratory limb of the ventilator circuit at the Y-piece, with no HME in place (control) and with 3 HME-AD devices, including the CircuVent, Humid-Flo, and AirLife, with and without exhaled humidity. Drug was eluted from the collecting filter and analyzed with spectrophotometry. Student t tests and analysis of variance were used for data analysis ( P < .05). The percentage of drug dose delivered (mean ± SD) distal to the bronchi in the control experiments was greater than all of the HME-ADs without exhaled humidity 18 ± 0.7 and with active exhaled humidity 10.8 ± 0.2% ( P < .005). Without exhaled humidity, aerosol delivery with the CircuVent (12.6 ± 0.8), Humid-Flo (15.3 ± 0.8), and AirLife (12.0 ± 0.5) was less than control ( P < .001, P = .01 and P < .001, respectively). In contrast, with exhaled humidity, no difference was found between control and HME-ADs ( P = .89). Also, a greater variation between control and the 3 HME-ADs was observed without exhaled humidity. Drug delivery without exhaled humidity exceeded aerosol deposition obtained with exhaled humidity in all conditions tested in this study. In this model simulating active exhaled humidity, aerosol drug delivery was lower and more consistent with both control and the HME-ADs than with the standard nonhumidified model. Further studies are needed to determine whether greater deposition in a dry model is an artifact of the model that does not simulate exhaled humidity. Copyright © 2017 by Daedalus Enterprises.

  14. Using Google AdWords for International Multilingual Recruitment to Health Research Websites

    PubMed Central

    Gross, Margaret S; Liu, Nancy H; Contreras, Omar; Muñoz, Ricardo F

    2014-01-01

    Background Google AdWords, the placement of sponsored links in Google search results, is a potent method of recruitment to Internet-based health studies and interventions. However, the performance of Google AdWords varies considerably depending on the language and the location of the target audience. Objective Our goal was to describe differences in AdWords performance when recruiting participants to the same study conducted in four languages and to determine whether AdWords campaigns can be optimized in order to increase recruitment while decreasing costs. Methods Google AdWords were used to recruit participants to the Mood Screener, a multilingual online depression screening tool available in English, Russian, Spanish, and Chinese. Two distinct recruitment periods are described: (1) “Unmanaged”, a 6-month period in which ads were allowed to run using only the AdWords tool itself, with no human intervention, and (2) “Managed”, a separate 7-week period during which we systematically sought to optimize our recruitment campaigns. Results During 6 months of unmanaged recruitment, our ads were shown over 1.3 million times, resulting in over 60,000 site visits. The average click-through rate (ratio of ads clicked to ads displayed) varied from 1.86% for Chinese ads to 8.48% for Russian ads, as did the average cost-per-click (from US $0.20 for Chinese ads to US $0.50 for English ads). Although Chinese speakers’ click-through rate was lowest, their rate of consenting to participate was the highest, at 3.62%, with English speakers exhibiting the lowest consent rate (0.97%). The conversion cost (cost to recruit a consenting participant) varied from US $10.80 for Russian speakers to US $51.88 for English speakers. During the 7 weeks of “managed” recruitment, we attempted to improve AdWords’ performance in regards to the consent rate and cost by systematically deleting underperforming ads and adjusting keywords. We were able to increase the number of people who consent after coming to the site by 91.8% while also decreasing per-consent cost by 23.3%. Conclusions Our results illustrate the need to linguistically and culturally adapt Google AdWords campaigns and to manage them carefully to ensure the most cost-effective results. PMID:24446166

  15. Modeling Nanocomposites for Molecular Dynamics (MD) Simulations

    DTIC Science & Technology

    2015-01-01

    Parallel Simulator ( LAMMPS ) is used as the MD simulator [9], the coordinates must be formatted for use in LAMMPSs. VMD has a set of tools (TopoTools...that can be used to generate a LAMMPS -readable format [6]. 3 Figure 4. Ethylene Monomer Produced From Coordinates in PDB and Rendered Using...where, i and j are the atom subscripts. Simulations are performed using LAMMPS simulation software. Periodic boundary conditions are

  16. Residential Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starke, Michael R; Abdelaziz, Omar A; Jackson, Rogerick K

    Residential Simulation Tool was developed to understand the impact of residential load consumption on utilities including the role of demand response. This is complicated as many different residential loads exist and are utilized for different purposes. The tool models human behavior and contributes this to load utilization, which contributes to the electrical consumption prediction by the tool. The tool integrates a number of different databases from Department of Energy and other Government websites to support the load consumption prediction.

  17. Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases

    PubMed Central

    Amos, Christopher I.; Bafna, Vineet; Hauser, Elizabeth R.; Hernandez, Ryan D.; Li, Chun; Liberles, David A.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Papanicolaou, George J.; Peng, Bo; Ritchie, Marylyn D.; Rosenfeld, Gabriel; Witte, John S.

    2014-01-01

    Genetic simulation programs are used to model data under specified assumptions to facilitate the understanding and study of complex genetic systems. Standardized data sets generated using genetic simulation are essential for the development and application of novel analytical tools in genetic epidemiology studies. With continuing advances in high-throughput genomic technologies and generation and analysis of larger, more complex data sets, there is a need for updating current approaches in genetic simulation modeling. To provide a forum to address current and emerging challenges in this area, the National Cancer Institute (NCI) sponsored a workshop, entitled “Genetic Simulation Tools for Post-Genome Wide Association Studies of Complex Diseases” at the National Institutes of Health (NIH) in Bethesda, Maryland on March 11-12, 2014. The goals of the workshop were to: (i) identify opportunities, challenges and resource needs for the development and application of genetic simulation models; (ii) improve the integration of tools for modeling and analysis of simulated data; and (iii) foster collaborations to facilitate development and applications of genetic simulation. During the course of the meeting the group identified challenges and opportunities for the science of simulation, software and methods development, and collaboration. This paper summarizes key discussions at the meeting, and highlights important challenges and opportunities to advance the field of genetic simulation. PMID:25371374

  18. Proceedings of the First Landscape State-and-Transition Simulation Modeling Conference, June 14–16, 2011, Portland, Oregon

    Treesearch

    Becky K. Kerns; Ayn J. Shlisky; Colin J. Daniel

    2012-01-01

    The first ever Landscape State-and-Transition Simulation Modeling Conference was held from June 14–16, 2011, in Portland Oregon. The conference brought together over 70 users of state-and-transition simulation modeling tools—the Vegetation Dynamics Development Tool (VDDT), the Tool for Exploratory Landscape Analysis (TELSA) and the Path Landscape Model. The goal of the...

  19. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.

  20. Identification of fuel cycle simulator functionalities for analysis of transition to a new fuel cycle

    DOE PAGES

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...

    2016-06-09

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

Top