Sample records for develop numerical tools

  1. Toolpack mathematical software development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osterweil, L.

    1982-07-21

    The purpose of this research project was to produce a well integrated set of tools for the support of numerical computation. The project entailed the specification, design and implementation of both a diversity of tools and an innovative tool integration mechanism. This large configuration of tightly integrated tools comprises an environment for numerical software development, and has been named Toolpack/IST (Integrated System of Tools). Following the creation of this environment in prototype form, the environment software was readied for widespread distribution by transitioning it to a development organization for systematization, documentation and distribution. It is expected that public release ofmore » Toolpack/IST will begin imminently and will provide a basis for evaluation of the innovative software approaches taken as well as a uniform set of development tools for the numerical software community.« less

  2. Selected aspects of microelectronics technology and applications: Numerically controlled machine tools. Technology trends series no. 2

    NASA Astrophysics Data System (ADS)

    Sigurdson, J.; Tagerud, J.

    1986-05-01

    A UNIDO publication about machine tools with automatic control discusses the following: (1) numerical control (NC) machine tool perspectives, definition of NC, flexible manufacturing systems, robots and their industrial application, research and development, and sensors; (2) experience in developing a capability in NC machine tools; (3) policy issues; (4) procedures for retrieval of relevant documentation from data bases. Diagrams, statistics, bibliography are included.

  3. Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    2001-01-01

    The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.

  4. Development of early numerical abilities of Spanish-speaking Mexican preschoolers: A new assessment tool.

    PubMed

    Beltrán-Navarro, Beatriz; Abreu-Mendoza, Roberto A; Matute, Esmeralda; Rosselli, Monica

    2018-01-01

    This article presents a tool for assessing the early numerical abilities of Spanish-speaking Mexican preschoolers. The Numerical Abilities Test, from the Evaluación Neuropsicológica Infantil-Preescolar (ENI-P), evaluates four core abilities of number development: magnitude comparison, counting, subitizing, and basic calculation. We evaluated 307 Spanish-speaking Mexican children aged 2 years 6 months to 4 years 11 months. Appropriate internal consistency and test-retest reliability were demonstrated. We also investigated the effect of age, children's school attendance, maternal education, and sex on children's numerical scores. The results showed that the four subtests captured development across ages. Critically, maternal education had an impact on children's performance in three out of the four subtests, but there was no effect associated with children's school attendance or sex. These results suggest that the Numerical Abilities Test is a reliable instrument for Spanish-speaking preschoolers. We discuss the implications of our outcomes for numerical development.

  5. Monitoring Object Library Usage and Changes

    NASA Technical Reports Server (NTRS)

    Owen, R. K.; Craw, James M. (Technical Monitor)

    1995-01-01

    The NASA Ames Numerical Aerodynamic Simulation program Aeronautics Consolidated Supercomputing Facility (NAS/ACSF) supercomputing center services over 1600 users, and has numerous analysts with root access. Several tools have been developed to monitor object library usage and changes. Some of the tools do "noninvasive" monitoring and other tools implement run-time logging even for object-only libraries. The run-time logging identifies who, when, and what is being used. The benefits are that real usage can be measured, unused libraries can be discontinued, training and optimization efforts can be focused at those numerical methods that are actually used. An overview of the tools will be given and the results will be discussed.

  6. GIS-MODFLOW: Ein kleines OpenSource-Werkzeug zur Anbindung von GIS-Daten an MODFLOW

    NASA Astrophysics Data System (ADS)

    Gossel, Wolfgang

    2013-06-01

    The numerical model MODFLOW (Harbaugh 2005) is an efficient and up-to-date tool for groundwater flow modelling. On the other hand, Geo-Information-Systems (GIS) provide useful tools for data preparation and visualization that can also be incorporated in numerical groundwater modelling. An interface between both would therefore be useful for many hydrogeological investigations. To date, several integrated stand-alone tools have been developed that rely on MODFLOW, MODPATH and transport modelling tools. Simultaneously, several open source-GIS codes were developed to improve functionality and ease of use. These GIS tools can be used as pre- and post-processors of the numerical model MODFLOW via a suitable interface. Here we present GIS-MODFLOW as an open-source tool that provides a new universal interface by using the ESRI ASCII GRID data format that can be converted into MODFLOW input data. This tool can also treat MODFLOW results. Such a combination of MODFLOW and open-source GIS opens new possibilities to render groundwater flow modelling, and simulation results, available to larger circles of hydrogeologists.

  7. Analysis of the thermo-mechanical deformations in a hot forging tool by numerical simulation

    NASA Astrophysics Data System (ADS)

    L-Cancelos, R.; Varas, F.; Martín, E.; Viéitez, I.

    2016-03-01

    Although programs have been developed for the design of tools for hot forging, its design is still largely based on the experience of the tool maker. This obliges to build some test matrices and correct their errors to minimize distortions in the forged piece. This phase prior to mass production consumes time and material resources, which makes the final product more expensive. The forging tools are usually constituted by various parts made of different grades of steel, which in turn have different mechanical properties and therefore suffer different degrees of strain. Furthermore, the tools used in the hot forging are exposed to a thermal field that also induces strain or stress based on the degree of confinement of the piece. Therefore, the mechanical behaviour of the assembly is determined by the contact between the different pieces. The numerical simulation allows to analyse different configurations and anticipate possible defects before tool making, thus, reducing the costs of this preliminary phase. In order to improve the dimensional quality of the manufactured parts, the work presented here focuses on the application of a numerical model to a hot forging manufacturing process in order to predict the areas of the forging die subjected to large deformations. The thermo-mechanical model developed and implemented with free software (Code-Aster) includes the strains of thermal origin, strains during forge impact and contact effects. The numerical results are validated with experimental measurements in a tooling set that produces forged crankshafts for the automotive industry. The numerical results show good agreement with the experimental tests. Thereby, a very useful tool for the design of tooling sets for hot forging is achieved.

  8. AN EIGHT WEEK SEMINAR IN AN INTRODUCTION TO NUMERICAL CONTROL ON TWO- AND THREE-AXIS MACHINE TOOLS FOR VOCATIONAL AND TECHNICAL MACHINE TOOL INSTRUCTORS. FINAL REPORT.

    ERIC Educational Resources Information Center

    BOLDT, MILTON; POKORNY, HARRY

    THIRTY-THREE MACHINE SHOP INSTRUCTORS FROM 17 STATES PARTICIPATED IN AN 8-WEEK SEMINAR TO DEVELOP THE SKILLS AND KNOWLEDGE ESSENTIAL FOR TEACHING THE OPERATION OF NUMERICALLY CONTROLLED MACHINE TOOLS. THE SEMINAR WAS GIVEN FROM JUNE 20 TO AUGUST 12, 1966, WITH COLLEGE CREDIT AVAILABLE THROUGH STOUT STATE UNIVERSITY. THE PARTICIPANTS COMPLETED AN…

  9. Hybrid finite volume-finite element model for the numerical analysis of furrow irrigation and fertigation

    USDA-ARS?s Scientific Manuscript database

    Although slowly abandoned in developed countries, furrow irrigation systems continue to be a dominant irrigation method in developing countries. Numerical models represent powerful tools to assess irrigation and fertigation efficiency. While several models have been proposed in the past, the develop...

  10. Numerical modeling tools for chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Jasinski, Thomas J.; Childs, Edward P.

    1992-01-01

    Development of general numerical simulation tools for chemical vapor deposition (CVD) was the objective of this study. Physical models of important CVD phenomena were developed and implemented into the commercial computational fluid dynamics software FLUENT. The resulting software can address general geometries as well as the most important phenomena occurring with CVD reactors: fluid flow patterns, temperature and chemical species distribution, gas phase and surface deposition. The physical models are documented which are available and examples are provided of CVD simulation capabilities.

  11. Development of a Design Tool for Planning Aqueous Amendment Injection Systems

    DTIC Science & Technology

    2012-08-01

    Chemical Oxidation with Permanganate (MnO4- ) ...................................... 2 1.4 IMPLEMENTATION ISSUES...17 6.4 SS DESIGN TOOL DEVELOPMENT AND EVALUATION ........................... 19 7.0 CHEMICAL OXIDATION WITH PERMANGANATE ...21 7.1 NUMERICAL MODELING OF PERMANGANATE DISTRIBUTION ........... 21 7.2 CDISCO DEVELOPMENT AND EVALUATION

  12. Numerical tension adjustment of x-ray membrane to represent goat skin kompang

    NASA Astrophysics Data System (ADS)

    Siswanto, Waluyo Adi; Abdullah, Muhammad Syiddiq Bin

    2017-04-01

    This paper presents a numerical membrane model of traditional musical instrument kompang that will be used to find the parameter of membrane tension of x-ray membrane representing the classical goat-skin membrane of kompang. In this study, the experiment towards the kompang is first conducted in an acoustical anechoic enclosure and in parallel a mathematical model of the kompang membrane is developed to simulate the vibration of the kompang membrane in polar coordinate by implementing Fourier-Bessel wave function. The wave equation in polar direction in mode 0,1 is applied to provide the corresponding natural frequencies of the circular membrane. The value of initial and boundary conditions in the function is determined from experiment to allow the correct development of numerical equation. The numerical mathematical model is coded in SMath for the accurate numerical analysis as well as the plotting tool. Two kompang membrane cases with different membrane materials, i.e. goat skin and x-ray film membranes with fixed radius of 0.1 m are used in the experiment. An alternative of kompang's membrane made of x-ray film with the appropriate tension setting can be used to represent the sound of traditional goat-skin kompang. The tension setting of the membrane to resemble the goat-skin is 24N. An effective numerical tool has been develop to help kompang maker to set the tension of x-ray membrane. In the future application, any tradional kompang with different size can be replaced by another membrane material if the tension is set to the correct tension value. The developed numerical tool is useful and handy to calculate the tension of the alternative membrane material.

  13. Optics simulations: a Python workshop

    NASA Astrophysics Data System (ADS)

    Ghalila, H.; Ammar, A.; Varadharajan, S.; Majdi, Y.; Zghal, M.; Lahmar, S.; Lakshminarayanan, V.

    2017-08-01

    Numerical simulations allow teachers and students to indirectly perform sophisticated experiments that cannot be realizable otherwise due to cost and other constraints. During the past few decades there has been an explosion in the development of numerical tools concurrently with open source environments such as Python software. This availability of open source software offers an incredible opportunity for advancing teaching methodologies as well as in research. More specifically it is possible to correlate theoretical knowledge with experimental measurements using "virtual" experiments. We have been working on the development of numerical simulation tools using the Python program package and we have concentrated on geometric and physical optics simulations. The advantage of doing hands-on numerical experiments is that it allows the student learner to be an active participant in the pedagogical/learning process rather than playing a passive role as in the traditional lecture format. Even in laboratory classes because of constraints of space, lack of equipment and often-large numbers of students, many students play a passive role since they work in groups of 3 or more students. Furthermore these new tools help students get a handle on numerical methods as well simulations and impart a "feel" for the physics under investigation.

  14. Which benefits in the use of a modeling platform : The VSoil example.

    NASA Astrophysics Data System (ADS)

    Lafolie, François; Cousin, Isabelle; Mollier, Alain; Pot, Valérie; Maron, Pierre-Alain; Moitrier, Nicolas; Nouguier, Cedric; Moitrier, Nathalie; Beudez, Nicolas

    2015-04-01

    In the environmental community the need for coupling the models and the associated knowledges emerged recently. The development of a coupling tool or of a modeling platform is mainly driven by the necessity to create models accounting for multiple processes and to take into account the feed back between these processes. Models focusing on a restricted number of processes exist and thus the coupling of these numerical tools appeared as an efficient and rapid mean to fill up the identified gaps. Several tools have been proposed : OMS3 (David et al. 2013) ; CSDMS framework (Peckham et al. 2013) ; the Open MI project developed within the frame of European Community (Open MI, 2011). However, what we should expect from a modeling platform could be more ambitious than only coupling existing numerical codes. We believe that we need to share easily not only our numerical representations but also the attached knowledges. We need to rapidly and easily develop complex models to have tools to bring responses to current issues on soil functioning and soil evolution within the frame of global change. We also need to share in a common frame our visions of soil functioning at various scales, one the one hand to strengthen our collaborations, and, on the other hand, to make them visible by the other communities working on environmental issues. The presentation will briefly present the VSoil platform. The platform is able to manipulate concepts and numerical representations of these processes. The tool helps in assembling modules to create a model and automatically generates an executable code and a GUI. Potentialities of the tool will be illustrated on few selected cases.

  15. Numerical tool for tsunami risk assessment in the southern coast of Dominican Republic

    NASA Astrophysics Data System (ADS)

    Macias Sanchez, J.; Llorente Isidro, M.; Ortega, S.; Gonzalez Vida, J. M., Sr.; Castro, M. J.

    2016-12-01

    The southern coast of Dominican Republic is a very populated region, with several important cities including Santo Domingo, its capital. Important activities are rooted in the southern coast including tourism, industry, commercial ports, and, energy facilities, among others. According to historical reports, it has been impacted by big earthquakes accompanied by tsunamis as in Azua in 1751 and recently Pedernales in 2010, but their sources are not clearly identified. The aim of the present work is to develop a numerical tool to simulate the impact in the southern coast of the Dominican Republic of tsunamis generated in the Caribbean Sea. This tool, based on the Tsunami-HySEA model from EDANYA group (University of Malaga, Spain), could be used in the framework of a Tsunami Early Warning Systems due the very short computing times when only propagation is computed or it could be used to assess inundation impact, computing inundation with a initial 5 meter resolution. Numerical results corresponding to three theoretical sources are used to test the numerical tool.

  16. Editing of EIA coded, numerically controlled, machine tool tapes

    NASA Technical Reports Server (NTRS)

    Weiner, J. M.

    1975-01-01

    Editing of numerically controlled (N/C) machine tool tapes (8-level paper tape) using an interactive graphic display processor is described. A rapid technique required for correcting production errors in N/C tapes was developed using the interactive text editor on the IMLAC PDS-ID graphic display system and two special programs resident on disk. The correction technique and special programs for processing N/C tapes coded to EIA specifications are discussed.

  17. Numerical Stimulation of Multicomponent Chromatography Using Spreadsheets.

    ERIC Educational Resources Information Center

    Frey, Douglas D.

    1990-01-01

    Illustrated is the use of spreadsheet programs for implementing finite difference numerical simulations of chromatography as an instructional tool in a separations course. Discussed are differential equations, discretization and integration, spreadsheet development, computer requirements, and typical simulation results. (CW)

  18. Numerical model updating technique for structures using firefly algorithm

    NASA Astrophysics Data System (ADS)

    Sai Kubair, K.; Mohan, S. C.

    2018-03-01

    Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.

  19. Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale

    PubMed Central

    Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv

    2015-01-01

    X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner. PMID:26169570

  20. Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale

    NASA Astrophysics Data System (ADS)

    Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv

    2015-07-01

    X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner.

  1. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  2. Determination of real machine-tool settings and minimization of real surface deviation by computerized inspection

    NASA Technical Reports Server (NTRS)

    Litvin, Faydor L.; Kuan, Chihping; Zhang, YI

    1991-01-01

    A numerical method is developed for the minimization of deviations of real tooth surfaces from the theoretical ones. The deviations are caused by errors of manufacturing, errors of installment of machine-tool settings and distortion of surfaces by heat-treatment. The deviations are determined by coordinate measurements of gear tooth surfaces. The minimization of deviations is based on the proper correction of initially applied machine-tool settings. The contents of accomplished research project cover the following topics: (1) Descriptions of the principle of coordinate measurements of gear tooth surfaces; (2) Deviation of theoretical tooth surfaces (with examples of surfaces of hypoid gears and references for spiral bevel gears); (3) Determination of the reference point and the grid; (4) Determination of the deviations of real tooth surfaces at the points of the grid; and (5) Determination of required corrections of machine-tool settings for minimization of deviations. The procedure for minimization of deviations is based on numerical solution of an overdetermined system of n linear equations in m unknowns (m much less than n ), where n is the number of points of measurements and m is the number of parameters of applied machine-tool settings to be corrected. The developed approach is illustrated with numerical examples.

  3. DEVELOPMENTS IN GRworkbench

    NASA Astrophysics Data System (ADS)

    Moylan, Andrew; Scott, Susan M.; Searle, Anthony C.

    2006-02-01

    The software tool GRworkbench is an ongoing project in visual, numerical General Relativity at The Australian National University. Recently, GRworkbench has been significantly extended to facilitate numerical experimentation in analytically-defined space-times. The numerical differential geometric engine has been rewritten using functional programming techniques, enabling objects which are normally defined as functions in the formalism of differential geometry and General Relativity to be directly represented as function variables in the C++ code of GRworkbench. The new functional differential geometric engine allows for more accurate and efficient visualisation of objects in space-times and makes new, efficient computational techniques available. Motivated by the desire to investigate a recent scientific claim using GRworkbench, new tools for numerical experimentation have been implemented, allowing for the simulation of complex physical situations.

  4. Numerical tool development of fluid-structure interactions for investigation of obstructive sleep apnea

    NASA Astrophysics Data System (ADS)

    Huang, Chien-Jung; White, Susan; Huang, Shao-Ching; Mallya, Sanjay; Eldredge, Jeff

    2016-11-01

    Obstructive sleep apnea (OSA) is a medical condition characterized by repetitive partial or complete occlusion of the airway during sleep. The soft tissues in the upper airway of OSA patients are prone to collapse under the low pressure loads incurred during breathing. The ultimate goal of this research is the development of a versatile numerical tool for simulation of air-tissue interactions in the patient specific upper airway geometry. This tool is expected to capture several phenomena, including flow-induced vibration (snoring) and large deformations during airway collapse of the complex airway geometry in respiratory flow conditions. Here, we present our ongoing progress toward this goal. To avoid mesh regeneration, for flow model, a sharp-interface embedded boundary method is used on Cartesian grids for resolving the fluid-structure interface, while for the structural model, a cut-cell finite element method is used. Also, to properly resolve large displacements, non-linear elasticity model is used. The fluid and structure solvers are connected with the strongly coupled iterative algorithm. The parallel computation is achieved with the numerical library PETSc. Some two- and three- dimensional preliminary results are shown to demonstrate the ability of this tool.

  5. Airplane numerical simulation for the rapid prototyping process

    NASA Astrophysics Data System (ADS)

    Roysdon, Paul F.

    Airplane Numerical Simulation for the Rapid Prototyping Process is a comprehensive research investigation into the most up-to-date methods for airplane development and design. Uses of modern engineering software tools, like MatLab and Excel, are presented with examples of batch and optimization algorithms which combine the computing power of MatLab with robust aerodynamic tools like XFOIL and AVL. The resulting data is demonstrated in the development and use of a full non-linear six-degrees-of-freedom simulator. The applications for this numerical tool-box vary from un-manned aerial vehicles to first-order analysis of manned aircraft. A Blended-Wing-Body airplane is used for the analysis to demonstrate the flexibility of the code from classic wing-and-tail configurations to less common configurations like the blended-wing-body. This configuration has been shown to have superior aerodynamic performance -- in contrast to their classic wing-and-tube fuselage counterparts -- and have reduced sensitivity to aerodynamic flutter as well as potential for increased engine noise abatement. Of course without a classic tail elevator to damp the nose up pitching moment, and the vertical tail rudder to damp the yaw and possible rolling aerodynamics, the challenges in lateral roll and yaw stability, as well as pitching moment are not insignificant. This thesis work applies the tools necessary to perform the airplane development and optimization on a rapid basis, demonstrating the strength of this tool through examples and comparison of the results to similar airplane performance characteristics published in literature.

  6. Technology Integration in Science Classrooms: Framework, Principles, and Examples

    ERIC Educational Resources Information Center

    Kim, Minchi C.; Freemyer, Sarah

    2011-01-01

    A great number of technologies and tools have been developed to support science learning and teaching. However, science teachers and researchers point out numerous challenges to implementing such tools in science classrooms. For instance, guidelines, lesson plans, Web links, and tools teachers can easily find through Web-based search engines often…

  7. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  8. Data access and decision tools for coastal water resources management

    EPA Science Inventory

    US EPA has supported the development of numerous models and tools to support implementation of environmental regulations. However, transfer of knowledge and methods from detailed technical models to support practical problem solving by local communities and watershed or coastal ...

  9. Benchmark Problems of the Geothermal Technologies Office Code Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Mark D.; Podgorney, Robert; Kelkar, Sharad M.

    A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office has sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulationmore » capabilities to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. Study participants submitted solutions to problems for which their simulation tools were deemed capable or nearly capable. Some participating codes were originally developed for EGS applications whereas some others were designed for different applications but can simulate processes similar to those in EGS. Solution submissions from both were encouraged. In some cases, participants made small incremental changes to their numerical simulation codes to address specific elements of the problem, and in other cases participants submitted solutions with existing simulation tools, acknowledging the limitations of the code. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners.« less

  10. Preserving Simplecticity in the Numerical Integration of Linear Beam Optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, Christopher K.

    2017-07-01

    Presented are mathematical tools and methods for the development of numerical integration techniques that preserve the symplectic condition inherent to mechanics. The intended audience is for beam physicists with backgrounds in numerical modeling and simulation with particular attention to beam optics applications. The paper focuses on Lie methods that are inherently symplectic regardless of the integration accuracy order. Section 2 provides the mathematically tools used in the sequel and necessary for the reader to extend the covered techniques. Section 3 places those tools in the context of charged-particle beam optics; in particular linear beam optics is presented in terms ofmore » a Lie algebraic matrix representation. Section 4 presents numerical stepping techniques with particular emphasis on a third-order leapfrog method. Section 5 discusses the modeling of field imperfections with particular attention to the fringe fields of quadrupole focusing magnets. The direct computation of a third order transfer matrix for a fringe field is shown.« less

  11. SESAME: a software tool for the numerical dosimetric reconstruction of radiological accidents involving external sources and its application to the accident in Chile in December 2005.

    PubMed

    Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F

    2009-01-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described.

  12. NOTE: Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool

    NASA Astrophysics Data System (ADS)

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  13. Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool.

    PubMed

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-07

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  14. Transforming Mobile Platform with KI-SIM Card into an Open Mobile Identity Tool

    NASA Astrophysics Data System (ADS)

    Hyppönen, Konstantin; Hassinen, Marko; Trichina, Elena

    Recent introduction of Near Field Communication (NFC) in mobile phones has stimulated the development of new proximity payment and identification services. We present an architecture that facilitates the use of the mobile phone as a personalised electronic identity tool. The tool can work as a replacement for numerous ID cards and licenses. Design for privacy principles have been applied, such as minimisation of data collection and informed consent of the user. We describe an implementation of a lightweight version of the of the mobile identity tool using currently available handset technology and off-the-shelf development tools.

  15. Developing Teaching Material Software Assisted for Numerical Methods

    NASA Astrophysics Data System (ADS)

    Handayani, A. D.; Herman, T.; Fatimah, S.

    2017-09-01

    The NCTM vision shows the importance of two things in school mathematics, which is knowing the mathematics of the 21st century and the need to continue to improve mathematics education to answer the challenges of a changing world. One of the competencies associated with the great challenges of the 21st century is the use of help and tools (including IT), such as: knowing the existence of various tools for mathematical activity. One of the significant challenges in mathematical learning is how to teach students about abstract concepts. In this case, technology in the form of mathematics learning software can be used more widely to embed the abstract concept in mathematics. In mathematics learning, the use of mathematical software can make high level math activity become easier accepted by student. Technology can strengthen student learning by delivering numerical, graphic, and symbolic content without spending the time to calculate complex computing problems manually. The purpose of this research is to design and develop teaching materials software assisted for numerical method. The process of developing the teaching material starts from the defining step, the process of designing the learning material developed based on information obtained from the step of early analysis, learners, materials, tasks that support then done the design step or design, then the last step is the development step. The development of teaching materials software assisted for numerical methods is valid in content. While validator assessment for teaching material in numerical methods is good and can be used with little revision.

  16. Evaluation of the efficiency and reliability of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  17. Python Open source Waveform ExtractoR (POWER): an open source, Python package to monitor and post-process numerical relativity simulations

    NASA Astrophysics Data System (ADS)

    Johnson, Daniel; Huerta, E. A.; Haas, Roland

    2018-01-01

    Numerical simulations of Einstein’s field equations provide unique insights into the physics of compact objects moving at relativistic speeds, and which are driven by strong gravitational interactions. Numerical relativity has played a key role to firmly establish gravitational wave astrophysics as a new field of research, and it is now paving the way to establish whether gravitational wave radiation emitted from compact binary mergers is accompanied by electromagnetic and astro-particle counterparts. As numerical relativity continues to blend in with routine gravitational wave data analyses to validate the discovery of gravitational wave events, it is essential to develop open source tools to streamline these studies. Motivated by our own experience as users and developers of the open source, community software, the Einstein Toolkit, we present an open source, Python package that is ideally suited to monitor and post-process the data products of numerical relativity simulations, and compute the gravitational wave strain at future null infinity in high performance environments. We showcase the application of this new package to post-process a large numerical relativity catalog and extract higher-order waveform modes from numerical relativity simulations of eccentric binary black hole mergers and neutron star mergers. This new software fills a critical void in the arsenal of tools provided by the Einstein Toolkit consortium to the numerical relativity community.

  18. Numerical analysis of the performance of rock weirs: Effects of structure configuration on local hydraulics

    USGS Publications Warehouse

    Holmquist-Johnson, C. L.

    2009-01-01

    River spanning rock structures are being constructed for water delivery as well as to enable fish passage at barriers and provide or improve the aquatic habitat for endangered fish species. Current design methods are based upon anecdotal information applicable to a narrow range of channel conditions. The complex flow patterns and performance of rock weirs is not well understood. Without accurate understanding of their hydraulics, designers cannot address the failure mechanisms of these structures. Flow characteristics such as jets, near bed velocities, recirculation, eddies, and plunging flow govern scour pool development. These detailed flow patterns can be replicated using a 3D numerical model. Numerical studies inexpensively simulate a large number of cases resulting in an increased range of applicability in order to develop design tools and predictive capability for analysis and design. The analysis and results of the numerical modeling, laboratory modeling, and field data provide a process-based method for understanding how structure geometry affects flow characteristics, scour development, fish passage, water delivery, and overall structure stability. Results of the numerical modeling allow designers to utilize results of the analysis to determine the appropriate geometry for generating desirable flow parameters. The end product of this research will develop tools and guidelines for more robust structure design or retrofits based upon predictable engineering and hydraulic performance criteria. ?? 2009 ASCE.

  19. Numerical Tension Adjustment of X-Ray Membrane to Represent Goat Skin Kompang

    NASA Astrophysics Data System (ADS)

    Syiddiq, M.; Siswanto, W. A.

    2017-01-01

    This paper presents a numerical membrane model of traditional musical instrument kompang that will be used to find the parameter of membrane tension of x-ray membrane representing the classical goat-skin membrane of kompang. In this study, the experiment towards the kompang is first conducted in an acoustical anechoic enclosure and in parallel a mathematical model of the kompang membrane is developed to simulate the vibration of the kompang membrane in polar coordinate by implementing Fourier-Bessel wave function. The wave equation in polar direction in mode 0,1 is applied to provide the corresponding natural frequencies of the circular membrane. The value of initial and boundary conditions in the function is determined from experiment to allow the correct development of numerical equation. The numerical mathematical model is coded in SMath for the accurate numerical analysis as well as the plotting tool. Two kompang membrane cases with different membrane materials, i.e. goat skin and x-ray film membranes with fixed radius of 0.1 m are used in the experiment. An alternative of kompang’s membrane made of x-ray film with the appropriate tension setting can be used to represent the sound of traditional goat-skin kompang. The tension setting of the membrane to resemble the goat-skin is 24N. An effective numerical tool has been used to help kompang maker to set the tension of x-ray membrane. In the future application, any traditional kompang with different size can be replaced by another membrane material if the tension is set to the correct tension value. The numerical tool used is useful and handy to calculate the tension of the alternative membrane material.

  20. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    NASA Astrophysics Data System (ADS)

    Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

    2016-03-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  1. The "Vsoil Platform" : a tool to integrate the various physical, chemical and biological processes contributing to the soil functioning at the local scale.

    NASA Astrophysics Data System (ADS)

    Lafolie, François; Cousin, Isabelle; Mollier, Alain; Pot, Valérie; Moitrier, Nicolas; Balesdent, Jérome; bruckler, Laurent; Moitrier, Nathalie; Nouguier, Cédric; Richard, Guy

    2014-05-01

    Models describing the soil functioning are valuable tools for addressing challenging issues related to agricultural production, soil protection or biogeochemical cycles. Coupling models that address different scientific fields is actually required in order to develop numerical tools able to simulate the complex interactions and feed-backs occurring within a soil profile in interaction with climate and human activities. We present here a component-based modelling platform named "VSoil", that aims at designing, developing, implementing and coupling numerical representation of biogeochemical and physical processes in soil, from the aggregate to the profile scales. The platform consists of four softwares, i) Vsoil_Processes dedicated to the conceptual description of processes and of their inputs and outputs, ii) Vsoil_Modules devoted to the development of numerical representation of elementary processes as modules, iii) Vsoil_Models which permits the coupling of modules to create models, iv) Vsoil_Player for the run of the model and the primary analysis of results. The platform is designed to be a collaborative tool, helping scientists to share not only their models, but also the scientific knowledge on which the models are built. The platform is based on the idea that processes of any kind can be described and characterized by their inputs (state variables required) and their outputs. The links between the processes are automatically detected by the platform softwares. For any process, several numerical representations (modules) can be developed and made available to platform users. When developing modules, the platform takes care of many aspects of the development task so that the user can focus on numerical calculations. Fortran2008 and C++ are the supported languages and existing codes can be easily incorporated into platform modules. Building a model from available modules simply requires selecting the processes being accounted for and for each process a module. During this task, the platform displays available modules and checks the compatibility between the modules. The model (main program) is automatically created when compatible modules have been selected for all the processes. A GUI is automatically generated to help the user providing parameters and initial situations. Numerical results can be immediately visualized, archived and exported. The platform also provides facilities to carry out sensitivity analysis. Parameters estimation and links with databases are being developed. The platform can be freely downloaded from the web site (http://www.inra.fr/sol_virtuel/) with a set of processes, variables, modules and models. However, it is designed so that any user can add its own components. Theses adds-on can be shared with co-workers by means of an export/import mechanism using the e-mail. The adds-on can also be made available to the whole community of platform users when developers asked for. A filtering tool is available to explore the content of the platform (processes, variables, modules, models).

  2. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    ERIC Educational Resources Information Center

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  3. Making it Easy to Construct Accurate Hydrological Models that Exploit High Performance Computers (Invited)

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Terrel, A.; Certik, O.; Seljebotn, D.

    2013-12-01

    This presentation will focus on two barriers to progress in the hydrological modeling community, and research and development conducted to lessen or eliminate them. The first is a barrier to sharing hydrological models among specialized scientists that is caused by intertwining the implementation of numerical methods with the implementation of abstract numerical modeling information. In the Proteus toolkit for computational methods and simulation, we have decoupled these two important parts of computational model through separate "physics" and "numerics" interfaces. More recently we have begun developing the Strong Form Language for easy and direct representation of the mathematical model formulation in a domain specific language embedded in Python. The second major barrier is sharing ANY scientific software tools that have complex library or module dependencies, as most parallel, multi-physics hydrological models must have. In this setting, users and developer are dependent on an entire distribution, possibly depending on multiple compilers and special instructions depending on the environment of the target machine. To solve these problem we have developed, hashdist, a stateless package management tool and a resulting portable, open source scientific software distribution.

  4. Common Analysis Tool Being Developed for Aeropropulsion: The National Cycle Program Within the Numerical Propulsion System Simulation Environment

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    1999-01-01

    The NASA Lewis Research Center is developing an environment for analyzing and designing aircraft engines-the Numerical Propulsion System Simulation (NPSS). NPSS will integrate multiple disciplines, such as aerodynamics, structure, and heat transfer, and will make use of numerical "zooming" on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS uses the latest computing and communication technologies to capture complex physical processes in a timely, cost-effective manner. The vision of NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Through the NASA/Industry Cooperative Effort agreement, NASA Lewis and industry partners are developing a new engine simulation called the National Cycle Program (NCP). NCP, which is the first step toward NPSS and is its initial framework, supports the aerothermodynamic system simulation process for the full life cycle of an engine. U.S. aircraft and airframe companies recognize NCP as the future industry standard common analysis tool for aeropropulsion system modeling. The estimated potential payoff for NCP is a $50 million/yr savings to industry through improved engineering productivity.

  5. Computational fluid dynamics applications to improve crop production systems

    USDA-ARS?s Scientific Manuscript database

    Computational fluid dynamics (CFD), numerical analysis and simulation tools of fluid flow processes have emerged from the development stage and become nowadays a robust design tool. It is widely used to study various transport phenomena which involve fluid flow, heat and mass transfer, providing det...

  6. Continued Development of Expert System Tools for NPSS Engine Diagnostics

    NASA Technical Reports Server (NTRS)

    Lewandowski, Henry

    1996-01-01

    The objectives of this grant were to work with previously developed NPSS (Numerical Propulsion System Simulation) tools and enhance their functionality; explore similar AI systems; and work with the High Performance Computing Communication (HPCC) K-12 program. Activities for this reporting period are briefly summarized and a paper addressing the implementation, monitoring and zooming in a distributed jet engine simulation is included as an attachment.

  7. Numerical simulation of tunneling through arbitrary potential barriers applied on MIM and MIIM rectenna diodes

    NASA Astrophysics Data System (ADS)

    Abdolkader, Tarek M.; Shaker, Ahmed; Alahmadi, A. N. M.

    2018-07-01

    With the continuous miniaturization of electronic devices, quantum-mechanical effects such as tunneling become more effective in many device applications. In this paper, a numerical simulation tool is developed under a MATLAB environment to calculate the tunneling probability and current through an arbitrary potential barrier comparing three different numerical techniques: the finite difference method, transfer matrix method, and transmission line method. For benchmarking, the tool is applied to many case studies such as the rectangular single barrier, rectangular double barrier, and continuous bell-shaped potential barrier, each compared to analytical solutions and giving the dependence of the error on the number of mesh points. In addition, a thorough study of the J ‑ V characteristics of MIM and MIIM diodes, used as rectifiers for rectenna solar cells, is presented and simulations are compared to experimental results showing satisfactory agreement. On the undergraduate level, the tool provides a deeper insight for students to compare numerical techniques used to solve various tunneling problems and helps students to choose a suitable technique for a certain application.

  8. Evaluating the Zebrafish Embryo Toxicity Test for Pesticide Hazard Screening

    EPA Science Inventory

    Given the numerous chemicals used in society, it is critical to develop tools for accurate and efficient evaluation of potential risks to human and ecological receptors. Fish embryo acute toxicity tests are 1 tool that has been shown to be highly predictive of standard, more reso...

  9. Modelling Student Misconceptions Using Nested Logit Item Response Models

    ERIC Educational Resources Information Center

    Yildiz, Mustafa

    2017-01-01

    Student misconceptions have been studied for decades from a curricular/instructional perspective and from the assessment/test level perspective. Numerous misconception assessment tools have been developed in order to measure students' misconceptions relative to the correct content. Often, these tools are used to make a variety of educational…

  10. Batch mode grid generation: An endangered species

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    1992-01-01

    Non-interactive grid generation schemes should thrive as emphasis shifts from development of numerical analysis and design methods to application of these tools to real engineering problems. A strong case is presented for the continued development and application of non-interactive geometry modeling methods. Guidelines, strategies, and techniques for developing and implementing these tools are presented using current non-interactive grid generation methods as examples. These schemes play an important role in the development of multidisciplinary analysis methods and some of these applications are also discussed.

  11. Computer-Numerical-Control and the EMCO Compact 5 Lathe.

    ERIC Educational Resources Information Center

    Mullen, Frank M.

    This laboratory manual is intended for use in teaching computer-numerical-control (CNC) programming using the Emco Maier Compact 5 Lathe. Developed for use at the postsecondary level, this material contains a short introduction to CNC machine tools. This section covers CNC programs, CNC machine axes, and CNC coordinate systems. The following…

  12. A review of decision support, risk communication and patient information tools for thrombolytic treatment in acute stroke: lessons for tool developers

    PubMed Central

    2013-01-01

    Background Tools to support clinical or patient decision-making in the treatment/management of a health condition are used in a range of clinical settings for numerous preference-sensitive healthcare decisions. Their impact in clinical practice is largely dependent on their quality across a range of domains. We critically analysed currently available tools to support decision making or patient understanding in the treatment of acute ischaemic stroke with intravenous thrombolysis, as an exemplar to provide clinicians/researchers with practical guidance on development, evaluation and implementation of such tools for other preference-sensitive treatment options/decisions in different clinical contexts. Methods Tools were identified from bibliographic databases, Internet searches and a survey of UK and North American stroke networks. Two reviewers critically analysed tools to establish: information on benefits/risks of thrombolysis included in tools, and the methods used to convey probabilistic information (verbal descriptors, numerical and graphical); adherence to guidance on presenting outcome probabilities (IPDASi probabilities items) and information content (Picker Institute Checklist); readability (Fog Index); and the extent that tools had comprehensive development processes. Results Nine tools of 26 identified included information on a full range of benefits/risks of thrombolysis. Verbal descriptors, frequencies and percentages were used to convey probabilistic information in 20, 19 and 18 tools respectively, whilst nine used graphical methods. Shortcomings in presentation of outcome probabilities (e.g. omitting outcomes without treatment) were identified. Patient information tools had an aggregate median Fog index score of 10. None of the tools had comprehensive development processes. Conclusions Tools to support decision making or patient understanding in the treatment of acute stroke with thrombolysis have been sub-optimally developed. Development of tools should utilise mixed methods and strategies to meaningfully involve clinicians, patients and their relatives in an iterative design process; include evidence-based methods to augment interpretability of textual and probabilistic information (e.g. graphical displays showing natural frequencies) on the full range of outcome states associated with available options; and address patients with different levels of health literacy. Implementation of tools will be enhanced when mechanisms are in place to periodically assess the relevance of tools and where necessary, update the mode of delivery, form and information content. PMID:23777368

  13. Numerical model for healthy and injured ankle ligaments.

    PubMed

    Forestiero, Antonella; Carniel, Emanuele Luigi; Fontanella, Chiara Giulia; Natali, Arturo Nicola

    2017-06-01

    The aim of this work is to provide a computational tool for the investigation of ankle mechanics under different loading conditions. The attention is focused on the biomechanical role of ankle ligaments that are fundamental for joints stability. A finite element model of the human foot is developed starting from Computed Tomography and Magnetic Resonance Imaging, using particular attention to the definition of ankle ligaments. A refined fiber-reinforced visco-hyperelastic constitutive model is assumed to characterize the mechanical response of ligaments. Numerical analyses that interpret anterior drawer and the talar tilt tests reported in literature are performed. The numerical results are in agreement with the range of values obtained by experimental tests confirming the accuracy of the procedure adopted. The increase of the ankle range of motion after some ligaments rupture is also evaluated, leading to the capability of the numerical models to interpret the damage conditions. The developed computational model provides a tool for the investigation of foot and ankle functionality in terms of stress-strain of the tissues and in terms of ankle motion, considering different types of damage to ankle ligaments.

  14. Composite use of numerical groundwater flow modeling and geoinformatics techniques for monitoring Indus Basin aquifer, Pakistan.

    PubMed

    Ahmad, Zulfiqar; Ashraf, Arshad; Fryar, Alan; Akhter, Gulraiz

    2011-02-01

    The integration of the Geographic Information System (GIS) with groundwater modeling and satellite remote sensing capabilities has provided an efficient way of analyzing and monitoring groundwater behavior and its associated land conditions. A 3-dimensional finite element model (Feflow) has been used for regional groundwater flow modeling of Upper Chaj Doab in Indus Basin, Pakistan. The approach of using GIS techniques that partially fulfill the data requirements and define the parameters of existing hydrologic models was adopted. The numerical groundwater flow model is developed to configure the groundwater equipotential surface, hydraulic head gradient, and estimation of the groundwater budget of the aquifer. GIS is used for spatial database development, integration with a remote sensing, and numerical groundwater flow modeling capabilities. The thematic layers of soils, land use, hydrology, infrastructure, and climate were developed using GIS. The Arcview GIS software is used as additive tool to develop supportive data for numerical groundwater flow modeling and integration and presentation of image processing and modeling results. The groundwater flow model was calibrated to simulate future changes in piezometric heads from the period 2006 to 2020. Different scenarios were developed to study the impact of extreme climatic conditions (drought/flood) and variable groundwater abstraction on the regional groundwater system. The model results indicated a significant response in watertable due to external influential factors. The developed model provides an effective tool for evaluating better management options for monitoring future groundwater development in the study area.

  15. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  16. Numerical Flight Mechanics Analysis Of The SHEFEX I Ascent And Re-Entry Phases

    NASA Astrophysics Data System (ADS)

    Bartolome Calvo, Javier; Eggers, Thino

    2011-08-01

    The SHarp Edge Flight EXperiment (SHEFEX) I provides a huge amount of scientific data to validate numerical tools in hypersonic flows. These data allow the direct comparison of flight measurements with the current numerical tools available at DLR. Therefore, this paper is devoted to apply a recently developed direct coupling between aerodynamics and flight dynamics to the SHEFEX I flight. In a first step, mission analyses are carried out using the trajectory optimization program REENT 6D coupled to missile DATCOM. In a second step, the direct coupling between the trajectory program and the DLR TAU code, in which the unsteady Euler equations including rigid body motion are solved, is applied to analyze some interesting parts of ascent and re-entry phases of the flight experiment. The agreement of the numerical predictions with the obtained flight data is satisfactory assuming a variable fin deflection angle.

  17. Comparisons of the topographic characteristics and electrical charge distributions among Babesia-infected erythrocytes and extraerythrocytic merozoites using AFM

    USDA-ARS?s Scientific Manuscript database

    Tick-borne Babesia parasites are responsible for costly diseases worldwide. Improved control and prevention tools are urgently needed, but development of such tools is limited by numerous gaps in knowledge of the parasite-host relationships. We hereby used atomic force microscopy (AFM) and Kelvin pr...

  18. Education Faculty Students' Views about Use of E-Books

    ERIC Educational Resources Information Center

    Yalman, Murat

    2015-01-01

    Parallel to technological developments, numerous new tools are now available for people's use. Societies adapt these tools to their professional lives by learning how to use them. In this way, they try to establish more comfortable working environments. Universities giving vocational education are supposed to teach these new technologies to their…

  19. Assessing Probabilistic Reasoning in Verbal-Numerical and Graphical-Pictorial Formats: An Evaluation of the Psychometric Properties of an Instrument

    ERIC Educational Resources Information Center

    Agus, Mirian; Penna, Maria Pietronilla; Peró-Cebollero, Maribel; Guàrdia-Olmos, Joan

    2016-01-01

    Research on the graphical facilitation of probabilistic reasoning has been characterised by the effort expended to identify valid assessment tools. The authors developed an assessment instrument to compare reasoning performances when problems were presented in verbal-numerical and graphical-pictorial formats. A sample of undergraduate psychology…

  20. Interface between a printed circuit board computer aided design tool (Tektronix 4051 based) and a numerical paper tape controlled drill press (Slo-Syn 530: 100 w/ Dumore Automatic Head Number 8391)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heckman, B.K.; Chinn, V.K.

    1981-01-01

    The development and use of computer programs written to produce the paper tape needed for the automation, or numeric control, of drill presses employed to fabricate computed-designed printed circuit boards are described. (LCL)

  1. Structural characterization and numerical simulations of flow properties of standard and reservoir carbonate rocks using micro-tomography

    NASA Astrophysics Data System (ADS)

    Islam, Amina; Chevalier, Sylvie; Sassi, Mohamed

    2018-04-01

    With advances in imaging techniques and computational power, Digital Rock Physics (DRP) is becoming an increasingly popular tool to characterize reservoir samples and determine their internal structure and flow properties. In this work, we present the details for imaging, segmentation, as well as numerical simulation of single-phase flow through a standard homogenous Silurian dolomite core plug sample as well as a heterogeneous sample from a carbonate reservoir. We develop a procedure that integrates experimental results into the segmentation step to calibrate the porosity. We also look into using two different numerical tools for the simulation; namely Avizo Fire Xlab Hydro that solves the Stokes' equations via the finite volume method and Palabos that solves the same equations using the Lattice Boltzmann Method. Representative Elementary Volume (REV) and isotropy studies are conducted on the two samples and we show how DRP can be a useful tool to characterize rock properties that are time consuming and costly to obtain experimentally.

  2. CIM at GE's factory of the future

    NASA Astrophysics Data System (ADS)

    Waldman, H.

    Functional features of a highly automated aircraft component batch processing factory are described. The system has processing, working, and methodology components. A rotating parts operation installed 20 yr ago features a high density of numerically controlled machines, and is connected to a hierarchical network of data communications and apparatus for moving the rotating parts and tools of engines. Designs produced at one location in the country are sent by telephone link to other sites for development of manufacturing plans, tooling, numerical control programs, and process instructions for the rotating parts. Direct numerical control is implemented at the work stations, which have instructions stored on tape for back-up in case the host computer goes down. Each machine is automatically monitored at 48 points and notice of failure can originate from any point in the system.

  3. Recovery Discontinuous Galerkin Jacobian-free Newton-Krylov Method for all-speed flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HyeongKae Park; Robert Nourgaliev; Vincent Mousseau

    2008-07-01

    There is an increasing interest to develop the next generation simulation tools for the advanced nuclear energy systems. These tools will utilize the state-of-art numerical algorithms and computer science technology in order to maximize the predictive capability, support advanced reactor designs, reduce uncertainty and increase safety margins. In analyzing nuclear energy systems, we are interested in compressible low-Mach number, high heat flux flows with a wide range of Re, Ra, and Pr numbers. Under these conditions, the focus is placed on turbulent heat transfer, in contrast to other industries whose main interest is in capturing turbulent mixing. Our objective ismore » to develop singlepoint turbulence closure models for large-scale engineering CFD code, using Direct Numerical Simulation (DNS) or Large Eddy Simulation (LES) tools, requireing very accurate and efficient numerical algorithms. The focus of this work is placed on fully-implicit, high-order spatiotemporal discretization based on the discontinuous Galerkin method solving the conservative form of the compressible Navier-Stokes equations. The method utilizes a local reconstruction procedure derived from weak formulation of the problem, which is inspired by the recovery diffusion flux algorithm of van Leer and Nomura [?] and by the piecewise parabolic reconstruction [?] in the finite volume method. The developed methodology is integrated into the Jacobianfree Newton-Krylov framework [?] to allow a fully-implicit solution of the problem.« less

  4. Space-weather assets developed by the French space-physics community

    NASA Astrophysics Data System (ADS)

    Rouillard, A. P.; Pinto, R. F.; Brun, A. S.; Briand, C.; Bourdarie, S.; Dudok De Wit, T.; Amari, T.; Blelly, P.-L.; Buchlin, E.; Chambodut, A.; Claret, A.; Corbard, T.; Génot, V.; Guennou, C.; Klein, K. L.; Koechlin, L.; Lavarra, M.; Lavraud, B.; Leblanc, F.; Lemorton, J.; Lilensten, J.; Lopez-Ariste, A.; Marchaudon, A.; Masson, S.; Pariat, E.; Reville, V.; Turc, L.; Vilmer, N.; Zucarello, F. P.

    2016-12-01

    We present a short review of space-weather tools and services developed and maintained by the French space-physics community. They include unique data from ground-based observatories, advanced numerical models, automated identification and tracking tools, a range of space instrumentation and interconnected virtual observatories. The aim of the article is to highlight some advances achieved in this field of research at the national level over the last decade and how certain assets could be combined to produce better space-weather tools exploitable by space-weather centres and customers worldwide. This review illustrates the wide range of expertise developed nationally but is not a systematic review of all assets developed in France.

  5. Integrated Control Modeling for Propulsion Systems Using NPSS

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Felder, James L.; Lavelle, Thomas M.; Withrow, Colleen A.; Yu, Albert Y.; Lehmann, William V. A.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS), an advanced engineering simulation environment used to design and analyze aircraft engines, has been enhanced by integrating control development tools into it. One of these tools is a generic controller interface that allows NPSS to communicate with control development software environments such as MATLAB and EASY5. The other tool is a linear model generator (LMG) that gives NPSS the ability to generate linear, time-invariant state-space models. Integrating these tools into NPSS enables it to be used for control system development. This paper will discuss the development and integration of these tools into NPSS. In addition, it will show a comparison of transient model results of a generic, dual-spool, military-type engine model that has been implemented in NPSS and Simulink. It will also show the linear model generator s ability to approximate the dynamics of a nonlinear NPSS engine model.

  6. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    NASA Astrophysics Data System (ADS)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  7. Development of a behaviour-based measurement tool with defined intervention level for assessing acute pain in cats.

    PubMed

    Calvo, G; Holden, E; Reid, J; Scott, E M; Firth, A; Bell, A; Robertson, S; Nolan, A M

    2014-12-01

    To develop a composite measure pain scale tool to assess acute pain in cats and derive an intervention score. To develop the prototype composite measure pain scale-feline, words describing painful cats were collected, grouped into behavioural categories and ranked. To assess prototype validity two observers independently assigned composite measure pain scale-feline and numerical rating scale scores to 25 hospitalised cats before and after analgesic treatment. Following interim analysis the prototype was revised (revised composite measure pain scale-feline). To determine intervention score, two observers independently assigned revised composite measure pain scale-feline and numerical rating scale scores to 116 cats. A further observer, a veterinarian, stated whether analgesia was necessary. Mean ± sd decrease in revised composite measure pain scale-feline and numerical rating scale scores following analgesia were 2 · 4 ± 2 · 87 and 1 · 9 ± 2 · 34, respectively (95% confidence interval for mean change in revised composite measure pain scale-feline between 1 · 21 and 3 · 6). Changes in revised composite measure pain scale-feline and numerical rating scale were significantly correlated (r = 0 · 8) (P < 0001). Intervention level score of ≥4/16 was derived for revised composite measure pain scale-feline (26 · 7% misclassification) and ≥3/10 for numerical rating scale (14 · 5% misclassification). A valid instrument with a recommended analgesic intervention level has been developed to assess acute clinical pain in cats that should be readily applicable in practice. © 2014 British Small Animal Veterinary Association.

  8. Seminar on Factual and Numerical Data Banks. Final Report (Rabat, Morocco, February 21-24, 1984).

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific and Cultural Organization, Paris (France). General Information Programme.

    The proceedings of a seminar on factual and numerical data banks are described. Seminar objectives were to: (1) make potential users aware of the value of data banks in their respective disciplines and inform them of the tools available; (2) identify national and regional data bank requirements; and (3) define a strategy for development in this…

  9. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  10. Structural reliability assessment capability in NESSUS

    NASA Astrophysics Data System (ADS)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  11. Servo-controlling structure of five-axis CNC system for real-time NURBS interpolating

    NASA Astrophysics Data System (ADS)

    Chen, Liangji; Guo, Guangsong; Li, Huiying

    2017-07-01

    NURBS (Non-Uniform Rational B-Spline) is widely used in CAD/CAM (Computer-Aided Design / Computer-Aided Manufacturing) to represent sculptured curves or surfaces. In this paper, we develop a 5-axis NURBS real-time interpolator and realize it in our developing CNC(Computer Numerical Control) system. At first, we use two NURBS curves to represent tool-tip and tool-axis path respectively. According to feedrate and Taylor series extension, servo-controlling signals of 5 axes are obtained for each interpolating cycle. Then, generation procedure of NC(Numerical Control) code with the presented method is introduced and the method how to integrate the interpolator into our developing CNC system is given. And also, the servo-controlling structure of the CNC system is introduced. Through the illustration, it has been indicated that the proposed method can enhance the machining accuracy and the spline interpolator is feasible for 5-axis CNC system.

  12. Investigation of the mechanical behaviour of the foot skin.

    PubMed

    Fontanella, C G; Carniel, E L; Forestiero, A; Natali, A N

    2014-11-01

    The aim of this work was to provide computational tools for the characterization of the actual mechanical behaviour of foot skin, accounting for results from experimental testing and histological investigation. Such results show the typical features of skin mechanics, such as anisotropic configuration, almost incompressible behaviour, material and geometrical non linearity. The anisotropic behaviour is mainly determined by the distribution of collagen fibres along specific directions, usually identified as cleavage lines. To evaluate the biomechanical response of foot skin, a refined numerical model of the foot is developed. The overall mechanical behaviour of the skin is interpreted by a fibre-reinforced hyperelastic constitutive model and the orientation of the cleavage lines is implemented by a specific procedure. Numerical analyses that interpret typical loading conditions of the foot are performed. The influence of fibres orientation and distribution on skin mechanics is outlined also by a comparison with results using an isotropic scheme. A specific constitutive formulation is provided to characterize the mechanical behaviour of foot skin. The formulation is applied within a numerical model of the foot to investigate the skin functionality during typical foot movements. Numerical analyses developed accounting for the actual anisotropic configuration of the skin show lower maximum principal stress fields than results from isotropic analyses. The developed computational models provide reliable tools for the investigation of foot tissues functionality. Furthermore, the comparison between numerical results from anisotropic and isotropic models shows the optimal configuration of foot skin. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. Data-Driven Modeling and Rendering of Force Responses from Elastic Tool Deformation

    PubMed Central

    Rakhmatov, Ruslan; Ogay, Tatyana; Jeon, Seokhee

    2018-01-01

    This article presents a new data-driven model design for rendering force responses from elastic tool deformation. The new design incorporates a six-dimensional input describing the initial position of the contact, as well as the state of the tool deformation. The input-output relationship of the model was represented by a radial basis functions network, which was optimized based on training data collected from real tool-surface contact. Since the input space of the model is represented in the local coordinate system of a tool, the model is independent of recording and rendering devices and can be easily deployed to an existing simulator. The model also supports complex interactions, such as self and multi-contact collisions. In order to assess the proposed data-driven model, we built a custom data acquisition setup and developed a proof-of-concept rendering simulator. The simulator was evaluated through numerical and psychophysical experiments with four different real tools. The numerical evaluation demonstrated the perceptual soundness of the proposed model, meanwhile the user study revealed the force feedback of the proposed simulator to be realistic. PMID:29342964

  14. Multiscale Software Tool for Controls Prototyping in Supersonic Combustors

    DTIC Science & Technology

    2004-04-01

    and design software (GEMA, NPSS , LES combustion). We are partner with major propulsion system developers (GE, Rolls Royce, Aerojet), and a...participant in NASA/GRC Numerical Propulsion System Simulation ( NPSS ) program. The principal investigator is the primary developer (Pindera, 2001) of a

  15. Thermo-elasto-plastic simulations of femtosecond laser-induced multiple-cavity in fused silica

    NASA Astrophysics Data System (ADS)

    Beuton, R.; Chimier, B.; Breil, J.; Hébert, D.; Mishchik, K.; Lopez, J.; Maire, P. H.; Duchateau, G.

    2018-04-01

    The formation and the interaction of multiple cavities, induced by tightly focused femtosecond laser pulses, are studied using a developed numerical tool, including the thermo-elasto-plastic material response. Simulations are performed in fused silica in cases of one, two, and four spots of laser energy deposition. The relaxation of the heated matter, launching shock waves in the surrounding cold material, leads to cavity formation and emergence of areas where cracks may be induced. Results show that the laser-induced structure shape depends on the energy deposition configuration and demonstrate the potential of the used numerical tool to obtain the desired designed structure or technological process.

  16. Estimation of the influence of tool wear on force signals: A finite element approach in AISI 1045 orthogonal cutting

    NASA Astrophysics Data System (ADS)

    Equeter, Lucas; Ducobu, François; Rivière-Lorphèvre, Edouard; Abouridouane, Mustapha; Klocke, Fritz; Dehombreux, Pierre

    2018-05-01

    Industrial concerns arise regarding the significant cost of cutting tools in machining process. In particular, their improper replacement policy can lead either to scraps, or to early tool replacements, which would waste fine tools. ISO 3685 provides the flank wear end-of-life criterion. Flank wear is also the nominal type of wear for longest tool lifetimes in optimal cutting conditions. Its consequences include bad surface roughness and dimensional discrepancies. In order to aid the replacement decision process, several tool condition monitoring techniques are suggested. Force signals were shown in the literature to be strongly linked with tools flank wear. It can therefore be assumed that force signals are highly relevant for monitoring the condition of cutting tools and providing decision-aid information in the framework of their maintenance and replacement. The objective of this work is to correlate tools flank wear with numerically computed force signals. The present work uses a Finite Element Model with a Coupled Eulerian-Lagrangian approach. The geometry of the tool is changed for different runs of the model, in order to obtain results that are specific to a certain level of wear. The model is assessed by comparison with experimental data gathered earlier on fresh tools. Using the model at constant cutting parameters, force signals under different tool wear states are computed and provide force signals for each studied tool geometry. These signals are qualitatively compared with relevant data from the literature. At this point, no quantitative comparison could be performed on worn tools because the reviewed literature failed to provide similar studies in this material, either numerical or experimental. Therefore, further development of this work should include experimental campaigns aiming at collecting cutting forces signals and assessing the numerical results that were achieved through this work.

  17. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less

  18. Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Y; Glascoe, L

    The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less

  19. Model-Driven Useware Engineering

    NASA Astrophysics Data System (ADS)

    Meixner, Gerrit; Seissler, Marc; Breiner, Kai

    User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.

  20. Simulation tools for guided wave based structural health monitoring

    NASA Astrophysics Data System (ADS)

    Mesnil, Olivier; Imperiale, Alexandre; Demaldent, Edouard; Baronian, Vahan; Chapuis, Bastien

    2018-04-01

    Structural Health Monitoring (SHM) is a thematic derived from Non Destructive Evaluation (NDE) based on the integration of sensors onto or into a structure in order to monitor its health without disturbing its regular operating cycle. Guided wave based SHM relies on the propagation of guided waves in plate-like or extruded structures. Using piezoelectric transducers to generate and receive guided waves is one of the most widely accepted paradigms due to the low cost and low weight of those sensors. A wide range of techniques for flaw detection based on the aforementioned setup is available in the literature but very few of these techniques have found industrial applications yet. A major difficulty comes from the sensitivity of guided waves to a substantial number of parameters such as the temperature or geometrical singularities, making guided wave measurement difficult to analyze. In order to apply guided wave based SHM techniques to a wider spectrum of applications and to transfer those techniques to the industry, the CEA LIST develops novel numerical methods. These methods facilitate the evaluation of the robustness of SHM techniques for multiple applicative cases and ease the analysis of the influence of various parameters, such as sensors positioning or environmental conditions. The first numerical tool is the guided wave module integrated to the commercial software CIVA, relying on a hybrid modal-finite element formulation to compute the guided wave response of perturbations (cavities, flaws…) in extruded structures of arbitrary cross section such as rails or pipes. The second numerical tool is based on the spectral element method [2] and simulates guided waves in both isotropic (metals) and orthotropic (composites) plate like-structures. This tool is designed to match the widely accepted sparse piezoelectric transducer array SHM configuration in which each embedded sensor acts as both emitter and receiver of guided waves. This tool is under development and will be adapted to simulate complex real-life structures such as curved composite panels with stiffeners. This communication will present these numerical tools and their main functionalities.

  1. Comparison of BrainTool to other UML modeling and model transformation tools

    NASA Astrophysics Data System (ADS)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  2. Numerical study on the splitting of a vapor bubble in the ultrasonic assisted EDM process with the curved tool and workpiece.

    PubMed

    Shervani-Tabar, M T; Seyed-Sadjadi, M H; Shabgard, M R

    2013-01-01

    Electrical discharge machining (EDM) is a powerful and modern method of machining. In the EDM process, a vapor bubble is generated between the tool and the workpiece in the dielectric liquid due to an electrical discharge. In this process dynamic behavior of the vapor bubble affects machining process. Vibration of the tool surface affects bubble behavior and consequently affects material removal rate (MRR). In this paper, dynamic behavior of the vapor bubble in an ultrasonic assisted EDM process after the appearance of the necking phenomenon is investigated. It is noteworthy that necking phenomenon occurs when the bubble takes the shape of an hour-glass. After the appearance of the necking phenomenon, the vapor bubble splits into two parts and two liquid jets are developed on the boundaries of the upper and lower parts of the vapor bubble. The liquid jet developed on the upper part of the bubble impinges to the tool and the liquid jet developed on the lower part of the bubble impinges to the workpiece. These liquid jets cause evacuation of debris from the gap between the tool and the workpiece and also cause erosion of the workpiece and the tool. Curved tool and workpiece affect the shape and the velocity of the liquid jets during splitting of the vapor bubble. In this paper dynamics of the vapor bubble after its splitting near the curved tool and workpiece is investigated in three cases. In the first case surfaces of the tool and the workpiece are flat, in the second case surfaces of the tool and the workpiece are convex and in the third case surfaces of the tool and workpiece are concave. Numerical results show that in the third case, the velocity of liquid jets which are developed on the boundaries of the upper and lower parts of the vapor bubble after its splitting have the highest magnitude and their shape are broader than the other cases. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Web-based tool for visualization of electric field distribution in deep-seated body structures and planning of electroporation-based treatments.

    PubMed

    Marčan, Marija; Pavliha, Denis; Kos, Bor; Forjanič, Tadeja; Miklavčič, Damijan

    2015-01-01

    Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed.

  4. Web-based tool for visualization of electric field distribution in deep-seated body structures and planning of electroporation-based treatments

    PubMed Central

    2015-01-01

    Background Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. Methods In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Results Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. Conclusions The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed. PMID:26356007

  5. Development of a radial ventricular assist device using numerical predictions and experimental haemolysis.

    PubMed

    Carswell, Dave; Hilton, Andy; Chan, Chris; McBride, Diane; Croft, Nick; Slone, Avril; Cross, Mark; Foster, Graham

    2013-08-01

    The objective of this study was to demonstrate the potential of Computational Fluid Dynamics (CFD) simulations in predicting the levels of haemolysis in ventricular assist devices (VADs). Three different prototypes of a radial flow VAD have been examined experimentally and computationally using CFD modelling to assess device haemolysis. Numerical computations of the flow field were computed using a CFD model developed with the use of the commercial software Ansys CFX 13 and a set of custom haemolysis analysis tools. Experimental values for the Normalised Index of Haemolysis (NIH) have been calculated as 0.020 g/100 L, 0.014 g/100 L and 0.0042 g/100 L for the three designs. Numerical analysis predicts an NIH of 0.021 g/100 L, 0.017 g/100 L and 0.0057 g/100 L, respectively. The actual differences between experimental and numerical results vary between 0.0012 and 0.003 g/100 L, with a variation of 5% for Pump 1 and slightly larger percentage differences for the other pumps. The work detailed herein demonstrates how CFD simulation and, more importantly, the numerical prediction of haemolysis may be used as an effective tool in order to help the designers of VADs manage the flow paths within pumps resulting in a less haemolytic device. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  6. Successes and Challenges of Incompressible Flow Simulation

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2003-01-01

    During the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of CFD discipline. Even though incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient, CFD tools become indispensable in fluid engineering for incompressible and low speed flow. This paper is intended to review some of the successes made possible by advances in computational technologies during the same period, and discuss some of the current challenges.

  7. Assessment of the National Combustion Code

    NASA Technical Reports Server (NTRS)

    Liu, nan-Suey; Iannetti, Anthony; Shih, Tsan-Hsing

    2007-01-01

    The advancements made during the last decade in the areas of combustion modeling, numerical simulation, and computing platform have greatly facilitated the use of CFD based tools in the development of combustion technology. Further development of verification, validation and uncertainty quantification will have profound impact on the reliability and utility of these CFD based tools. The objectives of the present effort are to establish baseline for the National Combustion Code (NCC) and experimental data, as well as to document current capabilities and identify gaps for further improvements.

  8. Finite Element Modelling and Analysis of Conventional Pultrusion Processes

    NASA Astrophysics Data System (ADS)

    Akishin, P.; Barkanov, E.; Bondarchuk, A.

    2015-11-01

    Pultrusion is one of many composite manufacturing techniques and one of the most efficient methods for producing fiber reinforced polymer composite parts with a constant cross-section. Numerical simulation is helpful for understanding the manufacturing process and developing scientific means for the pultrusion tooling design. Numerical technique based on the finite element method has been developed for the simulation of pultrusion processes. It uses the general purpose finite element software ANSYS Mechanical. It is shown that the developed technique predicts the temperature and cure profiles, which are in good agreement with those published in the open literature.

  9. Numerical Model of Flame Spread Over Solids in Microgravity: A Supplementary Tool for Designing a Space Experiment

    NASA Technical Reports Server (NTRS)

    Shih, Hsin-Yi; Tien, James S.; Ferkul, Paul (Technical Monitor)

    2001-01-01

    The recently developed numerical model of concurrent-flow flame spread over thin solids has been used as a simulation tool to help the designs of a space experiment. The two-dimensional and three-dimensional, steady form of the compressible Navier-Stokes equations with chemical reactions are solved. With the coupled multi-dimensional solver of the radiative heat transfer, the model is capable of answering a number of questions regarding the experiment concept and the hardware designs. In this paper, the capabilities of the numerical model are demonstrated by providing the guidance for several experimental designing issues. The test matrix and operating conditions of the experiment are estimated through the modeling results. The three-dimensional calculations are made to simulate the flame-spreading experiment with realistic hardware configuration. The computed detailed flame structures provide the insight to the data collection. In addition, the heating load and the requirements of the product exhaust cleanup for the flow tunnel are estimated with the model. We anticipate that using this simulation tool will enable a more efficient and successful space experiment to be conducted.

  10. Appropedia as a Tool for Service Learning in Sustainable Development

    ERIC Educational Resources Information Center

    Pearce, Joshua M.

    2009-01-01

    Numerous studies have demonstrated that university students are capable of contributing to sustainable development while improving their academic skills. Unfortunately for many institutions, the expense of sending large cohorts of students on international service learning trips is prohibitive. Yet, students remain enthusiastic and well equipped…

  11. Operating System For Numerically Controlled Milling Machine

    NASA Technical Reports Server (NTRS)

    Ray, R. B.

    1992-01-01

    OPMILL program is operating system for Kearney and Trecker milling machine providing fast easy way to program manufacture of machine parts with IBM-compatible personal computer. Gives machinist "equation plotter" feature, which plots equations that define movements and converts equations to milling-machine-controlling program moving cutter along defined path. System includes tool-manager software handling up to 25 tools and automatically adjusts to account for each tool. Developed on IBM PS/2 computer running DOS 3.3 with 1 MB of random-access memory.

  12. Genomic analysis and geographic visualization of H5N1 and SARS-CoV.

    PubMed

    Hill, Andrew W; Alexandrov, Boyan; Guralnick, Robert P; Janies, Daniel

    2007-10-11

    Emerging infectious diseases and organisms present critical issues of national security public health, and economic welfare. We still understand little about the zoonotic potential of many viruses. To this end, we are developing novel database tools to manage comparative genomic datasets. These tools add value because they allow us to summarize the direction, frequency and order of genomic changes. We will perform numerous real world tests with our tools with both Avian Influenza and Coronaviruses.

  13. Development and testing of a numerical simulation method for thermally nonequilibrium dissociating flows in ANSYS Fluent

    NASA Astrophysics Data System (ADS)

    Shoev, G. V.; Bondar, Ye. A.; Oblapenko, G. P.; Kustova, E. V.

    2016-03-01

    Various issues of numerical simulation of supersonic gas flows with allowance for thermochemical nonequilibrium on the basis of fluid dynamic equations in the two-temperature approximation are discussed. The computational tool for modeling flows with thermochemical nonequilibrium is the commercial software package ANSYS Fluent with an additional userdefined open-code module. A comparative analysis of results obtained by various models of vibration-dissociation coupling in binary gas mixtures of nitrogen and oxygen is performed. Results of numerical simulations are compared with available experimental data.

  14. Experimental and Numerical Examination of the Thermal Transmittance of High Performance Window Frames

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustavsen Ph.D., Arild; Goudey, Howdy; Kohler, Christian

    2010-06-17

    While window frames typically represent 20-30percent of the overall window area, their impact on the total window heat transfer rates may be much larger. This effect is even greater in low-conductance (highly insulating) windows which incorporate very low conductance glazings. Developing low-conductance window frames requires accurate simulation tools for product research and development. The Passivhaus Institute in Germany states that windows (glazing and frames, combined) should have U-values not exceeding 0.80 W/(m??K). This has created a niche market for highly insulating frames, with frame U-values typically around 0.7-1.0 W/(m2 cdot K). The U-values reported are often based on numerical simulationsmore » according to international simulation standards. It is prudent to check the accuracy of these calculation standards, especially for high performance products before more manufacturers begin to use them to improve other product offerings. In this paper the thermal transmittance of five highly insulating window frames (three wooden frames, one aluminum frame and one PVC frame), found from numerical simulations and experiments, are compared. Hot box calorimeter results are compared with numerical simulations according to ISO 10077-2 and ISO 15099. In addition CFD simulations have been carried out, in order to use the most accurate tool available to investigate the convection and radiation effects inside the frame cavities. Our results show that available tools commonly used to evaluate window performance, based on ISO standards, give good overall agreement, but specific areas need improvement.« less

  15. Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.

    ERIC Educational Resources Information Center

    Chen, Joseph C.; Chang, Ted C.

    2000-01-01

    Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)

  16. Numerical simulation of electromagnetic waves in Schwarzschild space-time by finite difference time domain method and Green function method

    NASA Astrophysics Data System (ADS)

    Jia, Shouqing; La, Dongsheng; Ma, Xuelian

    2018-04-01

    The finite difference time domain (FDTD) algorithm and Green function algorithm are implemented into the numerical simulation of electromagnetic waves in Schwarzschild space-time. FDTD method in curved space-time is developed by filling the flat space-time with an equivalent medium. Green function in curved space-time is obtained by solving transport equations. Simulation results validate both the FDTD code and Green function code. The methods developed in this paper offer a tool to solve electromagnetic scattering problems.

  17. CLIPS: An expert system building tool

    NASA Technical Reports Server (NTRS)

    Riley, Gary

    1991-01-01

    The C Language Integrated Production System (CLIPS) is an expert system building tool, which provides a complete environment for the development and delivery of rule and/or object based expert systems. CLIPS was specifically designed to provide a low cost option for developing and deploying expert system applications across a wide range of hardware platforms. The commercial potential of CLIPS is vast. Currently, CLIPS is being used by over 3,300 individuals throughout the public and private sector. Because the CLIPS source code is readily available, numerous groups have used CLIPS as a basis for their own expert system tools. To date, three commercially available tools have been derived from CLIPS. In general, the development of CLIPS has helped to improve the ability to deliver expert system technology throughout the public and private sectors for a wide range of applications and diverse computing environments.

  18. Fast algorithms for Quadrature by Expansion I: Globally valid expansions

    NASA Astrophysics Data System (ADS)

    Rachh, Manas; Klöckner, Andreas; O'Neil, Michael

    2017-09-01

    The use of integral equation methods for the efficient numerical solution of PDE boundary value problems requires two main tools: quadrature rules for the evaluation of layer potential integral operators with singular kernels, and fast algorithms for solving the resulting dense linear systems. Classically, these tools were developed separately. In this work, we present a unified numerical scheme based on coupling Quadrature by Expansion, a recent quadrature method, to a customized Fast Multipole Method (FMM) for the Helmholtz equation in two dimensions. The method allows the evaluation of layer potentials in linear-time complexity, anywhere in space, with a uniform, user-chosen level of accuracy as a black-box computational method. Providing this capability requires geometric and algorithmic considerations beyond the needs of standard FMMs as well as careful consideration of the accuracy of multipole translations. We illustrate the speed and accuracy of our method with various numerical examples.

  19. CFD applications: The Lockheed perspective

    NASA Technical Reports Server (NTRS)

    Miranda, Luis R.

    1987-01-01

    The Numerical Aerodynamic Simulator (NAS) epitomizes the coming of age of supercomputing and opens exciting horizons in the world of numerical simulation. An overview of supercomputing at Lockheed Corporation in the area of Computational Fluid Dynamics (CFD) is presented. This overview will focus on developments and applications of CFD as an aircraft design tool and will attempt to present an assessment, withing this context, of the state-of-the-art in CFD methodology.

  20. Algorithms for the Fractional Calculus: A Selection of Numerical Methods

    NASA Technical Reports Server (NTRS)

    Diethelm, K.; Ford, N. J.; Freed, A. D.; Luchko, Yu.

    2003-01-01

    Many recently developed models in areas like viscoelasticity, electrochemistry, diffusion processes, etc. are formulated in terms of derivatives (and integrals) of fractional (non-integer) order. In this paper we present a collection of numerical algorithms for the solution of the various problems arising in this context. We believe that this will give the engineer the necessary tools required to work with fractional models in an efficient way.

  1. Interactive visualization of numerical simulation results: A tool for mission planning and data analysis

    NASA Technical Reports Server (NTRS)

    Berchem, J.; Raeder, J.; Walker, R. J.; Ashour-Abdalla, M.

    1995-01-01

    We report on the development of an interactive system for visualizing and analyzing numerical simulation results. This system is based on visualization modules which use the Application Visualization System (AVS) and the NCAR graphics packages. Examples from recent simulations are presented to illustrate how these modules can be used for displaying and manipulating simulation results to facilitate their comparison with phenomenological model results and observations.

  2. The AAPT/ComPADRE Digital Library: Supporting Physics Education at All Levels

    NASA Astrophysics Data System (ADS)

    Mason, Bruce

    For more than a decade, the AAPT/ComPADRE Digital Library has been providing online resources, tools, and services that support broad communities of physics faculty and physics education researchers. This online library provides vetted resources for teachers and students, an environment for authors and developers to share their work, and the collaboration tools for a diverse set of users. This talk will focus on the recent collaborations and developments being hosted on or developed with ComPADRE. Examples include PhysPort, making the tools and resources developed by physics education researchers more accessible, the Open Source Physics project, expanding the use of numerical modeling at all levels of physics education, and PICUP, a community for those promoting computation in the physics curriculum. NSF-0435336, 0532798, 0840768, 0937836.

  3. A Coupled Multiphysics Approach for Simulating Induced Seismicity, Ground Acceleration and Structural Damage

    NASA Astrophysics Data System (ADS)

    Podgorney, Robert; Coleman, Justin; Wilkins, Amdrew; Huang, Hai; Veeraraghavan, Swetha; Xia, Yidong; Permann, Cody

    2017-04-01

    Numerical modeling has played an important role in understanding the behavior of coupled subsurface thermal-hydro-mechanical (THM) processes associated with a number of energy and environmental applications since as early as the 1970s. While the ability to rigorously describe all key tightly coupled controlling physics still remains a challenge, there have been significant advances in recent decades. These advances are related primarily to the exponential growth of computational power, the development of more accurate equations of state, improvements in the ability to represent heterogeneity and reservoir geometry, and more robust nonlinear solution schemes. The work described in this paper documents the development and linkage of several fully-coupled and fully-implicit modeling tools. These tools simulate: (1) the dynamics of fluid flow, heat transport, and quasi-static rock mechanics; (2) seismic wave propagation from the sources of energy release through heterogeneous material; and (3) the soil-structural damage resulting from ground acceleration. These tools are developed in Idaho National Laboratory's parallel Multiphysics Object Oriented Simulation Environment, and are integrated together using a global implicit approach. The governing equations are presented, the numerical approach for simultaneously solving and coupling the three coupling physics tools is discussed, and the data input and output methodology is outlined. An example is presented to demonstrate the capabilities of the coupled multiphysics approach. The example involves simulating a system conceptually similar to the geothermal development in Basel Switzerland, and the resultant induced seismicity, ground motion and structural damage is predicted.

  4. A Pythonic Approach for Computational Geosciences and Geo-Data Processing

    NASA Astrophysics Data System (ADS)

    Morra, G.; Yuen, D. A.; Lee, S. M.

    2016-12-01

    Computational methods and data analysis play a constantly increasing role in Earth Sciences however students and professionals need to climb a steep learning curve before reaching a sufficient level that allows them to run effective models. Furthermore the recent arrival and new powerful machine learning tools such as Torch and Tensor Flow has opened new possibilities but also created a new realm of complications related to the completely different technology employed. We present here a series of examples entirely written in Python, a language that combines the simplicity of Matlab with the power and speed of compiled languages such as C, and apply them to a wide range of geological processes such as porous media flow, multiphase fluid-dynamics, creeping flow and many-faults interaction. We also explore ways in which machine learning can be employed in combination with numerical modelling. From immediately interpreting a large number of modeling results to optimizing a set of modeling parameters to obtain a desired optimal simulation. We show that by using Python undergraduate and graduate can learn advanced numerical technologies with a minimum dedicated effort, which in turn encourages them to develop more numerical tools and quickly progress in their computational abilities. We also show how Python allows combining modeling with machine learning as pieces of LEGO, therefore simplifying the transition towards a new kind of scientific geo-modelling. The conclusion is that Python is an ideal tool to create an infrastructure for geosciences that allows users to quickly develop tools, reuse techniques and encourage collaborative efforts to interpret and integrate geo-data in profound new ways.

  5. An Interactive Educational Tool for Compressible Aerodynamics

    NASA Technical Reports Server (NTRS)

    Benson, Thomas J.

    1994-01-01

    A workstation-based interactive educational tool was developed to aid in the teaching of undergraduate compressible aerodynamics. The tool solves for the supersonic flow past a wedge using the equations found in NACA 1135. The student varies the geometry or flow conditions through a graphical user interface and the new conditions are calculated immediately. Various graphical formats present the variation of flow results to the student. One such format leads the student to the generation of some of the graphs found in NACA-1135. The tool includes interactive questions and answers to aid in both the use of the tool and to develop an understanding of some of the complexities of compressible aerodynamics. A series of help screens make the simulator easy to learn and use. This paper will detail the numerical methods used in the tool and describe how it can be used and modified.

  6. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    NASA Astrophysics Data System (ADS)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  7. A suite of benchmark and challenge problems for enhanced geothermal systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Mark; Fu, Pengcheng; McClure, Mark

    A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulation capabilitiesmore » to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners. We present the suite of benchmark and challenge problems developed for the GTO-CCS, providing problem descriptions and sample solutions.« less

  8. Numerical simulations for active tectonic processes: increasing interoperability and performance

    NASA Technical Reports Server (NTRS)

    Donnellan, A.; Fox, G.; Rundle, J.; McLeod, D.; Tullis, T.; Grant, L.

    2002-01-01

    The objective of this project is to produce a system to fully model earthquake-related data. This task develops simulation and analysis tools to study the physics of earthquakes using state-of-the-art modeling.

  9. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  10. A study of unstable rock failures using finite difference and discrete element methods

    NASA Astrophysics Data System (ADS)

    Garvey, Ryan J.

    Case histories in mining have long described pillars or faces of rock failing violently with an accompanying rapid ejection of debris and broken material into the working areas of the mine. These unstable failures have resulted in large losses of life and collapses of entire mine panels. Modern mining operations take significant steps to reduce the likelihood of unstable failure, however eliminating their occurrence is difficult in practice. Researchers over several decades have supplemented studies of unstable failures through the application of various numerical methods. The direction of the current research is to extend these methods and to develop improved numerical tools with which to study unstable failures in underground mining layouts. An extensive study is first conducted on the expression of unstable failure in discrete element and finite difference methods. Simulated uniaxial compressive strength tests are run on brittle rock specimens. Stable or unstable loading conditions are applied onto the brittle specimens by a pair of elastic platens with ranging stiffnesses. Determinations of instability are established through stress and strain histories taken for the specimen and the system. Additional numerical tools are then developed for the finite difference method to analyze unstable failure in larger mine models. Instability identifiers are established for assessing the locations and relative magnitudes of unstable failure through measures of rapid dynamic motion. An energy balance is developed which calculates the excess energy released as a result of unstable equilibria in rock systems. These tools are validated through uniaxial and triaxial compressive strength tests and are extended to models of coal pillars and a simplified mining layout. The results of the finite difference simulations reveal that the instability identifiers and excess energy calculations provide a generalized methodology for assessing unstable failures within potentially complex mine models. These combined numerical tools may be applied in future studies to design primary and secondary supports in bump-prone conditions, evaluate retreat mining cut sequences, asses pillar de-stressing techniques, or perform backanalyses on unstable failures in select mining layouts.

  11. Inducer analysis/pump model development

    NASA Astrophysics Data System (ADS)

    Cheng, Gary C.

    1994-03-01

    Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.

  12. Inducer analysis/pump model development

    NASA Technical Reports Server (NTRS)

    Cheng, Gary C.

    1994-01-01

    Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.

  13. Efficient simulation of press hardening process through integrated structural and CFD analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palaniswamy, Hariharasudhan; Mondalek, Pamela; Wronski, Maciek

    Press hardened steel parts are being increasingly used in automotive structures for their higher strength to meet safety standards while reducing vehicle weight to improve fuel consumption. However, manufacturing of sheet metal parts by press hardening process to achieve desired properties is extremely challenging as it involves complex interaction of plastic deformation, metallurgical change, thermal distribution, and fluid flow. Numerical simulation is critical for successful design of the process and to understand the interaction among the numerous process parameters to control the press hardening process in order to consistently achieve desired part properties. Until now there has been no integratedmore » commercial software solution that can efficiently model the complete process from forming of the blank, heat transfer between the blank and tool, microstructure evolution in the blank, heat loss from tool to the fluid that flows through water channels in the tools. In this study, a numerical solution based on Altair HyperWorks® product suite involving RADIOSS®, a non-linear finite element based structural analysis solver and AcuSolve®, an incompressible fluid flow solver based on Galerkin Least Square Finite Element Method have been utilized to develop an efficient solution for complete press hardening process design and analysis. RADIOSS is used to handle the plastic deformation, heat transfer between the blank and tool, and microstructure evolution in the blank during cooling. While AcuSolve is used to efficiently model heat loss from tool to the fluid that flows through water channels in the tools. The approach is demonstrated through some case studies.« less

  14. Experimental and numerical research on forging with torsion

    NASA Astrophysics Data System (ADS)

    Petrov, Mikhail A.; Subich, Vadim N.; Petrov, Pavel A.

    2017-10-01

    Increasing the efficiency of the technological operations of blank production is closely related to the computer-aided technologies (CAx). On the one hand, the practical result represents reality exactly. On the other hand, the development procedure of new process development demands unrestricted resources, which are limited on the SMEs. The tools of CAx were successfully applied for development of new process of forging with torsion and result analysis as well. It was shown, that the theoretical calculations find the confirmation both in praxis and during numerical simulation. The mostly used constructional materials were under study. The torque angles were stated. The simulated results were evaluated by experimental procedure.

  15. Evaluation and Validation (E&V) Team Public Report. Volume 5

    DTIC Science & Technology

    1990-10-31

    aspects, software engineering practices, etc. The E&V requirements which are developed will be used to guide the E&V technical effort. The currently...interoperability of Ada software engineering environment tools and data. The scope of the CAIS-A includes the functionality affecting transportability that is...requirement that they be CAIS conforming tools or data. That is, for example numerous CIVC data exist on special purpose software currently available

  16. Engineering With Nature Geographic Project Mapping Tool (EWN ProMap)

    DTIC Science & Technology

    2015-07-01

    EWN ProMap database provides numerous case studies for infrastructure projects such as breakwaters, river engineering dikes, and seawalls that have...the EWN Project Mapping Tool (EWN ProMap) is to assist users in their search for case study information that can be valuable for developing EWN ideas...Essential elements of EWN include: (1) using science and engineering to produce operational efficiencies supporting sustainable delivery of

  17. Electromagnetic Particle-In-Cell simulation on the impedance of a dipole antenna surrounded by an ion sheath

    NASA Astrophysics Data System (ADS)

    Miyake, Y.; Usui, H.; Kojima, H.; Omura, Y.; Matsumoto, H.

    2008-06-01

    We have newly developed a numerical tool for the analysis of antenna impedance in plasma environment by making use of electromagnetic Particle-In-Cell (PIC) plasma simulations. To validate the developed tool, we first examined the antenna impedance in a homogeneous kinetic plasma and confirmed that the obtained results basically agree with the conventional theories. We next applied the tool to examine an ion-sheathed dipole antenna. The results confirmed that the inclusion of the ion-sheath effects reduces the capacitance below the electron plasma frequency. The results also revealed that the signature of impedance resonance observed at the plasma frequency is modified by the presence of the sheath. Since the sheath dynamics can be solved by the PIC scheme throughout the antenna analysis in a self-consistent manner, the developed tool has feasibility to perform more practical and complicated antenna analyses that will be necessary in real space missions.

  18. Pratt and Whitney Space Propulsion NPSS Usage

    NASA Technical Reports Server (NTRS)

    Olson, Dean

    2004-01-01

    This talk presents Pratt and Whitney's space division overview of the Numerical Propulsion System Simulation (NPSS). It examines their reasons for wanting to use the NPSS system, their past activities supporting its development, and their planned future usage. It also gives an overview how different analysis tools fit into their overall product development.

  19. Community health assessment tool: a patterns approach to data collection and diagnosis.

    PubMed

    Kriegler, N F; Harton, M K

    1992-01-01

    Creation of an assessment tool to apply Gordon's functional patterns to the community as a client was a rewarding and stimulating project. Through use of the CHAT, students developed an appreciation of the complexity and inter-relationship of numerous aspects of the community. They completed the nursing process by developing appropriate nursing diagnoses, and planning, implementing, and evaluating a health promotion project. As the students continue to use this tool in the health promotion course, the diagnoses which they generate are being collected. From this accumulated input the plan is to compile a list of common diagnoses which are appropriate to use when the community is the client.

  20. Classical problems in computational aero-acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    In relation to the expected problems in the development of computational aeroacoustics (CAA), the preliminary applications were to classical problems where the known analytical solutions could be used to validate the numerical results. Such comparisons were used to overcome the numerical problems inherent in these calculations. Comparisons were made between the various numerical approaches to the problems such as direct simulations, acoustic analogies and acoustic/viscous splitting techniques. The aim was to demonstrate the applicability of CAA as a tool in the same class as computational fluid dynamics. The scattering problems that occur are considered and simple sources are discussed.

  1. compuGUT: An in silico platform for simulating intestinal fermentation

    NASA Astrophysics Data System (ADS)

    Moorthy, Arun S.; Eberl, Hermann J.

    The microbiota inhabiting the colon and its effect on health is a topic of significant interest. In this paper, we describe the compuGUT - a simulation tool developed to assist in exploring interactions between intestinal microbiota and their environment. The primary numerical machinery is implemented in C, and the accessory scripts for loading and visualization are prepared in bash (LINUX) and R. SUNDIALS libraries are employed for numerical integration, and googleVis API for interactive visualization. Supplementary material includes a concise description of the underlying mathematical model, and detailed characterization of numerical errors and computing times associated with implementation parameters.

  2. Research on ARM Numerical Control System

    NASA Astrophysics Data System (ADS)

    Wei, Xu; JiHong, Chen

    Computerized Numerical Control (CNC) machine tools is the foundation of modern manufacturing systems, whose advanced digital technology is the key to solve the problem of sustainable development of machine tool manufacturing industry. The paper is to design CNC system embedded on ARM and indicates the hardware design and the software systems supported. On the hardware side: the driving chip of the motor control unit, as the core of components, is MCX314AL of DSP motion control which is developed by NOVA Electronics Co., Ltd. of Japan. It make convenient to control machine because of its excellent performance, simple interface, easy programming. On the Software side, the uC/OS-2 is selected as the embedded operating system of the open source, which makes a detailed breakdown of the modules of the CNC system. Those priorities are designed according to their actual requirements. The ways of communication between the module and the interrupt response are so different that it guarantees real-time property and reliability of the numerical control system. Therefore, it not only meets the requirements of the current social precision machining, but has good man-machine interface and network support to facilitate a variety of craftsmen use.

  3. Methods, Software and Tools for Three Numerical Applications. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. R. Jessup

    2000-03-01

    This is a report of the results of the authors work supported by DOE contract DE-FG03-97ER25325. They proposed to study three numerical problems. They are: (1) the extension of the PMESC parallel programming library; (2) the development of algorithms and software for certain generalized eigenvalue and singular value (SVD) problems, and (3) the application of techniques of linear algebra to an information retrieval technique known as latent semantic indexing (LSI).

  4. Implementation of agronomical and geochemical modules into a 3D groundwater code for assessing nitrate storage and transport through unconfined Chalk aquifer

    NASA Astrophysics Data System (ADS)

    Picot-Colbeaux, Géraldine; Devau, Nicolas; Thiéry, Dominique; Pettenati, Marie; Surdyk, Nicolas; Parmentier, Marc; Amraoui, Nadia; Crastes de Paulet, François; André, Laurent

    2016-04-01

    Chalk aquifer is the main water resource for domestic water supply in many parts in northern France. In same basin, groundwater is frequently affected by quality problems concerning nitrates. Often close to or above the drinking water standards, nitrate concentration in groundwater is mainly due to historical agriculture practices, combined with leakage and aquifer recharge through the vadose zone. The complexity of processes occurring into such an environment leads to take into account a lot of knowledge on agronomy, geochemistry and hydrogeology in order to understand, model and predict the spatiotemporal evolution of nitrate content and provide a decision support tool for the water producers and stakeholders. To succeed in this challenge, conceptual and numerical models representing accurately the Chalk aquifer specificity need to be developed. A multidisciplinary approach is developed to simulate storage and transport from the ground surface until groundwater. This involves a new agronomic module "NITRATE" (NItrogen TRansfer for Arable soil to groundwaTEr), a soil-crop model allowing to calculate nitrogen mass balance in arable soil, and the "PHREEQC" numerical code for geochemical calculations, both coupled with the 3D transient groundwater numerical code "MARTHE". Otherwise, new development achieved on MARTHE code allows the use of dual porosity and permeability calculations needed in the fissured Chalk aquifer context. This method concerning the integration of existing multi-disciplinary tools is a real challenge to reduce the number of parameters by selecting the relevant equations and simplifying the equations without altering the signal. The robustness and the validity of these numerical developments are tested step by step with several simulations constrained by climate forcing, land use and nitrogen inputs over several decades. In the first time, simulations are performed in a 1D vertical unsaturated soil column for representing experimental nitrates vertical soil profiles (0-30m depth experimental measurements in Somme region). In the second time, this approach is used to simulate with a 3D model a drinking water catchment area in order to compared nitrate contents time series calculated and measured in the domestic water pumping well since 1995 (field in northern France - Avre Basin region). This numerical tool will help the decision-making in all activities in relation with water uses.

  5. Can a GIS toolbox assess the environmental risk of oil spills? Implementation for oil facilities in harbors.

    PubMed

    Valdor, Paloma F; Gómez, Aina G; Velarde, Víctor; Puente, Araceli

    2016-04-01

    Oil spills are one of the most widespread problems in port areas (loading/unloading of bulk liquid, fuel supply). Specific environmental risk analysis procedures for diffuse oil sources that are based on the evolution of oil in the marine environment are needed. Diffuse sources such as oil spills usually present a lack of information, which makes the use of numerical models an arduous and occasionally impossible task. For that reason, a tool that can assess the risk of oil spills in near-shore areas by using Geographical Information System (GIS) is presented. The SPILL Tool provides immediate results by automating the process without miscalculation errors. The tool was developed using the Python and ArcGIS scripting library to build a non-ambiguous geoprocessing workflow. The SPILL Tool was implemented for oil facilities at Tarragona Harbor (NE Spain) and validated showing a satisfactory correspondence (around 0.60 RSR error index) with the results obtained using a 2D calibrated oil transport numerical model. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Yeast synthetic biology toolbox and applications for biofuel production.

    PubMed

    Tsai, Ching-Sung; Kwak, Suryang; Turner, Timothy L; Jin, Yong-Su

    2015-02-01

    Yeasts are efficient biofuel producers with numerous advantages outcompeting bacterial counterparts. While most synthetic biology tools have been developed and customized for bacteria especially for Escherichia coli, yeast synthetic biological tools have been exploited for improving yeast to produce fuels and chemicals from renewable biomass. Here we review the current status of synthetic biological tools and their applications for biofuel production, focusing on the model strain Saccharomyces cerevisiae We describe assembly techniques that have been developed for constructing genes, pathways, and genomes in yeast. Moreover, we discuss synthetic parts for allowing precise control of gene expression at both transcriptional and translational levels. Applications of these synthetic biological approaches have led to identification of effective gene targets that are responsible for desirable traits, such as cellulosic sugar utilization, advanced biofuel production, and enhanced tolerance against toxic products for biofuel production from renewable biomass. Although an array of synthetic biology tools and devices are available, we observed some gaps existing in tool development to achieve industrial utilization. Looking forward, future tool development should focus on industrial cultivation conditions utilizing industrial strains. © FEMS 2015. All rights reserved. For permissions, please e-mail: journals.permission@oup.com.

  7. The QuakeSim Project: Numerical Simulations for Active Tectonic Processes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry

    2004-01-01

    In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.

  8. Numerical tools to predict the environmental loads for offshore structures under extreme weather conditions

    NASA Astrophysics Data System (ADS)

    Wu, Yanling

    2018-05-01

    In this paper, the extreme waves were generated using the open source computational fluid dynamic (CFD) tools — OpenFOAM and Waves2FOAM — using linear and nonlinear NewWave input. They were used to conduct the numerical simulation of the wave impact process. Numerical tools based on first-order (with and without stretching) and second-order NewWave are investigated. The simulation to predict force loading for the offshore platform under the extreme weather condition is implemented and compared.

  9. The MATH--Open Source Application for Easier Learning of Numerical Mathematics

    ERIC Educational Resources Information Center

    Glaser-Opitz, Henrich; Budajová, Kristina

    2016-01-01

    The article introduces a software application (MATH) supporting an education of Applied Mathematics, with focus on Numerical Mathematics. The MATH is an easy to use tool supporting various numerical methods calculations with graphical user interface and integrated plotting tool for graphical representation written in Qt with extensive use of Qwt…

  10. Mentoring New Faculty at a Christian University in the Northeast: Developing a Framework for Programming

    ERIC Educational Resources Information Center

    Cook, Donna M.

    2011-01-01

    Mentoring has been used in various fields as a professional development and acculturation tool (Kram, 1991) and is used extensively in higher education (Cunningham, 1999). However, despite numerous studies based on faculty mentoring, those conducted at Christian institutions of higher education have been limited. The study was framed by several…

  11. Numerical Simulation of Transient Liquid Phase Bonding under Temperature Gradient

    NASA Astrophysics Data System (ADS)

    Ghobadi Bigvand, Arian

    Transient Liquid Phase bonding under Temperature Gradient (TG-TLP bonding) is a relatively new process of TLP diffusion bonding family for joining difficult-to-weld aerospace materials. Earlier studies have suggested that in contrast to the conventional TLP bonding process, liquid state diffusion drives joint solidification in TG-TLP bonding process. In the present work, a mass conservative numerical model that considers asymmetry in joint solidification is developed using finite element method to properly study the TG-TLP bonding process. The numerical results, which are experimentally verified, show that unlike what has been previously reported, solid state diffusion plays a major role in controlling the solidification behavior during TG-TLP bonding process. The newly developed model provides a vital tool for further elucidation of the TG-TLP bonding process.

  12. Numerically stable finite difference simulation for ultrasonic NDE in anisotropic composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Quintanilla, Francisco Hernando; Cole, Christina M.

    2018-04-01

    Simulation tools can enable optimized inspection of advanced materials and complex geometry structures. Recent work at NASA Langley is focused on the development of custom simulation tools for modeling ultrasonic wave behavior in composite materials. Prior work focused on the use of a standard staggered grid finite difference type of mathematical approach, by implementing a three-dimensional (3D) anisotropic Elastodynamic Finite Integration Technique (EFIT) code. However, observations showed that the anisotropic EFIT method displays numerically unstable behavior at the locations of stress-free boundaries for some cases of anisotropic materials. This paper gives examples of the numerical instabilities observed for EFIT and discusses the source of instability. As an alternative to EFIT, the 3D Lebedev Finite Difference (LFD) method has been implemented. The paper briefly describes the LFD approach and shows examples of stable behavior in the presence of stress-free boundaries for a monoclinic anisotropy case. The LFD results are also compared to experimental results and dispersion curves.

  13. GIS application on modern Mexico

    NASA Astrophysics Data System (ADS)

    Prakash, Bharath

    This is a GIS based tool for showcasing the history of modern Mexico starting from the post-colonial era to the elections of 2012. The tool is developed using simple language and is flexible so as to allow for future enhancements. The application consists of numerous images and textual information, and also some links which can be used by primary and high school students to understand the history of modern Mexico, and also by tourists to look for all the international airports and United States of America consulates. This software depicts the aftermaths of the Colonial Era or the Spanish rule of Mexico. It covers various topics like the wars, politics, important personalities, drug cartels and violence. All these events are shown on GIS (Geographic information Science) maps. The software can be customized according to the user requirements and is developed using JAVA and GIS technology. The user interface is created using JAVA and MOJO which contributes to effective learning and understanding of the concepts with ease. Some of the user interface features provided in this tool includes zoom-in, zoom-out, legend editing, location identifier, print command, adding a layer and numerous menu items.

  14. Nondestructive surface analysis for material research using fiber optic vibrational spectroscopy

    NASA Astrophysics Data System (ADS)

    Afanasyeva, Natalia I.

    2001-11-01

    The advanced methods of fiber optical vibrational spectroscopy (FOVS) has been developed in conjunction with interferometer and low-loss, flexible, and nontoxic optical fibers, sensors, and probes. The combination of optical fibers and sensors with Fourier Transform (FT) spectrometer has been used in the range from 2.5 to 12micrometers . This technique serves as an ideal diagnostic tool for surface analysis of numerous and various diverse materials such as complex structured materials, fluids, coatings, implants, living cells, plants, and tissue. Such surfaces as well as living tissue or plants are very difficult to investigate in vivo by traditional FT infrared or Raman spectroscopy methods. The FOVS technique is nondestructive, noninvasive, fast (15 sec) and capable of operating in remote sampling regime (up to a fiber length of 3m). Fourier transform infrared (FTIR) and Raman fiber optic spectroscopy operating with optical fibers has been suggested as a new powerful tool. These techniques are highly sensitive techniques for structural studies in material research and various applications during process analysis to determine molecular composition, chemical bonds, and molecular conformations. These techniques could be developed as a new tool for quality control of numerous materials as well as noninvasive biopsy.

  15. Numerical modeling of friction welding of bi-metal joints for electrical applications

    NASA Astrophysics Data System (ADS)

    Velu, P. Shenbaga; Hynes, N. Rajesh Jesudoss

    2018-05-01

    In the manufacturing industries, and more especially in electrical engineering applications, the usage of non-ferrous materials plays a vital role. Today's engineering applications relies upon some of the significant properties such as a good corrosion resistance, mechanical properties, good heat conductivity and higher electrical conductivity. Copper-aluminum bi-metal joint is one such combination that meets the demands requirements for electrical applications. In this work, the numerical simulation of AA 6061 T6 alloy/Copper was carried out under joining conditions. By using this developed model, the temperature distribution along the length of the dissimilar joint is predicted and the time-temperature profile has also been generated. Besides, a Finite Element Model has been developed by using the numerical simulation Tool "ABAQUS". This developed FEM is helpful in predicting various output parameters during friction welding of this dissimilar joint combination.

  16. ESNIB (European Science Notes Information Bulletin): Reports on Current European/Middle Eastern Science

    DTIC Science & Technology

    1989-11-01

    tool for planning, programming , The TERMOS is a digital terrain modeling system and simulating, initiating, and surveying small-scale was developed ...workshop fea- (FRG) turing the European Strategic Program for Research and Conference Language: English Development in Information Technologies...self- * Research and Development in the Numerical addressed mailer and return it to ONREUR. Aerodynamic Systems Program , R. Bailey, NASA

  17. Developing Tools for Assessing and Using Commercially Available Reading Software Programs to Promote the Development of Early Reading Skills in Children

    ERIC Educational Resources Information Center

    Wood, Eileen; Gottardo, Alexandra; Grant, Amy; Evans, Mary Ann; Phillips, Linda; Savage, Robert

    2012-01-01

    As computers become an increasingly ubiquitous part of young children's lives there is a need to examine how best to harness digital technologies to promote learning in early childhood education contexts. The development of emergent literacy skills is 1 domain for which numerous software programs are available for young learners. In this study, we…

  18. Numerical simulation of deformation and failure processes of a complex technical object under impact loading

    NASA Astrophysics Data System (ADS)

    Kraus, E. I.; Shabalin, I. I.; Shabalin, T. I.

    2018-04-01

    The main points of development of numerical tools for simulation of deformation and failure of complex technical objects under nonstationary conditions of extreme loading are presented. The possibility of extending the dynamic method for construction of difference grids to the 3D case is shown. A 3D realization of discrete-continuum approach to the deformation and failure of complex technical objects is carried out. The efficiency of the existing software package for 3D modelling is shown.

  19. MODELING MICROBUBBLE DYNAMICS IN BIOMEDICAL APPLICATIONS*

    PubMed Central

    CHAHINE, Georges L.; HSIAO, Chao-Tsung

    2012-01-01

    Controlling microbubble dynamics to produce desirable biomedical outcomes when and where necessary and avoid deleterious effects requires advanced knowledge, which can be achieved only through a combination of experimental and numerical/analytical techniques. The present communication presents a multi-physics approach to study the dynamics combining viscous- in-viscid effects, liquid and structure dynamics, and multi bubble interaction. While complex numerical tools are developed and used, the study aims at identifying the key parameters influencing the dynamics, which need to be included in simpler models. PMID:22833696

  20. Interferometric correction system for a numerically controlled machine

    DOEpatents

    Burleson, Robert R.

    1978-01-01

    An interferometric correction system for a numerically controlled machine is provided to improve the positioning accuracy of a machine tool, for example, for a high-precision numerically controlled machine. A laser interferometer feedback system is used to monitor the positioning of the machine tool which is being moved by command pulses to a positioning system to position the tool. The correction system compares the commanded position as indicated by a command pulse train applied to the positioning system with the actual position of the tool as monitored by the laser interferometer. If the tool position lags the commanded position by a preselected error, additional pulses are added to the pulse train applied to the positioning system to advance the tool closer to the commanded position, thereby reducing the lag error. If the actual tool position is leading in comparison to the commanded position, pulses are deleted from the pulse train where the advance error exceeds the preselected error magnitude to correct the position error of the tool relative to the commanded position.

  1. solveME: fast and reliable solution of nonlinear ME models.

    PubMed

    Yang, Laurence; Ma, Ding; Ebrahim, Ali; Lloyd, Colton J; Saunders, Michael A; Palsson, Bernhard O

    2016-09-22

    Genome-scale models of metabolism and macromolecular expression (ME) significantly expand the scope and predictive capabilities of constraint-based modeling. ME models present considerable computational challenges: they are much (>30 times) larger than corresponding metabolic reconstructions (M models), are multiscale, and growth maximization is a nonlinear programming (NLP) problem, mainly due to macromolecule dilution constraints. Here, we address these computational challenges. We develop a fast and numerically reliable solution method for growth maximization in ME models using a quad-precision NLP solver (Quad MINOS). Our method was up to 45 % faster than binary search for six significant digits in growth rate. We also develop a fast, quad-precision flux variability analysis that is accelerated (up to 60× speedup) via solver warm-starts. Finally, we employ the tools developed to investigate growth-coupled succinate overproduction, accounting for proteome constraints. Just as genome-scale metabolic reconstructions have become an invaluable tool for computational and systems biologists, we anticipate that these fast and numerically reliable ME solution methods will accelerate the wide-spread adoption of ME models for researchers in these fields.

  2. The COPERNIC3 project: how AREVA is successfully developing an advanced global fuel rod performance code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garnier, Ch.; Mailhe, P.; Sontheimer, F.

    2007-07-01

    Fuel performance is a key factor for minimizing operating costs in nuclear plants. One of the important aspects of fuel performance is fuel rod design, based upon reliable tools able to verify the safety of current fuel solutions, prevent potential issues in new core managements and guide the invention of tomorrow's fuels. AREVA is developing its future global fuel rod code COPERNIC3, which is able to calculate the thermal-mechanical behavior of advanced fuel rods in nuclear plants. Some of the best practices to achieve this goal are described, by reviewing the three pillars of a fuel rod code: the database,more » the modelling and the computer and numerical aspects. At first, the COPERNIC3 database content is described, accompanied by the tools developed to effectively exploit the data. Then is given an overview of the main modelling aspects, by emphasizing the thermal, fission gas release and mechanical sub-models. In the last part, numerical solutions are detailed in order to increase the computational performance of the code, with a presentation of software configuration management solutions. (authors)« less

  3. Actualities and Development of Heavy-Duty CNC Machine Tool Thermal Error Monitoring Technology

    NASA Astrophysics Data System (ADS)

    Zhou, Zu-De; Gui, Lin; Tan, Yue-Gang; Liu, Ming-Yao; Liu, Yi; Li, Rui-Ya

    2017-09-01

    Thermal error monitoring technology is the key technological support to solve the thermal error problem of heavy-duty CNC (computer numerical control) machine tools. Currently, there are many review literatures introducing the thermal error research of CNC machine tools, but those mainly focus on the thermal issues in small and medium-sized CNC machine tools and seldom introduce thermal error monitoring technologies. This paper gives an overview of the research on the thermal error of CNC machine tools and emphasizes the study of thermal error of the heavy-duty CNC machine tool in three areas. These areas are the causes of thermal error of heavy-duty CNC machine tool and the issues with the temperature monitoring technology and thermal deformation monitoring technology. A new optical measurement technology called the "fiber Bragg grating (FBG) distributed sensing technology" for heavy-duty CNC machine tools is introduced in detail. This technology forms an intelligent sensing and monitoring system for heavy-duty CNC machine tools. This paper fills in the blank of this kind of review articles to guide the development of this industry field and opens up new areas of research on the heavy-duty CNC machine tool thermal error.

  4. An Introduction to Intelligent Processing Programs Developed by the Air Force Manufacturing Technology Directorate

    NASA Technical Reports Server (NTRS)

    Sampson, Paul G.; Sny, Linda C.

    1992-01-01

    The Air Force has numerous on-going manufacturing and integration development programs (machine tools, composites, metals, assembly, and electronics) which are instrumental in improving productivity in the aerospace industry, but more importantly, have identified strategies and technologies required for the integration of advanced processing equipment. An introduction to four current Air Force Manufacturing Technology Directorate (ManTech) manufacturing areas is provided. Research is being carried out in the following areas: (1) machining initiatives for aerospace subcontractors which provide for advanced technology and innovative manufacturing strategies to increase the capabilities of small shops; (2) innovative approaches to advance machine tool products and manufacturing processes; (3) innovative approaches to advance sensors for process control in machine tools; and (4) efforts currently underway to develop, with the support of industry, the Next Generation Workstation/Machine Controller (Low-End Controller Task).

  5. A Density Perturbation Method to Study the Eigenstructure of Two-Phase Flow Equation Systems

    NASA Astrophysics Data System (ADS)

    Cortes, J.; Debussche, A.; Toumi, I.

    1998-12-01

    Many interesting and challenging physical mechanisms are concerned with the mathematical notion of eigenstructure. In two-fluid models, complex phasic interactions yield a complex eigenstructure which may raise numerous problems in numerical simulations. In this paper, we develop a perturbation method to examine the eigenvalues and eigenvectors of two-fluid models. This original method, based on the stiffness of the density ratio, provides a convenient tool to study the relevance of pressure momentum interactions and allows us to get precise approximations of the whole flow eigendecomposition for minor requirements. Roe scheme is successfully implemented and some numerical tests are presented.

  6. Looking forward to genetically edited fruit crops.

    PubMed

    Nagamangala Kanchiswamy, Chidananda; Sargent, Daniel James; Velasco, Riccardo; Maffei, Massimo E; Malnoy, Mickael

    2015-02-01

    The availability of genome sequences for many fruit crops has redefined the boundaries of genetic engineering and genetically modified (GM) crop plants. However commercialization of GM crops is hindered by numerous regulatory and social hurdles. Here, we focus on recently developed genome-editing tools for fruit crop improvement and their importance from the consumer perspective. Challenges and opportunities for the deployment of new genome-editing tools for fruit plants are also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. A finite element analysis modeling tool for solid oxide fuel cell development: coupled electrochemistry, thermal and flow analysis in MARC®

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khaleel, Mohammad A.; Lin, Zijing; Singh, Prabhakar

    2004-05-03

    A 3D simulation tool for modeling solid oxide fuel cells is described. The tool combines the versatility and efficiency of a commercial finite element analysis code, MARC{reg_sign}, with an in-house developed robust and flexible electrochemical (EC) module. Based upon characteristic parameters obtained experimentally and assigned by the user, the EC module calculates the current density distribution, heat generation, and fuel and oxidant species concentration, taking the temperature profile provided by MARC{reg_sign} and operating conditions such as the fuel and oxidant flow rate and the total stack output voltage or current as the input. MARC{reg_sign} performs flow and thermal analyses basedmore » on the initial and boundary thermal and flow conditions and the heat generation calculated by the EC module. The main coupling between MARC{reg_sign} and EC is for MARC{reg_sign} to supply the temperature field to EC and for EC to give the heat generation profile to MARC{reg_sign}. The loosely coupled, iterative scheme is advantageous in terms of memory requirement, numerical stability and computational efficiency. The coupling is iterated to self-consistency for a steady-state solution. Sample results for steady states as well as the startup process for stacks with different flow designs are presented to illustrate the modeling capability and numerical performance characteristic of the simulation tool.« less

  8. FIELD APPLICATIONS OF ROBOTIC SYSTEMS IN HAZARDOUS WASTE SITE OPERATIONS

    EPA Science Inventory

    The cleanup of hazardous waste sites is a challenging and complex field that offers numerous opportunities for the application of robotic technology. he contamination problem, long in the making, will take decades to resolve. ur ingenuity in developing robotic tools to assist in ...

  9. COMMUNITY-SCALE MODELING FOR AIR TOXICS AND HOMELAND SECURITY

    EPA Science Inventory

    The purpose of this task is to develop and evaluate numerical and physical modeling tools for simulating ambient concentrations of airborne substances in urban settings at spatial scales ranging from <1-10 km. Research under this task will support client needs in human exposure ...

  10. A new statistical model for subgrid dispersion in large eddy simulations of particle-laden flows

    NASA Astrophysics Data System (ADS)

    Muela, Jordi; Lehmkuhl, Oriol; Pérez-Segarra, Carles David; Oliva, Asensi

    2016-09-01

    Dispersed multiphase turbulent flows are present in many industrial and commercial applications like internal combustion engines, turbofans, dispersion of contaminants, steam turbines, etc. Therefore, there is a clear interest in the development of models and numerical tools capable of performing detailed and reliable simulations about these kind of flows. Large Eddy Simulations offer good accuracy and reliable results together with reasonable computational requirements, making it a really interesting method to develop numerical tools for particle-laden turbulent flows. Nonetheless, in multiphase dispersed flows additional difficulties arises in LES, since the effect of the unresolved scales of the continuous phase over the dispersed phase is lost due to the filtering procedure. In order to solve this issue a model able to reconstruct the subgrid velocity seen by the particles is required. In this work a new model for the reconstruction of the subgrid scale effects over the dispersed phase is presented and assessed. This innovative methodology is based in the reconstruction of statistics via Probability Density Functions (PDFs).

  11. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  12. Developing a Computational Environment for Coupling MOR Data, Maps, and Models: The Virtual Research Vessel (VRV) Prototype

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; O'Dea, E.; Cushing, J. B.; Cuny, J. E.; Toomey, D. R.; Hackett, K.; Tikekar, R.

    2001-12-01

    The East Pacific Rise (EPR) from 9-10deg. N is currently our best-studied section of fast-spreading mid-ocean ridge. During several decades of investigation it has been explored by the full spectrum of ridge investigators, including chemists, biologists, geologists and geophysicists. These studies, and those that are ongoing, provide a wealth of observational data, results and data-driven theoretical (often numerical) studies that have not yet been fully utilized either by research scientists or by professional educators. While the situation is improving, a large amount of data, results, and related theoretical models still exist either in an inert, non-interactive form (e.g., journal publications) or as unlinked and currently incompatible computer data or algorithms. Infrastructure is needed not just for ready access to data, but linkage of disparate data sets (data to data) as well as data to models in order quantitatively evaluate hypotheses, refine numerical simulations, and explore new relations between observables. The prototype of a computational environment and toolset, called the Virtual Research Vessel (VRV), is being developed to provide scientists and educators with ready access to data, results and numerical models. While this effort is focused on the EPR 9N region, the resulting software tools and infrastructure should be helpful in establishing similar systems for other sections of the global mid-ocean ridge. Work in progress includes efforts to develop: (1) virtual database to incorporate diverse data types with domain-specific metadata into a global schema that allows web-query across different marine geology data sets, and an analogous declarative (database available) description of tools and models; (2) the ability to move data between GIS and the above DBMS, and tools to encourage data submission to archivesl (3) tools for finding and viewing archives, and translating between formats; (4) support for "computational steering" (tool composition) and model coupling (e.g., ability to run tool composition locally but access input data from the web, APIs to support coupling such as invoking programs that are running remotely, and help in writing data wrappers to publish programs); (5) support of migration paths for prototyped model coupling; and (6) export of marine geological data and data analysis to the undergraduate classroom (VRV-ET, "Educational Tool"). See the main VRV web site at http://oregonstate.edu/dept/vrv and the VRV-ET web site at: http://www.cs.uoregon.edu/research/vrv-et.

  13. Development and Validation of a Standardized Tool for Prioritization of Information Sources.

    PubMed

    Akwar, Holy; Kloeze, Harold; Mukhi, Shamir

    2016-01-01

    To validate the utility and effectiveness of a standardized tool for prioritization of information sources for early detection of diseases. The tool was developed with input from diverse public health experts garnered through survey. Ten raters used the tool to evaluate ten information sources and reliability among raters was computed. The Proc mixed procedure with random effect statement and SAS Macros were used to compute multiple raters' Fleiss Kappa agreement and Kendall's Coefficient of Concordance. Ten disparate information sources evaluated obtained the following composite scores: ProMed 91%; WAHID 90%; Eurosurv 87%; MediSys 85%; SciDaily 84%; EurekAl 83%; CSHB 78%; GermTrax 75%; Google 74%; and CBC 70%. A Fleiss Kappa agreement of 50.7% was obtained for ten information sources and 72.5% for a sub-set of five sources rated, which is substantial agreement validating the utility and effectiveness of the tool. This study validated the utility and effectiveness of a standardized criteria tool developed to prioritize information sources. The new tool was used to identify five information sources suited for use by the KIWI system in the CEZD-IIR project to improve surveillance of infectious diseases. The tool can be generalized to situations when prioritization of numerous information sources is necessary.

  14. Practical Tips and Tools--Using Stories from the Field for Professional Development: "How-To" Guidelines from Reading to Reflection and Practice Integration

    ERIC Educational Resources Information Center

    Talmi, Ayelet

    2013-01-01

    Case studies provide numerous opportunities for professional development and can be particularly helpful in transdiciplinary training. This article offers suggestions for how to use the "Zero to Three" Journal's "Stories From the Field" series of articles across a variety of settings and roles such as clinical practice, program…

  15. Nondestructive testing for assessing wood members in structures : a review

    Treesearch

    R. J. Ross; R. F. Pellerin

    1994-01-01

    Numerous organizations have conducted research to develop nondestructive testing (NDT) techniques for assessing the condition of wood members in structures. A review of this research was published in 1991. This is an update of the 1991 report. It presents a comprehensive review of published research on the development and use of NDT tools for in-place assessment of...

  16. Integrated flexible manufacturing program for manufacturing automation and rapid prototyping

    NASA Technical Reports Server (NTRS)

    Brooks, S. L.; Brown, C. W.; King, M. S.; Simons, W. R.; Zimmerman, J. J.

    1993-01-01

    The Kansas City Division of Allied Signal Inc., as part of the Integrated Flexible Manufacturing Program (IFMP), is developing an integrated manufacturing environment. Several systems are being developed to produce standards and automation tools for specific activities within the manufacturing environment. The Advanced Manufacturing Development System (AMDS) is concentrating on information standards (STEP) and product data transfer; the Expert Cut Planner system (XCUT) is concentrating on machining operation process planning standards and automation capabilities; the Advanced Numerical Control system (ANC) is concentrating on NC data preparation standards and NC data generation tools; the Inspection Planning and Programming Expert system (IPPEX) is concentrating on inspection process planning, coordinate measuring machine (CMM) inspection standards and CMM part program generation tools; and the Intelligent Scheduling and Planning System (ISAPS) is concentrating on planning and scheduling tools for a flexible manufacturing system environment. All of these projects are working together to address information exchange, standardization, and information sharing to support rapid prototyping in a Flexible Manufacturing System (FMS) environment.

  17. Using Virtualization to Integrate Weather, Climate, and Coastal Science Education

    NASA Astrophysics Data System (ADS)

    Davis, J. R.; Paramygin, V. A.; Figueiredo, R.; Sheng, Y.

    2012-12-01

    To better understand and communicate the important roles of weather and climate on the coastal environment, a unique publically available tool is being developed to support research, education, and outreach activities. This tool uses virtualization technologies to facilitate an interactive, hands-on environment in which students, researchers, and general public can perform their own numerical modeling experiments. While prior efforts have focused solely on the study of the coastal and estuary environments, this effort incorporates the community supported weather and climate model (WRF-ARW) into the Coastal Science Educational Virtual Appliance (CSEVA), an education tool used to assist in the learning of coastal transport processes; storm surge and inundation; and evacuation modeling. The Weather Research and Forecasting (WRF) Model is a next-generation, community developed and supported, mesoscale numerical weather prediction system designed to be used internationally for research, operations, and teaching. It includes two dynamical solvers (ARW - Advanced Research WRF and NMM - Nonhydrostatic Mesoscale Model) as well as a data assimilation system. WRF-ARW is the ARW dynamics solver combined with other components of the WRF system which was developed primarily at NCAR, community support provided by the Mesoscale and Microscale Meteorology (MMM) division of National Center for Atmospheric Research (NCAR). Included with WRF is the WRF Pre-processing System (WPS) which is a set of programs to prepare input for real-data simulations. The CSEVA is based on the Grid Appliance (GA) framework and is built using virtual machine (VM) and virtual networking technologies. Virtualization supports integration of an operating system, libraries (e.g. Fortran, C, Perl, NetCDF, etc. necessary to build WRF), web server, numerical models/grids/inputs, pre-/post-processing tools (e.g. WPS / RIP4 or UPS), graphical user interfaces, "Cloud"-computing infrastructure and other tools into a single ready-to-use package. Thus, the previous ornery task of setting up and compiling these tools becomes obsolete and the research, educator or student can focus on using the tools to study the interactions between weather, climate and the coastal environment. The incorporation of WRF into the CSEVA has been designed to be synergistic with the extensive online tutorials and biannual tutorials hosted by NCAR. Included are working examples of the idealized test simulations provided with WRF (2D sea breeze and squalls, a large eddy simulation, a Held and Suarez simulation, etc.) To demonstrate the integration of weather, coastal and coastal science education, example applications are being developed to demonstrate how the system can be used to couple a coastal and estuarine circulation, transport and storm surge model with downscale reanalysis weather and future climate predictions. Documentation, tutorials and the enhanced CSEVA itself will be found on the web at: http://cseva.coastal.ufl.edu.

  18. Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Couch, R; Becker, R; Rhee, M

    2004-09-24

    Lawrence Livermore National Laboratory participated in a U. S. Department of Energy/Office of Industrial Technology sponsored research project 'Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery', as a Cooperative Agreement TC-02028 with the Alcoa Technical Center (ATC). The objective of the joint project with Alcoa is to develop a numerical modeling capability to optimize the hot rolling process used to produce aluminum plate. Product lost in the rolling process and subsequent recycling, wastes resources consumed in the energy-intensive steps of remelting and reprocessing the ingot. The modeling capability developed by project partners willmore » be used to produce plate more efficiently and with reduced product loss.« less

  19. Numerical study of read scheme in one-selector one-resistor crossbar array

    NASA Astrophysics Data System (ADS)

    Kim, Sungho; Kim, Hee-Dong; Choi, Sung-Jin

    2015-12-01

    A comprehensive numerical circuit analysis of read schemes of a one selector-one resistance change memory (1S1R) crossbar array is carried out. Three schemes-the ground, V/2, and V/3 schemes-are compared with each other in terms of sensing margin and power consumption. Without the aid of a complex analytical approach or SPICE-based simulation, a simple numerical iteration method is developed to simulate entire current flows and node voltages within a crossbar array. Understanding such phenomena is essential in successfully evaluating the electrical specifications of selectors for suppressing intrinsic drawbacks of crossbar arrays, such as sneaky current paths and series line resistance problems. This method provides a quantitative tool for the accurate analysis of crossbar arrays and provides guidelines for developing an optimal read scheme, array configuration, and selector device specifications.

  20. An open-source Java-based Toolbox for environmental model evaluation: The MOUSE Software Application

    USDA-ARS?s Scientific Manuscript database

    A consequence of environmental model complexity is that the task of understanding how environmental models work and identifying their sensitivities/uncertainties, etc. becomes progressively more difficult. Comprehensive numerical and visual evaluation tools have been developed such as the Monte Carl...

  1. Using Genetic Algorithm and MODFLOW to Characterize Aquifer System of Northwest Florida

    EPA Science Inventory

    By integrating Genetic Algorithm and MODFLOW2005, an optimizing tool is developed to characterize the aquifer system of Region II, Northwest Florida. The history and the newest available observation data of the aquifer system is fitted automatically by using the numerical model c...

  2. Development and characterization of rice mutants for functional genomic studies and breeding

    USDA-ARS?s Scientific Manuscript database

    Mutagenesis is a powerful tool for creating genetic materials for studying functional genomics, breeding, and understanding the molecular basis of disease resistance. Approximately 100,000 putative mutants of rice (Oryza sativa L.) have been generated with mutagens. Numerous mutant genes involved in...

  3. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  4. The Role of Wakes in Modelling Tidal Current Turbines

    NASA Astrophysics Data System (ADS)

    Conley, Daniel; Roc, Thomas; Greaves, Deborah

    2010-05-01

    The eventual proper development of arrays of Tidal Current Turbines (TCT) will require a balance which maximizes power extraction while minimizing environmental impacts. Idealized analytical analogues and simple 2-D models are useful tools for investigating questions of a general nature but do not represent a practical tool for application to realistic cases. Some form of 3-D numerical simulations will be required for such applications and the current project is designed to develop a numerical decision-making tool for use in planning large scale TCT projects. The project is predicated on the use of an existing regional ocean modelling framework (the Regional Ocean Modelling System - ROMS) which is modified to enable the user to account for the effects of TCTs. In such a framework where mixing processes are highly parametrized, the fidelity of the quantitative results is critically dependent on the parameter values utilized. In light of the early stage of TCT development and the lack of field scale measurements, the calibration of such a model is problematic. In the absence of explicit calibration data sets, the device wake structure has been identified as an efficient feature for model calibration. This presentation will discuss efforts to design an appropriate calibration scheme which focuses on wake decay and the motivation for this approach, techniques applied, validation results from simple test cases and limitations shall be presented.

  5. Evaluation of Proteus as a Tool for the Rapid Development of Models of Hydrologic Systems

    NASA Astrophysics Data System (ADS)

    Weigand, T. M.; Farthing, M. W.; Kees, C. E.; Miller, C. T.

    2013-12-01

    Models of modern hydrologic systems can be complex and involve a variety of operators with varying character. The goal is to implement approximations of such models that are both efficient for the developer and computationally efficient, which is a set of naturally competing objectives. Proteus is a Python-based toolbox that supports prototyping of model formulations as well as a wide variety of modern numerical methods and parallel computing. We used Proteus to develop numerical approximations for three models: Richards' equation, a brine flow model derived using the Thermodynamically Constrained Averaging Theory (TCAT), and a multiphase TCAT-based tumor growth model. For Richards' equation, we investigated discontinuous Galerkin solutions with higher order time integration based on the backward difference formulas. The TCAT brine flow model was implemented using Proteus and a variety of numerical methods were compared to hand coded solutions. Finally, an existing tumor growth model was implemented in Proteus to introduce more advanced numerics and allow the code to be run in parallel. From these three example models, Proteus was found to be an attractive open-source option for rapidly developing high quality code for solving existing and evolving computational science models.

  6. Development and experimental assessment of a numerical modelling code to aid the design of profile extrusion cooling tools

    NASA Astrophysics Data System (ADS)

    Carneiro, O. S.; Rajkumar, A.; Fernandes, C.; Ferrás, L. L.; Habla, F.; Nóbrega, J. M.

    2017-10-01

    On the extrusion of thermoplastic profiles, upon the forming stage that takes place in the extrusion die, the profile must be cooled in a metallic calibrator. This stage must be done at a high rate, to assure increased productivity, but avoiding the development of high temperature gradients, in order to minimize the level of induced thermal residual stresses. In this work, we present a new coupled numerical solver, developed in the framework of the OpenFOAM® computational library, that computes the temperature distribution in both domains simultaneously (metallic calibrator and plastic profile), whose implementation aimed the minimization of the computational time. The new solver was experimentally assessed with an industrial case study.

  7. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  8. Modelling of peak temperature during friction stir processing of magnesium alloy AZ91

    NASA Astrophysics Data System (ADS)

    Vaira Vignesh, R.; Padmanaban, R.

    2018-02-01

    Friction stir processing (FSP) is a solid state processing technique with potential to modify the properties of the material through microstructural modification. The study of heat transfer in FSP aids in the identification of defects like flash, inadequate heat input, poor material flow and mixing etc. In this paper, transient temperature distribution during FSP of magnesium alloy AZ91 was simulated using finite element modelling. The numerical model results were validated using the experimental results from the published literature. The model was used to predict the peak temperature obtained during FSP for various process parameter combinations. The simulated peak temperature results were used to develop a statistical model. The effect of process parameters namely tool rotation speed, tool traverse speed and shoulder diameter of the tool on the peak temperature was investigated using the developed statistical model. It was found that peak temperature was directly proportional to tool rotation speed and shoulder diameter and inversely proportional to tool traverse speed.

  9. GenSAA: A tool for advancing satellite monitoring with graphical expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Luczak, Edward C.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real time data for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At the NASA Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  10. The Generic Spacecraft Analyst Assistant (gensaa): a Tool for Developing Graphical Expert Systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real-time data. The analysts must watch for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As the satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At NASA GSFC, fault-isolation expert systems are in operation supporting this data monitoring task. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will readily support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  11. Fast analysis of radionuclide decay chain migration

    NASA Astrophysics Data System (ADS)

    Chen, J. S.; Liang, C. P.; Liu, C. W.; Li, L.

    2014-12-01

    A novel tool for rapidly predicting the long-term plume behavior of an arbitrary length radionuclide decay chain is presented in this study. This fast tool is achieved based on generalized analytical solutions in compact format derived for a set of two-dimensional advection-dispersion equations coupled with sequential first-order decay reactions in groundwater system. The performance of the developed tool is evaluated by a numerical model using a Laplace transform finite difference scheme. The results of performance evaluation indicate that the developed model is robust and accurate. The developed model is then used to fast understand the transport behavior of a four-member radionuclide decay chain. Results show that the plume extents and concentration levels of any target radionuclide are very sensitive to longitudinal, transverse dispersion, decay rate constant and retardation factor. The developed model are useful tools for rapidly assessing the ecological and environmental impact of the accidental radionuclide releases such as the Fukushima nuclear disaster where multiple radionuclides leaked through the reactor, subsequently contaminating the local groundwater and ocean seawater in the vicinity of the nuclear plant.

  12. Cell chips as new tools for cell biology--results, perspectives and opportunities.

    PubMed

    Primiceri, Elisabetta; Chiriacò, Maria Serena; Rinaldi, Ross; Maruccio, Giuseppe

    2013-10-07

    Cell culture technologies were initially developed as research tools for studying cell functions, but nowadays they are essential for the biotechnology industry, with rapidly expanding applications requiring more and more advancements with respect to traditional tools. Miniaturization and integration of sensors and microfluidic components with cell culture techniques open the way to the development of cellomics as a new field of research targeting innovative analytic platforms for high-throughput studies. This approach enables advanced cell studies under controllable conditions by providing inexpensive, easy-to-operate devices. Thanks to their numerous advantages cell-chips have become a hotspot in biosensors and bioelectronics fields and have been applied to very different fields. In this review exemplary applications will be discussed, for cell counting and detection, cytotoxicity assays, migration assays and stem cell studies.

  13. Development and Demonstration of a Computational Tool for the Analysis of Particle Vitiation Effects in Hypersonic Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Perkins, Hugh Douglas

    2010-01-01

    In order to improve the understanding of particle vitiation effects in hypersonic propulsion test facilities, a quasi-one dimensional numerical tool was developed to efficiently model reacting particle-gas flows over a wide range of conditions. Features of this code include gas-phase finite-rate kinetics, a global porous-particle combustion model, mass, momentum and energy interactions between phases, and subsonic and supersonic particle drag and heat transfer models. The basic capabilities of this tool were validated against available data or other validated codes. To demonstrate the capabilities of the code a series of computations were performed for a model hypersonic propulsion test facility and scramjet. Parameters studied were simulated flight Mach number, particle size, particle mass fraction and particle material.

  14. Hard Sphere Simulation by Event-Driven Molecular Dynamics: Breakthrough, Numerical Difficulty, and Overcoming the issues

    NASA Astrophysics Data System (ADS)

    Isobe, Masaharu

    Hard sphere/disk systems are among the simplest models and have been used to address numerous fundamental problems in the field of statistical physics. The pioneering numerical works on the solid-fluid phase transition based on Monte Carlo (MC) and molecular dynamics (MD) methods published in 1957 represent historical milestones, which have had a significant influence on the development of computer algorithms and novel tools to obtain physical insights. This chapter addresses the works of Alder's breakthrough regarding hard sphere/disk simulation: (i) event-driven molecular dynamics, (ii) long-time tail, (iii) molasses tail, and (iv) two-dimensional melting/crystallization. From a numerical viewpoint, there are serious issues that must be overcome for further breakthrough. Here, we present a brief review of recent progress in this area.

  15. Calculation to experiment comparison of SPND signals in various nuclear reactor environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barbot, Loic; Radulovic, Vladimir; Fourmentel, Damien

    2015-07-01

    In the perspective of irradiation experiments in the future Jules Horowitz Reactor (JHR), the Instrumentation Sensors and Dosimetry Laboratory of CEA Cadarache (France) is developing a numerical tool for SPND design, simulation and operation. In the frame of the SPND numerical tool qualification, dedicated experiments have been performed both in the Slovenian TRIGA Mark II reactor (JSI) and very recently in the French CEA Saclay OSIRIS reactor, as well as a test of two detectors in the core of the Polish MARIA reactor (NCBJ). A full description of experimental set-ups and neutron-gamma calculations schemes are provided in the first partmore » of the paper. Calculation to experiment comparison of the various SPNDs in the different reactors is thoroughly described and discussed in the second part. Presented comparisons show promising final results. (authors)« less

  16. Computational Challenges of Viscous Incompressible Flows

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin; Kim, Chang Sung

    2004-01-01

    Over the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of the computational fluid dynamics (CFD) discipline. Although incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to the rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low-speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient CFD took become increasingly important in fluid engineering for incompressible and low-speed flow. This paper reviews some of the successes made possible by advances in computational technologies during the same period, and discusses some of the current challenges faced in computing incompressible flows.

  17. Layer-oriented simulation tool.

    PubMed

    Arcidiacono, Carmelo; Diolaiti, Emiliano; Tordi, Massimiliano; Ragazzoni, Roberto; Farinato, Jacopo; Vernet, Elise; Marchetti, Enrico

    2004-08-01

    The Layer-Oriented Simulation Tool (LOST) is a numerical simulation code developed for analysis of the performance of multiconjugate adaptive optics modules following a layer-oriented approach. The LOST code computes the atmospheric layers in terms of phase screens and then propagates the phase delays introduced in the natural guide stars' wave fronts by using geometrical optics approximations. These wave fronts are combined in an optical or numerical way, including the effects of wave-front sensors on measurements in terms of phase noise. The LOST code is described, and two applications to layer-oriented modules are briefly presented. We have focus on the Multiconjugate adaptive optics demonstrator to be mounted upon the Very Large Telescope and on the Near-IR-Visible Adaptive Interferometer for Astronomy (NIRVANA) interferometric system to be installed on the combined focus of the Large Binocular Telescope.

  18. Constitutive and numerical modeling of soil and soil-pile interaction for 3D applications and Kealakaha stream bridge case study.

    DOT National Transportation Integrated Search

    2011-12-01

    This study is concerned with developing new modeling tools for predicting the response of the new Kealakaha : Stream Bridge to static and dynamic loads, including seismic shaking. The bridge will span 220 meters, with the : deck structure being curve...

  19. 3DHYDROGEOCHEM: A 3-DIMENSIONAL MODEL OF DENSITY-DEPENDENT SUBSURFACE FLOW AND THERMAL MULTISPECIES-MULTICOMPONENT HYDROGEOCHEMICAL TRANSPORT

    EPA Science Inventory

    This report presents a three-dimensional finite-element numerical model designed to simulate chemical transport in subsurface systems with temperature effect taken into account. The three-dimensional model is developed to provide (1) a tool of application, with which one is able...

  20. An Interactive Multimedia Dichotomous Key for Teaching Plant Identification

    ERIC Educational Resources Information Center

    Jacquemart, Anne-Laure; Lhoir, Pierre; Binard, Fabian; Descamps, Charlotte

    2016-01-01

    Teaching plant identification includes demonstrating how to use dichotomous keys; this requires knowledge of numerous botanical terms and can be challenging, confusing and frustrating for students. Here, we developed a multimedia tool to help students (1) learn botanical terms, (2) practice, train and test their knowledge of plant identification…

  1. Using Genetic Algorithm and MODFLOW to Characterize Aquifer System of Northwest Florida (Published Proceedings)

    EPA Science Inventory

    By integrating Genetic Algorithm and MODFLOW2005, an optimizing tool is developed to characterize the aquifer system of Region II, Northwest Florida. The history and the newest available observation data of the aquifer system is fitted automatically by using the numerical model c...

  2. Shoaling develops with age in Zebrafish (Danio rerio)

    PubMed Central

    Buske, Christine; Gerlai, Robert

    2010-01-01

    The biological mechanisms of human social behavior are complex. Animal models may facilitate the understanding of these mechanisms and may help one to develop treatment strategies for abnormal human social behavior, a core symptom in numerous clinical conditions. The zebrafish is perhaps the most social vertebrate among commonly used laboratory species. Given its practical features and the numerous genetic tools developed for it, it should be a promising tool. Zebrafish shoal, i.e. form tight multimember groups, but the ontogenesis of this behavior has not been described. Analyzing the development of shoaling is a step towards discovering the mechanisms of this behavior. Here we study age-dependent changes of shoaling in zebrafish from day 7 post fertilization to over 5 months of age by measuring the distance between all pairs of fish in freely swimming groups of ten subjects. Our longitudinal (repeated measure within subject) and cross sectional (non-repeated measure between subject) analyses both demonstrated a significant increase of shoaling with age (decreased distance between shoal members). Given the sophisticated genetic and developmental biology methods already available for zebrafish, we argue that our behavioral results open a new avenue towards the understanding of the development of vertebrate social behavior and of its mechanisms and abnormalities. PMID:20837077

  3. Multimodal visualization interface for data management, self-learning and data presentation.

    PubMed

    Van Sint Jan, S; Demondion, X; Clapworthy, G; Louryan, S; Rooze, M; Cotten, A; Viceconti, M

    2006-10-01

    A multimodal visualization software, called the Data Manager (DM), has been developed to increase interdisciplinary communication around the topic of visualization and modeling of various aspects of the human anatomy. Numerous tools used in Radiology are integrated in the interface that runs on standard personal computers. The available tools, combined to hierarchical data management and custom layouts, allow analyzing of medical imaging data using advanced features outside radiological premises (for example, for patient review, conference presentation or tutorial preparation). The system is free, and based on an open-source software development architecture, and therefore updates of the system for custom applications are possible.

  4. Numerical Simulations of the Digital Microfluidic Manipulation of Single Microparticles.

    PubMed

    Lan, Chuanjin; Pal, Souvik; Li, Zhen; Ma, Yanbao

    2015-09-08

    Single-cell analysis techniques have been developed as a valuable bioanalytical tool for elucidating cellular heterogeneity at genomic, proteomic, and cellular levels. Cell manipulation is an indispensable process for single-cell analysis. Digital microfluidics (DMF) is an important platform for conducting cell manipulation and single-cell analysis in a high-throughput fashion. However, the manipulation of single cells in DMF has not been quantitatively studied so far. In this article, we investigate the interaction of a single microparticle with a liquid droplet on a flat substrate using numerical simulations. The droplet is driven by capillary force generated from the wettability gradient of the substrate. Considering the Brownian motion of microparticles, we utilize many-body dissipative particle dynamics (MDPD), an off-lattice mesoscopic simulation technique, in this numerical study. The manipulation processes (including pickup, transport, and drop-off) of a single microparticle with a liquid droplet are simulated. Parametric studies are conducted to investigate the effects on the manipulation processes from the droplet size, wettability gradient, wetting properties of the microparticle, and particle-substrate friction coefficients. The numerical results show that the pickup, transport, and drop-off processes can be precisely controlled by these parameters. On the basis of the numerical results, a trap-free delivery of a hydrophobic microparticle to a destination on the substrate is demonstrated in the numerical simulations. The numerical results not only provide a fundamental understanding of interactions among the microparticle, the droplet, and the substrate but also demonstrate a new technique for the trap-free immobilization of single hydrophobic microparticles in the DMF design. Finally, our numerical method also provides a powerful design and optimization tool for the manipulation of microparticles in DMF systems.

  5. Personal computer study of finite-difference methods for the transonic small disturbance equation

    NASA Technical Reports Server (NTRS)

    Bland, Samuel R.

    1989-01-01

    Calculation of unsteady flow phenomena requires careful attention to the numerical treatment of the governing partial differential equations. The personal computer provides a convenient and useful tool for the development of meshes, algorithms, and boundary conditions needed to provide time accurate solution of these equations. The one-dimensional equation considered provides a suitable model for the study of wave propagation in the equations of transonic small disturbance potential flow. Numerical results for effects of mesh size, extent, and stretching, time step size, and choice of far-field boundary conditions are presented. Analysis of the discretized model problem supports these numerical results. Guidelines for suitable mesh and time step choices are given.

  6. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  7. Microelectronics: The Nature of Work, Skills and Training. An Analysis of Case Studies from Developed and Developing Countries. Training Discussion Paper No. 51.

    ERIC Educational Resources Information Center

    Acero, Liliana

    Microelectronic technologies have had an impact on the nature of work in industry for both white-collar and blue-collar workers. Evidence from sector- and enterprise-level studies shows changes in skills and job content for blue-collar workers involved with numerically controlled machine tools, robots, and other microelectronics applications.…

  8. Presenting quantitative information about decision outcomes: a risk communication primer for patient decision aid developers.

    PubMed

    Trevena, Lyndal J; Zikmund-Fisher, Brian J; Edwards, Adrian; Gaissmaier, Wolfgang; Galesic, Mirta; Han, Paul K J; King, John; Lawson, Margaret L; Linder, Suzanne K; Lipkus, Isaac; Ozanne, Elissa; Peters, Ellen; Timmermans, Danielle; Woloshin, Steven

    2013-01-01

    Making evidence-based decisions often requires comparison of two or more options. Research-based evidence may exist which quantifies how likely the outcomes are for each option. Understanding these numeric estimates improves patients' risk perception and leads to better informed decision making. This paper summarises current "best practices" in communication of evidence-based numeric outcomes for developers of patient decision aids (PtDAs) and other health communication tools. An expert consensus group of fourteen researchers from North America, Europe, and Australasia identified eleven main issues in risk communication. Two experts for each issue wrote a "state of the art" summary of best evidence, drawing on the PtDA, health, psychological, and broader scientific literature. In addition, commonly used terms were defined and a set of guiding principles and key messages derived from the results. The eleven key components of risk communication were: 1) Presenting the chance an event will occur; 2) Presenting changes in numeric outcomes; 3) Outcome estimates for test and screening decisions; 4) Numeric estimates in context and with evaluative labels; 5) Conveying uncertainty; 6) Visual formats; 7) Tailoring estimates; 8) Formats for understanding outcomes over time; 9) Narrative methods for conveying the chance of an event; 10) Important skills for understanding numerical estimates; and 11) Interactive web-based formats. Guiding principles from the evidence summaries advise that risk communication formats should reflect the task required of the user, should always define a relevant reference class (i.e., denominator) over time, should aim to use a consistent format throughout documents, should avoid "1 in x" formats and variable denominators, consider the magnitude of numbers used and the possibility of format bias, and should take into account the numeracy and graph literacy of the audience. A substantial and rapidly expanding evidence base exists for risk communication. Developers of tools to facilitate evidence-based decision making should apply these principles to improve the quality of risk communication in practice.

  9. Presenting quantitative information about decision outcomes: a risk communication primer for patient decision aid developers

    PubMed Central

    2013-01-01

    Background Making evidence-based decisions often requires comparison of two or more options. Research-based evidence may exist which quantifies how likely the outcomes are for each option. Understanding these numeric estimates improves patients’ risk perception and leads to better informed decision making. This paper summarises current “best practices” in communication of evidence-based numeric outcomes for developers of patient decision aids (PtDAs) and other health communication tools. Method An expert consensus group of fourteen researchers from North America, Europe, and Australasia identified eleven main issues in risk communication. Two experts for each issue wrote a “state of the art” summary of best evidence, drawing on the PtDA, health, psychological, and broader scientific literature. In addition, commonly used terms were defined and a set of guiding principles and key messages derived from the results. Results The eleven key components of risk communication were: 1) Presenting the chance an event will occur; 2) Presenting changes in numeric outcomes; 3) Outcome estimates for test and screening decisions; 4) Numeric estimates in context and with evaluative labels; 5) Conveying uncertainty; 6) Visual formats; 7) Tailoring estimates; 8) Formats for understanding outcomes over time; 9) Narrative methods for conveying the chance of an event; 10) Important skills for understanding numerical estimates; and 11) Interactive web-based formats. Guiding principles from the evidence summaries advise that risk communication formats should reflect the task required of the user, should always define a relevant reference class (i.e., denominator) over time, should aim to use a consistent format throughout documents, should avoid “1 in x” formats and variable denominators, consider the magnitude of numbers used and the possibility of format bias, and should take into account the numeracy and graph literacy of the audience. Conclusion A substantial and rapidly expanding evidence base exists for risk communication. Developers of tools to facilitate evidence-based decision making should apply these principles to improve the quality of risk communication in practice. PMID:24625237

  10. Numerical Activities of Daily Living - Financial (NADL-F): A tool for the assessment of financial capacities.

    PubMed

    Arcara, Giorgio; Burgio, Francesca; Benavides-Varela, Silvia; Toffano, Roberta; Gindri, Patrizia; Tonini, Elisabetta; Meneghello, Francesca; Semenza, Carlo

    2017-09-07

    Financial capacity is the ability to manage one's own finances according to self-interests. Failure in financial decisions and lack of independence when dealing with money can affect people's quality of life and are associated with neuropsychological deficits or clinical conditions such as mild cognitive impairment or Alzheimer's disease. Despite the importance of evaluating financial capacity in the assessment of patients with neuropsychological and psychiatric disorders, only a few tools have been developed. In the present article, the authors introduce the Numerical Activities of Daily Living - Financial (NADL-F) test, a new test to assess financial capacity in clinical populations. The NADL-F is relatively short, yet it encompasses the most common activities involving financial capacities. The NADL-F proved to have satisfactory psychometric properties and overall good validity for measuring financial abilities. Associations with performance on basic neuropsychological tests were investigated, in particular focusing on mathematical abilities as cognitive correlates of financial capacity. Results indicate that the NADL-F could be a useful tool to guide treatments for the enhancement of financial capacities. By sharing all materials and procedures, the authors hope to promote the development of further versions of the NADL-F in different languages, taking into account the necessary adjustments related to different socio-cultural contexts.

  11. Development and Overview of CPAS Sasquatch Airdrop Landing Location Predictor Software

    NASA Technical Reports Server (NTRS)

    Bledsoe, Kristin J.; Bernatovich, Michael A.

    2015-01-01

    The Capsule Parachute Assembly System (CPAS) is the parachute system for NASA's Orion spacecraft. CPAS is currently in the Engineering Development Unit (EDU) phase of testing. The test program consists of numerous drop tests, wherein a test article rigged with parachutes is extracted from an aircraft. During such tests, range safety is paramount, as is the recoverability of the parachutes and test article. It is crucial to establish a release point from the aircraft that will ensure that the article and all items released from it during flight will land in a designated safe area. The Sasquatch footprint tool was developed to determine this safe release point and to predict the probable landing locations (footprints) of the payload and all released objects. In 2012, a new version of Sasquatch, called Sasquatch Polygons, was developed that significantly upgraded the capabilities of the footprint tool. Key improvements were an increase in the accuracy of the predictions, and the addition of an interface with the Debris Tool (DT), an in-flight debris avoidance tool for use on the test observation helicopter. Additional enhancements include improved data presentation for communication with test personnel and a streamlined code structure. This paper discusses the development, validation, and performance of Sasquatch Polygons, as well as its differences from the original Sasquatch footprint tool.

  12. ASTROS: A multidisciplinary automated structural design tool

    NASA Technical Reports Server (NTRS)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  13. Tool setting device

    DOEpatents

    Brown, Raymond J.

    1977-01-01

    The present invention relates to a tool setting device for use with numerically controlled machine tools, such as lathes and milling machines. A reference position of the machine tool relative to the workpiece along both the X and Y axes is utilized by the control circuit for driving the tool through its program. This reference position is determined for both axes by displacing a single linear variable displacement transducer (LVDT) with the machine tool through a T-shaped pivotal bar. The use of the T-shaped bar allows the cutting tool to be moved sequentially in the X or Y direction for indicating the actual position of the machine tool relative to the predetermined desired position in the numerical control circuit by using a single LVDT.

  14. The Development and Comparison of Molecular Dynamics Simulation and Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Chen, Jundong

    2018-03-01

    Molecular dynamics is an integrated technology that combines physics, mathematics and chemistry. Molecular dynamics method is a computer simulation experimental method, which is a powerful tool for studying condensed matter system. This technique not only can get the trajectory of the atom, but can also observe the microscopic details of the atomic motion. By studying the numerical integration algorithm in molecular dynamics simulation, we can not only analyze the microstructure, the motion of particles and the image of macroscopic relationship between them and the material, but can also study the relationship between the interaction and the macroscopic properties more conveniently. The Monte Carlo Simulation, similar to the molecular dynamics, is a tool for studying the micro-molecular and particle nature. In this paper, the theoretical background of computer numerical simulation is introduced, and the specific methods of numerical integration are summarized, including Verlet method, Leap-frog method and Velocity Verlet method. At the same time, the method and principle of Monte Carlo Simulation are introduced. Finally, similarities and differences of Monte Carlo Simulation and the molecular dynamics simulation are discussed.

  15. Verification of a Multiphysics Toolkit against the Magnetized Target Fusion Concept

    NASA Technical Reports Server (NTRS)

    Thomas, Scott; Perrell, Eric; Liron, Caroline; Chiroux, Robert; Cassibry, Jason; Adams, Robert B.

    2005-01-01

    In the spring of 2004 the Advanced Concepts team at MSFC embarked on an ambitious project to develop a suite of modeling routines that would interact with one another. The tools would each numerically model a portion of any advanced propulsion system. The tools were divided by physics categories, hence the name multiphysics toolset. Currently most of the anticipated modeling tools have been created and integrated. Results are given in this paper for both a quarter nozzle with chemically reacting flow and the interaction of two plasma jets representative of a Magnetized Target Fusion device. The results have not been calibrated against real data as of yet, but this paper demonstrates the current capability of the multiphysics tool and planned future enhancements

  16. Numerical model of solar dynamic radiator for parametric analysis

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer L.

    1989-01-01

    Growth power requirements for Space Station Freedom will be met through addition of 25 kW solar dynamic (SD) power modules. Extensive thermal and power cycle modeling capabilities have been developed which are powerful tools in Station design and analysis, but which prove cumbersome and costly for simple component preliminary design studies. In order to aid in refining the SD radiator to the mature design stage, a simple and flexible numerical model was developed. The model simulates heat transfer and fluid flow performance of the radiator and calculates area mass and impact survivability for many combinations of flow tube and panel configurations, fluid and material properties, and environmental and cycle variations.

  17. Optimizing romanian maritime coastline using mathematical model Litpack

    NASA Astrophysics Data System (ADS)

    Anton, I. A.; Panaitescu, M.; Panaitescu, F. V.

    2017-08-01

    There are many methods and tools to study shoreline change in coastal engineering. LITPACK is a numerical model included in MIKE software developed by DHI (Danish Hydraulic Institute). With this matehematical model we can simulate coastline evolution and profile along beach. Research and methodology: the paper contents location of the study area, the current status of Midia-Mangalia shoreline, protection objectives, the changes of shoreline after having protected constructions. In this paper are presented numerical and graphycal results obtained with this model for studying the romanian maritime coastline in area MIDIA-MANGALIA: non-cohesive sediment transport, long-shore current and littoral drift, coastline evolution, crossshore profile evolution, the development of the coastline position in time.

  18. Center for Extended Magnetohydrodynamics Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramos, Jesus

    This researcher participated in the DOE-funded Center for Extended Magnetohydrodynamics Modeling (CEMM), a multi-institutional collaboration led by the Princeton Plasma Physics Laboratory with Dr. Stephen Jardin as the overall Principal Investigator. This project developed advanced simulation tools to study the non-linear macroscopic dynamics of magnetically confined plasmas. The collaborative effort focused on the development of two large numerical simulation codes, M3D-C1 and NIMROD, and their application to a wide variety of problems. Dr. Ramos was responsible for theoretical aspects of the project, deriving consistent sets of model equations applicable to weakly collisional plasmas and devising test problems for verification ofmore » the numerical codes. This activity was funded for twelve years.« less

  19. The role of 3D visualisation as an analytical tool preparatory to numerical modelling [rapid communication

    NASA Astrophysics Data System (ADS)

    Robins, N. S.; Rutter, H. K.; Dumpleton, S.; Peach, D. W.

    2005-01-01

    Groundwater investigation has long depended on the process of developing a conceptual flow model as a precursor to developing a mathematical model, which in turn may lead in complex aquifers to the development of a numerical approximation model. The assumptions made in the development of the conceptual model depend heavily on the geological framework defining the aquifer, and if the conceptual model is inappropriate then subsequent modelling will also be incorrect. Paradoxically, the development of a robust conceptual model remains difficult, not least because this 3D paradigm is usually reduced to 2D plans and sections. 3D visualisation software is now available to facilitate the development of the conceptual model, to make the model more robust and defensible and to assist in demonstrating the hydraulics of the aquifer system. Case studies are presented to demonstrate the role and cost-effectiveness of the visualisation process.

  20. Fatigue Analysis of Rotating Parts. A Case Study for a Belt Driven Pulley

    NASA Astrophysics Data System (ADS)

    Sandu, Ionela; Tabacu, Stefan; Ducu, Catalin

    2017-10-01

    The present study is focused on the life estimation of a rotating part as a component of an engine assembly namely the pulley of the coolant pump. The goal of the paper is to develop a model, supported by numerical analysis, capable to predict the lifetime of the part. Starting from functional drawing, CAD Model and technical specifications of the part a numerical model was developed. MATLAB code was used to develop a tool to apply the load over the selected area. The numerical analysis was performed in two steps. The first simulation concerned the inertia relief due to rotational motion about the shaft (of the pump). Results from this simulation were saved and the stress - strain state used as initial conditions for the analysis with the load applied. The lifetime of a good part was estimated. A defect was created in order to investigate the influence over the working requirements. It was found that there is little influence with respect to the prescribed lifetime.

  1. Thermomechanical conditions and stresses on the friction stir welding tool

    NASA Astrophysics Data System (ADS)

    Atthipalli, Gowtam

    Friction stir welding has been commercially used as a joining process for aluminum and other soft materials. However, the use of this process in joining of hard alloys is still developing primarily because of the lack of cost effective, long lasting tools. Here I have developed numerical models to understand the thermo mechanical conditions experienced by the FSW tool and to improve its reusability. A heat transfer and visco-plastic flow model is used to calculate the torque, and traverse force on the tool during FSW. The computed values of torque and traverse force are validated using the experimental results for FSW of AA7075, AA2524, AA6061 and Ti-6Al-4V alloys. The computed torque components are used to determine the optimum tool shoulder diameter based on the maximum use of torque and maximum grip of the tool on the plasticized workpiece material. The estimation of the optimum tool shoulder diameter for FSW of AA6061 and AA7075 was verified with experimental results. The computed values of traverse force and torque are used to calculate the maximum shear stress on the tool pin to determine the load bearing ability of the tool pin. The load bearing ability calculations are used to explain the failure of H13 steel tool during welding of AA7075 and commercially pure tungsten during welding of L80 steel. Artificial neural network (ANN) models are developed to predict the important FSW output parameters as function of selected input parameters. These ANN consider tool shoulder radius, pin radius, pin length, welding velocity, tool rotational speed and axial pressure as input parameters. The total torque, sliding torque, sticking torque, peak temperature, traverse force, maximum shear stress and bending stress are considered as the output for ANN models. These output parameters are selected since they define the thermomechanical conditions around the tool during FSW. The developed ANN models are used to understand the effect of various input parameters on the total torque and traverse force during FSW of AA7075 and 1018 mild steel. The ANN models are also used to determine tool safety factor for wide range of input parameters. A numerical model is developed to calculate the strain and strain rates along the streamlines during FSW. The strain and strain rate values are calculated for FSW of AA2524. Three simplified models are also developed for quick estimation of output parameters such as material velocity field, torque and peak temperature. The material velocity fields are computed by adopting an analytical method of calculating velocities for flow of non-compressible fluid between two discs where one is rotating and other is stationary. The peak temperature is estimated based on a non-dimensional correlation with dimensionless heat input. The dimensionless heat input is computed using known welding parameters and material properties. The torque is computed using an analytical function based on shear strength of the workpiece material. These simplified models are shown to be able to predict these output parameters successfully.

  2. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    The increasing ship traffic and maritime transport of dangerous substances make it more difficult to significantly reduce the environmental, economic and social risks posed by potential spills, although the security rules are becoming more restrictive (ships with double hull, etc.) and the surveillance systems are becoming more developed (VTS, AIS). In fact, the problematic associated to spills is and will always be a main topic: spill events are continuously happening, most of them unknown for the general public because of their small scale impact, but with some of them (in a much smaller number) becoming authentic media phenomena in this information era, due to their large dimensions and environmental and social-economic impacts on ecosystems and local communities, and also due to some spectacular or shocking pictures generated. Hence, the adverse consequences posed by these type of accidents, increase the preoccupation of avoiding them in the future, or minimize their impacts, using not only surveillance and monitoring tools, but also increasing the capacity to predict the fate and behaviour of bodies, objects, or substances in the following hours after the accident - numerical models can have now a leading role in operational oceanography applied to safety and pollution response in the ocean because of their predictive potential. Search and rescue operation, oil, inert (ship debris, or floating containers), and HNS (hazardous and noxious substances) spills risk analysis are the main areas where models can be used. Model applications have been widely used in emergency or planning issues associated to pollution risks, and contingency and mitigation measures. Before a spill, in the planning stage, modelling simulations are used in environmental impact studies, or risk maps, using historical data, reference situations, and typical scenarios. After a spill, the use of fast and simple modelling applications allow to understand the fate and behaviour of the spilt substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very useful in emergency situations. The backtracking modelling feature and the possibility of importing spill locations from remote servers with observed data (per example, from flight surveillance or remote sensing) allow the potential application to the evaluation of possible contamination sources. The third tool developed is an innovative system to dynamically produce quantified risk levels in real time, integrating best available information from numerical forecasts and the existing monitoring tools. This system provides coastal pollution risk levels associated to potential (or real) oil spill incidents, taking into account regional statistic information on vessel accidents and coastal sensitivity indexes (determined in EROCIPS project), real time vessel information (positioning, cargo type, speed and vessel type) obtained from AIS, best-available metocean numerical forecasts (hydrodynamics, meteorology - including visibility, wave conditions) and simulated scenarios by the oil spill fate and behaviour component of MOHID Water Modelling System (www.mohid.com). Different spill fate and behaviour simulations are continuously generated and processed in background (assuming hypothetical spills from vessels), based on variable vessel information, and metocean conditions, and results from these simulations are used in the quantification the consequences of potential spills. Dynamic Risk Tool was not designed to replace conventional mapping tools, but to complement that type of information with an innovative approach to risk mapping. Taking advantage of interoperability between forecasting models, oil spill simulations, AIS monitoring systems, statistical data and coastal vulnerability, this software can provide to end-users realtime risk levels, allowing an innovative approach to risk mapping, providing decision-makers with an improved decision support model and also an intelligent risk-based traffic monitoring. For instance, this tool allows the prioritisation of individual ships and geographical areas, and facilitates strategic and dynamic tug positioning. As referred, the risk levels are generated in realtime, and the historic results are kept in a database, allowing later risk analysis or compilations for specific seasons or regions, in order to obtain typical risk maps, etc. The integration with metocean modeling results (instead of using typical static scenarios), as well as continuous background oil spill modelling, provide a more realistic approach to the estimation of risk levels - the metocean conditions and oil spill behaviour are always different and specific, and it's virtually impossible to previously define those conditions even if several thousands of static scenarios were previously considered. This system was initially implemented in Portugal (ARCOPOL project) for oil spills. The implementation at different regions in the Atlantic and the adaptation to chemical spills will be executed in the scope of ARCOPOL+ project. The numerical model used for computing the fate and behaviour of spilled substances in all the tools developed (MOHID lagrangian & oil spill model from MOHID Water modelling System) was also subject of several adaptations and updates, in order to increase its adaptability to the developed tools - horizontal velocity due to Stokes Drift, vertical movement of oil substances, modelling of floating containers, backtracking modelling and a multi-solution approach (generating computational grid on-the-fly, and using the available information from the multiple metocean forecasting solutions available) are some of the main features recently implemented. The main purpose of these software tools are mainly to reduce the gap between the decision-makers and scientific modellers - although the correct analysis of model results usually requires a specialist, an operational model user should not loose most of the time converting and interpolating metocean results, preparing input data files, running models and post-processing results - rather than analysing results and producing different scenarios. The harmonization and standardization in respect to dissemination of numerical model outputs is a strategic effort for the modelling scientific community, because facilitates the application of their results in decision-support tools like the ones presented here.

  3. Steps Towards Understanding Large-scale Deformation of Gas Hydrate-bearing Sediments

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Deusner, C.; Haeckel, M.; Kossel, E.

    2016-12-01

    Marine sediments bearing gas hydrates are typically characterized by heterogeneity in the gas hydrate distribution and anisotropy in the sediment-gas hydrate fabric properties. Gas hydrates also contribute to the strength and stiffness of the marine sediment, and any disturbance in the thermodynamic stability of the gas hydrates is likely to affect the geomechanical stability of the sediment. Understanding mechanisms and triggers of large-strain deformation and failure of marine gas hydrate-bearing sediments is an area of extensive research, particularly in the context of marine slope-stability and industrial gas production. The ultimate objective is to predict severe deformation events such as regional-scale slope failure or excessive sand production by using numerical simulation tools. The development of such tools essentially requires a careful analysis of thermo-hydro-chemo-mechanical behavior of gas hydrate-bearing sediments at lab-scale, and its stepwise integration into reservoir-scale simulators through definition of effective variables, use of suitable constitutive relations, and application of scaling laws. One of the focus areas of our research is to understand the bulk coupled behavior of marine gas hydrate systems with contributions from micro-scale characteristics, transport-reaction dynamics, and structural heterogeneity through experimental flow-through studies using high-pressure triaxial test systems and advanced tomographical tools (CT, ERT, MRI). We combine these studies to develop mathematical model and numerical simulation tools which could be used to predict the coupled hydro-geomechanical behavior of marine gas hydrate reservoirs in a large-strain framework. Here we will present some of our recent results from closely co-ordinated experimental and numerical simulation studies with an objective to capture the large-deformation behavior relevant to different gas production scenarios. We will also report on a variety of mechanically relevant test scenarios focusing on effects of dynamic changes in gas hydrate saturation, highly uneven gas hydrate distributions, focused fluid migration and gas hydrate production through depressurization and CO2 injection.

  4. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  5. Advanced Power System Analysis Capabilities

    NASA Technical Reports Server (NTRS)

    1997-01-01

    As a continuing effort to assist in the design and characterization of space power systems, the NASA Lewis Research Center's Power and Propulsion Office developed a powerful computerized analysis tool called System Power Analysis for Capability Evaluation (SPACE). This year, SPACE was used extensively in analyzing detailed operational timelines for the International Space Station (ISS) program. SPACE was developed to analyze the performance of space-based photovoltaic power systems such as that being developed for the ISS. It is a highly integrated tool that combines numerous factors in a single analysis, providing a comprehensive assessment of the power system's capability. Factors particularly critical to the ISS include the orientation of the solar arrays toward the Sun and the shadowing of the arrays by other portions of the station.

  6. CLIPS: A tool for the development and delivery of expert systems

    NASA Technical Reports Server (NTRS)

    Riley, Gary

    1991-01-01

    The C Language Integrated Production System (CLIPS) is a forward chaining rule-based language developed by the Software Technology Branch at the Johnson Space Center. CLIPS provides a complete environment for the construction of rule-based expert systems. CLIPS was designed specifically to provide high probability, low cost, and easy integration with external systems. Other key features of CLIPS include a powerful rule syntax, an interactive development environment, high performance, extensibility, a verification/validation tool, extensive documentation, and source code availability. The current release of CLIPS, version 4.3, is being used by over 2,500 users throughout the public and private community including: all NASA sites and branches of the military, numerous Federal bureaus, government contractors, 140 universities, and many companies.

  7. Development of a Numerical Model for Orthogonal Cutting. Discussion about the Sensitivity to Friction Problem

    NASA Astrophysics Data System (ADS)

    San Juan, M.; de la Iglesia, J. M.; Martín, O.; Santos, F. J.

    2009-11-01

    In despite of the important progresses achieved in the knowledge of cutting processes, the study of certain aspects has undergone the very limitations of the experimental means: temperature gradients, frictions, contact, etc… Therefore, the development of numerical models is a valid tool as a first approach to study of those problems. In the present work, a calculation model under Abaqus Explicit code is developed to represent the orthogonal cutting of AISI 4140 steel. A bidimensional simulation under plane strain conditions, which is considered as adiabatic due to the high speed of the material flow, is chosen. The chip separation is defined by means of a fracture law that allows complex simulations of tool penetration in the workpiece. The strong influence of friction on cutting is proved, therefore a very good definition of materials behaviour laws could be obtained, but an erroneous value of friction coefficient could notably reduce the reliability. Considering the difficulty of checking the friction models used in the simulation, from the tests carried out habitually, the most efficacious way to characterize the friction would be to combine simulation models with cutting tests.

  8. Prediction of blood pressure and blood flow in stenosed renal arteries using CFD

    NASA Astrophysics Data System (ADS)

    Jhunjhunwala, Pooja; Padole, P. M.; Thombre, S. B.; Sane, Atul

    2018-04-01

    In the present work an attempt is made to develop a diagnostive tool for renal artery stenosis (RAS) which is inexpensive and in-vitro. To analyse the effects of increase in the degree of severity of stenosis on hypertension and blood flow, haemodynamic parameters are studied by performing numerical simulations. A total of 16 stenosed models with varying degree of stenosis severity from 0-97.11% are assessed numerically. Blood is modelled as a shear-thinning, non-Newtonian fluid using the Carreau model. Computational Fluid Dynamics (CFD) analysis is carried out to compute the values of flow parameters like maximum velocity and maximum pressure attained by blood due to stenosis under pulsatile flow. These values are further used to compute the increase in blood pressure and decrease in available blood flow to kidney. The computed available blood flow and secondary hypertension for varying extent of stenosis are mapped by curve fitting technique using MATLAB and a mathematical model is developed. Based on these mathematical models, a quantification tool is developed for tentative prediction of probable availability of blood flow to the kidney and severity of stenosis if secondary hypertension is known.

  9. Verifying the error bound of numerical computation implemented in computer systems

    DOEpatents

    Sawada, Jun

    2013-03-12

    A verification tool receives a finite precision definition for an approximation of an infinite precision numerical function implemented in a processor in the form of a polynomial of bounded functions. The verification tool receives a domain for verifying outputs of segments associated with the infinite precision numerical function. The verification tool splits the domain into at least two segments, wherein each segment is non-overlapping with any other segment and converts, for each segment, a polynomial of bounded functions for the segment to a simplified formula comprising a polynomial, an inequality, and a constant for a selected segment. The verification tool calculates upper bounds of the polynomial for the at least two segments, beginning with the selected segment and reports the segments that violate a bounding condition.

  10. Numerical Analysis Objects

    NASA Astrophysics Data System (ADS)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  11. Nonlinear mechanics of composite materials with periodic microstructure

    NASA Technical Reports Server (NTRS)

    Jordan, E. H.; Walker, K. P.

    1991-01-01

    This report summarizes the result of research done under NASA NAG3-882 Nonlinear Mechanics of Composites with Periodic Microstructure. The effort involved the development of non-finite element methods to calculate local stresses around fibers in composite materials. The theory was developed and some promising numerical results were obtained. It is expected that when this approach is fully developed, it will provide an important tool for calculating local stresses and averaged constitutive behavior in composites. NASA currently has a major contractual effort (NAS3-24691) to bring the approach developed under this grant to application readiness. The report has three sections. One, the general theory that appeared as a NASA TM, a second section that gives greater details about the theory connecting Greens functions and Fourier series approaches, and a final section shows numerical results.

  12. [Blended-learning in psychosomatics and psychotherapy - Increasing the satisfaction and knowledge of students with a web-based e-learning tool].

    PubMed

    Ferber, Julia; Schneider, Gudrun; Havlik, Linda; Heuft, Gereon; Friederichs, Hendrik; Schrewe, Franz-Bernhard; Schulz-Steinel, Andrea; Burgmer, Markus

    2014-01-01

    To improve the synergy of established methods of teaching, the Department of Psychosomatics and Psychotherapy, University Hospital Münster, developed a web-based elearning tool using video clips of standardized patients. The effect of this blended-learning approach was evaluated. A multiple-choice test was performed by a naive (without the e-learning tool) and an experimental (with the tool) cohort of medical students to test the groups' expertise in psychosomatics. In addition, participants' satisfaction with the new tool was evaluated (numeric rating scale of 0-10). The experimental cohort was more satisfied with the curriculum and more interested in psychosomatics. Furthermore, the experimental cohort scored significantly better in the multiple-choice test. The new tool proved to be an important addition to the classical curriculum as a blended-learning approach which improves students' satisfaction and knowledge in psychosomatics.

  13. Energy evaluation of protection effectiveness of anti-vibration gloves.

    PubMed

    Hermann, Tomasz; Dobry, Marian Witalis

    2017-09-01

    This article describes an energy method of assessing protection effectiveness of anti-vibration gloves on the human dynamic structure. The study uses dynamic models of the human and the glove specified in Standard No. ISO 10068:2012. The physical models of human-tool systems were developed by combining human physical models with a power tool model. The combined human-tool models were then transformed into mathematical models from which energy models were finally derived. Comparative energy analysis was conducted in the domain of rms powers. The energy models of the human-tool systems were solved using numerical simulation implemented in the MATLAB/Simulink environment. The simulation procedure demonstrated the effectiveness of the anti-vibration glove as a method of protecting human operators of hand-held power tools against vibration. The desirable effect is achieved by lowering the flow of energy in the human-tool system when the anti-vibration glove is employed.

  14. Trajectories for High Specific Impulse High Specific Power Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)

    2002-01-01

    Flight times and deliverable masses for electric and fusion propulsion systems are difficult to approximate. Numerical integration is required for these continuous thrust systems. Many scientists are not equipped with the tools and expertise to conduct interplanetary and interstellar trajectory analysis for their concepts. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. An analytical method derived in the companion paper was also evaluated. The accuracy of this method is discussed in the paper.

  15. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  16. Using Modern Design Tools for Digital Avionics Development

    NASA Technical Reports Server (NTRS)

    Hyde, David W.; Lakin, David R., II; Asquith, Thomas E.

    2000-01-01

    Using Modem Design Tools for Digital Avionics Development Shrinking development time and increased complexity of new avionics forces the designer to use modem tools and methods during hardware development. Engineers at the Marshall Space Flight Center have successfully upgraded their design flow and used it to develop a Mongoose V based radiation tolerant processor board for the International Space Station's Water Recovery System. The design flow, based on hardware description languages, simulation, synthesis, hardware models, and full functional software model libraries, allowed designers to fully simulate the processor board from reset, through initialization before any boards were built. The fidelity of a digital simulation is limited to the accuracy of the models used and how realistically the designer drives the circuit's inputs during simulation. By using the actual silicon during simulation, device modeling errors are reduced. Numerous design flaws were discovered early in the design phase when they could be easily fixed. The use of hardware models and actual MIPS software loaded into full functional memory models also provided checkout of the software development environment. This paper will describe the design flow used to develop the processor board and give examples of errors that were found using the tools. An overview of the processor board firmware will also be covered.

  17. Analogue and numerical modelling in Volcanology: Development, evolution and future challenges

    NASA Astrophysics Data System (ADS)

    Kavanagh, Janine; Annen, Catherine

    2015-04-01

    Since the inception of volcanology as a science, analogue modelling has been an important methodology to study the formation and evolution of the volcanic system. With the development of computing capacities numerical modelling has become a widely used tool to explore magmatic process quantitatively and try to predict eruptive behaviour. Processes of interest include the development and establishment of the volcanic plumbing system, the propagation of magma to the surface to feed eruptions, the construction of a volcanic edifice and the dynamics of eruptive processes. An important ultimate aim is to characterise and measure the experimental volcanic and magmatic phenomena, to inform and improve eruption forecasting for hazard assessments. In nature, volcanic activity is often unpredictable and in an environment that is highly changeable and forbidding. Volcanic or magmatic activity cannot be repeated at will and has many (often unconstrained) variables. The processes of interest are frequently hidden from view, for example occurring beneath the Earth's surface or within a pyroclastic flow or plume. The challenges of working in volcanic terrains and gathering 'real' volcano data mean that analogue and numerical models have gained significant importance as a method to study the geometrics, kinematics, and dynamics of volcano growth and eruption. A huge variety of analogue materials have been used in volcanic modelling, often bringing out the more creative side of the scientific mind. As with all models, the choice of appropriate materials and boundary conditions are critical for assessing the relevance and usefulness of the experimental results. Numerical simulation has proved a useful tool to test the physical plausibility of conceptual models and presents the advantage of being applicable at different scales. It is limited however in its predictive power by the number of free parameters needed to describe geological systems. In this special symposium we will attempt to review the use and significance of analogue and numerical modelling in volcanological research over the past century to the present day. We introduce some of the new techniques being developed through a multidisciplinary approach, and offer some perspectives on how these might be used to help shape the direction of future research in volcanology.

  18. CFD - Mature Technology?

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2005-01-01

    Over the past 30 years, numerical methods and simulation tools for fluid dynamic problems have advanced as a new discipline, namely, computational fluid dynamics (CFD). Although a wide spectrum of flow regimes are encountered in many areas of science and engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to a large demand for predicting the aerodynamic performance characteristics of flight vehicles, such as commercial, military, and space vehicles. As flow analysis is required to be more accurate and computationally efficient for both commercial and mission-oriented applications (such as those encountered in meteorology, aerospace vehicle development, general fluid engineering and biofluid analysis) CFD tools for engineering become increasingly important for predicting safety, performance and cost. This paper presents the author's perspective on the maturity of CFD, especially from an aerospace engineering point of view.

  19. Managing water in the West: developing new tools for a critical resource

    USGS Publications Warehouse

    Scoppettone, G.G.; Gadomski, D.; Petersen, J.; Hatten, J.

    2005-01-01

    Rapid population growth in the Western United States over the last century has placed increasing strains on our water supplies and aquatic ecosystems. Historically, water rights have been used to determine the allocation of water in the West, but rules and regulations related to endangered species now often drive how water is released from reservoirs in large rivers such as the lower Colorado and the Columbia. In numerous smaller watersheds, communities are trying to balance the water necessary for human use, irrigation, and the conservation of ecosystems. To assist managers in the face of increasing complexity and uncertainty in water management decision-making, the Western Fisheries Research Center (WFRC) is involved in developing a new generation of integrative tools. Below are some examples of the types of tools that already exist within the WFRC.

  20. A new digitized reverse correction method for hypoid gears based on a one-dimensional probe

    NASA Astrophysics Data System (ADS)

    Li, Tianxing; Li, Jubo; Deng, Xiaozhong; Yang, Jianjun; Li, Genggeng; Ma, Wensuo

    2017-12-01

    In order to improve the tooth surface geometric accuracy and transmission quality of hypoid gears, a new digitized reverse correction method is proposed based on the measurement data from a one-dimensional probe. The minimization of tooth surface geometrical deviations is realized from the perspective of mathematical analysis and reverse engineering. Combining the analysis of complex tooth surface generation principles and the measurement mechanism of one-dimensional probes, the mathematical relationship between the theoretical designed tooth surface, the actual machined tooth surface and the deviation tooth surface is established, the mapping relation between machine-tool settings and tooth surface deviations is derived, and the essential connection between the accurate calculation of tooth surface deviations and the reverse correction method of machine-tool settings is revealed. Furthermore, a reverse correction model of machine-tool settings is built, a reverse correction strategy is planned, and the minimization of tooth surface deviations is achieved by means of the method of numerical iterative reverse solution. On this basis, a digitized reverse correction system for hypoid gears is developed by the organic combination of numerical control generation, accurate measurement, computer numerical processing, and digitized correction. Finally, the correctness and practicability of the digitized reverse correction method are proved through a reverse correction experiment. The experimental results show that the tooth surface geometric deviations meet the engineering requirements after two trial cuts and one correction.

  1. MI-Sim: A MATLAB package for the numerical analysis of microbial ecological interactions.

    PubMed

    Wade, Matthew J; Oakley, Jordan; Harbisher, Sophie; Parker, Nicholas G; Dolfing, Jan

    2017-01-01

    Food-webs and other classes of ecological network motifs, are a means of describing feeding relationships between consumers and producers in an ecosystem. They have application across scales where they differ only in the underlying characteristics of the organisms and substrates describing the system. Mathematical modelling, using mechanistic approaches to describe the dynamic behaviour and properties of the system through sets of ordinary differential equations, has been used extensively in ecology. Models allow simulation of the dynamics of the various motifs and their numerical analysis provides a greater understanding of the interplay between the system components and their intrinsic properties. We have developed the MI-Sim software for use with MATLAB to allow a rigorous and rapid numerical analysis of several common ecological motifs. MI-Sim contains a series of the most commonly used motifs such as cooperation, competition and predation. It does not require detailed knowledge of mathematical analytical techniques and is offered as a single graphical user interface containing all input and output options. The tools available in the current version of MI-Sim include model simulation, steady-state existence and stability analysis, and basin of attraction analysis. The software includes seven ecological interaction motifs and seven growth function models. Unlike other system analysis tools, MI-Sim is designed as a simple and user-friendly tool specific to ecological population type models, allowing for rapid assessment of their dynamical and behavioural properties.

  2. Simulation of the Physics of Flight

    ERIC Educational Resources Information Center

    Lane, W. Brian

    2013-01-01

    Computer simulations continue to prove to be a valuable tool in physics education. Based on the needs of an Aviation Physics course, we developed the PHYSics of FLIght Simulator (PhysFliS), which numerically solves Newton's second law for an airplane in flight based on standard aerodynamics relationships. The simulation can be used to pique…

  3. An intercomparison study of TSM, SEBS, and SEBAL using high-resolution imagery and lysimetric data

    USDA-ARS?s Scientific Manuscript database

    Over the past three decades, numerous remote sensing based ET mapping algorithms were developed. These algorithms provided a robust, economical, and efficient tool for ET estimations at field and regional scales. The Two Source Model (TSM), Surface Energy Balance System (SEBS), and Surface Energy Ba...

  4. Functions and the Volume of Vases

    ERIC Educational Resources Information Center

    McCoy, Ann C.; Barger, Rita H.; Barnett, Joann; Combs, Emily

    2012-01-01

    Functions are one of the most important and powerful tools in mathematics because they allow the symbolic, visual, and verbal representation of relationships between variables. The power of functions, as well as the numerous real-world uses of functions, make them an important part of the development of algebraic reasoning in the middle grades.…

  5. Forest adaptation resources: Climate change tools and approaches for land managers

    Treesearch

    Chris Swanston; Maria, eds. Janowiak

    2012-01-01

    The forests of northern Wisconsin, a defining feature of the region's landscape, are expected to undergo numerous changes in response to the changing climate. This document provides a collection of resources designed to help forest managers incorporate climate change considerations into management and devise adaptation tactics. It was developed in northern...

  6. Development of knowledge and tools to enhance resilience of beef grazing systems for sustainable animal protein production

    USDA-ARS?s Scientific Manuscript database

    Ruminant livestock provide an important source of meat and dairy protein that sustain the health and livelihoods for much of the world’s population. Grazinglands that support ruminant livestock provide numerous other ecosystem services, including provision of food, water, and genetic resources; regu...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogdanov, Yu. I., E-mail: bogdanov-yurii@inbox.ru; Avosopyants, G. V.; Belinskii, L. V.

    We describe a new method for reconstructing the quantum state of the electromagnetic field from the results of mutually complementary optical quadrature measurements. This method is based on the root approach and displaces squeezed Fock states are used as the basis. Theoretical analysis and numerical experiments demonstrate the considerable advantage of the developed tools over those described in the literature.

  8. The Persian developmental sentence scoring as a clinical measure of morphosyntax in children.

    PubMed

    Jalilevand, Nahid; Kamali, Mohammad; Modarresi, Yahya; Kazemi, Yalda

    2016-01-01

    Background: Developmental Sentence Scoring (DSS) was developed as a numerical measurement and a clinical method based on the morphosyntactic acquisition in the English language. The aim of this study was to develop a new numerical tool similar to DSS to assess the morphosyntactic abilities in Persian-speaking children. Methods: In this cross-sectional and comparative study, the language samples of 115 typically developing Persian-speaking children aged 30 - 65 months were audio recorded during the free play and picture description sessions. The Persian Developmental Sentence Score (PDSS) and the Mean Length of Utterance (MLU) were calculated. Pearson correlation and one - way Analysis of variance (ANOVA) were used for data analysis. Results: The correlation between PDSS and MLU in morphemes (convergent validity) was significant with a correlation coefficient of 0.97 (p< 0.001). The value Cronbach's Alpha (α= 0.79) in the grammatical categories and the split-half coefficient (0.86) indicated acceptable internal consistency reliability. Conclusion: The PDSS could be used as a reliable numerical measurement to estimate the syntactic development in Persian-speaking children.

  9. The Persian developmental sentence scoring as a clinical measure of morphosyntax in children

    PubMed Central

    Jalilevand, Nahid; Kamali, Mohammad; Modarresi, Yahya; Kazemi, Yalda

    2016-01-01

    Background: Developmental Sentence Scoring (DSS) was developed as a numerical measurement and a clinical method based on the morphosyntactic acquisition in the English language. The aim of this study was to develop a new numerical tool similar to DSS to assess the morphosyntactic abilities in Persian-speaking children. Methods: In this cross-sectional and comparative study, the language samples of 115 typically developing Persian-speaking children aged 30 - 65 months were audio recorded during the free play and picture description sessions. The Persian Developmental Sentence Score (PDSS) and the Mean Length of Utterance (MLU) were calculated. Pearson correlation and one – way Analysis of variance (ANOVA) were used for data analysis. Results: The correlation between PDSS and MLU in morphemes (convergent validity) was significant with a correlation coefficient of 0.97 (p< 0.001). The value Cronbach's Alpha (α= 0.79) in the grammatical categories and the split-half coefficient (0.86) indicated acceptable internal consistency reliability. Conclusion: The PDSS could be used as a reliable numerical measurement to estimate the syntactic development in Persian-speaking children. PMID:28210600

  10. A Visualization Tool for Integrating Research Results at an Underground Mine

    NASA Astrophysics Data System (ADS)

    Boltz, S.; Macdonald, B. D.; Orr, T.; Johnson, W.; Benton, D. J.

    2016-12-01

    Researchers with the National Institute for Occupational Safety and Health are conducting research at a deep, underground metal mine in Idaho to develop improvements in ground control technologies that reduce the effects of dynamic loading on mine workings, thereby decreasing the risk to miners. This research is multifaceted and includes: photogrammetry, microseismic monitoring, geotechnical instrumentation, and numerical modeling. When managing research involving such a wide range of data, understanding how the data relate to each other and to the mining activity quickly becomes a daunting task. In an effort to combine this diverse research data into a single, easy-to-use system, a three-dimensional visualization tool was developed. The tool was created using the Unity3d video gaming engine and includes the mine development entries, production stopes, important geologic structures, and user-input research data. The tool provides the user with a first-person, interactive experience where they are able to walk through the mine as well as navigate the rock mass surrounding the mine to view and interpret the imported data in the context of the mine and as a function of time. The tool was developed using data from a single mine; however, it is intended to be a generic tool that can be easily extended to other mines. For example, a similar visualization tool is being developed for an underground coal mine in Colorado. The ultimate goal is for NIOSH researchers and mine personnel to be able to use the visualization tool to identify trends that may not otherwise be apparent when viewing the data separately. This presentation highlights the features and capabilities of the mine visualization tool and explains how it may be used to more effectively interpret data and reduce the risk of ground fall hazards to underground miners.

  11. The Science of and Advanced Technology for Cost-Effective Manufacture of High Precision Engineering Products. Volume 4. Thermal Effects on the Accuracy of Numerically Controlled Machine Tools.

    DTIC Science & Technology

    1985-10-01

    83K0385 FINAL REPORT D Vol. 4 00 THERMAL EFFECTS ON THE ACCURACY OF LD NUME" 1ICALLY CONTROLLED MACHINE TOOLS PREPARED BY I Raghunath Venugopal and M...OF NUMERICALLY CONTROLLED MACHINE TOOLS 12 PERSONAL AJ’HOR(S) Venunorial, Raghunath and M. M. Barash 13a TYPE OF REPORT 13b TIME COVERED 14 DATE OF...TOOLS Prepared by Raghunath Venugopal and M. M. Barash Accesion For Unannounced 0 Justification ........................................... October 1085

  12. Mechanism-Based FE Simulation of Tool Wear in Diamond Drilling of SiCp/Al Composites.

    PubMed

    Xiang, Junfeng; Pang, Siqin; Xie, Lijing; Gao, Feinong; Hu, Xin; Yi, Jie; Hu, Fang

    2018-02-07

    The aim of this work is to analyze the micro mechanisms underlying the wear of macroscale tools during diamond machining of SiC p /Al6063 composites and to develop the mechanism-based diamond wear model in relation to the dominant wear behaviors. During drilling, high volume fraction SiC p /Al6063 composites containing Cu, the dominant wear mechanisms of diamond tool involve thermodynamically activated physicochemical wear due to diamond-graphite transformation catalyzed by Cu in air atmosphere and mechanically driven abrasive wear due to high-frequency scrape of hard SiC reinforcement on tool surface. An analytical diamond wear model, coupling Usui abrasive wear model and Arrhenius extended graphitization wear model was proposed and implemented through a user-defined subroutine for tool wear estimates. Tool wear estimate in diamond drilling of SiC p /Al6063 composites was achieved by incorporating the combined abrasive-chemical tool wear subroutine into the coupled thermomechanical FE model of 3D drilling. The developed drilling FE model for reproducing diamond tool wear was validated for feasibility and reliability by comparing numerically simulated tool wear morphology and experimentally observed results after drilling a hole using brazed polycrystalline diamond (PCD) and chemical vapor deposition (CVD) diamond coated tools. A fairly good agreement of experimental and simulated results in cutting forces, chip and tool wear morphologies demonstrates that the developed 3D drilling FE model, combined with a subroutine for diamond tool wear estimate can provide a more accurate analysis not only in cutting forces and chip shape but also in tool wear behavior during drilling SiC p /Al6063 composites. Once validated and calibrated, the developed diamond tool wear model in conjunction with other machining FE models can be easily extended to the investigation of tool wear evolution with various diamond tool geometries and other machining processes in cutting different workpiece materials.

  13. Mechanism-Based FE Simulation of Tool Wear in Diamond Drilling of SiCp/Al Composites

    PubMed Central

    Xiang, Junfeng; Pang, Siqin; Xie, Lijing; Gao, Feinong; Hu, Xin; Yi, Jie; Hu, Fang

    2018-01-01

    The aim of this work is to analyze the micro mechanisms underlying the wear of macroscale tools during diamond machining of SiCp/Al6063 composites and to develop the mechanism-based diamond wear model in relation to the dominant wear behaviors. During drilling, high volume fraction SiCp/Al6063 composites containing Cu, the dominant wear mechanisms of diamond tool involve thermodynamically activated physicochemical wear due to diamond-graphite transformation catalyzed by Cu in air atmosphere and mechanically driven abrasive wear due to high-frequency scrape of hard SiC reinforcement on tool surface. An analytical diamond wear model, coupling Usui abrasive wear model and Arrhenius extended graphitization wear model was proposed and implemented through a user-defined subroutine for tool wear estimates. Tool wear estimate in diamond drilling of SiCp/Al6063 composites was achieved by incorporating the combined abrasive-chemical tool wear subroutine into the coupled thermomechanical FE model of 3D drilling. The developed drilling FE model for reproducing diamond tool wear was validated for feasibility and reliability by comparing numerically simulated tool wear morphology and experimentally observed results after drilling a hole using brazed polycrystalline diamond (PCD) and chemical vapor deposition (CVD) diamond coated tools. A fairly good agreement of experimental and simulated results in cutting forces, chip and tool wear morphologies demonstrates that the developed 3D drilling FE model, combined with a subroutine for diamond tool wear estimate can provide a more accurate analysis not only in cutting forces and chip shape but also in tool wear behavior during drilling SiCp/Al6063 composites. Once validated and calibrated, the developed diamond tool wear model in conjunction with other machining FE models can be easily extended to the investigation of tool wear evolution with various diamond tool geometries and other machining processes in cutting different workpiece materials. PMID:29414839

  14. Investigating the Potential Impacts of Energy Production in the Marcellus Shale Region Using the Shale Network Database and CUAHSI-Supported Data Tools

    NASA Astrophysics Data System (ADS)

    Brazil, L.

    2017-12-01

    The Shale Network's extensive database of water quality observations enables educational experiences about the potential impacts of resource extraction with real data. Through open source tools that are developed and maintained by the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI), researchers, educators, and citizens can access and analyze the very same data that the Shale Network team has used in peer-reviewed publications about the potential impacts of hydraulic fracturing on water. The development of the Shale Network database has been made possible through collection efforts led by an academic team and involving numerous individuals from government agencies, citizen science organizations, and private industry. Thus far, CUAHSI-supported data tools have been used to engage high school students, university undergraduate and graduate students, as well as citizens so that all can discover how energy production impacts the Marcellus Shale region, which includes Pennsylvania and other nearby states. This presentation will describe these data tools, how the Shale Network has used them in developing educational material, and the resources available to learn more.

  15. Cultural Resources Collection Analysis Albeni Falls Project, Northern Idaho.

    DTIC Science & Technology

    1987-01-01

    numerous pestles and mortars, bolas stones, nephrite adzes, notched pebbles or net weights, an atlatl weight, and several unique incised and carved...tools including flaked and ground stone was documented; bifacial tools, drills, gravers, scrapers, numerous pestles and mortars, bolas stones, nephrite...59 27 Pestles ............................................................ 60 28 Zoomorphic pestle (?) fragment

  16. Authorship versus "credit" for participation in research: a case study of potential ethical dilemmas created by technical tools used by researchers and claims for authorship by their creators.

    PubMed

    Welker, James A; McCue, Jack D

    2007-01-01

    The distinction between authorship and other forms of credit for contribution to a publication has been a persisting controversy that has resulted in numerous guidelines outlining the expected contributions of those claiming authorship. While there have been flagrant, well-publicized deviations from widely accepted standards, they are largely outnumbered by cases that are not publicity-worthy, and therefore remain known to only those directly involved with the inappropriate conduct. We discuss the definition and ethical requirements of authorship, offer a case example of the authorship debate created by a technical tool at our institution, and review parallels that support and dispute the authorship claims of our software developers. Ultimately, we conclude that development of a technical tool that enables data collection does not adequately substitute for contributions to study design and manuscript preparation for authorship purposes. Unless the designers of such a technical tool prospectively participate as a part of the project, they would not have an adequate understanding of the publication's genesis to defend it publicly and cannot be listed as authors. Therefore, it is incumbent upon project members to invite tool developers to participate at the beginning of such projects, and for tool developers to contribute to study design and manuscript preparation when they desire authorship listings.

  17. Biopython: freely available Python tools for computational molecular biology and bioinformatics.

    PubMed

    Cock, Peter J A; Antao, Tiago; Chang, Jeffrey T; Chapman, Brad A; Cox, Cymon J; Dalke, Andrew; Friedberg, Iddo; Hamelryck, Thomas; Kauff, Frank; Wilczynski, Bartek; de Hoon, Michiel J L

    2009-06-01

    The Biopython project is a mature open source international collaboration of volunteer developers, providing Python libraries for a wide range of bioinformatics problems. Biopython includes modules for reading and writing different sequence file formats and multiple sequence alignments, dealing with 3D macro molecular structures, interacting with common tools such as BLAST, ClustalW and EMBOSS, accessing key online databases, as well as providing numerical methods for statistical learning. Biopython is freely available, with documentation and source code at (www.biopython.org) under the Biopython license.

  18. Making interdisciplinary solid Earth modeling and analysis tools accessible in a diverse undergraduate and graduate classroom

    NASA Astrophysics Data System (ADS)

    Becker, T. W.

    2011-12-01

    I present results from ongoing, NSF-CAREER funded educational and research efforts that center around making numerical tools in seismology and geodynamics more accessible to a broader audience. The goal is not only to train students in quantitative, interdisciplinary research, but also to make methods more easily accessible to practitioners across disciplines. I describe the two main efforts that were funded, the Solid Earth Research and Teaching Environment (SEATREE, geosys.usc.edu/projects/seatree/), and a new Numerical Methods class. SEATREE is a modular and user-friendly software framework to facilitate using solid Earth research tools in the undergraduate and graduate classroom and for interdisciplinary, scientific collaboration. We use only open-source software, and most programming is done in the Python computer language. We strive to make use of modern software design and development concepts while remaining compatible with traditional scientific coding and existing, legacy software. Our goals are to provide a fully contained, yet transparent package that lets users operate in an easy, graphically supported "black box" mode, while also allowing to look under the hood, for example to conduct numerous forward models to explore parameter space. SEATREE currently has several implemented modules, including on global mantle flow, 2D phase velocity tomography, and 2D mantle convection and was used at the University of Southern California, Los Angeles, and at a 2010 CIDER summer school tutorial. SEATREE was developed in collaboration with engineering and computer science undergraduate students, some of which have gone on to work in Earth Science projects. In the long run, we envision SEATREE to contribute to new ways of sharing scientific research, and making (numerical) experiments truly reproducible again. The other project is a set of lecture notes and Matlab exercises on Numerical Methods in solid Earth, focusing on finite difference and element methods. The class has been taught several times at USC to a broad audience of Earth science students with very diverse levels of exposure to math and physics. It is our goal to bring everyone up to speed and empower students, and we have seen structural geology students with very little exposure to math go on to construct their own numerical models of pTt-paths in a core-complex setting. This exemplifies the goal of teaching students to both be able to put together simple numerical models from scratch, and, perhaps more importantly, to truly understand the basic concepts, capabilities, and pitfalls of the more powerful community codes that are being increasingly used. SEATREE and the Numerical Methods class material are freely available at geodynamics.usc.edu/~becker.

  19. Tools for controlling protein interactions with light

    PubMed Central

    Tucker, Chandra L.; Vrana, Justin D.; Kennedy, Matthew J.

    2014-01-01

    Genetically-encoded actuators that allow control of protein-protein interactions with light, termed ‘optical dimerizers’, are emerging as new tools for experimental biology. In recent years, numerous new and versatile dimerizer systems have been developed. Here we discuss the design of optical dimerizer experiments, including choice of a dimerizer system, photoexcitation sources, and coordinate use of imaging reporters. We provide detailed protocols for experiments using two dimerization systems we previously developed, CRY2/CIB and UVR8/UVR8, for use controlling transcription, protein localization, and protein secretion with light. Additionally, we provide instructions and software for constructing a pulse-controlled LED light device for use in experiments requiring extended light treatments. PMID:25181301

  20. Development and application of theoretical models for Rotating Detonation Engine flowfields

    NASA Astrophysics Data System (ADS)

    Fievisohn, Robert

    As turbine and rocket engine technology matures, performance increases between successive generations of engine development are becoming smaller. One means of accomplishing significant gains in thermodynamic performance and power density is to use detonation-based heat release instead of deflagration. This work is focused on developing and applying theoretical models to aid in the design and understanding of Rotating Detonation Engines (RDEs). In an RDE, a detonation wave travels circumferentially along the bottom of an annular chamber where continuous injection of fresh reactants sustains the detonation wave. RDEs are currently being designed, tested, and studied as a viable option for developing a new generation of turbine and rocket engines that make use of detonation heat release. One of the main challenges in the development of RDEs is to understand the complex flowfield inside the annular chamber. While simplified models are desirable for obtaining timely performance estimates for design analysis, one-dimensional models may not be adequate as they do not provide flow structure information. In this work, a two-dimensional physics-based model is developed, which is capable of modeling the curved oblique shock wave, exit swirl, counter-flow, detonation inclination, and varying pressure along the inflow boundary. This is accomplished by using a combination of shock-expansion theory, Chapman-Jouguet detonation theory, the Method of Characteristics (MOC), and other compressible flow equations to create a shock-fitted numerical algorithm and generate an RDE flowfield. This novel approach provides a numerically efficient model that can provide performance estimates as well as details of the large-scale flow structures in seconds on a personal computer. Results from this model are validated against high-fidelity numerical simulations that may require a high-performance computing framework to provide similar performance estimates. This work provides a designer a new tool to conduct large-scale parametric studies to optimize a design space before conducting computationally-intensive, high-fidelity simulations that may be used to examine additional effects. The work presented in this thesis not only bridges the gap between simple one-dimensional models and high-fidelity full numerical simulations, but it also provides an effective tool for understanding and exploring RDE flow processes.

  1. Leveraging e-Science infrastructure for electrochemical research.

    PubMed

    Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F

    2011-08-28

    As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.

  2. A strip chart recorder pattern recognition tool kit for Shuttle operations

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Moebes, Travis A.; Shelton, Robert O.; Savely, Robert T.

    1993-01-01

    During Space Shuttle operations, Mission Control personnel monitor numerous mission-critical systems such as electrical power; guidance, navigation, and control; and propulsion by means of paper strip chart recorders. For example, electrical power controllers monitor strip chart recorder pen traces to identify onboard electrical equipment activations and deactivations. Recent developments in pattern recognition technologies coupled with new capabilities that distribute real-time Shuttle telemetry data to engineering workstations make it possible to develop computer applications that perform some of the low-level monitoring now performed by controllers. The number of opportunities for such applications suggests a need to build a pattern recognition tool kit to reduce software development effort through software reuse. We are building pattern recognition applications while keeping such a tool kit in mind. We demonstrated the initial prototype application, which identifies electrical equipment activations, during three recent Shuttle flights. This prototype was developed to test the viability of the basic system architecture, to evaluate the performance of several pattern recognition techniques including those based on cross-correlation, neural networks, and statistical methods, to understand the interplay between an advanced automation application and human controllers to enhance utility, and to identify capabilities needed in a more general-purpose tool kit.

  3. MITHRA 1.0: A full-wave simulation tool for free electron lasers

    NASA Astrophysics Data System (ADS)

    Fallahi, Arya; Yahaghi, Alireza; Kärtner, Franz X.

    2018-07-01

    Free Electron Lasers (FELs) are a solution for providing intense, coherent and bright radiation in the hard X-ray regime. Due to the low wall-plug efficiency of FEL facilities, it is crucial and additionally very useful to develop complete and accurate simulation tools for better optimizing a FEL interaction. The highly sophisticated dynamics involved in a FEL process was the main obstacle hindering the development of general simulation tools for this problem. We present a numerical algorithm based on finite difference time domain/Particle in cell (FDTD/PIC) in a Lorentz boosted coordinate system which is able to fulfill a full-wave simulation of a FEL process. The developed software offers a suitable tool for the analysis of FEL interactions without considering any of the usual approximations. A coordinate transformation to bunch rest frame makes the very different length scales of bunch size, optical wavelengths and the undulator period transform to values with the same order. Consequently, FDTD/PIC simulations in conjunction with efficient parallelization techniques make the full-wave simulation feasible using the available computational resources. Several examples of free electron lasers are analyzed using the developed software, the results are benchmarked based on standard FEL codes and discussed in detail.

  4. Simulation of router action on a lathe to test the cutting tool performance in edge-trimming of graphite/epoxy composite

    NASA Astrophysics Data System (ADS)

    Ramulu, M.; Rogers, E.

    1994-04-01

    The predominant machining application with graphite/epoxy composite materials in aerospace industry is peripheral trimming. The computer numerically controlled (CNC) high speed routers required to do edge trimming work are generally scheduled for production work in industry and are not available for extensive cutter testing. Therefore, an experimental method of simulating the conditions of periphery trim using a lathe is developed in this paper. The validity of the test technique will be demonstrated by conducting carbide tool wear tests under dry cutting conditions. The experimental results will be analyzed to characterize the wear behavior of carbide cutting tools in machining the composite materials.

  5. S&T converging trends in dealing with disaster: A review on AI tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Abu Bakar, E-mail: abakarh@usim.edu.my; Isa, Mohd Hafez Mohd.

    Science and Technology (S&T) has been able to help mankind to solve or minimize problems when arise. Different methodologies, techniques and tools were developed or used for specific cases by researchers, engineers, scientists throughout the world, and numerous papers and articles have been written by them. Nine selected cases such as flash flood, earthquakes, workplace accident, fault in aircraft industry, seismic vulnerability, disaster mitigation and management, and early fault detection in nuclear industry have been studied. This paper looked at those cases, and their results showed nearly 60% uses artificial intelligence (AI) as a tool. This paper also did somemore » review that will help young researchers in deciding the types of AI tools to be selected; thus proving the future trends in S&T.« less

  6. TiConverter: A training image converting tool for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Fadlelmula F., Mohamed M.; Killough, John; Fraim, Michael

    2016-11-01

    TiConverter is a tool developed to ease the application of multiple-point geostatistics whether by the open source Stanford Geostatistical Modeling Software (SGeMS) or other available commercial software. TiConverter has a user-friendly interface and it allows the conversion of 2D training images into numerical representations in four different file formats without the need for additional code writing. These are the ASCII (.txt), the geostatistical software library (GSLIB) (.txt), the Isatis (.dat), and the VTK formats. It performs the conversion based on the RGB color system. In addition, TiConverter offers several useful tools including image resizing, smoothing, and segmenting tools. The purpose of this study is to introduce the TiConverter, and to demonstrate its application and advantages with several examples from the literature.

  7. S&T converging trends in dealing with disaster: A review on AI tools

    NASA Astrophysics Data System (ADS)

    Hasan, Abu Bakar; Isa, Mohd. Hafez Mohd.

    2016-01-01

    Science and Technology (S&T) has been able to help mankind to solve or minimize problems when arise. Different methodologies, techniques and tools were developed or used for specific cases by researchers, engineers, scientists throughout the world, and numerous papers and articles have been written by them. Nine selected cases such as flash flood, earthquakes, workplace accident, fault in aircraft industry, seismic vulnerability, disaster mitigation and management, and early fault detection in nuclear industry have been studied. This paper looked at those cases, and their results showed nearly 60% uses artificial intelligence (AI) as a tool. This paper also did some review that will help young researchers in deciding the types of AI tools to be selected; thus proving the future trends in S&T.

  8. Mocking the weak lensing universe: The LensTools Python computing package

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-10-01

    We present a newly developed software package which implements a wide range of routines frequently used in Weak Gravitational Lensing (WL). With the continuously increasing size of the WL scientific community we feel that easy to use Application Program Interfaces (APIs) for common calculations are a necessity to ensure efficiency and coordination across different working groups. Coupled with existing open source codes, such as CAMB (Lewis et al., 2000) and Gadget2 (Springel, 2005), LensTools brings together a cosmic shear simulation pipeline which, complemented with a variety of WL feature measurement tools and parameter sampling routines, provides easy access to the numerics for theoretical studies of WL as well as for experiment forecasts. Being implemented in PYTHON (Rossum, 1995), LensTools takes full advantage of a range of state-of-the art techniques developed by the large and growing open-source software community (Jones et al., 2001; McKinney, 2010; Astrophy Collaboration, 2013; Pedregosa et al., 2011; Foreman-Mackey et al., 2013). We made the LensTools code available on the Python Package Index and published its documentation on http://lenstools.readthedocs.io.

  9. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  10. On the bistable zone of milling processes

    PubMed Central

    Dombovari, Zoltan; Stepan, Gabor

    2015-01-01

    A modal-based model of milling machine tools subjected to time-periodic nonlinear cutting forces is introduced. The model describes the phenomenon of bistability for certain cutting parameters. In engineering, these parameter domains are referred to as unsafe zones, where steady-state milling may switch to chatter for certain perturbations. In mathematical terms, these are the parameter domains where the periodic solution of the corresponding nonlinear, time-periodic delay differential equation is linearly stable, but its domain of attraction is limited due to the existence of an unstable quasi-periodic solution emerging from a secondary Hopf bifurcation. A semi-numerical method is presented to identify the borders of these bistable zones by tracking the motion of the milling tool edges as they might leave the surface of the workpiece during the cutting operation. This requires the tracking of unstable quasi-periodic solutions and the checking of their grazing to a time-periodic switching surface in the infinite-dimensional phase space. As the parameters of the linear structural behaviour of the tool/machine tool system can be obtained by means of standard modal testing, the developed numerical algorithm provides efficient support for the design of milling processes with quick estimates of those parameter domains where chatter can still appear in spite of setting the parameters into linearly stable domains. PMID:26303918

  11. Low Order Modeling Tools for Preliminary Pressure Gain Combustion Benefits Analyses

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.

    2012-01-01

    Pressure gain combustion (PGC) offers the promise of higher thermodynamic cycle efficiency and greater specific power in propulsion and power systems. This presentation describes a model, developed under a cooperative agreement between NASA and AFRL, for preliminarily assessing the performance enhancement and preliminary size requirements of PGC components either as stand-alone thrust producers or coupled with surrounding turbomachinery. The model is implemented in the Numerical Propulsion Simulation System (NPSS) environment allowing various configurations to be examined at numerous operating points. The validated model is simple, yet physics-based. It executes quickly in NPSS, yet produces realistic results.

  12. Spacecraft Charging Calculations: NASCAP-2K and SEE Spacecraft Charging Handbook

    NASA Technical Reports Server (NTRS)

    Davis, V. A.; Neergaard, L. F.; Mandell, M. J.; Katz, I.; Gardner, B. M.; Hilton, J. M.; Minor, J.

    2002-01-01

    For fifteen years NASA and the Air Force Charging Analyzer Program for Geosynchronous Orbits (NASCAP/GEO) has been the workhorse of spacecraft charging calculations. Two new tools, the Space Environment and Effects (SEE) Spacecraft Charging Handbook (recently released), and Nascap-2K (under development), use improved numeric techniques and modern user interfaces to tackle the same problem. The SEE Spacecraft Charging Handbook provides first-order, lower-resolution solutions while Nascap-2K provides higher resolution results appropriate for detailed analysis. This paper illustrates how the improvements in the numeric techniques affect the results.

  13. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    NASA Astrophysics Data System (ADS)

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  14. Microcomputer-Based Access to Machine-Readable Numeric Databases.

    ERIC Educational Resources Information Center

    Wenzel, Patrick

    1988-01-01

    Describes the use of microcomputers and relational database management systems to improve access to numeric databases by the Data and Program Library Service at the University of Wisconsin. The internal records management system, in-house reference tools, and plans to extend these tools to the entire campus are discussed. (3 references) (CLB)

  15. Efficient hybrid-symbolic methods for quantum mechanical calculations

    NASA Astrophysics Data System (ADS)

    Scott, T. C.; Zhang, Wenxing

    2015-06-01

    We present hybrid symbolic-numerical tools to generate optimized numerical code for rapid prototyping and fast numerical computation starting from a computer algebra system (CAS) and tailored to any given quantum mechanical problem. Although a major focus concerns the quantum chemistry methods of H. Nakatsuji which has yielded successful and very accurate eigensolutions for small atoms and molecules, the tools are general and may be applied to any basis set calculation with a variational principle applied to its linear and non-linear parameters.

  16. Mathematical modelling and numerical simulation of forces in milling process

    NASA Astrophysics Data System (ADS)

    Turai, Bhanu Murthy; Satish, Cherukuvada; Prakash Marimuthu, K.

    2018-04-01

    Machining of the material by milling induces forces, which act on the work piece material, tool and which in turn act on the machining tool. The forces involved in milling process can be quantified, mathematical models help to predict these forces. A lot of research has been carried out in this area in the past few decades. The current research aims at developing a mathematical model to predict forces at different levels which arise machining of Aluminium6061 alloy. Finite element analysis was used to develop a FE model to predict the cutting forces. Simulation was done for varying cutting conditions. Different experiments was designed using Taguchi method. A L9 orthogonal array was designed and the output was measure for the different experiments. The same was used to develop the mathematical model.

  17. On Dynamics of Spinning Structures

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.; Ibrahim, A.

    2012-01-01

    This paper provides details of developments pertaining to vibration analysis of gyroscopic systems, that involves a finite element structural discretization followed by the solution of the resulting matrix eigenvalue problem by a progressive, accelerated simultaneous iteration technique. Thus Coriolis, centrifugal and geometrical stiffness matrices are derived for shell and line elements, followed by the eigensolution details as well as solution of representative problems that demonstrates the efficacy of the currently developed numerical procedures and tools.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NREL developed a modeling and experimental strategy to characterize thermal performance of materials. The technique provides critical data on thermal properties with relevance for electronics packaging applications. Thermal contact resistance and bulk thermal conductivity were characterized for new high-performance materials such as thermoplastics, boron-nitride nanosheets, copper nanowires, and atomically bonded layers. The technique is an important tool for developing designs and materials that enable power electronics packaging with small footprint, high power density, and low cost for numerous applications.

  19. Comparison of methods for developing the dynamics of rigid-body systems

    NASA Technical Reports Server (NTRS)

    Ju, M. S.; Mansour, J. M.

    1989-01-01

    Several approaches for developing the equations of motion for a three-degree-of-freedom PUMA robot were compared on the basis of computational efficiency (i.e., the number of additions, subtractions, multiplications, and divisions). Of particular interest was the investigation of the use of computer algebra as a tool for developing the equations of motion. Three approaches were implemented algebraically: Lagrange's method, Kane's method, and Wittenburg's method. Each formulation was developed in absolute and relative coordinates. These six cases were compared to each other and to a recursive numerical formulation. The results showed that all of the formulations implemented algebraically required fewer calculations than the recursive numerical algorithm. The algebraic formulations required fewer calculations in absolute coordinates than in relative coordinates. Each of the algebraic formulations could be simplified, using patterns from Kane's method, to yield the same number of calculations in a given coordinate system.

  20. The Aviation System Monitoring and Modeling (ASMM) Project: A Documentation of its History and Accomplishments: 1999-2005

    NASA Technical Reports Server (NTRS)

    Statler, Irving C. (Editor)

    2007-01-01

    The Aviation System Monitoring and Modeling (ASMM) Project was one of the projects within NASA s Aviation Safety Program from 1999 through 2005. The objective of the ASMM Project was to develop the technologies to enable the aviation industry to undertake a proactive approach to the management of its system-wide safety risks. The ASMM Project entailed four interdependent elements: (1) Data Analysis Tools Development - develop tools to convert numerical and textual data into information; (2) Intramural Monitoring - test and evaluate the data analysis tools in operational environments; (3) Extramural Monitoring - gain insight into the aviation system performance by surveying its front-line operators; and (4) Modeling and Simulations - provide reliable predictions of the system-wide hazards, their causal factors, and their operational risks that may result from the introduction of new technologies, new procedures, or new operational concepts. This report is a documentation of the history of this highly successful project and of its many accomplishments and contributions to improved safety of the aviation system.

  1. The Use of Elgg Social Networking Tool for Students' Project Peer-Review Activity

    ERIC Educational Resources Information Center

    Samardzija, Ana Coric; Bubas, Goran

    2014-01-01

    Numerous e-learning 2.0 studies have advocated the use of social networking sites for educational purposes, but only a few of them have observed social networking sites as an instrument for specific learner skill development. This paper discusses a study which addresses the motivation and challenges associated with the introduction of the social…

  2. Assessing Sensitivity to Unmeasured Confounding Using a Simulated Potential Confounder

    ERIC Educational Resources Information Center

    Carnegie, Nicole Bohme; Harada, Masataka; Hill, Jennifer L.

    2016-01-01

    A major obstacle to developing evidenced-based policy is the difficulty of implementing randomized experiments to answer all causal questions of interest. When using a nonexperimental study, it is critical to assess how much the results could be affected by unmeasured confounding. We present a set of graphical and numeric tools to explore the…

  3. A Web Site that Provides Resources for Assessing Students' Statistical Literacy, Reasoning and Thinking

    ERIC Educational Resources Information Center

    Garfield, Joan; delMas, Robert

    2010-01-01

    The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Web site was developed to provide high-quality assessment resources for faculty who teach statistics at the tertiary level but resources are also useful to statistics teachers at the secondary level. This article describes some of the numerous ARTIST resources and suggests…

  4. Spray Cooling Processes for Space Applications

    NASA Technical Reports Server (NTRS)

    Kizito, John P.; VanderWal, Randy L.; Berger, Gordon; Tryggvason, Gretar

    2004-01-01

    The present paper reports ongoing work to develop numerical and modeling tools used to design efficient and effective spray cooling processes and to determine characteristic non-dimensional parametric dependence for practical fluids and conditions. In particular, we present data that will delineate conditions towards control of the impingement dynamics of droplets upon a heated substrate germane to practical situations.

  5. 3DHYDROGEOCHEM: A 3-DIMENSIONAL MODEL OF DENSITY-DEPENDENT SUBSURFACE FLOW AND THERMAL MULTISPECIES-MULTICOMPONENT HYDROGEOCHEMICAL TRANSPORT (EPA/600/SR-98/159)

    EPA Science Inventory

    This report presents a three-dimensional finite-element numerical model designed to simulate chemical transport in subsurface systems with temperature effect taken into account. The three-dimensional model is developed to provide (1) a tool of application, with which one is able ...

  6. Personalized Mobile English Vocabulary Learning System Based on Item Response Theory and Learning Memory Cycle

    ERIC Educational Resources Information Center

    Chen, C. M.; Chung, C. J.

    2008-01-01

    Since learning English is very popular in non-English speaking countries, developing modern assisted-learning tools that support effective English learning is a critical issue in the English-language education field. Learning English involves memorization and practice of a large number of vocabulary words and numerous grammatical structures.…

  7. Cognitive Skills, Domain Knowledge, and Self-Efficacy: Effects on Spreadsheet Quality

    ERIC Educational Resources Information Center

    Adkins, Joni K.

    2011-01-01

    Numerous studies have shown that spreadsheets used in companies often have errors which may affect the quality of the decisions made with these tools. Many businesses are unaware or choose to ignore the risks associated with spreadsheet use. The intent of this study was to learn more about the characteristics of spreadsheet end user developers,…

  8. The Development of a Corpus-Based Tool for Exploring Domain-Specific Collocational Knowledge in English

    ERIC Educational Resources Information Center

    Huang, Ping-Yu; Chen, Chien-Ming; Tsao, Nai-Lung; Wible, David

    2015-01-01

    Since it was published, Coxhead's (2000) Academic Word List (AWL) has been frequently used in English for academic purposes (EAP) classrooms, included in numerous teaching materials, and re-examined in light of various domain-specific corpora. Although well-received, the AWL has been criticized for ignoring some important facts that words still…

  9. Lessons Learned, Innovative Practices, and Emerging Trends: Technology for Teacher Education and Professional Development

    ERIC Educational Resources Information Center

    Donohue, Chip; Fox, Selena

    2012-01-01

    Since 1999, the authors have written numerous articles and books, given hundreds of presentations, served on national eLearning groups, and created new international online programs, all while paying careful attention to the trends, issues, and best practices in the effective use of technology tools and distance learning methods. In this article,…

  10. Bimolecular dynamics by computer analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eilbeck, J.C.; Lomdahl, P.S.; Scott, A.C.

    1984-01-01

    As numerical tools (computers and display equipment) become more powerful and the atomic structures of important biological molecules become known, the importance of detailed computation of nonequilibrium biomolecular dynamics increases. In this manuscript we report results from a well developed study of the hydrogen bonded polypeptide crystal acetanilide, a model protein. Directions for future research are suggested. 9 references, 6 figures.

  11. The Development and Implementation of U-Msg for College Students' English Learning

    ERIC Educational Resources Information Center

    Cheng, Yuh-Ming; Kuo, Sheng-Huang; Lou, Shi-Jer; Shih, Ru-Chu

    2016-01-01

    With the advance of mobile technology, mobile devices have become more portable and powerful with numerous useful tools in daily life. Thus, mobile learning has been widely involved in e-learning studies. Many studies point out that it is important to integrate both pedagogical and technical strengths of mobile technology into learning settings.…

  12. Cellular fatty acid analysis as a potential tool for predicting mosquitocidal activity of Bacillus sphaericus strains.

    PubMed Central

    Frachon, E; Hamon, S; Nicolas, L; de Barjac, H

    1991-01-01

    Gas-liquid chromatography of fatty acid methyl esters and numerical analysis were carried out with 114 Bacillus sphaericus strains. Since only two clusters harbored mosquitocidal strains, this technique could be developed in screening programs to limit bioassays on mosquito larvae. It also allows differentiation of highly homologous strains. PMID:1781697

  13. A Novel Cylindrical Representation for Characterizing Intrinsic Properties of Protein Sequences.

    PubMed

    Yu, Jia-Feng; Dou, Xiang-Hua; Wang, Hong-Bo; Sun, Xiao; Zhao, Hui-Ying; Wang, Ji-Hua

    2015-06-22

    The composition and sequence order of amino acid residues are the two most important characteristics to describe a protein sequence. Graphical representations facilitate visualization of biological sequences and produce biologically useful numerical descriptors. In this paper, we propose a novel cylindrical representation by placing the 20 amino acid residue types in a circle and sequence positions along the z axis. This representation allows visualization of the composition and sequence order of amino acids at the same time. Ten numerical descriptors and one weighted numerical descriptor have been developed to quantitatively describe intrinsic properties of protein sequences on the basis of the cylindrical model. Their applications to similarity/dissimilarity analysis of nine ND5 proteins indicated that these numerical descriptors are more effective than several classical numerical matrices. Thus, the cylindrical representation obtained here provides a new useful tool for visualizing and charactering protein sequences. An online server is available at http://biophy.dzu.edu.cn:8080/CNumD/input.jsp .

  14. Development of Numerical Tools for the Investigation of Plasma Detachment from Magnetic Nozzles

    NASA Technical Reports Server (NTRS)

    Sankaran, Kamesh; Polzin, Kurt A.

    2007-01-01

    A multidimensional numerical simulation framework aimed at investigating the process of plasma detachment from a magnetic nozzle is introduced. An existing numerical code based on a magnetohydrodynamic formulation of the plasma flow equations that accounts for various dispersive and dissipative processes in plasmas was significantly enhanced to allow for the modeling of axisymmetric domains containing three.dimensiunai momentum and magnetic flux vectors. A separate magnetostatic solver was used to simulate the applied magnetic field topologies found in various nozzle experiments. Numerical results from a magnetic diffusion test problem in which all three components of the magnetic field were present exhibit excellent quantitative agreement with the analytical solution, and the lack of numerical instabilities due to fluctuations in the value of del(raised dot)B indicate that the conservative MHD framework with dissipative effects is well-suited for multi-dimensional analysis of magnetic nozzles. Further studies will focus on modeling literature experiments both for the purpose of code validation and to extract physical insight regarding the mechanisms driving detachment.

  15. VCS: Tool for Visualizing Copy Number Variation and Single Nucleotide Polymorphism.

    PubMed

    Kim, HyoYoung; Sung, Samsun; Cho, Seoae; Kim, Tae-Hun; Seo, Kangseok; Kim, Heebal

    2014-12-01

    Copy number variation (CNV) or single nucleotide phlyorphism (SNP) is useful genetic resource to aid in understanding complex phenotypes or deseases susceptibility. Although thousands of CNVs and SNPs are currently avaliable in the public databases, they are somewhat difficult to use for analyses without visualization tools. We developed a web-based tool called the VCS (visualization of CNV or SNP) to visualize the CNV or SNP detected. The VCS tool can assist to easily interpret a biological meaning from the numerical value of CNV and SNP. The VCS provides six visualization tools: i) the enrichment of genome contents in CNV; ii) the physical distribution of CNV or SNP on chromosomes; iii) the distribution of log2 ratio of CNVs with criteria of interested; iv) the number of CNV or SNP per binning unit; v) the distribution of homozygosity of SNP genotype; and vi) cytomap of genes within CNV or SNP region.

  16. The numerical simulation tool for the MAORY multiconjugate adaptive optics system

    NASA Astrophysics Data System (ADS)

    Arcidiacono, C.; Schreiber, L.; Bregoli, G.; Diolaiti, E.; Foppiani, I.; Agapito, G.; Puglisi, A.; Xompero, M.; Oberti, S.; Cosentino, G.; Lombini, M.; Butler, R. C.; Ciliegi, P.; Cortecchia, F.; Patti, M.; Esposito, S.; Feautrier, P.

    2016-07-01

    The Multiconjugate Adaptive Optics RelaY (MAORY) is and Adaptive Optics module to be mounted on the ESO European-Extremely Large Telescope (E-ELT). It is an hybrid Natural and Laser Guide System that will perform the correction of the atmospheric turbulence volume above the telescope feeding the Multi-AO Imaging Camera for Deep Observations Near Infrared spectro-imager (MICADO). We developed an end-to-end Monte- Carlo adaptive optics simulation tool to investigate the performance of a the MAORY and the calibration, acquisition, operation strategies. MAORY will implement Multiconjugate Adaptive Optics combining Laser Guide Stars (LGS) and Natural Guide Stars (NGS) measurements. The simulation tool implement the various aspect of the MAORY in an end to end fashion. The code has been developed using IDL and use libraries in C++ and CUDA for efficiency improvements. Here we recall the code architecture, we describe the modeled instrument components and the control strategies implemented in the code.

  17. Agile Machining and Inspection Non-Nuclear Report (NNR) Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lazarus, Lloyd

    This report is a high level summary of the eight major projects funded by the Agile Machining and Inspection Non-Nuclear Readiness (NNR) project (FY06.0422.3.04.R1). The largest project of the group is the Rapid Response project in which the six major sub categories are summarized. This project focused on the operations of the machining departments that will comprise Special Applications Machining (SAM) in the Kansas City Responsive Infrastructure Manufacturing & Sourcing (KCRIMS) project. This project was aimed at upgrading older machine tools, developing new inspection tools, eliminating Classified Removable Electronic Media (CREM) in the handling of classified Numerical Control (NC) programsmore » by installing the CRONOS network, and developing methods to automatically load Coordinated-Measuring Machine (CMM) inspection data into bomb books and product score cards. Finally, the project personnel leaned perations of some of the machine tool cells, and now have the model to continue this activity.« less

  18. Annual Research Briefs, 2004: Center for Turbulence Research

    NASA Technical Reports Server (NTRS)

    Moin, Parviz; Mansour, Nagi N.

    2004-01-01

    This report contains the 2004 annual progress reports of the Research Fellows and students of the Center for Turbulence Research in its eighteenth year of operation. Since its inception in 1987, the objective of the CTR has been to advance the physical understanding of turbulent flows and development of physics based predictive tools for engineering analysis and turbulence control. Turbulence is ubiquitous in nature and in engineering devices. The studies at CTR have been motivated by applications where turbulence effects are significant; these include a broad range of technical areas such as planetary boundary layers, formation of planets, solar convection, magnetohydrodynamics, environmental and eco systems, aerodynamic noise, propulsion systems and high speed transportation. Numerical simulation has been the predominant research tool at CTR which has required a critical mass of researchers in numerical analysis and computer science in addition to core disciplines such as applied mathematics, chemical kinetics and fluid mechanics. Maintaining and promoting this interdisciplinary culture has been a hallmark of CTR and has been responsible for the realization of the results of its basic research in applications. The first group of reports in this volume are directed towards development, analysis and application of novel numerical methods for ow simulations. Development of methods for large eddy simulation of complex flows has been a central theme in this group. The second group is concerned with turbulent combustion, scalar transport and multi-phase ows. The nal group is devoted to geophysical turbulence where the problem of solar convection has been a new focus of considerable attention recently at CTR.

  19. PERFORM 60 - Prediction of the effects of radiation for reactor pressure vessel and in-core materials using multi-scale modelling - 60 years foreseen plant lifetime

    NASA Astrophysics Data System (ADS)

    Leclercq, Sylvain; Lidbury, David; Van Dyck, Steven; Moinereau, Dominique; Alamo, Ana; Mazouzi, Abdou Al

    2010-11-01

    In nuclear power plants, materials may undergo degradation due to severe irradiation conditions that may limit their operational life. Utilities that operate these reactors need to quantify the ageing and the potential degradations of some essential structures of the power plant to ensure safe and reliable plant operation. So far, the material databases needed to take account of these degradations in the design and safe operation of installations mainly rely on long-term irradiation programs in test reactors as well as on mechanical or corrosion testing in specialized hot cells. Continuous progress in the physical understanding of the phenomena involved in irradiation damage and continuous progress in computer sciences have now made possible the development of multi-scale numerical tools able to simulate the effects of irradiation on materials microstructure. A first step towards this goal has been successfully reached through the development of the RPV-2 and Toughness Module numerical tools by the scientific community created around the FP6 PERFECT project. These tools allow to simulate irradiation effects on the constitutive behaviour of the reactor pressure vessel low alloy steel, and also on its failure properties. Relying on the existing PERFECT Roadmap, the 4 years Collaborative Project PERFORM 60 has mainly for objective to develop multi-scale tools aimed at predicting the combined effects of irradiation and corrosion on internals (austenitic stainless steels) and also to improve existing ones on RPV (bainitic steels). PERFORM 60 is based on two technical sub-projects: (i) RPV and (ii) internals. In addition to these technical sub-projects, the Users' Group and Training sub-project shall allow representatives of constructors, utilities, research organizations… from Europe, USA and Japan to receive the information and training to get their own appraisal on limits and potentialities of the developed tools. An important effort will also be made to teach young researchers in the field of materials' degradation. PERFORM 60 has officially started on March 1st, 2009 with 20 European organizations and Universities involved in the nuclear field.

  20. Cement bond evaluation method in horizontal wells using segmented bond tool

    NASA Astrophysics Data System (ADS)

    Song, Ruolong; He, Li

    2018-06-01

    Most of the existing cement evaluation technologies suffer from tool eccentralization due to gravity in highly deviated wells and horizontal wells. This paper proposes a correction method to lessen the effects of tool eccentralization on evaluation results of cement bond using segmented bond tool, which has an omnidirectional sonic transmitter and eight segmented receivers evenly arranged around the tool 2 ft from the transmitter. Using 3-D finite difference parallel numerical simulation method, we investigate the logging responses of centred and eccentred segmented bond tool in a variety of bond conditions. From the numerical results, we find that the tool eccentricity and channel azimuth can be estimated from measured sector amplitude. The average of the sector amplitude when the tool is eccentred can be corrected to the one when the tool is centred. Then the corrected amplitude will be used to calculate the channel size. The proposed method is applied to both synthetic and field data. For synthetic data, it turns out that this method can estimate the tool eccentricity with small error and the bond map is improved after correction. For field data, the tool eccentricity has a good agreement with the measured well deviation angle. Though this method still suffers from the low accuracy of calculating channel azimuth, the credibility of corrected bond map is improved especially in horizontal wells. It gives us a choice to evaluate the bond condition for horizontal wells using existing logging tool. The numerical results in this paper can provide aids for understanding measurements of segmented tool in both vertical and horizontal wells.

  1. Towards numerical simulations of fluid-structure interactions for investigation of obstructive sleep apnea

    NASA Astrophysics Data System (ADS)

    Huang, Chien-Jung; White, Susan M.; Huang, Shao-Ching; Mallya, Sanjay; Eldredge, Jeff D.

    2014-11-01

    Obstructive sleep apnea(OSA) is a medical condition characterized by repetitive partial or complete occlusion of the airway during sleep. The soft tissues in the airway of OSA patients are prone to collapse under the low pressure loads incurred during breathing. The numerical simulation with patient-specific upper airway model can provide assistance for diagnosis and treatment assessment. The eventual goal of this research is the development of numerical tool for air-tissue interactions in the upper airway of patients with OSA. This tool is expected to capture collapse of the airway in respiratory flow conditions, as well as the effects of various treatment protocols. Here, we present our ongoing progress toward this goal. A sharp-interface embedded boundary method is used on Cartesian grids for resolving the air-tissue interface in the complex patient-specific airway geometries. For the structure simulation, a cut-cell FEM is used. Non-linear Green strains are used for properly resolving the large tissue displacements in the soft palate structures. The fluid and structure solvers are strongly coupled. Preliminary results will be shown, including flow simulation inside the 3D rigid upper airway of patients with OSA, and several validation problem for the fluid-structure coupling.

  2. Finite-difference time-domain modelling of through-the-Earth radio signal propagation

    NASA Astrophysics Data System (ADS)

    Ralchenko, M.; Svilans, M.; Samson, C.; Roper, M.

    2015-12-01

    This research seeks to extend the knowledge of how a very low frequency (VLF) through-the-Earth (TTE) radio signal behaves as it propagates underground, by calculating and visualizing the strength of the electric and magnetic fields for an arbitrary geology through numeric modelling. To achieve this objective, a new software tool has been developed using the finite-difference time-domain method. This technique is particularly well suited to visualizing the distribution of electromagnetic fields in an arbitrary geology. The frequency range of TTE radio (400-9000 Hz) and geometrical scales involved (1 m resolution for domains a few hundred metres in size) involves processing a grid composed of millions of cells for thousands of time steps, which is computationally expensive. Graphics processing unit acceleration was used to reduce execution time from days and weeks, to minutes and hours. Results from the new modelling tool were compared to three cases for which an analytic solution is known. Two more case studies were done featuring complex geologic environments relevant to TTE communications that cannot be solved analytically. There was good agreement between numeric and analytic results. Deviations were likely caused by numeric artifacts from the model boundaries; however, in a TTE application in field conditions, the uncertainty in the conductivity of the various geologic formations will greatly outweigh these small numeric errors.

  3. Reminiscence work with older people: the development of a historical reminiscence tool.

    PubMed

    Thorgrimsdottir, Sigrun Huld; Bjornsdottir, Kristin

    2016-03-01

    (i) To explore how reminiscence workers in older people's care define their work and (ii) to describe the development of a historical reminiscence tool containing historical developments from the older person's passage through life, intended to support reminiscence work. Reminiscence work refers to the recall of past occurrences in a client's life with the intention of enhancing well-being, social skills and self-image. The design of the historical reminiscence tool was informed by the model of intervention design developed by van Meijel et al. starting with problem definition followed by the accumulation of building blocks for the intervention, the design of the intervention and, lastly, a validation of the intervention. Two studies were designed to develop the historical reminiscence tool. Study 1 was a focus group interview, conducted in 2008, aimed at generating knowledge about current practice and to develop the historical reminiscence tool. Eighteen women who identified themselves as reminiscence workers participated in three focus groups. Study 2 was a telephone survey, conducted in 2012 by the first author, serving the purpose of validation. The results provided information about the use of such a historical reminiscence tool. Participants understood reminiscence work primarily as meaningful activity, working with personal experience and honouring the individual's memories and life story. The historical reminiscence tool containing information about important historical events and everyday life in the period 1925-1955 was welcomed by the participants. They provided numerous suggestions for improvement of the draft. Reminiscence work in Iceland is of the social or meaningful activity type rather than a therapy. A historical reminiscence tool containing pertinent historical information was considered helpful in strengthening reminiscence workers' knowledge of the social and historical background of their clients and person-centred care. Reminiscence tools, such as books or electronic sources containing historical information pertaining to aging individuals, can enhance the care of older people. © 2015 John Wiley & Sons Ltd.

  4. A material based approach to creating wear resistant surfaces for hot forging

    NASA Astrophysics Data System (ADS)

    Babu, Sailesh

    Tools and dies used in metal forming are characterized by extremely high temperatures at the interface, high local pressures and large metal to metal sliding. These harsh conditions result in accelerated wear of tooling. Lubrication of tools, done to improve metal flow drastically quenches the surface layers of the tools and compounds the tool failure problem. This phenomenon becomes a serious issue when parts forged at complex and are expected to meet tight tolerances. Unpredictable and hence uncontrolled wear and degradation of tooling result in poor part quality and premature tool failure that result in high scrap, shop downtime, poor efficiency and high cost. The objective of this dissertation is to develop a computer-based methodology for analyzing the requirements hot forging tooling to resist wear and plastic deformation and wear and predicting life cycle of forge tooling. Development of such is a system is complicated by the fact that wear and degradation of tooling is influenced by not only the die material used but also numerous process controls like lubricant, dilution ratio, forging temperature, equipment used, tool geometries among others. Phenomenological models available u1 the literature give us a good thumb rule to selecting materials but do not provide a way to evaluate pits performance in field. Once a material is chosen, there are no proven approaches to create surfaces out of these materials. Coating approaches like PVD and CVD cannot generate thick coatings necessary to withstand the conditions under hot forging. Welding cannot generate complex surfaces without several secondary operations like heat treating and machining. If careful procedures are not followed, welds crack and seldom survive forging loads. There is a strong need for an approach to selectively, reliably and precisely deposit material of choice reliably on an existing surface which exhibit not only good tribological properties but also good adhesion to the substrate. Dissertation outlines development of a new cyclic contact test design to recreate intermittent tempering seen in hot forging. This test has been used to validate the use of tempering parameters in modeling of in-service softening of tool steel surfaces. The dissertation also outlines an industrial case study, conducted at a forging company, to validate the wear model. This dissertation also outlines efforts at Ohio State University, to deposit Nickel Aluminide on AISI H13 substrate, using Laser Engineered Net Shaping (LENS). Dissertation reports results from an array of experiments conducted using LENS 750 machine, at various power levels, table speeds and hatch spacing. Results pertaining to bond quality, surface finish, compositional gradients and hardness are provided. Also, a thermal-based finite element numerical model that was used to simulate the LENS process is presented, along with some demonstrated results.

  5. Remote Numerical Simulations of the Interaction of High Velocity Clouds with Random Magnetic Fields

    NASA Astrophysics Data System (ADS)

    Santillan, Alfredo; Hernandez--Cervantes, Liliana; Gonzalez--Ponce, Alejandro; Kim, Jongsoo

    The numerical simulations associated with the interaction of High Velocity Clouds (HVC) with the Magnetized Galactic Interstellar Medium (ISM) are a powerful tool to describe the evolution of the interaction of these objects in our Galaxy. In this work we present a new project referred to as Theoretical Virtual i Observatories. It is oriented toward to perform numerical simulations in real time through a Web page. This is a powerful astrophysical computational tool that consists of an intuitive graphical user interface (GUI) and a database produced by numerical calculations. In this Website the user can make use of the existing numerical simulations from the database or run a new simulation introducing initial conditions such as temperatures, densities, velocities, and magnetic field intensities for both the ISM and HVC. The prototype is programmed using Linux, Apache, MySQL, and PHP (LAMP), based on the open source philosophy. All simulations were performed with the MHD code ZEUS-3D, which solves the ideal MHD equations by finite differences on a fixed Eulerian mesh. Finally, we present typical results that can be obtained with this tool.

  6. Rational development of solid dispersions via hot-melt extrusion using screening, material characterization, and numeric simulation tools.

    PubMed

    Zecevic, Damir E; Wagner, Karl G

    2013-07-01

    Effective and predictive small-scale selection tools are inevitable during the development of a solubility enhanced drug product. For hot-melt extrusion, this selection process can start with a microscale performance evaluation on a hot-stage microscope (HSM). A batch size of 400 mg can provide sufficient materials to assess the drug product attributes such as solid-state properties, solubility enhancement, and physical stability as well as process related attributes such as processing temperature in a twin-screw extruder (TSE). Prototype formulations will then be fed into a 5 mm TSE (~1-2 g) to confirm performance from the HSM under additional shear stress. Small stress stability testing might be performed with these samples or a larger batch (20-40 g) made by 9 or 12 mm TSE. Simultaneously, numeric process simulations are performed using process data as well as rheological and thermal properties of the formulations. Further scale up work to 16 and 18 mm TSE confirmed and refined the simulation model. Thus, at the end of the laboratory-scale development, not only the clinical trial supply could be manufactured, but also one can form a sound risk assessment to support further scale up even without decades of process experience. Copyright © 2013 Wiley Periodicals, Inc.

  7. Progress Toward an Efficient and General CFD Tool for Propulsion Design/Analysis

    NASA Technical Reports Server (NTRS)

    Cox, C. F.; Cinnella, P.; Westmoreland, S.

    1996-01-01

    The simulation of propulsive flows inherently involves chemical activity. Recent years have seen substantial strides made in the development of numerical schemes for reacting flowfields, in particular those involving finite-rate chemistry. However, finite-rate calculations are computationally intensive and require knowledge of the actual kinetics, which are not always known with sufficient accuracy. Alternatively, flow simulations based on the assumption of local chemical equilibrium are capable of obtaining physically reasonable results at far less computational cost. The present study summarizes the development of efficient numerical techniques for the simulation of flows in local chemical equilibrium, whereby a 'Black Box' chemical equilibrium solver is coupled to the usual gasdynamic equations. The generalization of the methods enables the modelling of any arbitrary mixture of thermally perfect gases, including air, combustion mixtures and plasmas. As demonstration of the potential of the methodologies, several solutions, involving reacting and perfect gas flows, will be presented. Included is a preliminary simulation of the SSME startup transient. Future enhancements to the proposed techniques will be discussed, including more efficient finite-rate and hybrid (partial equilibrium) schemes. The algorithms that have been developed and are being optimized provide for an efficient and general tool for the design and analysis of propulsion systems.

  8. A special purpose silicon compiler for designing supercomputing VLSI systems

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Murugavel, P.; Kamakoti, V.; Shankarraman, M. J.; Rangarajan, S.; Mallikarjun, M.; Karthikeyan, B.; Prabhakar, T. S.; Satish, V.; Venkatasubramaniam, P. R.

    1991-01-01

    Design of general/special purpose supercomputing VLSI systems for numeric algorithm execution involves tackling two important aspects, namely their computational and communication complexities. Development of software tools for designing such systems itself becomes complex. Hence a novel design methodology has to be developed. For designing such complex systems a special purpose silicon compiler is needed in which: the computational and communicational structures of different numeric algorithms should be taken into account to simplify the silicon compiler design, the approach is macrocell based, and the software tools at different levels (algorithm down to the VLSI circuit layout) should get integrated. In this paper a special purpose silicon (SPS) compiler based on PACUBE macrocell VLSI arrays for designing supercomputing VLSI systems is presented. It is shown that turn-around time and silicon real estate get reduced over the silicon compilers based on PLA's, SLA's, and gate arrays. The first two silicon compiler characteristics mentioned above enable the SPS compiler to perform systolic mapping (at the macrocell level) of algorithms whose computational structures are of GIPOP (generalized inner product outer product) form. Direct systolic mapping on PLA's, SLA's, and gate arrays is very difficult as they are micro-cell based. A novel GIPOP processor is under development using this special purpose silicon compiler.

  9. Model reduction of the numerical analysis of Low Impact Developments techniques

    NASA Astrophysics Data System (ADS)

    Brunetti, Giuseppe; Šimůnek, Jirka; Wöhling, Thomas; Piro, Patrizia

    2017-04-01

    Mechanistic models have proven to be accurate and reliable tools for the numerical analysis of the hydrological behavior of Low Impact Development (LIDs) techniques. However, their widespread adoption is limited by their complexity and computational cost. Recent studies have tried to address this issue by investigating the application of new techniques, such as surrogate-based modeling. However, current results are still limited and fragmented. One of such approaches, the Model Order Reduction (MOR) technique, can represent a valuable tool for reducing the computational complexity of a numerical problems by computing an approximation of the original model. While this technique has been extensively used in water-related problems, no studies have evaluated its use in LIDs modeling. Thus, the main aim of this study is to apply the MOR technique for the development of a reduced order model (ROM) for the numerical analysis of the hydrologic behavior of LIDs, in particular green roofs. The model should be able to correctly reproduce all the hydrological processes of a green roof while reducing the computational cost. The proposed model decouples the subsurface water dynamic of a green roof in a) one-dimensional (1D) vertical flow through a green roof itself and b) one-dimensional saturated lateral flow along the impervious rooftop. The green roof is horizontally discretized in N elements. Each element represents a vertical domain, which can have different properties or boundary conditions. The 1D Richards equation is used to simulate flow in the substrate and drainage layers. Simulated outflow from the vertical domain is used as a recharge term for saturated lateral flow, which is described using the kinematic wave approximation of the Boussinesq equation. The proposed model has been compared with the mechanistic model HYDRUS-2D, which numerically solves the Richards equation for the whole domain. The HYDRUS-1D code has been used for the description of vertical flow, while a Finite Volume Scheme has been adopted for lateral flow. Two scenarios involving flat and steep green roofs were analyzed. Results confirmed the accuracy of the reduced order model, which was able to reproduce both subsurface outflow and the moisture distribution in the green roof, significantly reducing the computational cost.

  10. Lessons learned in the development of the STOL intelligent tutoring system

    NASA Technical Reports Server (NTRS)

    Seamster, Thomas; Baker, Clifford; Ames, Troy

    1991-01-01

    Lessons learned during the development of the NASA Systems Test and Operations Language (STOL) Intelligent Tutoring System (ITS), being developed at NASA Goddard Space Flight Center are presented. The purpose of the intelligent tutor is to train STOL users by adapting tutoring based on inferred student strengths and weaknesses. This system has been under development for over one year and numerous lessons learned have emerged. These observations are presented in three sections, as follows. The first section addresses the methodology employed in the development of the STOL ITS and briefly presents the ITS architecture. The second presents lessons learned, in the areas of: intelligent tutor development; documentation and reporting; cost and schedule control; and tools and shells effectiveness. The third section presents recommendations which may be considered by other ITS developers, addressing: access, use and selection of subject matter experts; steps involved in ITS development; use of ITS interface design prototypes as part of knowledge engineering; and tools and shells effectiveness.

  11. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.

    PubMed

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-10-27

    The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.

  12. Verification and Validation Strategy for LWRS Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carl M. Stoots; Richard R. Schultz; Hans D. Gougar

    2012-09-01

    One intension of the Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to create advanced computational tools for safety assessment that enable more accurate representation of a nuclear power plant safety margin. These tools are to be used to study the unique issues posed by lifetime extension and relicensing of the existing operating fleet of nuclear power plants well beyond their first license extension period. The extent to which new computational models / codes such as RELAP-7 can be used for reactor licensing / relicensing activities depends mainly upon the thoroughness with which they have been verifiedmore » and validated (V&V). This document outlines the LWRS program strategy by which RELAP-7 code V&V planning is to be accomplished. From the perspective of developing and applying thermal-hydraulic and reactivity-specific models to reactor systems, the US Nuclear Regulatory Commission (NRC) Regulatory Guide 1.203 gives key guidance to numeric model developers and those tasked with the validation of numeric models. By creating Regulatory Guide 1.203 the NRC defined a framework for development, assessment, and approval of transient and accident analysis methods. As a result, this methodology is very relevant and is recommended as the path forward for RELAP-7 V&V. However, the unique issues posed by lifetime extension will require considerations in addition to those addressed in Regulatory Guide 1.203. Some of these include prioritization of which plants / designs should be studied first, coupling modern supporting experiments to the stringent needs of new high fidelity models / codes, and scaling of aging effects.« less

  13. Simulation of the spatial distribution of the acoustic pressure in sonochemical reactors with numerical methods: a review.

    PubMed

    Tudela, Ignacio; Sáez, Verónica; Esclapez, María Deseada; Díez-García, María Isabel; Bonete, Pedro; González-García, José

    2014-05-01

    Numerical methods for the calculation of the acoustic field inside sonoreactors have rapidly emerged in the last 15 years. This paper summarizes some of the most important works on this topic presented in the past, along with the diverse numerical works that have been published since then, reviewing the state of the art from a qualitative point of view. In this sense, we illustrate and discuss some of the models recently developed by the scientific community to deal with some of the complex events that take place in a sonochemical reactor such as the vibration of the reactor walls and the nonlinear phenomena inherent to the presence of ultrasonic cavitation. In addition, we point out some of the upcoming challenges that must be addressed in order to develop a reliable tool for the proper designing of efficient sonoreactors and the scale-up of sonochemical processes. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. The future of EUV lithography: enabling Moore's Law in the next decade

    NASA Astrophysics Data System (ADS)

    Pirati, Alberto; van Schoot, Jan; Troost, Kars; van Ballegoij, Rob; Krabbendam, Peter; Stoeldraijer, Judon; Loopstra, Erik; Benschop, Jos; Finders, Jo; Meiling, Hans; van Setten, Eelco; Mika, Niclas; Dredonx, Jeannot; Stamm, Uwe; Kneer, Bernhard; Thuering, Bernd; Kaiser, Winfried; Heil, Tilmann; Migura, Sascha

    2017-03-01

    While EUV systems equipped with a 0.33 Numerical Aperture lenses are readying to start volume manufacturing, ASML and Zeiss are ramping up their development activities on a EUV exposure tool with Numerical Aperture greater than 0.5. The purpose of this scanner, targeting a resolution of 8nm, is to extend Moore's law throughout the next decade. A novel, anamorphic lens design, has been developed to provide the required Numerical Aperture; this lens will be paired with new, faster stages and more accurate sensors enabling Moore's law economical requirements, as well as the tight focus and overlay control needed for future process nodes. The tighter focus and overlay control budgets, as well as the anamorphic optics, will drive innovations in the imaging and OPC modelling, and possibly in the metrology concepts. Furthermore, advances in resist and mask technology will be required to image lithography features with less than 10nm resolution. This paper presents an overview of the key technology innovations and infrastructure requirements for the next generation EUV systems.

  15. Numerical simulation of water hammer in low pressurized pipe: comparison of SimHydraulics and Lax-Wendroff method with experiment

    NASA Astrophysics Data System (ADS)

    Himr, D.

    2013-04-01

    Article describes simulation of unsteady flow during water hammer with two programs, which use different numerical approaches to solve ordinary one dimensional differential equations describing the dynamics of hydraulic elements and pipes. First one is Matlab-Simulink-SimHydraulics, which is a commercial software developed to solve the dynamics of general hydraulic systems. It defines them with block elements. The other software is called HYDRA and it is based on the Lax-Wendrff numerical method, which serves as a tool to solve the momentum and continuity equations. This program was developed in Matlab by Brno University of Technology. Experimental measurements were performed on a simple test rig, which consists of an elastic pipe with strong damping connecting two reservoirs. Water hammer is induced with fast closing the valve. Physical properties of liquid and pipe elasticity parameters were considered in both simulations, which are in very good agreement and differences in comparison with experimental data are minimal.

  16. SpectraFox: A free open-source data management and analysis tool for scanning probe microscopy and spectroscopy

    NASA Astrophysics Data System (ADS)

    Ruby, Michael

    In the last decades scanning probe microscopy and spectroscopy have become well-established tools in nanotechnology and surface science. This opened the market for many commercial manufacturers, each with different hardware and software standards. Besides the advantage of a wide variety of available hardware, the diversity may software-wise complicate the data exchange between scientists, and the data analysis for groups working with hardware developed by different manufacturers. Not only the file format differs between manufacturers, but also the data often requires further numerical treatment before publication. SpectraFox is an open-source and independent tool which manages, processes, and evaluates scanning probe spectroscopy and microscopy data. It aims at simplifying the documentation in parallel to measurement, and it provides solid evaluation tools for a large number of data.

  17. SAM/SAH Analogs as Versatile Tools for SAM-Dependent Methyltransferases.

    PubMed

    Zhang, Jing; Zheng, Yujun George

    2016-03-18

    S-Adenosyl-L-methionine (SAM) is a sulfonium molecule with a structural hybrid of methionine and adenosine. As the second largest cofactor in the human body, its major function is to serve as methyl donor for SAM-dependent methyltransferases (MTases). The resultant transmethylation of biomolecules constitutes a significant biochemical mechanism in epigenetic regulation, cellular signaling, and metabolite degradation. Recently, numerous SAM analogs have been developed as synthetic cofactors to transfer the activated groups on MTase substrates for downstream ligation and identification. Meanwhile, new compounds built upon or derived from the SAM scaffold have been designed and tested as selective inhibitors for important MTase targets. Here, we summarized the recent development and application of SAM analogs as chemical biology tools for MTases.

  18. Proceedings of the Workshop on software tools for distributed intelligent control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less

  19. Patient-specific dosimetric endpoints based treatment plan quality control in radiotherapy.

    PubMed

    Song, Ting; Staub, David; Chen, Mingli; Lu, Weiguo; Tian, Zhen; Jia, Xun; Li, Yongbao; Zhou, Linghong; Jiang, Steve B; Gu, Xuejun

    2015-11-07

    In intensity modulated radiotherapy (IMRT), the optimal plan for each patient is specific due to unique patient anatomy. To achieve such a plan, patient-specific dosimetric goals reflecting each patient's unique anatomy should be defined and adopted in the treatment planning procedure for plan quality control. This study is to develop such a personalized treatment plan quality control tool by predicting patient-specific dosimetric endpoints (DEs). The incorporation of patient specific DEs is realized by a multi-OAR geometry-dosimetry model, capable of predicting optimal DEs based on the individual patient's geometry. The overall quality of a treatment plan is then judged with a numerical treatment plan quality indicator and characterized as optimal or suboptimal. Taking advantage of clinically available prostate volumetric modulated arc therapy (VMAT) treatment plans, we built and evaluated our proposed plan quality control tool. Using our developed tool, six of twenty evaluated plans were identified as sub-optimal plans. After plan re-optimization, these suboptimal plans achieved better OAR dose sparing without sacrificing the PTV coverage, and the dosimetric endpoints of the re-optimized plans agreed well with the model predicted values, which validate the predictability of the proposed tool. In conclusion, the developed tool is able to accurately predict optimally achievable DEs of multiple OARs, identify suboptimal plans, and guide plan optimization. It is a useful tool for achieving patient-specific treatment plan quality control.

  20. Numerical simulation on chain-die forming of an AHSS top-hat section

    NASA Astrophysics Data System (ADS)

    Majji, Raju; Xiang, Yang; Ding, Scott; Yang, Chunhui

    2018-05-01

    The applications of Advanced High-Strength Steels (AHSS) in the automotive industry are rapidly increasing due to a demand for a lightweight material that significantly reduces fuel consumption without compromising passenger safety. Automotive industries and material suppliers are expected by consumers to deliver reliable and affordable products, thus stimulating these manufacturers to research solutions to meet these customer requirements. The primary advantage of AHSS is its extremely high strength to weight ratio, an ideal material for the automotive industry. However, its low ductility is a major disadvantage, in particular, when using traditional cold forming processes such as roll forming and deep drawing process to form profiles. Consequently, AHSS parts frequently fail to form. Thereby, in order to improve quality and reliability on manufacturing AHSS products, a recently-developed incremental cold sheet metal forming technology called Chain-die Forming (CDF) is recognised as a potential solution to the forming process of AHSS. The typical CDF process is a combination of bending and roll forming processes which is equivalent to a roll with a large deforming radius, and incrementally forms the desired shape with split die and segments. This study focuses on manufacturing an AHSS top-hat section with minimum passes without geometrical or surface defects by using finite element modelling and simulations. The developed numerical simulation is employed to investigate the influences on the main control parameter of the CDF process while forming AHSS products and further develop new die-punch sets of compensation design via a numerical optimal process. In addition, the study focuses on the tool design to compensate spring-back and reduce friction between tooling and sheet-metal. This reduces the number of passes, thereby improving productivity and reducing energy consumption and material waste. This numerical study reveals that CDF forms AHSS products of complex profiles with much less residual stress, low spring back, low strain and of higher geometrical accuracy compared to other traditional manufacturing processes.

  1. WEC Design Response Toolbox v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey

    2016-03-30

    The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.

  2. Numerical Analyses for Low Reynolds Flow in a Ventricular Assist Device.

    PubMed

    Lopes, Guilherme; Bock, Eduardo; Gómez, Luben

    2017-06-01

    Scientific and technological advances in blood pump developments have been driven by their importance in cardiac patient treatments and in the expansion of life quality in assisted people. To improve and optimize the design and development, numerical tools were incorporated into the analyses of these mechanisms and have become indispensable in their advances. This study analyzes the flow behavior with low impeller Reynolds number, for which there is no consensus on the full development of turbulence in ventricular assist devices (VAD). For supporting analyses, computational numerical simulations were carried out in different scenarios with the same rotation speed. Two modeling approaches were applied: laminar flow and turbulent flow with the standard, RNG and realizable κ - ε; the standard and SST κ - ω models; and Spalart-Allmaras models. The results agree with the literature for VAD and the range for transient flows in stirred tanks with an impeller Reynolds number around 2800 for the tested scenarios. The turbulent models were compared, and it is suggested, based on the expected physical behavior, the use of κ-ε RNG, standard and SST κ-ω, and Spalart-Allmaras models to numerical analyses for low impeller Reynolds numbers according to the tested flow scenarios. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  3. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.

    PubMed

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  4. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density

    PubMed Central

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345

  5. Using Knowledge Space Theory To Assess Student Understanding of Stoichiometry

    NASA Astrophysics Data System (ADS)

    Arasasingham, Ramesh D.; Taagepera, Mare; Potter, Frank; Lonjers, Stacy

    2004-10-01

    Using the concept of stoichiometry we examined the ability of beginning college chemistry students to make connections among the molecular, symbolic, and graphical representations of chemical phenomena, as well as to conceptualize, visualize, and solve numerical problems. Students took a test designed to follow conceptual development; we then analyzed student responses and the connectivities of their responses, or the cognitive organization of the material or thinking patterns, applying knowledge space theory (KST). The results reveal that the students' logical frameworks of conceptual understanding were very weak and lacked an integrated understanding of some of the fundamental aspects of chemical reactivity. Analysis of response states indicates that the overall thinking patterns began with symbolic representations, moved to numerical problem solving, and then lastly to visualization: the acquisition of visualization skills comes later in the knowledge structure. The results strongly suggest the need for teaching approaches that help students integrate their knowledge by emphasizing the relationships between the different representations and presenting them concurrently during instruction. Also, the results indicate that KST is a useful tool for revealing various aspects of students' cognitive structure in chemistry and can be used as an assessment tool or as a pedagogical tool to address a number of student-learning issues.

  6. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  7. Metabonomics and drug development.

    PubMed

    Ramana, Pranov; Adams, Erwin; Augustijns, Patrick; Van Schepdael, Ann

    2015-01-01

    Metabolites as an end product of metabolism possess a wealth of information about altered metabolic control and homeostasis that is dependent on numerous variables including age, sex, and environment. Studying significant changes in the metabolite patterns has been recognized as a tool to understand crucial aspects in drug development like drug efficacy and toxicity. The inclusion of metabonomics into the OMICS study platform brings us closer to define the phenotype and allows us to look at alternatives to improve the diagnosis of diseases. Advancements in the analytical strategies and statistical tools used to study metabonomics allow us to prevent drug failures at early stages of drug development and reduce financial losses during expensive phase II and III clinical trials. This chapter introduces metabonomics along with the instruments used in the study; in addition relevant examples of the usage of metabonomics in the drug development process are discussed along with an emphasis on future directions and the challenges it faces.

  8. A review and evaluation of numerical tools for fractional calculus and fractional order controls

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Liu, Lu; Dehghan, Sina; Chen, YangQuan; Xue, Dingyü

    2017-06-01

    In recent years, as fractional calculus becomes more and more broadly used in research across different academic disciplines, there are increasing demands for the numerical tools for the computation of fractional integration/differentiation, and the simulation of fractional order systems. Time to time, being asked about which tool is suitable for a specific application, the authors decide to carry out this survey to present recapitulative information of the available tools in the literature, in hope of benefiting researchers with different academic backgrounds. With this motivation, the present article collects the scattered tools into a dashboard view, briefly introduces their usage and algorithms, evaluates the accuracy, compares the performance, and provides informative comments for selection.

  9. Engineering Yarrowia lipolytica for Use in Biotechnological Applications: A Review of Major Achievements and Recent Innovations.

    PubMed

    Madzak, Catherine

    2018-06-25

    Yarrowia lipolytica is an oleaginous saccharomycetous yeast with a long history of industrial use. It aroused interest several decades ago as host for heterologous protein production. Thanks to the development of numerous molecular and genetic tools, Y. lipolytica is now a recognized system for expressing heterologous genes and secreting the corresponding proteins of interest. As genomic and transcriptomic tools increased our basic knowledge on this yeast, we can now envision engineering its metabolic pathways for use as whole-cell factory in various bioconversion processes. Y. lipolytica is currently being developed as a workhorse for biotechnology, notably for single-cell oil production and upgrading of industrial wastes into valuable products. As it becomes more and more difficult to keep up with an ever-increasing literature on Y. lipolytica engineering technology, this article aims to provide basic and actualized knowledge on this research area. The most useful reviews on Y. lipolytica biology, use, and safety will be evoked, together with a resume of the engineering tools available in this yeast. This mini-review will then focus on recently developed tools and engineering strategies, with a particular emphasis on promoter tuning, metabolic pathways assembly, and genome editing technologies.

  10. A fully-implicit high-order system thermal-hydraulics model for advanced non-LWR safety analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui

    An advanced system analysis tool is being developed for advanced reactor safety analysis. This paper describes the underlying physics and numerical models used in the code, including the governing equations, the stabilization schemes, the high-order spatial and temporal discretization schemes, and the Jacobian Free Newton Krylov solution method. The effects of the spatial and temporal discretization schemes are investigated. Additionally, a series of verification test problems are presented to confirm the high-order schemes. Furthermore, it is demonstrated that the developed system thermal-hydraulics model can be strictly verified with the theoretical convergence rates, and that it performs very well for amore » wide range of flow problems with high accuracy, efficiency, and minimal numerical diffusions.« less

  11. A fully-implicit high-order system thermal-hydraulics model for advanced non-LWR safety analyses

    DOE PAGES

    Hu, Rui

    2016-11-19

    An advanced system analysis tool is being developed for advanced reactor safety analysis. This paper describes the underlying physics and numerical models used in the code, including the governing equations, the stabilization schemes, the high-order spatial and temporal discretization schemes, and the Jacobian Free Newton Krylov solution method. The effects of the spatial and temporal discretization schemes are investigated. Additionally, a series of verification test problems are presented to confirm the high-order schemes. Furthermore, it is demonstrated that the developed system thermal-hydraulics model can be strictly verified with the theoretical convergence rates, and that it performs very well for amore » wide range of flow problems with high accuracy, efficiency, and minimal numerical diffusions.« less

  12. Efficient Computational Prototyping of Mixed Technology Microfluidic Components and Systems

    DTIC Science & Technology

    2002-08-01

    AFRL-IF-RS-TR-2002-190 Final Technical Report August 2002 EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC...SUBTITLE EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC COMPONENTS AND SYSTEMS 6. AUTHOR(S) Narayan R. Aluru, Jacob White...Aided Design (CAD) tools for microfluidic components and systems were developed in this effort. Innovative numerical methods and algorithms for mixed

  13. Visualizing Economic Development with ArcGIS Explorer

    ERIC Educational Resources Information Center

    Webster, Megan L.; Milson, Andrew J.

    2011-01-01

    Numerous educators have noted that Geographic Information Systems (GIS) is a powerful tool for social studies teaching and learning. Yet the use of GIS has been hampered by issues such as the cost of the software and the management of large spatial data files. One trend that shows great promise for GIS in education is the move to cloud computing.…

  14. Forest Adaptation Resources: climate change tools and approaches for land managers, 2nd edition

    Treesearch

    Christopher W. Swanston; Maria K. Janowiak; Leslie A. Brandt; Patricia R. Butler; Stephen D. Handler; P. Danielle Shannon; Abigail Derby Lewis; Kimberly Hall; Robert T. Fahey; Lydia Scott; Angela Kerber; Jason W. Miesbauer; Lindsay Darling

    2016-01-01

    Forests across the United States are expected to undergo numerous changes in response to the changing climate. This second edition of the Forest Adaptation Resources provides a collection of resources designed to help forest managers incorporate climate change considerations into management and devise adaptation tactics. It was developed as part of the Climate Change...

  15. The Loci Multidisciplinary Simulation System Overview and Status

    NASA Technical Reports Server (NTRS)

    Luke, Edward A.; Tong, Xiao-Ling; Tang, Lin

    2002-01-01

    This paper will discuss the Loci system, an innovative tool for developing tightly coupled multidisciplinary three dimensional simulations. This presentation will overview some of the unique capabilities of the Loci system to automate the assembly of numerical simulations from libraries of fundamental computational components. We will discuss the demonstration of the Loci system on coupled fluid-structure problems related to RBCC propulsion systems.

  16. Equivalence of live tree carbon stocks produced by three estimation approaches for forests of the western United States

    Treesearch

    Coeli M. Hoover; James E. Smith

    2017-01-01

    The focus on forest carbon estimation accompanying the implementation of increased regulatory and reporting requirements is fostering the development of numerous tools and methods to facilitate carbon estimation. One such well-established mechanism is via the Forest Vegetation Simulator (FVS), a growth and yield modeling system used by public and private land managers...

  17. Advanced CNC and CAM Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and student laboratory manual for a 1-year vocational training program to prepare students for entry-level positions as advanced computer numerical control (CNC) and computer-assisted manufacturing (CAM) technicians.. The program was developed through a modification of the DACUM…

  18. Fire and Smoke Model Evaluation Experiment (FASMEE): Modeling gaps and data needs

    Treesearch

    Yongqiang Liu; Adam Kochanski; Kirk Baker; Ruddy Mell; Rodman Linn; Ronan Paugam; Jan Mandel; Aime Fournier; Mary Ann Jenkins; Scott Goodrick; Gary Achtemeier; Andrew Hudak; Matthew Dickson; Brian Potter; Craig Clements; Shawn Urbanski; Roger Ottmar; Narasimhan Larkin; Timothy Brown; Nancy French; Susan Prichard; Adam Watts; Derek McNamara

    2017-01-01

    Fire and smoke models are numerical tools for simulating fire behavior, smoke dynamics, and air quality impacts of wildland fires. Fire models are developed based on the fundamental chemistry and physics of combustion and fire spread or statistical analysis of experimental data (Sullivan 2009). They provide information on fire spread and fuel consumption for safe and...

  19. Application for internal dosimetry using biokinetic distribution of photons based on nuclear medicine images.

    PubMed

    Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade

    2014-01-01

    This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity.

  20. Inlet Flow Control and Prediction Technologies for Embedded Propulsion Systems

    NASA Technical Reports Server (NTRS)

    McMillan, Michelle L.; Mackie, Scott A.; Gissen, Abe; Vukasinovic, Bojan; Lakebrink, Matthew T.; Glezer, Ari; Mani, Mori; Mace, James L.

    2011-01-01

    Fail-safe, hybrid, flow control (HFC) is a promising technology for meeting high-speed cruise efficiency, low-noise signature, and reduced fuel-burn goals for future, Hybrid-Wing-Body (HWB) aircraft with embedded engines. This report details the development of HFC technology that enables improved inlet performance in HWB vehicles with highly integrated inlets and embedded engines without adversely affecting vehicle performance. In addition, new test techniques for evaluating Boundary-Layer-Ingesting (BLI)-inlet flow-control technologies developed and demonstrated through this program are documented, including the ability to generate a BLI-like inlet-entrance flow in a direct-connect, wind-tunnel facility, as well as, the use of D-optimal, statistically designed experiments to optimize test efficiency and enable interpretation of results. Validated improvements in numerical analysis tools and methods accomplished through this program are also documented, including Reynolds-Averaged Navier-Stokes CFD simulations of steady-state flow physics for baseline, BLI-inlet diffuser flow, as well as, that created by flow-control devices. Finally, numerical methods were employed in a ground-breaking attempt to directly simulate dynamic distortion. The advances in inlet technologies and prediction tools will help to meet and exceed "N+2" project goals for future HWB aircraft.

  1. Calibration of a γ- Re θ transition model and its application in low-speed flows

    NASA Astrophysics Data System (ADS)

    Wang, YunTao; Zhang, YuLun; Meng, DeHong; Wang, GunXue; Li, Song

    2014-12-01

    The prediction of laminar-turbulent transition in boundary layer is very important for obtaining accurate aerodynamic characteristics with computational fluid dynamic (CFD) tools, because laminar-turbulent transition is directly related to complex flow phenomena in boundary layer and separated flow in space. Unfortunately, the transition effect isn't included in today's major CFD tools because of non-local calculations in transition modeling. In this paper, Menter's γ- Re θ transition model is calibrated and incorporated into a Reynolds-Averaged Navier-Stokes (RANS) code — Trisonic Platform (TRIP) developed in China Aerodynamic Research and Development Center (CARDC). Based on the experimental data of flat plate from the literature, the empirical correlations involved in the transition model are modified and calibrated numerically. Numerical simulation for low-speed flow of Trapezoidal Wing (Trap Wing) is performed and compared with the corresponding experimental data. It is indicated that the γ- Re θ transition model can accurately predict the location of separation-induced transition and natural transition in the flow region with moderate pressure gradient. The transition model effectively imporves the simulation accuracy of the boundary layer and aerodynamic characteristics.

  2. Numerical models for fluid-grains interactions: opportunities and limitations

    NASA Astrophysics Data System (ADS)

    Esteghamatian, Amir; Rahmani, Mona; Wachs, Anthony

    2017-06-01

    In the framework of a multi-scale approach, we develop numerical models for suspension flows. At the micro scale level, we perform particle-resolved numerical simulations using a Distributed Lagrange Multiplier/Fictitious Domain approach. At the meso scale level, we use a two-way Euler/Lagrange approach with a Gaussian filtering kernel to model fluid-solid momentum transfer. At both the micro and meso scale levels, particles are individually tracked in a Lagrangian way and all inter-particle collisions are computed by a Discrete Element/Soft-sphere method. The previous numerical models have been extended to handle particles of arbitrary shape (non-spherical, angular and even non-convex) as well as to treat heat and mass transfer. All simulation tools are fully-MPI parallel with standard domain decomposition and run on supercomputers with a satisfactory scalability on up to a few thousands of cores. The main asset of multi scale analysis is the ability to extend our comprehension of the dynamics of suspension flows based on the knowledge acquired from the high-fidelity micro scale simulations and to use that knowledge to improve the meso scale model. We illustrate how we can benefit from this strategy for a fluidized bed, where we introduce a stochastic drag force model derived from micro-scale simulations to recover the proper level of particle fluctuations. Conversely, we discuss the limitations of such modelling tools such as their limited ability to capture lubrication forces and boundary layers in highly inertial flows. We suggest ways to overcome these limitations in order to enhance further the capabilities of the numerical models.

  3. Dynamic load synthesis for shock numerical simulation in space structure design

    NASA Astrophysics Data System (ADS)

    Monti, Riccardo; Gasbarri, Paolo

    2017-08-01

    Pyroshock loads are the most stressing environments that a space equipment experiences during its operating life from a mechanical point of view. In general, the mechanical designer considers the pyroshock analysis as a very demanding constraint. Unfortunately, due to the non-linear behaviour of the structure under such loads, only the experimental tests can demonstrate if it is able to withstand these dynamic loads. By taking all the previous considerations into account, some preliminary information about the design correctness could be done by performing ;ad-hoc; numerical simulations, for example via commercial finite element software (i.e. MSC Nastran). Usually these numerical tools face the shock solution in two ways: 1) a direct mode, by using a time dependent enforcement and by evaluating the time-response and space-response as well as the internal forces; 2) a modal basis approach, by considering a frequency dependent load and of course by evaluating internal forces in the frequency domain. This paper has the main aim to develop a numerical tool to synthetize the time dependent enforcement based on deterministic and/or genetic algorithm optimisers. In particular starting from a specified spectrum in terms of SRS (Shock Response Spectrum) a time dependent discrete function, typically an acceleration profile, will be obtained to force the equipment by simulating the shock event. The synthetizing time and the interface with standards numerical codes will be two of the main topics dealt with in the paper. In addition a congruity and consistency methodology will be presented to ensure that the identified time dependent loads fully match the specified spectrum.

  4. Technology and Jobs: Computer-Aided Design. Numerical-Control Machine-Tool Operators. Office Automation.

    ERIC Educational Resources Information Center

    Stanton, Michael; And Others

    1985-01-01

    Three reports on the effects of high technology on the nature of work include (1) Stanton on applications and implications of computer-aided design for engineers, drafters, and architects; (2) Nardone on the outlook and training of numerical-control machine tool operators; and (3) Austin and Drake on the future of clerical occupations in automated…

  5. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  6. Measurement and Prediction of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2008-01-01

    An experimental and numerical investigation into the static and dynamic responses of shape memory alloy hybrid composite (SMAHC) beams is performed to provide quantitative validation of a recently commercialized numerical analysis/design tool for SMAHC structures. The SMAHC beam specimens consist of a composite matrix with embedded pre-strained SMA actuators, which act against the mechanical boundaries of the structure when thermally activated to adaptively stiffen the structure. Numerical results are produced from the numerical model as implemented into the commercial finite element code ABAQUS. A rigorous experimental investigation is undertaken to acquire high fidelity measurements including infrared thermography and projection moire interferometry for full-field temperature and displacement measurements, respectively. High fidelity numerical results are also obtained from the numerical model and include measured parameters, such as geometric imperfection and thermal load. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  7. Developing a Social Autopsy Tool for Dengue Mortality: A Pilot Study

    PubMed Central

    Arauz, María José; Ridde, Valéry; Hernández, Libia Milena; Charris, Yaneth; Carabali, Mabel; Villar, Luis Ángel

    2015-01-01

    Background Dengue fever is a public health problem in the tropical and sub-tropical world. Dengue cases have grown dramatically in recent years as well as dengue mortality. Colombia has experienced periodic dengue outbreaks with numerous dengue related-deaths, where the Santander department has been particularly affected. Although social determinants of health (SDH) shape health outcomes, including mortality, it is not yet understood how these affect dengue mortality. The aim of this pilot study was to develop and pre-test a social autopsy (SA) tool for dengue mortality. Methods and Findings The tool was developed and pre-tested in three steps. First, dengue fatal cases and ‘near misses’ (those who recovered from dengue complications) definitions were elaborated. Second, a conceptual framework on determinants of dengue mortality was developed to guide the construction of the tool. Lastly, the tool was designed and pre-tested among three relatives of fatal cases and six near misses in 2013 in the metropolitan zone of Bucaramanga. The tool turned out to be practical in the context of dengue mortality in Colombia after some modifications. The tool aims to study the social, individual, and health systems determinants of dengue mortality. The tool is focused on studying the socioeconomic position and the intermediary SDH rather than the socioeconomic and political context. Conclusions The SA tool is based on the scientific literature, a validated conceptual framework, researchers’ and health professionals’ expertise, and a pilot study. It is the first time that a SA tool has been created for the dengue mortality context. Our work furthers the study on SDH and how these are applied to neglected tropical diseases, like dengue. This tool could be integrated in surveillance systems to provide complementary information on the modifiable and avoidable death-related factors and therefore, be able to formulate interventions for dengue mortality reduction. PMID:25658485

  8. Mass and energy flows between the Solar chromosphere, transition region, and corona

    NASA Astrophysics Data System (ADS)

    Hansteen, V. H.

    2017-12-01

    A number of increasingly sophisticated numerical simulations spanning the convection zone to corona have shed considerable insight into the role of the magnetic field in the structure and energetics of the Sun's outer atmosphere. This development is strengthened by the wealth of observational data now coming on-line from both ground based and space borne observatories. We discuss what numerical models can tell us about the mass and energy flows in the region of the upper chromosphere and lower corona, using a variety of tools, including the direct comparison with data and the use of passive tracer particles (so-called 'corks') inserted into the simulated flows.

  9. Direct modeling for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Xu, Kun

    2015-06-01

    All fluid dynamic equations are valid under their modeling scales, such as the particle mean free path and mean collision time scale of the Boltzmann equation and the hydrodynamic scale of the Navier-Stokes (NS) equations. The current computational fluid dynamics (CFD) focuses on the numerical solution of partial differential equations (PDEs), and its aim is to get the accurate solution of these governing equations. Under such a CFD practice, it is hard to develop a unified scheme that covers flow physics from kinetic to hydrodynamic scales continuously because there is no such governing equation which could make a smooth transition from the Boltzmann to the NS modeling. The study of fluid dynamics needs to go beyond the traditional numerical partial differential equations. The emerging engineering applications, such as air-vehicle design for near-space flight and flow and heat transfer in micro-devices, do require further expansion of the concept of gas dynamics to a larger domain of physical reality, rather than the traditional distinguishable governing equations. At the current stage, the non-equilibrium flow physics has not yet been well explored or clearly understood due to the lack of appropriate tools. Unfortunately, under the current numerical PDE approach, it is hard to develop such a meaningful tool due to the absence of valid PDEs. In order to construct multiscale and multiphysics simulation methods similar to the modeling process of constructing the Boltzmann or the NS governing equations, the development of a numerical algorithm should be based on the first principle of physical modeling. In this paper, instead of following the traditional numerical PDE path, we introduce direct modeling as a principle for CFD algorithm development. Since all computations are conducted in a discretized space with limited cell resolution, the flow physics to be modeled has to be done in the mesh size and time step scales. Here, the CFD is more or less a direct construction of discrete numerical evolution equations, where the mesh size and time step will play dynamic roles in the modeling process. With the variation of the ratio between mesh size and local particle mean free path, the scheme will capture flow physics from the kinetic particle transport and collision to the hydrodynamic wave propagation. Based on the direct modeling, a continuous dynamics of flow motion will be captured in the unified gas-kinetic scheme. This scheme can be faithfully used to study the unexplored non-equilibrium flow physics in the transition regime.

  10. Understanding online health information: Evaluation, tools, and strategies.

    PubMed

    Beaunoyer, Elisabeth; Arsenault, Marianne; Lomanowska, Anna M; Guitton, Matthieu J

    2017-02-01

    Considering the status of the Internet as a prominent source of health information, assessing online health material has become a central issue in patient education. We describe the strategies available to evaluate the characteristics of online health information, including readability, emotional content, understandability, usability. Popular tools used in assessment of readability, emotional content and comprehensibility of online health information were reviewed. Tools designed to evaluate both printed and online material were considered. Readability tools are widely used in online health material evaluation and are highly covariant. Assessment of emotional content of online health-related communications via sentiment analysis tools is becoming more popular. Understandability and usability tools have been developed specifically for health-related material, but each tool has important limitations and has been tested on a limited number of health issues. Despite the availability of numerous assessment tools, their overall reliability differs between readability (high) and understandability (low). Approaches combining multiple assessment tools and involving both quantitative and qualitative observations would optimize assessment strategies. Effective assessment of online health information should rely on mixed strategies combining quantitative and qualitative evaluations. Assessment tools should be selected according to their functional properties and compatibility with target material. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    1999-01-01

    Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  12. Terascale High-Fidelity Simulations of Turbulent Combustion with Detailed Chemistry: Spray Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutland, Christopher J.

    2009-04-26

    The Terascale High-Fidelity Simulations of Turbulent Combustion (TSTC) project is a multi-university collaborative effort to develop a high-fidelity turbulent reacting flow simulation capability utilizing terascale, massively parallel computer technology. The main paradigm of the approach is direct numerical simulation (DNS) featuring the highest temporal and spatial accuracy, allowing quantitative observations of the fine-scale physics found in turbulent reacting flows as well as providing a useful tool for development of sub-models needed in device-level simulations. Under this component of the TSTC program the simulation code named S3D, developed and shared with coworkers at Sandia National Laboratories, has been enhanced with newmore » numerical algorithms and physical models to provide predictive capabilities for turbulent liquid fuel spray dynamics. Major accomplishments include improved fundamental understanding of mixing and auto-ignition in multi-phase turbulent reactant mixtures and turbulent fuel injection spray jets.« less

  13. Development of a Comprehensive and Interactive Tool to Inform State Violence and Injury Prevention Plans.

    PubMed

    Wilson, Lauren; Deokar, Angela J; Zaesim, Araya; Thomas, Karen; Kresnow-Sedacca, Marcie-Jo

    The Center of Disease Control and Prevention's Core State Violence and Injury Prevention Program (Core SVIPP) provides an opportunity for states to engage with their partners to implement, evaluate, and disseminate strategies that lead to the reduction and prevention of injury and violence. Core SVIPP requires awardees to develop or update their state injury and violence plans. Currently, literature informing state planning efforts is limited, especially regarding materials related to injury and violence. Presumably, plans that are higher quality result in having a greater impact on preventing injury and violence, and literature to improve quality would benefit prevention programming. (1) To create a comprehensive injury-specific index to aid in the development and revision of state injury and violence prevention plans, and (2) to assess the reliability and utility of this index. Through an iterative development process, a workgroup of subject matter experts created the Violence and Injury Prevention: Comprehensive Index Tool (VIP:CIT). The tool was pilot tested on 3 state injury and violence prevention plans and assessed for initial usability. Following revisions to the tool (ie, a rubric was developed to further delineate consistent criteria for rating; items were added and clarified), the same state plans were reassessed to test interrater reliability and tool utility. For the second assessment, reliability of the VIP:CIT improved, indicating that the rubric was a useful addition. Qualitative feedback from states suggested that the tool significantly helped guide plan development and communicate about planning processes. The final VIP:CIT is a tool that can help increase plan quality, decrease the research-to-practice gap, and increase connectivity to emerging public health paradigms. The tool provides an example of tailoring guidance materials to reflect academic literature, and it can be easily adapted to other topic areas to promote quality of strategic plans for numerous outcomes.

  14. Development of a Comprehensive and Interactive Tool to Inform State Violence and Injury Prevention Plans

    PubMed Central

    Wilson, Lauren; Deokar, Angela J.; Zaesim, Araya; Thomas, Karen; Kresnow-Sedacca, Marcie-jo

    2018-01-01

    Context The Center of Disease Control and Prevention’s Core State Violence and Injury Prevention Program (Core SVIPP) provides an opportunity for states to engage with their partners to implement, evaluate, and disseminate strategies that lead to the reduction and prevention of injury and violence. Core SVIPP requires awardees to develop or update their state injury and violence plans. Currently, literature informing state planning efforts is limited, especially regarding materials related to injury and violence. Presumably, plans that are higher quality result in having a greater impact on preventing injury and violence, and literature to improve quality would benefit prevention programming. Objective (1) To create a comprehensive injury-specific index to aid in the development and revision of state injury and violence prevention plans, and (2) to assess the reliability and utility of this index. Design Through an iterative development process, a workgroup of subject matter experts created the Violence and Injury Prevention: Comprehensive Index Tool (VIP:CIT). The tool was pilot tested on 3 state injury and violence prevention plans and assessed for initial usability. Following revisions to the tool (ie, a rubric was developed to further delineate consistent criteria for rating; items were added and clarified), the same state plans were reassessed to test interrater reliability and tool utility. Results For the second assessment, reliability of the VIP:CIT improved, indicating that the rubric was a useful addition. Qualitative feedback from states suggested that the tool significantly helped guide plan development and communicate about planning processes. Conclusion The final VIP:CIT is a tool that can help increase plan quality, decrease the research-to-practice gap, and increase connectivity to emerging public health paradigms. The tool provides an example of tailoring guidance materials to reflect academic literature, and it can be easily adapted to other topic areas to promote quality of strategic plans for numerous outcomes. PMID:29189505

  15. A Computational Framework for Efficient Low Temperature Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  16. Designs and numerical calculations for echo-enabled harmonic generation at very high harmonics

    NASA Astrophysics Data System (ADS)

    Penn, G.; Reinsch, M.

    2011-09-01

    The echo-enabled harmonic generation (EEHG) scheme for driving an FEL using two seeded energy modulations at much longer wavelengths than the output wavelength is a promising concept for future seeded FELs. There are many competing requirements in the design of an EEHG beamline which need careful optimization. Furthermore, revised simulation tools and methods are necessary because of both the high harmonic numbers simulated and the complicated nature of the phase space manipulations which are intrinsic to the scheme. This paper explores the constraints on performance and the required tolerances for reaching wavelengths well below 1/100th of that of the seed lasers, and describes some of the methodology for designing such a beamline. Numerical tools, developed both for the GENESIS and GINGER FEL codes, are presented and used here for more accurate study of the scheme beyond a time-averaged model. In particular, the impact of the local structure in peak current and bunching, which is an inherent part of the EEHG scheme, is evaluated.

  17. Detailed requirements document for the integrated structural analysis system, phase B

    NASA Technical Reports Server (NTRS)

    Rainey, J. A.

    1976-01-01

    The requirements are defined for a software system entitled integrated Structural Analysis System (ISAS) Phase B which is being developed to provide the user with a tool by which a complete and detailed analysis of a complex structural system can be performed. This software system will allow for automated interface with numerous structural analysis batch programs and for user interaction in the creation, selection, and validation of data. This system will include modifications to the 4 functions developed for ISAS, and the development of 25 new functions. The new functions are described.

  18. ESDAPT - APT PROGRAMMING EDITOR AND INTERPRETER

    NASA Technical Reports Server (NTRS)

    Premack, T.

    1994-01-01

    ESDAPT is a graphical programming environment for developing APT (Automatically Programmed Tool) programs for controlling numerically controlled machine tools. ESDAPT has a graphical user interface that provides the user with an APT syntax sensitive text editor and windows for displaying geometry and tool paths. APT geometry statement can also be created using menus and screen picks. ESDAPT interprets APT geometry statements and displays the results in its view windows. Tool paths are generated by batching the APT source to an APT processor (COSMIC P-APT recommended). The tool paths are then displayed in the view windows. Hardcopy output of the view windows is in color PostScript format. ESDAPT is written in C-language, yacc, lex, and XView for use on Sun4 series computers running SunOS. ESDAPT requires 4Mb of disk space, 7Mb of RAM, and MIT's X Window System, Version 11 Release 4, or OpenWindows version 3 for execution. Program documentation in PostScript format and an executable for OpenWindows version 3 are provided on the distribution media. The standard distribution medium for ESDAPT is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. This program was developed in 1992.

  19. An Evaluation Tool for CONUS-Scale Estimates of Components of the Water Balance

    NASA Astrophysics Data System (ADS)

    Saxe, S.; Hay, L.; Farmer, W. H.; Markstrom, S. L.; Kiang, J. E.

    2016-12-01

    Numerous research groups are independently developing data products to represent various components of the water balance (e.g. runoff, evapotranspiration, recharge, snow water equivalent, soil moisture, and climate) at the scale of the conterminous United States. These data products are derived from a range of sources, including direct measurement, remotely-sensed measurement, and statistical and deterministic model simulations. An evaluation tool is needed to compare these data products and the components of the water balance they contain in order to identify the gaps in the understanding and representation of continental-scale hydrologic processes. An ideal tool will be an objective, universally agreed upon, framework to address questions related to closing the water balance. This type of generic, model agnostic evaluation tool would facilitate collaboration amongst different hydrologic research groups and improve modeling capabilities with respect to continental-scale water resources. By adopting a comprehensive framework to consider hydrologic modeling in the context of a complete water balance, it is possible to identify weaknesses in process modeling, data product representation and regional hydrologic variation. As part of its National Water Census initiative, the U.S. Geological survey is facilitating this dialogue to developing prototype evaluation tools.

  20. Successful fabrication of a convex platform PMMA cell-counting slide using a high-precision perpendicular dual-spindle CNC machine tool

    NASA Astrophysics Data System (ADS)

    Chen, Shun-Tong; Chang, Chih-Hsien

    2013-12-01

    This study presents a novel approach to the fabrication of a biomedical-mold for producing convex platform PMMA (poly-methyl-meth-acrylate) slides for counting cells. These slides allow for the microscopic examination of urine sediment cells. Manufacturing of such slides incorporates three important procedures: (1) the development of a tabletop high-precision dual-spindle CNC (computerized numerical control) machine tool; (2) the formation of a boron-doped polycrystalline composite diamond (BD-PCD) wheel-tool on the machine tool developed in procedure (1); and (3) the cutting of a multi-groove-biomedical-mold array using the formed diamond wheel-tool in situ on the developed machine. The machine incorporates a hybrid working platform providing wheel-tool thinning using spark erosion to cut, polish, and deburr microgrooves on NAK80 steel directly. With consideration given for the electrical conductive properties of BD-PCD, the diamond wheel-tool is thinned to a thickness of 5 µm by rotary wire electrical discharge machining. The thinned wheel-tool can grind microgrooves 10 µm wide. An embedded design, which inserts a close fitting precision core into the biomedical-mold to create step-difference (concave inward) of 50 µm in height between the core and the mold, is also proposed and realized. The perpendicular dual-spindles and precision rotary stage are features that allow for biomedical-mold machining without the necessity of uploading and repositioning materials until all tasks are completed. A PMMA biomedical-slide with a plurality of juxtaposed counting chambers is formed and its usefulness verified.

  1. Two-Dimensional Model for Reactive-Sorption Columns of Cylindrical Geometry: Analytical Solutions and Moment Analysis.

    PubMed

    Khan, Farman U; Qamar, Shamsul

    2017-05-01

    A set of analytical solutions are presented for a model describing the transport of a solute in a fixed-bed reactor of cylindrical geometry subjected to the first (Dirichlet) and third (Danckwerts) type inlet boundary conditions. Linear sorption kinetic process and first-order decay are considered. Cylindrical geometry allows the use of large columns to investigate dispersion, adsorption/desorption and reaction kinetic mechanisms. The finite Hankel and Laplace transform techniques are adopted to solve the model equations. For further analysis, statistical temporal moments are derived from the Laplace-transformed solutions. The developed analytical solutions are compared with the numerical solutions of high-resolution finite volume scheme. Different case studies are presented and discussed for a series of numerical values corresponding to a wide range of mass transfer and reaction kinetics. A good agreement was observed in the analytical and numerical concentration profiles and moments. The developed solutions are efficient tools for analyzing numerical algorithms, sensitivity analysis and simultaneous determination of the longitudinal and transverse dispersion coefficients from a laboratory-scale radial column experiment. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Simulation Tool for Dielectric Barrier Discharge Plasma Actuators at Atmospheric and Sub-Atmospheric Pressures: SBIR Phase I Final Report

    NASA Technical Reports Server (NTRS)

    Likhanskii, Alexandre

    2012-01-01

    This report is the final report of a SBIR Phase I project. It is identical to the final report submitted, after some proprietary information of administrative nature has been removed. The development of a numerical simulation tool for dielectric barrier discharge (DBD) plasma actuator is reported. The objectives of the project were to analyze and predict DBD operation at wide range of ambient gas pressures. It overcomes the limitations of traditional DBD codes which are limited to low-speed applications and have weak prediction capabilities. The software tool allows DBD actuator analysis and prediction for subsonic to hypersonic flow regime. The simulation tool is based on the VORPAL code developed by Tech-X Corporation. VORPAL's capability of modeling DBD plasma actuator at low pressures (0.1 to 10 torr) using kinetic plasma modeling approach, and at moderate to atmospheric pressures (1 to 10 atm) using hydrodynamic plasma modeling approach, were demonstrated. In addition, results of experiments with pulsed+bias DBD configuration that were performed for validation purposes are reported.

  3. A group decision-making tool for the application of membrane technologies in different water reuse scenarios.

    PubMed

    Sadr, S M K; Saroj, D P; Kouchaki, S; Ilemobade, A A; Ouki, S K

    2015-06-01

    A global challenge of increasing concern is diminishing fresh water resources. A growing practice in many communities to supplement diminishing fresh water availability has been the reuse of water. Novel methods of treating polluted waters, such as membrane assisted technologies, have recently been developed and successfully implemented in many places. Given the diversity of membrane assisted technologies available, the current challenge is how to select a reliable alternative among numerous technologies for appropriate water reuse. In this research, a fuzzy logic based multi-criteria, group decision making tool has been developed. This tool has been employed in the selection of appropriate membrane treatment technologies for several non-potable and potable reuse scenarios. Robust criteria, covering technical, environmental, economic and socio-cultural aspects, were selected, while 10 different membrane assisted technologies were assessed in the tool. The results show this approach capable of facilitating systematic and rigorous analysis in the comparison and selection of membrane assisted technologies for advanced wastewater treatment and reuse. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Analyzing asteroid reflectance spectra with numerical tools based on scattering simulations

    NASA Astrophysics Data System (ADS)

    Penttilä, Antti; Väisänen, Timo; Markkanen, Johannes; Martikainen, Julia; Gritsevich, Maria; Muinonen, Karri

    2017-04-01

    We are developing a set of numerical tools that can be used in analyzing the reflectance spectra of granular materials such as the regolith surface of atmosphereless Solar system objects. Our goal is to be able to explain, with realistic numerical scattering models, the spectral features arising when materials are intimately mixed together. We include the space-weathering -type effects in our simulations, i.e., mixing host mineral locally with small inclusions of another material in small proportions. Our motivation for this study comes from the present lack of such tools. The current common practice is to apply a semi-physical approximate model such as some variation of Hapke models [e.g., 1] or the Shkuratov model [2]. These models are expressed in a closed form so that they are relatively fast to apply. They are based on simplifications on the radiative transfer theory. The problem is that the validity of the model is not always guaranteed, and the derived physical properties related to particle scattering properties can be unrealistic [3]. We base our numerical tool into a chain of scattering simulations. Scattering properties of small inclusions inside an absorbing host matrix can be derived using exact methods solving the Maxwell equations of the system. The next step, scattering by a single regolith grain, is solved using a geometrical optics method accounting for surface reflections, internal absorption, and possibly the internal diffuse scattering. The third step involves the radiative transfer simulations of these regolith grains in a macroscopic planar element. The chain can be continued next with shadowing simulation over the target surface elements, and finally by integrating the bidirectional reflectance distribution function over the object's shape. Most of the tools in the proposed chain already exist, and one practical task for us is to tie these together into an easy-to-use toolchain that can be publicly distributed. We plan to open the abovementioned toolchain as a web-based open service. Acknowledgments: The research is funded by the ERC Advanced Grant No. 320773 (SAEMPL) References: [1] B. Hapke, Icarus 195, 918-926, 2008. [2] Yu. Shkuratov et al, Icarus 137, 235-246, 1999. [3] Yu. Shkuratov et al, JQSRT 113, 2431-2456, 2012. [4] K. Muinonen et al, JQSRT 110, 1628-1639, 2009.

  5. È VIVO: Virtual eruptions at Vesuvius; A multimedia tool to illustrate numerical modeling to a general public

    NASA Astrophysics Data System (ADS)

    Todesco, Micol; Neri, Augusto; Demaria, Cristina; Marmo, Costantino; Macedonio, Giovanni

    2006-07-01

    Dissemination of scientific results to the general public has become increasingly important in our society. When science deals with natural hazards, public outreach is even more important: on the one hand, it contributes to hazard perception and it is a necessary step toward preparedness and risk mitigation; on the other hand, it contributes to establish a positive link of mutual confidence between scientific community and the population living at risk. The existence of such a link plays a relevant role in hazard communication, which in turn is essential to mitigate the risk. In this work, we present a tool that we have developed to illustrate our scientific results on pyroclastic flow propagation at Vesuvius. This tool, a CD-ROM that we developed joining scientific data with appropriate knowledge in communication sciences is meant to be a first prototype that will be used to test the validity of this approach to public outreach. The multimedia guide contains figures, images of real volcanoes and computer animations obtained through numerical modeling of pyroclastic density currents. Explanatory text, kept as short and simple as possible, illustrates both the process and the methodology applied to study this very dangerous natural phenomenon. In this first version, the CD-ROM will be distributed among selected categories of end-users together with a short questionnaire that we have drawn to test its readability. Future releases will include feedback from the users, further advancement of scientific results as well as a higher degree of interactivity.

  6. The 360 photography: a new anatomical insight of the sphenoid bone. Interest for anatomy teaching and skull base surgery.

    PubMed

    Jacquesson, Timothée; Mertens, Patrick; Berhouma, Moncef; Jouanneau, Emmanuel; Simon, Emile

    2017-01-01

    Skull base architecture is tough to understand because of its 3D complex shape and its numerous foramen, reliefs or joints. It is especially true for the sphenoid bone whom central location hinged with most of skull base components is unique. Recently, technological progress has led to develop new pedagogical tools. This way, we bought a new real-time three-dimensional insight of the sphenoid bone that could be useful for the teacher, the student and the surgeon. High-definition photography was taken all around an isolated dry skull base bone prepared with Beauchêne's technique. Pictures were then computed to provide an overview with rotation and magnification on demand. From anterior, posterior, lateral or oblique views and from in out looks, anatomical landmarks and subtleties were described step by step. Thus, the sella turcica, the optic canal, the superior orbital fissure, the sphenoid sinus, the vidian canal, pterygoid plates and all foramen were clearly placed relative to the others at each face of the sphenoid bone. In addition to be the first report of the 360 Photography tool, perspectives are promising as the development of a real-time interactive tridimensional space featuring the sphenoid bone. It allows to turn around the sphenoid bone and to better understand its own special shape, numerous foramen, neurovascular contents and anatomical relationships. This new technological tool may further apply for surgical planning and mostly for strengthening a basic anatomical knowledge firstly introduced.

  7. A Study on Tooling and Its Effect on Heat Generation and Mechanical Properties of Welded Joints in Friction Stir Welding

    NASA Astrophysics Data System (ADS)

    Tikader, Sujoy; Biswas, Pankaj; Puri, Asit Baran

    2018-04-01

    Friction stir welding (FSW) has been the most attracting solid state welding process as it serves numerous advantages like good mechanical, metallurgical properties etc. Non weldable aluminium alloys like 5XXX, 7XXX series can be simply joined by this process. In this present study a mathematical model has been developed and experiments were successfully performed to evaluate mechanical properties of FSW on similar aluminium alloys i.e. AA1100 for different process parameters and mainly two kind of tool geometry (straight cylindrical and conical or cylindrical tapered shaped pin with flat shoulder). Tensile strength and micro hardness for different process parameters are reported of the welded plate sample. It was noticed that in FSW of similar alloy with tool made of SS-310 tool steel, friction is the major contributor for the heat generation. It was seen that tool geometry, tool rotational speed, plunging force by the tool and traverse speed have significant effect on tensile strength and hardness of friction stir welded joints.

  8. Development of a numerical methodology for flowforming process simulation of complex geometry tubes

    NASA Astrophysics Data System (ADS)

    Varela, Sonia; Santos, Maite; Arroyo, Amaia; Pérez, Iñaki; Puigjaner, Joan Francesc; Puigjaner, Blanca

    2017-10-01

    Nowadays, the incremental flowforming process is widely explored because of the usage of complex tubular products is increasing due to the light-weighting trend and the use of expensive materials. The enhanced mechanical properties of finished parts combined with the process efficiency in terms of raw material and energy consumption are the key factors for its competitiveness and sustainability, which is consistent with EU industry policy. As a promising technology, additional steps for extending the existing flowforming limits in the production of tubular products are required. The objective of the present research is to further expand the current state of the art regarding limitations on tube thickness and diameter, exploring the feasibility to flowform complex geometries as tubes of elevated thickness of up to 60 mm. In this study, the analysis of the backward flowforming process of 7075 aluminum tubular preform is carried out to define the optimum process parameters, machine requirements and tooling geometry as demonstration case. Numerical simulation studies on flowforming of thin walled tubular components have been considered to increase the knowledge of the technology. The calculation of the rotational movement of the mesh preform, the high ratio thickness/length and the thermomechanical condition increase significantly the computation time of the numerical simulation model. This means that efficient and reliable tools able to predict the forming loads and the quality of flowformed thick tubes are not available. This paper aims to overcome this situation by developing a simulation methodology based on FEM simulation code including new strategies. Material characterization has also been performed through tensile test to able to design the process. Finally, to check the reliability of the model, flowforming tests at industrial environment have been developed.

  9. Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)

    NASA Astrophysics Data System (ADS)

    Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.

    2014-04-01

    A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.

  10. An Object Model for a Rocket Engine Numerical Simulator

    NASA Technical Reports Server (NTRS)

    Mitra, D.; Bhalla, P. N.; Pratap, V.; Reddy, P.

    1998-01-01

    Rocket Engine Numerical Simulator (RENS) is a packet of software which numerically simulates the behavior of a rocket engine. Different parameters of the components of an engine is the input to these programs. Depending on these given parameters the programs output the behaviors of those components. These behavioral values are then used to guide the design of or to diagnose a model of a rocket engine "built" by a composition of these programs simulating different components of the engine system. In order to use this software package effectively one needs to have a flexible model of a rocket engine. These programs simulating different components then should be plugged into this modular representation. Our project is to develop an object based model of such an engine system. We are following an iterative and incremental approach in developing the model, as is the standard practice in the area of object oriented design and analysis of softwares. This process involves three stages: object modeling to represent the components and sub-components of a rocket engine, dynamic modeling to capture the temporal and behavioral aspects of the system, and functional modeling to represent the transformational aspects. This article reports on the first phase of our activity under a grant (RENS) from the NASA Lewis Research center. We have utilized Rambaugh's object modeling technique and the tool UML for this purpose. The classes of a rocket engine propulsion system are developed and some of them are presented in this report. The next step, developing a dynamic model for RENS, is also touched upon here. In this paper we will also discuss the advantages of using object-based modeling for developing this type of an integrated simulator over other tools like an expert systems shell or a procedural language, e.g., FORTRAN. Attempts have been made in the past to use such techniques.

  11. ALPHA SMP SYSTEM(S) Final Report CRADA No. TC-1404-97

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seager, M.; Beaudet, T.

    Within the scope of this subcontract, Digital Equipment Corporation (DIGITAL) and the University, through the Lawrence Livermore National Laboratory (LLNL), engaged in joint research and development activities of mutual interest and benefit. The primary objectives of these activities were, for LLNL to improve its capability to perform its mission, and for DIGITAL to develop technical capability complimentary to this mission. The collaborative activities had direct manpower investments by DIGITAL and LLNL. The project was divided into four areas of concern, which were handled concurrently. These areas included Gang Scheduling, Numerical Methods, Applications Development and Code Development Tools.

  12. The 3D widgets for exploratory scientific visualization

    NASA Technical Reports Server (NTRS)

    Herndon, Kenneth P.; Meyer, Tom

    1995-01-01

    Computational fluid dynamics (CFD) techniques are used to simulate flows of fluids like air or water around such objects as airplanes and automobiles. These techniques usually generate very large amounts of numerical data which are difficult to understand without using graphical scientific visualization techniques. There are a number of commercial scientific visualization applications available today which allow scientists to control visualization tools via textual and/or 2D user interfaces. However, these user interfaces are often difficult to use. We believe that 3D direct-manipulation techniques for interactively controlling visualization tools will provide opportunities for powerful and useful interfaces with which scientists can more effectively explore their datasets. A few systems have been developed which use these techniques. In this paper, we will present a variety of 3D interaction techniques for manipulating parameters of visualization tools used to explore CFD datasets, and discuss in detail various techniques for positioning tools in a 3D scene.

  13. PANDA-view: An easy-to-use tool for statistical analysis and visualization of quantitative proteomics data.

    PubMed

    Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping

    2018-05-22

    Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.

  14. Large robotized turning centers described

    NASA Astrophysics Data System (ADS)

    Kirsanov, V. V.; Tsarenko, V. I.

    1985-09-01

    The introduction of numerical control (NC) machine tools has made it possible to automate machining in series and small series production. The organization of automated production sections merged NC machine tools with automated transport systems. However, both the one and the other require the presence of an operative at the machine for low skilled operations. Industrial robots perform a number of auxiliary operations, such as equipment loading-unloading and control, changing cutting and auxiliary tools, controlling workpieces and parts, and cleaning of location surfaces. When used with a group of equipment they perform transfer operations between the machine tools. Industrial robots eliminate the need for workers to form auxiliary operations. This underscores the importance of developing robotized manufacturing centers providing for minimal human participation in production and creating conditions for two and three shift operation of equipment. Work carried out at several robotized manufacturing centers for series and small series production is described.

  15. Forecasting the Risks of Pollution from Ships along the Portuguese Coast

    NASA Astrophysics Data System (ADS)

    Fernandes, Rodrigo; Neves, Ramiro; Lourenço, Filipe; Braunschweig, Frank

    2013-04-01

    Pollution risks in coastal and marine environments are in general based in a static approach, considering historical data, reference situations, and typical scenarios. This approach is quite important in a planning stage. However, an alternative approach can be studied, due to the latest implementation of several different real-time monitoring tools as well as faster performances in the generation of numerical forecasts for metocean properties and trajectories of pollutants spilt at sea or costal zones. These developments provide the possibility of developing an integrated support system for better decision-making in emergency or planning issues associated to pollution risks. An innovative methodology to dynamically produce quantified risks in real-time, integrating best available information from numerical forecasts and the existing monitoring tools, has been developed and applied to the Portuguese Coast. The developed system provides coastal pollution risk levels associated to potential (or real) oil spill incidents from ship collision, grounding or foundering, taking into account regional statistic information on vessel accidents and coastal sensitivity indexes, real-time vessel information (positioning, cargo type, speed and vessel type) obtained from AIS, best-available metocean numerical forecasts (hydrodynamics, meteorology - including visibility, wave conditions) and simulated scenarios by the oil spill fate and behaviour component of MOHID Water Modelling System. Different spill fate and behaviour simulations are continuously generated and processed in background (assuming hypothetical spills from vessels), based on variable vessel information and metocean conditions. Results from these simulations are used in the quantification of consequences of potential spills. All historic information is continuously stored in a database (for risk analysis at a later stage). This dynamic approach improves the accuracy in quantification of consequences to the shoreline, as well as the decision support model, allowing a more effective prioritization of individual ships and geographical areas. This system was initially implemented in Portugal for oil spills. The implementation in other Atlantic Regions (starting in Galician Coast, Spain) is being executed in the scope of ARCOPOL+ project (2011-1/150), as well as other relevant updates. The system is being adapted to include risk modelling of chemical spills, as well as fire & explosion accidents and operational illegal discharges. Also the integration of EMSA's THETIS "ship risk profile" (according to Annex 7 from Paris Memorandum of Understanding) in the risk model is being tested. Finally, a new component is being developed to compute the risk for specific time periods, taking advantage of the information previously stored in the database on the positioning of vessels and / or results of numerical models. This component provides the possibility of obtaining a support tool for detailed characterization of risk profiles in certain periods or a sensitivity analysis on different parameters.

  16. Managing design excellence tools during the development of new orthopaedic implants.

    PubMed

    Défossez, Henri J P; Serhan, Hassan

    2013-11-01

    Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing techniques). All used techniques provided savings exceeding investment costs. Some other tools were considered and found less relevant. A matrix summarized the investment costs and generated estimated savings. Globally, all companies can benefit from using DEX by smartly selecting and estimating those tools with best return on investment at the start of the project. For this, a good understanding of the available company resources, background and development strategy are needed. In conclusion, it was possible to illustrate that appropriate management of design excellence tools can greatly facilitate the development of new orthopaedic implant systems.

  17. Object Signing in Bamboo

    DTIC Science & Technology

    2000-03-01

    Dynamics. Numerous smaller companies have entered the market in just the past year offering similar PKI security solutions in an effort to take advantage...web of social , legal and business interactions that, in some cases, have taken generations to mature. We use common instruments to establish trust...types of digital certificates on the market today. However, with the recent development of object signing tools, some of the later browsers are incapable

  18. Operational Exploitation of Satellite-Based Sounding Data and Numerical Weather Prediction Models for Directed Energy Applications

    DTIC Science & Technology

    2015-12-01

    Verification Tool for Laser Environmental Effects Definition and Reference (LEEDR) Development ................................... 45 3.5 Gap Filling with NWP... effective cloud cover for all cloud layers within the AIRS field-of-view. ......................................... 59 Figure 37. Average wind...IR Infrared JPL Jet Propulsion Lab LEEDR Laser Environmental Effects Definition and Reference LIDAR Light Detection and Ranging MODIS Moderate

  19. Stability of compressible Taylor-Couette flow

    NASA Technical Reports Server (NTRS)

    Kao, K.; Chow, C.

    1992-01-01

    The objectives of this paper are to: (1) develop both analytical and numerical tools that can be used to predict the onset of instability and subsequently to simulate the transition process by which the originally laminar flow evolves into a turbulent flow; and (2) conduct the preliminary investigations with the purpose of understanding the mechanisms of the vortical structures of the compressible flow between tow concentric cylinders.

  20. Seaworthy Quantum Key Distribution Design and Validation (SEAKEY)

    DTIC Science & Technology

    2014-10-30

    to single photon detection, at comparable detection efficiencies. On the other hand, error-correction codes are better developed for small-alphabet...protocol is several orders of magnitude better than the Shapiro protocol, which needs entangled states. The bits/mode performance achieved by our...putting together a software tool implemented in MATLAB , which talks to the MODTRAN database via an intermediate numerical dump of transmission data

  1. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  2. Guide to NavyFOAM V1.0

    DTIC Science & Technology

    2011-04-01

    NavyFOAM has been developed using an open-source CFD software tool-kit ( OpenFOAM ) that draws heavily upon object-oriented programming. The...numerical methods and the physical models in the original version of OpenFOAM have been upgraded in an effort to improve accuracy and robustness of...computational fluid dynamics OpenFOAM , Object Oriented Programming (OOP) (CFD), NavyFOAM, 16. SECURITY CLASSIFICATION OF: a. REPORT UNCLASSIFIED b

  3. A Randomized Controlled Trial of the Social Tools and Rules for Teens (START) Program: An Immersive Socialization Intervention for Adolescents with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Vernon, Ty W.; Miller, Amber R.; Ko, Jordan A.; Barrett, Amy C.; McGarry, Elizabeth S.

    2018-01-01

    Adolescents with ASD face numerous personal and contextual barriers that impede the development of social motivation and core competencies, warranting the need for targeted intervention. A randomized controlled trial was conducted with 40 adolescents to evaluate the merits of a multi-component socialization intervention that places emphasis on…

  4. Cyclic Fatigue of Brittle Materials with an Indentation-Induced Flaw System

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Salem, Jonathan A.

    1996-01-01

    The ratio of static to cyclic fatigue life, or 'h ratio', was obtained numerically for an indentation flaw system subjected to sinusoidal loading conditions. Emphasis was placed on developing a simple, quick lifetime prediction tool. The solution for the h ratio was compared with experimental static and cyclic fatigue data obtained from as-indented 96 wt.% alumina specimens tested in room-temperature distilled water.

  5. Mineral resource of the month: cobalt

    USGS Publications Warehouse

    Shedd, Kim B.

    2009-01-01

    Cobalt is a metal used in numerous commercial, industrial and military applications. On a global basis, the leading use of cobalt is in rechargeable lithium-ion, nickel-cadmium and nickel-metal hydride battery electrodes. Cobalt use has grown rapidly since the early 1990s, with the development of new battery technologies and an increase in demand for portable electronics such as cell phones, laptop computers and cordless power tools.

  6. Computerized Modeling and Loaded Tooth Contact Analysis of Hypoid Gears Manufactured by Face Hobbing Process

    NASA Astrophysics Data System (ADS)

    Nishino, Takayuki

    The face hobbing process has been widely applied in automotive industry. But so far few analytical tools have been developed. This makes it difficult for us to optimize gear design. To settle this situation, this study aims at developing a computerized tool to predict the running performances such as loaded tooth contact pattern, static transmission error and so on. First, based upon kinematical analysis of a cutting machine, a mathematical description of tooth surface generation is given. Second, based upon the theory of gearing and differential geometry, conjugate tooth surfaces are studied. Then contact lines are generated. Third, load distribution along contact lines is formulated. Last, the numerical model is validated by measuring loaded transmission error and loaded tooth contact pattern.

  7. Lightning Tracking Tool for Assessment of Total Cloud Lightning within AWIPS II

    NASA Technical Reports Server (NTRS)

    Burks, Jason E.; Stano, Geoffrey T.; Sperow, Ken

    2014-01-01

    Total lightning (intra-cloud and cloud-to-ground) has been widely researched and shown to be a valuable tool to aid real-time warning forecasters in the assessment of severe weather potential of convective storms. The trend of total lightning has been related to the strength of a storm's updraft. Therefore a rapid increase in total lightning signifies the strengthening of the parent thunderstorm. The assessment of severe weather potential occurs in a time limited environment and therefore constrains the use of total lightning. A tool has been developed at NASA's Short-term Prediction Research and Transition (SPoRT) Center to assist in quickly analyzing the total lightning signature of multiple storms. The development of this tool comes as a direct result of forecaster feedback from numerous assessments requesting a real-time display of the time series of total lightning. This tool also takes advantage of the new architecture available within the AWIPS II environment. SPoRT's lightning tracking tool has been tested in the Hazardous Weather Testbed (HWT) Spring Program and significant changes have been made based on the feedback. In addition to the updates in response to the HWT assessment, the lightning tracking tool may also be extended to incorporate other requested displays, such as the intra-cloud to cloud-to-ground ratio as well as incorporate the lightning jump algorithm.

  8. Numerical simulation of multi-rifled tube drawing - finding proper feedstock dimensions and tool geometry

    NASA Astrophysics Data System (ADS)

    Bella, P.; Buček, P.; Ridzoň, M.; Mojžiš, M.; Parilák, L.'

    2017-02-01

    Production of multi-rifled seamless steel tubes is quite a new technology in Železiarne Podbrezová. Therefore, a lot of technological questions emerges (process technology, input feedstock dimensions, material flow during drawing, etc.) Pilot experiments to fine tune the process cost a lot of time and energy. For this, numerical simulation would be an alternative solution for achieving optimal parameters in production technology. This would reduce the number of experiments needed, lowering the overall costs of development. However, to claim the numerical results to be relevant it is necessary to verify them against the actual plant trials. Searching for optimal input feedstock dimension for drawing of multi-rifled tube with dimensions Ø28.6 mm × 6.3 mm is what makes the main topic of this paper. As a secondary task, effective position of the plug - die couple has been solved via numerical simulation. Comparing the calculated results with actual numbers from plant trials a good agreement was observed.

  9. Final Report of the Project "From the finite element method to the virtual element method"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manzini, Gianmarco; Gyrya, Vitaliy

    The Finite Element Method (FEM) is a powerful numerical tool that is being used in a large number of engineering applications. The FEM is constructed on triangular/tetrahedral and quadrilateral/hexahedral meshes. Extending the FEM to general polygonal/polyhedral meshes in straightforward way turns out to be extremely difficult and leads to very complex and computationally expensive schemes. The reason for this failure is that the construction of the basis functions on elements with a very general shape is a non-trivial and complex task. In this project we developed a new family of numerical methods, dubbed the Virtual Element Method (VEM) for themore » numerical approximation of partial differential equations (PDE) of elliptic type suitable to polygonal and polyhedral unstructured meshes. We successfully formulated, implemented and tested these methods and studied both theoretically and numerically their stability, robustness and accuracy for diffusion problems, convection-reaction-diffusion problems, the Stokes equations and the biharmonic equations.« less

  10. Application of multi-grid method on the simulation of incremental forging processes

    NASA Astrophysics Data System (ADS)

    Ramadan, Mohamad; Khaled, Mahmoud; Fourment, Lionel

    2016-10-01

    Numerical simulation becomes essential in manufacturing large part by incremental forging processes. It is a splendid tool allowing to show physical phenomena however behind the scenes, an expensive bill should be paid, that is the computational time. That is why many techniques are developed to decrease the computational time of numerical simulation. Multi-Grid method is a numerical procedure that permits to reduce computational time of numerical calculation by performing the resolution of the system of equations on several mesh of decreasing size which allows to smooth faster the low frequency of the solution as well as its high frequency. In this paper a Multi-Grid method is applied to cogging process in the software Forge 3. The study is carried out using increasing number of degrees of freedom. The results shows that calculation time is divide by two for a mesh of 39,000 nodes. The method is promising especially if coupled with Multi-Mesh method.

  11. From hacking the human genome to editing organs.

    PubMed

    Tobita, Takamasa; Guzman-Lepe, Jorge; Collin de l'Hortet, Alexandra

    2015-01-01

    In the recent decades, human genome engineering has been one of the major interesting research subjects, essentially because it raises new possibilities for personalized medicine and biotechnologies. With the development of engineered nucleases such as the Zinc Finger Nucleases (ZFNs), the Transcription activator-like effector nucleases (TALENs) and more recently the Clustered Regularly Interspaced short Palindromic Repeats (CRISPR), the field of human genome edition has evolved very rapidly. Every new genetic tool is broadening the scope of applications on human tissues, even before we can completely master each of these tools. In this review, we will present the recent advances regarding human genome edition tools, we will discuss the numerous implications they have in research and medicine, and we will mention the limits and concerns about such technologies.

  12. From hacking the human genome to editing organs

    PubMed Central

    Tobita, Takamasa; Guzman-Lepe, Jorge; Collin de l'Hortet, Alexandra

    2015-01-01

    ABSTRACT In the recent decades, human genome engineering has been one of the major interesting research subjects, essentially because it raises new possibilities for personalized medicine and biotechnologies. With the development of engineered nucleases such as the Zinc Finger Nucleases (ZFNs), the Transcription activator-like effector nucleases (TALENs) and more recently the Clustered Regularly Interspaced short Palindromic Repeats (CRISPR), the field of human genome edition has evolved very rapidly. Every new genetic tool is broadening the scope of applications on human tissues, even before we can completely master each of these tools. In this review, we will present the recent advances regarding human genome edition tools, we will discuss the numerous implications they have in research and medicine, and we will mention the limits and concerns about such technologies PMID:26588350

  13. Numerical methods for stochastic differential equations

    NASA Astrophysics Data System (ADS)

    Kloeden, Peter; Platen, Eckhard

    1991-06-01

    The numerical analysis of stochastic differential equations differs significantly from that of ordinary differential equations due to the peculiarities of stochastic calculus. This book provides an introduction to stochastic calculus and stochastic differential equations, both theory and applications. The main emphasise is placed on the numerical methods needed to solve such equations. It assumes an undergraduate background in mathematical methods typical of engineers and physicists, through many chapters begin with a descriptive summary which may be accessible to others who only require numerical recipes. To help the reader develop an intuitive understanding of the underlying mathematicals and hand-on numerical skills exercises and over 100 PC Exercises (PC-personal computer) are included. The stochastic Taylor expansion provides the key tool for the systematic derivation and investigation of discrete time numerical methods for stochastic differential equations. The book presents many new results on higher order methods for strong sample path approximations and for weak functional approximations, including implicit, predictor-corrector, extrapolation and variance-reduction methods. Besides serving as a basic text on such methods. the book offers the reader ready access to a large number of potential research problems in a field that is just beginning to expand rapidly and is widely applicable.

  14. ASTRYD: A new numerical tool for aircraft cabin and environmental noise prediction

    NASA Astrophysics Data System (ADS)

    Berhault, J.-P.; Venet, G.; Clerc, C.

    ASTRYD is an analytical tool, developed originally for underwater applications, that computes acoustic pressure distribution around three-dimensional bodies in closed spaces like aircraft cabins. The program accepts data from measurements or other simulations, processes them in the time domain, and delivers temporal evolutions of the acoustic pressures and accelerations, as well as the radiated/diffracted pressure at arbitrary points located in the external/internal space. A typical aerospace application is prediction of acoustic load on satellites during the launching phase. An aeronautic application is engine noise distribution on a business jet body for prediction of environmental and cabin noise.

  15. Active Control of Fan Noise: Feasibility Study. Volume 5; Numerical Computation of Acoustic Mode Reflection Coefficients for an Unflanged Cylindrical Duct

    NASA Technical Reports Server (NTRS)

    Kraft, R. E.

    1996-01-01

    A computational method to predict modal reflection coefficients in cylindrical ducts has been developed based on the work of Homicz, Lordi, and Rehm, which uses the Wiener-Hopf method to account for the boundary conditions at the termination of a thin cylindrical pipe. The purpose of this study is to develop a computational routine to predict the reflection coefficients of higher order acoustic modes impinging on the unflanged termination of a cylindrical duct. This effort was conducted wider Task Order 5 of the NASA Lewis LET Program, Active Noise Control of aircraft Engines: Feasibility Study, and will be used as part of the development of an integrated source noise, acoustic propagation, ANC actuator coupling, and control system algorithm simulation. The reflection coefficient prediction will be incorporated into an existing cylindrical duct modal analysis to account for the reflection of modes from the duct termination. This will provide a more accurate, rapid computation design tool for evaluating the effect of reflected waves on active noise control systems mounted in the duct, as well as providing a tool for the design of acoustic treatment in inlet ducts. As an active noise control system design tool, the method can be used preliminary to more accurate but more numerically intensive acoustic propagation models such as finite element methods. The resulting computer program has been shown to give reasonable results, some examples of which are presented. Reliable data to use for comparison is scarce, so complete checkout is difficult, and further checkout is needed over a wider range of system parameters. In future efforts the method will be adapted as a subroutine to the GEAE segmented cylindrical duct modal analysis program.

  16. Creative computing with Landlab: an open-source toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics

    NASA Astrophysics Data System (ADS)

    Hobley, Daniel E. J.; Adams, Jordan M.; Nudurupati, Sai Siddhartha; Hutton, Eric W. H.; Gasparini, Nicole M.; Istanbulluoglu, Erkan; Tucker, Gregory E.

    2017-01-01

    The ability to model surface processes and to couple them to both subsurface and atmospheric regimes has proven invaluable to research in the Earth and planetary sciences. However, creating a new model typically demands a very large investment of time, and modifying an existing model to address a new problem typically means the new work is constrained to its detriment by model adaptations for a different problem. Landlab is an open-source software framework explicitly designed to accelerate the development of new process models by providing (1) a set of tools and existing grid structures - including both regular and irregular grids - to make it faster and easier to develop new process components, or numerical implementations of physical processes; (2) a suite of stable, modular, and interoperable process components that can be combined to create an integrated model; and (3) a set of tools for data input, output, manipulation, and visualization. A set of example models built with these components is also provided. Landlab's structure makes it ideal not only for fully developed modelling applications but also for model prototyping and classroom use. Because of its modular nature, it can also act as a platform for model intercomparison and epistemic uncertainty and sensitivity analyses. Landlab exposes a standardized model interoperability interface, and is able to couple to third-party models and software. Landlab also offers tools to allow the creation of cellular automata, and allows native coupling of such models to more traditional continuous differential equation-based modules. We illustrate the principles of component coupling in Landlab using a model of landform evolution, a cellular ecohydrologic model, and a flood-wave routing model.

  17. Numerical simulations of novel high-power high-brightness diode laser structures

    NASA Astrophysics Data System (ADS)

    Boucke, Konstantin; Rogg, Joseph; Kelemen, Marc T.; Poprawe, Reinhart; Weimann, Guenter

    2001-07-01

    One of the key topics in today's semiconductor laser development activities is to increase the brightness of high-power diode lasers. Although structures showing an increased brightness have been developed specific draw-backs of these structures lead to a still strong demand for investigation of alternative concepts. Especially for the investigation of basically novel structures easy-to-use and fast simulation tools are essential to avoid unnecessary, cost and time consuming experiments. A diode laser simulation tool based on finite difference representations of the Helmholtz equation in 'wide-angle' approximation and the carrier diffusion equation has been developed. An optimized numerical algorithm leads to short execution times of a few seconds per resonator round-trip on a standard PC. After each round-trip characteristics like optical output power, beam profile and beam parameters are calculated. A graphical user interface allows online monitoring of the simulation results. The simulation tool is used to investigate a novel high-power, high-brightness diode laser structure, the so-called 'Z-Structure'. In this structure an increased brightness is achieved by reducing the divergency angle of the beam by angular filtering: The round trip path of the beam is two times folded using internal total reflection at surfaces defined by a small index step in the semiconductor material, forming a stretched 'Z'. The sharp decrease of the reflectivity for angles of incidence above the angle of total reflection leads to a narrowing of the angular spectrum of the beam. The simulations of the 'Z-Structure' indicate an increase of the beam quality by a factor of five to ten compared to standard broad-area lasers.

  18. Transposons As Tools for Functional Genomics in Vertebrate Models.

    PubMed

    Kawakami, Koichi; Largaespada, David A; Ivics, Zoltán

    2017-11-01

    Genetic tools and mutagenesis strategies based on transposable elements are currently under development with a vision to link primary DNA sequence information to gene functions in vertebrate models. By virtue of their inherent capacity to insert into DNA, transposons can be developed into powerful tools for chromosomal manipulations. Transposon-based forward mutagenesis screens have numerous advantages including high throughput, easy identification of mutated alleles, and providing insight into genetic networks and pathways based on phenotypes. For example, the Sleeping Beauty transposon has become highly instrumental to induce tumors in experimental animals in a tissue-specific manner with the aim of uncovering the genetic basis of diverse cancers. Here, we describe a battery of mutagenic cassettes that can be applied in conjunction with transposon vectors to mutagenize genes, and highlight versatile experimental strategies for the generation of engineered chromosomes for loss-of-function as well as gain-of-function mutagenesis for functional gene annotation in vertebrate models, including zebrafish, mice, and rats. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Investigating the Potential Impacts of Energy Production in the Marcellus Shale Region Using the Shale Network Database

    NASA Astrophysics Data System (ADS)

    Brantley, S.; Brazil, L.

    2017-12-01

    The Shale Network's extensive database of water quality observations enables educational experiences about the potential impacts of resource extraction with real data. Through tools that are open source and free to use, researchers, educators, and citizens can access and analyze the very same data that the Shale Network team has used in peer-reviewed publications about the potential impacts of hydraulic fracturing on water. The development of the Shale Network database has been made possible through efforts led by an academic team and involving numerous individuals from government agencies, citizen science organizations, and private industry. Thus far, these tools and data have been used to engage high school students, university undergraduate and graduate students, as well as citizens so that all can discover how energy production impacts the Marcellus Shale region, which includes Pennsylvania and other nearby states. This presentation will describe these data tools, how the Shale Network has used them in developing lesson plans, and the resources available to learn more.

  20. Object oriented studies into artificial space debris

    NASA Technical Reports Server (NTRS)

    Adamson, J. M.; Marshall, G.

    1988-01-01

    A prototype simulation is being developed under contract to the Royal Aerospace Establishment (RAE), Farnborough, England, to assist in the discrimination of artificial space objects/debris. The methodology undertaken has been to link Object Oriented programming, intelligent knowledge based system (IKBS) techniques and advanced computer technology with numeric analysis to provide a graphical, symbolic simulation. The objective is to provide an additional layer of understanding on top of conventional classification methods. Use is being made of object and rule based knowledge representation, multiple reasoning, truth maintenance and uncertainty. Software tools being used include Knowledge Engineering Environment (KEE) and SymTactics for knowledge representation. Hooks are being developed within the SymTactics framework to incorporate mathematical models describing orbital motion and fragmentation. Penetration and structural analysis can also be incorporated. SymTactics is an Object Oriented discrete event simulation tool built as a domain specific extension to the KEE environment. The tool provides facilities for building, debugging and monitoring dynamic (military) simulations.

  1. RF Models for Plasma-Surface Interactions in VSim

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas G.; Smithe, D. N.; Pankin, A. Y.; Roark, C. M.; Zhou, C. D.; Stoltz, P. H.; Kruger, S. E.

    2014-10-01

    An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath physics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath, can thus be simulated in complex geometries. Generalizations of the model to include sputtering, secondary electron emission, and effects from multiple ion species and background magnetic fields are summarized; related numerical results are also presented. In addition, improved tools for plasma chemistry and IEDF/EEDF visualization and modeling are discussed, as well as our initial efforts toward the development of hybrid fluid/kinetic transition capabilities within VSim. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling industrial plasma processes. Supported by US DoE SBIR-I/II Award DE-SC0009501.

  2. Numerical Study of Magnetic Damping During Unidirectional Solidification

    NASA Technical Reports Server (NTRS)

    Li, Ben Q.

    1997-01-01

    A fully 3-D numerical model is developed to represent magnetic damping of complex fluid flow, heat transfer and electromagnetic field distributions in a melt cavity. The model is developed based on our in-house finite element code for the fluid flow, heat transfer and electromagnetic field calculations. The computer code has been tested against benchmark test problems that are solved by other commercial codes as well as analytical solutions whenever available. The numerical model is tested against numerical and experimental results for water reported in literature. With the model so tested, various numerical simulations are carried out for the Sn-35.5% Pb melt convection and temperature distribution in a cylindrical cavity with and without the presence of a transverse magnetic field. Numerical results show that magnetic damping can be effectively applied to reduce turbulence and flow levels in the melt undergoing solidification and over a certain threshold value a higher magnetic field resulted in a higher velocity reduction. It is found also that for a fully 3-D representation of the magnetic damping effects, the electric field induced in the melt by the applied DC magnetic field does not vanish, as some researchers suggested, and must be included even for molten metal and semiconductors. Also, for the study of the melt flow instability, a long enough time has to be applied to ensure the final fluid flow recirculation pattern. Moreover, our numerical results suggested that there seems to exist a threshold value of applied magnetic field, above which magnetic damping becomes possible and below which the convection in the melt is actually enhanced. Because of the limited financial resource allocated for the project, we are unable to carry out extensive study on this effect, which should warrant further theoretical and experimental study. In that endeavor, the developed numerical model should be very useful; and the model should serve as a useful tool for exploring necessary design parameters for planning magnetic damping experiments and interpreting the experimental results.

  3. Dances with Membranes: Breakthroughs from Super-resolution Imaging

    PubMed Central

    Curthoys, Nikki M.; Parent, Matthew; Mlodzianoski, Michael; Nelson, Andrew J.; Lilieholm, Jennifer; Butler, Michael B.; Valles, Matthew; Hess, Samuel T.

    2017-01-01

    Biological membrane organization mediates numerous cellular functions and has also been connected with an immense number of human diseases. However, until recently, experimental methodologies have been unable to directly visualize the nanoscale details of biological membranes, particularly in intact living cells. Numerous models explaining membrane organization have been proposed, but testing those models has required indirect methods; the desire to directly image proteins and lipids in living cell membranes is a strong motivation for the advancement of technology. The development of super-resolution microscopy has provided powerful tools for quantification of membrane organization at the level of individual proteins and lipids, and many of these tools are compatible with living cells. Previously inaccessible questions are now being addressed, and the field of membrane biology is developing rapidly. This chapter discusses how the development of super-resolution microscopy has led to fundamental advances in the field of biological membrane organization. We summarize the history and some models explaining how proteins are organized in cell membranes, and give an overview of various super-resolution techniques and methods of quantifying super-resolution data. We discuss the application of super-resolution techniques to membrane biology in general, and also with specific reference to the fields of actin and actin-binding proteins, virus infection, mitochondria, immune cell biology, and phosphoinositide signaling. Finally, we present our hopes and expectations for the future of super-resolution microscopy in the field of membrane biology. PMID:26015281

  4. The method of space-time conservation element and solution element-applications to one-dimensional and two-dimensional time-marching flow problems

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung; Wang, Xiao-Yen; Chow, Chuen-Yen

    1995-01-01

    A nontraditional numerical method for solving conservation laws is being developed. The new method is designed from a physicist's perspective, i.e., its development is based more on physics than numerics. Even though it uses only the simplest approximation techniques, a 2D time-marching Euler solver developed recently using the new method is capable of generating nearly perfect solutions for a 2D shock reflection problem used by Helen Yee and others. Moreover, a recent application of this solver to computational aeroacoustics (CAA) problems reveals that: (1) accuracy of its results is comparable to that of a 6th order compact difference scheme even though nominally the current solver is only of 2nd-order accuracy; (2) generally, the non-reflecting boundary condition can be implemented in a simple way without involving characteristic variables; and (3) most importantly, the current solver is capable of handling both continuous and discontinuous flows very well and thus provides a unique numerical tool for solving those flow problems where the interactions between sound waves and shocks are important, such as the noise field around a supersonic over- or under-expansion jet.

  5. Visualization and Interaction in Research, Teaching, and Scientific Communication

    NASA Astrophysics Data System (ADS)

    Ammon, C. J.

    2017-12-01

    Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.

  6. Towards a suite of test cases and a pycomodo library to assess and improve numerical methods in ocean models

    NASA Astrophysics Data System (ADS)

    Garnier, Valérie; Honnorat, Marc; Benshila, Rachid; Boutet, Martial; Cambon, Gildas; Chanut, Jérome; Couvelard, Xavier; Debreu, Laurent; Ducousso, Nicolas; Duhaut, Thomas; Dumas, Franck; Flavoni, Simona; Gouillon, Flavien; Lathuilière, Cyril; Le Boyer, Arnaud; Le Sommer, Julien; Lyard, Florent; Marsaleix, Patrick; Marchesiello, Patrick; Soufflet, Yves

    2016-04-01

    The COMODO group (http://www.comodo-ocean.fr) gathers developers of global and limited-area ocean models (NEMO, ROMS_AGRIF, S, MARS, HYCOM, S-TUGO) with the aim to address well-identified numerical issues. In order to evaluate existing models, to improve numerical approaches and methods or concept (such as effective resolution) to assess the behavior of numerical model in complex hydrodynamical regimes and to propose guidelines for the development of future ocean models, a benchmark suite that covers both idealized test cases dedicated to targeted properties of numerical schemes and more complex test case allowing the evaluation of the kernel coherence is proposed. The benchmark suite is built to study separately, then together, the main components of an ocean model : the continuity and momentum equations, the advection-diffusion of the tracers, the vertical coordinate design and the time stepping algorithms. The test cases are chosen for their simplicity of implementation (analytic initial conditions), for their capacity to focus on a (few) scheme or part of the kernel, for the availability of analytical solutions or accurate diagnoses and lastly to simulate a key oceanic processus in a controlled environment. Idealized test cases allow to verify properties of numerical schemes advection-diffusion of tracers, - upwelling, - lock exchange, - baroclinic vortex, - adiabatic motion along bathymetry, and to put into light numerical issues that remain undetected in realistic configurations - trajectory of barotropic vortex, - interaction current - topography. When complexity in the simulated dynamics grows up, - internal wave, - unstable baroclinic jet, the sharing of the same experimental designs by different existing models is useful to get a measure of the model sensitivity to numerical choices (Soufflet et al., 2016). Lastly, test cases help in understanding the submesoscale influence on the dynamics (Couvelard et al., 2015). Such a benchmark suite is an interesting bed to continue research in numerical approaches as well as an efficient tool to maintain any oceanic code and assure the users a stamped model in a certain range of hydrodynamical regimes. Thanks to a common netCDF format, this suite is completed with a python library that encompasses all the tools and metrics used to assess the efficiency of the numerical methods. References - Couvelard X., F. Dumas, V. Garnier, A.L. Ponte, C. Talandier, A.M. Treguier (2015). Mixed layer formation and restratification in presence of mesoscale and submesoscale turbulence. Ocean Modelling, Vol 96-2, p 243-253. doi:10.1016/j.ocemod.2015.10.004. - Soufflet Y., P. Marchesiello, F. Lemarié, J. Jouanno, X. Capet, L. Debreu , R. Benshila (2016). On effective resolution in ocean models. Ocean Modelling, in press. doi:10.1016/j.ocemod.2015.12.004

  7. Linear modeling of human hand-arm dynamics relevant to right-angle torque tool interaction.

    PubMed

    Ay, Haluk; Sommerich, Carolyn M; Luscher, Anthony F

    2013-10-01

    A new protocol was evaluated for identification of stiffness, mass, and damping parameters employing a linear model for human hand-arm dynamics relevant to right-angle torque tool use. Powered torque tools are widely used to tighten fasteners in manufacturing industries. While these tools increase accuracy and efficiency of tightening processes, operators are repetitively exposed to impulsive forces, posing risk of upper extremity musculoskeletal injury. A novel testing apparatus was developed that closely mimics biomechanical exposure in torque tool operation. Forty experienced torque tool operators were tested with the apparatus to determine model parameters and validate the protocol for physical capacity assessment. A second-order hand-arm model with parameters extracted in the time domain met model accuracy criterion of 5% for time-to-peak displacement error in 93% of trials (vs. 75% for frequency domain). Average time-to-peak handle displacement and relative peak handle force errors were 0.69 ms and 0.21%, respectively. Model parameters were significantly affected by gender and working posture. Protocol and numerical calculation procedures provide an alternative method for assessing mechanical parameters relevant to right-angle torque tool use. The protocol more closely resembles tool use, and calculation procedures demonstrate better performance of parameter extraction using time domain system identification methods versus frequency domain. Potential future applications include parameter identification for in situ torque tool operation and equipment development for human hand-arm dynamics simulation under impulsive forces that could be used for assessing torque tools based on factors relevant to operator health (handle dynamics and hand-arm reaction force).

  8. Development and use of mathematical models and software frameworks for integrated analysis of agricultural systems and associated water use impacts

    USGS Publications Warehouse

    Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.

    2016-01-01

    The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.

  9. CLIPS: The C language integrated production system

    NASA Technical Reports Server (NTRS)

    Riley, Gary

    1994-01-01

    Expert systems are computer programs which emulate human expertise in well defined problem domains. The potential payoff from expert systems is high: valuable expertise can be captured and preserved, repetitive and/or mundane tasks requiring human expertise can be automated, and uniformity can be applied in decision making processes. The C Language Integrated Production System (CLIPS) is an expert system building tool, developed at the Johnson Space Center, which provides a complete environment for the development and delivery of rule and/or object based expert systems. CLIPS was specifically designed to provide a low cost option for developing and deploying expert system applications across a wide range of hardware platforms. The commercial potential of CLIPS is vast. Currently, CLIPS is being used by over 5,000 individuals throughout the public and private sector. Because the CLIPS source code is readily available, numerous groups have used CLIPS as the basis for their own expert system tools. To date, three commercially available tools have been derived from CLIPS. In general, the development of CLIPS has helped to improve the ability to deliver expert system technology throughout the public and private sectors for a wide range of applications and diverse computing environments.

  10. Assessment of the performance of numerical modeling in reproducing a replenishment of sediments in a water-worked channel

    NASA Astrophysics Data System (ADS)

    Juez, C.; Battisacco, E.; Schleiss, A. J.; Franca, M. J.

    2016-06-01

    The artificial replenishment of sediment is used as a method to re-establish sediment continuity downstream of a dam. However, the impact of this technique on the hydraulics conditions, and resulting bed morphology, is yet to be understood. Several numerical tools have been developed during last years for modeling sediment transport and morphology evolution which can be used for this application. These models range from 1D to 3D approaches: the first being over simplistic for the simulation of such a complex geometry; the latter requires often a prohibitive computational effort. However, 2D models are computationally efficient and in these cases may already provide sufficiently accurate predictions of the morphology evolution caused by the sediment replenishment in a river. Here, the 2D shallow water equations in combination with the Exner equation are solved by means of a weak-coupled strategy. The classical friction approach considered for reproducing the bed channel roughness has been modified to take into account the morphological effect of replenishment which provokes a channel bed fining. Computational outcomes are compared with four sets of experimental data obtained from several replenishment configurations studied in the laboratory. The experiments differ in terms of placement volume and configuration. A set of analysis parameters is proposed for the experimental-numerical comparison, with particular attention to the spreading, covered surface and travel distance of placed replenishment grains. The numerical tool is reliable in reproducing the overall tendency shown by the experimental data. The effect of fining roughness is better reproduced with the approach herein proposed. However, it is also highlighted that the sediment clusters found in the experiment are not well numerically reproduced in the regions of the channel with a limited number of sediment grains.

  11. Delamination Assessment Tool for Spacecraft Composite Structures

    NASA Astrophysics Data System (ADS)

    Portela, Pedro; Preller, Fabian; Wittke, Henrik; Sinnema, Gerben; Camanho, Pedro; Turon, Albert

    2012-07-01

    Fortunately only few cases are known where failure of spacecraft structures due to undetected damage has resulted in a loss of spacecraft and launcher mission. However, several problems related to damage tolerance and in particular delamination of composite materials have been encountered during structure development of various ESA projects and qualification testing. To avoid such costly failures during development, launch or service of spacecraft, launcher and reusable launch vehicles structures a comprehensive damage tolerance verification approach is needed. In 2009, the European Space Agency (ESA) initiated an activity called “Delamination Assessment Tool” which is led by the Portuguese company HPS Lda and includes academic and industrial partners. The goal of this study is the development of a comprehensive damage tolerance verification approach for launcher and reusable launch vehicles (RLV) structures, addressing analytical and numerical methodologies, material-, subcomponent- and component testing, as well as non-destructive inspection. The study includes a comprehensive review of current industrial damage tolerance practice resulting from ECSS and NASA standards, the development of new Best Practice Guidelines for analysis, test and inspection methods and the validation of these with a real industrial case study. The paper describes the main findings of this activity so far and presents a first iteration of a Damage Tolerance Verification Approach, which includes the introduction of novel analytical and numerical tools at an industrial level. This new approach is being put to the test using real industrial case studies provided by the industrial partners, MT Aerospace, RUAG Space and INVENT GmbH

  12. The development of a flash flood severity index

    NASA Astrophysics Data System (ADS)

    Schroeder, Amanda J.; Gourley, Jonathan J.; Hardy, Jill; Henderson, Jen J.; Parhi, Pradipta; Rahmani, Vahid; Reed, Kimberly A.; Schumacher, Russ S.; Smith, Brianne K.; Taraldsen, Matthew J.

    2016-10-01

    Flash flooding is a high impact weather event that requires clear communication regarding severity and potential hazards among forecasters, researchers, emergency managers, and the general public. Current standards used to communicate these characteristics include return periods and the United States (U.S.) National Weather Service (NWS) 4-tiered river flooding severity scale. Return periods are largely misunderstood, and the NWS scale is limited to flooding on gauged streams and rivers, often leaving out heavily populated urban corridors. To address these shortcomings, a student-led group of interdisciplinary researchers came together in a collaborative effort to develop an impact-based Flash Flood Severity Index (FFSI). The index was proposed as a damage-based, post-event assessment tool, and preliminary work toward the creation of this index has been completed and presented here. Numerous case studies were analyzed to develop the preliminary outline for the FFSI, and three examples of such cases are included in this paper. The scale includes five impact-based categories ranging from Category 1 very minor flooding to Category 5 catastrophic flooding. Along with the numerous case studies used to develop the initial outline of the scale, empirical data in the form of semi-structured interviews were conducted with multiple NWS forecasters across the country and their responses were analyzed to gain more perspective on the complicated nature of flash flood definitions and which tools were found to be most useful. The feedback from these interviews suggests the potential for acceptance of such an index if it can account for specific challenges.

  13. Ensembles of NLP Tools for Data Element Extraction from Clinical Notes

    PubMed Central

    Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D.; Day, Michele E.; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan

    2016-01-01

    Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort. PMID:28269947

  14. Ensembles of NLP Tools for Data Element Extraction from Clinical Notes.

    PubMed

    Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D; Day, Michele E; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan

    2016-01-01

    Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort.

  15. Model Hosting for continuous updating and transparent Water Resources Management

    NASA Astrophysics Data System (ADS)

    Jódar, Jorge; Almolda, Xavier; Batlle, Francisco; Carrera, Jesús

    2013-04-01

    Numerical models have become a standard tool for water resources management. They are required for water volume bookkeeping and help in decision making. Nevertheless, numerical models are complex and they can be used only by highly qualified technicians, which are often far from the decision makers. Moreover, they need to be maintained. That is, they require updating of their state, by assimilation of measurements, natural and anthropic actions (e.g., pumping and weather data), and model parameters. Worst, their very complexity implies that are they viewed as obscure and far, which hinders transparency and governance. We propose internet model hosting as an alternative to overcome these limitations. The basic idea is to keep the model hosted in the cloud. The model is updated as new data (measurements and external forcing) becomes available, which ensures continuous maintenance, with a minimal human cost (only required to address modelling problems). Internet access facilitates model use not only by modellers, but also by people responsible for data gathering and by water managers. As a result, the model becomes an institutional tool shared by water agencies to help them not only in decision making for sustainable management of water resources, but also in generating a common discussion platform. By promoting intra-agency sharing, the model becomes the common official position of the agency, which facilitates commitment in their adopted decisions regarding water management. Moreover, by facilitating access to stakeholders and the general public, the state of the aquifer and the impacts of alternative decisions become transparent. We have developed a tool (GAC, Global Aquifer Control) to address the above requirements. The application has been developed using Cloud Computing technologies, which facilitates the above operations. That is, GAC automatically updates the numerical models with the new available measurements, and then simulates numerous management options as required. To this end the application generates as many computing virtual machines as needed, customizing their size (CPU, memory…) accounting for all the particular requirements of every numerical model. Results are presented from a quantitative point of view (i.e. groundwater as a resource), and also from a qualitative perspective (i.e. the use of solute concentrations in groundwater as an environmental vector). In both cases detailed mass balances time series are obtained which can be used jointly with all the input and output model data to solve water conflicts between the different actors using and/or affecting the groundwater of the aquifer.

  16. Perspectives on the Future of CFD

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2000-01-01

    This viewgraph presentation gives an overview of the future of computational fluid dynamics (CFD), which in the past has pioneered the field of flow simulation. Over time CFD has progressed as computing power. Numerical methods have been advanced as CPU and memory capacity increases. Complex configurations are routinely computed now and direct numerical simulations (DNS) and large eddy simulations (LES) are used to study turbulence. As the computing resources changed to parallel and distributed platforms, computer science aspects such as scalability (algorithmic and implementation) and portability and transparent codings have advanced. Examples of potential future (or current) challenges include risk assessment, limitations of the heuristic model, and the development of CFD and information technology (IT) tools.

  17. Cognitive screening tools for identification of dementia in illiterate and low-educated older adults, a systematic review and meta-analysis.

    PubMed

    Paddick, Stella-Maria; Gray, William K; McGuire, Jackie; Richardson, Jenny; Dotchin, Catherine; Walker, Richard W

    2017-06-01

    The majority of older adults with dementia live in low- and middle-income countries (LMICs). Illiteracy and low educational background are common in older LMIC populations, particularly in rural areas, and cognitive screening tools developed for this setting must reflect this. This study aimed to review published validation studies of cognitive screening tools for dementia in low-literacy settings in order to determine the most appropriate tools for use. A systematic search of major databases was conducted according to PRISMA guidelines. Validation studies of brief cognitive screening tests including illiterate participants or those with elementary education were eligible. Studies were quality assessed using the QUADAS-2 tool. Good or fair quality studies were included in a bivariate random-effects meta-analysis and a hierarchical summary receiver operating characteristic (HSROC) curve constructed. Forty-five eligible studies were quality assessed. A significant proportion utilized a case-control design, resulting in spectrum bias. The area under the ROC (AUROC) curve was 0.937 for community/low prevalence studies, 0.881 for clinic based/higher prevalence studies, and 0.869 for illiterate populations. For the Mini-Mental State Examination (MMSE) (and adaptations), the AUROC curve was 0.853. Numerous tools for assessment of cognitive impairment in low-literacy settings have been developed, and tools developed for use in high-income countries have also been validated in low-literacy settings. Most tools have been inadequately validated, with only MMSE, cognitive abilities screening instrument (CASI), Eurotest, and Fototest having more than one published good or fair quality study in an illiterate or low-literate setting. At present no screening test can be recommended.

  18. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2018 update.

    PubMed

    Afgan, Enis; Baker, Dannon; Batut, Bérénice; van den Beek, Marius; Bouvier, Dave; Cech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Grüning, Björn A; Guerler, Aysam; Hillman-Jackson, Jennifer; Hiltemann, Saskia; Jalili, Vahid; Rasche, Helena; Soranzo, Nicola; Goecks, Jeremy; Taylor, James; Nekrutenko, Anton; Blankenberg, Daniel

    2018-05-22

    Galaxy (homepage: https://galaxyproject.org, main public server: https://usegalaxy.org) is a web-based scientific analysis platform used by tens of thousands of scientists across the world to analyze large biomedical datasets such as those found in genomics, proteomics, metabolomics and imaging. Started in 2005, Galaxy continues to focus on three key challenges of data-driven biomedical science: making analyses accessible to all researchers, ensuring analyses are completely reproducible, and making it simple to communicate analyses so that they can be reused and extended. During the last two years, the Galaxy team and the open-source community around Galaxy have made substantial improvements to Galaxy's core framework, user interface, tools, and training materials. Framework and user interface improvements now enable Galaxy to be used for analyzing tens of thousands of datasets, and >5500 tools are now available from the Galaxy ToolShed. The Galaxy community has led an effort to create numerous high-quality tutorials focused on common types of genomic analyses. The Galaxy developer and user communities continue to grow and be integral to Galaxy's development. The number of Galaxy public servers, developers contributing to the Galaxy framework and its tools, and users of the main Galaxy server have all increased substantially.

  19. Numerical analysis of double chirp effect in tapered and linearly chirped fiber Bragg gratings.

    PubMed

    Markowski, Konrad; Jedrzejewski, Kazimierz; Osuch, Tomasz

    2016-06-10

    In this paper, a theoretical analysis of recently developed tapered chirped fiber Bragg gratings (TCFBG) written in co-directional and counter-directional configurations is presented. In particular, the effects of the synthesis of chirps resulting from both a fused taper profile and a linearly chirped fringe pattern of the induced refractive index changes within the fiber core are extensively examined. For this purpose, a numerical model based on the transfer matrix method (TMM) and the coupled mode theory (CMT) was developed for such a grating. The impact of TCFBG parameters, such as grating length and steepness of the taper transition, as well as the effect of the fringe pattern chirp rate on the spectral properties of the resulting gratings, are presented. Results show that, by using the appropriate design process, TCFBGs with reduced or enhanced resulting chirp, and thus with widely tailored spectral responses, can be easily achieved. In turn, it reveals a great potential application of such structures. The presented numerical approach provides an excellent tool for TCFBG design.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Chao; Xu, Jun; Cao, Lei

    The electrodes of lithium-ion batteries (LIB) are known to be brittle and to fail earlier than the separators during an external crush event. Thus, the understanding of mechanical failure mechanism for LIB electrodes (anode and cathode) is critical for the safety design of LIB cells. In this paper, we present experimental and numerical studies on the constitutive behavior and progression of failure in LIB electrodes. Mechanical tests were designed and conducted to evaluate the constitutive properties of porous electrodes. Constitutive models were developed to describe the stress-strain response of electrodes under uniaxial tensile and compressive loads. The failure criterion andmore » a damage model were introduced to model their unique tensile and compressive failure behavior. The failure mechanism of LIB electrodes was studied using the blunt rod test on dry electrodes, and numerical models were built to simulate progressive failure. The different failure processes were examined and analyzed in detail numerically, and correlated with experimentally observed failure phenomena. Finally, the test results and models improve our understanding of failure behavior in LIB electrodes, and provide constructive insights on future development of physics-based safety design tools for battery structures under mechanical abuse.« less

  1. Metric Use in the Tool Industry. A Status Report and a Test of Assessment Methodology.

    DTIC Science & Technology

    1982-04-20

    Weights and Measures) CIM - Computer-Integrated Manufacturing CNC - Computer Numerical Control DOD - Department of Defense DODISS - DOD Index of...numerically-controlled ( CNC ) machines that have an inch-millimeter selection switch and a corresponding dual readout scale. S -4- The use of both metric...satisfactorily met the demands of both domestic and foreign customers for metric machine tools by providing either metric- capable machines or NC and CNC

  2. Structural Optimization for Reliability Using Nonlinear Goal Programming

    NASA Technical Reports Server (NTRS)

    El-Sayed, Mohamed E.

    1999-01-01

    This report details the development of a reliability based multi-objective design tool for solving structural optimization problems. Based on two different optimization techniques, namely sequential unconstrained minimization and nonlinear goal programming, the developed design method has the capability to take into account the effects of variability on the proposed design through a user specified reliability design criterion. In its sequential unconstrained minimization mode, the developed design tool uses a composite objective function, in conjunction with weight ordered design objectives, in order to take into account conflicting and multiple design criteria. Multiple design criteria of interest including structural weight, load induced stress and deflection, and mechanical reliability. The nonlinear goal programming mode, on the other hand, provides for a design method that eliminates the difficulty of having to define an objective function and constraints, while at the same time has the capability of handling rank ordered design objectives or goals. For simulation purposes the design of a pressure vessel cover plate was undertaken as a test bed for the newly developed design tool. The formulation of this structural optimization problem into sequential unconstrained minimization and goal programming form is presented. The resulting optimization problem was solved using: (i) the linear extended interior penalty function method algorithm; and (ii) Powell's conjugate directions method. Both single and multi-objective numerical test cases are included demonstrating the design tool's capabilities as it applies to this design problem.

  3. Excitonic terahertz photoconductivity in intrinsic semiconductor nanowires.

    PubMed

    Yan, Jie-Yun

    2018-06-13

    Excitonic terahertz photoconductivity in intrinsic semiconductor nanowires is studied. Based on the excitonic theory, the numerical method to calculate the photoconductivity spectrum in the nanowires is developed, which can simulate optical pump terahertz-probe spectroscopy measurements on real nanowires and thereby calculate the typical photoconductivity spectrum. With the help of the energetic structure deduced from the calculated linear absorption spectrum, the numerically observed shift of the resonant peak in the photoconductivity spectrum is found to result from the dominant exciton transition between excited or continuum states to the ground state, and the quantitative analysis is in good agreement with the quantum plasmon model. Besides, the dependence of the photoconductivity on the polarization of the terahertz field is also discussed. The numerical method and supporting theoretical analysis provide a new tool for experimentalists to understand the terahertz photoconductivity in intrinsic semiconductor nanowires at low temperatures or for nanowires subjected to below bandgap photoexcitation, where excitonic effects dominate.

  4. Importance of inlet boundary conditions for numerical simulation of combustor flows

    NASA Technical Reports Server (NTRS)

    Sturgess, G. J.; Syed, S. A.; Mcmanus, K. R.

    1983-01-01

    Fluid dynamic computer codes for the mathematical simulation of problems in gas turbine engine combustion systems are required as design and diagnostic tools. To eventually achieve a performance standard with these codes of more than qualitative accuracy it is desirable to use benchmark experiments for validation studies. Typical of the fluid dynamic computer codes being developed for combustor simulations is the TEACH (Teaching Elliptic Axisymmetric Characteristics Heuristically) solution procedure. It is difficult to find suitable experiments which satisfy the present definition of benchmark quality. For the majority of the available experiments there is a lack of information concerning the boundary conditions. A standard TEACH-type numerical technique is applied to a number of test-case experiments. It is found that numerical simulations of gas turbine combustor-relevant flows can be sensitive to the plane at which the calculations start and the spatial distributions of inlet quantities for swirling flows.

  5. Excitonic terahertz photoconductivity in intrinsic semiconductor nanowires

    NASA Astrophysics Data System (ADS)

    Yan, Jie-Yun

    2018-06-01

    Excitonic terahertz photoconductivity in intrinsic semiconductor nanowires is studied. Based on the excitonic theory, the numerical method to calculate the photoconductivity spectrum in the nanowires is developed, which can simulate optical pump terahertz-probe spectroscopy measurements on real nanowires and thereby calculate the typical photoconductivity spectrum. With the help of the energetic structure deduced from the calculated linear absorption spectrum, the numerically observed shift of the resonant peak in the photoconductivity spectrum is found to result from the dominant exciton transition between excited or continuum states to the ground state, and the quantitative analysis is in good agreement with the quantum plasmon model. Besides, the dependence of the photoconductivity on the polarization of the terahertz field is also discussed. The numerical method and supporting theoretical analysis provide a new tool for experimentalists to understand the terahertz photoconductivity in intrinsic semiconductor nanowires at low temperatures or for nanowires subjected to below bandgap photoexcitation, where excitonic effects dominate.

  6. Interactive cutting path analysis programs

    NASA Technical Reports Server (NTRS)

    Weiner, J. M.; Williams, D. S.; Colley, S. R.

    1975-01-01

    The operation of numerically controlled machine tools is interactively simulated. Four programs were developed to graphically display the cutting paths for a Monarch lathe, Cintimatic mill, Strippit sheet metal punch, and the wiring path for a Standard wire wrap machine. These programs are run on a IMLAC PDS-ID graphic display system under the DOS-3 disk operating system. The cutting path analysis programs accept input via both paper tape and disk file.

  7. Manufacturing Technology Research Needs of the Gear Industry.

    DTIC Science & Technology

    1987-12-31

    Management Shortcomings within the U.S. Precision Gear Industry ........... 33 2.2.7 European Gear and Machine Tool Companies ....... .. 35 2.2.8 German...manufacturing becomes more sophisticated, workers are running numerically con- trolled computer equipment requiring an understanding of math. 2.2.6.9 Management ...inefficiencies of the job shop environ- ment by managing the gear business as a backward integra- tion of the assembly line. o Develop and maintain

  8. Real Time Baseball Database

    NASA Astrophysics Data System (ADS)

    Fukue, Yasuhiro

    The author describes the system outline, features and operations of "Nikkan Sports Realtime Basaball Database" which was developed and operated by Nikkan Sports Shimbun, K. K. The system enables to input numerical data of professional baseball games as they proceed simultaneously, and execute data updating at realtime, just-in-time. Other than serving as supporting tool for prepareing newspapers it is also available for broadcasting media, general users through NTT dial Q2 and others.

  9. Application for internal dosimetry using biokinetic distribution of photons based on nuclear medicine images*

    PubMed Central

    Leal Neto, Viriato; Vieira, José Wilson; Lima, Fernando Roberto de Andrade

    2014-01-01

    Objective This article presents a way to obtain estimates of dose in patients submitted to radiotherapy with basis on the analysis of regions of interest on nuclear medicine images. Materials and Methods A software called DoRadIo (Dosimetria das Radiações Ionizantes [Ionizing Radiation Dosimetry]) was developed to receive information about source organs and target organs, generating graphical and numerical results. The nuclear medicine images utilized in the present study were obtained from catalogs provided by medical physicists. The simulations were performed with computational exposure models consisting of voxel phantoms coupled with the Monte Carlo EGSnrc code. The software was developed with the Microsoft Visual Studio 2010 Service Pack and the project template Windows Presentation Foundation for C# programming language. Results With the mentioned tools, the authors obtained the file for optimization of Monte Carlo simulations using the EGSnrc; organization and compaction of dosimetry results with all radioactive sources; selection of regions of interest; evaluation of grayscale intensity in regions of interest; the file of weighted sources; and, finally, all the charts and numerical results. Conclusion The user interface may be adapted for use in clinical nuclear medicine as a computer-aided tool to estimate the administered activity. PMID:25741101

  10. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  11. Tools for Interdisciplinary Data Assimilation and Sharing in Support of Hydrologic Science

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Walker, J.; Suftin, I.; Warren, M.; Kunicki, T.

    2013-12-01

    Information consumed and produced in hydrologic analyses is interdisciplinary and massive. These factors put a heavy information management burden on the hydrologic science community. The U.S. Geological Survey (USGS) Office of Water Information Center for Integrated Data Analytics (CIDA) seeks to assist hydrologic science investigators with all-components of their scientific data management life cycle. Ongoing data publication and software development projects will be presented demonstrating publically available data access services and manipulation tools being developed with support from two Department of the Interior initiatives. The USGS-led National Water Census seeks to provide both data and tools in support of nationally consistent water availability estimates. Newly available data include national coverages of radar-indicated precipitation, actual evapotranspiration, water use estimates aggregated by county, and South East region estimates of streamflow for 12-digit hydrologic unit code watersheds. Web services making these data available and applications to access them will be demonstrated. Web-available processing services able to provide numerous streamflow statistics for any USGS daily flow record or model result time series and other National Water Census processing tools will also be demonstrated. The National Climate Change and Wildlife Science Center is a USGS center leading DOI-funded academic global change adaptation research. It has a mission goal to ensure data used and produced by funded projects is available via web services and tools that streamline data management tasks in interdisciplinary science. For example, collections of downscaled climate projections, typically large collections of files that must be downloaded to be accessed, are being published using web services that allow access to the entire dataset via simple web-service requests and numerous processing tools. Recent progress on this front includes, data web services for Climate Model Intercomparison Phase 5 based downscaled climate projections, EPA's Integrated Climate and Land Use Scenarios projections of population and land cover metrics, and MODIS-derived land cover parameters from NASA's Land Processes Distributed Active Archive Center. These new services and ways to discover others will be presented through demonstration of a recently open-sourced project from a web-application or scripted workflow. Development and public deployment of server-based processing tools to subset and summarize these and other data is ongoing at the CIDA with partner groups such as 52 Degrees North and Unidata. The latest progress on subsetting, spatial summarization to areas of interest, and temporal summarization via common-statistical methods will be presented.

  12. Modeling languages for biochemical network simulation: reaction vs equation based approaches.

    PubMed

    Wiechert, Wolfgang; Noack, Stephan; Elsheikh, Atya

    2010-01-01

    Biochemical network modeling and simulation is an essential task in any systems biology project. The systems biology markup language (SBML) was established as a standardized model exchange language for mechanistic models. A specific strength of SBML is that numerous tools for formulating, processing, simulation and analysis of models are freely available. Interestingly, in the field of multidisciplinary simulation, the problem of model exchange between different simulation tools occurred much earlier. Several general modeling languages like Modelica have been developed in the 1990s. Modelica enables an equation based modular specification of arbitrary hierarchical differential algebraic equation models. Moreover, libraries for special application domains can be rapidly developed. This contribution compares the reaction based approach of SBML with the equation based approach of Modelica and explains the specific strengths of both tools. Several biological examples illustrating essential SBML and Modelica concepts are given. The chosen criteria for tool comparison are flexibility for constraint specification, different modeling flavors, hierarchical, modular and multidisciplinary modeling. Additionally, support for spatially distributed systems, event handling and network analysis features is discussed. As a major result it is shown that the choice of the modeling tool has a strong impact on the expressivity of the specified models but also strongly depends on the requirements of the application context.

  13. Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.

    PubMed

    Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca

    2018-05-01

    CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.

  14. Nucleation and microstructure development in Cr-Mo-V tool steel during gas atomization

    NASA Astrophysics Data System (ADS)

    Behúlová, M.; Grgač, P.; Čička, R.

    2017-11-01

    Nucleation studies of undercooled metallic melts are of essential interest for the understanding of phase selection, growth kinetics and microstructure development during their rapid non-equilibrium solidification. The paper deals with the modelling of nucleation processes and microstructure development in the hypoeutectic tool steel Ch12MF4 with the chemical composition of 2.37% C, 12.06 % Cr, 1.2% Mo, 4.0% V and balance Fe [wt. %] in the process of nitrogen gas atomization. Based on the classical theory of homogeneous nucleation, the nucleation temperature of molten rapidly cooled spherical particles from this alloy with diameter from 40 μm to 600 μm in the gas atomization process is calculated using various estimations of parameters influencing the nucleation process - the Gibbs free energy difference between solid and liquid phases and the solid/liquid interfacial energy. Results of numerical calculations are compared with experimentally measured nucleation temperatures during levitation experiments and microstructures developed in rapidly solidified powder particles from the investigated alloy.

  15. Rapid processing of data based on high-performance algorithms for solving inverse problems and 3D-simulation of the tsunami and earthquakes

    NASA Astrophysics Data System (ADS)

    Marinin, I. V.; Kabanikhin, S. I.; Krivorotko, O. I.; Karas, A.; Khidasheli, D. G.

    2012-04-01

    We consider new techniques and methods for earthquake and tsunami related problems, particularly - inverse problems for the determination of tsunami source parameters, numerical simulation of long wave propagation in soil and water and tsunami risk estimations. In addition, we will touch upon the issue of database management and destruction scenario visualization. New approaches and strategies, as well as mathematical tools and software are to be shown. The long joint investigations by researchers of the Institute of Mathematical Geophysics and Computational Mathematics SB RAS and specialists from WAPMERR and Informap have produced special theoretical approaches, numerical methods, and software tsunami and earthquake modeling (modeling of propagation and run-up of tsunami waves on coastal areas), visualization, risk estimation of tsunami, and earthquakes. Algorithms are developed for the operational definition of the origin and forms of the tsunami source. The system TSS numerically simulates the source of tsunami and/or earthquakes and includes the possibility to solve the direct and the inverse problem. It becomes possible to involve advanced mathematical results to improve models and to increase the resolution of inverse problems. Via TSS one can construct maps of risks, the online scenario of disasters, estimation of potential damage to buildings and roads. One of the main tools for the numerical modeling is the finite volume method (FVM), which allows us to achieve stability with respect to possible input errors, as well as to achieve optimum computing speed. Our approach to the inverse problem of tsunami and earthquake determination is based on recent theoretical results concerning the Dirichlet problem for the wave equation. This problem is intrinsically ill-posed. We use the optimization approach to solve this problem and SVD-analysis to estimate the degree of ill-posedness and to find the quasi-solution. The software system we developed is intended to create technology «no frost», realizing a steady stream of direct and inverse problems: solving the direct problem, the visualization and comparison with observed data, to solve the inverse problem (correction of the model parameters). The main objective of further work is the creation of a workstation operating emergency tool that could be used by an emergency duty person in real time.

  16. Numerical modeling of the fracture process in a three-unit all-ceramic fixed partial denture.

    PubMed

    Kou, Wen; Kou, Shaoquan; Liu, Hongyuan; Sjögren, Göran

    2007-08-01

    The main objectives were to examine the fracture mechanism and process of a ceramic fixed partial denture (FPD) framework under simulated mechanical loading using a recently developed numerical modeling code, the R-T(2D) code, and also to evaluate the suitability of R-T(2D) code as a tool for this purpose. Using the recently developed R-T(2D) code the fracture mechanism and process of a 3U yttria-tetragonal zirconia polycrystal ceramic (Y-TZP) FPD framework was simulated under static loading. In addition, the fracture pattern obtained using the numerical simulation was compared with the fracture pattern obtained in a previous laboratory test. The result revealed that the framework fracture pattern obtained using the numerical simulation agreed with that observed in a previous laboratory test. Quasi-photoelastic stress fringe pattern and acoustic emission showed that the fracture mechanism was tensile failure and that the crack started at the lower boundary of the framework. The fracture process could be followed both in step-by-step and step-in-step. Based on the findings in the current study, the R-T(2D) code seems suitable for use as a complement to other tests and clinical observations in studying stress distribution, fracture mechanism and fracture processes in ceramic FPD frameworks.

  17. Bed turbulent kinetic energy boundary conditions for trapping efficiency and spatial distribution of sediments in basins.

    PubMed

    Isenmann, Gilles; Dufresne, Matthieu; Vazquez, José; Mose, Robert

    2017-10-01

    The purpose of this study is to develop and validate a numerical tool for evaluating the performance of a settling basin regarding the trapping of suspended matter. The Euler-Lagrange approach was chosen to model the flow and sediment transport. The numerical model developed relies on the open source library OpenFOAM ® , enhanced with new particle/wall interaction conditions to limit sediment deposition in zones with favourable hydrodynamic conditions (shear stress, turbulent kinetic energy). In particular, a new relation is proposed for calculating the turbulent kinetic energy threshold as a function of the properties of each particle (diameter and density). The numerical model is compared to three experimental datasets taken from the literature and collected for scale models of basins. The comparison of the numerical and experimental results permits concluding on the model's capacity to predict the trapping of particles in a settling basin with an absolute error in the region of 5% when the sediment depositions occur over the entire bed. In the case of sediment depositions localised in preferential zones, their distribution is reproduced well by the model and trapping efficiency is evaluated with an absolute error in the region of 10% (excluding cases of particles with very low density).

  18. Temperature Measurement and Numerical Prediction in Machining Inconel 718.

    PubMed

    Díaz-Álvarez, José; Tapetado, Alberto; Vázquez, Carmen; Miguélez, Henar

    2017-06-30

    Thermal issues are critical when machining Ni-based superalloy components designed for high temperature applications. The low thermal conductivity and extreme strain hardening of this family of materials results in elevated temperatures around the cutting area. This elevated temperature could lead to machining-induced damage such as phase changes and residual stresses, resulting in reduced service life of the component. Measurement of temperature during machining is crucial in order to control the cutting process, avoiding workpiece damage. On the other hand, the development of predictive tools based on numerical models helps in the definition of machining processes and the obtainment of difficult to measure parameters such as the penetration of the heated layer. However, the validation of numerical models strongly depends on the accurate measurement of physical parameters such as temperature, ensuring the calibration of the model. This paper focuses on the measurement and prediction of temperature during the machining of Ni-based superalloys. The temperature sensor was based on a fiber-optic two-color pyrometer developed for localized temperature measurements in turning of Inconel 718. The sensor is capable of measuring temperature in the range of 250 to 1200 °C. Temperature evolution is recorded in a lathe at different feed rates and cutting speeds. Measurements were used to calibrate a simplified numerical model for prediction of temperature fields during turning.

  19. Machine learning research 1989-90

    NASA Technical Reports Server (NTRS)

    Porter, Bruce W.; Souther, Arthur

    1990-01-01

    Multifunctional knowledge bases offer a significant advance in artificial intelligence because they can support numerous expert tasks within a domain. As a result they amortize the costs of building a knowledge base over multiple expert systems and they reduce the brittleness of each system. Due to the inevitable size and complexity of multifunctional knowledge bases, their construction and maintenance require knowledge engineering and acquisition tools that can automatically identify interactions between new and existing knowledge. Furthermore, their use requires software for accessing those portions of the knowledge base that coherently answer questions. Considerable progress was made in developing software for building and accessing multifunctional knowledge bases. A language was developed for representing knowledge, along with software tools for editing and displaying knowledge, a machine learning program for integrating new information into existing knowledge, and a question answering system for accessing the knowledge base.

  20. NAFLA - Ein Simulationswerkzeug zur analytischen Abschätzung von Schadstofffahnenlängen

    NASA Astrophysics Data System (ADS)

    Kumar Yadav, Prabhas; Händel, Falk; Müller, Christian; Liedl, Rudolf; Dietrich, Peter

    2013-03-01

    Groundwater pollution with organic contaminants remains a world-wide problem. Before selection of any remediation technique, it is important to pre-assess contaminated sites with respect to their hazard. For this, several analytical and numerical approaches have been used and an initial assessment of contaminated sites the MS-Excel© tool "NAFLA" was developed. "NAFLA" allows a quick and straightforward calculation and comparison of some analytical approaches for the estimation of maximum plume length under steady-state conditions. These approaches differ from each other in source geometry, model domain orientation, and in the consideration of (bio)chemical reaction within the domain. In this communication, we provide details about the development of "NAFLA", its possible usage and information for users. The tool is especially designed for application in student education, by authorities and consultants.

  1. Vaccines: the fourth century.

    PubMed

    Plotkin, Stanley A

    2009-12-01

    Vaccine development, which began with Edward Jenner's observations in the late 18th century, has entered its 4th century. From its beginnings, with the use of whole organisms that had been weakened or inactivated, to the modern-day use of genetic engineering, it has taken advantage of the tools discovered in other branches of microbiology. Numerous successful vaccines are in use, but the list of diseases for which vaccines do not exist is long. However, the multiplicity of strategies now available, discussed in this article, portends even more successful development of vaccines.

  2. Quantitative analysis and comparative study of four cities green pattern in API system on the background of big data

    NASA Astrophysics Data System (ADS)

    Xin, YANG; Si-qi, WU; Qi, ZHANG

    2018-05-01

    Beijing, London, Paris, New York are typical cities in the world, so comparative study of four cities green pattern is very important to find out gap and advantage and to learn from each other. The paper will provide basis and new ideas for development of metropolises in China. On the background of big data, API (Application Programming Interface) system can provide extensive and accurate basic data to study urban green pattern in different geographical environment in domestic and foreign. On the basis of this, Average nearest neighbor tool, Kernel density tool and Standard Ellipse tool in ArcGIS platform can process and summarize data and realize quantitative analysis of green pattern. The paper summarized uniqueness of four cities green pattern and reasons of formation on basis of numerical comparison.

  3. Quality Improvement Project: Replacing the Numeric Rating Scale with a Clinically Aligned Pain Assessment (CAPA) Tool.

    PubMed

    Topham, Debra; Drew, Debra

    2017-12-01

    CAPA is a multifaceted pain assessment tool that was adopted at a large tertiary Midwest hospital to replace the numeric scale for adult patients who could self-report their pain experience. This article describes the process of implementation and the effect on patient satisfaction scores. Use of the tool is supported by the premise that pain assessment entails more than just pain intensity and that assessment is an exchange of meaning between patients and clinicians dependent on internal and external factors. Implementation of the tool was a transformative process resulting in modest increases in patient satisfaction scores with pain management. Patient reports that "staff did everything to manage pain" had the biggest gains and were sustained for more than 2 years. The CAPA tool meets regulatory requirements for pain assessment. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  4. Itzï (version 17.1): an open-source, distributed GIS model for dynamic flood simulation

    NASA Astrophysics Data System (ADS)

    Guillaume Courty, Laurent; Pedrozo-Acuña, Adrián; Bates, Paul David

    2017-05-01

    Worldwide, floods are acknowledged as one of the most destructive hazards. In human-dominated environments, their negative impacts are ascribed not only to the increase in frequency and intensity of floods but also to a strong feedback between the hydrological cycle and anthropogenic development. In order to advance a more comprehensive understanding of this complex interaction, this paper presents the development of a new open-source tool named Itzï that enables the 2-D numerical modelling of rainfall-runoff processes and surface flows integrated with the open-source geographic information system (GIS) software known as GRASS. Therefore, it takes advantage of the ability given by GIS environments to handle datasets with variations in both temporal and spatial resolutions. Furthermore, the presented numerical tool can handle datasets from different sources with varied spatial resolutions, facilitating the preparation and management of input and forcing data. This ability reduces the preprocessing time usually required by other models. Itzï uses a simplified form of the shallow water equations, the damped partial inertia equation, for the resolution of surface flows, and the Green-Ampt model for the infiltration. The source code is now publicly available online, along with complete documentation. The numerical model is verified against three different tests cases: firstly, a comparison with an analytic solution of the shallow water equations is introduced; secondly, a hypothetical flooding event in an urban area is implemented, where results are compared to those from an established model using a similar approach; and lastly, the reproduction of a real inundation event that occurred in the city of Kingston upon Hull, UK, in June 2007, is presented. The numerical approach proved its ability at reproducing the analytic and synthetic test cases. Moreover, simulation results of the real flood event showed its suitability at identifying areas affected by flooding, which were verified against those recorded after the event by local authorities.

  5. Genetic basis in motor skill and hand preference for tool use in chimpanzees (Pan troglodytes).

    PubMed

    Hopkins, William D; Reamer, Lisa; Mareno, Mary Catherine; Schapiro, Steven J

    2015-02-07

    Chimpanzees are well known for their tool using abilities. Numerous studies have documented variability in tool use among chimpanzees and the role that social learning and other factors play in their development. There are also findings on hand use in both captive and wild chimpanzees; however, less understood are the potential roles of genetic and non-genetic mechanisms in determining individual differences in tool use skill and laterality. Here, we examined heritability in tool use skill and handedness for a probing task in a sample of 243 captive chimpanzees. Quantitative genetic analysis, based on the extant pedigrees, showed that overall both tool use skill and handedness were significantly heritable. Significant heritability in motor skill was evident in two genetically distinct populations of apes, and between two cohorts that received different early social rearing experiences. We further found that motor skill decreased with age and that males were more commonly left-handed than females. Collectively, these data suggest that though non-genetic factors do influence tool use performance and handedness in chimpanzees, genetic factors also play a significant role, as has been reported in humans. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  6. Cohesive Laws and Progressive Damage Analysis of Composite Bonded Joints, a Combined Numerical/Experimental Approach

    NASA Technical Reports Server (NTRS)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2015-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations, in agreement with experimental tests, indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  7. Advanced Computational Modeling of Vapor Deposition in a High-Pressure Reactor

    NASA Technical Reports Server (NTRS)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  8. Advanced Computational Modeling of Vapor Deposition in a High-pressure Reactor

    NASA Technical Reports Server (NTRS)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  9. Stretchable Materials for Robust Soft Actuators towards Assistive Wearable Devices

    NASA Astrophysics Data System (ADS)

    Agarwal, Gunjan; Besuchet, Nicolas; Audergon, Basile; Paik, Jamie

    2016-09-01

    Soft actuators made from elastomeric active materials can find widespread potential implementation in a variety of applications ranging from assistive wearable technologies targeted at biomedical rehabilitation or assistance with activities of daily living, bioinspired and biomimetic systems, to gripping and manipulating fragile objects, and adaptable locomotion. In this manuscript, we propose a novel two-component soft actuator design and design tool that produces actuators targeted towards these applications with enhanced mechanical performance and manufacturability. Our numerical models developed using the finite element method can predict the actuator behavior at large mechanical strains to allow efficient design iterations for system optimization. Based on two distinctive actuator prototypes’ (linear and bending actuators) experimental results that include free displacement and blocked-forces, we have validated the efficacy of the numerical models. The presented extensive investigation of mechanical performance for soft actuators with varying geometric parameters demonstrates the practical application of the design tool, and the robustness of the actuator hardware design, towards diverse soft robotic systems for a wide set of assistive wearable technologies, including replicating the motion of several parts of the human body.

  10. OVERVIEW OF NEUTRON MEASUREMENTS IN JET FUSION DEVICE.

    PubMed

    Batistoni, P; Villari, R; Obryk, B; Packer, L W; Stamatelatos, I E; Popovichev, S; Colangeli, A; Colling, B; Fonnesu, N; Loreti, S; Klix, A; Klosowski, M; Malik, K; Naish, J; Pillon, M; Vasilopoulou, T; De Felice, P; Pimpinella, M; Quintieri, L

    2017-10-05

    The design and operation of ITER experimental fusion reactor requires the development of neutron measurement techniques and numerical tools to derive the fusion power and the radiation field in the device and in the surrounding areas. Nuclear analyses provide essential input to the conceptual design, optimisation, engineering and safety case in ITER and power plant studies. The required radiation transport calculations are extremely challenging because of the large physical extent of the reactor plant, the complexity of the geometry, and the combination of deep penetration and streaming paths. This article reports the experimental activities which are carried-out at JET to validate the neutronics measurements methods and numerical tools used in ITER and power plant design. A new deuterium-tritium campaign is proposed in 2019 at JET: the unique 14 MeV neutron yields produced will be exploited as much as possible to validate measurement techniques, codes, procedures and data currently used in ITER design thus reducing the related uncertainties and the associated risks in the machine operation. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Stretchable Materials for Robust Soft Actuators towards Assistive Wearable Devices

    PubMed Central

    Agarwal, Gunjan; Besuchet, Nicolas; Audergon, Basile; Paik, Jamie

    2016-01-01

    Soft actuators made from elastomeric active materials can find widespread potential implementation in a variety of applications ranging from assistive wearable technologies targeted at biomedical rehabilitation or assistance with activities of daily living, bioinspired and biomimetic systems, to gripping and manipulating fragile objects, and adaptable locomotion. In this manuscript, we propose a novel two-component soft actuator design and design tool that produces actuators targeted towards these applications with enhanced mechanical performance and manufacturability. Our numerical models developed using the finite element method can predict the actuator behavior at large mechanical strains to allow efficient design iterations for system optimization. Based on two distinctive actuator prototypes’ (linear and bending actuators) experimental results that include free displacement and blocked-forces, we have validated the efficacy of the numerical models. The presented extensive investigation of mechanical performance for soft actuators with varying geometric parameters demonstrates the practical application of the design tool, and the robustness of the actuator hardware design, towards diverse soft robotic systems for a wide set of assistive wearable technologies, including replicating the motion of several parts of the human body. PMID:27670953

  12. TRENDS: A flight test relational database user's guide and reference manual

    NASA Technical Reports Server (NTRS)

    Bondi, M. J.; Bjorkman, W. S.; Cross, J. L.

    1994-01-01

    This report is designed to be a user's guide and reference manual for users intending to access rotocraft test data via TRENDS, the relational database system which was developed as a tool for the aeronautical engineer with no programming background. This report has been written to assist novice and experienced TRENDS users. TRENDS is a complete system for retrieving, searching, and analyzing both numerical and narrative data, and for displaying time history and statistical data in graphical and numerical formats. This manual provides a 'guided tour' and a 'user's guide' for the new and intermediate-skilled users. Examples for the use of each menu item within TRENDS is provided in the Menu Reference section of the manual, including full coverage for TIMEHIST, one of the key tools. This manual is written around the XV-15 Tilt Rotor database, but does include an appendix on the UH-60 Blackhawk database. This user's guide and reference manual establishes a referrable source for the research community and augments NASA TM-101025, TRENDS: The Aeronautical Post-Test, Database Management System, Jan. 1990, written by the same authors.

  13. Stretchable Materials for Robust Soft Actuators towards Assistive Wearable Devices.

    PubMed

    Agarwal, Gunjan; Besuchet, Nicolas; Audergon, Basile; Paik, Jamie

    2016-09-27

    Soft actuators made from elastomeric active materials can find widespread potential implementation in a variety of applications ranging from assistive wearable technologies targeted at biomedical rehabilitation or assistance with activities of daily living, bioinspired and biomimetic systems, to gripping and manipulating fragile objects, and adaptable locomotion. In this manuscript, we propose a novel two-component soft actuator design and design tool that produces actuators targeted towards these applications with enhanced mechanical performance and manufacturability. Our numerical models developed using the finite element method can predict the actuator behavior at large mechanical strains to allow efficient design iterations for system optimization. Based on two distinctive actuator prototypes' (linear and bending actuators) experimental results that include free displacement and blocked-forces, we have validated the efficacy of the numerical models. The presented extensive investigation of mechanical performance for soft actuators with varying geometric parameters demonstrates the practical application of the design tool, and the robustness of the actuator hardware design, towards diverse soft robotic systems for a wide set of assistive wearable technologies, including replicating the motion of several parts of the human body.

  14. Thermographic Analysis of Stress Distribution in Welded Joints

    NASA Astrophysics Data System (ADS)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  15. Characterizing relationship between optical microangiography signals and capillary flow using microfluidic channels.

    PubMed

    Choi, Woo June; Qin, Wan; Chen, Chieh-Li; Wang, Jingang; Zhang, Qinqin; Yang, Xiaoqi; Gao, Bruce Z; Wang, Ruikang K

    2016-07-01

    Optical microangiography (OMAG) is a powerful optical angio-graphic tool to visualize micro-vascular flow in vivo. Despite numerous demonstrations for the past several years of the qualitative relationship between OMAG and flow, no convincing quantitative relationship has been proven. In this paper, we attempt to quantitatively correlate the OMAG signal with flow. Specifically, we develop a simplified analytical model of the complex OMAG, suggesting that the OMAG signal is a product of the number of particles in an imaging voxel and the decorrelation of OCT (optical coherence tomography) signal, determined by flow velocity, inter-frame time interval, and wavelength of the light source. Numerical simulation with the proposed model reveals that if the OCT amplitudes are correlated, the OMAG signal is related to a total number of particles across the imaging voxel cross-section per unit time (flux); otherwise it would be saturated but its strength is proportional to the number of particles in the imaging voxel (concentration). The relationship is validated using microfluidic flow phantoms with various preset flow metrics. This work suggests OMAG is a promising quantitative tool for the assessment of vascular flow.

  16. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  17. What Physicists Should Know About High Performance Computing - Circa 2002

    NASA Astrophysics Data System (ADS)

    Frederick, Donald

    2002-08-01

    High Performance Computing (HPC) is a dynamic, cross-disciplinary field that traditionally has involved applied mathematicians, computer scientists, and others primarily from the various disciplines that have been major users of HPC resources - physics, chemistry, engineering, with increasing use by those in the life sciences. There is a technological dynamic that is powered by economic as well as by technical innovations and developments. This talk will discuss practical ideas to be considered when developing numerical applications for research purposes. Even with the rapid pace of development in the field, the author believes that these concepts will not become obsolete for a while, and will be of use to scientists who either are considering, or who have already started down the HPC path. These principles will be applied in particular to current parallel HPC systems, but there will also be references of value to desktop users. The talk will cover such topics as: computing hardware basics, single-cpu optimization, compilers, timing, numerical libraries, debugging and profiling tools and the emergence of Computational Grids.

  18. A survey of parallel programming tools

    NASA Technical Reports Server (NTRS)

    Cheng, Doreen Y.

    1991-01-01

    This survey examines 39 parallel programming tools. Focus is placed on those tool capabilites needed for parallel scientific programming rather than for general computer science. The tools are classified with current and future needs of Numerical Aerodynamic Simulator (NAS) in mind: existing and anticipated NAS supercomputers and workstations; operating systems; programming languages; and applications. They are divided into four categories: suggested acquisitions, tools already brought in; tools worth tracking; and tools eliminated from further consideration at this time.

  19. Multiscale Mechano-Biological Finite Element Modelling of Oncoplastic Breast Surgery—Numerical Study towards Surgical Planning and Cosmetic Outcome Prediction

    PubMed Central

    Eiben, Bjoern; Hipwell, John H.; Williams, Norman R.; Keshtgar, Mo; Hawkes, David J.

    2016-01-01

    Surgical treatment for early-stage breast carcinoma primarily necessitates breast conserving therapy (BCT), where the tumour is removed while preserving the breast shape. To date, there have been very few attempts to develop accurate and efficient computational tools that could be used in the clinical environment for pre-operative planning and oncoplastic breast surgery assessment. Moreover, from the breast cancer research perspective, there has been very little effort to model complex mechano-biological processes involved in wound healing. We address this by providing an integrated numerical framework that can simulate the therapeutic effects of BCT over the extended period of treatment and recovery. A validated, three-dimensional, multiscale finite element procedure that simulates breast tissue deformations and physiological wound healing is presented. In the proposed methodology, a partitioned, continuum-based mathematical model for tissue recovery and angiogenesis, and breast tissue deformation is considered. The effectiveness and accuracy of the proposed numerical scheme is illustrated through patient-specific representative examples. Wound repair and contraction numerical analyses of real MRI-derived breast geometries are investigated, and the final predictions of the breast shape are validated against post-operative follow-up optical surface scans from four patients. Mean (standard deviation) breast surface distance errors in millimetres of 3.1 (±3.1), 3.2 (±2.4), 2.8 (±2.7) and 4.1 (±3.3) were obtained, demonstrating the ability of the surgical simulation tool to predict, pre-operatively, the outcome of BCT to clinically useful accuracy. PMID:27466815

  20. Comparison of software tools for kinetic evaluation of chemical degradation data.

    PubMed

    Ranke, Johannes; Wöltjen, Janina; Meinecke, Stefan

    2018-01-01

    For evaluating the fate of xenobiotics in the environment, a variety of degradation or environmental metabolism experiments are routinely conducted. The data generated in such experiments are evaluated by optimizing the parameters of kinetic models in a way that the model simulation fits the data. No comparison of the main software tools currently in use has been published to date. This article shows a comparison of numerical results as well as an overall, somewhat subjective comparison based on a scoring system using a set of criteria. The scoring was separately performed for two types of uses. Uses of type I are routine evaluations involving standard kinetic models and up to three metabolites in a single compartment. Evaluations involving non-standard model components, more than three metabolites or more than a single compartment belong to use type II. For use type I, usability is most important, while the flexibility of the model definition is most important for use type II. Test datasets were assembled that can be used to compare the numerical results for different software tools. These datasets can also be used to ensure that no unintended or erroneous behaviour is introduced in newer versions. In the comparison of numerical results, good agreement between the parameter estimates was observed for datasets with up to three metabolites. For the now unmaintained reference software DegKinManager/ModelMaker, and for OpenModel which is still under development, user options were identified that should be taken care of in order to obtain results that are as reliable as possible. Based on the scoring system mentioned above, the software tools gmkin, KinGUII and CAKE received the best scores for use type I. Out of the 15 software packages compared with respect to use type II, again gmkin and KinGUII were the first two, followed by the script based tool mkin, which is the technical basis for gmkin, and by OpenModel. Based on the evaluation using the system of criteria mentioned above and the comparison of numerical results for the suite of test datasets, the software tools gmkin, KinGUII and CAKE are recommended for use type I, and gmkin and KinGUII for use type II. For users that prefer to work with scripts instead of graphical user interfaces, mkin is recommended. For future software evaluations, it is recommended to include a measure for the total time that a typical user needs for a kinetic evaluation into the scoring scheme. It is the hope of the authors that the publication of test data, source code and overall rankings foster the evolution of useful and reliable software in the field.

  1. Simulation of the infiltration process of a ceramic open-pore body with a metal alloy in semi-solid state to design the manufacturing of interpenetrating phase composites

    NASA Astrophysics Data System (ADS)

    Schomer, Laura; Liewald, Mathias; Riedmüller, Kim Rouven

    2018-05-01

    Metal-ceramic Interpenetrating Phase Composites (IPC) belong to a special subcategory of composite materials and reveal enhanced properties compared to conventional composite materials. Currently, IPC are produced by infiltration of a ceramic open-pore body with liquid metal applying high pressure and I or high temperature to avoid residual porosity. However, these IPC are not able to gain their complete potential, because of structural damages and interface reactions occurring during the manufacturing process. Compared to this, the manufacturing of IPC using the semi-solid forming technology offers great perspectives due to relative low processing temperatures and reduced mechanical pressure. In this context, this paper is focusing on numerical investigations conducted by using the FLOW-3D software for gaining a deeper understanding of the infiltration of open-pore bodies with semi-solid materials. For flow simulation analysis, a geometric model and different porous media drag models have been used. They have been adjusted and compared to get a precise description of the infiltration process. Based on these fundamental numerical investigations, this paper also shows numerical investigations that were used for basically designing a semi-solid forming tool. Thereby, the development of the flow front and the pressure during the infiltration represent the basis of the evaluation. The use of an open and closed tool cavity combined with various geometries of the upper die shows different results relating to these evaluation arguments. Furthermore, different overflows were designed and its effects on the pressure at the end of the infiltration process were investigated. Thus, this paper provides a general guideline for a tool design for manufacturing of metal-ceramic IPC using semi-solid forming.

  2. The Standard for Clinicians’ Interview in Psychiatry (SCIP): A Clinician-administered Tool with Categorical, Dimensional, and Numeric Output—Conceptual Development, Design, and Description of the SCIP

    PubMed Central

    Nasrallah, Henry; Muvvala, Srinivas; El-Missiry, Ahmed; Mansour, Hader; Hill, Cheryl; Elswick, Daniel; Price, Elizabeth C.

    2016-01-01

    Existing standardized diagnostic interviews (SDIs) were designed for researchers and produce mainly categorical diagnoses. There is an urgent need for a clinician-administered tool that produces dimensional measures, in addition to categorical diagnoses. The Standard for Clinicians’ Interview in Psychiatry (SCIP) is a method of assessment of psychopathology for adults. It is designed to be administered by clinicians and includes the SCIP manual and the SCIP interview. Clinicians use the SCIP questions and rate the responses according to the SCIP manual rules. Clinicians use the patient’s responses to questions, observe the patient’s behaviors and make the final rating of the various signs and symptoms assessed. The SCIP method of psychiatric assessment has three components: 1) the SCIP interview (dimensional) component, 2) the etiological component, and 3) the disorder classification component. The SCIP produces three main categories of clinical data: 1) a diagnostic classification of psychiatric disorders, 2) dimensional scores, and 3) numeric data. The SCIP provides diagnoses consistent with criteria from editions of the Diagnostic and Statistical Manual (DSM) and International Classification of Disease (ICD). The SCIP produces 18 dimensional measures for key psychiatric signs or symptoms: anxiety, posttraumatic stress, obsessions, compulsions, depression, mania, suicidality, suicidal behavior, delusions, hallucinations, agitation, disorganized behavior, negativity, catatonia, alcohol addiction, drug addiction, attention, and hyperactivity. The SCIP produces numeric severity data for use in either clinical care or research. The SCIP was shown to be a valid and reliable assessment tool, and the validity and reliability results were published in 2014 and 2015. The SCIP is compatible with personalized psychiatry research and is in line with the Research Domain Criteria framework. PMID:27800284

  3. The Standard for Clinicians' Interview in Psychiatry (SCIP): A Clinician-administered Tool with Categorical, Dimensional, and Numeric Output-Conceptual Development, Design, and Description of the SCIP.

    PubMed

    Aboraya, Ahmed; Nasrallah, Henry; Muvvala, Srinivas; El-Missiry, Ahmed; Mansour, Hader; Hill, Cheryl; Elswick, Daniel; Price, Elizabeth C

    2016-01-01

    Existing standardized diagnostic interviews (SDIs) were designed for researchers and produce mainly categorical diagnoses. There is an urgent need for a clinician-administered tool that produces dimensional measures, in addition to categorical diagnoses. The Standard for Clinicians' Interview in Psychiatry (SCIP) is a method of assessment of psychopathology for adults. It is designed to be administered by clinicians and includes the SCIP manual and the SCIP interview. Clinicians use the SCIP questions and rate the responses according to the SCIP manual rules. Clinicians use the patient's responses to questions, observe the patient's behaviors and make the final rating of the various signs and symptoms assessed. The SCIP method of psychiatric assessment has three components: 1) the SCIP interview (dimensional) component, 2) the etiological component, and 3) the disorder classification component. The SCIP produces three main categories of clinical data: 1) a diagnostic classification of psychiatric disorders, 2) dimensional scores, and 3) numeric data. The SCIP provides diagnoses consistent with criteria from editions of the Diagnostic and Statistical Manual (DSM) and International Classification of Disease (ICD). The SCIP produces 18 dimensional measures for key psychiatric signs or symptoms: anxiety, posttraumatic stress, obsessions, compulsions, depression, mania, suicidality, suicidal behavior, delusions, hallucinations, agitation, disorganized behavior, negativity, catatonia, alcohol addiction, drug addiction, attention, and hyperactivity. The SCIP produces numeric severity data for use in either clinical care or research. The SCIP was shown to be a valid and reliable assessment tool, and the validity and reliability results were published in 2014 and 2015. The SCIP is compatible with personalized psychiatry research and is in line with the Research Domain Criteria framework.

  4. Numerical Capacities as Domain-Specific Predictors beyond Early Mathematics Learning: A Longitudinal Study

    PubMed Central

    Reigosa-Crespo, Vivian; González-Alemañy, Eduardo; León, Teresa; Torres, Rosario; Mosquera, Raysil; Valdés-Sosa, Mitchell

    2013-01-01

    The first aim of the present study was to investigate whether numerical effects (Numerical Distance Effect, Counting Effect and Subitizing Effect) are domain-specific predictors of mathematics development at the end of elementary school by exploring whether they explain additional variance of later mathematics fluency after controlling for the effects of general cognitive skills, focused on nonnumerical aspects. The second aim was to address the same issues but applied to achievement in mathematics curriculum that requires solutions to fluency in calculation. These analyses assess whether the relationship found for fluency are generalized to mathematics content beyond fluency in calculation. As a third aim, the domain specificity of the numerical effects was examined by analyzing whether they contribute to the development of reading skills, such as decoding fluency and reading comprehension, after controlling for general cognitive skills and phonological processing. Basic numerical capacities were evaluated in children of 3rd and 4th grades (n=49). Mathematics and reading achievements were assessed in these children one year later. Results showed that the size of the Subitizing Effect was a significant domain-specific predictor of fluency in calculation and also in curricular mathematics achievement, but not in reading skills, assessed at the end of elementary school. Furthermore, the size of the Counting Effect also predicted fluency in calculation, although this association only approached significance. These findings contrast with proposals that the core numerical competencies measured by enumeration will bear little relationship to mathematics achievement. We conclude that basic numerical capacities constitute domain-specific predictors and that they are not exclusively “start-up” tools for the acquisition of Mathematics; but they continue modulating this learning at the end of elementary school. PMID:24255710

  5. Numerical capacities as domain-specific predictors beyond early mathematics learning: a longitudinal study.

    PubMed

    Reigosa-Crespo, Vivian; González-Alemañy, Eduardo; León, Teresa; Torres, Rosario; Mosquera, Raysil; Valdés-Sosa, Mitchell

    2013-01-01

    The first aim of the present study was to investigate whether numerical effects (Numerical Distance Effect, Counting Effect and Subitizing Effect) are domain-specific predictors of mathematics development at the end of elementary school by exploring whether they explain additional variance of later mathematics fluency after controlling for the effects of general cognitive skills, focused on nonnumerical aspects. The second aim was to address the same issues but applied to achievement in mathematics curriculum that requires solutions to fluency in calculation. These analyses assess whether the relationship found for fluency are generalized to mathematics content beyond fluency in calculation. As a third aim, the domain specificity of the numerical effects was examined by analyzing whether they contribute to the development of reading skills, such as decoding fluency and reading comprehension, after controlling for general cognitive skills and phonological processing. Basic numerical capacities were evaluated in children of 3(rd) and 4(th) grades (n=49). Mathematics and reading achievements were assessed in these children one year later. Results showed that the size of the Subitizing Effect was a significant domain-specific predictor of fluency in calculation and also in curricular mathematics achievement, but not in reading skills, assessed at the end of elementary school. Furthermore, the size of the Counting Effect also predicted fluency in calculation, although this association only approached significance. These findings contrast with proposals that the core numerical competencies measured by enumeration will bear little relationship to mathematics achievement. We conclude that basic numerical capacities constitute domain-specific predictors and that they are not exclusively "start-up" tools for the acquisition of Mathematics; but they continue modulating this learning at the end of elementary school.

  6. Moving toward the automation of the systematic review process: a summary of discussions at the second meeting of International Collaboration for the Automation of Systematic Reviews (ICASR).

    PubMed

    O'Connor, Annette M; Tsafnat, Guy; Gilbert, Stephen B; Thayer, Kristina A; Wolfe, Mary S

    2018-01-09

    The second meeting of the International Collaboration for Automation of Systematic Reviews (ICASR) was held 3-4 October 2016 in Philadelphia, Pennsylvania, USA. ICASR is an interdisciplinary group whose aim is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. Having automated tools for systematic review should enable more transparent and timely review, maximizing the potential for identifying and translating research findings to practical application. The meeting brought together multiple stakeholder groups including users of summarized research, methodologists who explore production processes and systematic review quality, and technologists such as software developers, statisticians, and vendors. This diversity of participants was intended to ensure effective communication with numerous stakeholders about progress toward automation of systematic reviews and stimulate discussion about potential solutions to identified challenges. The meeting highlighted challenges, both simple and complex, and raised awareness among participants about ongoing efforts by various stakeholders. An outcome of this forum was to identify several short-term projects that participants felt would advance the automation of tasks in the systematic review workflow including (1) fostering better understanding about available tools, (2) developing validated datasets for testing new tools, (3) determining a standard method to facilitate interoperability of tools such as through an application programming interface or API, and (4) establishing criteria to evaluate the quality of tools' output. ICASR 2016 provided a beneficial forum to foster focused discussion about tool development and resources and reconfirm ICASR members' commitment toward systematic reviews' automation.

  7. Pathogenesis of cerebral malaria: new diagnostic tools, biomarkers, and therapeutic approaches

    PubMed Central

    Sahu, Praveen K.; Satpathi, Sanghamitra; Behera, Prativa K.; Mishra, Saroj K.; Mohanty, Sanjib; Wassmer, Samuel Crocodile

    2015-01-01

    Cerebral malaria is a severe neuropathological complication of Plasmodium falciparum infection. It results in high mortality and post-recovery neuro-cognitive disorders in children, even after appropriate treatment with effective anti-parasitic drugs. While the complete landscape of the pathogenesis of cerebral malaria still remains to be elucidated, numerous innovative approaches have been developed in recent years in order to improve the early detection of this neurological syndrome and, subsequently, the clinical care of affected patients. In this review, we briefly summarize the current understanding of cerebral malaria pathogenesis, compile the array of new biomarkers and tools available for diagnosis and research, and describe the emerging therapeutic approaches to tackle this pathology effectively. PMID:26579500

  8. Agent Based Modeling of Collaboration and Work Practices Onboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Acquisti, Alessandro; Sierhuis, Maarten; Clancey, William J.; Bradshaw, Jeffrey M.; Shaffo, Mike (Technical Monitor)

    2002-01-01

    The International Space Station is one the most complex projects ever, with numerous interdependent constraints affecting productivity and crew safety. This requires planning years before crew expeditions, and the use of sophisticated scheduling tools. Human work practices, however, are difficult to study and represent within traditional planning tools. We present an agent-based model and simulation of the activities and work practices of astronauts onboard the ISS based on an agent-oriented approach. The model represents 'a day in the life' of the ISS crew and is developed in Brahms, an agent-oriented, activity-based language used to model knowledge in situated action and learning in human activities.

  9. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  10. Biopython: freely available Python tools for computational molecular biology and bioinformatics

    PubMed Central

    Cock, Peter J. A.; Antao, Tiago; Chang, Jeffrey T.; Chapman, Brad A.; Cox, Cymon J.; Dalke, Andrew; Friedberg, Iddo; Hamelryck, Thomas; Kauff, Frank; Wilczynski, Bartek; de Hoon, Michiel J. L.

    2009-01-01

    Summary: The Biopython project is a mature open source international collaboration of volunteer developers, providing Python libraries for a wide range of bioinformatics problems. Biopython includes modules for reading and writing different sequence file formats and multiple sequence alignments, dealing with 3D macro molecular structures, interacting with common tools such as BLAST, ClustalW and EMBOSS, accessing key online databases, as well as providing numerical methods for statistical learning. Availability: Biopython is freely available, with documentation and source code at www.biopython.org under the Biopython license. Contact: All queries should be directed to the Biopython mailing lists, see www.biopython.org/wiki/_Mailing_listspeter.cock@scri.ac.uk. PMID:19304878

  11. Nutrition screening tools: does one size fit all? A systematic review of screening tools for the hospital setting.

    PubMed

    van Bokhorst-de van der Schueren, Marian A E; Guaitoli, Patrícia Realino; Jansma, Elise P; de Vet, Henrica C W

    2014-02-01

    Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. A systematic review of English, French, German, Spanish, Portuguese and Dutch articles identified via MEDLINE, Cinahl and EMBASE (from inception to the 2nd of February 2012). Additional studies were identified by checking reference lists of identified manuscripts. Search terms included key words for malnutrition, screening or assessment instruments, and terms for hospital setting and adults. Data were extracted independently by 2 authors. Only studies expressing the (construct, criterion or predictive) validity of a tool were included. 83 studies (32 screening tools) were identified: 42 studies on construct or criterion validity versus a reference method and 51 studies on predictive validity on outcome (i.e. length of stay, mortality or complications). None of the tools performed consistently well to establish the patients' nutritional status. For the elderly, MNA performed fair to good, for the adults MUST performed fair to good. SGA, NRS-2002 and MUST performed well in predicting outcome in approximately half of the studies reviewed in adults, but not in older patients. Not one single screening or assessment tool is capable of adequate nutrition screening as well as predicting poor nutrition related outcome. Development of new tools seems redundant and will most probably not lead to new insights. New studies comparing different tools within one patient population are required. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  12. Specialization in the Human Brain: The Case of Numbers

    PubMed Central

    Kadosh, Roi Cohen; Bahrami, Bahador; Walsh, Vincent; Butterworth, Brian; Popescu, Tudor; Price, Cathy J.

    2011-01-01

    How numerical representation is encoded in the adult human brain is important for a basic understanding of human brain organization, its typical and atypical development, its evolutionary precursors, cognitive architectures, education, and rehabilitation. Previous studies have shown that numerical processing activates the same intraparietal regions irrespective of the presentation format (e.g., symbolic digits or non-symbolic dot arrays). This has led to claims that there is a single format-independent, numerical representation. In the current study we used a functional magnetic resonance adaptation paradigm, and effective connectivity analysis to re-examine whether numerical processing in the intraparietal sulci is dependent or independent on the format of the stimuli. We obtained two novel results. First, the whole brain analysis revealed that format change (e.g., from dots to digits), in the absence of a change in magnitude, activated the same intraparietal regions as magnitude change, but to a greater degree. Second, using dynamic causal modeling as a tool to disentangle neuronal specialization across regions that are commonly activated, we found that the connectivity between the left and right intraparietal sulci is format-dependent. Together, this line of results supports the idea that numerical representation is subserved by multiple mechanisms within the same parietal regions. PMID:21808615

  13. An approach to achieve progress in spacecraft shielding

    NASA Astrophysics Data System (ADS)

    Thoma, K.; Schäfer, F.; Hiermaier, S.; Schneider, E.

    2004-01-01

    Progress in shield design against space debris can be achieved only when a combined approach based on several tools is used. This approach depends on the combined application of advanced numerical methods, specific material models and experimental determination of input parameters for these models. Examples of experimental methods for material characterization are given, covering the range from quasi static to very high strain rates for materials like Nextel and carbon fiber-reinforced materials. Mesh free numerical methods have extraordinary capabilities in the simulation of extreme material behaviour including complete failure with phase changes, combined with shock wave phenomena and the interaction with structural components. In this paper the benefits from combining numerical methods, material modelling and detailed experimental studies for shield design are demonstrated. The following examples are given: (1) Development of a material model for Nextel and Kevlar-Epoxy to enable numerical simulation of hypervelocity impacts on complex heavy protection shields for the International Space Station. (2) The influence of projectile shape on protection performance of Whipple Shields and how experimental problems in accelerating such shapes can be overcome by systematic numerical simulation. (3) The benefits of using metallic foams in "sandwich bumper shields" for spacecraft and how to approach systematic characterization of such materials.

  14. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.

  15. The numerical modelling of falling film thickness flow on horizontal tubes

    NASA Astrophysics Data System (ADS)

    Hassan, I. A.; Sadikin, A.; Isa, N. Mat

    2017-04-01

    This paper presents a computational modelling of water falling film flowing over horizontal tubes. The objective of this study is to use numerical predictions for comparing the film thickness along circumferential direction of tube on 2-D CFD models. The results are then validated with a theoretical result in previous literatures. A comprehensive design of 2-D models have been developed according to the real application and actual configuration of the falling film evaporator as well as previous experimental parameters. A computational modelling of the water falling film is presented with the aid of Ansys Fluent software. The Volume of Fluid (VOF) technique is adapted in this analysis since its capabilities of determining the film thickness on tubes surface is highly reliable. The numerical analysis is carried out under influence of ambient pressures at temperature of 27 °C. Three types of CFD numerical models were analyzed in this simulation with inter tube spacing of 30 mm, 20 mm and 10 mm respectively. The use of a numerical simulation tool on water falling film has resulted in a detailed investigation of film thickness. Based on the numerical simulated results, it is found that the average values of water film thickness for each model are 0.53 mm, 0.58 mm, and 0.63 mm.

  16. Numerical Simulation of Rocket Exhaust Interaction with Lunar Soil

    NASA Technical Reports Server (NTRS)

    Liever, Peter; Tosh, Abhijit; Curtis, Jennifer

    2012-01-01

    This technology development originated from the need to assess the debris threat resulting from soil material erosion induced by landing spacecraft rocket plume impingement on extraterrestrial planetary surfaces. The impact of soil debris was observed to be highly detrimental during NASA s Apollo lunar missions and will pose a threat for any future landings on the Moon, Mars, and other exploration targets. The innovation developed under this program provides a simulation tool that combines modeling of the diverse disciplines of rocket plume impingement gas dynamics, granular soil material liberation, and soil debris particle kinetics into one unified simulation system. The Unified Flow Solver (UFS) developed by CFDRC enabled the efficient, seamless simulation of mixed continuum and rarefied rocket plume flow utilizing a novel direct numerical simulation technique of the Boltzmann gas dynamics equation. The characteristics of the soil granular material response and modeling of the erosion and liberation processes were enabled through novel first principle-based granular mechanics models developed by the University of Florida specifically for the highly irregularly shaped and cohesive lunar regolith material. These tools were integrated into a unique simulation system that accounts for all relevant physics aspects: (1) Modeling of spacecraft rocket plume impingement flow under lunar vacuum environment resulting in a mixed continuum and rarefied flow; (2) Modeling of lunar soil characteristics to capture soil-specific effects of particle size and shape composition, soil layer cohesion and granular flow physics; and (3) Accurate tracking of soil-borne debris particles beginning with aerodynamically driven motion inside the plume to purely ballistic motion in lunar far field conditions. In the earlier project phase of this innovation, the capabilities of the UFS for mixed continuum and rarefied flow situations were validated and demonstrated for lunar lander rocket plume flow impingement under lunar vacuum conditions. Applications and improvements to the granular flow simulation tools contributed by the University of Florida were tested against Earth environment experimental results. Requirements for developing, validating, and demonstrating this solution environment were clearly identified, and an effective second phase execution plan was devised. In this phase, the physics models were refined and fully integrated into a production-oriented simulation tool set. Three-dimensional simulations of Apollo Lunar Excursion Module (LEM) and Altair landers (including full-scale lander geometry) established the practical applicability of the UFS simulation approach and its advanced performance level for large-scale realistic problems.

  17. MAGIC: A Tool for Combining, Interpolating, and Processing Magnetograms

    NASA Technical Reports Server (NTRS)

    Allred, Joel

    2012-01-01

    Transients in the solar coronal magnetic field are ultimately the source of space weather. Models which seek to track the evolution of the coronal field require magnetogram images to be used as boundary conditions. These magnetograms are obtained by numerous instruments with different cadences and resolutions. A tool is required which allows modelers to fmd all available data and use them to craft accurate and physically consistent boundary conditions for their models. We have developed a software tool, MAGIC (MAGnetogram Interpolation and Composition), to perform exactly this function. MAGIC can manage the acquisition of magneto gram data, cast it into a source-independent format, and then perform the necessary spatial and temporal interpolation to provide magnetic field values as requested onto model-defined grids. MAGIC has the ability to patch magneto grams from different sources together providing a more complete picture of the Sun's field than is possible from single magneto grams. In doing this, care must be taken so as not to introduce nonphysical current densities along the seam between magnetograms. We have designed a method which minimizes these spurious current densities. MAGIC also includes a number of post-processing tools which can provide additional information to models. For example, MAGIC includes an interface to the DA VE4VM tool which derives surface flow velocities from the time evolution of surface magnetic field. MAGIC has been developed as an application of the KAMELEON data formatting toolkit which has been developed by the CCMC.

  18. [Technical improvement of cohort constitution in administrative health databases: Providing a tool for integration and standardization of data applicable in the French National Health Insurance Database (SNIIRAM)].

    PubMed

    Ferdynus, C; Huiart, L

    2016-09-01

    Administrative health databases such as the French National Heath Insurance Database - SNIIRAM - are a major tool to answer numerous public health research questions. However the use of such data requires complex and time-consuming data management. Our objective was to develop and make available a tool to optimize cohort constitution within administrative health databases. We developed a process to extract, transform and load (ETL) data from various heterogeneous sources in a standardized data warehouse. This data warehouse is architected as a star schema corresponding to an i2b2 star schema model. We then evaluated the performance of this ETL using data from a pharmacoepidemiology research project conducted in the SNIIRAM database. The ETL we developed comprises a set of functionalities for creating SAS scripts. Data can be integrated into a standardized data warehouse. As part of the performance assessment of this ETL, we achieved integration of a dataset from the SNIIRAM comprising more than 900 million lines in less than three hours using a desktop computer. This enables patient selection from the standardized data warehouse within seconds of the request. The ETL described in this paper provides a tool which is effective and compatible with all administrative health databases, without requiring complex database servers. This tool should simplify cohort constitution in health databases; the standardization of warehouse data facilitates collaborative work between research teams. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  19. Creation of an ensemble of simulated cardiac cases and a human observer study: tools for the development of numerical observers for SPECT myocardial perfusion imaging

    NASA Astrophysics Data System (ADS)

    O'Connor, J. Michael; Pretorius, P. Hendrik; Gifford, Howard C.; Licho, Robert; Joffe, Samuel; McGuiness, Matthew; Mehurg, Shannon; Zacharias, Michael; Brankov, Jovan G.

    2012-02-01

    Our previous Single Photon Emission Computed Tomography (SPECT) myocardial perfusion imaging (MPI) research explored the utility of numerical observers. We recently created two hundred and eighty simulated SPECT cardiac cases using Dynamic MCAT (DMCAT) and SIMIND Monte Carlo tools. All simulated cases were then processed with two reconstruction methods: iterative ordered subset expectation maximization (OSEM) and filtered back-projection (FBP). Observer study sets were assembled for both OSEM and FBP methods. Five physicians performed an observer study on one hundred and seventy-nine images from the simulated cases. The observer task was to indicate detection of any myocardial perfusion defect using the American Society of Nuclear Cardiology (ASNC) 17-segment cardiac model and the ASNC five-scale rating guidelines. Human observer Receiver Operating Characteristic (ROC) studies established the guidelines for the subsequent evaluation of numerical model observer (NO) performance. Several NOs were formulated and their performance was compared with the human observer performance. One type of NO was based on evaluation of a cardiac polar map that had been pre-processed using a gradient-magnitude watershed segmentation algorithm. The second type of NO was also based on analysis of a cardiac polar map but with use of a priori calculated average image derived from an ensemble of normal cases.

  20. Using Computational and Mechanical Models to Study Animal Locomotion

    PubMed Central

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026

  1. Numerical study on the electromechanical behavior of dielectric elastomer with the influence of surrounding medium

    NASA Astrophysics Data System (ADS)

    Jia; Lu

    2016-01-01

    The considerable electric-induced shape change, together with the attributes of lightweight, high efficiency, and inexpensive cost, makes dielectric elastomer, a promising soft active material for the realization of actuators in broad applications. Although, a number of prototype devices have been demonstrated in the past few years, the further development of this technology necessitates adequate analytical and numerical tools. Especially, previous theoretical studies always neglect the influence of surrounding medium. Due to the large deformation and nonlinear equations of states involved in dielectric elastomer, finite element method (FEM) is anticipated; however, the few available formulations employ homemade codes, which are inconvenient to implement. The aim of this work is to present a numerical approach with the commercial FEM package COMSOL to investigate the nonlinear response of dielectric elastomer under electric stimulation. The influence of surrounding free space on the electric field is analyzed and the corresponding electric force is taken into account through an electric surface traction on the circumstances edge. By employing Maxwell stress tensor as actuation pressure, the mechanical and electric governing equations for dielectric elastomer are coupled, and then solved simultaneously with the Gent model of stain energy to derive the electric induced large deformation as well as the electromechanical instability. The finite element implementation presented here may provide a powerful computational tool to help design and optimize the engineering applications of dielectric elastomer.

  2. Development of a Protocol and a Screening Tool for Selection of DNAPL Source Area Remediation

    DTIC Science & Technology

    2012-02-01

    the different remedial time frames used in the modeling case studies. • Matrix Diffusion: Modeling results demonstrated that in fractured rock ...being used for the ISCO, EISB and SEAR fractured rock numerical simulations at the field scale. Figure 2-4 presents the distribution of intrinsic...sedimentary limestone, sandstone, and shale, igneous basalts and granites, and metamorphous rock . For the modeling sites, three general geologies are

  3. Flowfield characterization and model development in detonation tubes

    NASA Astrophysics Data System (ADS)

    Owens, Zachary Clark

    A series of experiments and numerical simulations are performed to advance the understanding of flowfield phenomena and impulse generation in detonation tubes. Experiments employing laser-based velocimetry, high-speed schlieren imaging and pressure measurements are used to construct a dataset against which numerical models can be validated. The numerical modeling culminates in the development of a two-dimensional, multi-species, finite-rate-chemistry, parallel, Navier-Stokes solver. The resulting model is specifically designed to assess unsteady, compressible, reacting flowfields, and its utility for studying multidimensional detonation structure is demonstrated. A reduced, quasi-one-dimensional model with source terms accounting for wall losses is also developed for rapid parametric assessment. Using these experimental and numerical tools, two primary objectives are pursued. The first objective is to gain an understanding of how nozzles affect unsteady, detonation flowfields and how they can be designed to maximize impulse in a detonation based propulsion system called a pulse detonation engine. It is shown that unlike conventional, steady-flow propulsion systems where converging-diverging nozzles generate optimal performance, unsteady detonation tube performance during a single-cycle is maximized using purely diverging nozzles. The second objective is to identify the primary underlying mechanisms that cause velocity and pressure measurements to deviate from idealized theory. An investigation of the influence of non-ideal losses including wall heat transfer, friction and condensation leads to the development of improved models that reconcile long-standing discrepancies between predicted and measured detonation tube performance. It is demonstrated for the first time that wall condensation of water vapor in the combustion products can cause significant deviations from ideal theory.

  4. A Comprehensive Look at Polypharmacy and Medication Screening Tools for the Older Cancer Patient

    PubMed Central

    DeGregory, Kathlene A.; Morris, Amy L.; Ramsdale, Erika E.

    2016-01-01

    Inappropriate medication use and polypharmacy are extremely common among older adults. Numerous studies have discussed the importance of a comprehensive medication assessment in the general geriatric population. However, only a handful of studies have evaluated inappropriate medication use in the geriatric oncology patient. Almost a dozen medication screening tools exist for the older adult. Each available tool has the potential to improve aspects of the care of older cancer patients, but no single tool has been developed for this population. We extensively reviewed the literature (MEDLINE, PubMed) to evaluate and summarize the most relevant medication screening tools for older patients with cancer. Findings of this review support the use of several screening tools concurrently for the elderly patient with cancer. A deprescribing tool should be developed and included in a comprehensive geriatric oncology assessment. Finally, prospective studies are needed to evaluate such a tool to determine its feasibility and impact in older patients with cancer. Implications for Practice: The prevalence of polypharmacy increases with advancing age. Older adults are more susceptible to adverse effects of medications. “Prescribing cascades” are common, whereas “deprescribing” remains uncommon; thus, older patients tend to accumulate medications over time. Older patients with cancer are at high risk for adverse drug events, in part because of the complexity and intensity of cancer treatment. Additionally, a cancer diagnosis often alters assessments of life expectancy, clinical status, and competing risk. Screening for polypharmacy and potentially inappropriate medications could reduce the risk for adverse drug events, enhance quality of life, and reduce health care spending for older cancer patients. PMID:27151653

  5. A software tool for analyzing multichannel cochlear implant signals.

    PubMed

    Lai, Wai Kong; Bögli, Hans; Dillier, Norbert

    2003-10-01

    A useful and convenient means to analyze the radio frequency (RF) signals being sent by a speech processor to a cochlear implant would be to actually capture and display them with appropriate software. This is particularly useful for development or diagnostic purposes. sCILab (Swiss Cochlear Implant Laboratory) is such a PC-based software tool intended for the Nucleus family of Multichannel Cochlear Implants. Its graphical user interface provides a convenient and intuitive means for visualizing and analyzing the signals encoding speech information. Both numerical and graphic displays are available for detailed examination of the captured CI signals, as well as an acoustic simulation of these CI signals. sCILab has been used in the design and verification of new speech coding strategies, and has also been applied as an analytical tool in studies of how different parameter settings of existing speech coding strategies affect speech perception. As a diagnostic tool, it is also useful for troubleshooting problems with the external equipment of the cochlear implant systems.

  6. Nose-to-tail analysis of an airbreathing hypersonic vehicle using an in-house simplified tool

    NASA Astrophysics Data System (ADS)

    Piscitelli, Filomena; Cutrone, Luigi; Pezzella, Giuseppe; Roncioni, Pietro; Marini, Marco

    2017-07-01

    SPREAD (Scramjet PREliminary Aerothermodynamic Design) is a simplified, in-house method developed by CIRA (Italian Aerospace Research Centre), able to provide a preliminary estimation of the performance of engine/aeroshape for airbreathing configurations. It is especially useful for scramjet engines, for which the strong coupling between the aerothermodynamic (external) and propulsive (internal) flow fields requires real-time screening of several engine/aeroshape configurations and the identification of the most promising one/s with respect to user-defined constraints and requirements. The outcome of this tool defines the base-line configuration for further design analyses with more accurate tools, e.g., CFD simulations and wind tunnel testing. SPREAD tool has been used to perform the nose-to-tail analysis of the LAPCAT-II Mach 8 MR2.4 vehicle configuration. The numerical results demonstrate SPREAD capability to quickly predict reliable values of aero-propulsive balance (i.e., net-thrust) and aerodynamic efficiency in a pre-design phase.

  7. Assessing Bleeding Risk in Patients Taking Anticoagulants

    PubMed Central

    Shoeb, Marwa; Fang, Margaret C.

    2013-01-01

    Anticoagulant medications are commonly used for the prevention and treatment of thromboembolism. Although highly effective, they are also associated with significant bleeding risks. Numerous individual clinical factors have been linked to an increased risk of hemorrhage, including older age, anemia, and renal disease. To help quantify hemorrhage risk for individual patients, a number of clinical risk prediction tools have been developed. These risk prediction tools differ in how they were derived and how they identify and weight individual risk factors. At present, their ability to effective predict anticoagulant-associated hemorrhage remains modest. Use of risk prediction tools to estimate bleeding in clinical practice is most influential when applied to patients at the lower spectrum of thromboembolic risk, when the risk of hemorrhage will more strongly affect clinical decisions about anticoagulation. Using risk tools may also help counsel and inform patients about their potential risk for hemorrhage while on anticoagulants, and can identify patients who might benefit from more careful management of anticoagulation. PMID:23479259

  8. Airport Viz - a 3D Tool to Enhance Security Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Daniel B

    2006-01-01

    In the summer of 2000, the National Safe Skies Alliance (NSSA) awarded a project to the Applied Visualization Center (AVC) at the University of Tennessee, Knoxville (UTK) to develop a 3D computer tool to assist the Federal Aviation Administration security group, now the Transportation Security Administration (TSA), in evaluating new equipment and procedures to improve airport checkpoint security. A preliminary tool was demonstrated at the 2001 International Aviation Security Technology Symposium. Since then, the AVC went on to construct numerous detection equipment models as well as models of several airports. Airport Viz has been distributed by the NSSA to amore » number of airports around the country which are able to incorporate their own CAD models into the software due to its unique open architecture. It provides a checkpoint design and passenger flow simulation function, a layout design and simulation tool for checked baggage and cargo screening, and a means to assist in the vulnerability assessment of airport access points for pedestrians and vehicles.« less

  9. A reliable algorithm for optimal control synthesis

    NASA Technical Reports Server (NTRS)

    Vansteenwyk, Brett; Ly, Uy-Loi

    1992-01-01

    In recent years, powerful design tools for linear time-invariant multivariable control systems have been developed based on direct parameter optimization. In this report, an algorithm for reliable optimal control synthesis using parameter optimization is presented. Specifically, a robust numerical algorithm is developed for the evaluation of the H(sup 2)-like cost functional and its gradients with respect to the controller design parameters. The method is specifically designed to handle defective degenerate systems and is based on the well-known Pade series approximation of the matrix exponential. Numerical test problems in control synthesis for simple mechanical systems and for a flexible structure with densely packed modes illustrate positively the reliability of this method when compared to a method based on diagonalization. Several types of cost functions have been considered: a cost function for robust control consisting of a linear combination of quadratic objectives for deterministic and random disturbances, and one representing an upper bound on the quadratic objective for worst case initial conditions. Finally, a framework for multivariable control synthesis has been developed combining the concept of closed-loop transfer recovery with numerical parameter optimization. The procedure enables designers to synthesize not only observer-based controllers but also controllers of arbitrary order and structure. Numerical design solutions rely heavily on the robust algorithm due to the high order of the synthesis model and the presence of near-overlapping modes. The design approach is successfully applied to the design of a high-bandwidth control system for a rotorcraft.

  10. Consistent Chemical Mechanism from Collaborative Data Processing

    DOE PAGES

    Slavinskaya, Nadezda; Starcke, Jan-Hendrik; Abbasi, Mehdi; ...

    2016-04-01

    Numerical tool of Process Informatics Model (PrIMe) is mathematically rigorous and numerically efficient approach for analysis and optimization of chemical systems. It handles heterogeneous data and is scalable to a large number of parameters. The Boundto-Bound Data Collaboration module of the automated data-centric infrastructure of PrIMe was used for the systematic uncertainty and data consistency analyses of the H 2/CO reaction model (73/17) and 94 experimental targets (ignition delay times). The empirical rule for evaluation of the shock tube experimental data is proposed. The initial results demonstrate clear benefits of the PrIMe methods for an evaluation of the kinetic datamore » quality and data consistency and for developing predictive kinetic models.« less

  11. Analysis of Electrowetting Dynamics with Level Set Method

    NASA Astrophysics Data System (ADS)

    Park, Jun Kwon; Hong, Jiwoo; Kang, Kwan Hyoung

    2009-11-01

    Electrowetting is a versatile tool to handle tiny droplets and forms a backbone of digital microfluidics. Numerical analysis is necessary to fully understand the dynamics of electrowetting, especially in designing electrowetting-based liquid lenses and reflective displays. We developed a numerical method to analyze the general contact-line problems, incorporating dynamic contact angle models. The method was applied to the analysis of spreading process of a sessile droplet for step input voltages in electrowetting. The result was compared with experimental data and analytical result which is based on the spectral method. It is shown that contact line friction significantly affects the contact line motion and the oscillation amplitude. The pinning process of contact line was well represented by including the hysteresis effect in the contact angle models.

  12. To Collapse or not to Collapse: The Life of a Primordial Black Hole

    NASA Astrophysics Data System (ADS)

    Craig, Robert; Bloomfield, Jolyon; Face, Stephen

    2016-03-01

    Primordial black holes offer insights into topics ranging from cosmological questions about inflationary models to astrophysical questions regarding supermassive black holes. Such insights depend on being able to predict the number density of black holes that form from primordial fluctuations. Traditionally this has been done by means of a ``rule-of-thumb'' developed by Carr in the 1980s, but recent numerical studies have shown that this predictor is a coarse tool at best. We present a two-parameter predictor with much more discrimination power that can be straightforwardly used to compute number densities. We also discuss challenges that face this type of prediction strategy, both analytically and numerically, and possible ways to circumvent them.

  13. Numerical Zooming Between a NPSS Engine System Simulation and a One-Dimensional High Compressor Analysis Code

    NASA Technical Reports Server (NTRS)

    Follen, Gregory; auBuchon, M.

    2000-01-01

    Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer along with the concept of numerical zooming between zero-dimensional to one-, two-, and three-dimensional component engine codes. In addition, the NPSS is refining the computing and communication technologies necessary to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Of the different technology areas that contribute to the development of the NPSS Environment, the subject of this paper is a discussion on numerical zooming between a NPSS engine simulation and higher fidelity representations of the engine components (fan, compressor, burner, turbines, etc.). What follows is a description of successfully zooming one-dimensional (row-by-row) high-pressure compressor analysis results back to a zero-dimensional NPSS engine simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the capability of the engine system simulation and increase the level of virtual test conducted prior to committing the design to hardware.

  14. High resolution modelling of extreme precipitation events in urban areas

    NASA Astrophysics Data System (ADS)

    Siemerink, Martijn; Volp, Nicolette; Schuurmans, Wytze; Deckers, Dave

    2015-04-01

    The present day society needs to adjust to the effects of climate change. More extreme weather conditions are expected, which can lead to longer periods of drought, but also to more extreme precipitation events. Urban water systems are not designed for such extreme events. Most sewer systems are not able to drain the excessive storm water, causing urban flooding. This leads to high economic damage. In order to take appropriate measures against extreme urban storms, detailed knowledge about the behaviour of the urban water system above and below the streets is required. To investigate the behaviour of urban water systems during extreme precipitation events new assessment tools are necessary. These tools should provide a detailed and integral description of the flow in the full domain of overland runoff, sewer flow, surface water flow and groundwater flow. We developed a new assessment tool, called 3Di, which provides detailed insight in the urban water system. This tool is based on a new numerical methodology that can accurately deal with the interaction between overland runoff, sewer flow and surface water flow. A one-dimensional model for the sewer system and open channel flow is fully coupled to a two-dimensional depth-averaged model that simulates the overland flow. The tool uses a subgrid-based approach in order to take high resolution information of the sewer system and of the terrain into account [1, 2]. The combination of using the high resolution information and the subgrid based approach results in an accurate and efficient modelling tool. It is now possible to simulate entire urban water systems using extreme high resolution (0.5m x 0.5m) terrain data in combination with a detailed sewer and surface water network representation. The new tool has been tested in several Dutch cities, such as Rotterdam, Amsterdam and The Hague. We will present the results of an extreme precipitation event in the city of Schiedam (The Netherlands). This city deals with significant soil consolidation and the low-lying areas are prone to urban flooding. The simulation results are compared with measurements in the sewer network. References [1] Guus S. Stelling G.S., 2012. Quadtree flood simulations with subgrid digital elevation models. Water Management 165 (WM1):1329-1354. [2] Vincenzo Cassuli and Guus S. Stelling, 2013. A semi-implicit numerical model for urban drainage systems. International Journal for Numerical Methods in Fluids. Vol. 73:600-614. DOI: 10.1002/fld.3817

  15. Numerical modeling of the traction process in the treatment for Pierre-Robin Sequence.

    PubMed

    Słowiński, Jakub J; Czarnecka, Aleksandra

    2016-10-01

    The goal of this numerical study was to identify the results of modulated growth simulation of the mandibular bone during traction in Pierre-Robin Sequence (PRS) treatment. Numerical simulation was conducted in the Ansys 16.2 environment. Two FEM (finite elements method) models of a newborn's mandible (a spatial and a flat model) were developed. The procedure simulated a 20-week traction period. The adopted growth measure was mandibular length increase, defined as the distance between the Co-Pog anatomic points used in cephalometric analysis. The simulation calculations conducted on the developed models showed that modulation had a significant influence on the pace of bone growth. In each of the analyzed cases, growth modulation resulted in an increase in pace. The largest value of increase was 6.91 mm. The modulated growth with the most beneficial load variant increased the basic value of the growth by as much as 24.6%, and growth with the least beneficial variant increased by 7.4%. Traction is a simple, minimally invasive and inexpensive procedure. The proposed algorithm may enable the development of a helpful forecasting tool, which could be of real use to doctors working on Pierre-Robin Sequence and other mandibular deformations in children. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Modeling combined heat transfer in an all solid state optical cryocooler

    NASA Astrophysics Data System (ADS)

    Kuzhiveli, Biju T.

    2017-12-01

    Attaining cooling effect by using laser induced anti-Stokes fluorescence in solids appears to have several advantages over conventional mechanical systems and has been the topic of recent analysis and experimental work. Using anti-Stokes fluorescence phenomenon to remove heat from a glass by pumping it with laser light, stands as a pronouncing physical basis for solid state cooling. Cryocooling by fluorescence is a feasible solution for obtaining compactness and reliability. It has a distinct niche in the family of small capacity cryocoolers and is undergoing a revolutionary advance. In pursuit of developing laser induced anti-Stokes fluorescent cryocooler, it is required to develop numerical tools that support the thermal design which could provide a thorough analysis of combined heat transfer mechanism within the cryocooler. The paper presents the details of numerical model developed for the cryocooler and the subsequent development of a computer program. The program has been used for the understanding of various heat transfer mechanisms and is being used for thermal design of components of an anti-Stokes fluorescent cryocooler.

  17. Technical Report on Occupations in Numerically Controlled Metal-Cutting Machining.

    ERIC Educational Resources Information Center

    Manpower Administration (DOL), Washington, DC. U.S. Employment Service.

    At the present time, only 5 percent of the short-run metal-cutting machining in the United States is done by numerically controlled machined tools, but within the next decade it is expected to increase by 50 percent. Numerically controlled machines use taped data which is changed into instructions and directs the machine to do certain steps…

  18. Journal of Aeronautics.

    DTIC Science & Technology

    1982-07-21

    aerodynamic tool for design of elastic aircraft. Several numerical examples are given and some dynamical problems of elastic aircraft are also discussed...Qiangang, Wu Changlin, Jian Zheng Northwestern Polytechnical University Abstract: A numerical metbod,6* ted for predicting the aerodynamic characte- ristics... Numerical value calculation method is one important means of the present research on elastic aircraft pneumatic characteristics. Be- cause this

  19. Modeling and Analysis of the Reverse Water Gas Shift Process for In-Situ Propellant Production

    NASA Technical Reports Server (NTRS)

    Whitlow, Jonathan E.

    2000-01-01

    This report focuses on the development of mathematical models and simulation tools developed for the Reverse Water Gas Shift (RWGS) process. This process is a candidate technology for oxygen production on Mars under the In-Situ Propellant Production (ISPP) project. An analysis of the RWGS process was performed using a material balance for the system. The material balance is very complex due to the downstream separations and subsequent recycle inherent with the process. A numerical simulation was developed for the RWGS process to provide a tool for analysis and optimization of experimental hardware, which will be constructed later this year at Kennedy Space Center (KSC). Attempts to solve the material balance for the system, which can be defined by 27 nonlinear equations, initially failed. A convergence scheme was developed which led to successful solution of the material balance, however the simplified equations used for the gas separation membrane were found insufficient. Additional more rigorous models were successfully developed and solved for the membrane separation. Sample results from these models are included in this report, with recommendations for experimental work needed for model validation.

  20. Proceedings of the IMOG (Interagency Manufacturing Operations Group) Numerical Systems Group. 62nd Meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maes, G.J.

    1993-10-01

    This document contains the proceedings of the 62nd Interagency Manufacturing Operations Group (IMOG) Numerical Systems Group. Included are the minutes of the 61st meeting and the agenda for the 62nd meeting. Presentations at the meeting are provided in the appendices to this document. Presentations were: 1992 NSG Annual Report to IMOG Steering Committee; Charter for the IMOG Numerical Systems Group; Y-12 Coordinate Measuring Machine Training Project; IBH NC Controller; Automatically Programmed Metrology Update; Certification of Anvil-5000 for Production Use at the Y-12 Plant; Accord Project; Sandia National Laboratories {open_quotes}Accord{close_quotes}; Demo/Anvil Tool Path Generation 5-Axis; Demo/Video Machine/Robot Animation Dynamics; Demo/Certification ofmore » Anvil Tool Path Generation; Tour of the M-60 Inspection Machine; Distributed Numerical Control Certification; Spline Usage Method; Y-12 NC Engineering Status; and Y-12 Manufacturing CAD Systems.« less

  1. A study on directional resistivity logging-while-drilling based on self-adaptive hp-FEM

    NASA Astrophysics Data System (ADS)

    Liu, Dejun; Li, Hui; Zhang, Yingying; Zhu, Gengxue; Ai, Qinghui

    2014-12-01

    Numerical simulation of resistivity logging-while-drilling (LWD) tool response provides guidance for designing novel logging instruments and interpreting real-time logging data. In this paper, based on self-adaptive hp-finite element method (hp-FEM) algorithm, we analyze LWD tool response against model parameters and briefly illustrate geosteering capabilities of directional resistivity LWD. Numerical simulation results indicate that the change of source spacing is of obvious influence on the investigation depth and detecting precision of resistivity LWD tool; the change of frequency can improve the resolution of low-resistivity formation and high-resistivity formation. The simulation results also indicate that the self-adaptive hp-FEM algorithm has good convergence speed and calculation accuracy to guide the geologic steering drilling and it is suitable to simulate the response of resistivity LWD tools.

  2. Evolving Design Criteria for Very Large Aperture Space Based Telescopes and Their Influence on the Need for Integrated Tools in the Optimization Process

    NASA Technical Reports Server (NTRS)

    Arnold, William R., Sr.

    2015-01-01

    NASA's Advanced Mirror Technology Development (AMTD) program has been developing the means to design and build the future generations of space based telescopes. With the nearing completion of the James Webb Space Telescope (JWST), the astrophysics community is already starting to define the requirements for follow-on observatories. The restrictions of available launch vehicles and the possibilities of planned future vehicles have fueled the competition between monolithic primaries (with better optical quality) and segmented primaries (with larger apertures, but with diffraction, costs and figure control issues). Regardless of the current shroud sizes and lift capacities, these competing architectures share the need for rapid design tools. As part of the AMTD program a number of tools have been developed and tested to speed up the design process. Starting with the Arnold Mirror Modeler (which creates Finite Element Models (FEM) for structural analysis) and now also feeds these models into thermal stability analyses. They share common file formats and interchangeable results. During the development of the program, numerous trade studies were created for 4-meter and 8-meter monolithic primaries, complete with support systems. Evaluation of these results has led to a better understanding of how the specification drives the results. This paper will show some of the early trade studies for typical specification requirements such as lowest mirror bending frequency and suspension system lowest frequency. The results use representative allowable stress values for each mirror substrate material and construction method and generic material properties. These studies lead to some interesting relationships between feasible designs and the realities of actually trying to build these mirrors. Much of the traditional specifications were developed for much smaller systems, where the mass and volume of the primary where a small portion of the overall satellite. JWST shows us that as the aperture grows, the primary takes up the majority of the mass and volume and the established rules need to be adjusted. For example, a small change in lowest frequency requirement can change the cost by millions of dollars. The paper uses numerous trade studies created during the software development phase of the Arnold Mirror Modeler to illustrate the influences of system specifications on the design space. The future telescopes will require better performance, stability and documented feasibility to meet the hurdles of today's budget and schedules realities. AMTD is developing the tools, but the basic system planning mentality also has to adopt to the requirements of these very large and complex physical structures.

  3. Trabecular bone score (TBS): Method and applications.

    PubMed

    Martineau, P; Leslie, W D

    2017-11-01

    Trabecular bone score (TBS) is a texture index derived from standard lumbar spine dual energy X-ray absorptiometry (DXA) images and provides information about the underlying bone independent of the bone mineral density (BMD). Several salient observations have emerged. Numerous studies have examined the relationship between TBS and fracture risk and have shown that lower TBS values are associated with increased risk for major osteoporotic fracture in postmenopausal women and older men, with this result being independent of BMD values and other clinical risk factors. Therefore, despite being derived from standard DXA images, the information contained in TBS is independent and complementary to the information provided by BMD and the FRAX® tool. A procedure to generate TBS-adjusted FRAX probabilities has become available with the resultant predicted fracture risks shown to be more accurate than the standard FRAX tool. With these developments, TBS has emerged as a clinical tool for improved fracture risk prediction and guiding decisions regarding treatment initiation, particularly for patients with FRAX probabilities around an intervention threshold. In this article, we review the development, validation, clinical application, and limitations of TBS. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. TRACI 2.0 - The Tool for the Reduction and Assessment of ...

    EPA Pesticide Factsheets

    TRACI 2.0, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts 2.0, has been expanded and developed for sustainability metrics, life cycle impact assessment, industrial ecology, and process design impact assessment for developing increasingly sustainable products, processes, facilities, companies, and communities. TRACI 2.0 allows the quantification of stressors that have potential effects, including ozone depletion, global warming, acidification, eutrophication, tropospheric ozone (smog) formation, human health criteria-related effects, human health cancer, human health noncancer, ecotoxicity, and fossil fuel depletion effects. Research is on going to quantify the use of land and water in a future version of TRACI. The original version of TRACI released in August 2002, has been used in many prestigious applications including: the US Green Building Council’s LEED Certification, the National Institute of Standards and Technology’s BEES (Building for Environment and Economic Sustainability) which is used by US EPA for Environmentally Preferable Purchasing, the US Marine Corps’ EKAT (Environmental Knowledge and Assessment Tool) for military and non-military uses, and within numerous college curriculums in engineering and design departments. To inform the public.

  5. Insertional engineering of chromosomes with Sleeping Beauty transposition: an overview.

    PubMed

    Grabundzija, Ivana; Izsvák, Zsuzsanna; Ivics, Zoltán

    2011-01-01

    Novel genetic tools and mutagenesis strategies based on the Sleeping Beauty (SB) transposable element are currently under development with a vision to link primary DNA sequence information to gene functions in vertebrate models. By virtue of its inherent capacity to insert into DNA, the SB transposon can be developed into powerful tools for chromosomal manipulations. Mutagenesis screens based on SB have numerous advantages including high throughput and easy identification of mutated alleles. Forward genetic approaches based on insertional mutagenesis by engineered SB transposons have the advantage of providing insight into genetic networks and pathways based on phenotype. Indeed, the SB transposon has become a highly instrumental tool to induce tumors in experimental animals in a tissue-specific -manner with the aim of uncovering the genetic basis of diverse cancers. Here, we describe a battery of mutagenic cassettes that can be applied in conjunction with SB transposon vectors to mutagenize genes, and highlight versatile experimental strategies for the generation of engineered chromosomes for loss-of-function as well as gain-of-function mutagenesis for functional gene annotation in vertebrate models.

  6. Numerical Model Metrics Tools in Support of Navy Operations

    NASA Astrophysics Data System (ADS)

    Dykes, J. D.; Fanguy, P.

    2017-12-01

    Increasing demands of accurate ocean forecasts that are relevant to the Navy mission decision makers demand tools that quickly provide relevant numerical model metrics to the forecasters. Increasing modelling capabilities with ever-higher resolution domains including coupled and ensemble systems as well as the increasing volume of observations and other data sources to which to compare the model output requires more tools for the forecaster to enable doing more with less. These data can be appropriately handled in a geographic information system (GIS) fused together to provide useful information and analyses, and ultimately a better understanding how the pertinent model performs based on ground truth.. Oceanographic measurements like surface elevation, profiles of temperature and salinity, and wave height can all be incorporated into a set of layers correlated to geographic information such as bathymetry and topography. In addition, an automated system that runs concurrently with the models on high performance machines matches routinely available observations to modelled values to form a database of matchups with which statistics can be calculated and displayed, to facilitate validation of forecast state and derived variables. ArcMAP, developed by Environmental Systems Research Institute, is a GIS application used by the Naval Research Laboratory (NRL) and naval operational meteorological and oceanographic centers to analyse the environment in support of a range of Navy missions. For example, acoustic propagation in the ocean is described with a three-dimensional analysis of sound speed that depends on profiles of temperature, pressure and salinity predicted by the Navy Coastal Ocean Model. The data and model output must include geo-referencing information suitable for accurately placing the data within the ArcMAP framework. NRL has developed tools that facilitate merging these geophysical data and their analyses, including intercomparisons between model predictions as well as comparison to validation data. This methodology produces new insights and facilitates identification of potential problems in ocean prediction.

  7. The VIDA Framework as an Education Tool: Leveraging Volcanology Data for Educational Purposes

    NASA Astrophysics Data System (ADS)

    Faied, D.; Sanchez, A.

    2009-04-01

    The VIDA Framework as an Education Tool: Leveraging Volcanology Data for Educational Purposes Dohy Faied, Aurora Sanchez (on behalf of SSP08 VAPOR Project Team) While numerous global initiatives exist to address the potential hazards posed by volcanic eruption events and assess impacts from a civil security viewpoint, there does not yet exist a single, unified, international system of early warning and hazard tracking for eruptions. Numerous gaps exist in the risk reduction cycle, from data collection, to data processing, and finally dissemination of salient information to relevant parties. As part of the 2008 International Space University's Space Studies Program, a detailed gap analysis of the state of volcano disaster risk reduction was undertaken, and this paper presents the principal results. This gap analysis considered current sensor technologies, data processing algorithms, and utilization of data products by various international organizations. Recommendations for strategies to minimize or eliminate certain gaps are also provided. In the effort to address the gaps, a framework evolved at system level. This framework, known as VIDA, is a tool to develop user requirements for civil security in hazardous contexts, and a candidate system concept for a detailed design phase. While the basic intention of VIDA is to support disaster risk reduction efforts, there are several methods of leveraging raw science data to support education across a wide demographic. Basic geophysical data could be used to educate school children about the characteristics of volcanoes, satellite mappings could support informed growth and development of societies in at-risk areas, and raw sensor data could contribute to a wide range of university-level research projects. Satellite maps, basic geophysical data, and raw sensor data are combined and accessible in a way that allows the relationships between these data types to be explored and used in a training environment. Such a resource naturally lends itself to research efforts in the subject but also research in operational tools, system architecture, and human/machine interaction in civil protection or emergency scenarios.

  8. Utilization of FEM model for steel microstructure determination

    NASA Astrophysics Data System (ADS)

    Kešner, A.; Chotěborský, R.; Linda, M.; Hromasová, M.

    2018-02-01

    Agricultural tools which are used in soil processing, they are worn by abrasive wear mechanism cases by hard minerals particles in the soil. The wear rate is influenced by mechanical characterization of tools material and wear rate is influenced also by soil mineral particle contents. Mechanical properties of steel can be affected by a technology of heat treatment that it leads to a different microstructures. Experimental work how to do it is very expensive and thanks to numerical methods like FEM we can assumed microstructure at low cost but each of numerical model is necessary to be verified. The aim of this work has shown a procedure of prediction microstructure of steel for agricultural tools. The material characterizations of 51CrV4 grade steel were used for numerical simulation like TTT diagram, heat capacity, heat conduction and other physical properties of material. A relationship between predicted microstructure by FEM and real microstructure after heat treatment shows a good correlation.

  9. Shape optimization and CAD

    NASA Technical Reports Server (NTRS)

    Rasmussen, John

    1990-01-01

    Structural optimization has attracted the attention since the days of Galileo. Olhoff and Taylor have produced an excellent overview of the classical research within this field. However, the interest in structural optimization has increased greatly during the last decade due to the advent of reliable general numerical analysis methods and the computer power necessary to use them efficiently. This has created the possibility of developing general numerical systems for shape optimization. Several authors, eg., Esping; Braibant & Fleury; Bennet & Botkin; Botkin, Yang, and Bennet; and Stanton have published practical and successful applications of general optimization systems. Ding and Homlein have produced extensive overviews of available systems. Furthermore, a number of commercial optimization systems based on well-established finite element codes have been introduced. Systems like ANSYS, IDEAS, OASIS, and NISAOPT are widely known examples. In parallel to this development, the technology of computer aided design (CAD) has gained a large influence on the design process of mechanical engineering. The CAD technology has already lived through a rapid development driven by the drastically growing capabilities of digital computers. However, the systems of today are still considered as being only the first generation of a long row of computer integrated manufacturing (CIM) systems. These systems to come will offer an integrated environment for design, analysis, and fabrication of products of almost any character. Thus, the CAD system could be regarded as simply a database for geometrical information equipped with a number of tools with the purpose of helping the user in the design process. Among these tools are facilities for structural analysis and optimization as well as present standard CAD features like drawing, modeling, and visualization tools. The state of the art of structural optimization is that a large amount of mathematical and mechanical techniques are available for the solution of single problems. By implementing collections of the available techniques into general software systems, operational environments for structural optimization have been created. The forthcoming years must bring solutions to the problem of integrating such systems into more general design environments. The result of this work should be CAD systems for rational design in which structural optimization is one important design tool among many others.

  10. XBeach-G: a tool for predicting gravel barrier response to extreme storm conditions

    NASA Astrophysics Data System (ADS)

    Masselink, Gerd; Poate, Tim; McCall, Robert; Roelvink, Dano; Russell, Paul; Davidson, Mark

    2014-05-01

    Gravel beaches protect low-lying back-barrier regions from flooding during storm events and their importance to society is widely acknowledged. Unfortunately, breaching and extensive storm damage has occurred at many gravel sites and this is likely to increase as a result of sea-level rise and enhanced storminess due to climate change. Limited scientific guidance is currently available to provide beach managers with operational management tools to predict the response of gravel beaches to storms. The New Understanding and Prediction of Storm Impacts on Gravel beaches (NUPSIG) project aims to improve our understanding of storm impacts on gravel coastal environments and to develop a predictive capability by modelling these impacts. The NUPSIG project uses a 5-pronged approach to address its aim: (1) analyse hydrodynamic data collected during a proto-type laboratory experiment on a gravel beach; (2) collect hydrodynamic field data on a gravel beach under a range of conditions, including storm waves with wave heights up to 3 m; (3) measure swash dynamics and beach response on 10 gravel beaches during extreme wave conditions with wave heights in excess of 3 m; (4) use the data collected under 1-3 to develop and validate a numerical model to model hydrodynamics and morphological response of gravel beaches under storm conditions; and (5) develop a tool for end-users, based on the model formulated under (4), for predicting storm response of gravel beaches and barriers. The aim of this presentation is to present the key results of the NUPSIG project and introduce the end-user tool for predicting storm response on gravel beaches. The model is based on the numerical model XBeach, and different forcing scenarios (wave and tides), barrier configurations (dimensions) and sediment characteristics are easily uploaded for model simulations using a Graphics User Interface (GUI). The model can be used to determine the vulnerability of gravel barriers to storm events, but can also be used to help optimise design criteria for gravel barriers to reduce their vulnerability and enhance their coastal protection ability.

  11. A numerical tool for the calculation of non-equilibrium ionisation states in the solar corona and other astrophysical plasma environments

    NASA Astrophysics Data System (ADS)

    Bradshaw, S. J.

    2009-07-01

    Context: The effects of non-equilibrium processes on the ionisation state of strongly emitting elements in the solar corona can be extremely difficult to assess and yet they are critically important. For example, there is much interest in dynamic heating events localised in the solar corona because they are believed to be responsible for its high temperature and yet recent work has shown that the hottest (≥107 K) emission predicted to be associated with these events can be observationally elusive due to the difficulty of creating the highly ionised states from which the expected emission arises. This leads to the possibility of observing instruments missing such heating events entirely. Aims: The equations describing the evolution of the ionisaton state are a very stiff system of coupled, partial differential equations whose solution can be numerically challenging and time-consuming. Without access to specialised codes and significant computational resources it is extremely difficult to avoid the assumption of an equilibrium ionisation state even when it clearly cannot be justified. The aim of the current work is to develop a computational tool to allow straightforward calculation of the time-dependent ionisation state for a wide variety of physical circumstances. Methods: A numerical model comprising the system of time-dependent ionisation equations for a particular element and tabulated values of plasma temperature as a function of time is developed. The tabulated values can be the solutions of an analytical model, the output from a numerical code or a set of observational measurements. An efficient numerical method to solve the ionisation equations is implemented. Results: A suite of tests is designed and run to demonstrate that the code provides reliable and accurate solutions for a number of scenarios including equilibration of the ion population and rapid heating followed by thermal conductive cooling. It is found that the solver can evolve the ionisation state to recover exactly the equilibrium state found by an independent, steady-state solver for all temperatures, resolve the extremely small ionisation/recombination timescales associated with rapid temperature changes at high densities, and provide stable and accurate solutions for both dominant and minor ion population fractions. Rapid heating and cooling of low to moderate density plasma is characterised by significant non-equilibrium ionisation conditions. The effective ionisation temperatures are significantly lower than the electron temperature and the values found are in close agreement with the previous work of others. At the very highest densities included in the present study an assumption of equilibrium ionisation is found to be robust. Conclusions: The computational tool presented here provides a straightforward and reliable way to calculate ionisation states for a wide variety of physical circumstances. The numerical code gives results that are accurate and consistent with previous studies, has relatively undemanding computational requirements and is freely available from the author.

  12. Development of Multi-Layered Floating Floor for Cabin Noise Reduction

    NASA Astrophysics Data System (ADS)

    Song, Jee-Hun; Hong, Suk-Yoon; Kwon, Hyun-Wung

    2017-12-01

    Recently, regulations pertaining to the noise and vibration environment of ship cabins have been strengthened. In this paper, a numerical model is developed for multi-layered floating floor to predict the structure-borne noise in ship cabins. The theoretical model consists of multi-panel structures lined with high-density mineral wool. The predicted results for structure-borne noise when multi-layered floating floor is used are compared to the measure-ments made of a mock-up. A comparison of the predicted results and the experimental one shows that the developed model could be an effective tool for predicting structure-borne noise in ship cabins.

  13. Numerical prediction of mechanical properties of Pb-Sn solder alloys containing antimony, bismuth and or silver ternary trace elements

    NASA Astrophysics Data System (ADS)

    Gadag, Shiva P.; Patra, Susant

    2000-12-01

    Solder joint interconnects are mechanical means of structural support for bridging the various electronic components and providing electrical contacts and a thermal path for heat dissipation. The functionality of the electronic device often relies on the structural integrity of the solder. The dimensional stability of solder joints is numerically predicted based on their mechanical properties. Algorithms to model the kinetics of dissolution and subsequent growth of intermetallic from the complete knowledge of a single history of time-temperature-reflow profile, by considering equivalent isothermal time intervals, have been developed. The information for dissolution is derived during the heating cycle of reflow and for the growth process from cooling curve of reflow profile. A simple and quick analysis tool to derive tensile stress-strain maps as a function of the reflow temperature of solder and strain rate has been developed by numerical program. The tensile properties are used in modeling thermal strain, thermal fatigue and to predict the overall fatigue life of solder joints. The numerical analysis of the tensile properties as affected by their composition and rate of testing, has been compiled in this paper. A numerical model using constitutive equation has been developed to evaluate the interfacial fatigue crack growth rate. The model can assess the effect of cooling rate, which depends on the level of strain energy release rate. Increasing cooling rate from normalizing to water-quenching, enhanced the fatigue resistance to interfacial crack growth by up to 50% at low strain energy release rate. The increased cooling rates enhanced the fatigue crack growth resistance by surface roughening at the interface of solder joint. This paper highlights salient features of process modeling. Interfacial intermetallic microstructure is affected by cooling rate and thereby affects the mechanical properties.

  14. A review of laboratory and numerical modelling in volcanology

    NASA Astrophysics Data System (ADS)

    Kavanagh, Janine L.; Engwell, Samantha L.; Martin, Simon A.

    2018-04-01

    Modelling has been used in the study of volcanic systems for more than 100 years, building upon the approach first applied by Sir James Hall in 1815. Informed by observations of volcanological phenomena in nature, including eye-witness accounts of eruptions, geophysical or geodetic monitoring of active volcanoes, and geological analysis of ancient deposits, laboratory and numerical models have been used to describe and quantify volcanic and magmatic processes that span orders of magnitudes of time and space. We review the use of laboratory and numerical modelling in volcanological research, focussing on sub-surface and eruptive processes including the accretion and evolution of magma chambers, the propagation of sheet intrusions, the development of volcanic flows (lava flows, pyroclastic density currents, and lahars), volcanic plume formation, and ash dispersal. When first introduced into volcanology, laboratory experiments and numerical simulations marked a transition in approach from broadly qualitative to increasingly quantitative research. These methods are now widely used in volcanology to describe the physical and chemical behaviours that govern volcanic and magmatic systems. Creating simplified models of highly dynamical systems enables volcanologists to simulate and potentially predict the nature and impact of future eruptions. These tools have provided significant insights into many aspects of the volcanic plumbing system and eruptive processes. The largest scientific advances in volcanology have come from a multidisciplinary approach, applying developments in diverse fields such as engineering and computer science to study magmatic and volcanic phenomena. A global effort in the integration of laboratory and numerical volcano modelling is now required to tackle key problems in volcanology and points towards the importance of benchmarking exercises and the need for protocols to be developed so that models are routinely tested against real world data.

  15. Automated benchmarking of peptide-MHC class I binding predictions.

    PubMed

    Trolle, Thomas; Metushi, Imir G; Greenbaum, Jason A; Kim, Yohan; Sidney, John; Lund, Ole; Sette, Alessandro; Peters, Bjoern; Nielsen, Morten

    2015-07-01

    Numerous in silico methods predicting peptide binding to major histocompatibility complex (MHC) class I molecules have been developed over the last decades. However, the multitude of available prediction tools makes it non-trivial for the end-user to select which tool to use for a given task. To provide a solid basis on which to compare different prediction tools, we here describe a framework for the automated benchmarking of peptide-MHC class I binding prediction tools. The framework runs weekly benchmarks on data that are newly entered into the Immune Epitope Database (IEDB), giving the public access to frequent, up-to-date performance evaluations of all participating tools. To overcome potential selection bias in the data included in the IEDB, a strategy was implemented that suggests a set of peptides for which different prediction methods give divergent predictions as to their binding capability. Upon experimental binding validation, these peptides entered the benchmark study. The benchmark has run for 15 weeks and includes evaluation of 44 datasets covering 17 MHC alleles and more than 4000 peptide-MHC binding measurements. Inspection of the results allows the end-user to make educated selections between participating tools. Of the four participating servers, NetMHCpan performed the best, followed by ANN, SMM and finally ARB. Up-to-date performance evaluations of each server can be found online at http://tools.iedb.org/auto_bench/mhci/weekly. All prediction tool developers are invited to participate in the benchmark. Sign-up instructions are available at http://tools.iedb.org/auto_bench/mhci/join. mniel@cbs.dtu.dk or bpeters@liai.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Automated benchmarking of peptide-MHC class I binding predictions

    PubMed Central

    Trolle, Thomas; Metushi, Imir G.; Greenbaum, Jason A.; Kim, Yohan; Sidney, John; Lund, Ole; Sette, Alessandro; Peters, Bjoern; Nielsen, Morten

    2015-01-01

    Motivation: Numerous in silico methods predicting peptide binding to major histocompatibility complex (MHC) class I molecules have been developed over the last decades. However, the multitude of available prediction tools makes it non-trivial for the end-user to select which tool to use for a given task. To provide a solid basis on which to compare different prediction tools, we here describe a framework for the automated benchmarking of peptide-MHC class I binding prediction tools. The framework runs weekly benchmarks on data that are newly entered into the Immune Epitope Database (IEDB), giving the public access to frequent, up-to-date performance evaluations of all participating tools. To overcome potential selection bias in the data included in the IEDB, a strategy was implemented that suggests a set of peptides for which different prediction methods give divergent predictions as to their binding capability. Upon experimental binding validation, these peptides entered the benchmark study. Results: The benchmark has run for 15 weeks and includes evaluation of 44 datasets covering 17 MHC alleles and more than 4000 peptide-MHC binding measurements. Inspection of the results allows the end-user to make educated selections between participating tools. Of the four participating servers, NetMHCpan performed the best, followed by ANN, SMM and finally ARB. Availability and implementation: Up-to-date performance evaluations of each server can be found online at http://tools.iedb.org/auto_bench/mhci/weekly. All prediction tool developers are invited to participate in the benchmark. Sign-up instructions are available at http://tools.iedb.org/auto_bench/mhci/join. Contact: mniel@cbs.dtu.dk or bpeters@liai.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25717196

  17. Developing a mapping tool for tablets

    NASA Astrophysics Data System (ADS)

    Vaughan, Alan; Collins, Nathan; Krus, Mike

    2014-05-01

    Digital field mapping offers significant benefits when compared with traditional paper mapping techniques in that it provides closer integration with downstream geological modelling and analysis. It also provides the mapper with the ability to rapidly integrate new data with existing databases without the potential degradation caused by repeated manual transcription of numeric, graphical and meta-data. In order to achieve these benefits, a number of PC-based digital mapping tools are available which have been developed for specific communities, eg the BGS•SIGMA project, Midland Valley's FieldMove®, and a range of solutions based on ArcGIS® software, which can be combined with either traditional or digital orientation and data collection tools. However, with the now widespread availability of inexpensive tablets and smart phones, a user led demand for a fully integrated tablet mapping tool has arisen. This poster describes the development of a tablet-based mapping environment specifically designed for geologists. The challenge was to deliver a system that would feel sufficiently close to the flexibility of paper-based geological mapping while being implemented on a consumer communication and entertainment device. The first release of a tablet-based geological mapping system from this project is illustrated and will be shown as implemented on an iPad during the poster session. Midland Valley is pioneering tablet-based mapping and, along with its industrial and academic partners, will be using the application in field based projects throughout this year and will be integrating feedback in further developments of this technology.

  18. Modeling Production Plant Forming Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhee, M; Becker, R; Couch, R

    2004-09-22

    Engineering has simulation tools and experience in modeling forming processes. Y-12 personnel have expressed interest in validating our tools and experience against their manufacturing process activities such as rolling, casting, and forging etc. We have demonstrated numerical capabilities in a collaborative DOE/OIT project with ALCOA that is nearing successful completion. The goal was to use ALE3D to model Alcoa's slab rolling process in order to demonstrate a computational tool that would allow Alcoa to define a rolling schedule that would minimize the probability of ingot fracture, thus reducing waste and energy consumption. It is intended to lead to long-term collaborationmore » with Y-12 and perhaps involvement with other components of the weapons production complex. Using simulations to aid in design of forming processes can: decrease time to production; reduce forming trials and associated expenses; and guide development of products with greater uniformity and less scrap.« less

  19. FireProt: web server for automated design of thermostable proteins

    PubMed Central

    Musil, Milos; Stourac, Jan; Brezovsky, Jan; Prokop, Zbynek; Zendulka, Jaroslav; Martinek, Tomas

    2017-01-01

    Abstract There is a continuous interest in increasing proteins stability to enhance their usability in numerous biomedical and biotechnological applications. A number of in silico tools for the prediction of the effect of mutations on protein stability have been developed recently. However, only single-point mutations with a small effect on protein stability are typically predicted with the existing tools and have to be followed by laborious protein expression, purification, and characterization. Here, we present FireProt, a web server for the automated design of multiple-point thermostable mutant proteins that combines structural and evolutionary information in its calculation core. FireProt utilizes sixteen tools and three protein engineering strategies for making reliable protein designs. The server is complemented with interactive, easy-to-use interface that allows users to directly analyze and optionally modify designed thermostable mutants. FireProt is freely available at http://loschmidt.chemi.muni.cz/fireprot. PMID:28449074

  20. Measuring Workload Demand of Informatics Systems with the Clinical Case Demand Index

    PubMed Central

    Iyengar, M. Sriram; Rogith, Deevakar; Florez-Arango, Jose F

    2017-01-01

    Introduction: The increasing use of Health Information Technology (HIT) can add substantially to workload on clinical providers. Current methods for assessing workload do not take into account the nature of clinical cases and the use of HIT tools while solving them. Methods: The Clinical Case Demand Index (CCDI), consisting of a summary score and visual representation, was developed to meet this need. Consistency with current perceived workload measures was evaluated in a Randomized Control Trial of a mobile health system. Results: CCDI is significantly correlated with existing workload measures and inversely related to provider performance. Discussion: CCDI combines subjective and objective characteristics of clinical cases along with cognitive and clinical dimensions. Applications include evaluation of HIT tools, clinician scheduling, medical education. Conclusion: CCDI supports comparative effectiveness research of HIT tools. In addition, CCDI could have numerous applications including training, clinical trials, design of clinical workflows, and others. PMID:29854166

Top