Sample records for advanced numerical tools

  1. Interferometric correction system for a numerically controlled machine

    DOEpatents

    Burleson, Robert R.

    1978-01-01

    An interferometric correction system for a numerically controlled machine is provided to improve the positioning accuracy of a machine tool, for example, for a high-precision numerically controlled machine. A laser interferometer feedback system is used to monitor the positioning of the machine tool which is being moved by command pulses to a positioning system to position the tool. The correction system compares the commanded position as indicated by a command pulse train applied to the positioning system with the actual position of the tool as monitored by the laser interferometer. If the tool position lags the commanded position by a preselected error, additional pulses are added to the pulse train applied to the positioning system to advance the tool closer to the commanded position, thereby reducing the lag error. If the actual tool position is leading in comparison to the commanded position, pulses are deleted from the pulse train where the advance error exceeds the preselected error magnitude to correct the position error of the tool relative to the commanded position.

  2. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  3. Structural reliability assessment capability in NESSUS

    NASA Astrophysics Data System (ADS)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  4. The Science of and Advanced Technology for Cost-Effective Manufacture of High Precision Engineering Products. Volume 4. Thermal Effects on the Accuracy of Numerically Controlled Machine Tools.

    DTIC Science & Technology

    1985-10-01

    83K0385 FINAL REPORT D Vol. 4 00 THERMAL EFFECTS ON THE ACCURACY OF LD NUME" 1ICALLY CONTROLLED MACHINE TOOLS PREPARED BY I Raghunath Venugopal and M...OF NUMERICALLY CONTROLLED MACHINE TOOLS 12 PERSONAL AJ’HOR(S) Venunorial, Raghunath and M. M. Barash 13a TYPE OF REPORT 13b TIME COVERED 14 DATE OF...TOOLS Prepared by Raghunath Venugopal and M. M. Barash Accesion For Unannounced 0 Justification ........................................... October 1085

  5. Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    2001-01-01

    The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.

  6. An integrated modeling and design tool for advanced optical spacecraft

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1992-01-01

    Consideration is given to the design and status of the Integrated Modeling of Optical Systems (IMOS) tool and to critical design issues. A multidisciplinary spacecraft design and analysis tool with support for structural dynamics, controls, thermal analysis, and optics, IMOS provides rapid and accurate end-to-end performance analysis, simulations, and optimization of advanced space-based optical systems. The requirements for IMOS-supported numerical arrays, user defined data structures, and a hierarchical data base are outlined, and initial experience with the tool is summarized. A simulation of a flexible telescope illustrates the integrated nature of the tools.

  7. Successes and Challenges of Incompressible Flow Simulation

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2003-01-01

    During the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of CFD discipline. Even though incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient, CFD tools become indispensable in fluid engineering for incompressible and low speed flow. This paper is intended to review some of the successes made possible by advances in computational technologies during the same period, and discuss some of the current challenges.

  8. The Advanced Course in Professional Selling

    ERIC Educational Resources Information Center

    Loe, Terry; Inks, Scott

    2014-01-01

    More universities are incorporating sales content into their curriculums, and although the introductory courses in professional sales have much common ground and guidance from numerous professional selling texts, instructors teaching the advanced selling course lack the guidance provided by common academic tools and materials. The resulting…

  9. An Introduction to Intelligent Processing Programs Developed by the Air Force Manufacturing Technology Directorate

    NASA Technical Reports Server (NTRS)

    Sampson, Paul G.; Sny, Linda C.

    1992-01-01

    The Air Force has numerous on-going manufacturing and integration development programs (machine tools, composites, metals, assembly, and electronics) which are instrumental in improving productivity in the aerospace industry, but more importantly, have identified strategies and technologies required for the integration of advanced processing equipment. An introduction to four current Air Force Manufacturing Technology Directorate (ManTech) manufacturing areas is provided. Research is being carried out in the following areas: (1) machining initiatives for aerospace subcontractors which provide for advanced technology and innovative manufacturing strategies to increase the capabilities of small shops; (2) innovative approaches to advance machine tool products and manufacturing processes; (3) innovative approaches to advance sensors for process control in machine tools; and (4) efforts currently underway to develop, with the support of industry, the Next Generation Workstation/Machine Controller (Low-End Controller Task).

  10. Advanced Numerical and Theoretical Methods for Photonic Crystals and Metamaterials

    NASA Astrophysics Data System (ADS)

    Felbacq, Didier

    2016-11-01

    This book provides a set of theoretical and numerical tools useful for the study of wave propagation in metamaterials and photonic crystals. While concentrating on electromagnetic waves, most of the material can be used for acoustic (or quantum) waves. For each presented numerical method, numerical code written in MATLAB® is presented. The codes are limited to 2D problems and can be easily translated in Python or Scilab, and used directly with Octave as well.

  11. USING CFD TO ANALYZE NUCLEAR SYSTEMS BEHAVIOR: DEFINING THE VALIDATION REQUIREMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard Schultz

    2012-09-01

    A recommended protocol to formulate numeric tool specifications and validation needs in concert with practices accepted by regulatory agencies for advanced reactors is described. The protocol is based on the plant type and perceived transient and accident envelopes that translates to boundary conditions for a process that gives the: (a) key phenomena and figures-of-merit which must be analyzed to ensure that the advanced plant can be licensed, (b) specification of the numeric tool capabilities necessary to perform the required analyses—including bounding calculational uncertainties, and (c) specification of the validation matrices and experiments--including the desired validation data. The result of applyingmore » the process enables a complete program to be defined, including costs, for creating and benchmarking transient and accident analysis methods for advanced reactors. By following a process that is in concert with regulatory agency licensing requirements from the start to finish, based on historical acceptance of past licensing submittals, the methods derived and validated have a high probability of regulatory agency acceptance.« less

  12. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  13. Computational Challenges of Viscous Incompressible Flows

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin; Kim, Chang Sung

    2004-01-01

    Over the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of the computational fluid dynamics (CFD) discipline. Although incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to the rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low-speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient CFD took become increasingly important in fluid engineering for incompressible and low-speed flow. This paper reviews some of the successes made possible by advances in computational technologies during the same period, and discusses some of the current challenges faced in computing incompressible flows.

  14. Verification of a Multiphysics Toolkit against the Magnetized Target Fusion Concept

    NASA Technical Reports Server (NTRS)

    Thomas, Scott; Perrell, Eric; Liron, Caroline; Chiroux, Robert; Cassibry, Jason; Adams, Robert B.

    2005-01-01

    In the spring of 2004 the Advanced Concepts team at MSFC embarked on an ambitious project to develop a suite of modeling routines that would interact with one another. The tools would each numerically model a portion of any advanced propulsion system. The tools were divided by physics categories, hence the name multiphysics toolset. Currently most of the anticipated modeling tools have been created and integrated. Results are given in this paper for both a quarter nozzle with chemically reacting flow and the interaction of two plasma jets representative of a Magnetized Target Fusion device. The results have not been calibrated against real data as of yet, but this paper demonstrates the current capability of the multiphysics tool and planned future enhancements

  15. Space-weather assets developed by the French space-physics community

    NASA Astrophysics Data System (ADS)

    Rouillard, A. P.; Pinto, R. F.; Brun, A. S.; Briand, C.; Bourdarie, S.; Dudok De Wit, T.; Amari, T.; Blelly, P.-L.; Buchlin, E.; Chambodut, A.; Claret, A.; Corbard, T.; Génot, V.; Guennou, C.; Klein, K. L.; Koechlin, L.; Lavarra, M.; Lavraud, B.; Leblanc, F.; Lemorton, J.; Lilensten, J.; Lopez-Ariste, A.; Marchaudon, A.; Masson, S.; Pariat, E.; Reville, V.; Turc, L.; Vilmer, N.; Zucarello, F. P.

    2016-12-01

    We present a short review of space-weather tools and services developed and maintained by the French space-physics community. They include unique data from ground-based observatories, advanced numerical models, automated identification and tracking tools, a range of space instrumentation and interconnected virtual observatories. The aim of the article is to highlight some advances achieved in this field of research at the national level over the last decade and how certain assets could be combined to produce better space-weather tools exploitable by space-weather centres and customers worldwide. This review illustrates the wide range of expertise developed nationally but is not a systematic review of all assets developed in France.

  16. Advanced CNC and CAM Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and student laboratory manual for a 1-year vocational training program to prepare students for entry-level positions as advanced computer numerical control (CNC) and computer-assisted manufacturing (CAM) technicians.. The program was developed through a modification of the DACUM…

  17. Solving PDEs with Intrepid

    DOE PAGES

    Bochev, P.; Edwards, H. C.; Kirby, R. C.; ...

    2012-01-01

    Intrepid is a Trilinos package for advanced discretizations of Partial Differential Equations (PDEs). The package provides a comprehensive set of tools for local, cell-based construction of a wide range of numerical methods for PDEs. This paper describes the mathematical ideas and software design principles incorporated in the package. We also provide representative examples showcasing the use of Intrepid both in the context of numerical PDEs and the more general context of data analysis.

  18. Simulations of binary black hole mergers

    NASA Astrophysics Data System (ADS)

    Lovelace, Geoffrey

    2017-01-01

    Advanced LIGO's observations of merging binary black holes have inaugurated the era of gravitational wave astronomy. Accurate models of binary black holes and the gravitational waves they emit are helping Advanced LIGO to find as many gravitational waves as possible and to learn as much as possible about the waves' sources. These models require numerical-relativity simulations of binary black holes, because near the time when the black holes merge, all analytic approximations break down. Following breakthroughs in 2005, many research groups have built numerical-relativity codes capable of simulating binary black holes. In this talk, I will discuss current challenges in simulating binary black holes for gravitational-wave astronomy, and I will discuss the tremendous progress that has already enabled such simulations to become an essential tool for Advanced LIGO.

  19. A fully-implicit high-order system thermal-hydraulics model for advanced non-LWR safety analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui

    An advanced system analysis tool is being developed for advanced reactor safety analysis. This paper describes the underlying physics and numerical models used in the code, including the governing equations, the stabilization schemes, the high-order spatial and temporal discretization schemes, and the Jacobian Free Newton Krylov solution method. The effects of the spatial and temporal discretization schemes are investigated. Additionally, a series of verification test problems are presented to confirm the high-order schemes. Furthermore, it is demonstrated that the developed system thermal-hydraulics model can be strictly verified with the theoretical convergence rates, and that it performs very well for amore » wide range of flow problems with high accuracy, efficiency, and minimal numerical diffusions.« less

  20. A fully-implicit high-order system thermal-hydraulics model for advanced non-LWR safety analyses

    DOE PAGES

    Hu, Rui

    2016-11-19

    An advanced system analysis tool is being developed for advanced reactor safety analysis. This paper describes the underlying physics and numerical models used in the code, including the governing equations, the stabilization schemes, the high-order spatial and temporal discretization schemes, and the Jacobian Free Newton Krylov solution method. The effects of the spatial and temporal discretization schemes are investigated. Additionally, a series of verification test problems are presented to confirm the high-order schemes. Furthermore, it is demonstrated that the developed system thermal-hydraulics model can be strictly verified with the theoretical convergence rates, and that it performs very well for amore » wide range of flow problems with high accuracy, efficiency, and minimal numerical diffusions.« less

  1. Optics simulations: a Python workshop

    NASA Astrophysics Data System (ADS)

    Ghalila, H.; Ammar, A.; Varadharajan, S.; Majdi, Y.; Zghal, M.; Lahmar, S.; Lakshminarayanan, V.

    2017-08-01

    Numerical simulations allow teachers and students to indirectly perform sophisticated experiments that cannot be realizable otherwise due to cost and other constraints. During the past few decades there has been an explosion in the development of numerical tools concurrently with open source environments such as Python software. This availability of open source software offers an incredible opportunity for advancing teaching methodologies as well as in research. More specifically it is possible to correlate theoretical knowledge with experimental measurements using "virtual" experiments. We have been working on the development of numerical simulation tools using the Python program package and we have concentrated on geometric and physical optics simulations. The advantage of doing hands-on numerical experiments is that it allows the student learner to be an active participant in the pedagogical/learning process rather than playing a passive role as in the traditional lecture format. Even in laboratory classes because of constraints of space, lack of equipment and often-large numbers of students, many students play a passive role since they work in groups of 3 or more students. Furthermore these new tools help students get a handle on numerical methods as well simulations and impart a "feel" for the physics under investigation.

  2. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  3. Recovery Discontinuous Galerkin Jacobian-free Newton-Krylov Method for all-speed flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HyeongKae Park; Robert Nourgaliev; Vincent Mousseau

    2008-07-01

    There is an increasing interest to develop the next generation simulation tools for the advanced nuclear energy systems. These tools will utilize the state-of-art numerical algorithms and computer science technology in order to maximize the predictive capability, support advanced reactor designs, reduce uncertainty and increase safety margins. In analyzing nuclear energy systems, we are interested in compressible low-Mach number, high heat flux flows with a wide range of Re, Ra, and Pr numbers. Under these conditions, the focus is placed on turbulent heat transfer, in contrast to other industries whose main interest is in capturing turbulent mixing. Our objective ismore » to develop singlepoint turbulence closure models for large-scale engineering CFD code, using Direct Numerical Simulation (DNS) or Large Eddy Simulation (LES) tools, requireing very accurate and efficient numerical algorithms. The focus of this work is placed on fully-implicit, high-order spatiotemporal discretization based on the discontinuous Galerkin method solving the conservative form of the compressible Navier-Stokes equations. The method utilizes a local reconstruction procedure derived from weak formulation of the problem, which is inspired by the recovery diffusion flux algorithm of van Leer and Nomura [?] and by the piecewise parabolic reconstruction [?] in the finite volume method. The developed methodology is integrated into the Jacobianfree Newton-Krylov framework [?] to allow a fully-implicit solution of the problem.« less

  4. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  5. 46 CFR 298.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... App. U.S.C. 1101 through 1294). Actual Cost of a Vessel or Shipyard Project means, as of any specified... thereafter, for the construction, reconstruction or reconditioning of such Vessel or Shipyard Project. Advanced Shipbuilding Technology means: (1) Numerically controlled machine tools, robots, automated process...

  6. 46 CFR 298.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... App. U.S.C. 1101 through 1294). Actual Cost of a Vessel or Shipyard Project means, as of any specified... thereafter, for the construction, reconstruction or reconditioning of such Vessel or Shipyard Project. Advanced Shipbuilding Technology means: (1) Numerically controlled machine tools, robots, automated process...

  7. 46 CFR 298.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... App. U.S.C. 1101 through 1294). Actual Cost of a Vessel or Shipyard Project means, as of any specified... thereafter, for the construction, reconstruction or reconditioning of such Vessel or Shipyard Project. Advanced Shipbuilding Technology means: (1) Numerically controlled machine tools, robots, automated process...

  8. 46 CFR 298.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... App. U.S.C. 1101 through 1294). Actual Cost of a Vessel or Shipyard Project means, as of any specified... thereafter, for the construction, reconstruction or reconditioning of such Vessel or Shipyard Project. Advanced Shipbuilding Technology means: (1) Numerically controlled machine tools, robots, automated process...

  9. 46 CFR 298.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... App. U.S.C. 1101 through 1294). Actual Cost of a Vessel or Shipyard Project means, as of any specified... thereafter, for the construction, reconstruction or reconditioning of such Vessel or Shipyard Project. Advanced Shipbuilding Technology means: (1) Numerically controlled machine tools, robots, automated process...

  10. Combustion and Magnetohydrodynamic Processes in Advanced Pulse Detonation Rocket Engines

    DTIC Science & Technology

    2012-10-01

    use of high-order numerical methods can also be a powerful tool in the analysis of such complex flows, but we need to understand the interaction of...computational physics, 43(2):357372, 1981. [47] B. Einfeldt. On godunov-type methods for gas dynamics . SIAM Journal on Numerical Analysis , pages 294...dimensional effects with complex reaction kinetics, the simple one-dimensional detonation structure provides a rich spectrum of dynamical features which are

  11. Structural characterization and numerical simulations of flow properties of standard and reservoir carbonate rocks using micro-tomography

    NASA Astrophysics Data System (ADS)

    Islam, Amina; Chevalier, Sylvie; Sassi, Mohamed

    2018-04-01

    With advances in imaging techniques and computational power, Digital Rock Physics (DRP) is becoming an increasingly popular tool to characterize reservoir samples and determine their internal structure and flow properties. In this work, we present the details for imaging, segmentation, as well as numerical simulation of single-phase flow through a standard homogenous Silurian dolomite core plug sample as well as a heterogeneous sample from a carbonate reservoir. We develop a procedure that integrates experimental results into the segmentation step to calibrate the porosity. We also look into using two different numerical tools for the simulation; namely Avizo Fire Xlab Hydro that solves the Stokes' equations via the finite volume method and Palabos that solves the same equations using the Lattice Boltzmann Method. Representative Elementary Volume (REV) and isotropy studies are conducted on the two samples and we show how DRP can be a useful tool to characterize rock properties that are time consuming and costly to obtain experimentally.

  12. Cell chips as new tools for cell biology--results, perspectives and opportunities.

    PubMed

    Primiceri, Elisabetta; Chiriacò, Maria Serena; Rinaldi, Ross; Maruccio, Giuseppe

    2013-10-07

    Cell culture technologies were initially developed as research tools for studying cell functions, but nowadays they are essential for the biotechnology industry, with rapidly expanding applications requiring more and more advancements with respect to traditional tools. Miniaturization and integration of sensors and microfluidic components with cell culture techniques open the way to the development of cellomics as a new field of research targeting innovative analytic platforms for high-throughput studies. This approach enables advanced cell studies under controllable conditions by providing inexpensive, easy-to-operate devices. Thanks to their numerous advantages cell-chips have become a hotspot in biosensors and bioelectronics fields and have been applied to very different fields. In this review exemplary applications will be discussed, for cell counting and detection, cytotoxicity assays, migration assays and stem cell studies.

  13. Sustainability of transport structures - some aspects of the nonlinear reliability assessment

    NASA Astrophysics Data System (ADS)

    Pukl, Radomír; Sajdlová, Tereza; Strauss, Alfred; Lehký, David; Novák, Drahomír

    2017-09-01

    Efficient techniques for both nonlinear numerical analysis of concrete structures and advanced stochastic simulation methods have been combined in order to offer an advanced tool for assessment of realistic behaviour, failure and safety assessment of transport structures. The utilized approach is based on randomization of the non-linear finite element analysis of the structural models. Degradation aspects such as carbonation of concrete can be accounted in order predict durability of the investigated structure and its sustainability. Results can serve as a rational basis for the performance and sustainability assessment based on advanced nonlinear computer analysis of the structures of transport infrastructure such as bridges or tunnels. In the stochastic simulation the input material parameters obtained from material tests including their randomness and uncertainty are represented as random variables or fields. Appropriate identification of material parameters is crucial for the virtual failure modelling of structures and structural elements. Inverse analysis using artificial neural networks and virtual stochastic simulations approach is applied to determine the fracture mechanical parameters of the structural material and its numerical model. Structural response, reliability and sustainability have been investigated on different types of transport structures made from various materials using the above mentioned methodology and tools.

  14. Perspectives on the Future of CFD

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2000-01-01

    This viewgraph presentation gives an overview of the future of computational fluid dynamics (CFD), which in the past has pioneered the field of flow simulation. Over time CFD has progressed as computing power. Numerical methods have been advanced as CPU and memory capacity increases. Complex configurations are routinely computed now and direct numerical simulations (DNS) and large eddy simulations (LES) are used to study turbulence. As the computing resources changed to parallel and distributed platforms, computer science aspects such as scalability (algorithmic and implementation) and portability and transparent codings have advanced. Examples of potential future (or current) challenges include risk assessment, limitations of the heuristic model, and the development of CFD and information technology (IT) tools.

  15. How Linguistic Frames Affect Motivational Profiles and the Roles of Quantitative versus Qualitative Research Strategies

    ERIC Educational Resources Information Center

    Yeager, Joseph; Sommer, Linda

    2005-01-01

    The combined tools of psycholinguistics and systems analysis have produced advances in motivational profiling resulting in numerous applications to behavioral engineering. Knowing the way people frame their motive offers leverage in causing behavior change ranging from persuasive marketing campaigns, forensic profiling, individual psychotherapy,…

  16. EVALUATING HYDROLOGICAL RESPONSE TO FORECASTED LAND-USE CHANGE: SCENARIO TESTING WITH THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL

    EPA Science Inventory

    It is currently possible to measure landscape change over large areas and determine trends in environmental condition using advanced space-based technologies accompanied by geospatial analyses of the remotely sensed data. There are numerous earth-observing satellite platforms fo...

  17. MODELING MICROBUBBLE DYNAMICS IN BIOMEDICAL APPLICATIONS*

    PubMed Central

    CHAHINE, Georges L.; HSIAO, Chao-Tsung

    2012-01-01

    Controlling microbubble dynamics to produce desirable biomedical outcomes when and where necessary and avoid deleterious effects requires advanced knowledge, which can be achieved only through a combination of experimental and numerical/analytical techniques. The present communication presents a multi-physics approach to study the dynamics combining viscous- in-viscid effects, liquid and structure dynamics, and multi bubble interaction. While complex numerical tools are developed and used, the study aims at identifying the key parameters influencing the dynamics, which need to be included in simpler models. PMID:22833696

  18. Using synchrotron light to accelerate EUV resist and mask materials learning

    NASA Astrophysics Data System (ADS)

    Naulleau, Patrick; Anderson, Christopher N.; Baclea-an, Lorie-Mae; Denham, Paul; George, Simi; Goldberg, Kenneth A.; Jones, Gideon; McClinton, Brittany; Miyakawa, Ryan; Mochi, Iacopo; Montgomery, Warren; Rekawa, Seno; Wallow, Tom

    2011-03-01

    As commercialization of extreme ultraviolet lithography (EUVL) progresses, direct industry activities are being focused on near term concerns. The question of long term extendibility of EUVL, however, remains crucial given the magnitude of the investments yet required to make EUVL a reality. Extendibility questions are best addressed using advanced research tools such as the SEMATECH Berkeley microfield exposure tool (MET) and actinic inspection tool (AIT). Utilizing Lawrence Berkeley National Laboratory's Advanced Light Source facility as the light source, these tools benefit from the unique properties of synchrotron light enabling research at nodes generations ahead of what is possible with commercial tools. The MET for example uses extremely bright undulator radiation to enable a lossless fully programmable coherence illuminator. Using such a system, resolution enhancing illuminations achieving k1 factors of 0.25 can readily be attained. Given the MET numerical aperture of 0.3, this translates to an ultimate resolution capability of 12 nm. Using such methods, the SEMATECH Berkeley MET has demonstrated resolution in resist to 16-nm half pitch and below in an imageable spin-on hard mask. At a half pitch of 16 nm, this material achieves a line-edge roughness of 2 nm with a correlation length of 6 nm. These new results demonstrate that the observed stall in ultimate resolution progress in chemically amplified resists is a materials issue rather than a tool limitation. With a resolution limit of 20-22 nm, the CAR champion from 2008 remains as the highest performing CAR tested to date. To enable continued advanced learning in EUV resists, SEMATECH has initiated a plan to implement a 0.5 NA microfield tool at the Advanced Light Source synchrotron facility. This tool will be capable of printing down to 8-nm half pitch.

  19. Assessment of the National Combustion Code

    NASA Technical Reports Server (NTRS)

    Liu, nan-Suey; Iannetti, Anthony; Shih, Tsan-Hsing

    2007-01-01

    The advancements made during the last decade in the areas of combustion modeling, numerical simulation, and computing platform have greatly facilitated the use of CFD based tools in the development of combustion technology. Further development of verification, validation and uncertainty quantification will have profound impact on the reliability and utility of these CFD based tools. The objectives of the present effort are to establish baseline for the National Combustion Code (NCC) and experimental data, as well as to document current capabilities and identify gaps for further improvements.

  20. Handbook of Research on Hybrid Learning Models: Advanced Tools, Technologies, and Applications

    ERIC Educational Resources Information Center

    Wang, Fu Lee, Ed.; Fong, Joseph, Ed.; Kwan, Reggie, Ed.

    2010-01-01

    Hybrid learning is now the single-greatest trend in education today due to the numerous educational advantages when both traditional classroom learning and e-learning are implemented collectively. This handbook collects emerging research and pedagogies related to the convergence of teaching and learning methods. This significant "Handbook of…

  1. The COPERNIC3 project: how AREVA is successfully developing an advanced global fuel rod performance code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garnier, Ch.; Mailhe, P.; Sontheimer, F.

    2007-07-01

    Fuel performance is a key factor for minimizing operating costs in nuclear plants. One of the important aspects of fuel performance is fuel rod design, based upon reliable tools able to verify the safety of current fuel solutions, prevent potential issues in new core managements and guide the invention of tomorrow's fuels. AREVA is developing its future global fuel rod code COPERNIC3, which is able to calculate the thermal-mechanical behavior of advanced fuel rods in nuclear plants. Some of the best practices to achieve this goal are described, by reviewing the three pillars of a fuel rod code: the database,more » the modelling and the computer and numerical aspects. At first, the COPERNIC3 database content is described, accompanied by the tools developed to effectively exploit the data. Then is given an overview of the main modelling aspects, by emphasizing the thermal, fission gas release and mechanical sub-models. In the last part, numerical solutions are detailed in order to increase the computational performance of the code, with a presentation of software configuration management solutions. (authors)« less

  2. Leveraging e-Science infrastructure for electrochemical research.

    PubMed

    Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F

    2011-08-28

    As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.

  3. Integrated flexible manufacturing program for manufacturing automation and rapid prototyping

    NASA Technical Reports Server (NTRS)

    Brooks, S. L.; Brown, C. W.; King, M. S.; Simons, W. R.; Zimmerman, J. J.

    1993-01-01

    The Kansas City Division of Allied Signal Inc., as part of the Integrated Flexible Manufacturing Program (IFMP), is developing an integrated manufacturing environment. Several systems are being developed to produce standards and automation tools for specific activities within the manufacturing environment. The Advanced Manufacturing Development System (AMDS) is concentrating on information standards (STEP) and product data transfer; the Expert Cut Planner system (XCUT) is concentrating on machining operation process planning standards and automation capabilities; the Advanced Numerical Control system (ANC) is concentrating on NC data preparation standards and NC data generation tools; the Inspection Planning and Programming Expert system (IPPEX) is concentrating on inspection process planning, coordinate measuring machine (CMM) inspection standards and CMM part program generation tools; and the Intelligent Scheduling and Planning System (ISAPS) is concentrating on planning and scheduling tools for a flexible manufacturing system environment. All of these projects are working together to address information exchange, standardization, and information sharing to support rapid prototyping in a Flexible Manufacturing System (FMS) environment.

  4. A Coupled Multiphysics Approach for Simulating Induced Seismicity, Ground Acceleration and Structural Damage

    NASA Astrophysics Data System (ADS)

    Podgorney, Robert; Coleman, Justin; Wilkins, Amdrew; Huang, Hai; Veeraraghavan, Swetha; Xia, Yidong; Permann, Cody

    2017-04-01

    Numerical modeling has played an important role in understanding the behavior of coupled subsurface thermal-hydro-mechanical (THM) processes associated with a number of energy and environmental applications since as early as the 1970s. While the ability to rigorously describe all key tightly coupled controlling physics still remains a challenge, there have been significant advances in recent decades. These advances are related primarily to the exponential growth of computational power, the development of more accurate equations of state, improvements in the ability to represent heterogeneity and reservoir geometry, and more robust nonlinear solution schemes. The work described in this paper documents the development and linkage of several fully-coupled and fully-implicit modeling tools. These tools simulate: (1) the dynamics of fluid flow, heat transport, and quasi-static rock mechanics; (2) seismic wave propagation from the sources of energy release through heterogeneous material; and (3) the soil-structural damage resulting from ground acceleration. These tools are developed in Idaho National Laboratory's parallel Multiphysics Object Oriented Simulation Environment, and are integrated together using a global implicit approach. The governing equations are presented, the numerical approach for simultaneously solving and coupling the three coupling physics tools is discussed, and the data input and output methodology is outlined. An example is presented to demonstrate the capabilities of the coupled multiphysics approach. The example involves simulating a system conceptually similar to the geothermal development in Basel Switzerland, and the resultant induced seismicity, ground motion and structural damage is predicted.

  5. Improvement of electrical resistivity tomography for leachate injection monitoring.

    PubMed

    Clément, R; Descloitres, M; Günther, T; Oxarango, L; Morra, C; Laurent, J-P; Gourc, J-P

    2010-03-01

    Leachate recirculation is a key process in the scope of operating municipal waste landfills as bioreactors, which aims to increase the moisture content to optimize the biodegradation in landfills. Given that liquid flows exhibit a complex behaviour in very heterogeneous porous media, in situ monitoring methods are required. Surface time-lapse electrical resistivity tomography (ERT) is usually proposed. Using numerical modelling with typical 2D and 3D injection plume patterns and 2D and 3D inversion codes, we show that wrong changes of resistivity can be calculated at depth if standard parameters are used for time-lapse ERT inversion. Major artefacts typically exhibit significant increases of resistivity (more than +30%) which can be misinterpreted as gas migration within the waste. In order to eliminate these artefacts, we tested an advanced time-lapse ERT procedure that includes (i) two advanced inversion tools and (ii) two alternative array geometries. The first advanced tool uses invariant regions in the model. The second advanced tool uses an inversion with a "minimum length" constraint. The alternative arrays focus on (i) a pole-dipole array (2D case), and (ii) a star array (3D case). The results show that these two advanced inversion tools and the two alternative arrays remove almost completely the artefacts within +/-5% both for 2D and 3D situations. As a field application, time-lapse ERT is applied using the star array during a 3D leachate injection in a non-hazardous municipal waste landfill. To evaluate the robustness of the two advanced tools, a synthetic model including both true decrease and increase of resistivity is built. The advanced time-lapse ERT procedure eliminates unwanted artefacts, while keeping a satisfactory image of true resistivity variations. This study demonstrates that significant and robust improvements can be obtained for time-lapse ERT monitoring of leachate recirculation in waste landfills. Copyright 2009 Elsevier Ltd. All rights reserved.

  6. Improvement of electrical resistivity tomography for leachate injection monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clement, R., E-mail: remi.clement@hmg.inpg.f; Descloitres, M.; Guenther, T., E-mail: Thomas.Guenther@liag-hannover.d

    2010-03-15

    Leachate recirculation is a key process in the scope of operating municipal waste landfills as bioreactors, which aims to increase the moisture content to optimize the biodegradation in landfills. Given that liquid flows exhibit a complex behaviour in very heterogeneous porous media, in situ monitoring methods are required. Surface time-lapse electrical resistivity tomography (ERT) is usually proposed. Using numerical modelling with typical 2D and 3D injection plume patterns and 2D and 3D inversion codes, we show that wrong changes of resistivity can be calculated at depth if standard parameters are used for time-lapse ERT inversion. Major artefacts typically exhibit significantmore » increases of resistivity (more than +30%) which can be misinterpreted as gas migration within the waste. In order to eliminate these artefacts, we tested an advanced time-lapse ERT procedure that includes (i) two advanced inversion tools and (ii) two alternative array geometries. The first advanced tool uses invariant regions in the model. The second advanced tool uses an inversion with a 'minimum length' constraint. The alternative arrays focus on (i) a pole-dipole array (2D case), and (ii) a star array (3D case). The results show that these two advanced inversion tools and the two alternative arrays remove almost completely the artefacts within +/-5% both for 2D and 3D situations. As a field application, time-lapse ERT is applied using the star array during a 3D leachate injection in a non-hazardous municipal waste landfill. To evaluate the robustness of the two advanced tools, a synthetic model including both true decrease and increase of resistivity is built. The advanced time-lapse ERT procedure eliminates unwanted artefacts, while keeping a satisfactory image of true resistivity variations. This study demonstrates that significant and robust improvements can be obtained for time-lapse ERT monitoring of leachate recirculation in waste landfills.« less

  7. An advanced constitutive model in the sheet metal forming simulation: the Teodosiu microstructural model and the Cazacu Barlat yield criterion

    NASA Astrophysics Data System (ADS)

    Alves, J. L.; Oliveira, M. C.; Menezes, L. F.

    2004-06-01

    Two constitutive models used to describe the plastic behavior of sheet metals in the numerical simulation of sheet metal forming process are studied: a recently proposed advanced constitutive model based on the Teodosiu microstructural model and the Cazacu Barlat yield criterion is compared with a more classical one, based on the Swift law and the Hill 1948 yield criterion. These constitutive models are implemented into DD3IMP, a finite element home code specifically developed to simulate sheet metal forming processes, which generically is a 3-D elastoplastic finite element code with an updated Lagrangian formulation, following a fully implicit time integration scheme, large elastoplastic strains and rotations. Solid finite elements and parametric surfaces are used to model the blank sheet and tool surfaces, respectively. Some details of the numerical implementation of the constitutive models are given. Finally, the theory is illustrated with the numerical simulation of the deep drawing of a cylindrical cup. The results show that the proposed advanced constitutive model predicts with more exactness the final shape (medium height and ears profile) of the formed part, as one can conclude from the comparison with the experimental results.

  8. Facilitating researcher use of flight simulators

    NASA Technical Reports Server (NTRS)

    Russell, C. Ray

    1990-01-01

    Researchers conducting experiments with flight simulators encounter numerous obstacles in bringing their ideas to the simulator. Research into how these simulators could be used more efficiently is presented. The study involved: (1) analyzing the Advanced Concepts Simulator software architecture, (2) analyzing the interaction between the researchers and simulation programmers, and (3) proposing a documentation tool for the researchers.

  9. CAM Highlights (FY 80)

    DTIC Science & Technology

    1980-10-01

    industrialized nations in almost every manufacturing market place. Many foreign nation’s manu- facturing advancements have resulted from...towards planning a computerized data storage and retrieval system based on Group Technology (GT) principles . The data storage and re- trieval...several computer languages available on the market to program numerically controlled machine tools. However, there was a need for a docu- ment showing

  10. The Development and Implementation of U-Msg for College Students' English Learning

    ERIC Educational Resources Information Center

    Cheng, Yuh-Ming; Kuo, Sheng-Huang; Lou, Shi-Jer; Shih, Ru-Chu

    2016-01-01

    With the advance of mobile technology, mobile devices have become more portable and powerful with numerous useful tools in daily life. Thus, mobile learning has been widely involved in e-learning studies. Many studies point out that it is important to integrate both pedagogical and technical strengths of mobile technology into learning settings.…

  11. Using Computational and Mechanical Models to Study Animal Locomotion

    PubMed Central

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026

  12. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  13. Group Theory with Applications in Chemical Physics

    NASA Astrophysics Data System (ADS)

    Jacobs, Patrick

    2005-10-01

    Group Theory is an indispensable mathematical tool in many branches of chemistry and physics. This book provides a self-contained and rigorous account on the fundamentals and applications of the subject to chemical physics, assuming no prior knowledge of group theory. The first half of the book focuses on elementary topics, such as molecular and crystal symmetry, whilst the latter half is more advanced in nature. Discussions on more complex material such as space groups, projective representations, magnetic crystals and spinor bases, often omitted from introductory texts, are expertly dealt with. With the inclusion of numerous exercises and worked examples, this book will appeal to advanced undergraduates and beginning graduate students studying physical sciences and is an ideal text for use on a two-semester course. An introductory and advanced text that comprehensively covers fundamentals and applications of group theory in detail Suitable for a two-semester course with numerous worked examples and problems Includes several topics often omitted from introductory texts, such as rotation group, space groups and spinor bases

  14. Practical Applications of Digital Pathology.

    PubMed

    Saeed-Vafa, Daryoush; Magliocco, Anthony M

    2015-04-01

    Virtual microscopy and advances in machine learning have paved the way for the ever-expanding field of digital pathology. Multiple image-based computing environments capable of performing automated quantitative and morphological analyses are the foundation on which digital pathology is built. The applications for digital pathology in the clinical setting are numerous and are explored along with the digital software environments themselves, as well as the different analytical modalities specific to digital pathology. Prospective studies, case-control analyses, meta-analyses, and detailed descriptions of software environments were explored that pertained to digital pathology and its use in the clinical setting. Many different software environments have advanced platforms capable of improving digital pathology and potentially influencing clinical decisions. The potential of digital pathology is vast, particularly with the introduction of numerous software environments available for use. With all the digital pathology tools available as well as those in development, the field will continue to advance, particularly in the era of personalized medicine, providing health care professionals with more precise prognostic information as well as helping them guide treatment decisions.

  15. Numerically stable finite difference simulation for ultrasonic NDE in anisotropic composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Quintanilla, Francisco Hernando; Cole, Christina M.

    2018-04-01

    Simulation tools can enable optimized inspection of advanced materials and complex geometry structures. Recent work at NASA Langley is focused on the development of custom simulation tools for modeling ultrasonic wave behavior in composite materials. Prior work focused on the use of a standard staggered grid finite difference type of mathematical approach, by implementing a three-dimensional (3D) anisotropic Elastodynamic Finite Integration Technique (EFIT) code. However, observations showed that the anisotropic EFIT method displays numerically unstable behavior at the locations of stress-free boundaries for some cases of anisotropic materials. This paper gives examples of the numerical instabilities observed for EFIT and discusses the source of instability. As an alternative to EFIT, the 3D Lebedev Finite Difference (LFD) method has been implemented. The paper briefly describes the LFD approach and shows examples of stable behavior in the presence of stress-free boundaries for a monoclinic anisotropy case. The LFD results are also compared to experimental results and dispersion curves.

  16. Generalized Differential Calculus and Applications to Optimization

    NASA Astrophysics Data System (ADS)

    Rector, Robert Blake Hayden

    This thesis contains contributions in three areas: the theory of generalized calculus, numerical algorithms for operations research, and applications of optimization to problems in modern electric power systems. A geometric approach is used to advance the theory and tools used for studying generalized notions of derivatives for nonsmooth functions. These advances specifically pertain to methods for calculating subdifferentials and to expanding our understanding of a certain notion of derivative of set-valued maps, called the coderivative, in infinite dimensions. A strong understanding of the subdifferential is essential for numerical optimization algorithms, which are developed and applied to nonsmooth problems in operations research, including non-convex problems. Finally, an optimization framework is applied to solve a problem in electric power systems involving a smart solar inverter and battery storage system providing energy and ancillary services to the grid.

  17. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  18. McIDAS-V: A Data Analysis and Visualization Tool for Global Satellite Data

    NASA Astrophysics Data System (ADS)

    Achtor, T. H.; Rink, T. D.

    2011-12-01

    The Man-computer Interactive Data Access System (McIDAS-V) is a java-based, open-source, freely available system for scientists, researchers and algorithm developers working with atmospheric data. The McIDAS-V software tools provide powerful new data manipulation and visualization capabilities, including 4-dimensional displays, an abstract data model with integrated metadata, user defined computation, and a powerful scripting capability. As such, McIDAS-V is a valuable tool for scientists and researchers within the GEO and GOESS domains. The advancing polar and geostationary orbit environmental satellite missions conducted by several countries will carry advanced instrumentation and systems that will collect and distribute land, ocean, and atmosphere data. These systems provide atmospheric and sea surface temperatures, humidity sounding, cloud and aerosol properties, and numerous other environmental products. This presentation will display and demonstrate some of the capabilities of McIDAS-V to analyze and display high temporal and spectral resolution data using examples from international environmental satellites.

  19. Using Genetic Mouse Models to Gain Insight into Glaucoma: Past Results and Future Possibilities

    PubMed Central

    Fernandes, Kimberly A.; Harder, Jeffrey M.; Williams, Pete A.; Rausch, Rebecca L.; Kiernan, Amy E.; Nair, K. Saidas; Anderson, Michael G.; John, Simon W.; Howell, Gareth R.; Libby, Richard T.

    2015-01-01

    While all forms of glaucoma are characterized by a specific pattern of retinal ganglion cell death, they are clinically divided into several distinct subclasses, including normal tension glaucoma, primary open angle glaucoma, congenital glaucoma, and secondary glaucoma. For each type of glaucoma there are likely numerous molecular pathways that control susceptibility to the disease. Given this complexity, a single animal model will never precisely model all aspects of all the different types of human glaucoma. Therefore, multiple animal models have been utilized to study glaucoma but more are needed. Because of the powerful genetic tools available to use in the laboratory mouse, it has proven to be a highly useful mammalian system for studying the pathophysiology of human disease. The similarity between human and mouse eyes coupled with the ability to use a combination of advanced cell biological and genetic tools in mice have led to a large increase in the number of studies using mice to model specific glaucoma phenotypes. Over the last decade, numerous new mouse models and genetic tools have emerged, providing important insight into the cell biology and genetics of glaucoma. In this review, we describe available mouse genetic models that can be used to study glaucoma-relevant disease/pathobiology. Furthermore, we discuss how these models have been used to gain insights into ocular hypertension (a major risk factor for glaucoma) and glaucomatous retinal ganglion cell death. Finally, the potential for developing new mouse models and using advanced genetic tools and resources for studying glaucoma are discussed. PMID:26116903

  20. Communication Tools for End-of-Life Decision-Making in Ambulatory Care Settings: A Systematic Review and Meta-Analysis.

    PubMed

    Oczkowski, Simon J; Chung, Han-Oh; Hanvey, Louise; Mbuagbaw, Lawrence; You, John J

    2016-01-01

    Patients with serious illness, and their families, state that better communication and decision-making with healthcare providers is a high priority to improve the quality of end-of-life care. Numerous communication tools to assist patients, family members, and clinicians in end-of-life decision-making have been published, but their effectiveness remains unclear. To determine, amongst adults in ambulatory care settings, the effect of structured communication tools for end-of-life decision-making on completion of advance care planning. We searched for relevant randomized controlled trials (RCTs) or non-randomized intervention studies in MEDLINE, EMBASE, CINAHL, ERIC, and the Cochrane Database of Randomized Controlled Trials from database inception until July 2014. Two reviewers independently screened articles for eligibility, extracted data, and assessed risk of bias. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used to evaluate the quality of evidence for each of the primary and secondary outcomes. Sixty-seven studies, including 46 RCTs, were found. The majority evaluated communication tools in older patients (age >50) with no specific medical condition, but many specifically evaluated populations with cancer, lung, heart, neurologic, or renal disease. Most studies compared the use of communication tools against usual care, but several compared the tools to less-intensive advance care planning tools. The use of structured communication tools increased: the frequency of advance care planning discussions/discussions about advance directives (RR 2.31, 95% CI 1.25-4.26, p = 0.007, low quality evidence) and the completion of advance directives (ADs) (RR 1.92, 95% CI 1.43-2.59, p<0.001, low quality evidence); concordance between AD preferences and subsequent medical orders for use or non-use of life supporting treatment (RR 1.19, 95% CI 1.01-1.39, p = 0.028, very low quality evidence, 1 observational study); and concordance between the care desired and care received by patients (RR 1.17, 95% CI 1.05-1.30, p = 0.004, low quality evidence, 2 RCTs). The use of structured communication tools may increase the frequency of discussions about and completion of advance directives, and concordance between the care desired and the care received by patients. The use of structured communication tools rather than an ad-hoc approach to end-of-life decision-making should be considered, and the selection and implementation of such tools should be tailored to address local needs and context. PROSPERO CRD42014012913.

  1. Communication Tools for End-of-Life Decision-Making in Ambulatory Care Settings: A Systematic Review and Meta-Analysis

    PubMed Central

    Chung, Han-Oh; Hanvey, Louise; Mbuagbaw, Lawrence; You, John J.

    2016-01-01

    Background Patients with serious illness, and their families, state that better communication and decision-making with healthcare providers is a high priority to improve the quality of end-of-life care. Numerous communication tools to assist patients, family members, and clinicians in end-of-life decision-making have been published, but their effectiveness remains unclear. Objectives To determine, amongst adults in ambulatory care settings, the effect of structured communication tools for end-of-life decision-making on completion of advance care planning. Methods We searched for relevant randomized controlled trials (RCTs) or non-randomized intervention studies in MEDLINE, EMBASE, CINAHL, ERIC, and the Cochrane Database of Randomized Controlled Trials from database inception until July 2014. Two reviewers independently screened articles for eligibility, extracted data, and assessed risk of bias. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used to evaluate the quality of evidence for each of the primary and secondary outcomes. Results Sixty-seven studies, including 46 RCTs, were found. The majority evaluated communication tools in older patients (age >50) with no specific medical condition, but many specifically evaluated populations with cancer, lung, heart, neurologic, or renal disease. Most studies compared the use of communication tools against usual care, but several compared the tools to less-intensive advance care planning tools. The use of structured communication tools increased: the frequency of advance care planning discussions/discussions about advance directives (RR 2.31, 95% CI 1.25–4.26, p = 0.007, low quality evidence) and the completion of advance directives (ADs) (RR 1.92, 95% CI 1.43–2.59, p<0.001, low quality evidence); concordance between AD preferences and subsequent medical orders for use or non-use of life supporting treatment (RR 1.19, 95% CI 1.01–1.39, p = 0.028, very low quality evidence, 1 observational study); and concordance between the care desired and care received by patients (RR 1.17, 95% CI 1.05–1.30, p = 0.004, low quality evidence, 2 RCTs). Conclusions The use of structured communication tools may increase the frequency of discussions about and completion of advance directives, and concordance between the care desired and the care received by patients. The use of structured communication tools rather than an ad-hoc approach to end-of-life decision-making should be considered, and the selection and implementation of such tools should be tailored to address local needs and context. Registration PROSPERO CRD42014012913 PMID:27119571

  2. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  3. Unsteady Loss in the Stator Due to the Incoming Rotor Wake in a Highly-Loaded Transonic Compressor

    NASA Technical Reports Server (NTRS)

    Hah, Chunill

    2015-01-01

    The present paper reports an investigation of unsteady loss generation in the stator due to the incoming rotor wake in an advanced GE transonic compressor design with a high-fidelity numerical method. This advanced compressor with high reaction and high stage loading has been investigated both experimentally and analytically in the past. The measured efficiency in this advanced compressor is significantly lower than the design intention goal. The general understanding is that the current generation of compressor design analysis tools miss some important flow physics in this modern compressor design. To pinpoint the source of the efficiency miss, an advanced test with a detailed flow traverse was performed for the front one and a half stage at the NASA Glenn Research Center.

  4. Astrophysical Computation in Research, the Classroom and Beyond

    NASA Astrophysics Data System (ADS)

    Frank, Adam

    2009-03-01

    In this talk I review progress in the use of simulations as a tool for astronomical research, for education and public outreach. The talk will include the basic elements of numerical simulations as well as advances in algorithms which have led to recent dramatic progress such as the use of Adaptive Mesh Refinement methods. The scientific focus of the talk will be star formation jets and outflows while the educational emphasis will be on the use of advanced platforms for simulation based learning in lecture and integrated homework. Learning modules for science outreach websites such as DISCOVER magazine will also be highlighted.

  5. Observing system simulations using synthetic radiances and atmospheric retrievals derived for the AMSU and HIRS in a mesoscale model. [Advanced Microwave Sounding Unit

    NASA Technical Reports Server (NTRS)

    Diak, George R.; Huang, Hung-Lung; Kim, Dongsoo

    1990-01-01

    The paper addresses the concept of synthetic satellite imagery as a visualization and diagnostic tool for understanding satellite sensors of the future and to detail preliminary results on the quality of soundings from the current sensors. Preliminary results are presented on the quality of soundings from the combination of the High-Resolution Infrared Radiometer Sounder and the Advanced Microwave Sounding Unit. Results are also presented on the first Observing System Simulation Experiment using this data in a mesoscale numerical prediction model.

  6. SAM Theory Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui

    The System Analysis Module (SAM) is an advanced and modern system analysis tool being developed at Argonne National Laboratory under the U.S. DOE Office of Nuclear Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM development aims for advances in physical modeling, numerical methods, and software engineering to enhance its user experience and usability for reactor transient analyses. To facilitate the code development, SAM utilizes an object-oriented application framework (MOOSE), and its underlying meshing and finite-element library (libMesh) and linear and non-linear solvers (PETSc), to leverage modern advanced software environments and numerical methods. SAM focuses on modeling advanced reactormore » concepts such as SFRs (sodium fast reactors), LFRs (lead-cooled fast reactors), and FHRs (fluoride-salt-cooled high temperature reactors) or MSRs (molten salt reactors). These advanced concepts are distinguished from light-water reactors in their use of single-phase, low-pressure, high-temperature, and low Prandtl number (sodium and lead) coolants. As a new code development, the initial effort has been focused on modeling and simulation capabilities of heat transfer and single-phase fluid dynamics responses in Sodium-cooled Fast Reactor (SFR) systems. The system-level simulation capabilities of fluid flow and heat transfer in general engineering systems and typical SFRs have been verified and validated. This document provides the theoretical and technical basis of the code to help users understand the underlying physical models (such as governing equations, closure models, and component models), system modeling approaches, numerical discretization and solution methods, and the overall capabilities in SAM. As the code is still under ongoing development, this SAM Theory Manual will be updated periodically to keep it consistent with the state of the development.« less

  7. The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    1999-01-01

    Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  8. Advanced Computational Modeling of Vapor Deposition in a High-Pressure Reactor

    NASA Technical Reports Server (NTRS)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  9. Advanced Computational Modeling of Vapor Deposition in a High-pressure Reactor

    NASA Technical Reports Server (NTRS)

    Cardelino, Beatriz H.; Moore, Craig E.; McCall, Sonya D.; Cardelino, Carlos A.; Dietz, Nikolaus; Bachmann, Klaus

    2004-01-01

    In search of novel approaches to produce new materials for electro-optic technologies, advances have been achieved in the development of computer models for vapor deposition reactors in space. Numerical simulations are invaluable tools for costly and difficult processes, such as those experiments designed for high pressures and microgravity conditions. Indium nitride is a candidate compound for high-speed laser and photo diodes for optical communication system, as well as for semiconductor lasers operating into the blue and ultraviolet regions. But InN and other nitride compounds exhibit large thermal decomposition at its optimum growth temperature. In addition, epitaxy at lower temperatures and subatmospheric pressures incorporates indium droplets into the InN films. However, surface stabilization data indicate that InN could be grown at 900 K in high nitrogen pressures, and microgravity could provide laminar flow conditions. Numerical models for chemical vapor deposition have been developed, coupling complex chemical kinetics with fluid dynamic properties.

  10. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  11. Advanced Concepts Theory Annual Report 1983.

    DTIC Science & Technology

    1984-05-18

    variety of theoretical models, tools, and computational strategies to understand, guide, and predict the behavior of high brightness, laboratory x-ray... theoretical models must treat hard and soft x-ray emission from different electron configurations with K, L, and M shells, and they must include... theoretical effort has basis for comprehending the trends which appear in the been devoted to elucidating the effects of opacity on the numerical results

  12. Advances of Proteomic Sciences in Dentistry.

    PubMed

    Khurshid, Zohaib; Zohaib, Sana; Najeeb, Shariq; Zafar, Muhammad Sohail; Rehman, Rabia; Rehman, Ihtesham Ur

    2016-05-13

    Applications of proteomics tools revolutionized various biomedical disciplines such as genetics, molecular biology, medicine, and dentistry. The aim of this review is to highlight the major milestones in proteomics in dentistry during the last fifteen years. Human oral cavity contains hard and soft tissues and various biofluids including saliva and crevicular fluid. Proteomics has brought revolution in dentistry by helping in the early diagnosis of various diseases identified by the detection of numerous biomarkers present in the oral fluids. This paper covers the role of proteomics tools for the analysis of oral tissues. In addition, dental materials proteomics and their future directions are discussed.

  13. Data-driven multi-scale multi-physics models to derive process-structure-property relationships for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-05-01

    Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.

  14. Data-driven multi-scale multi-physics models to derive process-structure-property relationships for additive manufacturing

    NASA Astrophysics Data System (ADS)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Lian, Yanping; Yu, Cheng; Liu, Zeliang; Yan, Jinhui; Wolff, Sarah; Wu, Hao; Ndip-Agbor, Ebot; Mozaffar, Mojtaba; Ehmann, Kornel; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-01-01

    Additive manufacturing (AM) possesses appealing potential for manipulating material compositions, structures and properties in end-use products with arbitrary shapes without the need for specialized tooling. Since the physical process is difficult to experimentally measure, numerical modeling is a powerful tool to understand the underlying physical mechanisms. This paper presents our latest work in this regard based on comprehensive material modeling of process-structure-property relationships for AM materials. The numerous influencing factors that emerge from the AM process motivate the need for novel rapid design and optimization approaches. For this, we propose data-mining as an effective solution. Such methods—used in the process-structure, structure-properties and the design phase that connects them—would allow for a design loop for AM processing and materials. We hope this article will provide a road map to enable AM fundamental understanding for the monitoring and advanced diagnostics of AM processing.

  15. HEAT - Habitat Evaluation and Assessment Tools for Effective Environmental Evaluations: User’s Guide

    DTIC Science & Technology

    2012-12-01

    surround- ing landscape (e.g., plants , animals, detritus, soil, the atmosphere, etc.) interact through a variety of physical, chemical, and... interactive geographic information system. Ecological Modelling 114:287–304. Ray, N., and M. A. Burgman. 2006. Subjective uncertainties in habitat...environmental impacts on ecological systems at numerous scales with varying degrees of success. Advances in technology have led many agen- cies to automate

  16. Concern-driven integrated approaches for the grouping, testing and assessment of nanomaterials.

    PubMed

    Landsiedel, Robert

    2016-11-01

    NM's potential to induce adverse effects in humans or the environment is being addressed in numerous research projects, and methods and tools for NM hazard identification and risk assessment are advancing. This article describes how integrated approaches for the testing and assessment of NMs can ensure the safety of nanomaterials, while adhering to the 3Rs principle. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less

  18. Integrated Control Modeling for Propulsion Systems Using NPSS

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Felder, James L.; Lavelle, Thomas M.; Withrow, Colleen A.; Yu, Albert Y.; Lehmann, William V. A.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS), an advanced engineering simulation environment used to design and analyze aircraft engines, has been enhanced by integrating control development tools into it. One of these tools is a generic controller interface that allows NPSS to communicate with control development software environments such as MATLAB and EASY5. The other tool is a linear model generator (LMG) that gives NPSS the ability to generate linear, time-invariant state-space models. Integrating these tools into NPSS enables it to be used for control system development. This paper will discuss the development and integration of these tools into NPSS. In addition, it will show a comparison of transient model results of a generic, dual-spool, military-type engine model that has been implemented in NPSS and Simulink. It will also show the linear model generator s ability to approximate the dynamics of a nonlinear NPSS engine model.

  19. Numerical Analyses for Low Reynolds Flow in a Ventricular Assist Device.

    PubMed

    Lopes, Guilherme; Bock, Eduardo; Gómez, Luben

    2017-06-01

    Scientific and technological advances in blood pump developments have been driven by their importance in cardiac patient treatments and in the expansion of life quality in assisted people. To improve and optimize the design and development, numerical tools were incorporated into the analyses of these mechanisms and have become indispensable in their advances. This study analyzes the flow behavior with low impeller Reynolds number, for which there is no consensus on the full development of turbulence in ventricular assist devices (VAD). For supporting analyses, computational numerical simulations were carried out in different scenarios with the same rotation speed. Two modeling approaches were applied: laminar flow and turbulent flow with the standard, RNG and realizable κ - ε; the standard and SST κ - ω models; and Spalart-Allmaras models. The results agree with the literature for VAD and the range for transient flows in stirred tanks with an impeller Reynolds number around 2800 for the tested scenarios. The turbulent models were compared, and it is suggested, based on the expected physical behavior, the use of κ-ε RNG, standard and SST κ-ω, and Spalart-Allmaras models to numerical analyses for low impeller Reynolds numbers according to the tested flow scenarios. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  20. Comparison of BrainTool to other UML modeling and model transformation tools

    NASA Astrophysics Data System (ADS)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  1. Optimization of Friction Stir Welding Tool Advance Speed via Monte-Carlo Simulation of the Friction Stir Welding Process

    PubMed Central

    Fraser, Kirk A.; St-Georges, Lyne; Kiss, Laszlo I.

    2014-01-01

    Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time. PMID:28788627

  2. Optimization of Friction Stir Welding Tool Advance Speed via Monte-Carlo Simulation of the Friction Stir Welding Process.

    PubMed

    Fraser, Kirk A; St-Georges, Lyne; Kiss, Laszlo I

    2014-04-30

    Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time.

  3. Joint pricing and production management: a geometric programming approach with consideration of cubic production cost function

    NASA Astrophysics Data System (ADS)

    Sadjadi, Seyed Jafar; Hamidi Hesarsorkh, Aghil; Mohammadi, Mehdi; Bonyadi Naeini, Ali

    2015-06-01

    Coordination and harmony between different departments of a company can be an important factor in achieving competitive advantage if the company corrects alignment between strategies of different departments. This paper presents an integrated decision model based on recent advances of geometric programming technique. The demand of a product considers as a power function of factors such as product's price, marketing expenditures, and consumer service expenditures. Furthermore, production cost considers as a cubic power function of outputs. The model will be solved by recent advances in convex optimization tools. Finally, the solution procedure is illustrated by numerical example.

  4. Robust Neighboring Optimal Guidance for the Advanced Launch System

    NASA Technical Reports Server (NTRS)

    Hull, David G.

    1993-01-01

    In recent years, optimization has become an engineering tool through the availability of numerous successful nonlinear programming codes. Optimal control problems are converted into parameter optimization (nonlinear programming) problems by assuming the control to be piecewise linear, making the unknowns the nodes or junction points of the linear control segments. Once the optimal piecewise linear control (suboptimal) control is known, a guidance law for operating near the suboptimal path is the neighboring optimal piecewise linear control (neighboring suboptimal control). Research conducted under this grant has been directed toward the investigation of neighboring suboptimal control as a guidance scheme for an advanced launch system.

  5. Polymer nanoparticles for drug and small silencing RNA delivery to treat cancers of different phenotypes

    PubMed Central

    Devulapally, Rammohan; Paulmurugan, Ramasamy

    2013-01-01

    Advances in nanotechnology have provided powerful and efficient tools in development of cancer diagnosis and therapy. There are numerous nanocarriers that are currently approved for clinical use in cancer therapy. In recent years, biodegradable polymer nanoparticles (NPs) have attracted a considerable attention for their ability to function as a possible carrier for target-specific delivery of various drugs, genes, proteins, peptides, vaccines, and other biomolecules in humans without much toxicity. This review will specifically focus on the recent advances in polymer-based nanocarriers for various drugs and small silencing RNA’s loading and delivery to treat different types of cancer. PMID:23996830

  6. Multimodal visualization interface for data management, self-learning and data presentation.

    PubMed

    Van Sint Jan, S; Demondion, X; Clapworthy, G; Louryan, S; Rooze, M; Cotten, A; Viceconti, M

    2006-10-01

    A multimodal visualization software, called the Data Manager (DM), has been developed to increase interdisciplinary communication around the topic of visualization and modeling of various aspects of the human anatomy. Numerous tools used in Radiology are integrated in the interface that runs on standard personal computers. The available tools, combined to hierarchical data management and custom layouts, allow analyzing of medical imaging data using advanced features outside radiological premises (for example, for patient review, conference presentation or tutorial preparation). The system is free, and based on an open-source software development architecture, and therefore updates of the system for custom applications are possible.

  7. Advances of Proteomic Sciences in Dentistry

    PubMed Central

    Khurshid, Zohaib; Zohaib, Sana; Najeeb, Shariq; Zafar, Muhammad Sohail; Rehman, Rabia; Rehman, Ihtesham Ur

    2016-01-01

    Applications of proteomics tools revolutionized various biomedical disciplines such as genetics, molecular biology, medicine, and dentistry. The aim of this review is to highlight the major milestones in proteomics in dentistry during the last fifteen years. Human oral cavity contains hard and soft tissues and various biofluids including saliva and crevicular fluid. Proteomics has brought revolution in dentistry by helping in the early diagnosis of various diseases identified by the detection of numerous biomarkers present in the oral fluids. This paper covers the role of proteomics tools for the analysis of oral tissues. In addition, dental materials proteomics and their future directions are discussed. PMID:27187379

  8. European Conference on Advanced Materials and Processes Held in Aachen, Federal Republic of Germany on November 22-24 1989. Abstracts

    DTIC Science & Technology

    1989-11-24

    However, the combination of increasing circuit complexity, customization, size, speed and heat flux is leading to a crisis in packaging technology(1...material properties and tooling restrictions, * production by an economic single-step sintering technique with subsequent heat treatment, * achievement of...programme, page 16. Numerical Mlodelling of Heat Transfer at Interfaces: Finite Element Approaches, Testing and Examples I W. Schafer, MAGM

  9. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  10. Advanced Secure Optical Image Processing for Communications

    NASA Astrophysics Data System (ADS)

    Al Falou, Ayman

    2018-04-01

    New image processing tools and data-processing network systems have considerably increased the volume of transmitted information such as 2D and 3D images with high resolution. Thus, more complex networks and long processing times become necessary, and high image quality and transmission speeds are requested for an increasing number of applications. To satisfy these two requests, several either numerical or optical solutions were offered separately. This book explores both alternatives and describes research works that are converging towards optical/numerical hybrid solutions for high volume signal and image processing and transmission. Without being limited to hybrid approaches, the latter are particularly investigated in this book in the purpose of combining the advantages of both techniques. Additionally, pure numerical or optical solutions are also considered since they emphasize the advantages of one of the two approaches separately.

  11. Computational fluid dynamics uses in fluid dynamics/aerodynamics education

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    1994-01-01

    The field of computational fluid dynamics (CFD) has advanced to the point where it can now be used for the purpose of fluid dynamics physics education. Because of the tremendous wealth of information available from numerical simulation, certain fundamental concepts can be efficiently communicated using an interactive graphical interrogation of the appropriate numerical simulation data base. In other situations, a large amount of aerodynamic information can be communicated to the student by interactive use of simple CFD tools on a workstation or even in a personal computer environment. The emphasis in this presentation is to discuss ideas for how this process might be implemented. Specific examples, taken from previous publications, will be used to highlight the presentation.

  12. PIFEX: An advanced programmable pipelined-image processor

    NASA Technical Reports Server (NTRS)

    Gennery, D. B.; Wilcox, B.

    1985-01-01

    PIFEX is a pipelined-image processor being built in the JPL Robotics Lab. It will operate on digitized raster-scanned images (at 60 frames per second for images up to about 300 by 400 and at lesser rates for larger images), performing a variety of operations simultaneously under program control. It thus is a powerful, flexible tool for image processing and low-level computer vision. It also has applications in other two-dimensional problems such as route planning for obstacle avoidance and the numerical solution of two-dimensional partial differential equations (although its low numerical precision limits its use in the latter field). The concept and design of PIFEX are described herein, and some examples of its use are given.

  13. Center for Extended Magnetohydrodynamics Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramos, Jesus

    This researcher participated in the DOE-funded Center for Extended Magnetohydrodynamics Modeling (CEMM), a multi-institutional collaboration led by the Princeton Plasma Physics Laboratory with Dr. Stephen Jardin as the overall Principal Investigator. This project developed advanced simulation tools to study the non-linear macroscopic dynamics of magnetically confined plasmas. The collaborative effort focused on the development of two large numerical simulation codes, M3D-C1 and NIMROD, and their application to a wide variety of problems. Dr. Ramos was responsible for theoretical aspects of the project, deriving consistent sets of model equations applicable to weakly collisional plasmas and devising test problems for verification ofmore » the numerical codes. This activity was funded for twelve years.« less

  14. From hacking the human genome to editing organs.

    PubMed

    Tobita, Takamasa; Guzman-Lepe, Jorge; Collin de l'Hortet, Alexandra

    2015-01-01

    In the recent decades, human genome engineering has been one of the major interesting research subjects, essentially because it raises new possibilities for personalized medicine and biotechnologies. With the development of engineered nucleases such as the Zinc Finger Nucleases (ZFNs), the Transcription activator-like effector nucleases (TALENs) and more recently the Clustered Regularly Interspaced short Palindromic Repeats (CRISPR), the field of human genome edition has evolved very rapidly. Every new genetic tool is broadening the scope of applications on human tissues, even before we can completely master each of these tools. In this review, we will present the recent advances regarding human genome edition tools, we will discuss the numerous implications they have in research and medicine, and we will mention the limits and concerns about such technologies.

  15. From hacking the human genome to editing organs

    PubMed Central

    Tobita, Takamasa; Guzman-Lepe, Jorge; Collin de l'Hortet, Alexandra

    2015-01-01

    ABSTRACT In the recent decades, human genome engineering has been one of the major interesting research subjects, essentially because it raises new possibilities for personalized medicine and biotechnologies. With the development of engineered nucleases such as the Zinc Finger Nucleases (ZFNs), the Transcription activator-like effector nucleases (TALENs) and more recently the Clustered Regularly Interspaced short Palindromic Repeats (CRISPR), the field of human genome edition has evolved very rapidly. Every new genetic tool is broadening the scope of applications on human tissues, even before we can completely master each of these tools. In this review, we will present the recent advances regarding human genome edition tools, we will discuss the numerous implications they have in research and medicine, and we will mention the limits and concerns about such technologies PMID:26588350

  16. Research on ARM Numerical Control System

    NASA Astrophysics Data System (ADS)

    Wei, Xu; JiHong, Chen

    Computerized Numerical Control (CNC) machine tools is the foundation of modern manufacturing systems, whose advanced digital technology is the key to solve the problem of sustainable development of machine tool manufacturing industry. The paper is to design CNC system embedded on ARM and indicates the hardware design and the software systems supported. On the hardware side: the driving chip of the motor control unit, as the core of components, is MCX314AL of DSP motion control which is developed by NOVA Electronics Co., Ltd. of Japan. It make convenient to control machine because of its excellent performance, simple interface, easy programming. On the Software side, the uC/OS-2 is selected as the embedded operating system of the open source, which makes a detailed breakdown of the modules of the CNC system. Those priorities are designed according to their actual requirements. The ways of communication between the module and the interrupt response are so different that it guarantees real-time property and reliability of the numerical control system. Therefore, it not only meets the requirements of the current social precision machining, but has good man-machine interface and network support to facilitate a variety of craftsmen use.

  17. A Pythonic Approach for Computational Geosciences and Geo-Data Processing

    NASA Astrophysics Data System (ADS)

    Morra, G.; Yuen, D. A.; Lee, S. M.

    2016-12-01

    Computational methods and data analysis play a constantly increasing role in Earth Sciences however students and professionals need to climb a steep learning curve before reaching a sufficient level that allows them to run effective models. Furthermore the recent arrival and new powerful machine learning tools such as Torch and Tensor Flow has opened new possibilities but also created a new realm of complications related to the completely different technology employed. We present here a series of examples entirely written in Python, a language that combines the simplicity of Matlab with the power and speed of compiled languages such as C, and apply them to a wide range of geological processes such as porous media flow, multiphase fluid-dynamics, creeping flow and many-faults interaction. We also explore ways in which machine learning can be employed in combination with numerical modelling. From immediately interpreting a large number of modeling results to optimizing a set of modeling parameters to obtain a desired optimal simulation. We show that by using Python undergraduate and graduate can learn advanced numerical technologies with a minimum dedicated effort, which in turn encourages them to develop more numerical tools and quickly progress in their computational abilities. We also show how Python allows combining modeling with machine learning as pieces of LEGO, therefore simplifying the transition towards a new kind of scientific geo-modelling. The conclusion is that Python is an ideal tool to create an infrastructure for geosciences that allows users to quickly develop tools, reuse techniques and encourage collaborative efforts to interpret and integrate geo-data in profound new ways.

  18. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  19. The Emergence of Contextual Social Psychology.

    PubMed

    Pettigrew, Thomas F

    2018-07-01

    Social psychology experiences recurring so-called "crises." This article maintains that these episodes actually mark advances in the discipline; these "crises" have enhanced relevance and led to greater methodological and statistical sophistication. New statistical tools have allowed social psychologists to begin to achieve a major goal: placing psychological phenomena in their larger social contexts. This growing trend is illustrated with numerous recent studies; they demonstrate how cultures and social norms moderate basic psychological processes. Contextual social psychology is finally emerging.

  20. Dances with Membranes: Breakthroughs from Super-resolution Imaging

    PubMed Central

    Curthoys, Nikki M.; Parent, Matthew; Mlodzianoski, Michael; Nelson, Andrew J.; Lilieholm, Jennifer; Butler, Michael B.; Valles, Matthew; Hess, Samuel T.

    2017-01-01

    Biological membrane organization mediates numerous cellular functions and has also been connected with an immense number of human diseases. However, until recently, experimental methodologies have been unable to directly visualize the nanoscale details of biological membranes, particularly in intact living cells. Numerous models explaining membrane organization have been proposed, but testing those models has required indirect methods; the desire to directly image proteins and lipids in living cell membranes is a strong motivation for the advancement of technology. The development of super-resolution microscopy has provided powerful tools for quantification of membrane organization at the level of individual proteins and lipids, and many of these tools are compatible with living cells. Previously inaccessible questions are now being addressed, and the field of membrane biology is developing rapidly. This chapter discusses how the development of super-resolution microscopy has led to fundamental advances in the field of biological membrane organization. We summarize the history and some models explaining how proteins are organized in cell membranes, and give an overview of various super-resolution techniques and methods of quantifying super-resolution data. We discuss the application of super-resolution techniques to membrane biology in general, and also with specific reference to the fields of actin and actin-binding proteins, virus infection, mitochondria, immune cell biology, and phosphoinositide signaling. Finally, we present our hopes and expectations for the future of super-resolution microscopy in the field of membrane biology. PMID:26015281

  1. Recent advances in the development and application of nanoelectrodes.

    PubMed

    Fan, Yunshan; Han, Chu; Zhang, Bo

    2016-10-07

    Nanoelectrodes have key advantages compared to electrodes of conventional size and are the tool of choice for numerous applications in both fundamental electrochemistry research and bioelectrochemical analysis. This Minireview summarizes recent advances in the development, characterization, and use of nanoelectrodes in nanoscale electroanalytical chemistry. Methods of nanoelectrode preparation include laser-pulled glass-sealed metal nanoelectrodes, mass-produced nanoelectrodes, carbon nanotube based and carbon-filled nanopipettes, and tunneling nanoelectrodes. Several new topics of their recent application are covered, which include the use of nanoelectrodes for electrochemical imaging at ultrahigh spatial resolution, imaging with nanoelectrodes and nanopipettes, electrochemical analysis of single cells, single enzymes, and single nanoparticles, and the use of nanoelectrodes to understand single nanobubbles.

  2. Tinkering with meiosis

    PubMed Central

    Crismani, Wayne; Girard, Chloé; Mercier, Raphael

    2013-01-01

    Meiosis is at the heart of Mendelian heredity. Recently, much progress has been made in the understanding of this process, in various organisms. In the last fifteen years, the functional characterization of numerous genes involved in meiosis has dramatically deepened our knowledge of key events, including recombination, cell cycle and chromosome distribution. Through a constantly advancing tool set and knowledge base, a number of advances have been made that will allow manipulation of meiosis from a plant breeding perspective. This review focuses on the aspects of meiosis that can be tinkered with to create and propagate new varieties. We would like to dedicate this review to the memory of Simon W. Chan (1974-2012) http://www.plb.ucdavis.edu/labs/srchan/ PMID:23136169

  3. Nondestructive surface analysis for material research using fiber optic vibrational spectroscopy

    NASA Astrophysics Data System (ADS)

    Afanasyeva, Natalia I.

    2001-11-01

    The advanced methods of fiber optical vibrational spectroscopy (FOVS) has been developed in conjunction with interferometer and low-loss, flexible, and nontoxic optical fibers, sensors, and probes. The combination of optical fibers and sensors with Fourier Transform (FT) spectrometer has been used in the range from 2.5 to 12micrometers . This technique serves as an ideal diagnostic tool for surface analysis of numerous and various diverse materials such as complex structured materials, fluids, coatings, implants, living cells, plants, and tissue. Such surfaces as well as living tissue or plants are very difficult to investigate in vivo by traditional FT infrared or Raman spectroscopy methods. The FOVS technique is nondestructive, noninvasive, fast (15 sec) and capable of operating in remote sampling regime (up to a fiber length of 3m). Fourier transform infrared (FTIR) and Raman fiber optic spectroscopy operating with optical fibers has been suggested as a new powerful tool. These techniques are highly sensitive techniques for structural studies in material research and various applications during process analysis to determine molecular composition, chemical bonds, and molecular conformations. These techniques could be developed as a new tool for quality control of numerous materials as well as noninvasive biopsy.

  4. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  5. A diagnostic interface for the ICOsahedral Non-hydrostatic (ICON) modelling framework based on the Modular Earth Submodel System (MESSy v2.50)

    NASA Astrophysics Data System (ADS)

    Kern, Bastian; Jöckel, Patrick

    2016-10-01

    Numerical climate and weather models have advanced to finer scales, accompanied by large amounts of output data. The model systems hit the input and output (I/O) bottleneck of modern high-performance computing (HPC) systems. We aim to apply diagnostic methods online during the model simulation instead of applying them as a post-processing step to written output data, to reduce the amount of I/O. To include diagnostic tools into the model system, we implemented a standardised, easy-to-use interface based on the Modular Earth Submodel System (MESSy) into the ICOsahedral Non-hydrostatic (ICON) modelling framework. The integration of the diagnostic interface into the model system is briefly described. Furthermore, we present a prototype implementation of an advanced online diagnostic tool for the aggregation of model data onto a user-defined regular coarse grid. This diagnostic tool will be used to reduce the amount of model output in future simulations. Performance tests of the interface and of two different diagnostic tools show, that the interface itself introduces no overhead in form of additional runtime to the model system. The diagnostic tools, however, have significant impact on the model system's runtime. This overhead strongly depends on the characteristics and implementation of the diagnostic tool. A diagnostic tool with high inter-process communication introduces large overhead, whereas the additional runtime of a diagnostic tool without inter-process communication is low. We briefly describe our efforts to reduce the additional runtime from the diagnostic tools, and present a brief analysis of memory consumption. Future work will focus on optimisation of the memory footprint and the I/O operations of the diagnostic interface.

  6. New efficient optimizing techniques for Kalman filters and numerical weather prediction models

    NASA Astrophysics Data System (ADS)

    Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis

    2016-06-01

    The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.

  7. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Paul T; Yin, Shengjun; Klasky, Hilda B

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current statusmore » of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite-element solutions and to also assess the level of confidence that can be placed in the best-estimate finiteelement solutions.« less

  8. CFD - Mature Technology?

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2005-01-01

    Over the past 30 years, numerical methods and simulation tools for fluid dynamic problems have advanced as a new discipline, namely, computational fluid dynamics (CFD). Although a wide spectrum of flow regimes are encountered in many areas of science and engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to a large demand for predicting the aerodynamic performance characteristics of flight vehicles, such as commercial, military, and space vehicles. As flow analysis is required to be more accurate and computationally efficient for both commercial and mission-oriented applications (such as those encountered in meteorology, aerospace vehicle development, general fluid engineering and biofluid analysis) CFD tools for engineering become increasingly important for predicting safety, performance and cost. This paper presents the author's perspective on the maturity of CFD, especially from an aerospace engineering point of view.

  9. Integrated multidisciplinary analysis of segmented reflector telescopes

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Needels, Laura

    1992-01-01

    The present multidisciplinary telescope-analysis approach, which encompasses thermal, structural, control and optical considerations, is illustrated for the case of an IR telescope in LEO; attention is given to end-to-end evaluations of the effects of mechanical disturbances and thermal gradients in measures of optical performance. Both geometric ray-tracing and surface-to-surface diffraction approximations are used in the telescope's optical model. Also noted is the role played by NASA-JPL's Integrated Modeling of Advanced Optical Systems computation tool, in view of numerical samples.

  10. Molecular Genetics of Mycobacteriophages

    PubMed Central

    HATFULL, GRAHAM F.

    2014-01-01

    Mycobacteriophages have provided numerous essential tools for mycobacterial genetics, including delivery systems for transposons, reporter genes, and allelic exchange substrates, and components for plasmid vectors and mutagenesis. Their genetically diverse genomes also reveal insights into the broader nature of the phage population and the evolutionary mechanisms that give rise to it. The substantial advances in our understanding of the biology of mycobacteriophages including a large collection of completely sequenced genomes indicates a rich potential for further contributions in tuberculosis genetics and beyond. PMID:25328854

  11. Aligning Food Systems Policies to Advance Public Health

    PubMed Central

    Muller, Mark; Tagtow, Angie; Roberts, Susan L.; MacDougall, Erin

    2009-01-01

    The involvement of public health professionals in food and agricultural policy provides tremendous opportunities for advancing the public's health. It is particularly challenging, however, for professionals to understand and consider the numerous policy drivers that impact the food system, which range from agricultural commodity policies to local food safety ordinances. Confronted with this complexity in the food system, policy advocates often focus on narrow objectives with disregard for the larger system. This commentary contends that, in order to be most effective, public health professionals need to consider the full range of interdependent policies that affect the system. Food policy councils have proven to be an effective tool, particularly at the local and state level, for developing comprehensive food systems policies that can improve public health. PMID:23144671

  12. A Survey of Challenges in Aerodynamic Exhaust Nozzle Technology for Aerospace Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Shyne, Rickey J.

    2002-01-01

    The current paper discusses aerodynamic exhaust nozzle technology challenges for aircraft and space propulsion systems. Technology advances in computational and experimental methods have led to more accurate design and analysis tools, but many major challenges continue to exist in nozzle performance, jet noise and weight reduction. New generations of aircraft and space vehicle concepts dictate that exhaust nozzles have optimum performance, low weight and acceptable noise signatures. Numerous innovative nozzle concepts have been proposed for advanced subsonic, supersonic and hypersonic vehicle configurations such as ejector, mixer-ejector, plug, single expansion ramp, altitude compensating, lobed and chevron nozzles. This paper will discuss the technology barriers that exist for exhaust nozzles as well as current research efforts in place to address the barriers.

  13. An advanced probabilistic structural analysis method for implicit performance functions

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  14. Advanced Power System Analysis Capabilities

    NASA Technical Reports Server (NTRS)

    1997-01-01

    As a continuing effort to assist in the design and characterization of space power systems, the NASA Lewis Research Center's Power and Propulsion Office developed a powerful computerized analysis tool called System Power Analysis for Capability Evaluation (SPACE). This year, SPACE was used extensively in analyzing detailed operational timelines for the International Space Station (ISS) program. SPACE was developed to analyze the performance of space-based photovoltaic power systems such as that being developed for the ISS. It is a highly integrated tool that combines numerous factors in a single analysis, providing a comprehensive assessment of the power system's capability. Factors particularly critical to the ISS include the orientation of the solar arrays toward the Sun and the shadowing of the arrays by other portions of the station.

  15. [Clinical Application of Non-invasive Diagnostic Tests for Liver Fibrosis].

    PubMed

    Shin, Jung Woo; Park, Neung Hwa

    2016-07-25

    The diagnostic assessment of liver fibrosis is an important step in the management of patients with chronic liver diseases. Liver biopsy is considered the gold standard to assess necroinflammation and fibrosis. However, recent technical advances have introduced numerous serum biomarkers and imaging tools using elastography as noninvasive alternatives to biopsy. Serum markers can be direct or indirect markers of the fibrosis process. The elastography-based studies include transient elastography, acoustic radiation force imaging, supersonic shear wave imaging and magnetic resonance elastography. As accumulation of clinical data shows that noninvasive tests provide prognostic information of clinical relevance, non-invasive diagnostic tools have been incorporated into clinical guidelines and practice. Here, the authors review noninvasive tests for the diagnosis of liver fibrosis.

  16. Performance and Weight Estimates for an Advanced Open Rotor Engine

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.; Tong, Michael T.

    2012-01-01

    NASA s Environmentally Responsible Aviation Project and Subsonic Fixed Wing Project are focused on developing concepts and technologies which may enable dramatic reductions to the environmental impact of future generation subsonic aircraft. The open rotor concept (also historically referred to an unducted fan or advanced turboprop) may allow for the achievement of this objective by reducing engine fuel consumption. To evaluate the potential impact of open rotor engines, cycle modeling and engine weight estimation capabilities have been developed. The initial development of the cycle modeling capabilities in the Numerical Propulsion System Simulation (NPSS) tool was presented in a previous paper. Following that initial development, further advancements have been made to the cycle modeling and weight estimation capabilities for open rotor engines and are presented in this paper. The developed modeling capabilities are used to predict the performance of an advanced open rotor concept using modern counter-rotating propeller designs. Finally, performance and weight estimates for this engine are presented and compared to results from a previous NASA study of advanced geared and direct-drive turbofans.

  17. Genome editing in plants: Advancing crop transformation and overview of tools.

    PubMed

    Shah, Tariq; Andleeb, Tayyaba; Lateef, Sadia; Noor, Mehmood Ali

    2018-05-07

    Genome manipulation technology is one of emerging field which brings real revolution in genetic engineering and biotechnology. Targeted editing of genomes pave path to address a wide range of goals not only to improve quality and productivity of crops but also permit to investigate the fundamental roots of biological systems. These goals includes creation of plants with valued compositional properties and with characters that confer resistance to numerous biotic and abiotic stresses. Numerous novel genome editing systems have been introduced during the past few years; these comprise zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and clustered regularly interspaced short palindromic repeats/Cas9 (CRISPR/Cas9). Genome editing technique is consistent for improving average yield to achieve the growing demands of the world's existing food famine and to launch a feasible and environmentally safe agriculture scheme, to more specific, productive, cost-effective and eco-friendly. These exciting novel methods, concisely reviewed herein, have verified themselves as efficient and reliable tools for the genetic improvement of plants. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  18. Basic and Advanced Numerical Performances Relate to Mathematical Expertise but Are Fully Mediated by Visuospatial Skills

    PubMed Central

    2016-01-01

    Recent studies have highlighted the potential role of basic numerical processing in the acquisition of numerical and mathematical competences. However, it is debated whether high-level numerical skills and mathematics depends specifically on basic numerical representations. In this study mathematicians and nonmathematicians performed a basic number line task, which required mapping positive and negative numbers on a physical horizontal line, and has been shown to correlate with more advanced numerical abilities and mathematical achievement. We found that mathematicians were more accurate compared with nonmathematicians when mapping positive, but not negative numbers, which are considered numerical primitives and cultural artifacts, respectively. Moreover, performance on positive number mapping could predict whether one is a mathematician or not, and was mediated by more advanced mathematical skills. This finding might suggest a link between basic and advanced mathematical skills. However, when we included visuospatial skills, as measured by block design subtest, the mediation analysis revealed that the relation between the performance in the number line task and the group membership was explained by non-numerical visuospatial skills. These results demonstrate that relation between basic, even specific, numerical skills and advanced mathematical achievement can be artifactual and explained by visuospatial processing. PMID:26913930

  19. Blood oxygenation level-dependent MRI for assessment of renal oxygenation

    PubMed Central

    Neugarten, Joel; Golestaneh, Ladan

    2014-01-01

    Blood oxygen level-dependent magnetic resonance imaging (BOLD MRI) has recently emerged as an important noninvasive technique to assess intrarenal oxygenation under physiologic and pathophysiologic conditions. Although this tool represents a major addition to our armamentarium of methodologies to investigate the role of hypoxia in the pathogenesis of acute kidney injury and progressive chronic kidney disease, numerous technical limitations confound interpretation of data derived from this approach. BOLD MRI has been utilized to assess intrarenal oxygenation in numerous experimental models of kidney disease and in human subjects with diabetic and nondiabetic chronic kidney disease, acute kidney injury, renal allograft rejection, contrast-associated nephropathy, and obstructive uropathy. However, confidence in conclusions based on data derived from BOLD MRI measurements will require continuing advances and technical refinements in the use of this technique. PMID:25473304

  20. Enhanced methodology of focus control and monitoring on scanner tool

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.

    2017-03-01

    As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus control and monitoring on multiple illumination conditions, opens an avenue to significantly reduce Focus-Exposure Matrix (FEM) wafer exposure for new product/layer best focus (BF) setup.

  1. An approach to achieve progress in spacecraft shielding

    NASA Astrophysics Data System (ADS)

    Thoma, K.; Schäfer, F.; Hiermaier, S.; Schneider, E.

    2004-01-01

    Progress in shield design against space debris can be achieved only when a combined approach based on several tools is used. This approach depends on the combined application of advanced numerical methods, specific material models and experimental determination of input parameters for these models. Examples of experimental methods for material characterization are given, covering the range from quasi static to very high strain rates for materials like Nextel and carbon fiber-reinforced materials. Mesh free numerical methods have extraordinary capabilities in the simulation of extreme material behaviour including complete failure with phase changes, combined with shock wave phenomena and the interaction with structural components. In this paper the benefits from combining numerical methods, material modelling and detailed experimental studies for shield design are demonstrated. The following examples are given: (1) Development of a material model for Nextel and Kevlar-Epoxy to enable numerical simulation of hypervelocity impacts on complex heavy protection shields for the International Space Station. (2) The influence of projectile shape on protection performance of Whipple Shields and how experimental problems in accelerating such shapes can be overcome by systematic numerical simulation. (3) The benefits of using metallic foams in "sandwich bumper shields" for spacecraft and how to approach systematic characterization of such materials.

  2. Coupling the System Analysis Module with SAS4A/SASSYS-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fanning, T. H.; Hu, R.

    2016-09-30

    SAS4A/SASSYS-1 is a simulation tool used to perform deterministic analysis of anticipated events as well as design basis and beyond design basis accidents for advanced reactors, with an emphasis on sodium fast reactors. SAS4A/SASSYS-1 has been under development and in active use for nearly forty-five years, and is currently maintained by the U.S. Department of Energy under the Office of Advanced Reactor Technology. Although SAS4A/SASSYS-1 contains a very capable primary and intermediate system modeling component, PRIMAR-4, it also has some shortcomings: outdated data management and code structure makes extension of the PRIMAR-4 module somewhat difficult. The user input format formore » PRIMAR-4 also limits the number of volumes and segments that can be used to describe a given system. The System Analysis Module (SAM) is a fairly new code development effort being carried out under the U.S. DOE Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. SAM is being developed with advanced physical models, numerical methods, and software engineering practices; however, it is currently somewhat limited in the system components and phenomena that can be represented. For example, component models for electromagnetic pumps and multi-layer stratified volumes have not yet been developed. Nor is there support for a balance of plant model. Similarly, system-level phenomena such as control-rod driveline expansion and vessel elongation are not represented. This report documents fiscal year 2016 work that was carried out to couple the transient safety analysis capabilities of SAS4A/SASSYS-1 with the system modeling capabilities of SAM under the joint support of the ART and NEAMS programs. The coupling effort was successful and is demonstrated by evaluating an unprotected loss of flow transient for the Advanced Burner Test Reactor (ABTR) design. There are differences between the stand-alone SAS4A/SASSYS-1 simulations and the coupled SAS/SAM simulations, but these are mainly attributed to the limited maturity of the SAM development effort. The severe accident modeling capabilities in SAS4A/SASSYS-1 (sodium boiling, fuel melting and relocation) will continue to play a vital role for a long time. Therefore, the SAS4A/SASSYS-1 modernization effort should remain a high priority task under the ART program to ensure continued participation in domestic and international SFR safety collaborations and design optimizations. On the other hand, SAM provides an advanced system analysis tool, with improved numerical solution schemes, data management, code flexibility, and accuracy. SAM is still in early stages of development and will require continued support from NEAMS to fulfill its potential and to mature into a production tool for advanced reactor safety analysis. The effort to couple SAS4A/SASSYS-1 and SAM is the first step on the integration of these modeling capabilities.« less

  3. Advanced Image Processing of Aerial Imagery

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn; Jobson, Daniel J.; Rahman, Zia-ur; Hines, Glenn

    2006-01-01

    Aerial imagery of the Earth is an invaluable tool for the assessment of ground features, especially during times of disaster. Researchers at the NASA Langley Research Center have developed techniques which have proven to be useful for such imagery. Aerial imagery from various sources, including Langley's Boeing 757 Aries aircraft, has been studied extensively. This paper discusses these studies and demonstrates that better-than-observer imagery can be obtained even when visibility is severely compromised. A real-time, multi-spectral experimental system will be described and numerous examples will be shown.

  4. Approaches for assessing and discovering protein interactions in cancer

    PubMed Central

    Mohammed, Hisham; Carroll, Jason S.

    2013-01-01

    Significant insight into the function of proteins, can be delineated by discovering and characterising interacting proteins. There are numerous methods for the discovery of unknown associated protein networks, with purification of the bait (the protein of interest) followed by Mass Spectrometry (MS) as a common theme. In recent years, advances have permitted the purification of endogenous proteins and methods for scaling down starting material. As such, approaches for rapid, unbiased identification of protein interactomes are becoming a standard tool in the researchers toolbox, rather than a technique that is only available to specialists. This review will highlight some of the recent technical advances in proteomic based discovery approaches, the pros and cons of various methods and some of the key findings in cancer related systems. PMID:24072816

  5. Valley Fever: Earth Observations for Risk Reduction

    NASA Astrophysics Data System (ADS)

    Sprigg, W. A.

    2012-12-01

    Advances in satellite Earth observation systems, numerical weather prediction, and dust storm modeling yield new tools for public health warnings, advisories and epidemiology of illnesses associated with airborne desert dust. Valley Fever, endemic from California through the US/Mexico border region into Central and South America, is triggered by inhalation of soil-dwelling fungal spores. The path from fungal growth to airborne threat depends on environmental conditions observable from satellite. And space-based sensors provide initial conditions for dust storm forecasts and baselines for the epidemiology of Valley Fever and other dust-borne aggravation of respiratory and cardiovascular disease. A new Pan-American Center for the World Meteorological Organization Sand and Dust Storm Warning Advisory and Assessment System creates an opportunity to advance Earth science applications in public health.

  6. Basic and advanced numerical performances relate to mathematical expertise but are fully mediated by visuospatial skills.

    PubMed

    Sella, Francesco; Sader, Elie; Lolliot, Simon; Cohen Kadosh, Roi

    2016-09-01

    Recent studies have highlighted the potential role of basic numerical processing in the acquisition of numerical and mathematical competences. However, it is debated whether high-level numerical skills and mathematics depends specifically on basic numerical representations. In this study mathematicians and nonmathematicians performed a basic number line task, which required mapping positive and negative numbers on a physical horizontal line, and has been shown to correlate with more advanced numerical abilities and mathematical achievement. We found that mathematicians were more accurate compared with nonmathematicians when mapping positive, but not negative numbers, which are considered numerical primitives and cultural artifacts, respectively. Moreover, performance on positive number mapping could predict whether one is a mathematician or not, and was mediated by more advanced mathematical skills. This finding might suggest a link between basic and advanced mathematical skills. However, when we included visuospatial skills, as measured by block design subtest, the mediation analysis revealed that the relation between the performance in the number line task and the group membership was explained by non-numerical visuospatial skills. These results demonstrate that relation between basic, even specific, numerical skills and advanced mathematical achievement can be artifactual and explained by visuospatial processing. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. The future of EUV lithography: enabling Moore's Law in the next decade

    NASA Astrophysics Data System (ADS)

    Pirati, Alberto; van Schoot, Jan; Troost, Kars; van Ballegoij, Rob; Krabbendam, Peter; Stoeldraijer, Judon; Loopstra, Erik; Benschop, Jos; Finders, Jo; Meiling, Hans; van Setten, Eelco; Mika, Niclas; Dredonx, Jeannot; Stamm, Uwe; Kneer, Bernhard; Thuering, Bernd; Kaiser, Winfried; Heil, Tilmann; Migura, Sascha

    2017-03-01

    While EUV systems equipped with a 0.33 Numerical Aperture lenses are readying to start volume manufacturing, ASML and Zeiss are ramping up their development activities on a EUV exposure tool with Numerical Aperture greater than 0.5. The purpose of this scanner, targeting a resolution of 8nm, is to extend Moore's law throughout the next decade. A novel, anamorphic lens design, has been developed to provide the required Numerical Aperture; this lens will be paired with new, faster stages and more accurate sensors enabling Moore's law economical requirements, as well as the tight focus and overlay control needed for future process nodes. The tighter focus and overlay control budgets, as well as the anamorphic optics, will drive innovations in the imaging and OPC modelling, and possibly in the metrology concepts. Furthermore, advances in resist and mask technology will be required to image lithography features with less than 10nm resolution. This paper presents an overview of the key technology innovations and infrastructure requirements for the next generation EUV systems.

  8. Numerical evaluation of longitudinal motions of Wigley hulls advancing in waves by using Bessho form translating-pulsating source Green'S function

    NASA Astrophysics Data System (ADS)

    Xiao, Wenbin; Dong, Wencai

    2016-06-01

    In the framework of 3D potential flow theory, Bessho form translating-pulsating source Green's function in frequency domain is chosen as the integral kernel in this study and hybrid source-and-dipole distribution model of the boundary element method is applied to directly solve the velocity potential for advancing ship in regular waves. Numerical characteristics of the Green function show that the contribution of local-flow components to velocity potential is concentrated at the nearby source point area and the wave component dominates the magnitude of velocity potential in the far field. Two kinds of mathematical models, with or without local-flow components taken into account, are adopted to numerically calculate the longitudinal motions of Wigley hulls, which demonstrates the applicability of translating-pulsating source Green's function method for various ship forms. In addition, the mesh analysis of discrete surface is carried out from the perspective of ship-form characteristics. The study shows that the longitudinal motion results by the simplified model are somewhat greater than the experimental data in the resonant zone, and the model can be used as an effective tool to predict ship seakeeping properties. However, translating-pulsating source Green function method is only appropriate for the qualitative analysis of motion response in waves if the ship geometrical shape fails to satisfy the slender-body assumption.

  9. Yeast synthetic biology toolbox and applications for biofuel production.

    PubMed

    Tsai, Ching-Sung; Kwak, Suryang; Turner, Timothy L; Jin, Yong-Su

    2015-02-01

    Yeasts are efficient biofuel producers with numerous advantages outcompeting bacterial counterparts. While most synthetic biology tools have been developed and customized for bacteria especially for Escherichia coli, yeast synthetic biological tools have been exploited for improving yeast to produce fuels and chemicals from renewable biomass. Here we review the current status of synthetic biological tools and their applications for biofuel production, focusing on the model strain Saccharomyces cerevisiae We describe assembly techniques that have been developed for constructing genes, pathways, and genomes in yeast. Moreover, we discuss synthetic parts for allowing precise control of gene expression at both transcriptional and translational levels. Applications of these synthetic biological approaches have led to identification of effective gene targets that are responsible for desirable traits, such as cellulosic sugar utilization, advanced biofuel production, and enhanced tolerance against toxic products for biofuel production from renewable biomass. Although an array of synthetic biology tools and devices are available, we observed some gaps existing in tool development to achieve industrial utilization. Looking forward, future tool development should focus on industrial cultivation conditions utilizing industrial strains. © FEMS 2015. All rights reserved. For permissions, please e-mail: journals.permission@oup.com.

  10. SGRAPH (SeismoGRAPHer): Seismic waveform analysis and integrated tools in seismology

    NASA Astrophysics Data System (ADS)

    Abdelwahed, Mohamed F.

    2012-03-01

    Although numerous seismological programs are currently available, most of them suffer from the inability to manipulate different data formats and the lack of embedded seismological tools. SeismoGRAPHer, or simply SGRAPH, is a new system for maintaining and analyzing seismic waveform data in a stand-alone, Windows-based application that manipulates a wide range of data formats. SGRAPH was intended to be a tool sufficient for performing basic waveform analysis and solving advanced seismological problems. The graphical user interface (GUI) utilities and the Windows functionalities, such as dialog boxes, menus, and toolbars, simplify the user interaction with the data. SGRAPH supports common data formats, such as SAC, SEED, GSE, ASCII, and Nanometrics Y-format, and provides the ability to solve many seismological problems with built-in inversion tools. Loaded traces are maintained, processed, plotted, and saved as SAC, ASCII, or PS (post script) file formats. SGRAPH includes Generalized Ray Theory (GRT), genetic algorithm (GA), least-square fitting, auto-picking, fast Fourier transforms (FFT), and many additional tools. This program provides rapid estimation of earthquake source parameters, location, attenuation, and focal mechanisms. Advanced waveform modeling techniques are provided for crustal structure and focal mechanism estimation. SGRAPH has been employed in the Egyptian National Seismic Network (ENSN) as a tool assisting with routine work and data analysis. More than 30 users have been using previous versions of SGRAPH in their research for more than 3 years. The main features of this application are ease of use, speed, small disk space requirements, and the absence of third-party developed components. Because of its architectural structure, SGRAPH can be interfaced with newly developed methods or applications in seismology. A complete setup file, including the SGRAPH package with the online user guide, is available.

  11. A multimedia perioperative record keeper for clinical research.

    PubMed

    Perrino, A C; Luther, M A; Phillips, D B; Levin, F L

    1996-05-01

    To develop a multimedia perioperative recordkeeper that provides: 1. synchronous, real-time acquisition of multimedia data, 2. on-line access to the patient's chart data, and 3. advanced data analysis capabilities through integrated, multimedia database and analysis applications. To minimize cost and development time, the system design utilized industry standard hardware components and graphical. software development tools. The system was configured to use a Pentium PC complemented with a variety of hardware interfaces to external data sources. These sources included physiologic monitors with data in digital, analog, video, and audio as well as paper-based formats. The development process was guided by trials in over 80 clinical cases and by the critiques from numerous users. As a result of this process, a suite of custom software applications were created to meet the design goals. The Perioperative Data Acquisition application manages data collection from a variety of physiological monitors. The Charter application provides for rapid creation of an electronic medical record from the patient's paper-based chart and investigator's notes. The Multimedia Medical Database application provides a relational database for the organization and management of multimedia data. The Triscreen application provides an integrated data analysis environment with simultaneous, full-motion data display. With recent technological advances in PC power, data acquisition hardware, and software development tools, the clinical researcher now has the ability to collect and examine a more complete perioperative record. It is hoped that the description of the MPR and its development process will assist and encourage others to advance these tools for perioperative research.

  12. Analyzing asteroid reflectance spectra with numerical tools based on scattering simulations

    NASA Astrophysics Data System (ADS)

    Penttilä, Antti; Väisänen, Timo; Markkanen, Johannes; Martikainen, Julia; Gritsevich, Maria; Muinonen, Karri

    2017-04-01

    We are developing a set of numerical tools that can be used in analyzing the reflectance spectra of granular materials such as the regolith surface of atmosphereless Solar system objects. Our goal is to be able to explain, with realistic numerical scattering models, the spectral features arising when materials are intimately mixed together. We include the space-weathering -type effects in our simulations, i.e., mixing host mineral locally with small inclusions of another material in small proportions. Our motivation for this study comes from the present lack of such tools. The current common practice is to apply a semi-physical approximate model such as some variation of Hapke models [e.g., 1] or the Shkuratov model [2]. These models are expressed in a closed form so that they are relatively fast to apply. They are based on simplifications on the radiative transfer theory. The problem is that the validity of the model is not always guaranteed, and the derived physical properties related to particle scattering properties can be unrealistic [3]. We base our numerical tool into a chain of scattering simulations. Scattering properties of small inclusions inside an absorbing host matrix can be derived using exact methods solving the Maxwell equations of the system. The next step, scattering by a single regolith grain, is solved using a geometrical optics method accounting for surface reflections, internal absorption, and possibly the internal diffuse scattering. The third step involves the radiative transfer simulations of these regolith grains in a macroscopic planar element. The chain can be continued next with shadowing simulation over the target surface elements, and finally by integrating the bidirectional reflectance distribution function over the object's shape. Most of the tools in the proposed chain already exist, and one practical task for us is to tie these together into an easy-to-use toolchain that can be publicly distributed. We plan to open the abovementioned toolchain as a web-based open service. Acknowledgments: The research is funded by the ERC Advanced Grant No. 320773 (SAEMPL) References: [1] B. Hapke, Icarus 195, 918-926, 2008. [2] Yu. Shkuratov et al, Icarus 137, 235-246, 1999. [3] Yu. Shkuratov et al, JQSRT 113, 2431-2456, 2012. [4] K. Muinonen et al, JQSRT 110, 1628-1639, 2009.

  13. Towards effective interactive three-dimensional colour postprocessing

    NASA Technical Reports Server (NTRS)

    Bailey, B. C.; Hajjar, J. F.; Abel, J. F.

    1986-01-01

    Recommendations for the development of effective three-dimensional, graphical color postprocessing are made. First, the evaluation of large, complex numerical models demands that a postprocessor be highly interactive. A menu of available functions should be provided and these operations should be performed quickly so that a sense of continuity and spontaneity exists during the post-processing session. Second, an agenda for three-dimensional color postprocessing is proposed. A postprocessor must be versatile with respect to application and basic algorithms must be designed so that they are flexible. A complete selection of tools is necessary to allow arbitrary specification of views, extraction of qualitative information, and access to detailed quantitative and problem information. Finally, full use of advanced display hardware is necessary if interactivity is to be maximized and effective postprocessing of today's numerical simulations is to be achieved.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qin, Hong; Liu, Jian; Xiao, Jianyuan

    Particle-in-cell (PIC) simulation is the most important numerical tool in plasma physics. However, its long-term accuracy has not been established. To overcome this difficulty, we developed a canonical symplectic PIC method for the Vlasov-Maxwell system by discretising its canonical Poisson bracket. A fast local algorithm to solve the symplectic implicit time advance is discovered without root searching or global matrix inversion, enabling applications of the proposed method to very large-scale plasma simulations with many, e.g. 10(9), degrees of freedom. The long-term accuracy and fidelity of the algorithm enables us to numerically confirm Mouhot and Villani's theory and conjecture on nonlinearmore » Landau damping over several orders of magnitude using the PIC method, and to calculate the nonlinear evolution of the reflectivity during the mode conversion process from extraordinary waves to Bernstein waves.« less

  15. The Langley Stability and Transition Analysis Code (LASTRAC) : LST, Linear and Nonlinear PSE for 2-D, Axisymmetric, and Infinite Swept Wing Boundary Layers

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2003-01-01

    During the past two decades, our understanding of laminar-turbulent transition flow physics has advanced significantly owing to, in a large part, the NASA program support such as the National Aerospace Plane (NASP), High-speed Civil Transport (HSCT), and Advanced Subsonic Technology (AST). Experimental, theoretical, as well as computational efforts on various issues such as receptivity and linear and nonlinear evolution of instability waves take part in broadening our knowledge base for this intricate flow phenomenon. Despite all these advances, transition prediction remains a nontrivial task for engineers due to the lack of a widely available, robust, and efficient prediction tool. The design and development of the LASTRAC code is aimed at providing one such engineering tool that is easy to use and yet capable of dealing with a broad range of transition related issues. LASTRAC was written from scratch based on the state-of-the-art numerical methods for stability analysis and modem software technologies. At low fidelity, it allows users to perform linear stability analysis and N-factor transition correlation for a broad range of flow regimes and configurations by using either the linear stability theory (LST) or linear parabolized stability equations (LPSE) method. At high fidelity, users may use nonlinear PSE to track finite-amplitude disturbances until the skin friction rise. Coupled with the built-in receptivity model that is currently under development, the nonlinear PSE method offers a synergistic approach to predict transition onset for a given disturbance environment based on first principles. This paper describes the governing equations, numerical methods, code development, and case studies for the current release of LASTRAC. Practical applications of LASTRAC are demonstrated for linear stability calculations, N-factor transition correlation, non-linear breakdown simulations, and controls of stationary crossflow instability in supersonic swept wing boundary layers.

  16. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  17. Micromagnetic computer simulations of spin waves in nanometre-scale patterned magnetic elements

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Koog

    2010-07-01

    Current needs for further advances in the nanotechnologies of information-storage and -processing devices have attracted a great deal of interest in spin (magnetization) dynamics in nanometre-scale patterned magnetic elements. For instance, the unique dynamic characteristics of non-uniform magnetic microstructures such as various types of domain walls, magnetic vortices and antivortices, as well as spin wave dynamics in laterally restricted thin-film geometries, have been at the centre of extensive and intensive researches. Understanding the fundamentals of their unique spin structure as well as their robust and novel dynamic properties allows us to implement new functionalities into existing or future devices. Although experimental tools and theoretical approaches are effective means of understanding the fundamentals of spin dynamics and of gaining new insights into them, the limitations of those same tools and approaches have left gaps of unresolved questions in the pertinent physics. As an alternative, however, micromagnetic modelling and numerical simulation has recently emerged as a powerful tool for the study of a variety of phenomena related to spin dynamics of nanometre-scale magnetic elements. In this review paper, I summarize the recent results of simulations of the excitation and propagation and other novel wave characteristics of spin waves, highlighting how the micromagnetic computer simulation approach contributes to an understanding of spin dynamics of nanomagnetism and considering some of the merits of numerical simulation studies. Many examples of micromagnetic modelling for numerical calculations, employing various dimensions and shapes of patterned magnetic elements, are given. The current limitations of continuum micromagnetic modelling and of simulations based on the Landau-Lifshitz-Gilbert equation of motion of magnetization are also discussed, along with further research directions for spin-wave studies.

  18. Steps Towards Understanding Large-scale Deformation of Gas Hydrate-bearing Sediments

    NASA Astrophysics Data System (ADS)

    Gupta, S.; Deusner, C.; Haeckel, M.; Kossel, E.

    2016-12-01

    Marine sediments bearing gas hydrates are typically characterized by heterogeneity in the gas hydrate distribution and anisotropy in the sediment-gas hydrate fabric properties. Gas hydrates also contribute to the strength and stiffness of the marine sediment, and any disturbance in the thermodynamic stability of the gas hydrates is likely to affect the geomechanical stability of the sediment. Understanding mechanisms and triggers of large-strain deformation and failure of marine gas hydrate-bearing sediments is an area of extensive research, particularly in the context of marine slope-stability and industrial gas production. The ultimate objective is to predict severe deformation events such as regional-scale slope failure or excessive sand production by using numerical simulation tools. The development of such tools essentially requires a careful analysis of thermo-hydro-chemo-mechanical behavior of gas hydrate-bearing sediments at lab-scale, and its stepwise integration into reservoir-scale simulators through definition of effective variables, use of suitable constitutive relations, and application of scaling laws. One of the focus areas of our research is to understand the bulk coupled behavior of marine gas hydrate systems with contributions from micro-scale characteristics, transport-reaction dynamics, and structural heterogeneity through experimental flow-through studies using high-pressure triaxial test systems and advanced tomographical tools (CT, ERT, MRI). We combine these studies to develop mathematical model and numerical simulation tools which could be used to predict the coupled hydro-geomechanical behavior of marine gas hydrate reservoirs in a large-strain framework. Here we will present some of our recent results from closely co-ordinated experimental and numerical simulation studies with an objective to capture the large-deformation behavior relevant to different gas production scenarios. We will also report on a variety of mechanically relevant test scenarios focusing on effects of dynamic changes in gas hydrate saturation, highly uneven gas hydrate distributions, focused fluid migration and gas hydrate production through depressurization and CO2 injection.

  19. Inlet Flow Control and Prediction Technologies for Embedded Propulsion Systems

    NASA Technical Reports Server (NTRS)

    McMillan, Michelle L.; Mackie, Scott A.; Gissen, Abe; Vukasinovic, Bojan; Lakebrink, Matthew T.; Glezer, Ari; Mani, Mori; Mace, James L.

    2011-01-01

    Fail-safe, hybrid, flow control (HFC) is a promising technology for meeting high-speed cruise efficiency, low-noise signature, and reduced fuel-burn goals for future, Hybrid-Wing-Body (HWB) aircraft with embedded engines. This report details the development of HFC technology that enables improved inlet performance in HWB vehicles with highly integrated inlets and embedded engines without adversely affecting vehicle performance. In addition, new test techniques for evaluating Boundary-Layer-Ingesting (BLI)-inlet flow-control technologies developed and demonstrated through this program are documented, including the ability to generate a BLI-like inlet-entrance flow in a direct-connect, wind-tunnel facility, as well as, the use of D-optimal, statistically designed experiments to optimize test efficiency and enable interpretation of results. Validated improvements in numerical analysis tools and methods accomplished through this program are also documented, including Reynolds-Averaged Navier-Stokes CFD simulations of steady-state flow physics for baseline, BLI-inlet diffuser flow, as well as, that created by flow-control devices. Finally, numerical methods were employed in a ground-breaking attempt to directly simulate dynamic distortion. The advances in inlet technologies and prediction tools will help to meet and exceed "N+2" project goals for future HWB aircraft.

  20. Investigating the Potential Impacts of Energy Production in the Marcellus Shale Region Using the Shale Network Database and CUAHSI-Supported Data Tools

    NASA Astrophysics Data System (ADS)

    Brazil, L.

    2017-12-01

    The Shale Network's extensive database of water quality observations enables educational experiences about the potential impacts of resource extraction with real data. Through open source tools that are developed and maintained by the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI), researchers, educators, and citizens can access and analyze the very same data that the Shale Network team has used in peer-reviewed publications about the potential impacts of hydraulic fracturing on water. The development of the Shale Network database has been made possible through collection efforts led by an academic team and involving numerous individuals from government agencies, citizen science organizations, and private industry. Thus far, CUAHSI-supported data tools have been used to engage high school students, university undergraduate and graduate students, as well as citizens so that all can discover how energy production impacts the Marcellus Shale region, which includes Pennsylvania and other nearby states. This presentation will describe these data tools, how the Shale Network has used them in developing educational material, and the resources available to learn more.

  1. A group decision-making tool for the application of membrane technologies in different water reuse scenarios.

    PubMed

    Sadr, S M K; Saroj, D P; Kouchaki, S; Ilemobade, A A; Ouki, S K

    2015-06-01

    A global challenge of increasing concern is diminishing fresh water resources. A growing practice in many communities to supplement diminishing fresh water availability has been the reuse of water. Novel methods of treating polluted waters, such as membrane assisted technologies, have recently been developed and successfully implemented in many places. Given the diversity of membrane assisted technologies available, the current challenge is how to select a reliable alternative among numerous technologies for appropriate water reuse. In this research, a fuzzy logic based multi-criteria, group decision making tool has been developed. This tool has been employed in the selection of appropriate membrane treatment technologies for several non-potable and potable reuse scenarios. Robust criteria, covering technical, environmental, economic and socio-cultural aspects, were selected, while 10 different membrane assisted technologies were assessed in the tool. The results show this approach capable of facilitating systematic and rigorous analysis in the comparison and selection of membrane assisted technologies for advanced wastewater treatment and reuse. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Therapeutic application of RNAi: is mRNA targeting finally ready for prime time?

    PubMed Central

    Grimm, Dirk; Kay, Mark A.

    2007-01-01

    With unprecedented speed, RNA interference (RNAi) has advanced from its basic discovery in lower organisms to becoming a powerful genetic tool and perhaps our single most promising biotherapeutic for a wide array of diseases. Numerous studies document RNAi efficacy in laboratory animals, and the first clinical trials are underway and thus far suggest that RNAi is safe to use in humans. Yet substantial hurdles have also surfaced and must be surmounted before therapeutic RNAi applications can become a standard therapy. Here we review the most critical roadblocks and concerns for clinical RNAi transition, delivery, and safety. We highlight emerging solutions and concurrently discuss novel therapeutic RNAi-based concepts. The current rapid advances create realistic optimism that the establishment of RNAi as a new and potent clinical modality in humans is near. PMID:18060021

  3. CFD Analysis in Advance of the NASA Juncture Flow Experiment

    NASA Technical Reports Server (NTRS)

    Lee, H. C.; Pulliam, T. H.; Neuhart, D. H.; Kegerise, M. A.

    2017-01-01

    NASA through its Transformational Tools and Technologies Project (TTT) under the Advanced Air Vehicle Program, is supporting a substantial effort to investigate the formation and origin of separation bubbles found on wing-body juncture zones. The flow behavior in these regions is highly complex, difficult to measure experimentally, and challenging to model numerically. Multiple wing configurations were designed and evaluated using Computational Fluid Dynamics (CFD), and a series of wind tunnel risk reduction tests were performed to further down-select the candidates for the final experiment. This paper documents the CFD analysis done in conjunction with the 6 percent scale risk reduction experiment performed in NASA Langley's 14- by 22-Foot Subsonic Tunnel. The combined CFD and wind tunnel results ultimately helped the Juncture Flow committee select the wing configurations for the final experiment.

  4. Physics the Google Way

    NASA Astrophysics Data System (ADS)

    Ward, David W.

    2005-09-01

    Are we smarter now than Socrates was in his time? Society as a whole certainly enjoys a higher degree of education, but humans as a species probably don't get intrinsically smarter with time. Our knowledge base, however, continues to grow at an unprecedented rate, so how then do we keep up? The printing press was one of the earliest technological advances that expanded our memory and made possible our present intellectual capacity. We are now faced with a new technological advance of the same magnitude, the Internet, but how do we use it effectively? A new tool is available on Google™ (http://www.google.com)that allows a user not only to numerically evaluate equations but also to automatically perform unit analysis and conversion, with most of the fundamental physical constants built in. This paper describes some of its capabilities.

  5. Advanced optical manufacturing and testing; Proceedings of the Meeting, San Diego, CA, July 9-11, 1990

    NASA Astrophysics Data System (ADS)

    Sanger, Gregory M.; Reid, Paul B.; Baker, Lionel R.

    1990-11-01

    Consideration is given to advanced optical fabrication, profilometry and thin films, and metrology. Particular attention is given to automation for optics manufacturing, 3D contouring on a numerically controlled grinder, laser-scanning lens configurations, a noncontact precision measurement system, novel noncontact profiler design for measuring synchrotron radiation mirrors, laser-diode technologies for in-process metrology, measurements of X-ray reflectivities of Au-coatings at several energies, platinum coating of an X-ray mirror for SR lithography, a Hilbert transform algorithm for fringe-pattern analysis, structural error sources during fabrication of the AXAF optical elements, an in-process mirror figure qualification procedure for large deformable mirrors, interferometric evaluation of lenslet arrays for 2D phase-locked laser diode sources, and manufacturing and metrology tooling for the solar-A soft X-ray telescope.

  6. A Concept for the Inclusion of Analytical and Computational Capability in Existing Systems for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, Clinton; Cooper, Anita E.; Powers, W. T.

    2005-01-01

    For approximately two decades, efforts have been sponsored by NASA's Marshall Space Flight Center to make possible high-speed, automated classification and quantification of constituent materials in various harsh environments. MSFC, along with the Air Force/Arnold Engineering Development Center, has led the work, developing and implementing systems that employ principles of emission and absorption spectroscopy to monitor molecular and atomic particulates in gas plasma of rocket engine flow fields. One such system identifies species and quantifies mass loss rates in H2/O2 rocket plumes. Other gases have been examined and the physics of their detection under numerous conditions were made a part of the knowledge base for the MSFC/USAF team. Additionally, efforts are being advanced to hardware encode components of the data analysis tools in order to address real-time operational requirements for health monitoring and management. NASA has a significant investment in these systems, warranting a spiral approach that meshes current tools and experience with technological advancements. This paper addresses current systems - the Optical Plume Anomaly Detector (OPAD) and the Engine Diagnostic Filtering System (EDIFIS) - and discusses what is considered a natural progression: a concept for migrating them towards detection of high energy particles, including neutrons and gamma rays. The proposal outlines system development to date, basic concepts for future advancements, and recommendations for accomplishing them.

  7. On constraining pilot point calibration with regularization in PEST

    USGS Publications Warehouse

    Fienen, M.N.; Muffels, C.T.; Hunt, R.J.

    2009-01-01

    Ground water model calibration has made great advances in recent years with practical tools such as PEST being instrumental for making the latest techniques available to practitioners. As models and calibration tools get more sophisticated, however, the power of these tools can be misapplied, resulting in poor parameter estimates and/or nonoptimally calibrated models that do not suit their intended purpose. Here, we focus on an increasingly common technique for calibrating highly parameterized numerical models - pilot point parameterization with Tikhonov regularization. Pilot points are a popular method for spatially parameterizing complex hydrogeologic systems; however, additional flexibility offered by pilot points can become problematic if not constrained by Tikhonov regularization. The objective of this work is to explain and illustrate the specific roles played by control variables in the PEST software for Tikhonov regularization applied to pilot points. A recent study encountered difficulties implementing this approach, but through examination of that analysis, insight into underlying sources of potential misapplication can be gained and some guidelines for overcoming them developed. ?? 2009 National Ground Water Association.

  8. Object oriented studies into artificial space debris

    NASA Technical Reports Server (NTRS)

    Adamson, J. M.; Marshall, G.

    1988-01-01

    A prototype simulation is being developed under contract to the Royal Aerospace Establishment (RAE), Farnborough, England, to assist in the discrimination of artificial space objects/debris. The methodology undertaken has been to link Object Oriented programming, intelligent knowledge based system (IKBS) techniques and advanced computer technology with numeric analysis to provide a graphical, symbolic simulation. The objective is to provide an additional layer of understanding on top of conventional classification methods. Use is being made of object and rule based knowledge representation, multiple reasoning, truth maintenance and uncertainty. Software tools being used include Knowledge Engineering Environment (KEE) and SymTactics for knowledge representation. Hooks are being developed within the SymTactics framework to incorporate mathematical models describing orbital motion and fragmentation. Penetration and structural analysis can also be incorporated. SymTactics is an Object Oriented discrete event simulation tool built as a domain specific extension to the KEE environment. The tool provides facilities for building, debugging and monitoring dynamic (military) simulations.

  9. Prostate cancer diagnostics: Clinical challenges and the ongoing need for disruptive and effective diagnostic tools.

    PubMed

    Sharma, Shikha; Zapatero-Rodríguez, Julia; O'Kennedy, Richard

    The increased incidence and the significant health burden associated with carcinoma of the prostate have led to substantial changes in its diagnosis over the past century. Despite technological advancements, the management of prostate cancer has become progressively more complex and controversial for both early and late-stage disease. The limitations and potential harms associated with the use of prostate-specific antigen (PSA) as a diagnostic marker have stimulated significant investigation of numerous novel biomarkers that demonstrate varying capacities to detect prostate cancer and can decrease unnecessary biopsies. However, only a few of these markers have been approved for specific clinical settings while the others have not been adequately validated for use. This review systematically and critically assesses ongoing issues and emerging challenges in the current state of prostate cancer diagnostic tools and the need for disruptive next generation tools based on analysis of combinations of these biomarkers to enhance predictive accuracy which will benefit clinical diagnostics and patient welfare. Copyright © 2016. Published by Elsevier Inc.

  10. GenSAA: A tool for advancing satellite monitoring with graphical expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Luczak, Edward C.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real time data for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At the NASA Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  11. Metabolic engineering of yeast for lignocellulosic biofuel production.

    PubMed

    Jin, Yong-Su; Cate, Jamie Hd

    2017-12-01

    Production of biofuels from lignocellulosic biomass remains an unsolved challenge in industrial biotechnology. Efforts to use yeast for conversion face the question of which host organism to use, counterbalancing the ease of genetic manipulation with the promise of robust industrial phenotypes. Saccharomyces cerevisiae remains the premier host for metabolic engineering of biofuel pathways, due to its many genetic, systems and synthetic biology tools. Numerous engineering strategies for expanding substrate ranges and diversifying products of S. cerevisiae have been developed. Other yeasts generally lack these tools, yet harbor superior phenotypes that could be exploited in the harsh processes required for lignocellulosic biofuel production. These include thermotolerance, resistance to toxic compounds generated during plant biomass deconstruction, and wider carbon consumption capabilities. Although promising, these yeasts have yet to be widely exploited. By contrast, oleaginous yeasts such as Yarrowia lipolytica capable of producing high titers of lipids are rapidly advancing in terms of the tools available for their metabolic manipulation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. 3rd congress on applied synthetic biology in Europe (Costa da Caparica, Portugal, February 2016).

    PubMed

    Cueva, Miguel

    2017-03-25

    The third meeting organised by the European Federation of Biotechnology (EFB) on advances in Applied Synthetic Biotechnology in Europe (ASBE) was held in Costa da Caparica, Portugal, in February 2016. Abundant novel applications in synthetic biology were described in the six sessions of the meeting, which was divided into technology and tools for synthetic biology (I, II and III), bionanoscience, biosynthetic pathways and enzyme synthetic biology, and metabolic engineering and chemical manufacturing. The meeting presented numerous methods for the development of novel synthetic strains, synthetic biological tools and synthetic biology applications. With the aid of synthetic biology, production costs of chemicals, metabolites and food products are expected to decrease, by generating sustainable biochemical production of such resources. Also, such synthetic biological advances could be applied for medical purposes, as in pharmaceuticals and for biosensors. Recurrent, linked themes throughout the meeting were the shortage of resources, the world's transition into a bioeconomy, and how synthetic biology is helping tackle these issues through cutting-edge technologies. While there are still limitations in synthetic biology research, innovation is propelling the development of technology, the standardisation of synthetic biological tools and the use of suitable host organisms. These developments are laying a foundation to providing a future where cutting-edge research could generate potential solutions to society's pressing issues, thus incentivising a transition into a bioeconomy. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Prompt and Precise Prototyping

    NASA Technical Reports Server (NTRS)

    2003-01-01

    For Sanders Design International, Inc., of Wilton, New Hampshire, every passing second between the concept and realization of a product is essential to succeed in the rapid prototyping industry where amongst heavy competition, faster time-to-market means more business. To separate itself from its rivals, Sanders Design aligned with NASA's Marshall Space Flight Center to develop what it considers to be the most accurate rapid prototyping machine for fabrication of extremely precise tooling prototypes. The company's Rapid ToolMaker System has revolutionized production of high quality, small-to-medium sized prototype patterns and tooling molds with an exactness that surpasses that of computer numerically-controlled (CNC) machining devices. Created with funding and support from Marshall under a Small Business Innovation Research (SBIR) contract, the Rapid ToolMaker is a dual-use technology with applications in both commercial and military aerospace fields. The advanced technology provides cost savings in the design and manufacturing of automotive, electronic, and medical parts, as well as in other areas of consumer interest, such as jewelry and toys. For aerospace applications, the Rapid ToolMaker enables fabrication of high-quality turbine and compressor blades for jet engines on unmanned air vehicles, aircraft, and missiles.

  14. The role of optimization in the next generation of computer-based design tools

    NASA Technical Reports Server (NTRS)

    Rogan, J. Edward

    1989-01-01

    There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.

  15. Evaluation of Proteus as a Tool for the Rapid Development of Models of Hydrologic Systems

    NASA Astrophysics Data System (ADS)

    Weigand, T. M.; Farthing, M. W.; Kees, C. E.; Miller, C. T.

    2013-12-01

    Models of modern hydrologic systems can be complex and involve a variety of operators with varying character. The goal is to implement approximations of such models that are both efficient for the developer and computationally efficient, which is a set of naturally competing objectives. Proteus is a Python-based toolbox that supports prototyping of model formulations as well as a wide variety of modern numerical methods and parallel computing. We used Proteus to develop numerical approximations for three models: Richards' equation, a brine flow model derived using the Thermodynamically Constrained Averaging Theory (TCAT), and a multiphase TCAT-based tumor growth model. For Richards' equation, we investigated discontinuous Galerkin solutions with higher order time integration based on the backward difference formulas. The TCAT brine flow model was implemented using Proteus and a variety of numerical methods were compared to hand coded solutions. Finally, an existing tumor growth model was implemented in Proteus to introduce more advanced numerics and allow the code to be run in parallel. From these three example models, Proteus was found to be an attractive open-source option for rapidly developing high quality code for solving existing and evolving computational science models.

  16. Machine learning research 1989-90

    NASA Technical Reports Server (NTRS)

    Porter, Bruce W.; Souther, Arthur

    1990-01-01

    Multifunctional knowledge bases offer a significant advance in artificial intelligence because they can support numerous expert tasks within a domain. As a result they amortize the costs of building a knowledge base over multiple expert systems and they reduce the brittleness of each system. Due to the inevitable size and complexity of multifunctional knowledge bases, their construction and maintenance require knowledge engineering and acquisition tools that can automatically identify interactions between new and existing knowledge. Furthermore, their use requires software for accessing those portions of the knowledge base that coherently answer questions. Considerable progress was made in developing software for building and accessing multifunctional knowledge bases. A language was developed for representing knowledge, along with software tools for editing and displaying knowledge, a machine learning program for integrating new information into existing knowledge, and a question answering system for accessing the knowledge base.

  17. Free Radical Addition Polymerization Kinetics without Steady-State Approximations: A Numerical Analysis for the Polymer, Physical, or Advanced Organic Chemistry Course

    ERIC Educational Resources Information Center

    Iler, H. Darrell; Brown, Amber; Landis, Amanda; Schimke, Greg; Peters, George

    2014-01-01

    A numerical analysis of the free radical addition polymerization system is described that provides those teaching polymer, physical, or advanced organic chemistry courses the opportunity to introduce students to numerical methods in the context of a simple but mathematically stiff chemical kinetic system. Numerical analysis can lead students to an…

  18. Application of the finite element method in orthopedic implant design.

    PubMed

    Saha, Subrata; Roychowdhury, Amit

    2009-01-01

    The finite element method (FEM) was first introduced to the field of orthopedic biomechanics in the early 1970s to evaluate stresses in human bones. By the early 1980s, the method had become well established as a tool for basic research and design analysis. Since the late 1980s and early 1990s, FEM has also been used to study bone remodeling. Today, it is one of the most reliable simulation tools for evaluating wear, fatigue, crack propagation, and so forth, and is used in many types of preoperative testing. Since the introduction of FEM to orthopedic biomechanics, there have been rapid advances in computer processing speeds, the finite element and other numerical methods, understanding of mechanical properties of soft and hard tissues and their modeling, and image-processing techniques. In light of these advances, it is accepted today that FEM will continue to contribute significantly to further progress in the design and development of orthopedic implants, as well as in the understanding of other complex systems of the human body. In the following article, different main application areas of finite element simulation will be reviewed including total hip joint arthroplasty, followed by the knee, spine, shoulder, and elbow, respectively.

  19. Experimental and Numerical Investigations of Applying Tip-bottomed Tool for Bending Advanced Ultra-high Strength Steel Sheet

    NASA Astrophysics Data System (ADS)

    Mitsomwang, Pusit; Borrisutthekul, Rattana; Klaiw-awoot, Ken; Pattalung, Aran

    2017-09-01

    This research was carried out aiming to investigate the application of a tip-bottomed tool for bending an advanced ultra-high strength steel sheet. The V-die bending experiment of a dual phase steel (DP980) sheet which had a thickness of 1.6 mm was executed using a conventional bending and a tip-bottomed punches. Experimental results revealed that the springback of the bent worksheet in the case of the tip-bottomed punch was less than that of the conventional punch case. To further discuss bending characteristics, a finite element (FE) model was developed and used to simulate the bending of the worksheet. From the FE analysis, it was found that the application of the tip-bottomed punch contributed the plastic deformation to occur at the bending region. Consequently, the springback of the worksheet reduced. In addition, the width of the punch tip was found to affect the deformation at the bending region and determined the springback of the bent worksheet. Moreover, the use of the tip-bottomed punch resulted in the apparent increase of the surface hardness of the bent worksheet, compared to the bending with the conventional punch.

  20. Cognitive correlates of performance in advanced mathematics.

    PubMed

    Wei, Wei; Yuan, Hongbo; Chen, Chuansheng; Zhou, Xinlin

    2012-03-01

    Much research has been devoted to understanding cognitive correlates of elementary mathematics performance, but little such research has been done for advanced mathematics (e.g., modern algebra, statistics, and mathematical logic). To promote mathematical knowledge among college students, it is necessary to understand what factors (including cognitive factors) are important for acquiring advanced mathematics. We recruited 80 undergraduates from four universities in Beijing. The current study investigated the associations between students' performance on a test of advanced mathematics and a battery of 17 cognitive tasks on basic numerical processing, complex numerical processing, spatial abilities, language abilities, and general cognitive processing. The results showed that spatial abilities were significantly correlated with performance in advanced mathematics after controlling for other factors. In addition, certain language abilities (i.e., comprehension of words and sentences) also made unique contributions. In contrast, basic numerical processing and computation were generally not correlated with performance in advanced mathematics. Results suggest that spatial abilities and language comprehension, but not basic numerical processing, may play an important role in advanced mathematics. These results are discussed in terms of their theoretical significance and practical implications. ©2011 The British Psychological Society.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Mark D.; McPherson, Brian J.; Grigg, Reid B.

    Numerical simulation is an invaluable analytical tool for scientists and engineers in making predictions about of the fate of carbon dioxide injected into deep geologic formations for long-term storage. Current numerical simulators for assessing storage in deep saline formations have capabilities for modeling strongly coupled processes involving multifluid flow, heat transfer, chemistry, and rock mechanics in geologic media. Except for moderate pressure conditions, numerical simulators for deep saline formations only require the tracking of two immiscible phases and a limited number of phase components, beyond those comprising the geochemical reactive system. The requirements for numerically simulating the utilization and storagemore » of carbon dioxide in partially depleted petroleum reservoirs are more numerous than those for deep saline formations. The minimum number of immiscible phases increases to three, the number of phase components may easily increase fourfold, and the coupled processes of heat transfer, geochemistry, and geomechanics remain. Public and scientific confidence in the ability of numerical simulators used for carbon dioxide sequestration in deep saline formations has advanced via a natural progression of the simulators being proven against benchmark problems, code comparisons, laboratory-scale experiments, pilot-scale injections, and commercial-scale injections. This paper describes a new numerical simulator for the scientific investigation of carbon dioxide utilization and storage in partially depleted petroleum reservoirs, with an emphasis on its unique features for scientific investigations; and documents the numerical simulation of the utilization of carbon dioxide for enhanced oil recovery in the western section of the Farnsworth Unit and represents an early stage in the progression of numerical simulators for carbon utilization and storage in depleted oil reservoirs.« less

  2. Aberration measurement technique based on an analytical linear model of a through-focus aerial image.

    PubMed

    Yan, Guanyong; Wang, Xiangzhao; Li, Sikun; Yang, Jishuo; Xu, Dongbo; Erdmann, Andreas

    2014-03-10

    We propose an in situ aberration measurement technique based on an analytical linear model of through-focus aerial images. The aberrations are retrieved from aerial images of six isolated space patterns, which have the same width but different orientations. The imaging formulas of the space patterns are investigated and simplified, and then an analytical linear relationship between the aerial image intensity distributions and the Zernike coefficients is established. The linear relationship is composed of linear fitting matrices and rotation matrices, which can be calculated numerically in advance and utilized to retrieve Zernike coefficients. Numerical simulations using the lithography simulators PROLITH and Dr.LiTHO demonstrate that the proposed method can measure wavefront aberrations up to Z(37). Experiments on a real lithography tool confirm that our method can monitor lens aberration offset with an accuracy of 0.7 nm.

  3. Numerical Simulation and Mechanical Design for TPS Electron Beam Position Monitors

    NASA Astrophysics Data System (ADS)

    Hsueh, H. P.; Kuan, C. K.; Ueng, T. S.; Hsiung, G. Y.; Chen, J. R.

    2007-01-01

    Comprehensive study on the mechanical design and numerical simulation for the high resolution electron beam position monitors are key steps to build the newly proposed 3rd generation synchrotron radiation research facility, Taiwan Photon Source (TPS). With more advanced electromagnetic simulation tool like MAFIA tailored specifically for particle accelerator, the design for the high resolution electron beam position monitors can be tested in such environment before they are experimentally tested. The design goal of our high resolution electron beam position monitors is to get the best resolution through sensitivity and signal optimization. The definitions and differences between resolution and sensitivity of electron beam position monitors will be explained. The design consideration is also explained. Prototype deign has been carried out and the related simulations were also carried out with MAFIA. The results are presented here. Sensitivity as high as 200 in x direction has been achieved in x direction at 500 MHz.

  4. Advancing Data Assimilation in Operational Hydrologic Forecasting: Progresses, Challenges, and Emerging Opportunities

    NASA Technical Reports Server (NTRS)

    Liu, Yuqiong; Weerts, A.; Clark, M.; Hendricks Franssen, H.-J; Kumar, S.; Moradkhani, H.; Seo, D.-J.; Schwanenberg, D.; Smith, P.; van Dijk, A. I. J. M.; hide

    2012-01-01

    Data assimilation (DA) holds considerable potential for improving hydrologic predictions as demonstrated in numerous research studies. However, advances in hydrologic DA research have not been adequately or timely implemented in operational forecast systems to improve the skill of forecasts for better informed real-world decision making. This is due in part to a lack of mechanisms to properly quantify the uncertainty in observations and forecast models in real-time forecasting situations and to conduct the merging of data and models in a way that is adequately efficient and transparent to operational forecasters. The need for effective DA of useful hydrologic data into the forecast process has become increasingly recognized in recent years. This motivated a hydrologic DA workshop in Delft, the Netherlands in November 2010, which focused on advancing DA in operational hydrologic forecasting and water resources management. As an outcome of the workshop, this paper reviews, in relevant detail, the current status of DA applications in both hydrologic research and operational practices, and discusses the existing or potential hurdles and challenges in transitioning hydrologic DA research into cost-effective operational forecasting tools, as well as the potential pathways and newly emerging opportunities for overcoming these challenges. Several related aspects are discussed, including (1) theoretical or mathematical aspects in DA algorithms, (2) the estimation of different types of uncertainty, (3) new observations and their objective use in hydrologic DA, (4) the use of DA for real-time control of water resources systems, and (5) the development of community-based, generic DA tools for hydrologic applications. It is recommended that cost-effective transition of hydrologic DA from research to operations should be helped by developing community-based, generic modeling and DA tools or frameworks, and through fostering collaborative efforts among hydrologic modellers, DA developers, and operational forecasters.

  5. Numerical study on wave loads and motions of two ships advancing in waves by using three-dimensional translating-pulsating source

    NASA Astrophysics Data System (ADS)

    Xu, Yong; Dong, Wen-Cai

    2013-08-01

    A frequency domain analysis method based on the three-dimensional translating-pulsating (3DTP) source Green function is developed to investigate wave loads and free motions of two ships advancing on parallel course in waves. Two experiments are carried out respectively to measure the wave loads and the freemotions for a pair of side-byside arranged ship models advancing with an identical speed in head regular waves. For comparison, each model is also tested alone. Predictions obtained by the present solution are found in favorable agreement with the model tests and are more accurate than the traditional method based on the three dimensional pulsating (3DP) source Green function. Numerical resonances and peak shift can be found in the 3DP predictions, which result from the wave energy trapped in the gap between two ships and the extremely inhomogeneous wave load distribution on each hull. However, they can be eliminated by 3DTP, in which the speed affects the free surface and most of the wave energy can be escaped from the gap. Both the experiment and the present prediction show that hydrodynamic interaction effects on wave loads and free motions are significant. The present solver may serve as a validated tool to predict wave loads and motions of two vessels under replenishment at sea, and may help to evaluate the hydrodynamic interaction effects on the ships safety in replenishment operation.

  6. OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE - A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gordon Tibbitts; Arnis Judzis

    2002-07-01

    This document details the progress to date on the OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE -- A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING contract for the quarter starting April 2002 through June 2002. Even though we are awaiting the optimization portion of the testing program, accomplishments include the following: (1) Presentation material was provided to the DOE/NETL project manager (Dr. John Rogers) for the DOE exhibit at the 2002 Offshore Technology Conference. (2) Two meeting at Smith International and one at Andergauge in Houston were held to investigate their interest in joining the Mud Hammer Performancemore » study. (3) SDS Digger Tools (Task 3 Benchmarking participant) apparently has not negotiated a commercial deal with Halliburton on the supply of fluid hammers to the oil and gas business. (4) TerraTek is awaiting progress by Novatek (a DOE contractor) on the redesign and development of their next hammer tool. Their delay will require an extension to TerraTek's contracted program. (5) Smith International has sufficient interest in the program to start engineering and chroming of collars for testing at TerraTek. (6) Shell's Brian Tarr has agreed to join the Industry Advisory Group for the DOE project. The addition of Brian Tarr is welcomed as he has numerous years of experience with the Novatek tool and was involved in the early tests in Europe while with Mobil Oil. (7) Conoco's field trial of the Smith fluid hammer for an application in Vietnam was organized and has contributed to the increased interest in their tool.« less

  7. High-NA EUV lithography enabling Moore's law in the next decade

    NASA Astrophysics Data System (ADS)

    van Schoot, Jan; Troost, Kars; Bornebroek, Frank; van Ballegoij, Rob; Lok, Sjoerd; Krabbendam, Peter; Stoeldraijer, Judon; Loopstra, Erik; Benschop, Jos P.; Finders, Jo; Meiling, Hans; van Setten, Eelco; Kneer, Bernhard; Kuerz, Peter; Kaiser, Winfried; Heil, Tilmann; Migura, Sascha; Neumann, Jens Timo

    2017-10-01

    While EUV systems equipped with a 0.33 Numerical Aperture lenses are readying to start volume manufacturing, ASML and Zeiss are ramping up their activities on a EUV exposure tool with Numerical Aperture of 0.55. The purpose of this scanner, targeting an ultimate resolution of 8nm, is to extend Moore's law throughout the next decade. A novel, anamorphic lens design, capable of providing the required Numerical Aperture has been investigated; This lens will be paired with new, faster stages and more accurate sensors enabling Moore's law economical requirements, as well as the tight focus and overlay control needed for future process nodes. The tighter focus and overlay control budgets, as well as the anamorphic optics, will drive innovations in the imaging and OPC modelling. Furthermore, advances in resist and mask technology will be required to image lithography features with less than 10nm resolution. This paper presents an overview of the target specifications, key technology innovations and imaging simulations demonstrating the advantages as compared to 0.33NA and showing the capabilities of the next generation EUV systems.

  8. A microkernel design for component-based parallel numerical software systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, S.

    1999-01-13

    What is the minimal software infrastructure and what type of conventions are needed to simplify development of sophisticated parallel numerical application codes using a variety of software components that are not necessarily available as source code? We propose an opaque object-based model where the objects are dynamically loadable from the file system or network. The microkernel required to manage such a system needs to include, at most: (1) a few basic services, namely--a mechanism for loading objects at run time via dynamic link libraries, and consistent schemes for error handling and memory management; and (2) selected methods that all objectsmore » share, to deal with object life (destruction, reference counting, relationships), and object observation (viewing, profiling, tracing). We are experimenting with these ideas in the context of extensible numerical software within the ALICE (Advanced Large-scale Integrated Computational Environment) project, where we are building the microkernel to manage the interoperability among various tools for large-scale scientific simulations. This paper presents some preliminary observations and conclusions from our work with microkernel design.« less

  9. Training the next generation of scientists in Weather Forecasting: new approaches with real models

    NASA Astrophysics Data System (ADS)

    Carver, Glenn; Váňa, Filip; Siemen, Stephan; Kertesz, Sandor; Keeley, Sarah

    2014-05-01

    The European Centre for Medium Range Weather Forecasts operationally produce medium range forecasts using what is internationally acknowledged as the world leading global weather forecast model. Future development of this scientifically advanced model relies on a continued availability of experts in the field of meteorological science and with high-level software skills. ECMWF therefore has a vested interest in young scientists and University graduates developing the necessary skills in numerical weather prediction including both scientific and technical aspects. The OpenIFS project at ECMWF maintains a portable version of the ECMWF forecast model (known as IFS) for use in education and research at Universities, National Meteorological Services and other research and education organisations. OpenIFS models can be run on desktop or high performance computers to produce weather forecasts in a similar way to the operational forecasts at ECMWF. ECMWF also provide the Metview desktop application, a modern, graphical, and easy to use tool for analysing and visualising forecasts that is routinely used by scientists and forecasters at ECMWF and other institutions. The combination of Metview with the OpenIFS models has the potential to deliver classroom-friendly tools allowing students to apply their theoretical knowledge to real-world examples using a world-leading weather forecasting model. In this paper we will describe how the OpenIFS model has been used for teaching. We describe the use of Linux based 'virtual machines' pre-packaged on USB sticks that support a technically easy and safe way of providing 'classroom-on-a-stick' learning environments for advanced training in numerical weather prediction. We welcome discussions with interested parties.

  10. Phonon-tunnelling dissipation in mechanical resonators

    PubMed Central

    Cole, Garrett D.; Wilson-Rae, Ignacio; Werbach, Katharina; Vanner, Michael R.; Aspelmeyer, Markus

    2011-01-01

    Microscale and nanoscale mechanical resonators have recently emerged as ubiquitous devices for use in advanced technological applications, for example, in mobile communications and inertial sensors, and as novel tools for fundamental scientific endeavours. Their performance is in many cases limited by the deleterious effects of mechanical damping. In this study, we report a significant advancement towards understanding and controlling support-induced losses in generic mechanical resonators. We begin by introducing an efficient numerical solver, based on the 'phonon-tunnelling' approach, capable of predicting the design-limited damping of high-quality mechanical resonators. Further, through careful device engineering, we isolate support-induced losses and perform a rigorous experimental test of the strong geometric dependence of this loss mechanism. Our results are in excellent agreement with the theory, demonstrating the predictive power of our approach. In combination with recent progress on complementary dissipation mechanisms, our phonon-tunnelling solver represents a major step towards accurate prediction of the mechanical quality factor. PMID:21407197

  11. Recent advances in molecular biology of parasitic viruses.

    PubMed

    Banik, Gouri Rani; Stark, Damien; Rashid, Harunor; Ellis, John T

    2014-01-01

    The numerous protozoa that can inhabit the human gastro-intestinal tract are known, yet little is understood of the viruses which infect these protozoa. The discovery, morphologic details, purification methods of virus-like particles, genome and proteome of the parasitic viruses, Entamoeba histolytica, Giardia lamblia, Trichomonas vaginalis, and the Eimeria sp. are described in this review. The protozoan viruses share many common features: most of them are RNA or double-stranded RNA viruses, ranging between 5 and 8 kilobases, and are spherical or icosahedral in shape with an average diameter of 30-40 nm. These viruses may influence the function and pathogenicity of the protozoa which they infect, and may be important to investigate from a clinical perspective. The viruses may be used as specific genetic transfection vectors for the parasites and may represent a research tool. This review provides an overview on recent advances in the field of protozoan viruses.

  12. Advanced computational simulations of water waves interacting with wave energy converters

    NASA Astrophysics Data System (ADS)

    Pathak, Ashish; Freniere, Cole; Raessi, Mehdi

    2017-03-01

    Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.

  13. Adapting California’s ecosystems to a changing climate

    USGS Publications Warehouse

    Elizabeth Chornesky,; David Ackerly,; Paul Beier,; Frank Davis,; Flint, Lorraine E.; Lawler, Joshua J.; Moyle, Peter B.; Moritz, Max A.; Scoonover, Mary; Byrd, Kristin B.; Alvarez, Pelayo; Heller, Nicole E.; Micheli, Elisabeth; Weiss, Stuart

    2017-01-01

    Significant efforts are underway to translate improved understanding of how climate change is altering ecosystems into practical actions for sustaining ecosystem functions and benefits. We explore this transition in California, where adaptation and mitigation are advancing relatively rapidly, through four case studies that span large spatial domains and encompass diverse ecological systems, institutions, ownerships, and policies. The case studies demonstrate the context specificity of societal efforts to adapt ecosystems to climate change and involve applications of diverse scientific tools (e.g., scenario analyses, downscaled climate projections, ecological and connectivity models) tailored to specific planning and management situations (alternative energy siting, wetland management, rangeland management, open space planning). They illustrate how existing institutional and policy frameworks provide numerous opportunities to advance adaptation related to ecosystems and suggest that progress is likely to be greatest when scientific knowledge is integrated into collective planning and when supportive policies and financing enable action.

  14. Application of Genomic In Situ Hybridization in Horticultural Science

    PubMed Central

    Ramzan, Fahad; Lim, Ki-Byung

    2017-01-01

    Molecular cytogenetic techniques, such as in situ hybridization methods, are admirable tools to analyze the genomic structure and function, chromosome constituents, recombination patterns, alien gene introgression, genome evolution, aneuploidy, and polyploidy and also genome constitution visualization and chromosome discrimination from different genomes in allopolyploids of various horticultural crops. Using GISH advancement as multicolor detection is a significant approach to analyze the small and numerous chromosomes in fruit species, for example, Diospyros hybrids. This analytical technique has proved to be the most exact and effective way for hybrid status confirmation and helps remarkably to distinguish donor parental genomes in hybrids such as Clivia, Rhododendron, and Lycoris ornamental hybrids. The genome characterization facilitates in hybrid selection having potential desirable characteristics during the early hybridization breeding, as this technique expedites to detect introgressed sequence chromosomes. This review study epitomizes applications and advancements of genomic in situ hybridization (GISH) techniques in horticultural plants. PMID:28459054

  15. An integrated approach for prioritizing pharmaceuticals found in the environment for risk assessment, monitoring and advanced research.

    PubMed

    Caldwell, Daniel J; Mastrocco, Frank; Margiotta-Casaluci, Luigi; Brooks, Bryan W

    2014-11-01

    Numerous active pharmaceutical ingredients (APIs), approved prior to enactment of detailed environmental risk assessment (ERA) guidance in the EU in 2006, have been detected in surface waters as a result of advancements in analytical technologies. Without adequate knowledge of the potential hazards these APIs may pose, assessing their environmental risk is challenging. As it would be impractical to commence hazard characterization and ERA en masse, several approaches to prioritizing substances for further attention have been published. Here, through the combination of three presentations given at a recent conference, "Pharmaceuticals in the Environment, Is there a problem?" (Nîmes, France, June 2013) we review several of these approaches, identify salient components, and present available techniques and tools that could facilitate a pragmatic, scientifically sound approach to prioritizing APIs for advanced study or ERA and, where warranted, fill critical data gaps through targeted, intelligent testing. We further present a modest proposal to facilitate future prioritization efforts and advanced research studies that incorporates mammalian pharmacology data (e.g., adverse outcomes pathways and the fish plasma model) and modeled exposure data based on pharmaceutical use. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. High-power disk lasers: advances and applications

    NASA Astrophysics Data System (ADS)

    Havrilla, David; Ryba, Tracey; Holzer, Marco

    2012-03-01

    Though the genesis of the disk laser concept dates to the early 90's, the disk laser continues to demonstrate the flexibility and the certain future of a breakthrough technology. On-going increases in power per disk, and improvements in beam quality and efficiency continue to validate the genius of the disk laser concept. As of today, the disk principle has not reached any fundamental limits regarding output power per disk or beam quality, and offers numerous advantages over other high power resonator concepts, especially over monolithic architectures. With about 2,000 high power disk lasers installations, and a demand upwards of 1,000 lasers per year, the disk laser has proven to be a robust and reliable industrial tool. With advancements in running cost, investment cost and footprint, manufacturers continue to implement disk laser technology with more vigor than ever. This paper will explain recent advances in disk laser technology and process relevant features of the laser, like pump diode arrangement, resonator design and integrated beam guidance. In addition, advances in applications in the thick sheet area and very cost efficient high productivity applications like remote welding, remote cutting and cutting of thin sheets will be discussed.

  17. Advanced Prosthetic Gait Training Tool

    DTIC Science & Technology

    2014-10-01

    AWARD NUMBER: W81XWH-10-1-0870 TITLE: Advanced Prosthetic Gait Training Tool...October 2014 2. REPORT TYPE Annual Report 3. DATES COVERED 20 Sep 2013 to 19 Sep 2014 4. TITLE AND SUBTITLE Advanced Prosthetic Gait Training...produce a computer-based Advanced Prosthetic Gait Training Tool to aid in the training of clinicians at military treatment facilities providing care

  18. È VIVO: Virtual eruptions at Vesuvius; A multimedia tool to illustrate numerical modeling to a general public

    NASA Astrophysics Data System (ADS)

    Todesco, Micol; Neri, Augusto; Demaria, Cristina; Marmo, Costantino; Macedonio, Giovanni

    2006-07-01

    Dissemination of scientific results to the general public has become increasingly important in our society. When science deals with natural hazards, public outreach is even more important: on the one hand, it contributes to hazard perception and it is a necessary step toward preparedness and risk mitigation; on the other hand, it contributes to establish a positive link of mutual confidence between scientific community and the population living at risk. The existence of such a link plays a relevant role in hazard communication, which in turn is essential to mitigate the risk. In this work, we present a tool that we have developed to illustrate our scientific results on pyroclastic flow propagation at Vesuvius. This tool, a CD-ROM that we developed joining scientific data with appropriate knowledge in communication sciences is meant to be a first prototype that will be used to test the validity of this approach to public outreach. The multimedia guide contains figures, images of real volcanoes and computer animations obtained through numerical modeling of pyroclastic density currents. Explanatory text, kept as short and simple as possible, illustrates both the process and the methodology applied to study this very dangerous natural phenomenon. In this first version, the CD-ROM will be distributed among selected categories of end-users together with a short questionnaire that we have drawn to test its readability. Future releases will include feedback from the users, further advancement of scientific results as well as a higher degree of interactivity.

  19. Bridging groundwater models and decision support with a Bayesian network

    USGS Publications Warehouse

    Fienen, Michael N.; Masterson, John P.; Plant, Nathaniel G.; Gutierrez, Benjamin T.; Thieler, E. Robert

    2013-01-01

    Resource managers need to make decisions to plan for future environmental conditions, particularly sea level rise, in the face of substantial uncertainty. Many interacting processes factor in to the decisions they face. Advances in process models and the quantification of uncertainty have made models a valuable tool for this purpose. Long-simulation runtimes and, often, numerical instability make linking process models impractical in many cases. A method for emulating the important connections between model input and forecasts, while propagating uncertainty, has the potential to provide a bridge between complicated numerical process models and the efficiency and stability needed for decision making. We explore this using a Bayesian network (BN) to emulate a groundwater flow model. We expand on previous approaches to validating a BN by calculating forecasting skill using cross validation of a groundwater model of Assateague Island in Virginia and Maryland, USA. This BN emulation was shown to capture the important groundwater-flow characteristics and uncertainty of the groundwater system because of its connection to island morphology and sea level. Forecast power metrics associated with the validation of multiple alternative BN designs guided the selection of an optimal level of BN complexity. Assateague island is an ideal test case for exploring a forecasting tool based on current conditions because the unique hydrogeomorphological variability of the island includes a range of settings indicative of past, current, and future conditions. The resulting BN is a valuable tool for exploring the response of groundwater conditions to sea level rise in decision support.

  20. Metabonomics and drug development.

    PubMed

    Ramana, Pranov; Adams, Erwin; Augustijns, Patrick; Van Schepdael, Ann

    2015-01-01

    Metabolites as an end product of metabolism possess a wealth of information about altered metabolic control and homeostasis that is dependent on numerous variables including age, sex, and environment. Studying significant changes in the metabolite patterns has been recognized as a tool to understand crucial aspects in drug development like drug efficacy and toxicity. The inclusion of metabonomics into the OMICS study platform brings us closer to define the phenotype and allows us to look at alternatives to improve the diagnosis of diseases. Advancements in the analytical strategies and statistical tools used to study metabonomics allow us to prevent drug failures at early stages of drug development and reduce financial losses during expensive phase II and III clinical trials. This chapter introduces metabonomics along with the instruments used in the study; in addition relevant examples of the usage of metabonomics in the drug development process are discussed along with an emphasis on future directions and the challenges it faces.

  1. Microfluidic inertial focusing fundamentals, limitations and applications for biomedical sample processing

    NASA Astrophysics Data System (ADS)

    Reece, Amy E.

    The microfabrication of microfluidic control systems and advances in molecular amplification tools has enabled the miniaturization of single cell analytical platforms for the efficient, highly selective enumeration and molecular characterization of rare and diseased cells from clinical samples. In many cases, the high-throughput nature of microfluidic inertial focusing has enabled the popularization of this new class of Lab-on-a-Chip devices that exhibit numerous advantages over conventional methods as prognostic and diagnostic tools. Inertial focusing is the passive, sheathless alignment of particles and cells to precise spatiotemporal equilibrium positions that arise from a force balance between opposing inertial lift forces and hydrodynamic repulsions. The applicability of inertial focusing to a spectrum of filtration, separation and encapsulation challenges places heavy emphasis upon the accurate description of the hydrodynamic forces responsible for predictable inertial focusing behavior. These inertial focusing fundamentals, limitations and their applications are studied extensively throughout this work.

  2. Computational Fluid Dynamics (CFD): Future role and requirements as viewed by an applied aerodynamicist. [computer systems design

    NASA Technical Reports Server (NTRS)

    Yoshihara, H.

    1978-01-01

    The problem of designing the wing-fuselage configuration of an advanced transonic commercial airliner and the optimization of a supercruiser fighter are sketched, pointing out the essential fluid mechanical phenomena that play an important role. Such problems suggest that for a numerical method to be useful, it must be able to treat highly three dimensional turbulent separations, flows with jet engine exhausts, and complex vehicle configurations. Weaknesses of the two principal tools of the aerodynamicist, the wind tunnel and the computer, suggest a complementing combined use of these tools, which is illustrated by the case of the transonic wing-fuselage design. The anticipated difficulties in developing an adequate turbulent transport model suggest that such an approach may have to suffice for an extended period. On a longer term, experimentation of turbulent transport in meaningful cases must be intensified to provide a data base for both modeling and theory validation purposes.

  3. Trash to treasure: production of biofuels and commodity chemicals via syngas fermenting microorganisms.

    PubMed

    Latif, Haythem; Zeidan, Ahmad A; Nielsen, Alex T; Zengler, Karsten

    2014-06-01

    Fermentation of syngas is a means through which unutilized organic waste streams can be converted biologically into biofuels and commodity chemicals. Despite recent advances, several issues remain which limit implementation of industrial-scale syngas fermentation processes. At the cellular level, the energy conservation mechanism of syngas fermenting microorganisms has not yet been entirely elucidated. Furthermore, there was a lack of genetic tools to study and ultimately enhance their metabolic capabilities. Recently, substantial progress has been made in understanding the intricate energy conservation mechanisms of these microorganisms. Given the complex relationship between energy conservation and metabolism, strain design greatly benefits from systems-level approaches. Numerous genetic manipulation tools have also been developed, paving the way for the use of metabolic engineering and systems biology approaches. Rational strain designs can now be deployed resulting in desirable phenotypic traits for large-scale production. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives.

    PubMed

    Zhao, Min; Wang, Qingguo; Wang, Quan; Jia, Peilin; Zhao, Zhongming

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development.

  5. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives

    PubMed Central

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development. PMID:24564169

  6. Advanced Computational Modeling Approaches for Shock Response Prediction

    NASA Technical Reports Server (NTRS)

    Derkevorkian, Armen; Kolaini, Ali R.; Peterson, Lee

    2015-01-01

    Motivation: (1) The activation of pyroshock devices such as explosives, separation nuts, pin-pullers, etc. produces high frequency transient structural response, typically from few tens of Hz to several hundreds of kHz. (2) Lack of reliable analytical tools makes the prediction of appropriate design and qualification test levels a challenge. (3) In the past few decades, several attempts have been made to develop methodologies that predict the structural responses to shock environments. (4) Currently, there is no validated approach that is viable to predict shock environments overt the full frequency range (i.e., 100 Hz to 10 kHz). Scope: (1) Model, analyze, and interpret space structural systems with complex interfaces and discontinuities, subjected to shock loads. (2) Assess the viability of a suite of numerical tools to simulate transient, non-linear solid mechanics and structural dynamics problems, such as shock wave propagation.

  7. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  8. Insecticidal activity of plant lectins and potential application in crop protection.

    PubMed

    Macedo, Maria Lígia R; Oliveira, Caio F R; Oliveira, Carolina T

    2015-01-27

    Lectins constitute a complex group of proteins found in different organisms. These proteins constitute an important field for research, as their structural diversity and affinity for several carbohydrates makes them suitable for numerous biological applications. This review addresses the classification and insecticidal activities of plant lectins, providing an overview of the applicability of these proteins in crop protection. The likely target sites in insect tissues, the mode of action of these proteins, as well as the use of lectins as biotechnological tools for pest control are also described. The use of initial bioassays employing artificial diets has led to the most recent advances in this field, such as plant breeding and the construction of fusion proteins, using lectins for targeting the delivery of toxins and to potentiate expected insecticide effects. Based on the data presented, we emphasize the contribution that plant lectins may make as tools for the development of integrated insect pest control strategies.

  9. Fundamental Research on Percussion Drilling: Improved rock mechanics analysis, advanced simulation technology, and full-scale laboratory investigations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael S. Bruno

    This report summarizes the research efforts on the DOE supported research project Percussion Drilling (DE-FC26-03NT41999), which is to significantly advance the fundamental understandings of the physical mechanisms involved in combined percussion and rotary drilling, and thereby facilitate more efficient and lower cost drilling and exploration of hard-rock reservoirs. The project has been divided into multiple tasks: literature reviews, analytical and numerical modeling, full scale laboratory testing and model validation, and final report delivery. Literature reviews document the history, pros and cons, and rock failure physics of percussion drilling in oil and gas industries. Based on the current understandings, a conceptualmore » drilling model is proposed for modeling efforts. Both analytical and numerical approaches are deployed to investigate drilling processes such as drillbit penetration with compression, rotation and percussion, rock response with stress propagation, damage accumulation and failure, and debris transportation inside the annulus after disintegrated from rock. For rock mechanics modeling, a dynamic numerical tool has been developed to describe rock damage and failure, including rock crushing by compressive bit load, rock fracturing by both shearing and tensile forces, and rock weakening by repetitive compression-tension loading. Besides multiple failure criteria, the tool also includes a damping algorithm to dissipate oscillation energy and a fatigue/damage algorithm to update rock properties during each impact. From the model, Rate of Penetration (ROP) and rock failure history can be estimated. For cuttings transport in annulus, a 3D numerical particle flowing model has been developed with aid of analytical approaches. The tool can simulate cuttings movement at particle scale under laminar or turbulent fluid flow conditions and evaluate the efficiency of cutting removal. To calibrate the modeling efforts, a series of full-scale fluid hammer drilling tests, as well as single impact tests, have been designed and executed. Both Berea sandstone and Mancos shale samples are used. In single impact tests, three impacts are sequentially loaded at the same rock location to investigate rock response to repetitive loadings. The crater depth and width are measured as well as the displacement and force in the rod and the force in the rock. Various pressure differences across the rock-indentor interface (i.e. bore pressure minus pore pressure) are used to investigate the pressure effect on rock penetration. For hammer drilling tests, an industrial fluid hammer is used to drill under both underbalanced and overbalanced conditions. Besides calibrating the modeling tool, the data and cuttings collected from the tests indicate several other important applications. For example, different rock penetrations during single impact tests may reveal why a fluid hammer behaves differently with diverse rock types and under various pressure conditions at the hole bottom. On the other hand, the shape of the cuttings from fluid hammer tests, comparing to those from traditional rotary drilling methods, may help to identify the dominant failure mechanism that percussion drilling relies on. If so, encouraging such a failure mechanism may improve hammer performance. The project is summarized in this report. Instead of compiling the information contained in the previous quarterly or other technical reports, this report focuses on the descriptions of tasks, findings, and conclusions, as well as the efforts on promoting percussion drilling technologies to industries including site visits, presentations, and publications. As a part of the final deliveries, the 3D numerical model for rock mechanics is also attached.« less

  10. Nonlinear Stochastic PDEs: Analysis and Approximations

    DTIC Science & Technology

    2016-05-23

    numerical performance. Main theoretical and experimental advances include: 1.Introduction of a number of effective approaches to numerical analysis of...Stokes and Euler SPDEs, quasi -geostrophic SPDE, Ginzburg-Landau SPDE and Duffing oscillator REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT...compare their numerical performance. Main theoretical and experimental advances include: 1.Introduction of a number of effective approaches to

  11. STAGS Example Problems Manual

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Rankin, Charles C.

    2006-01-01

    This document summarizes the STructural Analysis of General Shells (STAGS) development effort, STAGS performance for selected demonstration problems, and STAGS application problems illustrating selected advanced features available in the STAGS Version 5.0. Each problem is discussed including selected background information and reference solutions when available. The modeling and solution approach for each problem is described and illustrated. Numerical results are presented and compared with reference solutions, test data, and/or results obtained from mesh refinement studies. These solutions provide an indication of the overall capabilities of the STAGS nonlinear finite element analysis tool and provide users with representative cases, including input files, to explore these capabilities that may then be tailored to other applications.

  12. Surgical research using national databases

    PubMed Central

    Leland, Hyuma; Heckmann, Nathanael

    2016-01-01

    Recent changes in healthcare and advances in technology have increased the use of large-volume national databases in surgical research. These databases have been used to develop perioperative risk stratification tools, assess postoperative complications, calculate costs, and investigate numerous other topics across multiple surgical specialties. The results of these studies contain variable information but are subject to unique limitations. The use of large-volume national databases is increasing in popularity, and thorough understanding of these databases will allow for a more sophisticated and better educated interpretation of studies that utilize such databases. This review will highlight the composition, strengths, and weaknesses of commonly used national databases in surgical research. PMID:27867945

  13. Surgical research using national databases.

    PubMed

    Alluri, Ram K; Leland, Hyuma; Heckmann, Nathanael

    2016-10-01

    Recent changes in healthcare and advances in technology have increased the use of large-volume national databases in surgical research. These databases have been used to develop perioperative risk stratification tools, assess postoperative complications, calculate costs, and investigate numerous other topics across multiple surgical specialties. The results of these studies contain variable information but are subject to unique limitations. The use of large-volume national databases is increasing in popularity, and thorough understanding of these databases will allow for a more sophisticated and better educated interpretation of studies that utilize such databases. This review will highlight the composition, strengths, and weaknesses of commonly used national databases in surgical research.

  14. Fast micromagnetic simulations on GPU—recent advances made with \\mathsf{mumax}^3

    NASA Astrophysics Data System (ADS)

    Leliaert, J.; Dvornik, M.; Mulkers, J.; De Clercq, J.; Milošević, M. V.; Van Waeyenberge, B.

    2018-03-01

    In the last twenty years, numerical modeling has become an indispensable part of magnetism research. It has become a standard tool for both the exploration of new systems and for the interpretation of experimental data. In the last five years, the capabilities of micromagnetic modeling have dramatically increased due to the deployment of graphical processing units (GPU), which have sped up calculations to a factor of 200. This has enabled many studies which were previously unfeasible. In this topical review, we give an overview of this modeling approach and show how it has contributed to the forefront of current magnetism research.

  15. Oxygen Mass Transport in Stented Coronary Arteries.

    PubMed

    Murphy, Eoin A; Dunne, Adrian S; Martin, David M; Boyle, Fergal J

    2016-02-01

    Oxygen deficiency, known as hypoxia, in arterial walls has been linked to increased intimal hyperplasia, which is the main adverse biological process causing in-stent restenosis. Stent implantation has significant effects on the oxygen transport into the arterial wall. Elucidating these effects is critical to optimizing future stent designs. In this study the most advanced oxygen transport model developed to date was assessed in two test cases and used to compare three coronary stent designs. Additionally, the predicted results from four simplified blood oxygen transport models are compared in the two test cases. The advanced model showed good agreement with experimental measurements within the mass-transfer boundary layer and at the luminal surface; however, more work is needed in predicting the oxygen transport within the arterial wall. Simplifying the oxygen transport model within the blood flow produces significant errors in predicting the oxygen transport in arteries. This study can be used as a guide for all future numerical studies in this area and the advanced model could provide a powerful tool in aiding design of stents and other cardiovascular devices.

  16. Mobile Air Quality Studies (MAQS)-an international project.

    PubMed

    Groneberg, David A; Scutaru, Cristian; Lauks, Mathias; Takemura, Masaya; Fischer, Tanja C; Kölzow, Silvana; van Mark, Anke; Uibel, Stefanie; Wagner, Ulrich; Vitzthum, Karin; Beck, Fabian; Mache, Stefanie; Kreiter, Carolin; Kusma, Bianca; Friedebold, Annika; Zell, Hanna; Gerber, Alexander; Bock, Johanna; Al-Mutawakl, Khaled; Donat, Johannes; Geier, Maria Victoria; Pilzner, Carolin; Welker, Pia; Joachim, Ricarda; Bias, Harald; Götting, Michael; Sakr, Mohannad; Addicks, Johann P; Börger, Julia-Annik; Jensen, Anna-Maria; Grajewski, Sonja; Shami, Awfa; Neye, Niko; Kröger, Stefan; Hoffmann, Sarah; Kloss, Lisa; Mayer, Sebastian; Puk, Clemens; Henkel, Ulrich; Rospino, Robert; Schilling, Ute; Krieger, Evelyn; Westphal, Gesa; Meyer-Falcke, Andreas; Hupperts, Hagen; de Roux, Andrés; Tropp, Salome; Weiland, Marco; Mühlbach, Janette; Steinberg, Johannes; Szerwinski, Anne; Falahkohan, Sepiede; Sudik, Claudia; Bircks, Anna; Noga, Oliver; Dickgreber, Nicolas; Dinh, Q Thai; Golpon, Heiko; Kloft, Beatrix; Groneberg, Rafael Neill B; Witt, Christian; Wicker, Sabine; Zhang, Li; Springer, Jochen; Kütting, Birgitta; Mingomataj, Ervin C; Fischer, Axel; Schöffel, Norman; Unger, Volker; Quarcoo, David

    2010-04-09

    Due to an increasing awareness of the potential hazardousness of air pollutants, new laws, rules and guidelines have recently been implemented globally. In this respect, numerous studies have addressed traffic-related exposure to particulate matter using stationary technology so far. By contrast, only few studies used the advanced technology of mobile exposure analysis. The Mobile Air Quality Study (MAQS) addresses the issue of air pollutant exposure by combining advanced high-granularity spatial-temporal analysis with vehicle-mounted, person-mounted and roadside sensors. The MAQS-platform will be used by international collaborators in order 1) to assess air pollutant exposure in relation to road structure, 2) to assess air pollutant exposure in relation to traffic density, 3) to assess air pollutant exposure in relation to weather conditions, 4) to compare exposure within vehicles between front and back seat (children) positions, and 5) to evaluate "traffic zone"-exposure in relation to non-"traffic zone"-exposure.Primarily, the MAQS-platform will focus on particulate matter. With the establishment of advanced mobile analysis tools, it is planed to extend the analysis to other pollutants including NO2, SO2, nanoparticles and ozone.

  17. High power disk lasers: advances and applications

    NASA Astrophysics Data System (ADS)

    Havrilla, David; Holzer, Marco

    2011-02-01

    Though the genesis of the disk laser concept dates to the early 90's, the disk laser continues to demonstrate the flexibility and the certain future of a breakthrough technology. On-going increases in power per disk, and improvements in beam quality and efficiency continue to validate the genius of the disk laser concept. As of today, the disk principle has not reached any fundamental limits regarding output power per disk or beam quality, and offers numerous advantages over other high power resonator concepts, especially over monolithic architectures. With well over 1000 high power disk lasers installations, the disk laser has proven to be a robust and reliable industrial tool. With advancements in running cost, investment cost and footprint, manufacturers continue to implement disk laser technology with more vigor than ever. This paper will explain important details of the TruDisk laser series and process relevant features of the system, like pump diode arrangement, resonator design and integrated beam guidance. In addition, advances in applications in the thick sheet area and very cost efficient high productivity applications like remote welding, remote cutting and cutting of thin sheets will be discussed.

  18. A strip chart recorder pattern recognition tool kit for Shuttle operations

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Moebes, Travis A.; Shelton, Robert O.; Savely, Robert T.

    1993-01-01

    During Space Shuttle operations, Mission Control personnel monitor numerous mission-critical systems such as electrical power; guidance, navigation, and control; and propulsion by means of paper strip chart recorders. For example, electrical power controllers monitor strip chart recorder pen traces to identify onboard electrical equipment activations and deactivations. Recent developments in pattern recognition technologies coupled with new capabilities that distribute real-time Shuttle telemetry data to engineering workstations make it possible to develop computer applications that perform some of the low-level monitoring now performed by controllers. The number of opportunities for such applications suggests a need to build a pattern recognition tool kit to reduce software development effort through software reuse. We are building pattern recognition applications while keeping such a tool kit in mind. We demonstrated the initial prototype application, which identifies electrical equipment activations, during three recent Shuttle flights. This prototype was developed to test the viability of the basic system architecture, to evaluate the performance of several pattern recognition techniques including those based on cross-correlation, neural networks, and statistical methods, to understand the interplay between an advanced automation application and human controllers to enhance utility, and to identify capabilities needed in a more general-purpose tool kit.

  19. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  20. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  1. MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.

    PubMed

    Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M

    2002-05-30

    Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.

  2. Advanced Numerical-Algebraic Thinking: Constructing the Concept of Covariation as a Prelude to the Concept of Function

    ERIC Educational Resources Information Center

    Hitt, Fernando; Morasse, Christian

    2009-01-01

    Introduction: In this document we stress the importance of developing in children a structure for advanced numerical-algebraic thinking that can provide an element of control when solving mathematical situations. We analyze pupils' conceptions that induce errors in algebra due to a lack of control in connection with their numerical thinking. We…

  3. Technology Tools to Support Reading in the Digital Age

    ERIC Educational Resources Information Center

    Biancarosa, Gina; Griffiths, Gina G.

    2012-01-01

    Advances in digital technologies are dramatically altering the texts and tools available to teachers and students. These technological advances have created excitement among many for their potential to be used as instructional tools for literacy education. Yet with the promise of these advances come issues that can exacerbate the literacy…

  4. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  5. Engineering bacterial translation initiation - Do we have all the tools we need?

    PubMed

    Vigar, Justin R J; Wieden, Hans-Joachim

    2017-11-01

    Reliable tools that allow precise and predictable control over gene expression are critical for the success of nearly all bioengineering applications. Translation initiation is the most regulated phase during protein biosynthesis, and is therefore a promising target for exerting control over gene expression. At the translational level, the copy number of a protein can be fine-tuned by altering the interaction between the translation initiation region of an mRNA and the ribosome. These interactions can be controlled by modulating the mRNA structure using numerous approaches, including small molecule ligands, RNAs, or RNA-binding proteins. A variety of naturally occurring regulatory elements have been repurposed, facilitating advances in synthetic gene regulation strategies. The pursuit of a comprehensive understanding of mechanisms governing translation initiation provides the framework for future engineering efforts. Here we outline state-of-the-art strategies used to predictably control translation initiation in bacteria. We also discuss current limitations in the field and future goals. Due to its function as the rate-determining step, initiation is the ideal point to exert effective translation regulation. Several engineering tools are currently available to rationally design the initiation characteristics of synthetic mRNAs. However, improvements are required to increase the predictability, effectiveness, and portability of these tools. Predictable and reliable control over translation initiation will allow greater predictability when designing, constructing, and testing genetic circuits. The ability to build more complex circuits predictably will advance synthetic biology and contribute to our fundamental understanding of the underlying principles of these processes. "This article is part of a Special Issue entitled "Biochemistry of Synthetic Biology - Recent Developments" Guest Editor: Dr. Ilka Heinemann and Dr. Patrick O'Donoghue. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Proceedings of the Workshop on software tools for distributed intelligent control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less

  7. Toolpack mathematical software development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osterweil, L.

    1982-07-21

    The purpose of this research project was to produce a well integrated set of tools for the support of numerical computation. The project entailed the specification, design and implementation of both a diversity of tools and an innovative tool integration mechanism. This large configuration of tightly integrated tools comprises an environment for numerical software development, and has been named Toolpack/IST (Integrated System of Tools). Following the creation of this environment in prototype form, the environment software was readied for widespread distribution by transitioning it to a development organization for systematization, documentation and distribution. It is expected that public release ofmore » Toolpack/IST will begin imminently and will provide a basis for evaluation of the innovative software approaches taken as well as a uniform set of development tools for the numerical software community.« less

  8. Geochemistry and the understanding of ground-water systems

    USGS Publications Warehouse

    Glynn, Pierre D.; Plummer, Niel

    2005-01-01

    Geochemistry has contributed significantly to the understanding of ground-water systems over the last 50 years. Historic advances include development of the hydrochemical facies concept, application of equilibrium theory, investigation of redox processes, and radiocarbon dating. Other hydrochemical concepts, tools, and techniques have helped elucidate mechanisms of flow and transport in ground-water systems, and have helped unlock an archive of paleoenvironmental information. Hydrochemical and isotopic information can be used to interpret the origin and mode of ground-water recharge, refine estimates of time scales of recharge and ground-water flow, decipher reactive processes, provide paleohydrological information, and calibrate ground-water flow models. Progress needs to be made in obtaining representative samples. Improvements are needed in the interpretation of the information obtained, and in the construction and interpretation of numerical models utilizing hydrochemical data. The best approach will ensure an optimized iterative process between field data collection and analysis, interpretation, and the application of forward, inverse, and statistical modeling tools. Advances are anticipated from microbiological investigations, the characterization of natural organics, isotopic fingerprinting, applications of dissolved gas measurements, and the fields of reaction kinetics and coupled processes. A thermodynamic perspective is offered that could facilitate the comparison and understanding of the multiple physical, chemical, and biological processes affecting ground-water systems.

  9. Annual Research Briefs, 2004: Center for Turbulence Research

    NASA Technical Reports Server (NTRS)

    Moin, Parviz; Mansour, Nagi N.

    2004-01-01

    This report contains the 2004 annual progress reports of the Research Fellows and students of the Center for Turbulence Research in its eighteenth year of operation. Since its inception in 1987, the objective of the CTR has been to advance the physical understanding of turbulent flows and development of physics based predictive tools for engineering analysis and turbulence control. Turbulence is ubiquitous in nature and in engineering devices. The studies at CTR have been motivated by applications where turbulence effects are significant; these include a broad range of technical areas such as planetary boundary layers, formation of planets, solar convection, magnetohydrodynamics, environmental and eco systems, aerodynamic noise, propulsion systems and high speed transportation. Numerical simulation has been the predominant research tool at CTR which has required a critical mass of researchers in numerical analysis and computer science in addition to core disciplines such as applied mathematics, chemical kinetics and fluid mechanics. Maintaining and promoting this interdisciplinary culture has been a hallmark of CTR and has been responsible for the realization of the results of its basic research in applications. The first group of reports in this volume are directed towards development, analysis and application of novel numerical methods for ow simulations. Development of methods for large eddy simulation of complex flows has been a central theme in this group. The second group is concerned with turbulent combustion, scalar transport and multi-phase ows. The nal group is devoted to geophysical turbulence where the problem of solar convection has been a new focus of considerable attention recently at CTR.

  10. Numerical tools to predict the environmental loads for offshore structures under extreme weather conditions

    NASA Astrophysics Data System (ADS)

    Wu, Yanling

    2018-05-01

    In this paper, the extreme waves were generated using the open source computational fluid dynamic (CFD) tools — OpenFOAM and Waves2FOAM — using linear and nonlinear NewWave input. They were used to conduct the numerical simulation of the wave impact process. Numerical tools based on first-order (with and without stretching) and second-order NewWave are investigated. The simulation to predict force loading for the offshore platform under the extreme weather condition is implemented and compared.

  11. The MATH--Open Source Application for Easier Learning of Numerical Mathematics

    ERIC Educational Resources Information Center

    Glaser-Opitz, Henrich; Budajová, Kristina

    2016-01-01

    The article introduces a software application (MATH) supporting an education of Applied Mathematics, with focus on Numerical Mathematics. The MATH is an easy to use tool supporting various numerical methods calculations with graphical user interface and integrated plotting tool for graphical representation written in Qt with extensive use of Qwt…

  12. A Comparison of Satellite Conjunction Analysis Screening Tools

    DTIC Science & Technology

    2011-09-01

    visualization tool. Version 13.1.4 for Linux was tested. The SOAP conjunction analysis function does not have the capacity to perform the large...was examined by SOAP to confirm the conjunction. STK Advanced CAT STK Advanced CAT (Conjunction Analysis Tools) is an add-on module for the STK ...run with each tool. When attempting to perform the seven day all vs all analysis with STK Advanced CAT, the program consistently crashed during report

  13. Monitoring Object Library Usage and Changes

    NASA Technical Reports Server (NTRS)

    Owen, R. K.; Craw, James M. (Technical Monitor)

    1995-01-01

    The NASA Ames Numerical Aerodynamic Simulation program Aeronautics Consolidated Supercomputing Facility (NAS/ACSF) supercomputing center services over 1600 users, and has numerous analysts with root access. Several tools have been developed to monitor object library usage and changes. Some of the tools do "noninvasive" monitoring and other tools implement run-time logging even for object-only libraries. The run-time logging identifies who, when, and what is being used. The benefits are that real usage can be measured, unused libraries can be discontinued, training and optimization efforts can be focused at those numerical methods that are actually used. An overview of the tools will be given and the results will be discussed.

  14. Verification and Validation Strategy for LWRS Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carl M. Stoots; Richard R. Schultz; Hans D. Gougar

    2012-09-01

    One intension of the Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to create advanced computational tools for safety assessment that enable more accurate representation of a nuclear power plant safety margin. These tools are to be used to study the unique issues posed by lifetime extension and relicensing of the existing operating fleet of nuclear power plants well beyond their first license extension period. The extent to which new computational models / codes such as RELAP-7 can be used for reactor licensing / relicensing activities depends mainly upon the thoroughness with which they have been verifiedmore » and validated (V&V). This document outlines the LWRS program strategy by which RELAP-7 code V&V planning is to be accomplished. From the perspective of developing and applying thermal-hydraulic and reactivity-specific models to reactor systems, the US Nuclear Regulatory Commission (NRC) Regulatory Guide 1.203 gives key guidance to numeric model developers and those tasked with the validation of numeric models. By creating Regulatory Guide 1.203 the NRC defined a framework for development, assessment, and approval of transient and accident analysis methods. As a result, this methodology is very relevant and is recommended as the path forward for RELAP-7 V&V. However, the unique issues posed by lifetime extension will require considerations in addition to those addressed in Regulatory Guide 1.203. Some of these include prioritization of which plants / designs should be studied first, coupling modern supporting experiments to the stringent needs of new high fidelity models / codes, and scaling of aging effects.« less

  15. Prediction of material strength and fracture of glass using the SPHINX smooth particle hydrodynamics code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandell, D.A.; Wingate, C.A.

    1994-08-01

    The design of many military devices involves numerical predictions of the material strength and fracture of brittle materials. The materials of interest include ceramics, that are used in armor packages; glass that is used in truck and jeep windshields and in helicopters; and rock and concrete that are used in underground bunkers. As part of a program to develop advanced hydrocode design tools, the authors have implemented a brittle fracture model for glass into the SPHINX smooth particle hydrodynamics code. The authors have evaluated this model and the code by predicting data from one-dimensional flyer plate impacts into glass, andmore » data from tungsten rods impacting glass. Since fractured glass properties, which are needed in the model, are not available, the authors did sensitivity studies of these properties, as well as sensitivity studies to determine the number of particles needed in the calculations. The numerical results are in good agreement with the data.« less

  16. An Object-Oriented Finite Element Framework for Multiphysics Phase Field Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael R Tonks; Derek R Gaston; Paul C Millett

    2012-01-01

    The phase field approach is a powerful and popular method for modeling microstructure evolution. In this work, advanced numerical tools are used to create a phase field framework that facilitates rapid model development. This framework, called MARMOT, is based on Idaho National Laboratory's finite element Multiphysics Object-Oriented Simulation Environment. In MARMOT, the system of phase field partial differential equations (PDEs) are solved simultaneously with PDEs describing additional physics, such as solid mechanics and heat conduction, using the Jacobian-Free Newton Krylov Method. An object-oriented architecture is created by taking advantage of commonalities in phase fields models to facilitate development of newmore » models with very little written code. In addition, MARMOT provides access to mesh and time step adaptivity, reducing the cost for performing simulations with large disparities in both spatial and temporal scales. In this work, phase separation simulations are used to show the numerical performance of MARMOT. Deformation-induced grain growth and void growth simulations are included to demonstrate the muliphysics capability.« less

  17. Hosted Services for Advanced V and V Technologies: An Approach to Achieving Adoption without the Woes of Usage

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.; OMalley, Owen; Brew, William A.

    2003-01-01

    Attempts to achieve widespread use of software verification tools have been notably unsuccessful. Even 'straightforward', classic, and potentially effective verification tools such as lint-like tools face limits on their acceptance. These limits are imposed by the expertise required applying the tools and interpreting the results, the high false positive rate of many verification tools, and the need to integrate the tools into development environments. The barriers are even greater for more complex advanced technologies such as model checking. Web-hosted services for advanced verification technologies may mitigate these problems by centralizing tool expertise. The possible benefits of this approach include eliminating the need for software developer expertise in tool application and results filtering, and improving integration with other development tools.

  18. Selected aspects of microelectronics technology and applications: Numerically controlled machine tools. Technology trends series no. 2

    NASA Astrophysics Data System (ADS)

    Sigurdson, J.; Tagerud, J.

    1986-05-01

    A UNIDO publication about machine tools with automatic control discusses the following: (1) numerical control (NC) machine tool perspectives, definition of NC, flexible manufacturing systems, robots and their industrial application, research and development, and sensors; (2) experience in developing a capability in NC machine tools; (3) policy issues; (4) procedures for retrieval of relevant documentation from data bases. Diagrams, statistics, bibliography are included.

  19. Advancing Measurement Science to Assess Monitoring, Diagnostics, and Prognostics for Manufacturing Robotics

    PubMed Central

    Qiao, Guixiu; Weiss, Brian A.

    2016-01-01

    Unexpected equipment downtime is a ‘pain point’ for manufacturers, especially in that this event usually translates to financial losses. To minimize this pain point, manufacturers are developing new health monitoring, diagnostic, prognostic, and maintenance (collectively known as prognostics and health management (PHM)) techniques to advance the state-of-the-art in their maintenance strategies. The manufacturing community has a wide-range of needs with respect to the advancement and integration of PHM technologies to enhance manufacturing robotic system capabilities. Numerous researchers, including personnel from the National Institute of Standards and Technology (NIST), have identified a broad landscape of barriers and challenges to advancing PHM technologies. One such challenge is the verification and validation of PHM technology through the development of performance metrics, test methods, reference datasets, and supporting tools. Besides documenting and presenting the research landscape, NIST personnel are actively researching PHM for robotics to promote the development of innovative sensing technology and prognostic decision algorithms and to produce a positional accuracy test method that emphasizes the identification of static and dynamic positional accuracy. The test method development will provide manufacturers with a methodology that will allow them to quickly assess the positional health of their robot systems along with supporting the verification and validation of PHM techniques for the robot system. PMID:28058172

  20. Advancing Measurement Science to Assess Monitoring, Diagnostics, and Prognostics for Manufacturing Robotics.

    PubMed

    Qiao, Guixiu; Weiss, Brian A

    2016-01-01

    Unexpected equipment downtime is a 'pain point' for manufacturers, especially in that this event usually translates to financial losses. To minimize this pain point, manufacturers are developing new health monitoring, diagnostic, prognostic, and maintenance (collectively known as prognostics and health management (PHM)) techniques to advance the state-of-the-art in their maintenance strategies. The manufacturing community has a wide-range of needs with respect to the advancement and integration of PHM technologies to enhance manufacturing robotic system capabilities. Numerous researchers, including personnel from the National Institute of Standards and Technology (NIST), have identified a broad landscape of barriers and challenges to advancing PHM technologies. One such challenge is the verification and validation of PHM technology through the development of performance metrics, test methods, reference datasets, and supporting tools. Besides documenting and presenting the research landscape, NIST personnel are actively researching PHM for robotics to promote the development of innovative sensing technology and prognostic decision algorithms and to produce a positional accuracy test method that emphasizes the identification of static and dynamic positional accuracy. The test method development will provide manufacturers with a methodology that will allow them to quickly assess the positional health of their robot systems along with supporting the verification and validation of PHM techniques for the robot system.

  1. Left ventricular fluid mechanics: the long way from theoretical models to clinical applications.

    PubMed

    Pedrizzetti, Gianni; Domenichini, Federico

    2015-01-01

    The flow inside the left ventricle is characterized by the formation of vortices that smoothly accompany blood from the mitral inlet to the aortic outlet. Computational fluid dynamics permitted to shed some light on the fundamental processes involved with vortex motion. More recently, patient-specific numerical simulations are becoming an increasingly feasible tool that can be integrated with the developing imaging technologies. The existing computational methods are reviewed in the perspective of their potential role as a novel aid for advanced clinical analysis. The current results obtained by simulation methods either alone or in combination with medical imaging are summarized. Open problems are highlighted and perspective clinical applications are discussed.

  2. Monitoring environmental pollutants by microchip capillary electrophoresis with electrochemical detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Gang; Lin, Yuehe; Wang, Joseph

    2006-01-15

    This is a review article. During the past decade, significant progress in the development of miniaturized microfluidic systems has Occurred due to the numerous advantages of microchip analysis. This review focuses on recent advances and the key strategies in microchip capillary electrophoresis (CE) with electrochemical detection (ECD) for separating and detecting a variety of environmental pollutants. The subjects covered include the fabrication of microfluidic chips, ECD, typical applications of microchip CE with ECD in environmental analysis, and future prospects. It is expected that microchip CE-ECD will become a powerful tool in the environmental field and will lead to the creationmore » of truly portable devices.« less

  3. Application of an enhanced discrete element method to oil and gas drilling processes

    NASA Astrophysics Data System (ADS)

    Ubach, Pere Andreu; Arrufat, Ferran; Ring, Lev; Gandikota, Raju; Zárate, Francisco; Oñate, Eugenio

    2016-03-01

    The authors present results on the use of the discrete element method (DEM) for the simulation of drilling processes typical in the oil and gas exploration industry. The numerical method uses advanced DEM techniques using a local definition of the DEM parameters and combined FEM-DEM procedures. This paper presents a step-by-step procedure to build a DEM model for analysis of the soil region coupled to a FEM model for discretizing the drilling tool that reproduces the drilling mechanics of a particular drill bit. A parametric study has been performed to determine the model parameters in order to maintain accurate solutions with reduced computational cost.

  4. Chiral reagents in glycosylation and modification of carbohydrates.

    PubMed

    Wang, Hao-Yuan; Blaszczyk, Stephanie A; Xiao, Guozhi; Tang, Weiping

    2018-02-05

    Carbohydrates play a significant role in numerous biological events, and the chemical synthesis of carbohydrates is vital for further studies to understand their various biological functions. Due to the structural complexity of carbohydrates, the stereoselective formation of glycosidic linkages and the site-selective modification of hydroxyl groups are very challenging and at the same time extremely important. In recent years, the rapid development of chiral reagents including both chiral auxiliaries and chiral catalysts has significantly improved the stereoselectivity for glycosylation reactions and the site-selectivity for the modification of carbohydrates. These new tools will greatly facilitate the efficient synthesis of oligosaccharides, polysaccharides, and glycoconjugates. In this tutorial review, we will summarize these advances and highlight the most recent examples.

  5. Carbofluoresceins and Carborhodamines as Scaffolds for High-Contrast Fluorogenic Probes

    PubMed Central

    2013-01-01

    Fluorogenic molecules are important tools for advanced biochemical and biological experiments. The extant collection of fluorogenic probes is incomplete, however, leaving regions of the electromagnetic spectrum unutilized. Here, we synthesize green-excited fluorescent and fluorogenic analogues of the classic fluorescein and rhodamine 110 fluorophores by replacement of the xanthene oxygen with a quaternary carbon. These anthracenyl “carbofluorescein” and “carborhodamine 110” fluorophores exhibit excellent fluorescent properties and can be masked with enzyme- and photolabile groups to prepare high-contrast fluorogenic molecules useful for live cell imaging experiments and super-resolution microscopy. Our divergent approach to these red-shifted dye scaffolds will enable the preparation of numerous novel fluorogenic probes with high biological utility. PMID:23557713

  6. Engineering hurdles in contact and intraocular lens lathe design: the view ahead

    NASA Astrophysics Data System (ADS)

    Bradley, Norman D.; Keller, John R.; Ball, Gary A.

    1994-05-01

    Current trends in and intraocular lens design suggest ever- increasing demand for aspheric lens geometries - multisurface and/or toric surfaces - in a variety of new materials. As computer numeric controls (CNC) lathes and mills continue to evolve with he ophthalmic market, engineering hurdles present themselves to designers: Can hardware based upon single-point diamond turning accommodate the demands of software-driven designs? What are the limits of CNC resolution and repeatability in high-throughput production? What are the controlling factors in lathed, polish-free surface production? Emerging technologies in the lathed biomedical optics field are discussed along with their limitations, including refined diamond tooling, vibrational control, automation, and advanced motion control systems.

  7. Advances in Diagnosis, Surveillance, and Monitoring of Zika Virus: An Update

    PubMed Central

    Singh, Raj K.; Dhama, Kuldeep; Karthik, Kumaragurubaran; Tiwari, Ruchi; Khandia, Rekha; Munjal, Ashok; Iqbal, Hafiz M. N.; Malik, Yashpal S.; Bueno-Marí, Rubén

    2018-01-01

    Zika virus (ZIKV) is associated with numerous human health-related disorders, including fetal microcephaly, neurological signs, and autoimmune disorders such as Guillain-Barré syndrome (GBS). Perceiving the ZIKA associated losses, in 2016, the World Health Organization (WHO) declared it as a global public health emergency. In consequence, an upsurge in the research on ZIKV was seen around the globe, with significant attainments over developing several effective diagnostics, drugs, therapies, and vaccines countering this life-threatening virus at an early step. State-of-art tools developed led the researchers to explore virus at the molecular level, and in-depth epidemiological investigations to understand the reason for increased pathogenicity and different clinical manifestations. These days, ZIKV infection is diagnosed based on clinical manifestations, along with serological and molecular detection tools. As, isolation of ZIKV is a tedious task; molecular assays such as reverse transcription-polymerase chain reaction (RT-PCR), real-time qRT-PCR, loop-mediated isothermal amplification (LAMP), lateral flow assays (LFAs), biosensors, nucleic acid sequence-based amplification (NASBA) tests, strand invasion-based amplification tests and immune assays like enzyme-linked immunosorbent assay (ELISA) are in-use to ascertain the ZIKV infection or Zika fever. Herein, this review highlights the recent advances in the diagnosis, surveillance, and monitoring of ZIKV. These new insights gained from the recent advances can aid in the rapid and definitive detection of this virus and/or Zika fever. The summarized information will aid the strategies to design and adopt effective prevention and control strategies to counter this viral pathogen of great public health concern. PMID:29403448

  8. Hydroforming Of Patchwork Blanks — Numerical Modeling And Experimental Validation

    NASA Astrophysics Data System (ADS)

    Lamprecht, Klaus; Merklein, Marion; Geiger, Manfred

    2005-08-01

    In comparison to the commonly applied technology of tailored blanks the concept of patchwork blanks offers a number of additional advantages. Potential application areas for patchwork blanks in automotive industry are e.g. local reinforcements of automotive closures, structural reinforcements of rails and pillars as well as shock towers. But even if there is a significant application potential for patchwork blanks in automobile production, industrial realization of this innovative technique is decelerated due to a lack of knowledge regarding the forming behavior and the numerical modeling of patchwork blanks. Especially for the numerical simulation of hydroforming processes, where one part of the forming tool is replaced by a fluid under pressure, advanced modeling techniques are required to ensure an accurate prediction of the blanks' forming behavior. The objective of this contribution is to provide an appropriate model for the numerical simulation of patchwork blanks' forming processes. Therefore, different finite element modeling techniques for patchwork blanks are presented. In addition to basic shell element models a combined finite element model consisting of shell and solid elements is defined. Special emphasis is placed on the modeling of the weld seam. For this purpose the local mechanical properties of the weld metal, which have been determined by means of Martens-hardness measurements and uniaxial tensile tests, are integrated in the finite element models. The results obtained from the numerical simulations are compared to experimental data from a hydraulic bulge test. In this context the focus is laid on laser- and spot-welded patchwork blanks.

  9. Numerical Zooming Between a NPSS Engine System Simulation and a One-Dimensional High Compressor Analysis Code

    NASA Technical Reports Server (NTRS)

    Follen, Gregory; auBuchon, M.

    2000-01-01

    Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer along with the concept of numerical zooming between zero-dimensional to one-, two-, and three-dimensional component engine codes. In addition, the NPSS is refining the computing and communication technologies necessary to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Of the different technology areas that contribute to the development of the NPSS Environment, the subject of this paper is a discussion on numerical zooming between a NPSS engine simulation and higher fidelity representations of the engine components (fan, compressor, burner, turbines, etc.). What follows is a description of successfully zooming one-dimensional (row-by-row) high-pressure compressor analysis results back to a zero-dimensional NPSS engine simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the capability of the engine system simulation and increase the level of virtual test conducted prior to committing the design to hardware.

  10. Analysis of the thermo-mechanical deformations in a hot forging tool by numerical simulation

    NASA Astrophysics Data System (ADS)

    L-Cancelos, R.; Varas, F.; Martín, E.; Viéitez, I.

    2016-03-01

    Although programs have been developed for the design of tools for hot forging, its design is still largely based on the experience of the tool maker. This obliges to build some test matrices and correct their errors to minimize distortions in the forged piece. This phase prior to mass production consumes time and material resources, which makes the final product more expensive. The forging tools are usually constituted by various parts made of different grades of steel, which in turn have different mechanical properties and therefore suffer different degrees of strain. Furthermore, the tools used in the hot forging are exposed to a thermal field that also induces strain or stress based on the degree of confinement of the piece. Therefore, the mechanical behaviour of the assembly is determined by the contact between the different pieces. The numerical simulation allows to analyse different configurations and anticipate possible defects before tool making, thus, reducing the costs of this preliminary phase. In order to improve the dimensional quality of the manufactured parts, the work presented here focuses on the application of a numerical model to a hot forging manufacturing process in order to predict the areas of the forging die subjected to large deformations. The thermo-mechanical model developed and implemented with free software (Code-Aster) includes the strains of thermal origin, strains during forge impact and contact effects. The numerical results are validated with experimental measurements in a tooling set that produces forged crankshafts for the automotive industry. The numerical results show good agreement with the experimental tests. Thereby, a very useful tool for the design of tooling sets for hot forging is achieved.

  11. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  12. EUV microexposures at the ALS using the 0.3-NA MET projectionoptics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naulleau, Patrick; Goldberg, Kenneth A.; Anderson, Erik

    2005-09-01

    The recent development of high numerical aperture (NA) EUV optics such as the 0.3-NA Micro Exposure Tool (MET) optic has given rise to a new class of ultra-high resolution microexposure stations. Once such printing station has been developed and implemented at Lawrence Berkeley National Laboratory's Advanced Light Source. This flexible printing station utilizes a programmable coherence illuminator providing real-time pupil-fill control for advanced EUV resist and mask development. The Berkeley exposure system programmable illuminator enables several unique capabilities. Using dipole illumination out to {sigma}=1, the Berkeley tool supports equal-line-space printing down to 12 nm, well beyond the capabilities of similarmore » tools. Using small-sigma illumination combined with the central obscuration of the MET optic enables the system to print feature sizes that are twice as small as those coded on the mask. In this configuration, the effective 10x-demagnification for equal lines and spaces reduces the mask fabrication burden for ultra-high-resolution printing. The illuminator facilitates coherence studies such as the impact of coherence on line-edge roughness (LER) and flare. Finally the illuminator enables novel print-based aberration monitoring techniques as described elsewhere in these proceedings. Here we describe the capabilities of the new MET printing station and present system characterization results. Moreover, we present the latest printing results obtained in experimental resists. Limited by the availability of high-resolution photoresists, equal line-space printing down to 25 nm has been demonstrated as well as isolated line printing down to 29 nm with an LER of approaching 3 nm.« less

  13. A Comprehensive Look at Polypharmacy and Medication Screening Tools for the Older Cancer Patient

    PubMed Central

    DeGregory, Kathlene A.; Morris, Amy L.; Ramsdale, Erika E.

    2016-01-01

    Inappropriate medication use and polypharmacy are extremely common among older adults. Numerous studies have discussed the importance of a comprehensive medication assessment in the general geriatric population. However, only a handful of studies have evaluated inappropriate medication use in the geriatric oncology patient. Almost a dozen medication screening tools exist for the older adult. Each available tool has the potential to improve aspects of the care of older cancer patients, but no single tool has been developed for this population. We extensively reviewed the literature (MEDLINE, PubMed) to evaluate and summarize the most relevant medication screening tools for older patients with cancer. Findings of this review support the use of several screening tools concurrently for the elderly patient with cancer. A deprescribing tool should be developed and included in a comprehensive geriatric oncology assessment. Finally, prospective studies are needed to evaluate such a tool to determine its feasibility and impact in older patients with cancer. Implications for Practice: The prevalence of polypharmacy increases with advancing age. Older adults are more susceptible to adverse effects of medications. “Prescribing cascades” are common, whereas “deprescribing” remains uncommon; thus, older patients tend to accumulate medications over time. Older patients with cancer are at high risk for adverse drug events, in part because of the complexity and intensity of cancer treatment. Additionally, a cancer diagnosis often alters assessments of life expectancy, clinical status, and competing risk. Screening for polypharmacy and potentially inappropriate medications could reduce the risk for adverse drug events, enhance quality of life, and reduce health care spending for older cancer patients. PMID:27151653

  14. Optimization of Process Parameters for High Efficiency Laser Forming of Advanced High Strength Steels within Metallurgical Constraints

    NASA Astrophysics Data System (ADS)

    Sheikholeslami, Ghazal; Griffiths, Jonathan; Dearden, Geoff; Edwardson, Stuart P.

    Laser forming (LF) has been shown to be a viable alternative to form automotive grade advanced high strength steels (AHSS). Due to their high strength, heat sensitivity and low conventional formability show early fractures, larger springback, batch-to-batch inconsistency and high tool wear. In this paper, optimisation of the LF process parameters has been conducted to further understand the impact of a surface heat treatment on DP1000. A FE numerical simulation has been developed to analyse the dynamic thermo-mechanical effects. This has been verified against empirical data. The goal of the optimisation has been to develop a usable process window for the LF of AHSS within strict metallurgical constraints. Results indicate it is possible to LF this material, however a complex relationship has been found between the generation and maintenance of hardness values in the heated zone. A laser surface hardening effect has been observed that could be beneficial to the efficiency of the process.

  15. Exploring a Multiphysics Resolution Approach for Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Estupinan Donoso, Alvaro Antonio; Peters, Bernhard

    2018-06-01

    Metal additive manufacturing (AM) is a fast-evolving technology aiming to efficiently produce complex parts while saving resources. Worldwide, active research is being performed to solve the existing challenges of this growing technique. Constant computational advances have enabled multiscale and multiphysics numerical tools that complement the traditional physical experimentation. In this contribution, an advanced discrete-continuous concept is proposed to address the physical phenomena involved during laser powder bed fusion. The concept treats powder as discrete by the extended discrete element method, which predicts the thermodynamic state and phase change for each particle. The fluid surrounding is solved with multiphase computational fluid dynamics techniques to determine momentum, heat, gas and liquid transfer. Thus, results track the positions and thermochemical history of individual particles in conjunction with the prevailing fluid phases' temperature and composition. It is believed that this methodology can be employed to complement experimental research by analysis of the comprehensive results, which can be extracted from it to enable AM processes optimization for parts qualification.

  16. Advances in the in Vivo Raman Spectroscopy of Malignant Skin Tumors Using Portable Instrumentation

    PubMed Central

    Kourkoumelis, Nikolaos; Balatsoukas, Ioannis; Moulia, Violetta; Elka, Aspasia; Gaitanis, Georgios; Bassukas, Ioannis D.

    2015-01-01

    Raman spectroscopy has emerged as a promising tool for real-time clinical diagnosis of malignant skin tumors offering a number of potential advantages: it is non-intrusive, it requires no sample preparation, and it features high chemical specificity with minimal water interference. However, in vivo tissue evaluation and accurate histopathological classification remain a challenging task for the successful transition from laboratory prototypes to clinical devices. In the literature, there are numerous reports on the applications of Raman spectroscopy to biomedical research and cancer diagnostics. Nevertheless, cases where real-time, portable instrumentations have been employed for the in vivo evaluation of skin lesions are scarce, despite their advantages in use as medical devices in the clinical setting. This paper reviews the advances in real-time Raman spectroscopy for the in vivo characterization of common skin lesions. The translational momentum of Raman spectroscopy towards the clinical practice is revealed by (i) assembling the technical specifications of portable systems and (ii) analyzing the spectral characteristics of in vivo measurements. PMID:26132563

  17. Gene therapy and genome surgery in the retina.

    PubMed

    DiCarlo, James E; Mahajan, Vinit B; Tsang, Stephen H

    2018-06-01

    Precision medicine seeks to treat disease with molecular specificity. Advances in genome sequence analysis, gene delivery, and genome surgery have allowed clinician-scientists to treat genetic conditions at the level of their pathology. As a result, progress in treating retinal disease using genetic tools has advanced tremendously over the past several decades. Breakthroughs in gene delivery vectors, both viral and nonviral, have allowed the delivery of genetic payloads in preclinical models of retinal disorders and have paved the way for numerous successful clinical trials. Moreover, the adaptation of CRISPR-Cas systems for genome engineering have enabled the correction of both recessive and dominant pathogenic alleles, expanding the disease-modifying power of gene therapies. Here, we highlight the translational progress of gene therapy and genome editing of several retinal disorders, including RPE65-, CEP290-, and GUY2D-associated Leber congenital amaurosis, as well as choroideremia, achromatopsia, Mer tyrosine kinase- (MERTK-) and RPGR X-linked retinitis pigmentosa, Usher syndrome, neovascular age-related macular degeneration, X-linked retinoschisis, Stargardt disease, and Leber hereditary optic neuropathy.

  18. REACH: Real-Time Data Awareness in Multi-Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Maks, Lori; Coleman, Jason; Hennessy, Joseph F. (Technical Monitor)

    2002-01-01

    NASA's Advanced Architectures and Automation Branch at the Goddard Space Flight Center (Code 588) saw the potential to reduce the cost of constellation missions by creating new user interfaces to the ground system health-and-safety data. The goal is to enable a small Flight Operations Team (FOT) to remain aware and responsive to the increased amount of ground system information in a multi-spacecraft environment. Rather than abandon the tried and true, these interfaces were developed to run alongside existing ground system software to provide additional support to the FOT. These new user interfaces have been combined in a tool called REACH. REACH-the Real-time Evaluation and Analysis of Consolidated Health-is a software product that uses advanced visualization techniques to make spacecraft anomalies easy to spot, no matter how many spacecraft are in the constellation. REACH reads numerous real-time streams of data from the ground system(s) and displays synthesized information to the FOT such that anomalies are easy to pick out and investigate.

  19. Recent Advances in Algal Genetic Tool Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less

  20. Recent Advances in Algal Genetic Tool Development

    DOE PAGES

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less

  1. Interactive Visualization to Advance Earthquake Simulation

    NASA Astrophysics Data System (ADS)

    Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn

    2008-04-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.

  2. Designing drugs on the internet? Free web tools and services supporting medicinal chemistry.

    PubMed

    Ertl, Peter; Jelfs, Stephen

    2007-01-01

    The drug discovery process is supported by a multitude of freely available tools on the Internet. This paper summarizes some of the databases and tools that are of particular interest to medicinal chemistry. These include numerous data collections that provide access to valuable chemical data resources, allowing complex queries of compound structures, associated physicochemical properties and biological activities to be performed and, in many cases, providing links to commercial chemical suppliers. Further applications are available for searching protein-ligand complexes and identifying important binding interactions that occur. This is particularly useful for understanding the molecular recognition of ligands in the lead optimization process. The Internet also provides access to databases detailing metabolic pathways and transformations which can provide insight into disease mechanism, identify new targets entities or the potential off-target effects of a drug candidate. Furthermore, sophisticated online cheminformatics tools are available for processing chemical structures, predicting properties, and generating 2D or 3D structure representations--often required prior to more advanced analyses. The Internet provides a wealth of valuable resources that, if fully exploited, can greatly benefit the drug discovery community. In this paper, we provide an overview of some of the more important of these and, in particular, the freely accessible resources that are currently available.

  3. Application of the gene editing tool, CRISPR-Cas9, for treating neurodegenerative diseases.

    PubMed

    Kolli, Nivya; Lu, Ming; Maiti, Panchanan; Rossignol, Julien; Dunbar, Gary L

    2018-01-01

    Increased accumulation of transcribed protein from the damaged DNA and reduced DNA repair capability contributes to numerous neurological diseases for which effective treatments are lacking. Gene editing techniques provide new hope for replacing defective genes and DNA associated with neurological diseases. With advancements in using such editing tools as zinc finger nucleases (ZFNs), meganucleases, and transcription activator-like effector nucleases (TALENs), etc., scientists are able to design DNA-binding proteins, which can make precise double-strand breaks (DSBs) at the target DNA. Recent developments with the CRISPR-Cas9 gene-editing technology has proven to be more precise and efficient when compared to most other gene-editing techniques. Two methods, non-homologous end joining (NHEJ) and homology-direct repair (HDR), are used in CRISPR-Cas9 system to efficiently excise the defective genes and incorporate exogenous DNA at the target site. In this review article, we provide an overview of the CRISPR-Cas9 methodology, including its molecular mechanism, with a focus on how in this gene-editing tool can be used to counteract certain genetic defects associated with neurological diseases. Detailed understanding of this new tool could help researchers design specific gene editing strategies to repair genetic disorders in selective neurological diseases. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Advancing MODFLOW Applying the Derived Vector Space Method

    NASA Astrophysics Data System (ADS)

    Herrera, G. S.; Herrera, I.; Lemus-García, M.; Hernandez-Garcia, G. D.

    2015-12-01

    The most effective domain decomposition methods (DDM) are non-overlapping DDMs. Recently a new approach, the DVS-framework, based on an innovative discretization method that uses a non-overlapping system of nodes (the derived-nodes), was introduced and developed by I. Herrera et al. [1, 2]. Using the DVS-approach a group of four algorithms, referred to as the 'DVS-algorithms', which fulfill the DDM-paradigm (i.e. the solution of global problems is obtained by resolution of local problems exclusively) has been derived. Such procedures are applicable to any boundary-value problem, or system of such equations, for which a standard discretization method is available and then software with a high degree of parallelization can be constructed. In a parallel talk, in this AGU Fall Meeting, Ismael Herrera will introduce the general DVS methodology. The application of the DVS-algorithms has been demonstrated in the solution of several boundary values problems of interest in Geophysics. Numerical examples for a single-equation, for the cases of symmetric, non-symmetric and indefinite problems were demonstrated before [1,2]. For these problems DVS-algorithms exhibited significantly improved numerical performance with respect to standard versions of DDM algorithms. In view of these results our research group is in the process of applying the DVS method to a widely used simulator for the first time, here we present the advances of the application of this method for the parallelization of MODFLOW. Efficiency results for a group of tests will be presented. References [1] I. Herrera, L.M. de la Cruz and A. Rosas-Medina. Non overlapping discretization methods for partial differential equations, Numer Meth Part D E, (2013). [2] Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  5. Utility of the advanced chronic kidney disease patient management tools: case studies.

    PubMed

    Patwardhan, Meenal B; Matchar, David B; Samsa, Gregory P; Haley, William E

    2008-01-01

    Appropriate management of advanced chronic kidney disease (CKD) delays or limits its progression. The Advanced CKD Patient Management Toolkit was developed using a process-improvement technique to assist patient management and address CKD-specific management issues. We pilot tested the toolkit in 2 community nephrology practices, assessed the utility of individual tools, and evaluated the impact on conformance to an advanced CKD guideline through patient chart abstraction. Tool use was distinct in the 2 sites and depended on the site champion's involvement, the extent of process reconfiguration demanded by a tool, and its perceived value. Baseline conformance varied across guideline recommendations (averaged 54%). Posttrial conformance increased in all clinical areas (averaged 59%). Valuable features of the toolkit in real-world settings were its ability to: facilitate tool selection, direct implementation efforts in response to a baseline performance audit, and allow selection of tool versions and customizing them. Our results suggest that systematically created, multifaceted, and customizable tools can promote guideline conformance.

  6. Advances and unresolved challenges in the structural characterization of isomeric lipids.

    PubMed

    Hancock, Sarah E; Poad, Berwyck L J; Batarseh, Amani; Abbott, Sarah K; Mitchell, Todd W

    2017-05-01

    As the field of lipidomics grows and its application becomes wide and varied it is important that we don't forget its foundation, i.e. the identification and measurement of molecular lipids. Advances in liquid chromatography and the emergence of ion mobility as a useful tool in lipid analysis are allowing greater separation of lipid isomers than ever before. At the same time, novel ion activation techniques, such as ozone-induced dissociation, are pushing lipid structural characterization by mass spectrometry to new levels. Nevertheless, the quantitative capacity of these techniques is yet to be proven and further refinements are required to unravel the high level of lipid complexity found in biological samples. At present there is no one technique capable of providing full structural characterization of lipids from a biological sample. There are however, numerous techniques now available (as discussed in this review) that could be deployed in a targeted approach. Moving forward, the combination of advanced separation and ion activation techniques is likely to provide mass spectrometry-based lipidomics with its best opportunity to achieve complete molecular-level lipid characterization and measurement from complex mixtures. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.

  7. Advancing the field of 3D biomaterial printing.

    PubMed

    Jakus, Adam E; Rutz, Alexandra L; Shah, Ramille N

    2016-01-11

    3D biomaterial printing has emerged as a potentially revolutionary technology, promising to transform both research and medical therapeutics. Although there has been recent progress in the field, on-demand fabrication of functional and transplantable tissues and organs is still a distant reality. To advance to this point, there are two major technical challenges that must be overcome. The first is expanding upon the limited variety of available 3D printable biomaterials (biomaterial inks), which currently do not adequately represent the physical, chemical, and biological complexity and diversity of tissues and organs within the human body. Newly developed biomaterial inks and the resulting 3D printed constructs must meet numerous interdependent requirements, including those that lead to optimal printing, structural, and biological outcomes. The second challenge is developing and implementing comprehensive biomaterial ink and printed structure characterization combined with in vitro and in vivo tissue- and organ-specific evaluation. This perspective outlines considerations for addressing these technical hurdles that, once overcome, will facilitate rapid advancement of 3D biomaterial printing as an indispensable tool for both investigating complex tissue and organ morphogenesis and for developing functional devices for a variety of diagnostic and regenerative medicine applications.

  8. Aura of technology and the cutting edge: a history of lasers in neurosurgery.

    PubMed

    Ryan, Robert W; Spetzler, Robert F; Preul, Mark C

    2009-09-01

    In this historical review the authors examine the important developments that have led to the availability of laser energy to neurosurgeons as a unique and sometimes invaluable tool. They review the physical science behind the function of lasers, as well as how and when various lasers based on different lasing mediums were discovered. They also follow the close association between advances in laser technology and their application in biomedicine, from early laboratory experiments to the first clinical experiences. Because opinions on the appropriate role of lasers in neurosurgery vary widely, the historical basis for some of these views is explored. Initial enthusiasm for a technology that appears to have innate advantages for safe resections has often given way to the strict limitations and demands of the neurosurgical operating theater. However, numerous creative solutions to improve laser delivery, power, safety, and ergonomics demonstrate the important role that technological advances in related scientific fields continue to offer neurosurgery. Benefiting from the most recent developments in materials science, current CO(2) laser delivery systems provide a useful addition to the neurosurgical armamentarium when applied in the correct circumstances and reflect the important historical advances that come about from the interplay between neurosurgery and technology.

  9. Advances in Machine Learning and Data Mining for Astronomy

    NASA Astrophysics Data System (ADS)

    Way, Michael J.; Scargle, Jeffrey D.; Ali, Kamal M.; Srivastava, Ashok N.

    2012-03-01

    Advances in Machine Learning and Data Mining for Astronomy documents numerous successful collaborations among computer scientists, statisticians, and astronomers who illustrate the application of state-of-the-art machine learning and data mining techniques in astronomy. Due to the massive amount and complexity of data in most scientific disciplines, the material discussed in this text transcends traditional boundaries between various areas in the sciences and computer science. The book's introductory part provides context to issues in the astronomical sciences that are also important to health, social, and physical sciences, particularly probabilistic and statistical aspects of classification and cluster analysis. The next part describes a number of astrophysics case studies that leverage a range of machine learning and data mining technologies. In the last part, developers of algorithms and practitioners of machine learning and data mining show how these tools and techniques are used in astronomical applications. With contributions from leading astronomers and computer scientists, this book is a practical guide to many of the most important developments in machine learning, data mining, and statistics. It explores how these advances can solve current and future problems in astronomy and looks at how they could lead to the creation of entirely new algorithms within the data mining community.

  10. Numerical simulation on chain-die forming of an AHSS top-hat section

    NASA Astrophysics Data System (ADS)

    Majji, Raju; Xiang, Yang; Ding, Scott; Yang, Chunhui

    2018-05-01

    The applications of Advanced High-Strength Steels (AHSS) in the automotive industry are rapidly increasing due to a demand for a lightweight material that significantly reduces fuel consumption without compromising passenger safety. Automotive industries and material suppliers are expected by consumers to deliver reliable and affordable products, thus stimulating these manufacturers to research solutions to meet these customer requirements. The primary advantage of AHSS is its extremely high strength to weight ratio, an ideal material for the automotive industry. However, its low ductility is a major disadvantage, in particular, when using traditional cold forming processes such as roll forming and deep drawing process to form profiles. Consequently, AHSS parts frequently fail to form. Thereby, in order to improve quality and reliability on manufacturing AHSS products, a recently-developed incremental cold sheet metal forming technology called Chain-die Forming (CDF) is recognised as a potential solution to the forming process of AHSS. The typical CDF process is a combination of bending and roll forming processes which is equivalent to a roll with a large deforming radius, and incrementally forms the desired shape with split die and segments. This study focuses on manufacturing an AHSS top-hat section with minimum passes without geometrical or surface defects by using finite element modelling and simulations. The developed numerical simulation is employed to investigate the influences on the main control parameter of the CDF process while forming AHSS products and further develop new die-punch sets of compensation design via a numerical optimal process. In addition, the study focuses on the tool design to compensate spring-back and reduce friction between tooling and sheet-metal. This reduces the number of passes, thereby improving productivity and reducing energy consumption and material waste. This numerical study reveals that CDF forms AHSS products of complex profiles with much less residual stress, low spring back, low strain and of higher geometrical accuracy compared to other traditional manufacturing processes.

  11. Itzï (version 17.1): an open-source, distributed GIS model for dynamic flood simulation

    NASA Astrophysics Data System (ADS)

    Guillaume Courty, Laurent; Pedrozo-Acuña, Adrián; Bates, Paul David

    2017-05-01

    Worldwide, floods are acknowledged as one of the most destructive hazards. In human-dominated environments, their negative impacts are ascribed not only to the increase in frequency and intensity of floods but also to a strong feedback between the hydrological cycle and anthropogenic development. In order to advance a more comprehensive understanding of this complex interaction, this paper presents the development of a new open-source tool named Itzï that enables the 2-D numerical modelling of rainfall-runoff processes and surface flows integrated with the open-source geographic information system (GIS) software known as GRASS. Therefore, it takes advantage of the ability given by GIS environments to handle datasets with variations in both temporal and spatial resolutions. Furthermore, the presented numerical tool can handle datasets from different sources with varied spatial resolutions, facilitating the preparation and management of input and forcing data. This ability reduces the preprocessing time usually required by other models. Itzï uses a simplified form of the shallow water equations, the damped partial inertia equation, for the resolution of surface flows, and the Green-Ampt model for the infiltration. The source code is now publicly available online, along with complete documentation. The numerical model is verified against three different tests cases: firstly, a comparison with an analytic solution of the shallow water equations is introduced; secondly, a hypothetical flooding event in an urban area is implemented, where results are compared to those from an established model using a similar approach; and lastly, the reproduction of a real inundation event that occurred in the city of Kingston upon Hull, UK, in June 2007, is presented. The numerical approach proved its ability at reproducing the analytic and synthetic test cases. Moreover, simulation results of the real flood event showed its suitability at identifying areas affected by flooding, which were verified against those recorded after the event by local authorities.

  12. Using Virtualization to Integrate Weather, Climate, and Coastal Science Education

    NASA Astrophysics Data System (ADS)

    Davis, J. R.; Paramygin, V. A.; Figueiredo, R.; Sheng, Y.

    2012-12-01

    To better understand and communicate the important roles of weather and climate on the coastal environment, a unique publically available tool is being developed to support research, education, and outreach activities. This tool uses virtualization technologies to facilitate an interactive, hands-on environment in which students, researchers, and general public can perform their own numerical modeling experiments. While prior efforts have focused solely on the study of the coastal and estuary environments, this effort incorporates the community supported weather and climate model (WRF-ARW) into the Coastal Science Educational Virtual Appliance (CSEVA), an education tool used to assist in the learning of coastal transport processes; storm surge and inundation; and evacuation modeling. The Weather Research and Forecasting (WRF) Model is a next-generation, community developed and supported, mesoscale numerical weather prediction system designed to be used internationally for research, operations, and teaching. It includes two dynamical solvers (ARW - Advanced Research WRF and NMM - Nonhydrostatic Mesoscale Model) as well as a data assimilation system. WRF-ARW is the ARW dynamics solver combined with other components of the WRF system which was developed primarily at NCAR, community support provided by the Mesoscale and Microscale Meteorology (MMM) division of National Center for Atmospheric Research (NCAR). Included with WRF is the WRF Pre-processing System (WPS) which is a set of programs to prepare input for real-data simulations. The CSEVA is based on the Grid Appliance (GA) framework and is built using virtual machine (VM) and virtual networking technologies. Virtualization supports integration of an operating system, libraries (e.g. Fortran, C, Perl, NetCDF, etc. necessary to build WRF), web server, numerical models/grids/inputs, pre-/post-processing tools (e.g. WPS / RIP4 or UPS), graphical user interfaces, "Cloud"-computing infrastructure and other tools into a single ready-to-use package. Thus, the previous ornery task of setting up and compiling these tools becomes obsolete and the research, educator or student can focus on using the tools to study the interactions between weather, climate and the coastal environment. The incorporation of WRF into the CSEVA has been designed to be synergistic with the extensive online tutorials and biannual tutorials hosted by NCAR. Included are working examples of the idealized test simulations provided with WRF (2D sea breeze and squalls, a large eddy simulation, a Held and Suarez simulation, etc.) To demonstrate the integration of weather, coastal and coastal science education, example applications are being developed to demonstrate how the system can be used to couple a coastal and estuarine circulation, transport and storm surge model with downscale reanalysis weather and future climate predictions. Documentation, tutorials and the enhanced CSEVA itself will be found on the web at: http://cseva.coastal.ufl.edu.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, Benjamin; Ruebel, Oliver; Fischer, Curt Fischer R.

    BASTet is an advanced software library written in Python. BASTet serves as the analysis and storage library for the OpenMSI project. BASTet is an integrate framework for: i) storage of spectral imaging data, ii) storage of derived analysis data, iii) provenance of analyses, iv) integration and execution of analyses via complex workflows. BASTet implements the API for the HDF5 storage format used by OpenMSI. Analyses that are developed using BASTet benefit from direct integration with storage format, automatic tracking of provenance, and direct integration with command-line and workflow execution tools. BASTet also defines interfaces to enable developers to directly integratemore » their analysis with OpenMSI's web-based viewing infrastruture without having to know OpenMSI. BASTet also provides numerous helper classes and tools to assist with the conversion of data files, ease parallel implementation of analysis algorithms, ease interaction with web-based functions, description methods for data reduction. BASTet also includes detailed developer documentation, user tutorials, iPython notebooks, and other supporting documents.« less

  14. High Precision Thermal, Structural and Optical Analysis of an External Occulter Using a Common Model and the General Purpose Multi-Physics Analysis Tool Cielo

    NASA Technical Reports Server (NTRS)

    Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg

    2011-01-01

    The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.

  15. The UEA sRNA Workbench (version 4.4): a comprehensive suite of tools for analyzing miRNAs and sRNAs.

    PubMed

    Stocks, Matthew B; Mohorianu, Irina; Beckers, Matthew; Paicu, Claudia; Moxon, Simon; Thody, Joshua; Dalmay, Tamas; Moulton, Vincent

    2018-05-02

    RNA interference, a highly conserved regulatory mechanism, is mediated via small RNAs. Recent technical advances enabled the analysis of larger, complex datasets and the investigation of microRNAs and the less known small interfering RNAs. However, the size and intricacy of current data requires a comprehensive set of tools, able to discriminate the patterns from the low-level, noise-like, variation; numerous and varied suggestions from the community represent an invaluable source of ideas for future tools, the ability of the community to contribute to this software is essential. We present a new version of the UEA sRNA Workbench, reconfigured to allow an easy insertion of new tools/workflows. In its released form, it comprises of a suite of tools in a user-friendly environment, with enhanced capabilities for a comprehensive processing of sRNA-seq data e.g. tools for an accurate prediction of sRNA loci (CoLIde) and miRNA loci (miRCat2), as well as workflows to guide the users through common steps such as quality checking of the input data, normalization of abundances or detection of differential expression represent the first step in sRNA-seq analyses. The UEA sRNA Workbench is available at: http://srna-workbench.cmp.uea.ac.uk The source code is available at: https://github.com/sRNAworkbenchuea/UEA_sRNA_Workbench. v.moulton@uea.ac.uk.

  16. The Canadian Astronomy Data Centre

    NASA Astrophysics Data System (ADS)

    Ball, Nicholas M.; Schade, D.; Astronomy Data Centre, Canadian

    2011-01-01

    The Canadian Astronomy Data Centre (CADC) is the world's largest astronomical data center, holding over 0.5 Petabytes of information, and serving nearly 3000 astronomers worldwide. Its current data collections include BLAST, CFHT, CGPS, FUSE, Gemini, HST, JCMT, MACHO, MOST, and numerous other archives and services. It provides extensive data archiving, curation, and processing expertise, via projects such as MegaPipe, and enables substantial day-to-day collaboration between resident astronomers and computer specialists. It is a stable, powerful, persistent, and properly supported environment for the storage and processing of large volumes of data, a condition that is now absolutely vital for their science potential to be exploited by the community. Through initiatives such as the Common Archive Observation Model (CAOM), the Canadian Virtual Observatory (CVO), and the Canadian Advanced Network for Astronomical Research (CANFAR), the CADC is at the global forefront of advancing astronomical research through improved data services. The CAOM aims to provide homogeneous data access, and hence viable interoperability between a potentially unlimited number of different data collections, at many wavelengths. It is active in the definition of numerous emerging standards within the International Virtual Observatory, and several datasets are already available. The CANFAR project is an initiative to make cloud computing for storage and data-intensive processing available to the community. It does this via a Virtual Machine environment that is equivalent to managing a local desktop. Several groups are already processing science data. CADC is also at the forefront of advanced astronomical data analysis, driven by the science requirements of astronomers both locally and further afield. The emergence of 'Astroinformatics' promises to provide not only utility items like object classifications, but to directly enable new science by accessing previously undiscovered or intractable information. We are currently in the early stages of implementing Astroinformatics tools, such as machine learning, on CANFAR.

  17. FY15 Report on Thermomechanical Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Francis D.; Buchholz, Stuart

    2015-08-01

    Sandia is participating in the third phase of a United States (US)-German Joint Project that compares constitutive models and simulation procedures on the basis of model calculations of the thermomechanical behavior and healing of rock salt (Salzer et al. 2015). The first goal of the project is to evaluate the ability of numerical modeling tools to correctly describe the relevant deformation phenomena in rock salt under various influences. Among the numerical modeling tools required to address this are constitutive models that are used in computer simulations for the description of the thermal, mechanical, and hydraulic behavior of the host rockmore » under various influences and for the long-term prediction of this behavior. Achieving this goal will lead to increased confidence in the results of numerical simulations related to the secure disposal of radioactive wastes in rock salt. Results of the Joint Project may ultimately be used to make various assertions regarding stability analysis of an underground repository in salt during the operating phase as well as long-term integrity of the geological barrier in the post-operating phase A primary evaluation of constitutive model capabilities comes by way of predicting large-scale field tests. The Joint Project partners decided to model Waste Isolation Pilot Plant (WIPP) Rooms B & D which are full-scale rooms having the same dimensions. Room D deformed under natural, ambient conditions while Room B was thermally driven by an array of waste-simulating heaters (Munson et al. 1988; 1990). Existing laboratory test data for WIPP salt were carefully scrutinized and the partners decided that additional testing would be needed to help evaluate advanced features of the constitutive models. The German partners performed over 140 laboratory tests on WIPP salt at no charge to the US Department of Energy (DOE).« less

  18. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  19. Critical brain regions for tool-related and imitative actions: a componential analysis

    PubMed Central

    Shapiro, Allison D.; Coslett, H. Branch

    2014-01-01

    Numerous functional neuroimaging studies suggest that widespread bilateral parietal, temporal, and frontal regions are involved in tool-related and pantomimed gesture performance, but the role of these regions in specific aspects of gestural tasks remains unclear. In the largest prospective study of apraxia-related lesions to date, we performed voxel-based lesion–symptom mapping with data from 71 left hemisphere stroke participants to assess the critical neural substrates of three types of actions: gestures produced in response to viewed tools, imitation of tool-specific gestures demonstrated by the examiner, and imitation of meaningless gestures. Thus, two of the three gesture types were tool-related, and two of the three were imitative, enabling pairwise comparisons designed to highlight commonalities and differences. Gestures were scored separately for postural (hand/arm positioning) and kinematic (amplitude/timing) accuracy. Lesioned voxels in the left posterior temporal gyrus were significantly associated with lower scores on the posture component for both of the tool-related gesture tasks. Poor performance on the kinematic component of all three gesture tasks was significantly associated with lesions in left inferior parietal and frontal regions. These data enable us to propose a componential neuroanatomic model of action that delineates the specific components required for different gestural action tasks. Thus, visual posture information and kinematic capacities are differentially critical to the three types of actions studied here: the kinematic aspect is particularly critical for imitation of meaningless movement, capacity for tool-action posture representations are particularly necessary for pantomimed gestures to the sight of tools, and both capacities inform imitation of tool-related movements. These distinctions enable us to advance traditional accounts of apraxia. PMID:24776969

  20. Critical brain regions for tool-related and imitative actions: a componential analysis.

    PubMed

    Buxbaum, Laurel J; Shapiro, Allison D; Coslett, H Branch

    2014-07-01

    Numerous functional neuroimaging studies suggest that widespread bilateral parietal, temporal, and frontal regions are involved in tool-related and pantomimed gesture performance, but the role of these regions in specific aspects of gestural tasks remains unclear. In the largest prospective study of apraxia-related lesions to date, we performed voxel-based lesion-symptom mapping with data from 71 left hemisphere stroke participants to assess the critical neural substrates of three types of actions: gestures produced in response to viewed tools, imitation of tool-specific gestures demonstrated by the examiner, and imitation of meaningless gestures. Thus, two of the three gesture types were tool-related, and two of the three were imitative, enabling pairwise comparisons designed to highlight commonalities and differences. Gestures were scored separately for postural (hand/arm positioning) and kinematic (amplitude/timing) accuracy. Lesioned voxels in the left posterior temporal gyrus were significantly associated with lower scores on the posture component for both of the tool-related gesture tasks. Poor performance on the kinematic component of all three gesture tasks was significantly associated with lesions in left inferior parietal and frontal regions. These data enable us to propose a componential neuroanatomic model of action that delineates the specific components required for different gestural action tasks. Thus, visual posture information and kinematic capacities are differentially critical to the three types of actions studied here: the kinematic aspect is particularly critical for imitation of meaningless movement, capacity for tool-action posture representations are particularly necessary for pantomimed gestures to the sight of tools, and both capacities inform imitation of tool-related movements. These distinctions enable us to advance traditional accounts of apraxia. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware

    NASA Astrophysics Data System (ADS)

    Johnson, V. L.; Teuben, P. J.; Penprase, B. E.

    An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.

  2. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns Across Heterogenous Space-Time Data

    NASA Astrophysics Data System (ADS)

    Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.

    2017-12-01

    Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.

  3. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    PubMed

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  4. Fundamentals of Clinical Outcomes Assessment for Spinal Disorders: Clinical Outcome Instruments and Applications

    PubMed Central

    Vavken, Patrick; Ganal-Antonio, Anne Kathleen B.; Quidde, Julia; Shen, Francis H.; Chapman, Jens R.; Samartzis, Dino

    2015-01-01

    Study Design A broad narrative review. Objectives Outcome assessment in spinal disorders is imperative to help monitor the safety and efficacy of the treatment in an effort to change the clinical practice and improve patient outcomes. The following article, part two of a two-part series, discusses the various outcome tools and instruments utilized to address spinal disorders and their management. Methods A thorough review of the peer-reviewed literature was performed, irrespective of language, addressing outcome research, instruments and tools, and applications. Results Numerous articles addressing the development and implementation of health-related quality-of-life, neck and low back pain, overall pain, spinal deformity, and other condition-specific outcome instruments have been reported. Their applications in the context of the clinical trial studies, the economic analyses, and overall evidence-based orthopedics have been noted. Additional issues regarding the problems and potential sources of bias utilizing outcomes scales and the concept of minimally clinically important difference were discussed. Conclusion Continuing research needs to assess the outcome instruments and tools used in the clinical outcome assessment for spinal disorders. Understanding the fundamental principles in spinal outcome assessment may also advance the field of “personalized spine care.” PMID:26225283

  5. The Connectome Viewer Toolkit: An Open Source Framework to Manage, Analyze, and Visualize Connectomes

    PubMed Central

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit – a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/ PMID:21713110

  6. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, M. Clinton; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many konts to make possible high speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flow fields/plumes; the Optical Plume Anomaly Detection (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDIFIS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Additionally, efforts are being advanced to hardware encode components of the EDIFIS in order to address real-time operational requirements for health monitoring and management. This paper addresses the OPAD with its tool suite, and discusses what is considered a natural progression: a concept for migrating OPAD towards detection of high energy particles, including neutrons and gamma rays. The integration of these tools and capabilities will provide NASA with a systematic approach to monitor space vehicle internal and external environment.

  7. Animal models: an important tool in mycology.

    PubMed

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  8. Visualization and Interaction in Research, Teaching, and Scientific Communication

    NASA Astrophysics Data System (ADS)

    Ammon, C. J.

    2017-12-01

    Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.

  9. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  10. Rapid processing of data based on high-performance algorithms for solving inverse problems and 3D-simulation of the tsunami and earthquakes

    NASA Astrophysics Data System (ADS)

    Marinin, I. V.; Kabanikhin, S. I.; Krivorotko, O. I.; Karas, A.; Khidasheli, D. G.

    2012-04-01

    We consider new techniques and methods for earthquake and tsunami related problems, particularly - inverse problems for the determination of tsunami source parameters, numerical simulation of long wave propagation in soil and water and tsunami risk estimations. In addition, we will touch upon the issue of database management and destruction scenario visualization. New approaches and strategies, as well as mathematical tools and software are to be shown. The long joint investigations by researchers of the Institute of Mathematical Geophysics and Computational Mathematics SB RAS and specialists from WAPMERR and Informap have produced special theoretical approaches, numerical methods, and software tsunami and earthquake modeling (modeling of propagation and run-up of tsunami waves on coastal areas), visualization, risk estimation of tsunami, and earthquakes. Algorithms are developed for the operational definition of the origin and forms of the tsunami source. The system TSS numerically simulates the source of tsunami and/or earthquakes and includes the possibility to solve the direct and the inverse problem. It becomes possible to involve advanced mathematical results to improve models and to increase the resolution of inverse problems. Via TSS one can construct maps of risks, the online scenario of disasters, estimation of potential damage to buildings and roads. One of the main tools for the numerical modeling is the finite volume method (FVM), which allows us to achieve stability with respect to possible input errors, as well as to achieve optimum computing speed. Our approach to the inverse problem of tsunami and earthquake determination is based on recent theoretical results concerning the Dirichlet problem for the wave equation. This problem is intrinsically ill-posed. We use the optimization approach to solve this problem and SVD-analysis to estimate the degree of ill-posedness and to find the quasi-solution. The software system we developed is intended to create technology «no frost», realizing a steady stream of direct and inverse problems: solving the direct problem, the visualization and comparison with observed data, to solve the inverse problem (correction of the model parameters). The main objective of further work is the creation of a workstation operating emergency tool that could be used by an emergency duty person in real time.

  11. Tool setting device

    DOEpatents

    Brown, Raymond J.

    1977-01-01

    The present invention relates to a tool setting device for use with numerically controlled machine tools, such as lathes and milling machines. A reference position of the machine tool relative to the workpiece along both the X and Y axes is utilized by the control circuit for driving the tool through its program. This reference position is determined for both axes by displacing a single linear variable displacement transducer (LVDT) with the machine tool through a T-shaped pivotal bar. The use of the T-shaped bar allows the cutting tool to be moved sequentially in the X or Y direction for indicating the actual position of the machine tool relative to the predetermined desired position in the numerical control circuit by using a single LVDT.

  12. Numerical tool for tsunami risk assessment in the southern coast of Dominican Republic

    NASA Astrophysics Data System (ADS)

    Macias Sanchez, J.; Llorente Isidro, M.; Ortega, S.; Gonzalez Vida, J. M., Sr.; Castro, M. J.

    2016-12-01

    The southern coast of Dominican Republic is a very populated region, with several important cities including Santo Domingo, its capital. Important activities are rooted in the southern coast including tourism, industry, commercial ports, and, energy facilities, among others. According to historical reports, it has been impacted by big earthquakes accompanied by tsunamis as in Azua in 1751 and recently Pedernales in 2010, but their sources are not clearly identified. The aim of the present work is to develop a numerical tool to simulate the impact in the southern coast of the Dominican Republic of tsunamis generated in the Caribbean Sea. This tool, based on the Tsunami-HySEA model from EDANYA group (University of Malaga, Spain), could be used in the framework of a Tsunami Early Warning Systems due the very short computing times when only propagation is computed or it could be used to assess inundation impact, computing inundation with a initial 5 meter resolution. Numerical results corresponding to three theoretical sources are used to test the numerical tool.

  13. GIS-MODFLOW: Ein kleines OpenSource-Werkzeug zur Anbindung von GIS-Daten an MODFLOW

    NASA Astrophysics Data System (ADS)

    Gossel, Wolfgang

    2013-06-01

    The numerical model MODFLOW (Harbaugh 2005) is an efficient and up-to-date tool for groundwater flow modelling. On the other hand, Geo-Information-Systems (GIS) provide useful tools for data preparation and visualization that can also be incorporated in numerical groundwater modelling. An interface between both would therefore be useful for many hydrogeological investigations. To date, several integrated stand-alone tools have been developed that rely on MODFLOW, MODPATH and transport modelling tools. Simultaneously, several open source-GIS codes were developed to improve functionality and ease of use. These GIS tools can be used as pre- and post-processors of the numerical model MODFLOW via a suitable interface. Here we present GIS-MODFLOW as an open-source tool that provides a new universal interface by using the ESRI ASCII GRID data format that can be converted into MODFLOW input data. This tool can also treat MODFLOW results. Such a combination of MODFLOW and open-source GIS opens new possibilities to render groundwater flow modelling, and simulation results, available to larger circles of hydrogeologists.

  14. Biomechanics of Wheat/Barley Straw and Corn Stover

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher T. Wright; Peter A. Pryfogle; Nathan A. Stevens

    2005-03-01

    The lack of understanding of the mechanical characteristics of cellulosic feedstocks is a limiting factor in economically collecting and processing crop residues, primarily wheat and barley stems and corn stover. Several testing methods, including compression, tension, and bend have been investigated to increase our understanding of the biomechanical behavior of cellulosic feedstocks. Biomechanical data from these tests can provide required input to numerical models and help advance harvesting, handling, and processing techniques. In addition, integrating the models with the complete data set from this study can identify potential tools for manipulating the biomechanical properties of plant varieties in such amore » manner as to optimize their physical characteristics to produce higher value biomass and more energy efficient harvesting practices.« less

  15. High-charge and multiple-star vortex coronagraphy from stacked vector vortex phase masks.

    PubMed

    Aleksanyan, Artur; Brasselet, Etienne

    2018-02-01

    Optical vortex phase masks are now installed at many ground-based large telescopes for high-contrast astronomical imaging. To date, such instrumental advances have been restricted to the use of helical phase masks of the lowest even order, while future giant telescopes will require high-order masks. Here we propose a single-stage on-axis scheme to create high-order vortex coronagraphs based on second-order vortex phase masks. By extending our approach to an off-axis design, we also explore the implementation of multiple-star vortex coronagraphy. An experimental laboratory demonstration is reported and supported by numerical simulations. These results offer a practical roadmap to the development of future coronagraphic tools with enhanced performances.

  16. Current achievements and future directions in genetic engineering of European plum (Prunus domestica L.).

    PubMed

    Petri, Cesar; Alburquerque, Nuria; Faize, Mohamed; Scorza, Ralph; Dardick, Chris

    2018-06-01

    In most woody fruit species, transformation and regeneration are difficult. However, European plum (Prunus domestica) has been shown to be amenable to genetic improvement technologies from classical hybridization, to genetic engineering, to rapid cycle crop breeding ('FasTrack' breeding). Since the first report on European plum transformation with marker genes in the early 90 s, numerous manuscripts have been published reporting the generation of new clones with agronomically interesting traits, such as pests, diseases and/or abiotic stress resistance, shorter juvenile period, dwarfing, continuous flowering, etc. This review focuses on the main advances in genetic transformation of European plum achieved to date, and the lines of work that are converting genetic engineering into a contemporary breeding tool for this species.

  17. Diagnosis of the Computer-Controlled Milling Machine, Definition of the Working Errors and Input Corrections on the Basis of Mathematical Model

    NASA Astrophysics Data System (ADS)

    Starikov, A. I.; Nekrasov, R. Yu; Teploukhov, O. J.; Soloviev, I. V.; Narikov, K. A.

    2016-10-01

    Manufactures, machinery and equipment improve of constructively as science advances and technology, and requirements are improving of quality and longevity. That is, the requirements for surface quality and precision manufacturing, oil and gas equipment parts are constantly increasing. Production of oil and gas engineering products on modern machine tools with computer numerical control - is a complex synthesis of technical and electrical equipment parts, as well as the processing procedure. Technical machine part wears during operation and in the electrical part are accumulated mathematical errors. Thus, the above-mentioned disadvantages of any of the following parts of metalworking equipment affect the manufacturing process of products in general, and as a result lead to the flaw.

  18. AN EIGHT WEEK SEMINAR IN AN INTRODUCTION TO NUMERICAL CONTROL ON TWO- AND THREE-AXIS MACHINE TOOLS FOR VOCATIONAL AND TECHNICAL MACHINE TOOL INSTRUCTORS. FINAL REPORT.

    ERIC Educational Resources Information Center

    BOLDT, MILTON; POKORNY, HARRY

    THIRTY-THREE MACHINE SHOP INSTRUCTORS FROM 17 STATES PARTICIPATED IN AN 8-WEEK SEMINAR TO DEVELOP THE SKILLS AND KNOWLEDGE ESSENTIAL FOR TEACHING THE OPERATION OF NUMERICALLY CONTROLLED MACHINE TOOLS. THE SEMINAR WAS GIVEN FROM JUNE 20 TO AUGUST 12, 1966, WITH COLLEGE CREDIT AVAILABLE THROUGH STOUT STATE UNIVERSITY. THE PARTICIPANTS COMPLETED AN…

  19. Verifying the error bound of numerical computation implemented in computer systems

    DOEpatents

    Sawada, Jun

    2013-03-12

    A verification tool receives a finite precision definition for an approximation of an infinite precision numerical function implemented in a processor in the form of a polynomial of bounded functions. The verification tool receives a domain for verifying outputs of segments associated with the infinite precision numerical function. The verification tool splits the domain into at least two segments, wherein each segment is non-overlapping with any other segment and converts, for each segment, a polynomial of bounded functions for the segment to a simplified formula comprising a polynomial, an inequality, and a constant for a selected segment. The verification tool calculates upper bounds of the polynomial for the at least two segments, beginning with the selected segment and reports the segments that violate a bounding condition.

  20. Numerical Analysis Objects

    NASA Astrophysics Data System (ADS)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  1. Basic and Advanced Numerical Performances Relate to Mathematical Expertise but Are Fully Mediated by Visuospatial Skills

    ERIC Educational Resources Information Center

    Sella, Francesco; Sader, Elie; Lolliot, Simon; Cohen Kadosh, Roi

    2016-01-01

    Recent studies have highlighted the potential role of basic numerical processing in the acquisition of numerical and mathematical competences. However, it is debated whether high-level numerical skills and mathematics depends specifically on basic numerical representations. In this study mathematicians and nonmathematicians performed a basic…

  2. Optical-thermal light-tissue interactions during photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Gould, Taylor; Wang, Quanzeng; Pfefer, T. Joshua

    2014-03-01

    Photoacoustic imaging (PAI) has grown rapidly as a biomedical imaging technique in recent years, with key applications in cancer diagnosis and oximetry. In spite of these advances, the literature provides little insight into thermal tissue interactions involved in PAI. To elucidate these basic phenomena, we have developed, validated, and implemented a three-dimensional numerical model of tissue photothermal (PT) response to repetitive laser pulses. The model calculates energy deposition, fluence distributions, transient temperature and damage profiles in breast tissue with blood vessels and generalized perfusion. A parametric evaluation of these outputs vs. vessel diameter and depth, optical beam diameter, wavelength, and irradiance, was performed. For a constant radiant exposure level, increasing beam diameter led to a significant increase in subsurface heat generation rate. Increasing vessel diameter resulted in two competing effects - reduced mean energy deposition in the vessel due to light attenuation and greater thermal superpositioning due to reduced thermal relaxation. Maximum temperatures occurred either at the surface or in subsurface regions of the dermis, depending on vessel geometry and position. Results are discussed in terms of established exposure limits and levels used in prior studies. While additional experimental and numerical study is needed, numerical modeling represents a powerful tool for elucidating the effect of PA imaging devices on biological tissue.

  3. IT Security Support for the Spaceport Command Control Systems Development Ground Support Development Operations

    NASA Technical Reports Server (NTRS)

    Branch, Drew A.

    2014-01-01

    Security is one of the most if not the most important areas today. After the several attacks on the United States, security everywhere has heightened from airports to the communication among the military branches legionnaires. With advanced persistent threats (APT's) on the rise following Stuxnet, government branches and agencies are required, more than ever, to follow several standards, policies and procedures to reduce the likelihood of a breach. Attack vectors today are very advanced and are going to continue to get more and more advanced as security controls advance. This creates a need for networks and systems to be in an updated and secured state in a launch control system environment. FISMA is a law that is mandated by the government to follow when government agencies secure networks and devices. My role on this project is to ensure network devices and systems are in compliance with NIST, as outlined in FISMA. I will achieve this by providing assistance with security plan documentation and collection, system hardware and software inventory, malicious code and malware scanning, and configuration of network devices i.e. routers and IDS's/IPS's. In addition, I will be completing security assessments on software and hardware, vulnerability assessments and reporting, and conducting patch management and risk assessments. A guideline that will help with compliance with NIST is the SANS Top 20 Critical Controls. SANS Top 20 Critical Controls as well as numerous security tools, security software and the conduction of research will be used to successfully complete the tasks given to me. This will ensure compliance with FISMA and NIST, secure systems and a secured network. By the end of this project, I hope to have carried out the tasks stated above as well as gain an immense knowledge about compliance, security tools, networks and network devices, as well as policies and procedures.

  4. IT Security Support for the Spaceport Command Control Systems Development Ground Support Development Operations

    NASA Technical Reports Server (NTRS)

    Branch, Drew

    2013-01-01

    Security is one of the most if not the most important areas today. After the several attacks on the United States, security everywhere was heightened from Airports to the communication among the military branches legionnaires. With advanced persistent threats (APTs) on the rise following Stuxnet, government branches and agencies are required, more than ever, to follow several standards, policies and procedures to reduce the likelihood of a breach. Attack vectors today are very advanced and are going to continue to get more and more advanced as security controls advance. This creates a need for networks and systems to be in an updated and secured state in a launch control system environment. FISMA is a law that is mandated by the government to follow when government agencies secure networks and devices. My role on this project is to ensure network devices and systems are in compliance with NIST, as outlined in FISMA. I will achieve this by providing assistance with security plan documentation and collection, system hardware and software inventory, malicious code and malware scanning and configuration of network devices i.e. routers and IDSsIPSs. In addition I will be completing security assessments on software and hardware, vulnerability assessments and reporting, conducting patch management and risk assessments. A guideline that will help with compliance with NIST is the SANS Top 20 Critical Controls. SANS Top 20 Critical Controls as well as numerous security tools, security software and the conduction of research will be used to successfully complete the tasks given to me. This will ensure compliance with FISMA and NIST, secure systems and a secured network. By the end of this project, I hope to have carried out stated above as well as gain an immense knowledge about compliance, security tools, networks and network devices, policies and procedures.

  5. Application of a coupled smoothed particle hydrodynamics (SPH) and coarse-grained (CG) numerical modelling approach to study three-dimensional (3-D) deformations of single cells of different food-plant materials during drying.

    PubMed

    Rathnayaka, C M; Karunasena, H C P; Senadeera, W; Gu, Y T

    2018-03-14

    Numerical modelling has gained popularity in many science and engineering streams due to the economic feasibility and advanced analytical features compared to conventional experimental and theoretical models. Food drying is one of the areas where numerical modelling is increasingly applied to improve drying process performance and product quality. This investigation applies a three dimensional (3-D) Smoothed Particle Hydrodynamics (SPH) and Coarse-Grained (CG) numerical approach to predict the morphological changes of different categories of food-plant cells such as apple, grape, potato and carrot during drying. To validate the model predictions, experimental findings from in-house experimental procedures (for apple) and sources of literature (for grape, potato and carrot) have been utilised. The subsequent comaprison indicate that the model predictions demonstrate a reasonable agreement with the experimental findings, both qualitatively and quantitatively. In this numerical model, a higher computational accuracy has been maintained by limiting the consistency error below 1% for all four cell types. The proposed meshfree-based approach is well-equipped to predict the morphological changes of plant cellular structure over a wide range of moisture contents (10% to 100% dry basis). Compared to the previous 2-D meshfree-based models developed for plant cell drying, the proposed model can draw more useful insights on the morphological behaviour due to the 3-D nature of the model. In addition, the proposed computational modelling approach has a high potential to be used as a comprehensive tool in many other tissue morphology related investigations.

  6. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  7. Apes produce tools for future use.

    PubMed

    Bräuer, Juliane; Call, Josep

    2015-03-01

    There is now growing evidence that some animal species are able to plan for the future. For example great apes save and exchange tools for future use. Here we raise the question whether chimpanzees, orangutans, and bonobos would produce tools for future use. Subjects only had access to a baited apparatus for a limited duration and therefore should use the time preceding this access to create the appropriate tools in order to get the rewards. The apes were tested in three conditions depending on the need for pre-prepared tools. Either eight tools, one tool or no tools were needed to retrieve the reward. The apes prepared tools in advance for future use and they produced them mainly in conditions when they were really needed. The fact that apes were able to solve this new task indicates that their planning skills are flexible. However, for the condition in which eight tools were needed, apes produced less than two tools per trial in advance. However, they used their chance to produce additional tools in the tool use phase-thus often obtaining most of the reward from the apparatus. Increased pressure to prepare more tools in advance did not have an effect on their performance. © 2014 Wiley Periodicals, Inc.

  8. High satisfaction and low decisional conflict with advance care planning among chronically ill patients with advanced chronic obstructive pulmonary disease or heart failure using an online decision aid: A pilot study.

    PubMed

    Van Scoy, Lauren J; Green, Michael J; Dimmock, Anne Ef; Bascom, Rebecca; Boehmer, John P; Hensel, Jessica K; Hozella, Joshua B; Lehman, Erik B; Schubart, Jane R; Farace, Elana; Stewart, Renee R; Levi, Benjamin H

    2016-09-01

    Many patients with chronic illnesses report a desire for increased involvement in medical decision-making. This pilot study aimed to explore how patients with exacerbation-prone disease trajectories such as advanced heart failure or chronic obstructive pulmonary disease experience advance care planning using an online decision aid and to compare whether patients with different types of exacerbation-prone illnesses had varied experiences using the tool. Pre-intervention questionnaires measured advance care planning knowledge. Post-intervention questionnaires measured: (1) advance care planning knowledge; (2) satisfaction with tool; (3) decisional conflict; and (4) accuracy of the resultant advance directive. Comparisons were made between patients with heart failure and chronic obstructive pulmonary disease. Over 90% of the patients with heart failure (n = 24) or chronic obstructive pulmonary disease (n = 25) reported being "satisfied" or "highly satisfied" with the tool across all satisfaction domains; over 90% of participants rated the resultant advance directive as "very accurate." Participants reported low decisional conflict. Advance care planning knowledge scores rose by 18% (p < 0.001) post-intervention. There were no significant differences between participants with heart failure and chronic obstructive pulmonary disease. Patients with advanced heart failure and chronic obstructive pulmonary disease were highly satisfied after using an online advance care planning decision aid and had increased knowledge of advance care planning. This tool can be a useful resource for time-constrained clinicians whose patients wish to engage in advance care planning. © The Author(s) 2016.

  9. Improving governance action by an advanced water modelling system applied to the Po river basin in Italy

    NASA Astrophysics Data System (ADS)

    Alessandrini, Cinzia; Del Longo, Mauro; Pecora, Silvano; Puma, Francesco; Vezzani, Claudia

    2013-04-01

    In spite of the historical abundance of water due to rains and to huge storage capacity provided by alpine lakes, Po river basin, the most important Italian water district experienced in the past ten years five drought/water scarcity events respectively in 2003, 2006, 2007 and 2012 summers and in the 2011-2012 winter season. The basic approach to these crises was the observation and the post-event evaluation; from 2007 an advanced numerical modelling system, called Drought Early Warning System for the Po River (DEWS-Po) was developed, providing advanced tools to simulate the hydrological and anthropic processes that affect river flows and allowing to follow events with real-time evaluations. In early 2012 the same system enabled also forecasts. Dews-Po system gives a real-time representation of water distribution across the basin, characterized by high anthropogenic pressure, optimizing with specific tools water allocation in competing situations. The system represents an innovative approach in drought forecast and in water resource management in the Po basin, giving deterministic and probabilistic meteorological forecasts as input to a chain for numerical distributed modelling of hydrological and hydraulic simulations. The system architecture is designed to receive in input hydro-meteorological actually observed and forecasted variables: deterministic meteorological forecasts with a fifteen days lead time, withdrawals data for different uses, natural an artificial reservoirs storage and release data. The model details are very sharp, simulating also the interaction between Adriatic sea and Po river in the delta area in terms of salt intrusion forecasting. Calculation of return period through run-method and of drought stochastic-indicators are enabled to assess the characteristics of the on-going and forecasted event. An Inter-institutional Technical Board is constituted within the Po River Basin Authority since 2008 and meets regularly during water crises to act decisions regarding water management in order to prevent major impacts. The Board is made of experts from public administrations with a strong involvement of stakeholders representative of different uses. The Dews- Po was intensively used by the Technical Board as decision support system during the 2012 summer event, providing tools to understand the on-going situation of water availability and use across the basin, helping to evaluate water management choices in an objective way, through what-if scenarios considering withdrawals reduction and increased releases from regulated Alpine lakes. A description of the use of Dews- Po system within the Technical Board is given, especially focusing on those elements, prone to be considered "good management indicators", which proved to be most useful in ensuring the success of governance action. Strength and improvement needs of the system are then described

  10. In defense of the stethoscope.

    PubMed

    Murphy, Raymond Lh

    2008-03-01

    The stethoscope is widely considered to be an unreliable instrument. Many studies document the significant observer variability in its use. Numerous other diagnostic tools are available that are generally regarded to provide more reliable diagnostic information. Some even argue that teaching of the ancient art should be de-emphasized in medical schools. Yet auscultation with an acoustic stethoscope can provide important, even life-saving, information. The purpose of this article is to present evidence that supports the use of the stethoscope in clinical medicine. The argument for the stethoscope will be made by presenting relevant investigations, including clinical studies acknowledged to meet the criteria of evidence-based medicine. It will focus on studies that have employed computerized acoustic technology to correlate lung sounds with disease states. This technology has advanced in recent years, which has stimulated a resurgence of interest in auscultation. Numerous studies have been done that utilized objective methods that circumvented the problem of observer variability. There is now a good deal of scientific evidence to support the hypothesis that lung sounds contain information that is clinically useful. This technology also allows this information to be collected more efficiently than previously possible. Advances in educational technology have made it possible to impart information on auscultation much more easily than was possible in the past. Contrary to predictions, the stethoscope is not likely to be relegated to the museum shelf in the near future. Computer technology is making it an even more useful clinical instrument.

  11. Cultural Resources Collection Analysis Albeni Falls Project, Northern Idaho.

    DTIC Science & Technology

    1987-01-01

    numerous pestles and mortars, bolas stones, nephrite adzes, notched pebbles or net weights, an atlatl weight, and several unique incised and carved...tools including flaked and ground stone was documented; bifacial tools, drills, gravers, scrapers, numerous pestles and mortars, bolas stones, nephrite...59 27 Pestles ............................................................ 60 28 Zoomorphic pestle (?) fragment

  12. Preface to advances in numerical simulation of plasmas

    NASA Astrophysics Data System (ADS)

    Parker, Scott E.; Chacon, Luis

    2016-10-01

    This Journal of Computational Physics Special Issue, titled ;Advances in Numerical Simulation of Plasmas,; presents a snapshot of the international state of the art in the field of computational plasma physics. The articles herein are a subset of the topics presented as invited talks at the 24th International Conference on the Numerical Simulation of Plasmas (ICNSP), August 12-14, 2015 in Golden, Colorado. The choice of papers was highly selective. The ICNSP is held every other year and is the premier scientific meeting in the field of computational plasma physics.

  13. Approximating a retarded-advanced differential equation that models human phonation

    NASA Astrophysics Data System (ADS)

    Teodoro, M. Filomena

    2017-11-01

    In [1, 2, 3] we have got the numerical solution of a linear mixed type functional differential equation (MTFDE) introduced initially in [4], considering the autonomous and non-autonomous case by collocation, least squares and finite element methods considering B-splines basis set. The present work introduces a numerical scheme using least squares method (LSM) and Gaussian basis functions to solve numerically a nonlinear mixed type equation with symmetric delay and advance which models human phonation. The preliminary results are promising. We obtain an accuracy comparable with the previous results.

  14. The Yambo code: a comprehensive tool to perform ab-initio simulations of equilibrium and out-of-equilibrium properties

    NASA Astrophysics Data System (ADS)

    Marini, Andrea

    Density functional theory and many-body perturbation theory methods (such as GW and Bethe-Selpether equation) are standard approaches to the equilibrium ground and excited state properties of condensed matter systems, surfaces, molecules and other several kind of materials. At the same time ultra-fast optical spectroscopy is becoming a widely used and powerful tool for the observation of the out-of-equilibrium dynamical processes. In this case the theoretical tools (such as the Baym-Kadanoff equation) are well known but, only recently, have been merged with the ab-Initio approach. And, for this reason, highly parallel and efficient codes are lacking. Nevertheless, the combination of these two areas of research represents, for the ab-initio community, a challenging prespective as it requires the development of advanced theoretical, methodological and numerical tools. Yambo is a popular community software implementing the above methods using plane-waves and pseudo-potentials. Yambo is available to the community as open-source software, and oriented to high-performance computing. The Yambo project aims at making the simulation of these equilibrium and out-of-equilibrium complex processes available to a wide community of users. Indeed the code is used, in practice, in many countries and well beyond the European borders. Yambo is a member of the suite of codes of the MAX European Center of Excellence (Materials design at the exascale) . It is also used by the user facilities of the European Spectroscopy Facility and of the NFFA European Center (nanoscience foundries & fine analysis). In this talk I will discuss some recent numerical and methodological developments that have been implemented in Yambo towards to exploitation of next generation HPC supercomputers. In particular, I will present the hybrid MPI+OpenMP parallelization and the specific case of the response function calculation. I will also discuss the future plans of the Yambo project and its potential use as tool for science dissemination, also in third world countries. Etsf, MAX European Center of Excellence and NFFA European Center.

  15. Frequency domain finite-element and spectral-element acoustic wave modeling using absorbing boundaries and perfectly matched layer

    NASA Astrophysics Data System (ADS)

    Rahimi Dalkhani, Amin; Javaherian, Abdolrahim; Mahdavi Basir, Hadi

    2018-04-01

    Wave propagation modeling as a vital tool in seismology can be done via several different numerical methods among them are finite-difference, finite-element, and spectral-element methods (FDM, FEM and SEM). Some advanced applications in seismic exploration benefit the frequency domain modeling. Regarding flexibility in complex geological models and dealing with the free surface boundary condition, we studied the frequency domain acoustic wave equation using FEM and SEM. The results demonstrated that the frequency domain FEM and SEM have a good accuracy and numerical efficiency with the second order interpolation polynomials. Furthermore, we developed the second order Clayton and Engquist absorbing boundary condition (CE-ABC2) and compared it with the perfectly matched layer (PML) for the frequency domain FEM and SEM. In spite of PML method, CE-ABC2 does not add any additional computational cost to the modeling except assembling boundary matrices. As a result, considering CE-ABC2 is more efficient than PML for the frequency domain acoustic wave propagation modeling especially when computational cost is high and high-level absorbing performance is unnecessary.

  16. A mathematical model for simulating noise suppression of lined ejectors

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.

    1994-01-01

    A mathematical model containing the essential features embodied in the noise suppression of lined ejectors is presented. Although some simplification of the physics is necessary to render the model mathematically tractable, the current model is the most versatile and technologically advanced at the current time. A system of linearized equations and the boundary conditions governing the sound field are derived starting from the equations of fluid dynamics. A nonreflecting boundary condition is developed. In view of the complex nature of the equations, a parametric study requires the use of numerical techniques and modern computers. A finite element algorithm that solves the differential equations coupled with the boundary condition is then introduced. The numerical method results in a matrix equation with several hundred thousand degrees of freedom that is solved efficiently on a supercomputer. The model is validated by comparing results either with exact solutions or with approximate solutions from other works. In each case, excellent correlations are obtained. The usefulness of the model as an optimization tool and the importance of variable impedance liners as a mechanism for achieving broadband suppression within a lined ejector are demonstrated.

  17. Modeling combined heat transfer in an all solid state optical cryocooler

    NASA Astrophysics Data System (ADS)

    Kuzhiveli, Biju T.

    2017-12-01

    Attaining cooling effect by using laser induced anti-Stokes fluorescence in solids appears to have several advantages over conventional mechanical systems and has been the topic of recent analysis and experimental work. Using anti-Stokes fluorescence phenomenon to remove heat from a glass by pumping it with laser light, stands as a pronouncing physical basis for solid state cooling. Cryocooling by fluorescence is a feasible solution for obtaining compactness and reliability. It has a distinct niche in the family of small capacity cryocoolers and is undergoing a revolutionary advance. In pursuit of developing laser induced anti-Stokes fluorescent cryocooler, it is required to develop numerical tools that support the thermal design which could provide a thorough analysis of combined heat transfer mechanism within the cryocooler. The paper presents the details of numerical model developed for the cryocooler and the subsequent development of a computer program. The program has been used for the understanding of various heat transfer mechanisms and is being used for thermal design of components of an anti-Stokes fluorescent cryocooler.

  18. Numerical Modelling of Staged Combustion Aft-Injected Hybrid Rocket Motors

    NASA Astrophysics Data System (ADS)

    Nijsse, Jeff

    The staged combustion aft-injected hybrid (SCAIH) rocket motor is a promising design for the future of hybrid rocket propulsion. Advances in computational fluid dynamics and scientific computing have made computational modelling an effective tool in hybrid rocket motor design and development. The focus of this thesis is the numerical modelling of the SCAIH rocket motor in a turbulent combustion, high-speed, reactive flow framework accounting for solid soot transport and radiative heat transfer. The SCAIH motor is modelled with a shear coaxial injector with liquid oxygen injected in the center at sub-critical conditions: 150 K and 150 m/s (Mach ≈ 0.9), and a gas-generator gas-solid mixture of one-third carbon soot by mass injected in the annual opening at 1175 K and 460 m/s (Mach ≈ 0.6). Flow conditions in the near injector region and the flame anchoring mechanism are of particular interest. Overall, the flow is shown to exhibit instabilities and the flame is shown to anchor directly on the injector faceplate with temperatures in excess of 2700 K.

  19. How to identify dislocations in molecular dynamics simulations?

    NASA Astrophysics Data System (ADS)

    Li, Duo; Wang, FengChao; Yang, ZhenYu; Zhao, YaPu

    2014-12-01

    Dislocations are of great importance in revealing the underlying mechanisms of deformed solid crystals. With the development of computational facilities and technologies, the observations of dislocations at atomic level through numerical simulations are permitted. Molecular dynamics (MD) simulation suggests itself as a powerful tool for understanding and visualizing the creation of dislocations as well as the evolution of crystal defects. However, the numerical results from the large-scale MD simulations are not very illuminating by themselves and there exist various techniques for analyzing dislocations and the deformed crystal structures. Thus, it is a big challenge for the beginners in this community to choose a proper method to start their investigations. In this review, we summarized and discussed up to twelve existing structure characterization methods in MD simulations of deformed crystal solids. A comprehensive comparison was made between the advantages and disadvantages of these typical techniques. We also examined some of the recent advances in the dynamics of dislocations related to the hydraulic fracturing. It was found that the dislocation emission has a significant effect on the propagation and bifurcation of the crack tip in the hydraulic fracturing.

  20. Recent advances in systems metabolic engineering tools and strategies.

    PubMed

    Chae, Tong Un; Choi, So Young; Kim, Je Woong; Ko, Yoo-Sung; Lee, Sang Yup

    2017-10-01

    Metabolic engineering has been playing increasingly important roles in developing microbial cell factories for the production of various chemicals and materials to achieve sustainable chemical industry. Nowadays, many tools and strategies are available for performing systems metabolic engineering that allows systems-level metabolic engineering in more sophisticated and diverse ways by adopting rapidly advancing methodologies and tools of systems biology, synthetic biology and evolutionary engineering. As an outcome, development of more efficient microbial cell factories has become possible. Here, we review recent advances in systems metabolic engineering tools and strategies together with accompanying application examples. In addition, we describe how these tools and strategies work together in simultaneous and synergistic ways to develop novel microbial cell factories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. State of the art of sonic boom modeling

    NASA Astrophysics Data System (ADS)

    Plotkin, Kenneth J.

    2002-01-01

    Based on fundamental theory developed through the 1950s and 1960s, sonic boom modeling has evolved into practical tools. Over the past decade, there have been requirements for design tools for an advanced supersonic transport, and for tools for environmental assessment of various military and aerospace activities. This has resulted in a number of advances in the understanding of the physics of sonic booms, including shock wave rise times, propagation through turbulence, and blending sonic boom theory with modern computational fluid dynamics (CFD) aerodynamic design methods. This article reviews the early fundamental theory, recent advances in theory, and the application of these advances to practical models.

  2. State of the art of sonic boom modeling.

    PubMed

    Plotkin, Kenneth J

    2002-01-01

    Based on fundamental theory developed through the 1950s and 1960s, sonic boom modeling has evolved into practical tools. Over the past decade, there have been requirements for design tools for an advanced supersonic transport, and for tools for environmental assessment of various military and aerospace activities. This has resulted in a number of advances in the understanding of the physics of sonic booms, including shock wave rise times, propagation through turbulence, and blending sonic boom theory with modern computational fluid dynamics (CFD) aerodynamic design methods. This article reviews the early fundamental theory, recent advances in theory, and the application of these advances to practical models.

  3. Tsunami-induced boulder transport - combining physical experiments and numerical modelling

    NASA Astrophysics Data System (ADS)

    Oetjen, Jan; Engel, Max; May, Simon Matthias; Schüttrumpf, Holger; Brueckner, Helmut; Prasad Pudasaini, Shiva

    2016-04-01

    Coasts are crucial areas for living, economy, recreation, transportation, and various sectors of industry. Many of them are exposed to high-energy wave events. With regard to the ongoing population growth in low-elevation coastal areas, the urgent need for developing suitable management measures, especially for hazards like tsunamis, becomes obvious. These measures require supporting tools which allow an exact estimation of impact parameters like inundation height, inundation area, and wave energy. Focussing on tsunamis, geological archives can provide essential information on frequency and magnitude on a longer time scale in order to support coastal hazard management. While fine-grained deposits may quickly be altered after deposition, multi-ton coarse clasts (boulders) may represent an information source on past tsunami events with a much higher preservation potential. Applying numerical hydrodynamic coupled boulder transport models (BTM) is a commonly used approach to analyse characteristics (e.g. wave height, flow velocity) of the corresponding tsunami. Correct computations of tsunamis and the induced boulder transport can provide essential event-specific information, including wave heights, runup and direction. Although several valuable numerical models for tsunami-induced boulder transport exist (e. g. Goto et al., 2007; Imamura et al., 2008), some important basic aspects of both tsunami hydrodynamics and corresponding boulder transport have not yet been entirely understood. Therefore, our project aims at these questions in four crucial aspects of boulder transport by a tsunami: (i) influence of sediment load, (ii) influence of complex boulder shapes other than idealized rectangular shapes, (iii) momentum transfers between multiple boulders, and (iv) influence of non-uniform bathymetries and topographies both on tsunami and boulder. The investigation of these aspects in physical experiments and the correct implementation of an advanced model is an urgent need since they have been largely neglected. In order to tackle these gaps, we develop a novel BTM in two steps. First, scaled physical experiments are performed that determine the exact hydrodynamic processes within a tsunami during boulder transportations. Furthermore, the experiments are the basis for calibrating the numerical BTM. The BTM is based on the numerical two-phase mass flow model of Pudasaini (2012) that employs an advanced and unified high-resolution computational tool for mixtures consisting of the solid and fluid components and their interactions. This allows for the motion of the boulder while interacting with the particle-laden tsunami on the inundated coastal plane as a function of the total fluid and solid stresses. Our approach leads to fundamentally new insights in to the essential physical processes in BTM. Goto, K., Chavanich, S. A., Imamura, F., Kunthasap, P., Matsui, T., Minoura, K., Sugawara, D. and Yanagisawa, H.: Distribution, origin and transport process of boulders deposited by the 2004 Indian Ocean tsunami at Pakarang Cape, Thailand. Sediment. Geol., 202, 821-837, 2007. Imamura, F., Goto, K. and Ohkubo, S.: A numerical model of the transport of a boulder by tsunami. J. Geophys. Res. Oceans, 113, C01008, 2008. Pudasaini, S. P.: A general two-phase debris flow model. J. Geophys. Res. Earth Surf., 117, F03010, 2012.

  4. A randomized, controlled trial of in situ pediatric advanced life support recertification ("pediatric advanced life support reconstructed") compared with standard pediatric advanced life support recertification for ICU frontline providers*.

    PubMed

    Kurosawa, Hiroshi; Ikeyama, Takanari; Achuff, Patricia; Perkel, Madeline; Watson, Christine; Monachino, Annemarie; Remy, Daphne; Deutsch, Ellen; Buchanan, Newton; Anderson, Jodee; Berg, Robert A; Nadkarni, Vinay M; Nishisaki, Akira

    2014-03-01

    Recent evidence shows poor retention of Pediatric Advanced Life Support provider skills. Frequent refresher training and in situ simulation are promising interventions. We developed a "Pediatric Advanced Life Support-reconstructed" recertification course by deconstructing the training into six 30-minute in situ simulation scenario sessions delivered over 6 months. We hypothesized that in situ Pediatric Advanced Life Support-reconstructed implementation is feasible and as effective as standard Pediatric Advanced Life Support recertification. A prospective randomized, single-blinded trial. Single-center, large, tertiary PICU in a university-affiliated children's hospital. Nurses and respiratory therapists in PICU. Simulation-based modular Pediatric Advanced Life Support recertification training. Simulation-based pre- and postassessment sessions were conducted to evaluate participants' performance. Video-recorded sessions were rated by trained raters blinded to allocation. The primary outcome was skill performance measured by a validated Clinical Performance Tool, and secondary outcome was behavioral performance measured by a Behavioral Assessment Tool. A mixed-effect model was used to account for baseline differences. Forty participants were prospectively randomized to Pediatric Advanced Life Support reconstructed versus standard Pediatric Advanced Life Support with no significant difference in demographics. Clinical Performance Tool score was similar at baseline in both groups and improved after Pediatric Advanced Life Support reconstructed (pre, 16.3 ± 4.1 vs post, 22.4 ± 3.9; p < 0.001), but not after standard Pediatric Advanced Life Support (pre, 14.3 ± 4.7 vs post, 14.9 ± 4.4; p =0.59). Improvement of Clinical Performance Tool was significantly higher in Pediatric Advanced Life Support reconstructed compared with standard Pediatric Advanced Life Support (p = 0.006). Behavioral Assessment Tool improved in both groups: Pediatric Advanced Life Support reconstructed (pre, 33.3 ± 4.5 vs post, 35.9 ± 5.0; p = 0.008) and standard Pediatric Advanced Life Support (pre, 30.5 ± 4.7 vs post, 33.6 ± 4.9; p = 0.02), with no significant difference of improvement between both groups (p = 0.49). For PICU-based nurses and respiratory therapists, simulation-based "Pediatric Advanced Life Support-reconstructed" in situ training is feasible and more effective than standard Pediatric Advanced Life Support recertification training for skill performance. Both Pediatric Advanced Life Support recertification training courses improved behavioral performance.

  5. Analysis instruments for the performance of Advanced Practice Nursing.

    PubMed

    Sevilla-Guerra, Sonia; Zabalegui, Adelaida

    2017-11-29

    Advanced Practice Nursing has been a reality in the international context for several decades and recently new nursing profiles have been developed in Spain as well that follow this model. The consolidation of these advanced practice roles has also led to of the creation of tools that attempt to define and evaluate their functions. This study aims to identify and explore the existing instruments that enable the domains of Advanced Practice Nursing to be defined. A review of existing international questionnaires and instruments was undertaken, including an analysis of the design process, the domains/dimensions defined, the main results and an exploration of clinimetric properties. Seven studies were analysed but not all proved to be valid, stable or reliable tools. One included tool was able to differentiate between the functions of the general nurse and the advanced practice nurse by the level of activities undertaken within the five domains described. These tools are necessary to evaluate the scope of advanced practice in new nursing roles that correspond to other international models of competencies and practice domains. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  6. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  7. Moving toward the automation of the systematic review process: a summary of discussions at the second meeting of International Collaboration for the Automation of Systematic Reviews (ICASR).

    PubMed

    O'Connor, Annette M; Tsafnat, Guy; Gilbert, Stephen B; Thayer, Kristina A; Wolfe, Mary S

    2018-01-09

    The second meeting of the International Collaboration for Automation of Systematic Reviews (ICASR) was held 3-4 October 2016 in Philadelphia, Pennsylvania, USA. ICASR is an interdisciplinary group whose aim is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. Having automated tools for systematic review should enable more transparent and timely review, maximizing the potential for identifying and translating research findings to practical application. The meeting brought together multiple stakeholder groups including users of summarized research, methodologists who explore production processes and systematic review quality, and technologists such as software developers, statisticians, and vendors. This diversity of participants was intended to ensure effective communication with numerous stakeholders about progress toward automation of systematic reviews and stimulate discussion about potential solutions to identified challenges. The meeting highlighted challenges, both simple and complex, and raised awareness among participants about ongoing efforts by various stakeholders. An outcome of this forum was to identify several short-term projects that participants felt would advance the automation of tasks in the systematic review workflow including (1) fostering better understanding about available tools, (2) developing validated datasets for testing new tools, (3) determining a standard method to facilitate interoperability of tools such as through an application programming interface or API, and (4) establishing criteria to evaluate the quality of tools' output. ICASR 2016 provided a beneficial forum to foster focused discussion about tool development and resources and reconfirm ICASR members' commitment toward systematic reviews' automation.

  8. Visualization of multiple influences on ocellar flight control in giant honeybees with the data-mining tool Viscovery SOMine.

    PubMed

    Kastberger, G; Kranner, G

    2000-02-01

    Viscovery SOMine is a software tool for advanced analysis and monitoring of numerical data sets. It was developed for professional use in business, industry, and science and to support dependency analysis, deviation detection, unsupervised clustering, nonlinear regression, data association, pattern recognition, and animated monitoring. Based on the concept of self-organizing maps (SOMs), it employs a robust variant of unsupervised neural networks--namely, Kohonen's Batch-SOM, which is further enhanced with a new scaling technique for speeding up the learning process. This tool provides a powerful means by which to analyze complex data sets without prior statistical knowledge. The data representation contained in the trained SOM is systematically converted to be used in a spectrum of visualization techniques, such as evaluating dependencies between components, investigating geometric properties of the data distribution, searching for clusters, or monitoring new data. We have used this software tool to analyze and visualize multiple influences of the ocellar system on free-flight behavior in giant honeybees. Occlusion of ocelli will affect orienting reactivities in relation to flight target, level of disturbance, and position of the bee in the flight chamber; it will induce phototaxis and make orienting imprecise and dependent on motivational settings. Ocelli permit the adjustment of orienting strategies to environmental demands by enforcing abilities such as centering or flight kinetics and by providing independent control of posture and flight course.

  9. Upstream-advancing waves generated by three-dimensional moving disturbances

    NASA Astrophysics Data System (ADS)

    Lee, Seung-Joon; Grimshaw, Roger H. J.

    1990-02-01

    The wave field resulting from a surface pressure or a bottom topography in a horizontally unbounded domain is studied. Upstream-advancing waves successively generated by various forcing disturbances moving with near-resonant speeds are found by numerically solving a forced Kadomtsev-Petviashvili (fKP) equation, which shows in its simplest form the interplay of a basic linear wave operator, longitudinal and transverse dispersion, nonlinearity, and forcing. Curved solitary waves are found as a slowly varying similarity solution of the Kadomtsev-Petviashvili (KP) equation, and are favorably compared with the upstream-advancing waves numerically obtained.

  10. The atmospheric boundary layer — advances in knowledge and application

    NASA Astrophysics Data System (ADS)

    Garratt, J. R.; Hess, G. D.; Physick, W. L.; Bougeault, P.

    1996-02-01

    We summarise major activities and advances in boundary-layer knowledge in the 25 years since 1970, with emphasis on the application of this knowledge to surface and boundary-layer parametrisation schemes in numerical models of the atmosphere. Progress in three areas is discussed: (i) the mesoscale modelling of selected phenomena; (ii) numerical weather prediction; and (iii) climate simulations. Future trends are identified, including the incorporation into models of advanced cloud schemes and interactive canopy schemes, and the nesting of high resolution boundary-layer schemes in global climate models.

  11. Modeling rapidly spinning, merging black holes with numerical relativity for the era of first gravitational-wave observations

    NASA Astrophysics Data System (ADS)

    Lovelace, Geoffrey; Simulating eXtreme Collaboration; LIGO Scientific Collaboration

    2016-03-01

    The Advanced Laser Interferometer Gravitational-Wave Observatory (Advanced LIGO) began searching for gravitational waves in September 2015, with three times the sensitivity of the initial LIGO experiment. Merging black holes are among the most promising sources of gravitational waves for Advanced LIGO, but near the time of merger, the emitted waves can only be computed using numerical relativity. In this talk, I will present new numerical-relativity simulations of merging black holes, made using the Spectral Einstein Code [black-holes.org/SpEC.html], including cases with black-hole spins that are nearly as fast as possible. I will discuss how such simulations will be able to rapidly follow up gravitational-wave observations, improving our understanding of the waves' sources.

  12. Numerical simulation of coupled electrochemical and transport processes in battery systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liaw, B.Y.; Gu, W.B.; Wang, C.Y.

    1997-12-31

    Advanced numerical modeling to simulate dynamic battery performance characteristics for several types of advanced batteries is being conducted using computational fluid dynamics (CFD) techniques. The CFD techniques provide efficient algorithms to solve a large set of highly nonlinear partial differential equations that represent the complex battery behavior governed by coupled electrochemical reactions and transport processes. The authors have recently successfully applied such techniques to model advanced lead-acid, Ni-Cd and Ni-MH cells. In this paper, the authors briefly discuss how the governing equations were numerically implemented, show some preliminary modeling results, and compare them with other modeling or experimental data reportedmore » in the literature. The authors describe the advantages and implications of using the CFD techniques and their capabilities in future battery applications.« less

  13. Rotary Percussive Auto-Gopher for Deep Drilling and Sampling

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Badescu, Mircea; Sherrit, Stewart

    2009-01-01

    The term "rotary percussive auto-gopher" denotes a proposed addition to a family of apparatuses, based on ultrasonic/ sonic drill corers (USDCs), that have been described in numerous previous NASA Tech Briefs articles. These apparatuses have been designed, variously, for boring into, and/or acquiring samples of, rock or other hard, brittle materials of geological interest. In the case of the rotary percussive autogopher, the emphasis would be on developing an apparatus capable of penetrating to, and acquiring samples at, depths that could otherwise be reached only by use of much longer, heavier, conventional drilling-and-sampling apparatuses. To recapitulate from the prior articles about USDCs: A USDC can be characterized as a lightweight, low-power jackhammer in which a piezoelectrically driven actuator generates ultrasonic vibrations and is coupled to a tool bit through a free mass. The bouncing of the free mass between the actuator horn and the drill bit converts the actuator ultrasonic vibrations into sonic hammering of the drill bit. The combination of ultrasonic and sonic vibrations gives rise to a hammering action (and a resulting chiseling action at the tip of the tool bit) that is more effective for drilling than is the microhammering action of ultrasonic vibrations alone. The hammering and chiseling actions are so effective that the size of the axial force needed to make the tool bit advance into soil, rock, or another material of interest is much smaller than in ordinary rotary drilling, ordinary hammering, or ordinary steady pushing. The predecessor of the rotary percussive auto-gopher is an apparatus, now denoted an ultrasonic/sonic gopher and previously denoted an ultrasonic gopher, described in "Ultrasonic/ Sonic Mechanism for Drilling and Coring" (NPO-30291), NASA Tech Briefs Vol. 27, No. 9 (September 2003), page 65. The ultrasonic/sonic gopher is intended for use mainly in acquiring cores. The name of the apparatus reflects the fact that, like a gopher, it periodically stops advancing at the end of the hole to bring excavated material (in this case, a core sample) to the surface, then re-enters the hole to resume the advance of the end of the hole. By use of a cable suspended from a reel on the surface, the gopher is lifted from the hole to remove a core sample, then lowered into the hole to resume the advance and acquire the next core sample.

  14. Results of an Experimental Exploration of Advanced Automated Geospatial Tools: Agility in Complex Planning

    DTIC Science & Technology

    2009-06-01

    AUTOMATED GEOSPATIAL TOOLS : AGILITY IN COMPLEX PLANNING Primary Topic: Track 5 – Experimentation and Analysis Walter A. Powell [STUDENT] - GMU...TITLE AND SUBTITLE Results of an Experimental Exploration of Advanced Automated Geospatial Tools : Agility in Complex Planning 5a. CONTRACT NUMBER...Std Z39-18 Abstract Typically, the development of tools and systems for the military is requirement driven; systems are developed to meet

  15. Domain decomposition and matching for time-domain analysis of motions of ships advancing in head sea

    NASA Astrophysics Data System (ADS)

    Tang, Kai; Zhu, Ren-chuan; Miao, Guo-ping; Fan, Ju

    2014-08-01

    A domain decomposition and matching method in the time-domain is outlined for simulating the motions of ships advancing in waves. The flow field is decomposed into inner and outer domains by an imaginary control surface, and the Rankine source method is applied to the inner domain while the transient Green function method is used in the outer domain. Two initial boundary value problems are matched on the control surface. The corresponding numerical codes are developed, and the added masses, wave exciting forces and ship motions advancing in head sea for Series 60 ship and S175 containership, are presented and verified. A good agreement has been obtained when the numerical results are compared with the experimental data and other references. It shows that the present method is more efficient because of the panel discretization only in the inner domain during the numerical calculation, and good numerical stability is proved to avoid divergence problem regarding ships with flare.

  16. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    NASA Astrophysics Data System (ADS)

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  17. Microcomputer-Based Access to Machine-Readable Numeric Databases.

    ERIC Educational Resources Information Center

    Wenzel, Patrick

    1988-01-01

    Describes the use of microcomputers and relational database management systems to improve access to numeric databases by the Data and Program Library Service at the University of Wisconsin. The internal records management system, in-house reference tools, and plans to extend these tools to the entire campus are discussed. (3 references) (CLB)

  18. Designing and Operating Through Compromise: Architectural Analysis of CKMS for the Advanced Metering Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duren, Mike; Aldridge, Hal; Abercrombie, Robert K

    2013-01-01

    Compromises attributable to the Advanced Persistent Threat (APT) highlight the necessity for constant vigilance. The APT provides a new perspective on security metrics (e.g., statistics based cyber security) and quantitative risk assessments. We consider design principals and models/tools that provide high assurance for energy delivery systems (EDS) operations regardless of the state of compromise. Cryptographic keys must be securely exchanged, then held and protected on either end of a communications link. This is challenging for a utility with numerous substations that must secure the intelligent electronic devices (IEDs) that may comprise complex control system of systems. For example, distribution andmore » management of keys among the millions of intelligent meters within the Advanced Metering Infrastructure (AMI) is being implemented as part of the National Smart Grid initiative. Without a means for a secure cryptographic key management system (CKMS) no cryptographic solution can be widely deployed to protect the EDS infrastructure from cyber-attack. We consider 1) how security modeling is applied to key management and cyber security concerns on a continuous basis from design through operation, 2) how trusted models and key management architectures greatly impact failure scenarios, and 3) how hardware-enabled trust is a critical element to detecting, surviving, and recovering from attack.« less

  19. Nonlinear three-dimensional verification of the SPECYL and PIXIE3D magnetohydrodynamics codes for fusion plasmas

    NASA Astrophysics Data System (ADS)

    Bonfiglio, D.; Chacón, L.; Cappello, S.

    2010-08-01

    With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacón, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code in cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.

  20. Nonlinear three-dimensional verification of the SPECYL and PIXIE3D magnetohydrodynamics codes for fusion plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonfiglio, Daniele; Chacon, Luis; Cappello, Susanna

    2010-01-01

    With the increasing impact of scientific discovery via advanced computation, there is presently a strong emphasis on ensuring the mathematical correctness of computational simulation tools. Such endeavor, termed verification, is now at the center of most serious code development efforts. In this study, we address a cross-benchmark nonlinear verification study between two three-dimensional magnetohydrodynamics (3D MHD) codes for fluid modeling of fusion plasmas, SPECYL [S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996)] and PIXIE3D [L. Chacon, Phys. Plasmas 15, 056103 (2008)], in their common limit of application: the simple viscoresistive cylindrical approximation. SPECYL is a serial code inmore » cylindrical geometry that features a spectral formulation in space and a semi-implicit temporal advance, and has been used extensively to date for reversed-field pinch studies. PIXIE3D is a massively parallel code in arbitrary curvilinear geometry that features a conservative, solenoidal finite-volume discretization in space, and a fully implicit temporal advance. The present study is, in our view, a first mandatory step in assessing the potential of any numerical 3D MHD code for fluid modeling of fusion plasmas. Excellent agreement is demonstrated over a wide range of parameters for several fusion-relevant cases in both two- and three-dimensional geometries.« less

  1. Efficient hybrid-symbolic methods for quantum mechanical calculations

    NASA Astrophysics Data System (ADS)

    Scott, T. C.; Zhang, Wenxing

    2015-06-01

    We present hybrid symbolic-numerical tools to generate optimized numerical code for rapid prototyping and fast numerical computation starting from a computer algebra system (CAS) and tailored to any given quantum mechanical problem. Although a major focus concerns the quantum chemistry methods of H. Nakatsuji which has yielded successful and very accurate eigensolutions for small atoms and molecules, the tools are general and may be applied to any basis set calculation with a variational principle applied to its linear and non-linear parameters.

  2. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    NASA Astrophysics Data System (ADS)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term forecasts (0-20 min ahead) to improve optimization and control of equipment on distribution feeders with high penetration of solar. Leveraging such tools that have seen extensive use in the atmospheric sciences supports the development of accurate physics-based solar forecast models. Directions for future research are also provided.

  3. Acquaintance to Artificial Neural Networks and use of artificial intelligence as a diagnostic tool for tuberculosis: A review.

    PubMed

    Dande, Payal; Samant, Purva

    2018-01-01

    Tuberculosis [TB] has afflicted numerous nations in the world. As per a report by the World Health Organization [WHO], an estimated 1.4 million TB deaths in 2015 and an additional 0.4 million deaths resulting from TB disease among people living with HIV, were observed. Most of the TB deaths can be prevented if it is detected at an early stage. The existing processes of diagnosis like blood tests or sputum tests are not only tedious but also take a long time for analysis and cannot differentiate between different drug resistant stages of TB. The need to find newer prompt methods for disease detection has been aided by the latest Artificial Intelligence [AI] tools. Artificial Neural Network [ANN] is one of the important tools that is being used widely in diagnosis and evaluation of medical conditions. This review aims at providing brief introduction to various AI tools that are used in TB detection and gives a detailed description about the utilization of ANN as an efficient diagnostic technique. The paper also provides a critical assessment of ANN and the existing techniques for their diagnosis of TB. Researchers and Practitioners in the field are looking forward to use ANN and other upcoming AI tools such as Fuzzy-logic, genetic algorithms and artificial intelligence simulation as a promising current and future technology tools towards tackling the global menace of Tuberculosis. Latest advancements in the diagnostic field include the combined use of ANN with various other AI tools like the Fuzzy-logic, which has led to an increase in the efficacy and specificity of the diagnostic techniques. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. OSTI.GOV | OSTI, US Dept of Energy Office of Scientific and Technical

    Science.gov Websites

    Information Skip to main content ☰ Submit Research Results Search Tools Public Access Policy Data Services & Dev Tools About FAQs News Sign In Create Account Sign In Create Account Department Information Search terms: Advanced search options Advanced Search OptionsAdvanced Search queries use a

  5. Spray combustion experiments and numerical predictions

    NASA Technical Reports Server (NTRS)

    Mularz, Edward J.; Bulzan, Daniel L.; Chen, Kuo-Huey

    1993-01-01

    The next generation of commercial aircraft will include turbofan engines with performance significantly better than those in the current fleet. Control of particulate and gaseous emissions will also be an integral part of the engine design criteria. These performance and emission requirements present a technical challenge for the combustor: control of the fuel and air mixing and control of the local stoichiometry will have to be maintained much more rigorously than with combustors in current production. A better understanding of the flow physics of liquid fuel spray combustion is necessary. This paper describes recent experiments on spray combustion where detailed measurements of the spray characteristics were made, including local drop-size distributions and velocities. Also, an advanced combustor CFD code has been under development and predictions from this code are compared with experimental results. Studies such as these will provide information to the advanced combustor designer on fuel spray quality and mixing effectiveness. Validation of new fast, robust, and efficient CFD codes will also enable the combustor designer to use them as additional design tools for optimization of combustor concepts for the next generation of aircraft engines.

  6. An Exploratory Analysis for the Selection and Implementation of Advanced Manufacturing Technology by Fuzzy Multi-criteria Decision Making Methods: A Comparative Study

    NASA Astrophysics Data System (ADS)

    Nath, Surajit; Sarkar, Bijan

    2017-08-01

    Advanced Manufacturing Technologies (AMTs) offer opportunities for the manufacturing organizations to excel their competitiveness and in turn their effectiveness in manufacturing. Proper selection and evaluation of AMTs is the most significant task in today's modern world. But this involves a lot of uncertainty and vagueness as it requires many conflicting criteria to deal with. So the task of selection and evaluation of AMTs becomes very tedious for the evaluators as they are not able to provide crisp data for the criteria. Different Fuzzy Multi-criteria Decision Making (MCDM) methods help greatly in dealing with this problem. This paper focuses on the application of two very much potential Fuzzy MCDM methods namely COPRAS-G, EVAMIX and a comparative study between them on some rarely mentioned criteria. Each of the two methods is very powerful evaluation tool and has beauty in its own. Although, performance wise these two methods are almost at same level, but, the approach of each one of them are quite unique. This uniqueness is revealed by introducing a numerical example of selection of AMT.

  7. Analysis of the Yukawa gravitational potential in f (R ) gravity. II. Relativistic periastron advance

    NASA Astrophysics Data System (ADS)

    De Laurentis, Mariafelicia; De Martino, Ivan; Lazkoz, Ruth

    2018-05-01

    Alternative theories of gravity may serve to overcome several shortcomings of the standard cosmological model but, in their weak field limit, general relativity must be recovered so as to match the tight constraints at the Solar System scale. Therefore, testing such alternative models at scales of stellar systems could give a unique opportunity to confirm or rule them out. One of the most straightforward modifications is represented by analytical f (R )-gravity models that introduce a Yukawa-like modification to the Newtonian potential thus modifying the dynamics of particles. Using the geodesics equations, we have illustrated the amplitude of these modifications. First, we have integrated numerically the equations of motion showing the orbital precession of a particle around a massive object. Second, we have computed an analytic expression for the periastron advance of systems having their semimajor axis much shorter than the Yukawa-scale length. Finally, we have extended our results to the case of a binary system composed of two massive objects. Our analysis provides a powerful tool to obtain constraints on the underlying theory of gravity using current and forthcoming data sets.

  8. Synthetic biology strategies toward heterologous phytochemical production.

    PubMed

    Kotopka, Benjamin J; Li, Yanran; Smolke, Christina D

    2018-06-13

    Covering: 2006 to 2018Phytochemicals are important sources for the discovery and development of agricultural and pharmaceutical compounds, such as pesticides and medicines. However, these compounds are typically present in low abundance in nature, and the biosynthetic pathways for most phytochemicals are not fully elucidated. Heterologous production of phytochemicals in plant, bacterial, and yeast hosts has been pursued as a potential approach to address sourcing issues associated with many valuable phytochemicals, and more recently has been utilized as a tool to aid in the elucidation of plant biosynthetic pathways. Due to the structural complexity of certain phytochemicals and the associated biosynthetic pathways, reconstitution of plant pathways in heterologous hosts can encounter numerous challenges. Synthetic biology approaches have been developed to address these challenges in areas such as precise control over heterologous gene expression, improving functional expression of heterologous enzymes, and modifying central metabolism to increase the supply of precursor compounds into the pathway. These strategies have been applied to advance plant pathway reconstitution and phytochemical production in a wide variety of heterologous hosts. Here, we review synthetic biology strategies that have been recently applied to advance complex phytochemical production in heterologous hosts.

  9. Particle and nuclear physics instrumentation and its broad connections

    DOE PAGES

    Demarteau, Marcel; Lipton, Ron; Nicholson, Howard; ...

    2016-12-20

    Subatomic physics shares with other basic sciences the need to innovate, invent, and develop tools, techniques, and technologies to carry out its mission to explore the nature of matter, energy, space, and time. In some cases, entire detectors or technologies developed specifically for particle physics research have been adopted by other fields of research or in commercial applications. In most cases, however, the development of new devices and technologies by particle physics for its own research has added value to other fields of research or to applications beneficial to society by integrating them in the existing technologies. Thus, detector researchmore » and development has not only advanced the current state of technology for particle physics, but has often advanced research in other fields of science and has underpinned progress in numerous applications in medicine and national security. At the same time particle physics has profited immensely from developments in industry and applied them to great benefit for the use of particle physics detectors. Finally, this symbiotic relationship has seen strong mutual benefits with sometimes unexpected far reach.« less

  10. Particle and nuclear physics instrumentation and its broad connections

    NASA Astrophysics Data System (ADS)

    Demarteau, M.; Lipton, R.; Nicholson, H.; Shipsey, I.

    2016-10-01

    Subatomic physics shares with other basic sciences the need to innovate, invent, and develop tools, techniques, and technologies to carry out its mission to explore the nature of matter, energy, space, and time. In some cases, entire detectors or technologies developed specifically for particle physics research have been adopted by other fields of research or in commercial applications. In most cases, however, the development of new devices and technologies by particle physics for its own research has added value to other fields of research or to applications beneficial to society by integrating them in the existing technologies. Thus, detector research and development has not only advanced the current state of technology for particle physics, but has often advanced research in other fields of science and has underpinned progress in numerous applications in medicine and national security. At the same time particle physics has profited immensely from developments in industry and applied them to great benefit for the use of particle physics detectors. This symbiotic relationship has seen strong mutual benefits with sometimes unexpected far reach.

  11. Particle and nuclear physics instrumentation and its broad connections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demarteau, Marcel; Lipton, Ron; Nicholson, Howard

    Subatomic physics shares with other basic sciences the need to innovate, invent, and develop tools, techniques, and technologies to carry out its mission to explore the nature of matter, energy, space, and time. In some cases, entire detectors or technologies developed specifically for particle physics research have been adopted by other fields of research or in commercial applications. In most cases, however, the development of new devices and technologies by particle physics for its own research has added value to other fields of research or to applications beneficial to society by integrating them in the existing technologies. Thus, detector researchmore » and development has not only advanced the current state of technology for particle physics, but has often advanced research in other fields of science and has underpinned progress in numerous applications in medicine and national security. At the same time particle physics has profited immensely from developments in industry and applied them to great benefit for the use of particle physics detectors. Finally, this symbiotic relationship has seen strong mutual benefits with sometimes unexpected far reach.« less

  12. Ultrastrong coupling of a single artificial atom to an electromagnetic continuum in the nonperturbative regime

    NASA Astrophysics Data System (ADS)

    Forn-Díaz, P.; García-Ripoll, J. J.; Peropadre, B.; Orgiazzi, J.-L.; Yurtalan, M. A.; Belyansky, R.; Wilson, C. M.; Lupascu, A.

    2017-01-01

    The study of light-matter interaction has led to important advances in quantum optics and enabled numerous technologies. Over recent decades, progress has been made in increasing the strength of this interaction at the single-photon level. More recently, a major achievement has been the demonstration of the so-called strong coupling regime, a key advancement enabling progress in quantum information science. Here, we demonstrate light-matter interaction over an order of magnitude stronger than previously reported, reaching the nonperturbative regime of ultrastrong coupling (USC). We achieve this using a superconducting artificial atom tunably coupled to the electromagnetic continuum of a one-dimensional waveguide. For the largest coupling, the spontaneous emission rate of the atom exceeds its transition frequency. In this USC regime, the description of atom and light as distinct entities breaks down, and a new description in terms of hybrid states is required. Beyond light-matter interaction itself, the tunability of our system makes it a promising tool to study a number of important physical systems, such as the well-known spin-boson and Kondo models.

  13. High Fidelity Modeling of Field Reversed Configuration (FRC) Thrusters

    DTIC Science & Technology

    2016-06-01

    space propulsion . This effort consists of numerical model development, physical model development, and systematic studies of the non-linear plasma...studies of the physical characteristics of Field Reversed Configuration (FRC) plasma for advanced space propulsion . This effort consists of numerical...FRCs for propulsion application. Two of the most advanced designs are based on the theta-pinch formation and the RMF formation mechanism, which

  14. Regional Sediment Management (RSM) Modeling Tools: Integration of Advanced Sediment Transport Tools into HEC-RAS

    DTIC Science & Technology

    2014-06-01

    Integration of Advanced Sediment Transport Tools into HEC-RAS by Paul M. Boyd and Stanford A. Gibson PURPOSE: This Coastal and Hydraulics Engineering...Technical Note (CHETN) summarizes the development and initial testing of new sediment transport and modeling tools developed by the U.S. Army Corps...sediment transport within the USACE HEC River Analysis System (HEC-RAS) software package and to determine its applicability to Regional Sediment

  15. Advanced Flow Control as a Management Tool in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Wugalter, S.

    1974-01-01

    Advanced Flow Control is closely related to Air Traffic Control. Air Traffic Control is the business of the Federal Aviation Administration. To formulate an understanding of advanced flow control and its use as a management tool in the National Airspace System, it becomes necessary to speak somewhat of air traffic control, the role of FAA, and their relationship to advanced flow control. Also, this should dispell forever, any notion that advanced flow control is the inspirational master valve scheme to be used on the Alaskan Oil Pipeline.

  16. Next-generation sequencing for endocrine cancers: Recent advances and challenges.

    PubMed

    Suresh, Padmanaban S; Venkatesh, Thejaswini; Tsutsumi, Rie; Shetty, Abhishek

    2017-05-01

    Contemporary molecular biology research tools have enriched numerous areas of biomedical research that address challenging diseases, including endocrine cancers (pituitary, thyroid, parathyroid, adrenal, testicular, ovarian, and neuroendocrine cancers). These tools have placed several intriguing clues before the scientific community. Endocrine cancers pose a major challenge in health care and research despite considerable attempts by researchers to understand their etiology. Microarray analyses have provided gene signatures from many cells, tissues, and organs that can differentiate healthy states from diseased ones, and even show patterns that correlate with stages of a disease. Microarray data can also elucidate the responses of endocrine tumors to therapeutic treatments. The rapid progress in next-generation sequencing methods has overcome many of the initial challenges of these technologies, and their advantages over microarray techniques have enabled them to emerge as valuable aids for clinical research applications (prognosis, identification of drug targets, etc.). A comprehensive review describing the recent advances in next-generation sequencing methods and their application in the evaluation of endocrine and endocrine-related cancers is lacking. The main purpose of this review is to illustrate the concepts that collectively constitute our current view of the possibilities offered by next-generation sequencing technological platforms, challenges to relevant applications, and perspectives on the future of clinical genetic testing of patients with endocrine tumors. We focus on recent discoveries in the use of next-generation sequencing methods for clinical diagnosis of endocrine tumors in patients and conclude with a discussion on persisting challenges and future objectives.

  17. The quiet revolution of numerical weather prediction.

    PubMed

    Bauer, Peter; Thorpe, Alan; Brunet, Gilbert

    2015-09-03

    Advances in numerical weather prediction represent a quiet revolution because they have resulted from a steady accumulation of scientific knowledge and technological advances over many years that, with only a few exceptions, have not been associated with the aura of fundamental physics breakthroughs. Nonetheless, the impact of numerical weather prediction is among the greatest of any area of physical science. As a computational problem, global weather prediction is comparable to the simulation of the human brain and of the evolution of the early Universe, and it is performed every day at major operational centres across the world.

  18. Cement bond evaluation method in horizontal wells using segmented bond tool

    NASA Astrophysics Data System (ADS)

    Song, Ruolong; He, Li

    2018-06-01

    Most of the existing cement evaluation technologies suffer from tool eccentralization due to gravity in highly deviated wells and horizontal wells. This paper proposes a correction method to lessen the effects of tool eccentralization on evaluation results of cement bond using segmented bond tool, which has an omnidirectional sonic transmitter and eight segmented receivers evenly arranged around the tool 2 ft from the transmitter. Using 3-D finite difference parallel numerical simulation method, we investigate the logging responses of centred and eccentred segmented bond tool in a variety of bond conditions. From the numerical results, we find that the tool eccentricity and channel azimuth can be estimated from measured sector amplitude. The average of the sector amplitude when the tool is eccentred can be corrected to the one when the tool is centred. Then the corrected amplitude will be used to calculate the channel size. The proposed method is applied to both synthetic and field data. For synthetic data, it turns out that this method can estimate the tool eccentricity with small error and the bond map is improved after correction. For field data, the tool eccentricity has a good agreement with the measured well deviation angle. Though this method still suffers from the low accuracy of calculating channel azimuth, the credibility of corrected bond map is improved especially in horizontal wells. It gives us a choice to evaluate the bond condition for horizontal wells using existing logging tool. The numerical results in this paper can provide aids for understanding measurements of segmented tool in both vertical and horizontal wells.

  19. Spike-train spectra and network response functions for non-linear integrate-and-fire neurons.

    PubMed

    Richardson, Magnus J E

    2008-11-01

    Reduced models have long been used as a tool for the analysis of the complex activity taking place in neurons and their coupled networks. Recent advances in experimental and theoretical techniques have further demonstrated the usefulness of this approach. Despite the often gross simplification of the underlying biophysical properties, reduced models can still present significant difficulties in their analysis, with the majority of exact and perturbative results available only for the leaky integrate-and-fire model. Here an elementary numerical scheme is demonstrated which can be used to calculate a number of biologically important properties of the general class of non-linear integrate-and-fire models. Exact results for the first-passage-time density and spike-train spectrum are derived, as well as the linear response properties and emergent states of recurrent networks. Given that the exponential integrate-fire model has recently been shown to agree closely with the experimentally measured response of pyramidal cells, the methodology presented here promises to provide a convenient tool to facilitate the analysis of cortical-network dynamics.

  20. Programming biological models in Python using PySB.

    PubMed

    Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K

    2013-01-01

    Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.

  1. The Monarch Initiative: an integrative data and analytic platform connecting phenotypes to genotypes across species

    PubMed Central

    Mungall, Christopher J.; McMurry, Julie A.; Köhler, Sebastian; Balhoff, James P.; Borromeo, Charles; Brush, Matthew; Carbon, Seth; Conlin, Tom; Dunn, Nathan; Engelstad, Mark; Foster, Erin; Gourdine, J.P.; Jacobsen, Julius O.B.; Keith, Dan; Laraway, Bryan; Lewis, Suzanna E.; NguyenXuan, Jeremy; Shefchek, Kent; Vasilevsky, Nicole; Yuan, Zhou; Washington, Nicole; Hochheiser, Harry; Groza, Tudor; Smedley, Damian; Robinson, Peter N.; Haendel, Melissa A.

    2017-01-01

    The correlation of phenotypic outcomes with genetic variation and environmental factors is a core pursuit in biology and biomedicine. Numerous challenges impede our progress: patient phenotypes may not match known diseases, candidate variants may be in genes that have not been characterized, model organisms may not recapitulate human or veterinary diseases, filling evolutionary gaps is difficult, and many resources must be queried to find potentially significant genotype–phenotype associations. Non-human organisms have proven instrumental in revealing biological mechanisms. Advanced informatics tools can identify phenotypically relevant disease models in research and diagnostic contexts. Large-scale integration of model organism and clinical research data can provide a breadth of knowledge not available from individual sources and can provide contextualization of data back to these sources. The Monarch Initiative (monarchinitiative.org) is a collaborative, open science effort that aims to semantically integrate genotype–phenotype data from many species and sources in order to support precision medicine, disease modeling, and mechanistic exploration. Our integrated knowledge graph, analytic tools, and web services enable diverse users to explore relationships between phenotypes and genotypes across species. PMID:27899636

  2. The Monarch Initiative: an integrative data and analytic platform connecting phenotypes to genotypes across species

    DOE PAGES

    Mungall, Christopher J.; McMurry, Julie A.; Köhler, Sebastian; ...

    2016-11-29

    The correlation of phenotypic outcomes with genetic variation and environmental factors is a core pursuit in biology and biomedicine. Numerous challenges impede our progress: patient phenotypes may not match known diseases, candidate variants may be in genes that have not been characterized, model organisms may not recapitulate human or veterinary diseases, filling evolutionary gaps is difficult, and many resources must be queried to find potentially significant genotype-phenotype associations. Nonhuman organisms have proven instrumental in revealing biological mechanisms. Advanced informatics tools can identify phenotypically relevant disease models in research and diagnostic contexts. Large-scale integration of model organism and clinical research datamore » can provide a breadth of knowledge not available from individual sources and can provide contextualization of data back to these sources. The Monarch Initiative (monarchinitiative.org) is a collaborative, open science effort that aims to semantically integrate genotype-phenotype data from many species and sources in order to support precision medicine, disease modeling, and mechanistic exploration. Our integrated knowledge graph, analytic tools, and web services enable diverse users to explore relationships between phenotypes and genotypes across species.« less

  3. Programming biological models in Python using PySB

    PubMed Central

    Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K

    2013-01-01

    Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis. PMID:23423320

  4. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, Marshall Clint; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many fronts to make possible high-speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flowfields/plumes. The Optical Plume Anomaly Detector (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDiFiS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Capabilities for real-time processing are being advanced on several fronts, including an effort to hardware encode components of the EDiFiS for health monitoring and management. This paper addresses the OPAD with its tool suites, and discusses what is considered a natural progression: a concept for taking OPAD to the next logical level of high energy physics, incorporating fermion and boson particle analyses in measurement of neutron flux.

  5. MRI of human hair.

    PubMed

    Mattle, Eveline; Weiger, Markus; Schmidig, Daniel; Boesiger, Peter; Fey, Michael

    2009-06-01

    Hair care for humans is a major world industry with specialised tools, chemicals and techniques. Studying the effect of hair care products has become a considerable field of research, and besides mechanical and optical testing numerous advanced analytical techniques have been employed in this area. In the present work, another means of studying the properties of hair is added by demonstrating the feasibility of magnetic resonance imaging (MRI) of the human hair. Established dedicated nuclear magnetic resonance microscopy hardware (solenoidal radiofrequency microcoils and planar field gradients) and methods (constant time imaging) were adapted to the specific needs of hair MRI. Images were produced at a spatial resolution high enough to resolve the inner structure of the hair, showing contrast between cortex and medulla. Quantitative evaluation of a scan series with different echo times provided a T*(2) value of 2.6 ms for the cortex and a water content of about 90% for hairs saturated with water. The demonstration of the feasibility of hair MRI potentially adds a new tool to the large variety of analytical methods used nowadays in the development of hair care products.

  6. Spliceosome Profiling Visualizes Operations of a Dynamic RNP at Nucleotide Resolution.

    PubMed

    Burke, Jordan E; Longhurst, Adam D; Merkurjev, Daria; Sales-Lee, Jade; Rao, Beiduo; Moresco, James J; Yates, John R; Li, Jingyi Jessica; Madhani, Hiten D

    2018-05-03

    Tools to understand how the spliceosome functions in vivo have lagged behind advances in the structural biology of the spliceosome. Here, methods are described to globally profile spliceosome-bound pre-mRNA, intermediates, and spliced mRNA at nucleotide resolution. These tools are applied to three yeast species that span 600 million years of evolution. The sensitivity of the approach enables the detection of canonical and non-canonical events, including interrupted, recursive, and nested splicing. This application of statistical modeling uncovers independent roles for the size and position of the intron and the number of introns per transcript in substrate progression through the two catalytic stages. These include species-specific inputs suggestive of spliceosome-transcriptome coevolution. Further investigations reveal the ATP-dependent discard of numerous endogenous substrates after spliceosome assembly in vivo and connect this discard to intron retention, a form of splicing regulation. Spliceosome profiling is a quantitative, generalizable global technology used to investigate an RNP central to eukaryotic gene expression. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Three-dimensional anthropometric techniques applied to the fabrication of burn masks and the quantification of wound healing

    NASA Astrophysics Data System (ADS)

    Whitestone, Jennifer J.; Geisen, Glen R.; McQuiston, Barbara K.

    1997-03-01

    Anthropometric surveys conducted by the military provide comprehensive human body measurement data that are human interface requirements for successful mission performance of weapon systems, including cockpits, protective equipment, and clothing. The application of human body dimensions to model humans and human-machine performance begins with engineering anthropometry. There are two critical elements to engineering anthropometry: data acquisition and data analysis. First, the human body is captured dimensionally with either traditional anthropometric tools, such as calipers and tape measures, or with advanced image acquisition systems, such as a laser scanner. Next, numerous statistical analysis tools, such as multivariate modeling and feature envelopes, are used to effectively transition these data for design and evaluation of equipment and work environments. Recently, Air Force technology transfer allowed researchers at the Computerized Anthropometric Research and Design (CARD) Laboratory at Wright-Patterson Air Force Base to work with the Dayton, Ohio area medical community in assessing the rate of wound healing and improving the fit of total contract burn masks. This paper describes the successful application of CARD Lab engineering anthropometry to two medically oriented human interface problems.

  8. Development of a Robust and Efficient Parallel Solver for Unsteady Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    West, Jeff; Wright, Jeffrey; Thakur, Siddharth; Luke, Ed; Grinstead, Nathan

    2012-01-01

    The traditional design and analysis practice for advanced propulsion systems relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. In the design of advanced propulsion systems, CFD plays a major role in defining the required performance over the entire flight regime, as well as in testing the sensitivity of the design to the different modes of operation. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment. The turbomachinery simulation capability presented here is being developed in a computational tool called Loci-STREAM [1]. It integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci [2] which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective is to be able to routinely simulate problems involving complex geometries requiring large unstructured grids and complex multidisciplinary physics. An immediate application of interest is simulation of unsteady flows in rocket turbopumps, particularly in cryogenic liquid rocket engines. The key components of the overall methodology presented in this paper are the following: (a) high fidelity unsteady simulation capability based on Detached Eddy Simulation (DES) in conjunction with second-order temporal discretization, (b) compliance with Geometric Conservation Law (GCL) in order to maintain conservative property on moving meshes for second-order time-stepping scheme, (c) a novel cloud-of-points interpolation method (based on a fast parallel kd-tree search algorithm) for interfaces between turbomachinery components in relative motion which is demonstrated to be highly scalable, and (d) demonstrated accuracy and parallel scalability on large grids (approx 250 million cells) in full turbomachinery geometries.

  9. How can we probe the atom mass currents induced by synthetic gauge fields?

    NASA Astrophysics Data System (ADS)

    Paramekanti, Arun; Killi, Matthew; Trotzky, Stefan

    2013-05-01

    Ultracold atomic fermions and bosons in an optical lattice can have quantum ground states which support equilibrium currents in the presence of synthetic magnetic fields or spin orbit coupling. As a tool to uncover these mass currents, we propose using an anisotropic quantum quench of the optical lattice which dynamically converts the current patterns into measurable density patterns. Using analytical calculations and numerical simulations, we show that this scheme can probe diverse equilibrium bulk current patterns in Bose superfluids and Fermi fluids induced by synthetic magnetic fields, as well as detect the chiral edge currents in topological states of atomic matter such as quantum Hall and quantum spin Hall insulators. This work is supported by NSERC of Canada and the Canadian Institute for Advanced Research.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santhanagopalan, Shriram; White, Ralph E.

    Rotating ring disc electrode (RRDE) experiments are a classic tool for investigating kinetics of electrochemical reactions. Several standardized methods exist for extracting transport parameters and reaction rate constants using RRDE measurements. Here in this work, we compare some approximate solutions to the convective diffusion used popularly in the literature to a rigorous numerical solution of the Nernst-Planck equations coupled to the three dimensional flow problem. In light of these computational advancements, we explore design aspects of the RRDE that will help improve sensitivity of our parameter estimation procedure to experimental data. We use the oxygen reduction in acidic media involvingmore » three charge transfer reactions and a chemical reaction as an example, and identify ways to isolate reaction currents for the individual processes in order to accurately estimate the exchange current densities.« less

  11. Dengue: Knowledge gaps, unmet needs and research priorities

    PubMed Central

    Katzelnick, Leah C.; Coloma, Josefina; Harris, Eva

    2018-01-01

    Summary Dengue virus (DENV) is a mosquito-borne pathogen that causes up to ~100 million dengue cases each year, placing a major public health, social and economic burden on numerous low- and middle-income countries (LMICs). Major advances by scientists, vaccine developers, and affected communities are revealing new insights and enabling novel interventions and approaches to dengue prevention and control. Such research has highlighted further questions about both the basic understanding of dengue and efforts to develop new tools. We discuss existing approaches to dengue diagnostics, disease prognosis, surveillance, and vector control in LMICs as well as potential consequences of vaccine introduction. We also summarize current knowledge and recent insights into dengue epidemiology, immunology, and pathogenesis, and their implications for understanding natural infection and current and future vaccines. PMID:28185868

  12. Microbial ecology of the skin in the era of metagenomics and molecular microbiology.

    PubMed

    Hannigan, Geoffrey D; Grice, Elizabeth A

    2013-12-01

    The skin is the primary physical barrier between the body and the external environment and is also a substrate for the colonization of numerous microbes. Previously, dermatological microbiology research was dominated by culture-based techniques, but significant advances in genomic technologies have enabled the development of less-biased, culture-independent approaches to characterize skin microbial communities. These molecular microbiology approaches illustrate the great diversity of microbiota colonizing the skin and highlight unique features such as site specificity, temporal dynamics, and interpersonal variation. Disruptions in skin commensal microbiota are associated with the progression of many dermatological diseases. A greater understanding of how skin microbes interact with each other and with their host, and how we can therapeutically manipulate those interactions, will provide powerful tools for treating and preventing dermatological disease.

  13. Printing Technologies for Medical Applications.

    PubMed

    Shafiee, Ashkan; Atala, Anthony

    2016-03-01

    Over the past 15 years, printers have been increasingly utilized for biomedical applications in various areas of medicine and tissue engineering. This review discusses the current and future applications of 3D bioprinting. Several 3D printing tools with broad applications from surgical planning to 3D models are being created, such as liver replicas and intermediate splints. Numerous researchers are exploring this technique to pattern cells or fabricate several different tissues and organs, such as blood vessels or cardiac patches. Current investigations in bioprinting applications are yielding further advances. As one of the fastest areas of industry expansion, 3D additive manufacturing will change techniques across biomedical applications, from research and testing models to surgical planning, device manufacturing, and tissue or organ replacement. Copyright © 2016. Published by Elsevier Ltd.

  14. Computerized Design of Low-noise Face-milled Spiral Bevel Gears

    NASA Technical Reports Server (NTRS)

    Litvin, Faydor L.; Zhang, YI; Handschuh, Robert F.

    1994-01-01

    An advanced design methodology is proposed for the face-milled spiral bevel gears with modified tooth surface geometry that provides a reduced level of noise and has a stabilized bearing contact. The approach is based on the local synthesis of the gear drive that provides the 'best' machine-tool settings. The theoretical aspects of the local synthesis approach are based on the application of a predesigned parabolic function for absorption of undesirable transmission errors caused by misalignment and the direct relations between principal curvatures and directions for mating surfaces. The meshing and contact of the gear drive is synthesized and analyzed by a computer program. The generation of gears with the proposed geometry design can be accomplished by application of existing equipment. A numerical example that illustrates the proposed theory is presented.

  15. Nature Neuroscience Review

    PubMed Central

    Maze, Ian; Shen, Li; Zhang, Bin; Garcia, Benjamin A.; Shao, Ningyi; Mitchell, Amanda; Sun, HaoSheng; Akbarian, Schahram; Allis, C. David; Nestler, Eric J.

    2014-01-01

    Over the past decade, rapid advances in epigenomics research have extensively characterized critical roles for chromatin regulatory events during normal periods of eukaryotic cell development and plasticity, as well as part of aberrant processes implicated in human disease. Application of such approaches to studies of the central nervous system (CNS), however, is more recent. Here, we provide a comprehensive overview of currently available tools to analyze neuroepigenomics data, as well as a discussion of pending challenges specific to the field of neuroscience. Integration of numerous unbiased genome-wide and proteomic approaches will be necessary to fully understand the neuroepigenome and the extraordinarily complex nature of the human brain. This will be critical to the development of future diagnostic and therapeutic strategies aimed at alleviating the vast array of heterogeneous and genetically distinct disorders of the CNS. PMID:25349914

  16. Computerized design of low-noise face-milled spiral bevel gears

    NASA Astrophysics Data System (ADS)

    Litvin, Faydor L.; Zhang, Yi; Handschuh, Robert F.

    1994-08-01

    An advanced design methodology is proposed for the face-milled spiral bevel gears with modified tooth surface geometry that provides a reduced level of noise and has a stabilized bearing contact. The approach is based on the local synthesis of the gear drive that provides the 'best' machine-tool settings. The theoretical aspects of the local synthesis approach are based on the application of a predesigned parabolic function for absorption of undesirable transmission errors caused by misalignment and the direct relations between principal curvatures and directions for mating surfaces. The meshing and contact of the gear drive is synthesized and analyzed by a computer program. The generation of gears with the proposed geometry design can be accomplished by application of existing equipment. A numerical example that illustrates the proposed theory is presented.

  17. Airborne Trailblazer: Two decades with NASA Langley's 737 flying laboratory

    NASA Technical Reports Server (NTRS)

    Wallace, Lane E.

    1994-01-01

    This book is the story of a very unique aircraft and the contributions it has made to the air transportation industry. NASA's Boeing 737-100 Transport Systems Research Vehicle started life as the prototype for Boeing's 737 series of aircraft. The airplane was acquired by LaRC in 1974 to conduct research into advanced transport aircraft technologies. In the twenty years that followed, the airplane participated in more than twenty different research projects, evolving from a research tool for a specific NASA program into a national airborne research facility. It played a critical role in developing and gaining acceptance for numerous significant transport technologies including 'glass cockpits,' airborne windshear detection systems, data links for air traffic control communications, the microwave landing system, and the satellite-based global positioning system (GPS).

  18. A design tool for direct and non-stochastic calculations of near-field radiative transfer in complex structures: The NF-RT-FDTD algorithm

    NASA Astrophysics Data System (ADS)

    Didari, Azadeh; Pinar Mengüç, M.

    2017-08-01

    Advances in nanotechnology and nanophotonics are inextricably linked with the need for reliable computational algorithms to be adapted as design tools for the development of new concepts in energy harvesting, radiative cooling, nanolithography and nano-scale manufacturing, among others. In this paper, we provide an outline for such a computational tool, named NF-RT-FDTD, to determine the near-field radiative transfer between structured surfaces using Finite Difference Time Domain method. NF-RT-FDTD is a direct and non-stochastic algorithm, which accounts for the statistical nature of the thermal radiation and is easily applicable to any arbitrary geometry at thermal equilibrium. We present a review of the fundamental relations for far- and near-field radiative transfer between different geometries with nano-scale surface and volumetric features and gaps, and then we discuss the details of the NF-RT-FDTD formulation, its application to sample geometries and outline its future expansion to more complex geometries. In addition, we briefly discuss some of the recent numerical works for direct and indirect calculations of near-field thermal radiation transfer, including Scattering Matrix method, Finite Difference Time Domain method (FDTD), Wiener Chaos Expansion, Fluctuating Surface Current (FSC), Fluctuating Volume Current (FVC) and Thermal Discrete Dipole Approximations (TDDA).

  19. Advances in the quantification of mitochondrial function in primary human immune cells through extracellular flux analysis.

    PubMed

    Nicholas, Dequina; Proctor, Elizabeth A; Raval, Forum M; Ip, Blanche C; Habib, Chloe; Ritou, Eleni; Grammatopoulos, Tom N; Steenkamp, Devin; Dooms, Hans; Apovian, Caroline M; Lauffenburger, Douglas A; Nikolajczyk, Barbara S

    2017-01-01

    Numerous studies show that mitochondrial energy generation determines the effectiveness of immune responses. Furthermore, changes in mitochondrial function may regulate lymphocyte function in inflammatory diseases like type 2 diabetes. Analysis of lymphocyte mitochondrial function has been facilitated by introduction of 96-well format extracellular flux (XF96) analyzers, but the technology remains imperfect for analysis of human lymphocytes. Limitations in XF technology include the lack of practical protocols for analysis of archived human cells, and inadequate data analysis tools that require manual quality checks. Current analysis tools for XF outcomes are also unable to automatically assess data quality and delete untenable data from the relatively high number of biological replicates needed to power complex human cell studies. The objectives of work presented herein are to test the impact of common cellular manipulations on XF outcomes, and to develop and validate a new automated tool that objectively analyzes a virtually unlimited number of samples to quantitate mitochondrial function in immune cells. We present significant improvements on previous XF analyses of primary human cells that will be absolutely essential to test the prediction that changes in immune cell mitochondrial function and fuel sources support immune dysfunction in chronic inflammatory diseases like type 2 diabetes.

  20. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 9: Tool and Die, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  1. Remote Numerical Simulations of the Interaction of High Velocity Clouds with Random Magnetic Fields

    NASA Astrophysics Data System (ADS)

    Santillan, Alfredo; Hernandez--Cervantes, Liliana; Gonzalez--Ponce, Alejandro; Kim, Jongsoo

    The numerical simulations associated with the interaction of High Velocity Clouds (HVC) with the Magnetized Galactic Interstellar Medium (ISM) are a powerful tool to describe the evolution of the interaction of these objects in our Galaxy. In this work we present a new project referred to as Theoretical Virtual i Observatories. It is oriented toward to perform numerical simulations in real time through a Web page. This is a powerful astrophysical computational tool that consists of an intuitive graphical user interface (GUI) and a database produced by numerical calculations. In this Website the user can make use of the existing numerical simulations from the database or run a new simulation introducing initial conditions such as temperatures, densities, velocities, and magnetic field intensities for both the ISM and HVC. The prototype is programmed using Linux, Apache, MySQL, and PHP (LAMP), based on the open source philosophy. All simulations were performed with the MHD code ZEUS-3D, which solves the ideal MHD equations by finite differences on a fixed Eulerian mesh. Finally, we present typical results that can be obtained with this tool.

  2. A review of laboratory and numerical modelling in volcanology

    NASA Astrophysics Data System (ADS)

    Kavanagh, Janine L.; Engwell, Samantha L.; Martin, Simon A.

    2018-04-01

    Modelling has been used in the study of volcanic systems for more than 100 years, building upon the approach first applied by Sir James Hall in 1815. Informed by observations of volcanological phenomena in nature, including eye-witness accounts of eruptions, geophysical or geodetic monitoring of active volcanoes, and geological analysis of ancient deposits, laboratory and numerical models have been used to describe and quantify volcanic and magmatic processes that span orders of magnitudes of time and space. We review the use of laboratory and numerical modelling in volcanological research, focussing on sub-surface and eruptive processes including the accretion and evolution of magma chambers, the propagation of sheet intrusions, the development of volcanic flows (lava flows, pyroclastic density currents, and lahars), volcanic plume formation, and ash dispersal. When first introduced into volcanology, laboratory experiments and numerical simulations marked a transition in approach from broadly qualitative to increasingly quantitative research. These methods are now widely used in volcanology to describe the physical and chemical behaviours that govern volcanic and magmatic systems. Creating simplified models of highly dynamical systems enables volcanologists to simulate and potentially predict the nature and impact of future eruptions. These tools have provided significant insights into many aspects of the volcanic plumbing system and eruptive processes. The largest scientific advances in volcanology have come from a multidisciplinary approach, applying developments in diverse fields such as engineering and computer science to study magmatic and volcanic phenomena. A global effort in the integration of laboratory and numerical volcano modelling is now required to tackle key problems in volcanology and points towards the importance of benchmarking exercises and the need for protocols to be developed so that models are routinely tested against real world data.

  3. Use of advanced analysis tools to support freeway corridor freight planning.

    DOT National Transportation Integrated Search

    2010-07-22

    Advanced corridor freight management and pricing strategies are increasingly being chosen to : address freight mobility challenges. As a result, evaluation tools are needed to assess the benefits : of these strategies as compared to other alternative...

  4. SmartWay Truck Tool-Advanced Class: Getting the Most out of Your SmartWay Participation

    EPA Pesticide Factsheets

    This EPA presentation provides information on the Advanced SmartWay Truck Tool; it's background, development, participation, data collection, usage, fleet categories, emission metrics, ranking system, performance data, reports, and schedule for 2017.

  5. Multiscale and Multiphysics Modeling of Additive Manufacturing of Advanced Materials

    NASA Technical Reports Server (NTRS)

    Liou, Frank; Newkirk, Joseph; Fan, Zhiqiang; Sparks, Todd; Chen, Xueyang; Fletcher, Kenneth; Zhang, Jingwei; Zhang, Yunlu; Kumar, Kannan Suresh; Karnati, Sreekar

    2015-01-01

    The objective of this proposed project is to research and develop a prediction tool for advanced additive manufacturing (AAM) processes for advanced materials and develop experimental methods to provide fundamental properties and establish validation data. Aircraft structures and engines demand materials that are stronger, useable at much higher temperatures, provide less acoustic transmission, and enable more aeroelastic tailoring than those currently used. Significant improvements in properties can only be achieved by processing the materials under nonequilibrium conditions, such as AAM processes. AAM processes encompass a class of processes that use a focused heat source to create a melt pool on a substrate. Examples include Electron Beam Freeform Fabrication and Direct Metal Deposition. These types of additive processes enable fabrication of parts directly from CAD drawings. To achieve the desired material properties and geometries of the final structure, assessing the impact of process parameters and predicting optimized conditions with numerical modeling as an effective prediction tool is necessary. The targets for the processing are multiple and at different spatial scales, and the physical phenomena associated occur in multiphysics and multiscale. In this project, the research work has been developed to model AAM processes in a multiscale and multiphysics approach. A macroscale model was developed to investigate the residual stresses and distortion in AAM processes. A sequentially coupled, thermomechanical, finite element model was developed and validated experimentally. The results showed the temperature distribution, residual stress, and deformation within the formed deposits and substrates. A mesoscale model was developed to include heat transfer, phase change with mushy zone, incompressible free surface flow, solute redistribution, and surface tension. Because of excessive computing time needed, a parallel computing approach was also tested. In addition, after investigating various methods, a Smoothed Particle Hydrodynamics Model (SPH Model) was developed to model wire feeding process. Its computational efficiency and simple architecture makes it more robust and flexible than other models. More research on material properties may be needed to realistically model the AAM processes. A microscale model was developed to investigate heterogeneous nucleation, dendritic grain growth, epitaxial growth of columnar grains, columnar-to-equiaxed transition, grain transport in melt, and other properties. The orientations of the columnar grains were almost perpendicular to the laser motion's direction. Compared to the similar studies in the literature, the multiple grain morphology modeling result is in the same order of magnitude as optical morphologies in the experiment. Experimental work was conducted to validate different models. An infrared camera was incorporated as a process monitoring and validating tool to identify the solidus and mushy zones during deposition. The images were successfully processed to identify these regions. This research project has investigated multiscale and multiphysics of the complex AAM processes thus leading to advanced understanding of these processes. The project has also developed several modeling tools and experimental validation tools that will be very critical in the future of AAM process qualification and certification.

  6. Zephyr: Open-source Parallel Seismic Waveform Inversion in an Integrated Python-based Framework

    NASA Astrophysics Data System (ADS)

    Smithyman, B. R.; Pratt, R. G.; Hadden, S. M.

    2015-12-01

    Seismic Full-Waveform Inversion (FWI) is an advanced method to reconstruct wave properties of materials in the Earth from a series of seismic measurements. These methods have been developed by researchers since the late 1980s, and now see significant interest from the seismic exploration industry. As researchers move towards implementing advanced numerical modelling (e.g., 3D, multi-component, anisotropic and visco-elastic physics), it is desirable to make use of a modular approach, minimizing the effort developing a new set of tools for each new numerical problem. SimPEG (http://simpeg.xyz) is an open source project aimed at constructing a general framework to enable geophysical inversion in various domains. In this abstract we describe Zephyr (https://github.com/bsmithyman/zephyr), which is a coupled research project focused on parallel FWI in the seismic context. The software is built on top of Python, Numpy and IPython, which enables very flexible testing and implementation of new features. Zephyr is an open source project, and is released freely to enable reproducible research. We currently implement a parallel, distributed seismic forward modelling approach that solves the 2.5D (two-and-one-half dimensional) viscoacoustic Helmholtz equation at a range modelling frequencies, generating forward solutions for a given source behaviour, and gradient solutions for a given set of observed data. Solutions are computed in a distributed manner on a set of heterogeneous workers. The researcher's frontend computer may be separated from the worker cluster by a network link to enable full support for computation on remote clusters from individual workstations or laptops. The present codebase introduces a numerical discretization equivalent to that used by FULLWV, a well-known seismic FWI research codebase. This makes it straightforward to compare results from Zephyr directly with FULLWV. The flexibility introduced by the use of a Python programming environment makes extension of the codebase with new methods much more straightforward. This enables comparison and integration of new efforts with existing results.

  7. Pseudo-shock waves and their interactions in high-speed intakes

    NASA Astrophysics Data System (ADS)

    Gnani, F.; Zare-Behtash, H.; Kontis, K.

    2016-04-01

    In an air-breathing engine the flow deceleration from supersonic to subsonic conditions takes places inside the isolator through a gradual compression consisting of a series of shock waves. The wave system, referred to as a pseudo-shock wave or shock train, establishes the combustion chamber entrance conditions, and therefore influences the performance of the entire propulsion system. The characteristics of the pseudo-shock depend on a number of variables which make this flow phenomenon particularly challenging to be analysed. Difficulties in experimentally obtaining accurate flow quantities at high speeds and discrepancies of numerical approaches with measured data have been readily reported. Understanding the flow physics in the presence of the interaction of numerous shock waves with the boundary layer in internal flows is essential to developing methods and control strategies. To counteract the negative effects of shock wave/boundary layer interactions, which are responsible for the engine unstart process, multiple flow control methodologies have been proposed. Improved analytical models, advanced experimental methodologies and numerical simulations have allowed a more in-depth analysis of the flow physics. The present paper aims to bring together the main results, on the shock train structure and its associated phenomena inside isolators, studied using the aforementioned tools. Several promising flow control techniques that have more recently been applied to manipulate the shock wave/boundary layer interaction are also examined in this review.

  8. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  9. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 6: Welding, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  10. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 12: Instrumentation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  11. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 15: Administrative Information, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This volume developed by the Machine Tool Advanced Skill Technology (MAST) program contains key administrative documents and provides additional sources for machine tool and precision manufacturing information and important points of contact in the industry. The document contains the following sections: a foreword; grant award letter; timeline for…

  12. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  13. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 3: Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  14. A review and evaluation of numerical tools for fractional calculus and fractional order controls

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Liu, Lu; Dehghan, Sina; Chen, YangQuan; Xue, Dingyü

    2017-06-01

    In recent years, as fractional calculus becomes more and more broadly used in research across different academic disciplines, there are increasing demands for the numerical tools for the computation of fractional integration/differentiation, and the simulation of fractional order systems. Time to time, being asked about which tool is suitable for a specific application, the authors decide to carry out this survey to present recapitulative information of the available tools in the literature, in hope of benefiting researchers with different academic backgrounds. With this motivation, the present article collects the scattered tools into a dashboard view, briefly introduces their usage and algorithms, evaluates the accuracy, compares the performance, and provides informative comments for selection.

  15. Numerical Hydrodynamics in General Relativity.

    PubMed

    Font, José A

    2003-01-01

    The current status of numerical solutions for the equations of ideal general relativistic hydrodynamics is reviewed. With respect to an earlier version of the article, the present update provides additional information on numerical schemes, and extends the discussion of astrophysical simulations in general relativistic hydrodynamics. Different formulations of the equations are presented, with special mention of conservative and hyperbolic formulations well-adapted to advanced numerical methods. A large sample of available numerical schemes is discussed, paying particular attention to solution procedures based on schemes exploiting the characteristic structure of the equations through linearized Riemann solvers. A comprehensive summary of astrophysical simulations in strong gravitational fields is presented. These include gravitational collapse, accretion onto black holes, and hydrodynamical evolutions of neutron stars. The material contained in these sections highlights the numerical challenges of various representative simulations. It also follows, to some extent, the chronological development of the field, concerning advances on the formulation of the gravitational field and hydrodynamic equations and the numerical methodology designed to solve them. Supplementary material is available for this article at 10.12942/lrr-2003-4.

  16. Technology and Jobs: Computer-Aided Design. Numerical-Control Machine-Tool Operators. Office Automation.

    ERIC Educational Resources Information Center

    Stanton, Michael; And Others

    1985-01-01

    Three reports on the effects of high technology on the nature of work include (1) Stanton on applications and implications of computer-aided design for engineers, drafters, and architects; (2) Nardone on the outlook and training of numerical-control machine tool operators; and (3) Austin and Drake on the future of clerical occupations in automated…

  17. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollingsworth, Jeff

    2014-07-31

    The purpose of this project was to develop tools and techniques to improve the ability of computational scientists to investigate and correct problems (bugs) in their programs. Specifically, the University of Maryland component of this project focused on the problems associated with the finite number of bits available in a computer to represent numeric values. In large scale scientific computation, numbers are frequently added to and multiplied with each other billions of times. Thus even small errors due to the representation of numbers can accumulate into big errors. However, using too many bits to represent a number results in additionalmore » computation, memory, and energy costs. Thus it is critical to find the right size for numbers. This project focused on several aspects of this general problem. First, we developed a tool to look for cancelations, the catastrophic loss of precision in numbers due to the addition of two numbers whose actual values are close to each other, but whose representation in a computer is identical or nearly so. Second, we developed a suite of tools to allow programmers to identify exactly how much precision is required for each operation in their program. This tool allows programmers to both verify that enough precision is available, but more importantly find cases where extra precision could be eliminated to allow the program to use less memory, computer time, or energy. These tools use advanced binary modification techniques to allow the analysis of actual optimized code. The system, called Craft, has been applied to a number of benchmarks and real applications.« less

  18. Systems-Level Synthetic Biology for Advanced Biofuel Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruffing, Anne; Jensen, Travis J.; Strickland, Lucas Marshall

    2015-03-01

    Cyanobacteria have been shown to be capable of producing a variety of advanced biofuels; however, product yields remain well below those necessary for large scale production. New genetic tools and high throughput metabolic engineering techniques are needed to optimize cyanobacterial metabolisms for enhanced biofuel production. Towards this goal, this project advances the development of a multiple promoter replacement technique for systems-level optimization of gene expression in a model cyanobacterial host: Synechococcus sp. PCC 7002. To realize this multiple-target approach, key capabilities were developed, including a high throughput detection method for advanced biofuels, enhanced transformation efficiency, and genetic tools for Synechococcusmore » sp. PCC 7002. Moreover, several additional obstacles were identified for realization of this multiple promoter replacement technique. The techniques and tools developed in this project will help to enable future efforts in the advancement of cyanobacterial biofuels.« less

  19. Measurement and Prediction of the Thermomechanical Response of Shape Memory Alloy Hybrid Composite Beams

    NASA Technical Reports Server (NTRS)

    Davis, Brian; Turner, Travis L.; Seelecke, Stefan

    2008-01-01

    An experimental and numerical investigation into the static and dynamic responses of shape memory alloy hybrid composite (SMAHC) beams is performed to provide quantitative validation of a recently commercialized numerical analysis/design tool for SMAHC structures. The SMAHC beam specimens consist of a composite matrix with embedded pre-strained SMA actuators, which act against the mechanical boundaries of the structure when thermally activated to adaptively stiffen the structure. Numerical results are produced from the numerical model as implemented into the commercial finite element code ABAQUS. A rigorous experimental investigation is undertaken to acquire high fidelity measurements including infrared thermography and projection moire interferometry for full-field temperature and displacement measurements, respectively. High fidelity numerical results are also obtained from the numerical model and include measured parameters, such as geometric imperfection and thermal load. Excellent agreement is achieved between the predicted and measured results of the static and dynamic thermomechanical response, thereby providing quantitative validation of the numerical tool.

  20. Optimization-Based Inverse Identification of the Parameters of a Concrete Cap Material Model

    NASA Astrophysics Data System (ADS)

    Král, Petr; Hokeš, Filip; Hušek, Martin; Kala, Jiří; Hradil, Petr

    2017-10-01

    Issues concerning the advanced numerical analysis of concrete building structures in sophisticated computing systems currently require the involvement of nonlinear mechanics tools. The efforts to design safer, more durable and mainly more economically efficient concrete structures are supported via the use of advanced nonlinear concrete material models and the geometrically nonlinear approach. The application of nonlinear mechanics tools undoubtedly presents another step towards the approximation of the real behaviour of concrete building structures within the framework of computer numerical simulations. However, the success rate of this application depends on having a perfect understanding of the behaviour of the concrete material models used and having a perfect understanding of the used material model parameters meaning. The effective application of nonlinear concrete material models within computer simulations often becomes very problematic because these material models very often contain parameters (material constants) whose values are difficult to obtain. However, getting of the correct values of material parameters is very important to ensure proper function of a concrete material model used. Today, one possibility, which permits successful solution of the mentioned problem, is the use of optimization algorithms for the purpose of the optimization-based inverse material parameter identification. Parameter identification goes hand in hand with experimental investigation while it trying to find parameter values of the used material model so that the resulting data obtained from the computer simulation will best approximate the experimental data. This paper is focused on the optimization-based inverse identification of the parameters of a concrete cap material model which is known under the name the Continuous Surface Cap Model. Within this paper, material parameters of the model are identified on the basis of interaction between nonlinear computer simulations, gradient based and nature inspired optimization algorithms and experimental data, the latter of which take the form of a load-extension curve obtained from the evaluation of uniaxial tensile test results. The aim of this research was to obtain material model parameters corresponding to the quasi-static tensile loading which may be further used for the research involving dynamic and high-speed tensile loading. Based on the obtained results it can be concluded that the set goal has been reached.

  1. Combination of Ultrasonic Vibration and Cryogenic Cooling for Cutting Performance Improvement of Inconel 718 Turning

    NASA Astrophysics Data System (ADS)

    Lin, S. Y.; Chung, C. T.; Cheng, Y. Y.

    2011-01-01

    The main objective of this study is to develop a thermo-elastic-plastic coupling model, based on a combination skill of ultrasonically assisted cutting and cryogenic cooling, under large deformation for Inconel 718 alloy machining process. The improvement extent on cutting performance and tool life promotion may be examined from this investigation. The critical value of the strain energy density of the workpiece will be utilized as the chip separation and the discontinuous chip segmentation criteria. The forced convection cooling and a hydrodynamic lubrication model will be considered and formulated in the model. Finite element method will be applied to create a complete numerical solution for this ultrasonic vibration cutting model. During the analysis, the cutting tool is incrementally advanced forward with superimposed ultrasonic vibration in a back and forth step-by-step manner, from an incipient stage of tool-workpiece engagement to a steady state of chip formation, a whole simulation of orthogonal cutting process under plane strain deformation is thus undertaken. High shear strength induces a fluctuation phenomenon of shear angle, high shear strain rate, variation of chip types and chip morphology, tool-chip contact length variation, the temperature distributions within the workpiece, chip and tool, periodic fluctuation in cutting forces can be determined from the developed model. A complete comparison of machining characteristics between some different combinations of ultrasonically assisted cutting and cryogenic cooling with conventional cutting operation can be acquired. Finally, the high-speed turning experiment for Inconel 718 alloy will be taken in the laboratory to validate the accuracy of the model, and the progressive flank wear, crater wear, notching and chipping of the tool edge can also be measured in the experiments.

  2. Working with the superabrasives industry to optimize tooling for grinding brittle materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, J.S.; Piscotty, M.A.; Blaedel, K.L.

    1996-05-01

    The optics manufacturing industry is undertaking a significant modernization, as computer-numeric-controlled (CNC) equipment is joining or replacing open-loop equipment and hand lapping/polishing on the shop floor. Several prototype CNC lens grinding platforms employing ring tools are undergoing development and demonstration at the Center for Optics Manufacturing in Rochester, NY, and several machine tool companies have CNC product lines aimed at the optics industry. Benefits to using CNC ring tool grinding equipment include: essentially unlimited flexibility in selecting radii of curvature without special radiused tooling, the potential for CIM linkages to CAD workstations, and the cultural shift from craftsmen with undocumentedmore » procedures to CNC machine operators employing computerized routines for process control. In recent years, these developments, have inspired a number of US optics companies to invest in CNC equipment and participate in process development activities involving bound diamond tooling. This modernization process,extends beyond large optics companies that have historically embraced advanced equipment, to also include smaller optical shops where a shift to CNC equipment requires a significant company commitment. This paper addresses our efforts to optimize fine grinding wheels to support the new generation of CNC equipment. We begin with a discussion of how fine grinding fits into the optical production process, and then describe an initiative for improving the linkage between optics industry and the grinding wheel industry. For the purposes of this paper, we define fine wheels to have diamond sizes below 20 micrometers, which includes wheels used for what is sometimes called medium grinding (e.g. 10-20 micrometers diamond) and for fine grinding (e.g. 2-4 micrometers diamond).« less

  3. Preserving Simplecticity in the Numerical Integration of Linear Beam Optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, Christopher K.

    2017-07-01

    Presented are mathematical tools and methods for the development of numerical integration techniques that preserve the symplectic condition inherent to mechanics. The intended audience is for beam physicists with backgrounds in numerical modeling and simulation with particular attention to beam optics applications. The paper focuses on Lie methods that are inherently symplectic regardless of the integration accuracy order. Section 2 provides the mathematically tools used in the sequel and necessary for the reader to extend the covered techniques. Section 3 places those tools in the context of charged-particle beam optics; in particular linear beam optics is presented in terms ofmore » a Lie algebraic matrix representation. Section 4 presents numerical stepping techniques with particular emphasis on a third-order leapfrog method. Section 5 discusses the modeling of field imperfections with particular attention to the fringe fields of quadrupole focusing magnets. The direct computation of a third order transfer matrix for a fringe field is shown.« less

  4. Advances in edge-diffraction modeling for virtual-acoustic simulations

    NASA Astrophysics Data System (ADS)

    Calamia, Paul Thomas

    In recent years there has been growing interest in modeling sound propagation in complex, three-dimensional (3D) virtual environments. With diverse applications for the military, the gaming industry, psychoacoustics researchers, architectural acousticians, and others, advances in computing power and 3D audio-rendering techniques have driven research and development aimed at closing the gap between the auralization and visualization of virtual spaces. To this end, this thesis focuses on improving the physical and perceptual realism of sound-field simulations in virtual environments through advances in edge-diffraction modeling. To model sound propagation in virtual environments, acoustical simulation tools commonly rely on geometrical-acoustics (GA) techniques that assume asymptotically high frequencies, large flat surfaces, and infinitely thin ray-like propagation paths. Such techniques can be augmented with diffraction modeling to compensate for the effect of surface size on the strength and directivity of a reflection, to allow for propagation around obstacles and into shadow zones, and to maintain soundfield continuity across reflection and shadow boundaries. Using a time-domain, line-integral formulation of the Biot-Tolstoy-Medwin (BTM) diffraction expression, this thesis explores various aspects of diffraction calculations for virtual-acoustic simulations. Specifically, we first analyze the periodic singularity of the BTM integrand and describe the relationship between the singularities and higher-order reflections within wedges with open angle less than 180°. Coupled with analytical approximations for the BTM expression, this analysis allows for accurate numerical computations and a continuous sound field in the vicinity of an arbitrary wedge geometry insonified by a point source. Second, we describe an edge-subdivision strategy that allows for fast diffraction calculations with low error relative to a numerically more accurate solution. Third, to address the considerable increase in propagation paths due to diffraction, we describe a simple procedure for identifying and culling insignificant diffraction components during a virtual-acoustic simulation. Finally, we present a novel method to find GA components using diffraction parameters that ensures continuity at reflection and shadow boundaries.

  5. Flowfield characterization and model development in detonation tubes

    NASA Astrophysics Data System (ADS)

    Owens, Zachary Clark

    A series of experiments and numerical simulations are performed to advance the understanding of flowfield phenomena and impulse generation in detonation tubes. Experiments employing laser-based velocimetry, high-speed schlieren imaging and pressure measurements are used to construct a dataset against which numerical models can be validated. The numerical modeling culminates in the development of a two-dimensional, multi-species, finite-rate-chemistry, parallel, Navier-Stokes solver. The resulting model is specifically designed to assess unsteady, compressible, reacting flowfields, and its utility for studying multidimensional detonation structure is demonstrated. A reduced, quasi-one-dimensional model with source terms accounting for wall losses is also developed for rapid parametric assessment. Using these experimental and numerical tools, two primary objectives are pursued. The first objective is to gain an understanding of how nozzles affect unsteady, detonation flowfields and how they can be designed to maximize impulse in a detonation based propulsion system called a pulse detonation engine. It is shown that unlike conventional, steady-flow propulsion systems where converging-diverging nozzles generate optimal performance, unsteady detonation tube performance during a single-cycle is maximized using purely diverging nozzles. The second objective is to identify the primary underlying mechanisms that cause velocity and pressure measurements to deviate from idealized theory. An investigation of the influence of non-ideal losses including wall heat transfer, friction and condensation leads to the development of improved models that reconcile long-standing discrepancies between predicted and measured detonation tube performance. It is demonstrated for the first time that wall condensation of water vapor in the combustion products can cause significant deviations from ideal theory.

  6. Computational Prediction of miRNA Genes from Small RNA Sequencing Data

    PubMed Central

    Kang, Wenjing; Friedländer, Marc R.

    2015-01-01

    Next-generation sequencing now for the first time allows researchers to gage the depth and variation of entire transcriptomes. However, now as rare transcripts can be detected that are present in cells at single copies, more advanced computational tools are needed to accurately annotate and profile them. microRNAs (miRNAs) are 22 nucleotide small RNAs (sRNAs) that post-transcriptionally reduce the output of protein coding genes. They have established roles in numerous biological processes, including cancers and other diseases. During miRNA biogenesis, the sRNAs are sequentially cleaved from precursor molecules that have a characteristic hairpin RNA structure. The vast majority of new miRNA genes that are discovered are mined from small RNA sequencing (sRNA-seq), which can detect more than a billion RNAs in a single run. However, given that many of the detected RNAs are degradation products from all types of transcripts, the accurate identification of miRNAs remain a non-trivial computational problem. Here, we review the tools available to predict animal miRNAs from sRNA sequencing data. We present tools for generalist and specialist use cases, including prediction from massively pooled data or in species without reference genome. We also present wet-lab methods used to validate predicted miRNAs, and approaches to computationally benchmark prediction accuracy. For each tool, we reference validation experiments and benchmarking efforts. Last, we discuss the future of the field. PMID:25674563

  7. Building Models in the Classroom: Taking Advantage of Sophisticated Geomorphic Numerical Tools Using a Simple Graphical User Interface

    NASA Astrophysics Data System (ADS)

    Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.

    2014-12-01

    Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

  8. Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Mike; Cipiti, Ben; Demuth, Scott Francis

    2017-01-30

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less

  9. Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durkee, Joe W.; Cipiti, Ben; Demuth, Scott Francis

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less

  10. Microbial ecology to manage processes in environmental biotechnology.

    PubMed

    Rittmann, Bruce E

    2006-06-01

    Microbial ecology and environmental biotechnology are inherently tied to each other. The concepts and tools of microbial ecology are the basis for managing processes in environmental biotechnology; and these processes provide interesting ecosystems to advance the concepts and tools of microbial ecology. Revolutionary advancements in molecular tools to understand the structure and function of microbial communities are bolstering the power of microbial ecology. A push from advances in modern materials along with a pull from a societal need to become more sustainable is enabling environmental biotechnology to create novel processes. How do these two fields work together? Five principles illuminate the way: (i) aim for big benefits; (ii) develop and apply more powerful tools to understand microbial communities; (iii) follow the electrons; (iv) retain slow-growing biomass; and (v) integrate, integrate, integrate.

  11. Development of a CFD Code for Analysis of Fluid Dynamic Forces in Seals

    NASA Technical Reports Server (NTRS)

    Athavale, Mahesh M.; Przekwas, Andrzej J.; Singhal, Ashok K.

    1991-01-01

    The aim is to develop a 3-D computational fluid dynamics (CFD) code for the analysis of fluid flow in cylindrical seals and evaluation of the dynamic forces on the seals. This code is expected to serve as a scientific tool for detailed flow analysis as well as a check for the accuracy of the 2D industrial codes. The features necessary in the CFD code are outlined. The initial focus was to develop or modify and implement new techniques and physical models. These include collocated grid formulation, rotating coordinate frames and moving grid formulation. Other advanced numerical techniques include higher order spatial and temporal differencing and an efficient linear equation solver. These techniques were implemented in a 2D flow solver for initial testing. Several benchmark test cases were computed using the 2D code, and the results of these were compared to analytical solutions or experimental data to check the accuracy. Tests presented here include planar wedge flow, flow due to an enclosed rotor, and flow in a 2D seal with a whirling rotor. Comparisons between numerical and experimental results for an annular seal and a 7-cavity labyrinth seal are also included.

  12. Supersymmetry on the Lattice

    NASA Astrophysics Data System (ADS)

    Schaich, David

    2016-03-01

    Lattice field theory provides a non-perturbative regularization of strongly interacting systems, which has proven crucial to the study of quantum chromodynamics among many other theories. Supersymmetry plays prominent roles in the study of physics beyond the standard model, both as an ingredient in model building and as a tool to improve our understanding of quantum field theory. Attempts to apply lattice techniques to supersymmetric field theories have a long history, but until recently these efforts have generally encountered insurmountable difficulties related to the interplay of supersymmetry with the lattice discretization of spacetime. In recent years these difficulties have been overcome for a class of theories that includes the particularly interesting case of maximally supersymmetric Yang-Mills (N = 4 SYM) in four dimensions, which is a cornerstone of AdS/CFT duality. In combination with computational advances this progress enables practical numerical investigations of N = 4 SYM on the lattice, which can address questions that are difficult or impossible to handle through perturbation theory, AdS/CFT duality, or the conformal bootstrap program. I will briefly review some of the new ideas underlying this recent progress, and present some results from ongoing large-scale numerical calculations, including comparisons with analytic predictions.

  13. Foundations of a query and simulation system for the modeling of biochemical and biological processes.

    PubMed

    Antoniotti, M; Park, F; Policriti, A; Ugel, N; Mishra, B

    2003-01-01

    The analysis of large amounts of data, produced as (numerical) traces of in vivo, in vitro and in silico experiments, has become a central activity for many biologists and biochemists. Recent advances in the mathematical modeling and computation of biochemical systems have moreover increased the prominence of in silico experiments; such experiments typically involve the simulation of sets of Differential Algebraic Equations (DAE), e.g., Generalized Mass Action systems (GMA) and S-systems. In this paper we reason about the necessary theoretical and pragmatic foundations for a query and simulation system capable of analyzing large amounts of such trace data. To this end, we propose to combine in a novel way several well-known tools from numerical analysis (approximation theory), temporal logic and verification, and visualization. The result is a preliminary prototype system: simpathica/xssys. When dealing with simulation data simpathica/xssys exploits the special structure of the underlying DAE, and reduces the search space in an efficient way so as to facilitate any queries about the traces. The proposed system is designed to give the user possibility to systematically analyze and simultaneously query different possible timed evolutions of the modeled system.

  14. A Figure of Merit: Quantifying the Probability of a Nuclear Reactor Accident.

    PubMed

    Wellock, Thomas R

    In recent decades, probabilistic risk assessment (PRA) has become an essential tool in risk analysis and management in many industries and government agencies. The origins of PRA date to the 1975 publication of the U.S. Nuclear Regulatory Commission's (NRC) Reactor Safety Study led by MIT professor Norman Rasmussen. The "Rasmussen Report" inspired considerable political and scholarly disputes over the motives behind it and the value of its methods and numerical estimates of risk. The Report's controversies have overshadowed the deeper technical origins of risk assessment. Nuclear experts had long sought to express risk in a "figure of merit" to verify the safety of weapons and, later, civilian reactors. By the 1970s, technical advances in PRA gave the methodology the potential to serve political ends, too. The Report, it was hoped, would prove nuclear power's safety to a growing chorus of critics. Subsequent attacks on the Report's methods and numerical estimates damaged the NRC's credibility. PRA's fortunes revived when the 1979 Three Mile Island accident demonstrated PRA's potential for improving the safety of nuclear power and other technical systems. Nevertheless, the Report's controversies endure in mistrust of PRA and its experts.

  15. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  16. Solving ODE Initial Value Problems With Implicit Taylor Series Methods

    NASA Technical Reports Server (NTRS)

    Scott, James R.

    2000-01-01

    In this paper we introduce a new class of numerical methods for integrating ODE initial value problems. Specifically, we propose an extension of the Taylor series method which significantly improves its accuracy and stability while also increasing its range of applicability. To advance the solution from t (sub n) to t (sub n+1), we expand a series about the intermediate point t (sub n+mu):=t (sub n) + mu h, where h is the stepsize and mu is an arbitrary parameter called an expansion coefficient. We show that, in general, a Taylor series of degree k has exactly k expansion coefficients which raise its order of accuracy. The accuracy is raised by one order if k is odd, and by two orders if k is even. In addition, if k is three or greater, local extrapolation can be used to raise the accuracy two additional orders. We also examine stability for the problem y'= lambda y, Re (lambda) less than 0, and identify several A-stable schemes. Numerical results are presented for both fixed and variable stepsizes. It is shown that implicit Taylor series methods provide an effective integration tool for most problems, including stiff systems and ODE's with a singular point.

  17. Development of a 3D numerical methodology for fast prediction of gun blast induced loading

    NASA Astrophysics Data System (ADS)

    Costa, E.; Lagasco, F.

    2014-05-01

    In this paper, the development of a methodology based on semi-empirical models from the literature to carry out 3D prediction of pressure loading on surfaces adjacent to a weapon system during firing is presented. This loading is consequent to the impact of the blast wave generated by the projectile exiting the muzzle bore. When exceeding a pressure threshold level, loading is potentially capable to induce unwanted damage to nearby hard structures as well as frangible panels or electronic equipment. The implemented model shows the ability to quickly predict the distribution of the blast wave parameters over three-dimensional complex geometry surfaces when the weapon design and emplacement data as well as propellant and projectile characteristics are available. Considering these capabilities, the use of the proposed methodology is envisaged as desirable in the preliminary design phase of the combat system to predict adverse effects and then enable to identify the most appropriate countermeasures. By providing a preliminary but sensitive estimate of the operative environmental loading, this numerical means represents a good alternative to more powerful, but time consuming advanced computational fluid dynamics tools, which use can, thus, be limited to the final phase of the design.

  18. Investigation Study on Determination of Fracture Strain and Fractuer Forming Limit Curve Using Different Experimental and Numerical Methods

    NASA Astrophysics Data System (ADS)

    Farahnak, P.; Urbanek, M.; Džugan, J.

    2017-09-01

    Forming Limit Curve (FLC) is a well-known tool for the evaluation of failure in sheet metal process. However, its experimental determination and evaluation are rather complex. From theoretical point of view, FLC describes initiation of the instability not fracture. During the last years Digital Image Correlation (DIC) techniques have been developed extensively. Throughout this paper, all the measurements were done using DIC and as it is reported in the literature, different approaches to capture necking and fracture phenomena using Cross Section Method (CSM), Time dependent Method (TDM) and Thinning Method (TM) were investigated. Each aforementioned method has some advantages and disadvantages. Moreover, a cruciform specimen was used in order to cover whole FLC in the range between uniaxial to equi-biaxial tension and as an alternative for Nakajima test. Based on above-mentioned uncertainty about the fracture strain, some advanced numerical failure models can describe necking and fracture phenomena accurately with consideration of anisotropic effects. It is noticeable that in this paper, dog-bone, notch and circular disk specimens are used to calibrate Johnson-Cook (J-C) fracture model. The results are discussed for mild steel DC01.

  19. A new model for volume recombination in plane-parallel chambers in pulsed fields of high dose-per-pulse

    NASA Astrophysics Data System (ADS)

    Gotz, M.; Karsch, L.; Pawelke, J.

    2017-11-01

    In order to describe the volume recombination in a pulsed radiation field of high dose-per-pulse this study presents a numerical solution of a 1D transport model of the liberated charges in a plane-parallel ionization chamber. In addition, measurements were performed on an Advanced Markus ionization chamber in a pulsed electron beam to obtain suitable data to test the calculation. The experiment used radiation pulses of 4 μs duration and variable dose-per-pulse values up to about 1 Gy, as well as pulses of variable duration up to 308 μs at constant dose-per-pulse values between 85 mGy and 400 mGy. Those experimental data were compared to the developed numerical model and existing descriptions of volume recombination. At low collection voltages the observed dose-per-pulse dependence of volume recombination can be approximated by the existing theory using effective parameters. However, at high collection voltages large discrepancies are observed. The developed numerical model shows much better agreement with the observations and is able to replicate the observed behavior over the entire range of dose-per-pulse values and collection voltages. Using the developed numerical model, the differences between observation and existing theory are shown to be the result of a large fraction of the charge being collected as free electrons and the resultant distortion of the electric field inside the chamber. Furthermore, the numerical solution is able to calculate recombination losses for arbitrary pulse durations in good agreement with the experimental data, an aspect not covered by current theory. Overall, the presented numerical solution of the charge transport model should provide a more flexible tool to describe volume recombination for high dose-per-pulse values as well as for arbitrary pulse durations and repetition rates.

  20. A new model for volume recombination in plane-parallel chambers in pulsed fields of high dose-per-pulse.

    PubMed

    Gotz, M; Karsch, L; Pawelke, J

    2017-11-01

    In order to describe the volume recombination in a pulsed radiation field of high dose-per-pulse this study presents a numerical solution of a 1D transport model of the liberated charges in a plane-parallel ionization chamber. In addition, measurements were performed on an Advanced Markus ionization chamber in a pulsed electron beam to obtain suitable data to test the calculation. The experiment used radiation pulses of 4 μs duration and variable dose-per-pulse values up to about 1 Gy, as well as pulses of variable duration up to 308 [Formula: see text] at constant dose-per-pulse values between 85 mGy and 400 mGy. Those experimental data were compared to the developed numerical model and existing descriptions of volume recombination. At low collection voltages the observed dose-per-pulse dependence of volume recombination can be approximated by the existing theory using effective parameters. However, at high collection voltages large discrepancies are observed. The developed numerical model shows much better agreement with the observations and is able to replicate the observed behavior over the entire range of dose-per-pulse values and collection voltages. Using the developed numerical model, the differences between observation and existing theory are shown to be the result of a large fraction of the charge being collected as free electrons and the resultant distortion of the electric field inside the chamber. Furthermore, the numerical solution is able to calculate recombination losses for arbitrary pulse durations in good agreement with the experimental data, an aspect not covered by current theory. Overall, the presented numerical solution of the charge transport model should provide a more flexible tool to describe volume recombination for high dose-per-pulse values as well as for arbitrary pulse durations and repetition rates.

  1. DEVELOPMENTS IN GRworkbench

    NASA Astrophysics Data System (ADS)

    Moylan, Andrew; Scott, Susan M.; Searle, Anthony C.

    2006-02-01

    The software tool GRworkbench is an ongoing project in visual, numerical General Relativity at The Australian National University. Recently, GRworkbench has been significantly extended to facilitate numerical experimentation in analytically-defined space-times. The numerical differential geometric engine has been rewritten using functional programming techniques, enabling objects which are normally defined as functions in the formalism of differential geometry and General Relativity to be directly represented as function variables in the C++ code of GRworkbench. The new functional differential geometric engine allows for more accurate and efficient visualisation of objects in space-times and makes new, efficient computational techniques available. Motivated by the desire to investigate a recent scientific claim using GRworkbench, new tools for numerical experimentation have been implemented, allowing for the simulation of complex physical situations.

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 8: Sheet Metal & Composites, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 4: Manufacturing Engineering Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  4. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 10: Computer-Aided Drafting & Design, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  6. Advances in bioluminescence imaging: new probes from old recipes.

    PubMed

    Yao, Zi; Zhang, Brendan S; Prescher, Jennifer A

    2018-06-04

    Bioluminescent probes are powerful tools for visualizing biology in live tissues and whole animals. Recent years have seen a surge in the number of new luciferases, luciferins, and related tools available for bioluminescence imaging. Many were crafted using classic methods of optical probe design and engineering. Here we highlight recent advances in bioluminescent tool discovery and development, along with applications of the probes in cells, tissues, and organisms. Collectively, these tools are improving in vivo imaging capabilities and bolstering new research directions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  8. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  9. Advanced Tools Webinar Series Presents: Regulatory Issues and Case Studies of Advanced Tools

    EPA Science Inventory

    U.S. EPA has released A Guide for Assessing Biodegradation and Source Identification of Organic Ground Water Contaminants using Compound Specific Isotope Analysis (CSIA) [EPA 600/R-08/148 | December 2008 | www.epa.gov/ada]. The Guide provides recommendations for sample collecti...

  10. Visualization of small scale structures on high resolution DEMs

    NASA Astrophysics Data System (ADS)

    Kokalj, Žiga; Zakšek, Klemen; Pehani, Peter; Čotar, Klemen; Oštir, Krištof

    2015-04-01

    Knowledge on the terrain morphology is very important for observation of numerous processes and events and digital elevation models are therefore one of the most important datasets in geographic analyses. Furthermore, recognition of natural and anthropogenic microrelief structures, which can be observed on detailed terrain models derived from aerial laser scanning (lidar) or structure-from-motion photogrammetry, is of paramount importance in many applications. In this paper we thus examine and evaluate methods of raster lidar data visualization for the determination (recognition) of microrelief features and present a series of strategies to assist selecting the preferred visualization of choice for structures of various shapes and sizes, set in varied landscapes. Often the answer is not definite and more frequently a combination of techniques has to be used to map a very diverse landscape. Researchers can only very recently benefit from free software for calculation of advanced visualization techniques. These tools are often difficult to understand, have numerous options that confuse the user, or require and produce non-standard data formats, because they were written for specific purposes. We therefore designed the Relief Visualization Toolbox (RVT) as a free, easy-to-use, standalone application to create visualisations from high-resolution digital elevation data. It is tailored for the very beginners in relief interpretation, but it can also be used by more advanced users in data processing and geographic information systems. It offers a range of techniques, such as simple hillshading and its derivatives, slope gradient, trend removal, positive and negative openness, sky-view factor, and anisotropic sky-view factor. All included methods have been proven to be effective for detection of small scale features and the default settings are optimised to accomplish this task. However, the usability of the tool goes beyond computation for visualization purposes, as sky-view factor, for example, is an essential variable in many fields, e.g. in meteorology. RVT produces two types of results: 1) the original files have a full range of values and are intended for further analyses in geographic information systems, 2) the simplified versions are histogram stretched for visualization purposes and saved as 8-bit GeoTIFF files. This means that they can be explored in non-GIS software, e.g. with simple picture viewers, which is essential when a larger community of non-specialists needs to be considered, e.g. in public collaborative projects. The tool recognizes all frequently used single band raster formats and supports elevation raster file data conversion.

  11. Linguistic Alternatives to Quantitative Research Strategies. Part One: How Linguistic Mechanisms Advance Research Outcomes

    ERIC Educational Resources Information Center

    Yeager, Joseph; Sommer, Linda

    2007-01-01

    Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…

  12. Estimating parameters from rotating ring disc electrode measurements

    DOE PAGES

    Santhanagopalan, Shriram; White, Ralph E.

    2017-10-21

    Rotating ring disc electrode (RRDE) experiments are a classic tool for investigating kinetics of electrochemical reactions. Several standardized methods exist for extracting transport parameters and reaction rate constants using RRDE measurements. Here in this work, we compare some approximate solutions to the convective diffusion used popularly in the literature to a rigorous numerical solution of the Nernst-Planck equations coupled to the three dimensional flow problem. In light of these computational advancements, we explore design aspects of the RRDE that will help improve sensitivity of our parameter estimation procedure to experimental data. We use the oxygen reduction in acidic media involvingmore » three charge transfer reactions and a chemical reaction as an example, and identify ways to isolate reaction currents for the individual processes in order to accurately estimate the exchange current densities.« less

  13. DebugIT for patient safety - improving the treatment with antibiotics through multimedia data mining of heterogeneous clinical data.

    PubMed

    Lovis, Christian; Colaert, Dirk; Stroetmann, Veli N

    2008-01-01

    The concepts and architecture underlying a large-scale integrating project funded within the 7th EU Framework Programme (FP7) are discussed. The main objective of the project is to build a tool that will have a significant impact for the monitoring and the control of infectious diseases and antimicrobial resistances in Europe; This will be realized by building a technical and semantic infrastructure able to share heterogeneous clinical data sets from different hospitals in different countries, with different languages and legislations; to analyze large amounts of this clinical data with advanced multimedia data mining and finally apply the obtained knowledge for clinical decisions and outcome monitoring. There are numerous challenges in this project at all levels, technical, semantical, legal and ethical that will have to be addressed.

  14. Application of stable isotope ratio analysis for biodegradation monitoring in groundwater

    USGS Publications Warehouse

    Hatzinger, Paul B.; Böhlke, John Karl; Sturchio, Neil C.

    2013-01-01

    Stable isotope ratio analysis is increasingly being applied as a tool to detect, understand, and quantify biodegradation of organic and inorganic contaminants in groundwater. An important feature of this approach is that it allows degradative losses of contaminants to be distinguished from those caused by non-destructive processes such as dilution, dispersion, and sorption. Recent advances in analytical techniques, and new approaches for interpreting stable isotope data, have expanded the utility of this method while also exposing complications and ambiguities that must be considered in data interpretations. Isotopic analyses of multiple elements in a compound, and multiple compounds in the environment, are being used to distinguish biodegradative pathways by their characteristic isotope effects. Numerical models of contaminant transport, degradation pathways, and isotopic composition are improving quantitative estimates of in situ contaminant degradation rates under realistic environmental conditions.

  15. Peptide-mediated vectorization of metal complexes: conjugation strategies and biomedical applications.

    PubMed

    Soler, Marta; Feliu, Lidia; Planas, Marta; Ribas, Xavi; Costas, Miquel

    2016-08-16

    The rich chemical and structural versatility of transition metal complexes provides numerous novel paths to be pursued in the design of molecules that exert particular chemical or physicochemical effects that could operate over specific biological targets. However, the poor cell permeability of metallodrugs represents an important barrier for their therapeutic use. The conjugation between metal complexes and a functional peptide vector can be regarded as a versatile and potential strategy to improve their bioavailability and accumulation inside cells, and the site selectivity of their effect. This perspective lies in reviewing the recent advances in the design of metallopeptide conjugates for biomedical applications. Additionally, we highlight the studies where this approach has been directed towards the incorporation of redox active metal centers into living organisms for modulating the cellular redox balance, as a tool with application in anticancer therapy.

  16. Development and implementation of (Q)SAR modeling within the CHARMMing web-user interface.

    PubMed

    Weidlich, Iwona E; Pevzner, Yuri; Miller, Benjamin T; Filippov, Igor V; Woodcock, H Lee; Brooks, Bernard R

    2015-01-05

    Recent availability of large publicly accessible databases of chemical compounds and their biological activities (PubChem, ChEMBL) has inspired us to develop a web-based tool for structure activity relationship and quantitative structure activity relationship modeling to add to the services provided by CHARMMing (www.charmming.org). This new module implements some of the most recent advances in modern machine learning algorithms-Random Forest, Support Vector Machine, Stochastic Gradient Descent, Gradient Tree Boosting, so forth. A user can import training data from Pubchem Bioassay data collections directly from our interface or upload his or her own SD files which contain structures and activity information to create new models (either categorical or numerical). A user can then track the model generation process and run models on new data to predict activity. © 2014 Wiley Periodicals, Inc.

  17. Selective Plasma Etching of Polymeric Substrates for Advanced Applications

    PubMed Central

    Puliyalil, Harinarayanan; Cvelbar, Uroš

    2016-01-01

    In today’s nanoworld, there is a strong need to manipulate and process materials on an atom-by-atom scale with new tools such as reactive plasma, which in some states enables high selectivity of interaction between plasma species and materials. These interactions first involve preferential interactions with precise bonds in materials and later cause etching. This typically occurs based on material stability, which leads to preferential etching of one material over other. This process is especially interesting for polymeric substrates with increasing complexity and a “zoo” of bonds, which are used in numerous applications. In this comprehensive summary, we encompass the complete selective etching of polymers and polymer matrix micro-/nanocomposites with plasma and unravel the mechanisms behind the scenes, which ultimately leads to the enhancement of surface properties and device performance. PMID:28335238

  18. Geometry Modeling and Grid Generation for Computational Aerodynamic Simulations Around Iced Airfoils and Wings

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Slater, John W.; Vickerman, Mary B.; VanZante, Judith F.; Wadel, Mary F. (Technical Monitor)

    2002-01-01

    Issues associated with analysis of 'icing effects' on airfoil and wing performances are discussed, along with accomplishments and efforts to overcome difficulties with ice. Because of infinite variations of ice shapes and their high degree of complexity, computational 'icing effects' studies using available software tools must address many difficulties in geometry acquisition and modeling, grid generation, and flow simulation. The value of each technology component needs to be weighed from the perspective of the entire analysis process, from geometry to flow simulation. Even though CFD codes are yet to be validated for flows over iced airfoils and wings, numerical simulation, when considered together with wind tunnel tests, can provide valuable insights into 'icing effects' and advance our understanding of the relationship between ice characteristics and their effects on performance degradation.

  19. Evaluating Cellular Polyfunctionality with a Novel Polyfunctionality Index

    PubMed Central

    Larsen, Martin; Sauce, Delphine; Arnaud, Laurent; Fastenackels, Solène; Appay, Victor; Gorochov, Guy

    2012-01-01

    Functional evaluation of naturally occurring or vaccination-induced T cell responses in mice, men and monkeys has in recent years advanced from single-parameter (e.g. IFN-γ-secretion) to much more complex multidimensional measurements. Co-secretion of multiple functional molecules (such as cytokines and chemokines) at the single-cell level is now measurable due primarily to major advances in multiparametric flow cytometry. The very extensive and complex datasets generated by this technology raise the demand for proper analytical tools that enable the analysis of combinatorial functional properties of T cells, hence polyfunctionality. Presently, multidimensional functional measures are analysed either by evaluating all combinations of parameters individually or by summing frequencies of combinations that include the same number of simultaneous functions. Often these evaluations are visualized as pie charts. Whereas pie charts effectively represent and compare average polyfunctionality profiles of particular T cell subsets or patient groups, they do not document the degree or variation of polyfunctionality within a group nor does it allow more sophisticated statistical analysis. Here we propose a novel polyfunctionality index that numerically evaluates the degree and variation of polyfuntionality, and enable comparative and correlative parametric and non-parametric statistical tests. Moreover, it allows the usage of more advanced statistical approaches, such as cluster analysis. We believe that the polyfunctionality index will render polyfunctionality an appropriate end-point measure in future studies of T cell responsiveness. PMID:22860124

  20. Gravitational waves from Scorpius X-1: A comparison of search methods and prospects for detection with advanced detectors

    NASA Astrophysics Data System (ADS)

    Messenger, C.; Bulten, H. J.; Crowder, S. G.; Dergachev, V.; Galloway, D. K.; Goetz, E.; Jonker, R. J. G.; Lasky, P. D.; Meadors, G. D.; Melatos, A.; Premachandra, S.; Riles, K.; Sammut, L.; Thrane, E. H.; Whelan, J. T.; Zhang, Y.

    2015-07-01

    The low-mass X-ray binary Scorpius X-1 (Sco X-1) is potentially the most luminous source of continuous gravitational-wave radiation for interferometers such as LIGO and Virgo. For low-mass X-ray binaries this radiation would be sustained by active accretion of matter from its binary companion. With the Advanced Detector Era fast approaching, work is underway to develop an array of robust tools for maximizing the science and detection potential of Sco X-1. We describe the plans and progress of a project designed to compare the numerous independent search algorithms currently available. We employ a mock-data challenge in which the search pipelines are tested for their relative proficiencies in parameter estimation, computational efficiency, robustness, and most importantly, search sensitivity. The mock-data challenge data contains an ensemble of 50 Scorpius X-1 (Sco X-1) type signals, simulated within a frequency band of 50-1500 Hz. Simulated detector noise was generated assuming the expected best strain sensitivity of Advanced LIGO [1] and Advanced VIRGO [2] (4 ×10-24 Hz-1 /2 ). A distribution of signal amplitudes was then chosen so as to allow a useful comparison of search methodologies. A factor of 2 in strain separates the quietest detected signal, at 6.8 ×10-26 strain, from the torque-balance limit at a spin frequency of 300 Hz, although this limit could range from 1.2 ×10-25 (25 Hz) to 2.2 ×10-26 (750 Hz) depending on the unknown frequency of Sco X-1. With future improvements to the search algorithms and using advanced detector data, our expectations for probing below the theoretical torque-balance strain limit are optimistic.

  1. SESAME: a software tool for the numerical dosimetric reconstruction of radiological accidents involving external sources and its application to the accident in Chile in December 2005.

    PubMed

    Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F

    2009-01-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described.

  2. Identifying opportunities to advance practice at a large academic medical center using the ASHP Ambulatory Care Self-Assessment Tool.

    PubMed

    Martirosov, Amber Lanae; Michael, Angela; McCarty, Melissa; Bacon, Opal; DiLodovico, John R; Jantz, Arin; Kostoff, Diana; MacDonald, Nancy C; Mikulandric, Nancy; Neme, Klodiana; Sulejmani, Nimisha; Summers, Bryant B

    2018-05-29

    The use of the ASHP Ambulatory Care Self-Assessment Tool to advance pharmacy practice at 8 ambulatory care clinics of a large academic medical center is described. The ASHP Ambulatory Care Self-Assessment Tool was developed to help ambulatory care pharmacists assess how their current practices align with the ASHP Practice Advancement Initiative. The Henry Ford Hospital Ambulatory Care Advisory Group (ACAG) opted to use the "Practitioner Track" sections of the tool to assess pharmacy practices within each of 8 ambulatory care clinics individually. The responses to self-assessment items were then compiled and discussed by ACAG members. The group identified best practices and ways to implement action items to advance ambulatory care practice throughout the institution. Three recommended action items were common to most clinics: (1) identify and evaluate solutions to deliver financially viable services, (2) develop technology to improve patient care, and (3) optimize the role of pharmacy technicians and support personnel. The ACAG leadership met with pharmacy administrators to discuss how action items that were both feasible and deemed likely to have a medium-to-high impact aligned with departmental goals and used this information to develop an ambulatory care strategic plan. This process informed and enabled initiatives to advance ambulatory care pharmacy practice within the system. The ASHP Ambulatory Care Self-Assessment Tool was useful in identifying opportunities for practice advancement in a large academic medical center. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  3. Development of Advanced Light-Duty Powertrain and Hybrid Analysis Tool (SAE 2013-01-0808)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by Environmental Protection Agency to evaluate the Greenhouse gas emissions and fuel efficiency from light-duty vehicles. It is a physics-based, forward-looking, full vehicle computer simulator, which is cap...

  4. Advanced Computing Tools and Models for Accelerator Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  5. Serpentinomics-an emerging new field of study

    Treesearch

    Jessica Wright; Eric von Wettberg

    2009-01-01

    "Serpentinomics" is an emerging field of study which has the potential to greatly advance our understanding of serpentine ecology. Several newly developing –omic fields, often using high-throughput tools developed for molecular biology, will advance the field of serpentine ecology, or, "serpentinomics." Using tools from the...

  6. Conceptual Assessment Tool for Advanced Undergraduate Electrodynamics

    ERIC Educational Resources Information Center

    Baily, Charles; Ryan, Qing X.; Astolfi, Cecilia; Pollock, Steven J.

    2017-01-01

    As part of ongoing investigations into student learning in advanced undergraduate courses, we have developed a conceptual assessment tool for upper-division electrodynamics (E&M II): the Colorado UppeR-division ElectrodyNamics Test (CURrENT). This is a free response, postinstruction diagnostic with 6 multipart questions, an optional 3-question…

  7. Advances in In Vitro and In Silico Tools for Toxicokinetic Dose Modeling and Predictive Toxicology (WC10)

    EPA Science Inventory

    Recent advances in vitro assays, in silico tools, and systems biology approaches provide opportunities for refined mechanistic understanding for chemical safety assessment that will ultimately lead to reduced reliance on animal-based methods. With the U.S. commercial chemical lan...

  8. Some research perspectives in galloping phenomena: critical conditions and post-critical behavior

    NASA Astrophysics Data System (ADS)

    Piccardo, Giuseppe; Pagnini, Luisa Carlotta; Tubino, Federica

    2015-01-01

    This paper gives an overview of wind-induced galloping phenomena, describing its manifold features and the many advances that have taken place in this field. Starting from a quasi-steady model of aeroelastic forces exerted by the wind on a rigid cylinder with three degree-of-freedom, two translations and a rotation in the plane of the model cross section, the fluid-structure interaction forces are described in simple terms, yet suitable with complexity of mechanical systems, both in the linear and in the nonlinear field, thus allowing investigation of a wide range of structural typologies and their dynamic behavior. The paper is driven by some key concerns. A great effort is made in underlying strengths and weaknesses of the classic quasi-steady theory as well as of the simplistic assumptions that are introduced in order to investigate such complex phenomena through simple engineering models. A second aspect, which is crucial to the authors' approach, is to take into account and harmonize the engineering, physical and mathematical perspectives in an interdisciplinary way—something which does not happen often. The authors underline that the quasi-steady approach is an irreplaceable tool, tough approximate and simple, for performing engineering analyses; at the same time, the study of this phenomenon gives origin to numerous problems that make the application of high-level mathematical solutions particularly attractive. Finally, the paper discusses a wide range of features of the galloping theory and its practical use which deserve further attention and refinements, pointing to the great potential represented by new fields of application and advanced analysis tools.

  9. Directed Chemical Evolution with an Outsized Genetic Code

    PubMed Central

    Krusemark, Casey J.; Tilmans, Nicolas P.; Brown, Patrick O.; Harbury, Pehr B.

    2016-01-01

    The first demonstration that macromolecules could be evolved in a test tube was reported twenty-five years ago. That breakthrough meant that billions of years of chance discovery and refinement could be compressed into a few weeks, and provided a powerful tool that now dominates all aspects of protein engineering. A challenge has been to extend this scientific advance into synthetic chemical space: to enable the directed evolution of abiotic molecules. The problem has been tackled in many ways. These include expanding the natural genetic code to include unnatural amino acids, engineering polyketide and polypeptide synthases to produce novel products, and tagging combinatorial chemistry libraries with DNA. Importantly, there is still no small-molecule analog of directed protein evolution, i.e. a substantiated approach for optimizing complex (≥ 10^9 diversity) populations of synthetic small molecules over successive generations. We present a key advance towards this goal: a tool for genetically-programmed synthesis of small-molecule libraries from large chemical alphabets. The approach accommodates alphabets that are one to two orders of magnitude larger than any in Nature, and facilitates evolution within the chemical spaces they create. This is critical for small molecules, which are built up from numerous and highly varied chemical fragments. We report a proof-of-concept chemical evolution experiment utilizing an outsized genetic code, and demonstrate that fitness traits can be passed from an initial small-molecule population through to the great-grandchildren of that population. The results establish the practical feasibility of engineering synthetic small molecules through accelerated evolution. PMID:27508294

  10. Reliability of CBCT as an assessment tool for mandibular molars furcation defects

    NASA Astrophysics Data System (ADS)

    Marinescu, Adrian George; Boariu, Marius; Rusu, Darian; Stratul, Stefan-Ioan; Ogodescu, Alexandru

    2014-01-01

    Introduction. In numerous clinical situations it is not possible to have an exact clinical evaluation of the furcation defects. Recently the use of CBCT in periodontology has led to an increased precision in diagnostic. Aim. To determine the accuracy of CBCT as diagnostic tool of the furcation defects. Material and method. 19 patients with generalised advanced chronic periodontitis were included in this study, presenting a total of 25 lower molars with different degrees of furcation defects. Clinical and digital measurements (in mm) were performed on all the molars involved. The data obtained has been compared and statistically analysed. Results. The analysis of primary data has demonstrated that all the furcation grade II and III defects were revealed using the CBCT technique. Regarding the incipient defects (grade I Hamp < 3mm), the dimensions measured on CBCT images were slightly bigger. The results have shown that 84% of the defects detected by CBCT have been confirmed by clinical measurements. These data are similar to those revealed by other studies1. Conclusions. The use of CBCT technique in evaluation and diagnosis of human mandibular furcation defects can provide many important information regarding the size and aspect of the interradicular defect, efficiently and noninvasively. CBCT technique is used more effectively in detection of advanced furcation degree compared to incipient ones. However, the CBCT examination cannot replace, at least in this stage of development, the clinical measurements, especially the intraoperative ones, which are considered to represent the „golden standard" in this domain.

  11. PlanetPack: A radial-velocity time-series analysis tool facilitating exoplanets detection, characterization, and dynamical simulations

    NASA Astrophysics Data System (ADS)

    Baluev, Roman V.

    2013-08-01

    We present PlanetPack, a new software tool that we developed to facilitate and standardize the advanced analysis of radial velocity (RV) data for the goal of exoplanets detection, characterization, and basic dynamical N-body simulations. PlanetPack is a command-line interpreter, that can run either in an interactive mode or in a batch mode of automatic script interpretation. Its major abilities include: (i) advanced RV curve fitting with the proper maximum-likelihood treatment of unknown RV jitter; (ii) user-friendly multi-Keplerian as well as Newtonian N-body RV fits; (iii) use of more efficient maximum-likelihood periodograms that involve the full multi-planet fitting (sometimes called as “residual” or “recursive” periodograms); (iv) easily calculatable parametric 2D likelihood function level contours, reflecting the asymptotic confidence regions; (v) fitting under some useful functional constraints is user-friendly; (vi) basic tasks of short- and long-term planetary dynamical simulation using a fast Everhart-type integrator based on Gauss-Legendre spacings; (vii) fitting the data with red noise (auto-correlated errors); (viii) various analytical and numerical methods for the tasks of determining the statistical significance. It is planned that further functionality may be added to PlanetPack in the future. During the development of this software, a lot of effort was made to improve the calculational speed, especially for CPU-demanding tasks. PlanetPack was written in pure C++ (standard of 1998/2003), and is expected to be compilable and useable on a wide range of platforms.

  12. Development of early numerical abilities of Spanish-speaking Mexican preschoolers: A new assessment tool.

    PubMed

    Beltrán-Navarro, Beatriz; Abreu-Mendoza, Roberto A; Matute, Esmeralda; Rosselli, Monica

    2018-01-01

    This article presents a tool for assessing the early numerical abilities of Spanish-speaking Mexican preschoolers. The Numerical Abilities Test, from the Evaluación Neuropsicológica Infantil-Preescolar (ENI-P), evaluates four core abilities of number development: magnitude comparison, counting, subitizing, and basic calculation. We evaluated 307 Spanish-speaking Mexican children aged 2 years 6 months to 4 years 11 months. Appropriate internal consistency and test-retest reliability were demonstrated. We also investigated the effect of age, children's school attendance, maternal education, and sex on children's numerical scores. The results showed that the four subtests captured development across ages. Critically, maternal education had an impact on children's performance in three out of the four subtests, but there was no effect associated with children's school attendance or sex. These results suggest that the Numerical Abilities Test is a reliable instrument for Spanish-speaking preschoolers. We discuss the implications of our outcomes for numerical development.

  13. Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollias, Pavlos

    2016-09-06

    This the final report for the DE-SC0007096 - Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales - PI: Pavlos Kollias. The final report outline the main findings of the research conducted using the aforementioned award in the area of cloud research from the cloud scale (10-100 m) to the mesoscale (20-50 km).

  14. Estimation of the influence of tool wear on force signals: A finite element approach in AISI 1045 orthogonal cutting

    NASA Astrophysics Data System (ADS)

    Equeter, Lucas; Ducobu, François; Rivière-Lorphèvre, Edouard; Abouridouane, Mustapha; Klocke, Fritz; Dehombreux, Pierre

    2018-05-01

    Industrial concerns arise regarding the significant cost of cutting tools in machining process. In particular, their improper replacement policy can lead either to scraps, or to early tool replacements, which would waste fine tools. ISO 3685 provides the flank wear end-of-life criterion. Flank wear is also the nominal type of wear for longest tool lifetimes in optimal cutting conditions. Its consequences include bad surface roughness and dimensional discrepancies. In order to aid the replacement decision process, several tool condition monitoring techniques are suggested. Force signals were shown in the literature to be strongly linked with tools flank wear. It can therefore be assumed that force signals are highly relevant for monitoring the condition of cutting tools and providing decision-aid information in the framework of their maintenance and replacement. The objective of this work is to correlate tools flank wear with numerically computed force signals. The present work uses a Finite Element Model with a Coupled Eulerian-Lagrangian approach. The geometry of the tool is changed for different runs of the model, in order to obtain results that are specific to a certain level of wear. The model is assessed by comparison with experimental data gathered earlier on fresh tools. Using the model at constant cutting parameters, force signals under different tool wear states are computed and provide force signals for each studied tool geometry. These signals are qualitatively compared with relevant data from the literature. At this point, no quantitative comparison could be performed on worn tools because the reviewed literature failed to provide similar studies in this material, either numerical or experimental. Therefore, further development of this work should include experimental campaigns aiming at collecting cutting forces signals and assessing the numerical results that were achieved through this work.

  15. The Sequenced Angiosperm Genomes and Genome Databases.

    PubMed

    Chen, Fei; Dong, Wei; Zhang, Jiawei; Guo, Xinyue; Chen, Junhao; Wang, Zhengjia; Lin, Zhenguo; Tang, Haibao; Zhang, Liangsheng

    2018-01-01

    Angiosperms, the flowering plants, provide the essential resources for human life, such as food, energy, oxygen, and materials. They also promoted the evolution of human, animals, and the planet earth. Despite the numerous advances in genome reports or sequencing technologies, no review covers all the released angiosperm genomes and the genome databases for data sharing. Based on the rapid advances and innovations in the database reconstruction in the last few years, here we provide a comprehensive review for three major types of angiosperm genome databases, including databases for a single species, for a specific angiosperm clade, and for multiple angiosperm species. The scope, tools, and data of each type of databases and their features are concisely discussed. The genome databases for a single species or a clade of species are especially popular for specific group of researchers, while a timely-updated comprehensive database is more powerful for address of major scientific mysteries at the genome scale. Considering the low coverage of flowering plants in any available database, we propose construction of a comprehensive database to facilitate large-scale comparative studies of angiosperm genomes and to promote the collaborative studies of important questions in plant biology.

  16. Reliable in vitro studies require appropriate ovarian cancer cell lines

    PubMed Central

    2014-01-01

    Ovarian cancer is the fifth most common cause of cancer death in women and the leading cause of death from gynaecological malignancies. Of the 75% women diagnosed with locally advanced or disseminated disease, only 30% will survive five years following treatment. This poor prognosis is due to the following reasons: limited understanding of the tumor origin, unclear initiating events and early developmental stages of ovarian cancer, lack of reliable ovarian cancer-specific biomarkers, and drug resistance in advanced cases. In the past, in vitro studies using cell line models have been an invaluable tool for basic, discovery-driven cancer research. However, numerous issues including misidentification and cross-contamination of cell lines have hindered research efforts. In this study we examined all ovarian cancer cell lines available from cell banks. Hereby, we identified inconsistencies in the reporting, difficulties in the identification of cell origin or clinical data of the donor patients, restricted ethnic and histological type representation, and a lack of tubal and peritoneal cancer cell lines. We recommend that all cell lines should be distributed via official cell banks only with strict guidelines regarding the minimal available information required to improve the quality of ovarian cancer research in future. PMID:24936210

  17. Advances in aptamer screening and small molecule aptasensors.

    PubMed

    Kim, Yeon Seok; Gu, Man Bock

    2014-01-01

    It has been 20 years since aptamer and SELEX (systematic evolution of ligands by exponential enrichment) were described independently by Andrew Ellington and Larry Gold. Based on the great advantages of aptamers, there have been numerous isolated aptamers for various targets that have actively been applied as therapeutic and analytical tools. Over 2,000 papers related to aptamers or SELEX have been published, attesting to their wide usefulness and the applicability of aptamers. SELEX methods have been modified or re-created over the years to enable aptamer isolation with higher affinity and selectivity in more labor- and time-efficient manners, including automation. Initially, most of the studies about aptamers have focused on the protein targets, which have physiological functions in the body, and their applications as therapeutic agents or receptors for diagnostics. However, aptamers for small molecules such as organic or inorganic compounds, drugs, antibiotics, or metabolites have not been studied sufficiently, despite the ever-increasing need for rapid and simple analytical methods for various chemical targets in the fields of medical diagnostics, environmental monitoring, food safety, and national defense against targets including chemical warfare. This review focuses on not only recent advances in aptamer screening methods but also its analytical application for small molecules.

  18. Integrating hydrology into catchment scale studies - need for new paradigms?

    NASA Astrophysics Data System (ADS)

    Teutsch, G.

    2009-04-01

    Until the seventies, scientific development in the field of groundwater hydrology concentrated mainly on a better understanding of the physics of subsurface flow in homogeneous or simply stratified porous respectively fractured media. Then, since mid of the seventies, a much more complex vision of groundwater hydrology gradually developed. A more realistic description of the subsurface including its heterogeneity, predominant physico-chemical-biological reactions and also technologies for the efficient clean-up of contaminants developed during the past 30 years, much facilitated by the advancement in numerical modelling techniques and the boost in computer power. Even though the advancements in this field have been very significant, a new grand challenge evolved during the past 10 years trying to bring together the fields needed to build Integrated Watershed Management Systems (IWMS). The fundamental conceptual question is: Do we need new approaches to groundwater hydrology, maybe even new paradigms in order to successfully build IWMS - or can we simply extrapolate our existing concepts and tool-sets to the scale of catchments and watersheds and simply add some interfaces to adjacent disciplines like economy, ecology and others? This lecture tries to provide some of the answers by describing some successful examples.

  19. Genetics of Adiposity in Large Animal Models for Human Obesity-Studies on Pigs and Dogs.

    PubMed

    Stachowiak, M; Szczerbal, I; Switonski, M

    2016-01-01

    The role of domestic mammals in the development of human biomedical sciences has been widely documented. Among these model species the pig and dog are of special importance. Both are useful for studies on the etiology of human obesity. Genome sequences of both species are known and advanced genetic tools [eg, microarray SNP for genome wide association studies (GWAS), next generation sequencing (NGS), etc.] are commonly used in such studies. In the domestic pig the accumulation of adipose tissue is an important trait, which influences meat quality and fattening efficiency. Numerous quantitative trait loci (QTLs) for pig fatness traits were identified, while gene polymorphisms associated with these traits were also described. The situation is different in dog population. Generally, excessive accumulation of adipose tissue is considered, similar to humans, as a complex disease. However, research on the genetic background of canine obesity is still in its infancy. Between-breed differences in terms of adipose tissue accumulation are well known in both animal species. In this review we show recent advances of studies on adipose tissue accumulation in pigs and dogs, and their potential importance for studies on human obesity. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Improvement of the Prediction of Drugs Demand Using Spatial Data Mining Tools.

    PubMed

    Ramos, M Isabel; Cubillas, Juan José; Feito, Francisco R

    2016-01-01

    The continued availability of products at any store is the major issue in order to provide good customer service. If the store is a drugstore this matter reaches a greater importance, as out of stock of a drug when there is high demand causes problems and tensions in the healthcare system. There are numerous studies of the impact this issue has on patients. The lack of any drug in a pharmacy in certain seasons is very common, especially when some external factors proliferate favoring the occurrence of certain diseases. This study focuses on a particular drug consumed in the city of Jaen, southern Andalucia, Spain. Our goal is to determine in advance the Salbutamol demand. Advanced data mining techniques have been used with spatial variables. These last have a key role to generate an effective model. In this research we have used the attributes that are associated with Salbutamol demand and it has been generated a very accurate prediction model of 5.78% of mean absolute error. This is a very encouraging data considering that the consumption of this drug in Jaen varies 500% from one period to another.

  1. The Sequenced Angiosperm Genomes and Genome Databases

    PubMed Central

    Chen, Fei; Dong, Wei; Zhang, Jiawei; Guo, Xinyue; Chen, Junhao; Wang, Zhengjia; Lin, Zhenguo; Tang, Haibao; Zhang, Liangsheng

    2018-01-01

    Angiosperms, the flowering plants, provide the essential resources for human life, such as food, energy, oxygen, and materials. They also promoted the evolution of human, animals, and the planet earth. Despite the numerous advances in genome reports or sequencing technologies, no review covers all the released angiosperm genomes and the genome databases for data sharing. Based on the rapid advances and innovations in the database reconstruction in the last few years, here we provide a comprehensive review for three major types of angiosperm genome databases, including databases for a single species, for a specific angiosperm clade, and for multiple angiosperm species. The scope, tools, and data of each type of databases and their features are concisely discussed. The genome databases for a single species or a clade of species are especially popular for specific group of researchers, while a timely-updated comprehensive database is more powerful for address of major scientific mysteries at the genome scale. Considering the low coverage of flowering plants in any available database, we propose construction of a comprehensive database to facilitate large-scale comparative studies of angiosperm genomes and to promote the collaborative studies of important questions in plant biology. PMID:29706973

  2. Modelisation numerique des phenomenes physiques du soudage par friction-malaxage et comportement en fatigue de joints soudes en aluminium 7075-T6

    NASA Astrophysics Data System (ADS)

    Gemme, Frederic

    The aim of the present research project is to increase the amount of fundamental knowledge regarding the process by getting a better understanding of the physical phenomena involved in friction stir welding (FSW). Such knowledge is required to improve the process in the context of industrial applications. In order to do so, the first part of the project is dedicated to a theoretical study of the process, while the microstructure and the mechanical properties of welded joints obtained in different welding conditions are measured and analyzed in the second part. The combination of the tool rotating and translating movements induces plastic deformation and heat generation of the welded material. The material thermomechanical history is responsible for metallurgical phenomena occurring during FSW such as recrystallization and precipitate dissolution and coarsening. Process modelling is used to reproduce this thermomechanical history in order to predict the influence of welding on the material microstructure. It is helpful to study heat generation and heat conduction mechanisms and to understand how joint properties are related to them. In the current work, a finite element numerical model based on solid mechanics has been developed to compute the thermomechanical history of the welded material. The computation results were compared to reference experimental data in order to validate the model and to calibrate unknown physical parameters. The model was used to study the effect of the friction coefficient on the thermomechanical history. Results showed that contact conditions at the workpiece/tool interface have a strong effect on relative amounts of heat generated by friction and by plastic deformation. The comparison with the experimental torque applied by the tool for different rotational speeds has shown that the friction coefficient decreases when the rotational speed increases. Consequently, heat generation is far more important near the material/tool interface and the material deformation is shallower, increasing the lack of penetration probability. The variation of thermomechanical conditions with regards to the rotational speed is responsible for the variation of the nugget shape, as recrystallization conditions are not reached in the same volume of material. The second part of the research project was dedicated to a characterization of the welded joints microstructure and mechanical properties. Sound joints were obtained by using a manufacturing procedure involving process parameters optimization and quality control of the joint integrity. Five different combinations of rotational and advancing speeds were studied. Microstructure observations have shown that the rotational speed has an effect on recrystallization conditions because of the variation of the contact conditions at the material/tool interface. On the other hand, the advancing speed has a strong effect on the precipitation state in the heat affected zone (HAZ). The heat input increases when the advancing speed decreases. The material softening in the HAZ is then more pronounced. Mechanical testing of the welded joints showed that the fatigue resistance increases when the rotational speed increases and the advancing speed decreases. The fatigue resistance of FSW joints mainly depends on the ratio of the advancing speed on the rotational speed, called the welding pitch k. When the welding pitch is high (k ≥ 0,66 mm/rev), the fatigue resistance depends on crack initiation at the root of circular grooves left by the tool on the weld surface. The size of these grooves is directly related to the welding pitch. When the welding pitch is low (k ≤ 0,2 mm/rev), the heat input is high and the fatigue resistance is limited by the HAZ softening. The fatigue resistance is optimized when k stands in the 0,25-0,30 mm/rev range. Outside that range, the presence of small lateral lips is critical. The results of the characterization part of the project showed that the effects of the applied vertical force on the formation of lateral lips should be submitted to further investigations. The elimination of the lateral lip, which could be achieved with a more precise adjustment of the vertical force, could lead to an improved fatigue resistance. The elimination of lateral lips, but also the circular grooves left by the tool, may be obtained by developing an appropriate surfacing technique and could lead to an improved fatigue resistance without reducing the advancing speed. (Abstract shortened by UMI.)

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oehler, G.C.

    As dramatic as are the recent changes in Eastern Europe and the Soviet Union political happenings, some other factors are having at least as important an impact on the intelligence community's business. For example, new and more global problems have arisen, such as the proliferation of advanced weapons, economic competitiveness, and environmental concerns. It is obvious that intelligence requirements are on the increase. For the intelligence community whose business is information gathering and processing, advanced information management tools are needed. Fortunately, recent technical advances offer these tools. Some of the more notable advances in information documentation, storage, and retrieval aremore » described.« less

  4. Nanomagnet Logic: Architectures, design, and benchmarking

    NASA Astrophysics Data System (ADS)

    Kurtz, Steven J.

    Nanomagnet Logic (NML) is an emerging technology being studied as a possible replacement or supplementary device for Complimentary Metal-Oxide-Semiconductor (CMOS) Field-Effect Transistors (FET) by the year 2020. NML devices offer numerous potential advantages including: low energy operation, steady state non-volatility, radiation hardness and a clear path to fabrication and integration with CMOS. However, maintaining both low-energy operation and non-volatility while scaling from the device to the architectural level is non-trivial as (i) nearest neighbor interactions within NML circuits complicate the modeling of ensemble nanomagnet behavior and (ii) the energy intensive clock structures required for re-evaluation and NML's relatively high latency challenge its ability to offer system-level performance wins against other emerging nanotechnologies. Thus, further research efforts are required to model more complex circuits while also identifying circuit design techniques that balance low-energy operation with steady state non-volatility. In addition, further work is needed to design and model low-power on-chip clocks while simultaneously identifying application spaces where NML systems (including clock overhead) offer sufficient energy savings to merit their inclusion in future processors. This dissertation presents research advancing the understanding and modeling of NML at all levels including devices, circuits, and line clock structures while also benchmarking NML against both scaled CMOS and tunneling FETs (TFET) devices. This is accomplished through the development of design tools and methodologies for (i) quantifying both energy and stability in NML circuits and (ii) evaluating line-clocked NML system performance. The application of these newly developed tools improves the understanding of ideal design criteria (i.e., magnet size, clock wire geometry, etc.) for NML architectures. Finally, the system-level performance evaluation tool offers the ability to project what advancements are required for NML to realize performance improvements over scaled-CMOS hardware equivalents at the functional unit and/or application-level.

  5. Turbomachinery noise

    NASA Astrophysics Data System (ADS)

    Groeneweg, John F.; Sofrin, Thomas G.; Rice, Edward J.; Gliebe, Phillip R.

    1991-08-01

    Summarized here are key advances in experimental techniques and theoretical applications which point the way to a broad understanding and control of turbomachinery noise. On the experimental side, the development of effective inflow control techniques makes it possible to conduct, in ground based facilities, definitive experiments in internally controlled blade row interactions. Results can now be valid indicators of flight behavior and can provide a firm base for comparison with analytical results. Inflow control coupled with detailed diagnostic tools such as blade pressure measurements can be used to uncover the more subtle mechanisms such as rotor strut interaction, which can set tone levels for some engine configurations. Initial mappings of rotor wake-vortex flow fields have provided a data base for a first generation semiempirical flow disturbance model. Laser velocimetry offers a nonintrusive method for validating and improving the model. Digital data systems and signal processing algorithms are bringing mode measurement closer to a working tool that can be frequently applied to a real machine such as a turbofan engine. On the analytical side, models of most of the links in the chain from turbomachine blade source to far field observation point have been formulated. Three dimensional lifting surface theory for blade rows, including source noncompactness and cascade effects, blade row transmission models incorporating mode and frequency scattering, and modal radiation calculations, including hybrid numerical-analytical approaches, are tools which await further application.

  6. Ensembles of NLP Tools for Data Element Extraction from Clinical Notes

    PubMed Central

    Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D.; Day, Michele E.; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan

    2016-01-01

    Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort. PMID:28269947

  7. Ensembles of NLP Tools for Data Element Extraction from Clinical Notes.

    PubMed

    Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D; Day, Michele E; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan

    2016-01-01

    Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort.

  8. Numerical simulation of tunneling through arbitrary potential barriers applied on MIM and MIIM rectenna diodes

    NASA Astrophysics Data System (ADS)

    Abdolkader, Tarek M.; Shaker, Ahmed; Alahmadi, A. N. M.

    2018-07-01

    With the continuous miniaturization of electronic devices, quantum-mechanical effects such as tunneling become more effective in many device applications. In this paper, a numerical simulation tool is developed under a MATLAB environment to calculate the tunneling probability and current through an arbitrary potential barrier comparing three different numerical techniques: the finite difference method, transfer matrix method, and transmission line method. For benchmarking, the tool is applied to many case studies such as the rectangular single barrier, rectangular double barrier, and continuous bell-shaped potential barrier, each compared to analytical solutions and giving the dependence of the error on the number of mesh points. In addition, a thorough study of the J ‑ V characteristics of MIM and MIIM diodes, used as rectifiers for rectenna solar cells, is presented and simulations are compared to experimental results showing satisfactory agreement. On the undergraduate level, the tool provides a deeper insight for students to compare numerical techniques used to solve various tunneling problems and helps students to choose a suitable technique for a certain application.

  9. Case-Based Capture and Reuse of Aerospace Design Rationale

    NASA Technical Reports Server (NTRS)

    Leake, David B.

    1998-01-01

    The goal of this project is to apply artificial intelligence techniques to facilitate capture and reuse of aerospace design rationale. The project applies case-based reasoning (CBR) and concept mapping (CMAP) tools to the task of capturing, organizing, and interactively accessing experiences or "cases" encapsulating the methods and rationale underlying expert aerospace design. As stipulated in the award, Indiana University and Ames personnel are collaborating on performance of research and determining the direction of research, to assure that the project focuses on high-value tasks. In the first five months of the project, we have made two visits to Ames Research Center to consult with our NASA collaborators, to learn about the advanced aerospace design tools being developed there, and to identify specific needs for intelligent design support. These meetings identified a number of task areas for applying CBR and concept mapping technology. We jointly selected a first task area to focus on: Acquiring the convergence criteria that experts use to guide the selection of useful data from a set of numerical simulations of high-lift systems. During the first funding period, we developed two software systems. First, we have adapted a CBR system developed at Indiana University into a prototype case-based reasoning shell to capture and retrieve information about design experiences, with the sample task of capturing and reusing experts' intuitive criteria for determining convergence (work conducted at Indiana University). Second, we have also adapted and refined existing concept mapping tools that will be used to clarify and capture the rationale underlying those experiences, to facilitate understanding of the expert's reasoning and guide future reuse of captured information (work conducted at the University of West Florida). The tools we have developed are designed to be the basis for a general framework for facilitating tasks within systems developed by the Advanced Design Technologies Testbed (ADTT) project at ARC. The tenets of our framework are (1) that the systems developed should leverage a designer's knowledge, rather than attempting to replace it; (2) that learning and user feedback must play a central role, so that the system can adapt to how it is used, and (3) that the learning and feedback processes must be as natural and as unobtrusive as possible. In the second funding period we will extend our current work, applying the tools to capturing higher-level design rationale.

  10. Periastron advance in spinning black hole binaries: Gravitational self-force from numerical relativity

    NASA Astrophysics Data System (ADS)

    Le Tiec, Alexandre; Buonanno, Alessandra; Mroué, Abdul H.; Pfeiffer, Harald P.; Hemberger, Daniel A.; Lovelace, Geoffrey; Kidder, Lawrence E.; Scheel, Mark A.; Szilágyi, Bela; Taylor, Nicholas W.; Teukolsky, Saul A.

    2013-12-01

    We study the general relativistic periastron advance in spinning black hole binaries on quasicircular orbits, with spins aligned or antialigned with the orbital angular momentum, using numerical-relativity simulations, the post-Newtonian approximation, and black hole perturbation theory. By imposing a symmetry by exchange of the bodies’ labels, we devise an improved version of the perturbative result and use it as the leading term of a new type of expansion in powers of the symmetric mass ratio. This allows us to measure, for the first time, the gravitational self-force effect on the periastron advance of a nonspinning particle orbiting a Kerr black hole of mass M and spin S=-0.5M2, down to separations of order 9M. Comparing the predictions of our improved perturbative expansion with the exact results from numerical simulations of equal-mass and equal-spin binaries, we find a remarkable agreement over a wide range of spins and orbital separations.

  11. Which benefits in the use of a modeling platform : The VSoil example.

    NASA Astrophysics Data System (ADS)

    Lafolie, François; Cousin, Isabelle; Mollier, Alain; Pot, Valérie; Maron, Pierre-Alain; Moitrier, Nicolas; Nouguier, Cedric; Moitrier, Nathalie; Beudez, Nicolas

    2015-04-01

    In the environmental community the need for coupling the models and the associated knowledges emerged recently. The development of a coupling tool or of a modeling platform is mainly driven by the necessity to create models accounting for multiple processes and to take into account the feed back between these processes. Models focusing on a restricted number of processes exist and thus the coupling of these numerical tools appeared as an efficient and rapid mean to fill up the identified gaps. Several tools have been proposed : OMS3 (David et al. 2013) ; CSDMS framework (Peckham et al. 2013) ; the Open MI project developed within the frame of European Community (Open MI, 2011). However, what we should expect from a modeling platform could be more ambitious than only coupling existing numerical codes. We believe that we need to share easily not only our numerical representations but also the attached knowledges. We need to rapidly and easily develop complex models to have tools to bring responses to current issues on soil functioning and soil evolution within the frame of global change. We also need to share in a common frame our visions of soil functioning at various scales, one the one hand to strengthen our collaborations, and, on the other hand, to make them visible by the other communities working on environmental issues. The presentation will briefly present the VSoil platform. The platform is able to manipulate concepts and numerical representations of these processes. The tool helps in assembling modules to create a model and automatically generates an executable code and a GUI. Potentialities of the tool will be illustrated on few selected cases.

  12. Screening of multiple potential control genes for use in caste and body region comparisons using RT-qPCR in Coptotermes formosanus

    USDA-ARS?s Scientific Manuscript database

    Formosan subterranean termites, Coptotermes formosanus, are an important world wide pest. Molecular gene expression is an important tool for understanding the physiology of organisms. The recent advancement of molecular tools for Coptotermes formosanus is leading to advancement of the understanding ...

  13. Earthquake information products and tools from the Advanced National Seismic System (ANSS)

    USGS Publications Warehouse

    Wald, Lisa

    2006-01-01

    This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.

  14. Editing of EIA coded, numerically controlled, machine tool tapes

    NASA Technical Reports Server (NTRS)

    Weiner, J. M.

    1975-01-01

    Editing of numerically controlled (N/C) machine tool tapes (8-level paper tape) using an interactive graphic display processor is described. A rapid technique required for correcting production errors in N/C tapes was developed using the interactive text editor on the IMLAC PDS-ID graphic display system and two special programs resident on disk. The correction technique and special programs for processing N/C tapes coded to EIA specifications are discussed.

  15. Metric Use in the Tool Industry. A Status Report and a Test of Assessment Methodology.

    DTIC Science & Technology

    1982-04-20

    Weights and Measures) CIM - Computer-Integrated Manufacturing CNC - Computer Numerical Control DOD - Department of Defense DODISS - DOD Index of...numerically-controlled ( CNC ) machines that have an inch-millimeter selection switch and a corresponding dual readout scale. S -4- The use of both metric...satisfactorily met the demands of both domestic and foreign customers for metric machine tools by providing either metric- capable machines or NC and CNC

  16. Recent advances on terrain database correlation testing

    NASA Astrophysics Data System (ADS)

    Sakude, Milton T.; Schiavone, Guy A.; Morelos-Borja, Hector; Martin, Glenn; Cortes, Art

    1998-08-01

    Terrain database correlation is a major requirement for interoperability in distributed simulation. There are numerous situations in which terrain database correlation problems can occur that, in turn, lead to lack of interoperability in distributed training simulations. Examples are the use of different run-time terrain databases derived from inconsistent on source data, the use of different resolutions, and the use of different data models between databases for both terrain and culture data. IST has been developing a suite of software tools, named ZCAP, to address terrain database interoperability issues. In this paper we discuss recent enhancements made to this suite, including improved algorithms for sampling and calculating line-of-sight, an improved method for measuring terrain roughness, and the application of a sparse matrix method to the terrain remediation solution developed at the Visual Systems Lab of the Institute for Simulation and Training. We review the application of some of these new algorithms to the terrain correlation measurement processes. The application of these new algorithms improves our support for very large terrain databases, and provides the capability for performing test replications to estimate the sampling error of the tests. With this set of tools, a user can quantitatively assess the degree of correlation between large terrain databases.

  17. Stellar and Binary Evolution in Star Clusters

    NASA Technical Reports Server (NTRS)

    McMillan, Stephen L. W.

    2001-01-01

    This paper presents a final report on research activities covered on Stellar and Binary Evolution in Star Clusters. Substantial progress was made in the development and dissemination of the "Starlab" software environment. Significant improvements were made to "kira," an N-body simulation program tailored to the study of dense stellar systems such as star clusters and galactic nuclei. Key advances include (1) the inclusion of stellar and binary evolution in a self-consistent manner, (2) proper treatment of the anisotropic Galactic tidal field, (3) numerous technical enhancements in the treatment of binary dynamics and interactions, and (4) full support for the special-purpose GRAPE-4 hardware, boosting the program's performance by a factor of 10-100 over the accelerated version. The data-reduction and analysis tools in Starlab were also substantially expanded. A Starlab Web site (http://www.sns.ias.edu/-starlab) was created and developed. The site contains detailed information on the structure and function of the various tools that comprise the package, as well as download information, "how to" tips and examples of common operations, demonstration programs, animations, etc. All versions of the software are freely distributed to all interested users, along with detailed installation instructions.

  18. Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.

    PubMed

    Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O

    2006-03-01

    The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.

  19. Review of hardware-in-the-loop simulation and its prospects in the automotive area

    NASA Astrophysics Data System (ADS)

    Fathy, Hosam K.; Filipi, Zoran S.; Hagena, Jonathan; Stein, Jeffrey L.

    2006-05-01

    Hardware-in-the-loop (HIL) simulation is rapidly evolving from a control prototyping tool to a system modeling, simulation, and synthesis paradigm synergistically combining many advantages of both physical and virtual prototyping. This paper provides a brief overview of the key enablers and numerous applications of HIL simulation, focusing on its metamorphosis from a control validation tool into a system development paradigm. It then describes a state-of-the art engine-in-the-loop (EIL) simulation facility that highlights the use of HIL simulation for the system-level experimental evaluation of powertrain interactions and development of strategies for clean and efficient propulsion. The facility comprises a real diesel engine coupled to accurate real-time driver, driveline, and vehicle models through a highly responsive dynamometer. This enables the verification of both performance and fuel economy predictions of different conventional and hybrid powertrains. Furthermore, the facility can both replicate the highly dynamic interactions occurring within a real powertrain and measure their influence on transient emissions and visual signature through state-of-the-art instruments. The viability of this facility for integrated powertrain system development is demonstrated through a case study exploring the development of advanced High Mobility Multipurpose Wheeled Vehicle (HMMWV) powertrains.

  20. Virtual laboratories: new opportunities for collaborative water science

    NASA Astrophysics Data System (ADS)

    Ceola, Serena; Arheimer, Berit; Bloeschl, Guenter; Baratti, Emanuele; Capell, Rene; Castellarin, Attilio; Freer, Jim; Han, Dawei; Hrachowitz, Markus; Hundecha, Yeshewatesfa; Hutton, Christopher; Lindström, Goran; Montanari, Alberto; Nijzink, Remko; Parajka, Juraj; Toth, Elena; Viglione, Alberto; Wagener, Thorsten

    2015-04-01

    Reproducibility and repeatability of experiments are the fundamental prerequisites that allow researchers to validate results and share hydrological knowledge, experience and expertise in the light of global water management problems. Virtual laboratories offer new opportunities to enable these prerequisites since they allow experimenters to share data, tools and pre-defined experimental procedures (i.e. protocols). Here we present the outcomes of a first collaborative numerical experiment undertaken by five different international research groups in a virtual laboratory to address the key issues of reproducibility and repeatability. Moving from the definition of accurate and detailed experimental protocols, a rainfall-runoff model was independently applied to 15 European catchments by the research groups and model results were collectively examined through a web-based discussion. We found that a detailed modelling protocol was crucial to ensure the comparability and reproducibility of the proposed experiment across groups. Our results suggest that sharing comprehensive and precise protocols and running the experiments within a controlled environment (e.g. virtual laboratory) is as fundamental as sharing data and tools for ensuring experiment repeatability and reproducibility across the broad scientific community and thus advancing hydrology in a more coherent way.

  1. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  2. Software Testing and Verification in Climate Model Development

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  3. Pluripotent stem cells reveal the developmental biology of human megakaryocytes and provide a source of platelets for clinical application.

    PubMed

    Takayama, Naoya; Eto, Koji

    2012-10-01

    Human pluripotent stem cells [PSCs; including human embryonic stem cells (ESCs) and induced pluripotent stem cells (iPSCs)] can infinitely proliferate in vitro and are easily accessible for gene manipulation. Megakaryocytes (MKs) and platelets can be created from human ESCs and iPSCs in vitro and represent a potential source of blood cells for transfusion and a promising tool for studying the human thrombopoiesis. Moreover, disease-specific iPSCs are a powerful tool for elucidating the pathogenesis of hematological diseases and for drug screening. In that context, we and other groups have developed in vitro MK and platelet differentiation systems from human pluripotent stem cells (PSCs). Combining this co-culture system with a drug-inducible gene expression system enabled us to clarify the novel role played by c-MYC during human thrombopoiesis. In the next decade, technical advances (e.g., high-throughput genomic sequencing) will likely enable the identification of numerous gene mutations associated with abnormal thrombopoiesis. Combined with such technology, an in vitro system for differentiating human PSCs into MKs and platelets could provide a novel platform for studying human gene function associated with thrombopoiesis.

  4. Advancing User Supports with a Structured How-To Knowledge Base for Earth Science Data

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Acker, James G.; Lynnes, Christopher S.; Beaty, Tammy; Lighty, Luther; Kempler, Steven J.

    2016-01-01

    It is a challenge to access and process fast growing Earth science data from satellites and numerical models, which may be archived in very different data format and structures. NASA data centers, managed by the Earth Observing System Data and Information System (EOSDIS), have developed a rich and diverse set of data services and tools with features intended to simplify finding, downloading, and working with these data. Although most data services and tools have user guides, many users still experience difficulties with accessing or reading data due to varying levels of familiarity with data services, tools, and/or formats. A type of structured online document, data recipe, were created in beginning 2013 by Goddard Earth Science Data and Information Services Center (GES DISC). A data recipe is the How-To document created by using the fixed template, containing step-by-step instructions with screenshots and examples of accessing and working with real data. The recipes has been found to be very helpful, especially to first-time-users of particular data services, tools, or data products. Online traffic to the data recipe pages is significant to some recipes. In 2014, the NASA Earth Science Data System Working Group (ESDSWG) for data recipes was established, aimed to initiate an EOSDIS-wide campaign for leveraging the distributed knowledge within EOSDIS and its user communities regarding their respective services and tools. The ESDSWG data recipe group started with inventory and analysis of existing EOSDIS-wide online help documents, and provided recommendations and guidelines and for writing and grouping data recipes. This presentation will overview activities of creating How-To documents at GES DISC and ESDSWG. We encourage feedback and contribution from users for improving the data How-To knowledge base.

  5. Advancing User Supports with Structured How-To Knowledge Base for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Shen, S.; Acker, J. G.; Lynnes, C.; Lighty, L.; Beaty, T.; Kempler, S.

    2016-12-01

    It is a challenge to access and process fast growing Earth science data from satellites and numerical models, which may be archived in very different data format and structures. NASA data centers, managed by the Earth Observing System Data and Information System (EOSDIS), have developed a rich and diverse set of data services and tools with features intended to simplify finding, downloading, and working with these data. Although most data services and tools have user guides, many users still experience difficulties with accessing or reading data due to varying levels of familiarity with data services, tools, and/or formats. A type of structured online document, "data recipe", were created in beginning 2013 by Goddard Earth Science Data and Information Services Center (GES DISC). A data recipe is the "How-To" document created by using the fixed template, containing step-by-step instructions with screenshots and examples of accessing and working with real data. The recipes has been found to be very helpful, especially to first-time-users of particular data services, tools, or data products. Online traffic to the data recipe pages is significant to some recipes. In 2014, the NASA Earth Science Data System Working Group (ESDSWG) for data recipes was established, aimed to initiate an EOSDIS-wide campaign for leveraging the distributed knowledge within EOSDIS and its user communities regarding their respective services and tools. The ESDSWG data recipe group started with inventory and analysis of existing EOSDIS-wide online help documents, and provided recommendations and guidelines and for writing and grouping data recipes. This presentation will overview activities of creating How-To documents at GES DISC and ESDSWG. We encourage feedback and contribution from users for improving the data How-To knowledge base.

  6. ATST telescope mount: telescope of machine tool

    NASA Astrophysics Data System (ADS)

    Jeffers, Paul; Stolz, Günter; Bonomi, Giovanni; Dreyer, Oliver; Kärcher, Hans

    2012-09-01

    The Advanced Technology Solar Telescope (ATST) will be the largest solar telescope in the world, and will be able to provide the sharpest views ever taken of the solar surface. The telescope has a 4m aperture primary mirror, however due to the off axis nature of the optical layout, the telescope mount has proportions similar to an 8 meter class telescope. The technology normally used in this class of telescope is well understood in the telescope community and has been successfully implemented in numerous projects. The world of large machine tools has developed in a separate realm with similar levels of performance requirement but different boundary conditions. In addition the competitive nature of private industry has encouraged development and usage of more cost effective solutions both in initial capital cost and thru-life operating cost. Telescope mounts move relatively slowly with requirements for high stability under external environmental influences such as wind buffeting. Large machine tools operate under high speed requirements coupled with high application of force through the machine but with little or no external environmental influences. The benefits of these parallel development paths and the ATST system requirements are being combined in the ATST Telescope Mount Assembly (TMA). The process of balancing the system requirements with new technologies is based on the experience of the ATST project team, Ingersoll Machine Tools who are the main contractor for the TMA and MT Mechatronics who are their design subcontractors. This paper highlights a number of these proven technologies from the commercially driven machine tool world that are being introduced to the TMA design. Also the challenges of integrating and ensuring that the differences in application requirements are accounted for in the design are discussed.

  7. The Employment Retention and Advancement Project: How Effective Are Different Approaches Aiming to Increase Employment Retention and Advancement? Final Impacts for Twelve Models. Executive Summary

    ERIC Educational Resources Information Center

    Hendra, Richard; Dillman, Keri-Nicole; Hamilton, Gayle; Lundquist, Erika; Martinson, Karin; Wavelet, Melissa

    2010-01-01

    This report summarizes the final impact results for the national Employment Retention and Advancement (ERA) project. This project tested, using a random assignment design, the effectiveness of numerous programs intended to promote steady work and career advancement. All the programs targeted current and former welfare recipients and other low-wage…

  8. Integrating advanced visualization technology into the planetary Geoscience workflow

    NASA Astrophysics Data System (ADS)

    Huffman, John; Forsberg, Andrew; Loomis, Andrew; Head, James; Dickson, James; Fassett, Caleb

    2011-09-01

    Recent advances in computer visualization have allowed us to develop new tools for analyzing the data gathered during planetary missions, which is important, since these data sets have grown exponentially in recent years to tens of terabytes in size. As part of the Advanced Visualization in Solar System Exploration and Research (ADVISER) project, we utilize several advanced visualization techniques created specifically with planetary image data in mind. The Geoviewer application allows real-time active stereo display of images, which in aggregate have billions of pixels. The ADVISER desktop application platform allows fast three-dimensional visualization of planetary images overlain on digital terrain models. Both applications include tools for easy data ingest and real-time analysis in a programmatic manner. Incorporation of these tools into our everyday scientific workflow has proved important for scientific analysis, discussion, and publication, and enabled effective and exciting educational activities for students from high school through graduate school.

  9. Numerical Analysis of AHSS Fracture in a Stretch-bending Test

    NASA Astrophysics Data System (ADS)

    Luo, Meng; Chen, Xiaoming; Shi, Ming F.; Shih, Hua-Chu

    2010-06-01

    Advanced High Strength Steels (AHSS) are increasingly used in the automotive industry due to their superior strength and substantial weight reduction advantage. However, their limited ductility gives rise to numerous manufacturing issues. One of them is the so-called `shear fracture' often observed on tight radii during stamping processes. Since traditional approaches, such as the Forming Limit Diagram (FLD), are unable to predict this type of fracture, efforts have been made to develop failure criteria that can predict shear fractures. In this paper, a recently developed Modified Mohr-Coulomb (MMC) ductile fracture criterion[1] is adopted to analyze the failure behavior of a Dual Phase (DP) steel sheet during stretch bending operations. The plasticity and ductile fracture of the present sheet are fully characterized by the Hill'48 orthotropic model and the MMC fracture model respectively. Finite Element models with three different element types (3D, shell and plane strain) were built for a Stretch Forming Simulator (SFS) test and numerical simulations with four different R/t ratios (die radius normalized by sheet thickness) were performed. It has been shown that the 3D and shell element models can accurately predict the failure location/mode, the upper die load-displacement responses as well as the wall stress and wrap angle at the onset of fracture for all R/t ratios. Furthermore, a series of parametric studies were conducted on the 3D element model, and the effects of tension level (clamping distance) and tooling friction on the failure modes/locations were investigated.

  10. Geodynamics for Everyone: Robust Finite-Difference Heat Transfer Models using MS Excel 2007 Spreadsheets

    NASA Astrophysics Data System (ADS)

    Grose, C. J.

    2008-05-01

    Numerical geodynamics models of heat transfer are typically thought of as specialized topics of research requiring knowledge of specialized modelling software, linux platforms, and state-of-the-art finite-element codes. I have implemented analytical and numerical finite-difference techniques with Microsoft Excel 2007 spreadsheets to solve for complex solid-earth heat transfer problems for use by students, teachers, and practicing scientists without specialty in geodynamics modelling techniques and applications. While implementation of equations for use in Excel spreadsheets is occasionally cumbersome, once case boundary structure and node equations are developed, spreadsheet manipulation becomes routine. Model experimentation by modifying parameter values, geometry, and grid resolution makes Excel a useful tool whether in the classroom at the undergraduate or graduate level or for more engaging student projects. Furthermore, the ability to incorporate complex geometries and heat-transfer characteristics makes it ideal for first and occasionally higher order geodynamics simulations to better understand and constrain the results of professional field research in a setting that does not require the constraints of state-of-the-art modelling codes. The straightforward expression and manipulation of model equations in excel can also serve as a medium to better understand the confusing notations of advanced mathematical problems. To illustrate the power and robustness of computation and visualization in spreadsheet models I focus primarily on one-dimensional analytical and two-dimensional numerical solutions to two case problems: (i) the cooling of oceanic lithosphere and (ii) temperatures within subducting slabs. Excel source documents will be made available.

  11. Total Quality Management: Analysis, Evaluation and Implementation Within ACRV Project Teams

    NASA Technical Reports Server (NTRS)

    Raiman, Laura B.

    1991-01-01

    Total quality management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The Assured Crew Return Vehicle (ACRV) Project Office was identified as an excellent project in which to demonstrate the applications and benefits of TQM processes. As the ACRV Program moves through its various stages of development, it is vital that effectiveness and efficiency be maintained in order to provide the Space Station Freedom (SSF) crew an affordable, on-time assured return to Earth. A critical factor for the success of the ACRV is attaining the maximum benefit from the resources applied to the program. Through a series of four tutorials on various quality improvement techniques, and numerous one-on-one sessions during the SSF's 10-week term in the project office, results were obtained which are aiding the ACRV Office in implementing a disciplined, ongoing process for generating fundamental decisions and actions that shape and guide the organization. Significant advances were made in improving the processes for two particular groups - the correspondence distribution team and the WATER Test team. Numerous people from across JSC were a part of the various team activities including engineering, man systems, and safety. The work also included significant interaction with the support contractor to the ACRV Project. The results of the improvement activities can be used as models for other organizations desiring to operate under a system of continuous improvement. In particular, they have advanced the ACRV Project Teams further down the path of continuous improvement, in support of a working philosophy of TQM.

  12. Digital Rocks Portal: a sustainable platform for imaged dataset sharing, translation and automated analysis

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Hanlon, M.; Nanda, G.; Agarwal, P.

    2015-12-01

    Recent advances in imaging have provided a wealth of 3D datasets that reveal pore space microstructure (nm to cm length scale) and allow investigation of nonlinear flow and mechanical phenomena from first principles using numerical approaches. This framework has popularly been called "digital rock physics". Researchers, however, have trouble storing and sharing the datasets both due to their size and the lack of standardized image types and associated metadata for volumetric datasets. This impedes scientific cross-validation of the numerical approaches that characterize large scale porous media properties, as well as development of multiscale approaches required for correct upscaling. A single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal, that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Once widely accepter, the repository will jumpstart productivity and enable scientific inquiry and engineering decisions founded on a data-driven basis. This is the first repository of its kind. We show initial results on incorporating essential software tools and pipelines that make it easier for researchers to store and reuse data, and for educators to quickly visualize and illustrate concepts to a wide audience. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.

  13. Enhanced Representation of Turbulent Flow Phenomena in Large-Eddy Simulations of the Atmospheric Boundary Layer using Grid Refinement with Pseudo-Spectral Numerics

    NASA Astrophysics Data System (ADS)

    Torkelson, G. Q.; Stoll, R., II

    2017-12-01

    Large Eddy Simulation (LES) is a tool commonly used to study the turbulent transport of momentum, heat, and moisture in the Atmospheric Boundary Layer (ABL). For a wide range of ABL LES applications, representing the full range of turbulent length scales in the flow field is a challenge. This is an acute problem in regions of the ABL with strong velocity or scalar gradients, which are typically poorly resolved by standard computational grids (e.g., near the ground surface, in the entrainment zone). Most efforts to address this problem have focused on advanced sub-grid scale (SGS) turbulence model development, or on the use of massive computational resources. While some work exists using embedded meshes, very little has been done on the use of grid refinement. Here, we explore the benefits of grid refinement in a pseudo-spectral LES numerical code. The code utilizes both uniform refinement of the grid in horizontal directions, and stretching of the grid in the vertical direction. Combining the two techniques allows us to refine areas of the flow while maintaining an acceptable grid aspect ratio. In tests that used only refinement of the vertical grid spacing, large grid aspect ratios were found to cause a significant unphysical spike in the stream-wise velocity variance near the ground surface. This was especially problematic in simulations of stably-stratified ABL flows. The use of advanced SGS models was not sufficient to alleviate this issue. The new refinement technique is evaluated using a series of idealized simulation test cases of neutrally and stably stratified ABLs. These test cases illustrate the ability of grid refinement to increase computational efficiency without loss in the representation of statistical features of the flow field.

  14. The ADER-DG method for seismic wave propagation and earthquake rupture dynamics

    NASA Astrophysics Data System (ADS)

    Pelties, Christian; Gabriel, Alice; Ampuero, Jean-Paul; de la Puente, Josep; Käser, Martin

    2013-04-01

    We will present the Arbitrary high-order DERivatives Discontinuous Galerkin (ADER-DG) method for solving the combined elastodynamic wave propagation and dynamic rupture problem. The ADER-DG method enables high-order accuracy in space and time while being implemented on unstructured tetrahedral meshes. A tetrahedral element discretization provides rapid and automatized mesh generation as well as geometrical flexibility. Features as mesh coarsening and local time stepping schemes can be applied to reduce computational efforts without introducing numerical artifacts. The method is well suited for parallelization and large scale high-performance computing since only directly neighboring elements exchange information via numerical fluxes. The concept of fluxes is a key ingredient of the numerical scheme as it governs the numerical dispersion and diffusion properties and allows to accommodate for boundary conditions, empirical friction laws of dynamic rupture processes, or the combination of different element types and non-conforming mesh transitions. After introducing fault dynamics into the ADER-DG framework, we will demonstrate its specific advantages in benchmarking test scenarios provided by the SCEC/USGS Spontaneous Rupture Code Verification Exercise. An important result of the benchmark is that the ADER-DG method avoids spurious high-frequency contributions in the slip rate spectra and therefore does not require artificial Kelvin-Voigt damping, filtering or other modifications of the produced synthetic seismograms. To demonstrate the capabilities of the proposed scheme we simulate an earthquake scenario, inspired by the 1992 Landers earthquake, that includes branching and curved fault segments. Furthermore, topography is respected in the discretized model to capture the surface waves correctly. The advanced geometrical flexibility combined with an enhanced accuracy will make the ADER-DG method a useful tool to study earthquake dynamics on complex fault systems in realistic rheologies.

  15. Modelling river bank erosion processes and mass failure mechanisms using 2-D depth averaged numerical model

    NASA Astrophysics Data System (ADS)

    Die Moran, Andres; El kadi Abderrezzak, Kamal; Tassi, Pablo; Herouvet, Jean-Michel

    2014-05-01

    Bank erosion is a key process that may cause a large number of economic and environmental problems (e.g. land loss, damage to structures and aquatic habitat). Stream bank erosion (toe erosion and mass failure) represents an important form of channel morphology changes and a significant source of sediment. With the advances made in computational techniques, two-dimensional (2-D) numerical models have become valuable tools for investigating flow and sediment transport in open channels at large temporal and spatial scales. However, the implementation of mass failure process in 2D numerical models is still a challenging task. In this paper, a simple, innovative algorithm is implemented in the Telemac-Mascaret modeling platform to handle bank failure: failure occurs whether the actual slope of one given bed element is higher than the internal friction angle. The unstable bed elements are rotated around an appropriate axis, ensuring mass conservation. Mass failure of a bank due to slope instability is applied at the end of each sediment transport evolution iteration, once the bed evolution due to bed load (and/or suspended load) has been computed, but before the global sediment mass balance is verified. This bank failure algorithm is successfully tested using two laboratory experimental cases. Then, bank failure in a 1:40 scale physical model of the Rhine River composed of non-uniform material is simulated. The main features of the bank erosion and failure are correctly reproduced in the numerical simulations, namely the mass wasting at the bank toe, followed by failure at the bank head, and subsequent transport of the mobilised material in an aggradation front. Volumes of eroded material obtained are of the same order of magnitude as the volumes measured during the laboratory tests.

  16. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  17. Quality Improvement Project: Replacing the Numeric Rating Scale with a Clinically Aligned Pain Assessment (CAPA) Tool.

    PubMed

    Topham, Debra; Drew, Debra

    2017-12-01

    CAPA is a multifaceted pain assessment tool that was adopted at a large tertiary Midwest hospital to replace the numeric scale for adult patients who could self-report their pain experience. This article describes the process of implementation and the effect on patient satisfaction scores. Use of the tool is supported by the premise that pain assessment entails more than just pain intensity and that assessment is an exchange of meaning between patients and clinicians dependent on internal and external factors. Implementation of the tool was a transformative process resulting in modest increases in patient satisfaction scores with pain management. Patient reports that "staff did everything to manage pain" had the biggest gains and were sustained for more than 2 years. The CAPA tool meets regulatory requirements for pain assessment. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  18. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production

    PubMed Central

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D.; Pittelkow, Cameron M.

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders. PMID:28804490

  19. Automated shock detection and analysis algorithm for space weather application

    NASA Astrophysics Data System (ADS)

    Vorotnikov, Vasiliy S.; Smith, Charles W.; Hu, Qiang; Szabo, Adam; Skoug, Ruth M.; Cohen, Christina M. S.

    2008-03-01

    Space weather applications have grown steadily as real-time data have become increasingly available. Numerous industrial applications have arisen with safeguarding of the power distribution grids being a particular interest. NASA uses short-term and long-term space weather predictions in its launch facilities. Researchers studying ionospheric, auroral, and magnetospheric disturbances use real-time space weather services to determine launch times. Commercial airlines, communication companies, and the military use space weather measurements to manage their resources and activities. As the effects of solar transients upon the Earth's environment and society grow with the increasing complexity of technology, better tools are needed to monitor and evaluate the characteristics of the incoming disturbances. A need is for automated shock detection and analysis methods that are applicable to in situ measurements upstream of the Earth. Such tools can provide advance warning of approaching disturbances that have significant space weather impacts. Knowledge of the shock strength and speed can also provide insight into the nature of the approaching solar transient prior to arrival at the magnetopause. We report on efforts to develop a tool that can find and analyze shocks in interplanetary plasma data without operator intervention. This method will run with sufficient speed to be a practical space weather tool providing useful shock information within 1 min of having the necessary data to ground. The ability to run without human intervention frees space weather operators to perform other vital services. We describe ways of handling upstream data that minimize the frequency of false positive alerts while providing the most complete description of approaching disturbances that is reasonably possible.

  20. "We actually care and we want to make the parks better": A qualitative study of youth experiences and perceptions after conducting park audits.

    PubMed

    Gallerani, David G; Besenyi, Gina M; Wilhelm Stanis, Sonja A; Kaczynski, Andrew T

    2017-02-01

    This study explored youths' experiences and perceptions about community engagement as a result of participating in a community-based data collection project using paper and mobile technology park environmental audit tools. In July 2014, youth (ages 11-18, n=50) were recruited to participate in nine focus groups after auditing two parks each using paper, electronic, or both versions of the Community Park Audit Tool in Greenville County, SC. The focus groups explored the youths' experiences participating in the project, changes as a result of participation, suggested uses of park audit data collected, and who should use the tools. Four themes emerged related to youths' project participation experiences: two positive (fun and new experiences) and two negative (uncomfortable/unsafe and travel issues). Changes described as a result of participating in the project fell into four themes: increased awareness, motivation for further action, physical activity benefits, and no change. Additionally, youth had numerous suggestions for utilizing the data collected that were coded into six themes: maintenance & aesthetics, feature/amenity addition, online park information, park rating/review system, fundraising, and organizing community projects. Finally, six themes emerged regarding who the youth felt could use the tools: frequent park visitors, community groups/organizations, parks and recreation professionals, adults, youth, and everyone. This study revealed a wealth of information about youth experiences conducting park audits for community health promotion. Understanding youth attitudes and preferences can help advance youth empowerment and civic engagement efforts to promote individual and community health. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production.

    PubMed

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D; Pittelkow, Cameron M

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders.

  2. Data fusion for CD metrology: heterogeneous hybridization of scatterometry, CDSEM, and AFM data

    NASA Astrophysics Data System (ADS)

    Hazart, J.; Chesneau, N.; Evin, G.; Largent, A.; Derville, A.; Thérèse, R.; Bos, S.; Bouyssou, R.; Dezauzier, C.; Foucher, J.

    2014-04-01

    The manufacturing of next generation semiconductor devices forces metrology tool providers for an exceptional effort in order to meet the requirements for precision, accuracy and throughput stated in the ITRS. In the past years hybrid metrology (based on data fusion theories) has been investigated as a new methodology for advanced metrology [1][2][3]. This paper provides a new point of view of data fusion for metrology through some experiments and simulations. The techniques are presented concretely in terms of equations to be solved. The first point of view is High Level Fusion which is the use of simple numbers with their associated uncertainty postprocessed by tools. In this paper, it is divided into two stages: one for calibration to reach accuracy, the second to reach precision thanks to Bayesian Fusion. From our perspective, the first stage is mandatory before applying the second stage which is commonly presented [1]. However a reference metrology system is necessary for this fusion. So, precision can be improved if and only if the tools to be fused are perfectly matched at least for some parameters. We provide a methodology similar to a multidimensional TMU able to perform this matching exercise. It is demonstrated on a 28 nm node backend lithography case. The second point of view is Deep Level Fusion which works on the contrary with raw data and their combination. In the approach presented here, the analysis of each raw data is based on a parametric model and connections between the parameters of each tool. In order to allow OCD/SEM Deep Level Fusion, a SEM Compact Model derived from [4] has been developed and compared to AFM. As far as we know, this is the first time such techniques have been coupled at Deep Level. A numerical study on the case of a simple stack for lithography is performed. We show strict equivalence of Deep Level Fusion and High Level Fusion when tools are sensitive and models are perfect. When one of the tools can be considered as a reference and the second is biased, High Level Fusion is far superior to standard Deep Level Fusion. Otherwise, only the second stage of High Level Fusion is possible (Bayesian Fusion) and do not provide substantial advantage. Finally, when OCD is equipped with methods for bias detection [5], Deep Level Fusion outclasses the two-stage High Level Fusion and will benefit to the industry for most advanced nodes production.

  3. The Beck Depression Inventory (BDI-II) and a single screening question as screening tools for depressive disorder in Dutch advanced cancer patients.

    PubMed

    Warmenhoven, Franca; van Rijswijk, Eric; Engels, Yvonne; Kan, Cornelis; Prins, Judith; van Weel, Chris; Vissers, Kris

    2012-02-01

    Depression is highly prevalent in advanced cancer patients, but the diagnosis of depressive disorder in patients with advanced cancer is difficult. Screening instruments could facilitate diagnosing depressive disorder in patients with advanced cancer. The aim of this study was to determine the validity of the Beck Depression Inventory (BDI-II) and a single screening question as screening tools for depressive disorder in advanced cancer patients. Patients with advanced metastatic disease, visiting the outpatient palliative care department, were asked to fill out a self-questionnaire containing the Beck Depression Inventory (BDI-II) and a single screening question "Are you feeling depressed?" The mood section of the PRIME-MD was used as a gold standard. Sixty-one patients with advanced metastatic disease were eligible to be included in the study. Complete data were obtained from 46 patients. The area under the curve of the receiver operating characteristics analysis of the BDI-II was 0.82. The optimal cut-off point of the BDI-II was 16 with a sensitivity of 90% and a specificity of 69%. The single screening question showed a sensitivity of 50% and a specificity of 94%. The BDI-II seems an adequate screening tool for a depressive disorder in advanced cancer patients. The sensitivity of a single screening question is poor.

  4. Advanced Combustion Numerics and Modeling - FY18 First Quarter Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitesides, R. A.; Killingsworth, N. J.; McNenly, M. J.

    This project is focused on early stage research and development of numerical methods and models to improve advanced engine combustion concepts and systems. The current focus is on development of new mathematics and algorithms to reduce the time to solution for advanced combustion engine design using detailed fuel chemistry. The research is prioritized towards the most time-consuming workflow bottlenecks (computer and human) and accuracy gaps that slow ACS program members. Zero-RK, the fast and accurate chemical kinetics solver software developed in this project, is central to the research efforts and continues to be developed to address the current and emergingmore » needs of the engine designers, engine modelers and fuel mechanism developers.« less

  5. Contact Us | OSTI, US Dept of Energy Office of Scientific and Technical

    Science.gov Websites

    Information skip to main content Sign In Create Account OSTI.GOV title logo U.S. Department of Energy Office of Scientific and Technical Information Search terms: Advanced search options Advanced Tools Public Access Policy Data Services & Dev Tools About FAQs News Sign In Create Account Contact

  6. How Project Management Tools Aid in Association to Advance Collegiate Schools of Business (AACSB) International Maintenance of Accreditation

    ERIC Educational Resources Information Center

    Cann, Cynthia W.; Brumagim, Alan L.

    2008-01-01

    The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…

  7. Synthetic biology advances for pharmaceutical production

    PubMed Central

    Breitling, Rainer; Takano, Eriko

    2015-01-01

    Synthetic biology enables a new generation of microbial engineering for the biotechnological production of pharmaceuticals and other high-value chemicals. This review presents an overview of recent advances in the field, describing new computational and experimental tools for the discovery, optimization and production of bioactive molecules, and outlining progress towards the application of these tools to pharmaceutical production systems. PMID:25744872

  8. Error analysis of numerical gravitational waveforms from coalescing binary black holes

    NASA Astrophysics Data System (ADS)

    Fong, Heather; Chu, Tony; Kumar, Prayush; Pfeiffer, Harald; Boyle, Michael; Hemberger, Daniel; Kidder, Lawrence; Scheel, Mark; Szilagyi, Bela; SXS Collaboration

    2016-03-01

    The Advanced Laser Interferometer Gravitational-wave Observatory (Advanced LIGO) has finished a successful first observation run and will commence its second run this summer. Detection of compact object binaries utilizes matched-filtering, which requires a vast collection of highly accurate gravitational waveforms. This talk will present a set of about 100 new aligned-spin binary black hole simulations. I will discuss their properties, including a detailed error analysis, which demonstrates that the numerical waveforms are sufficiently accurate for gravitational wave detection purposes, as well as for parameter estimation purposes.

  9. Advanced graphical user interface for multi-physics simulations using AMST

    NASA Astrophysics Data System (ADS)

    Hoffmann, Florian; Vogel, Frank

    2017-07-01

    Numerical modelling of particulate matter has gained much popularity in recent decades. Advanced Multi-physics Simulation Technology (AMST) is a state-of-the-art three dimensional numerical modelling technique combining the eX-tended Discrete Element Method (XDEM) with Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) [1]. One major limitation of this code is the lack of a graphical user interface (GUI) meaning that all pre-processing has to be made directly in a HDF5-file. This contribution presents the first graphical pre-processor developed for AMST.

  10. Numerical modelling of river morphodynamics: Latest developments and remaining challenges

    NASA Astrophysics Data System (ADS)

    Siviglia, Annunziato; Crosato, Alessandra

    2016-07-01

    Numerical morphodynamic models provide scientific frameworks for advancing our understanding of river systems. The research on involved topics is an important and socially relevant undertaking regarding our environment. Nowadays numerical models are used for different purposes, from answering questions about basic morphodynamic research to managing complex river engineering problems. Due to increasing computer power and the development of advanced numerical techniques, morphodynamic models are now more and more used to predict the bed patterns evolution to a broad spectrum of spatial and temporal scales. The development and the success of application of such models are based upon a wide range of disciplines from applied mathematics for the numerical solution of the equations to geomorphology for the physical interpretation of the results. In this light we organized this special issue (SI) soliciting multidisciplinary contributions which encompass any aspect needed for the development and applications of such models. Most of the papers in the SI stem from contributions to session HS9.5/GM7.11 on numerical modelling and experiments in river morphodynamics at the European Geosciences Union (EGU) General Assembly held in Vienna, April 27th to May 2nd 2014.

  11. Identification of drug-resistant subpopulations in canine hemangiosarcoma

    PubMed Central

    Khammanivong, A.; Gorden, B. H.; Frantz, A. M.; Graef, A. J.; Dickerson, E. B.

    2017-01-01

    Canine hemangiosarcoma is a rapidly progressive disease that is poorly responsive to conventional chemotherapy. Despite numerous attempts to advance treatment options and improve outcomes, drug resistance remains a hurdle to successful therapy. To address this problem, we used recently characterized progenitor cell populations derived from canine hemangiosarcoma cell lines and grown as non-adherent spheres to identify potential drug resistance mechanisms as well as drug-resistant cell populations. Cells from sphere-forming cultures displayed enhanced resistance to chemotherapy drugs, expansion of dye-excluding side populations and altered ATP-binding cassette (ABC) transporter expression. Invasion studies demonstrated variability between cell lines as well as between sphere and monolayer cell populations. Collectively, our results suggest that sphere cell populations contain distinct subpopulations of drug-resistant cells that utilize multiple mechanisms to evade cytotoxic drugs. Our approach represents a new tool for the study of drug resistance in hemangiosarcoma, which could alter approaches for treating this disease. PMID:25112808

  12. Engineering nucleic acid structures for programmable molecular circuitry and intracellular biocomputation

    NASA Astrophysics Data System (ADS)

    Li, Jiang; Green, Alexander A.; Yan, Hao; Fan, Chunhai

    2017-11-01

    Nucleic acids have attracted widespread attention due to the simplicity with which they can be designed to form discrete structures and programmed to perform specific functions at the nanoscale. The advantages of DNA/RNA nanotechnology offer numerous opportunities for in-cell and in-vivo applications, and the technology holds great promise to advance the growing field of synthetic biology. Many elegant examples have revealed the potential in integrating nucleic acid nanostructures in cells and in vivo where they can perform important physiological functions. In this Review, we summarize the current abilities of DNA/RNA nanotechnology to realize applications in live cells and then discuss the key problems that must be solved to fully exploit the useful properties of nanostructures. Finally, we provide viewpoints on how to integrate the tools provided by DNA/RNA nanotechnology and related new technologies to construct nucleic acid nanostructure-based molecular circuitry for synthetic biology.

  13. Space Shuttle Propulsion Systems Plume Modeling and Simulation for the Lift-Off Computational Fluid Dynamics Model

    NASA Technical Reports Server (NTRS)

    Strutzenberg, L. L.; Dougherty, N. S.; Liever, P. A.; West, J. S.; Smith, S. D.

    2007-01-01

    This paper details advances being made in the development of Reynolds-Averaged Navier-Stokes numerical simulation tools, models, and methods for the integrated Space Shuttle Vehicle at launch. The conceptual model and modeling approach described includes the development of multiple computational models to appropriately analyze the potential debris transport for critical debris sources at Lift-Off. The conceptual model described herein involves the integration of propulsion analysis for the nozzle/plume flow with the overall 3D vehicle flowfield at Lift-Off. Debris Transport Analyses are being performed using the Shuttle Lift-Off models to assess the risk to the vehicle from Lift-Off debris and appropriately prioritized mitigation of potential debris sources to continue to reduce vehicle risk. These integrated simulations are being used to evaluate plume-induced debris environments where the multi-plume interactions with the launch facility can potentially accelerate debris particles toward the vehicle.

  14. High-order continuum kinetic method for modeling plasma dynamics in phase space

    DOE PAGES

    Vogman, G. V.; Colella, P.; Shumlak, U.

    2014-12-15

    Continuum methods offer a high-fidelity means of simulating plasma kinetics. While computationally intensive, these methods are advantageous because they can be cast in conservation-law form, are not susceptible to noise, and can be implemented using high-order numerical methods. Advances in continuum method capabilities for modeling kinetic phenomena in plasmas require the development of validation tools in higher dimensional phase space and an ability to handle non-cartesian geometries. To that end, a new benchmark for validating Vlasov-Poisson simulations in 3D (x,v x,v y) is presented. The benchmark is based on the Dory-Guest-Harris instability and is successfully used to validate a continuummore » finite volume algorithm. To address challenges associated with non-cartesian geometries, unique features of cylindrical phase space coordinates are described. Preliminary results of continuum kinetic simulations in 4D (r,z,v r,v z) phase space are presented.« less

  15. Application of chemical biology in target identification and drug discovery.

    PubMed

    Zhu, Yue; Xiao, Ting; Lei, Saifei; Zhou, Fulai; Wang, Ming-Wei

    2015-09-01

    Drug discovery and development is vital to the well-being of mankind and sustainability of the pharmaceutical industry. Using chemical biology approaches to discover drug leads has become a widely accepted path partially because of the completion of the Human Genome Project. Chemical biology mainly solves biological problems through searching previously unknown targets for pharmacologically active small molecules or finding ligands for well-defined drug targets. It is a powerful tool to study how these small molecules interact with their respective targets, as well as their roles in signal transduction, molecular recognition and cell functions. There have been an increasing number of new therapeutic targets being identified and subsequently validated as a result of advances in functional genomics, which in turn led to the discovery of numerous active small molecules via a variety of high-throughput screening initiatives. In this review, we highlight some applications of chemical biology in the context of drug discovery.

  16. FROGS: Find, Rapidly, OTUs with Galaxy Solution.

    PubMed

    Escudié, Frédéric; Auer, Lucas; Bernard, Maria; Mariadassou, Mahendra; Cauquil, Laurent; Vidal, Katia; Maman, Sarah; Hernandez-Raquet, Guillermina; Combes, Sylvie; Pascal, Géraldine

    2018-04-15

    Metagenomics leads to major advances in microbial ecology and biologists need user friendly tools to analyze their data on their own. This Galaxy-supported pipeline, called FROGS, is designed to analyze large sets of amplicon sequences and produce abundance tables of Operational Taxonomic Units (OTUs) and their taxonomic affiliation. The clustering uses Swarm. The chimera removal uses VSEARCH, combined with original cross-sample validation. The taxonomic affiliation returns an innovative multi-affiliation output to highlight databases conflicts and uncertainties. Statistical results and numerous graphical illustrations are produced along the way to monitor the pipeline. FROGS was tested for the detection and quantification of OTUs on real and in silico datasets and proved to be rapid, robust and highly sensitive. It compares favorably with the widespread mothur, UPARSE and QIIME. Source code and instructions for installation: https://github.com/geraldinepascal/FROGS.git. A companion website: http://frogs.toulouse.inra.fr. geraldine.pascal@inra.fr. Supplementary data are available at Bioinformatics online.

  17. Recent advances in applications of nanomaterials for sample preparation.

    PubMed

    Xu, Linnan; Qi, Xiaoyue; Li, Xianjiang; Bai, Yu; Liu, Huwei

    2016-01-01

    Sample preparation is a key step for qualitative and quantitative analysis of trace analytes in complicated matrix. Along with the rapid development of nanotechnology in material science, numerous nanomaterials have been developed with particularly useful applications in analytical chemistry. Benefitting from their high specific areas, increased surface activities, and unprecedented physical/chemical properties, the potentials of nanomaterials for rapid and efficient sample preparation have been exploited extensively. In this review, recent progress of novel nanomaterials applied in sample preparation has been summarized and discussed. Both nanoparticles and nanoporous materials are evaluated for their unusual performance in sample preparation. Various compositions and functionalizations extended the applications of nanomaterials in sample preparations, and distinct size and shape selectivity was generated from the diversified pore structures of nanoporous materials. Such great variety make nanomaterials a kind of versatile tools in sample preparation for almost all categories of analytes. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE PAGES

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    2018-02-02

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  19. mRNA Cancer Vaccines-Messages that Prevail.

    PubMed

    Grunwitz, Christian; Kranz, Lena M

    2017-01-01

    During the last decade, mRNA became increasingly recognized as a versatile tool for the development of new innovative therapeutics. Especially for vaccine development, mRNA is of outstanding interest and numerous clinical trials have been initiated. Strikingly, all of these studies have proven that large-scale GMP production of mRNA is feasible and concordantly report a favorable safety profile of mRNA vaccines. Induction of T-cell immunity is a multi-faceted process comprising antigen acquisition, antigen processing and presentation, as well as immune stimulation. The effectiveness of mRNA vaccines is critically dependent on making the antigen(s) of interest available to professional antigen-presenting cells, especially DCs. Efficient delivery of mRNA into DCs in vivo remains a major challenge in the mRNA vaccine field. This review summarizes the principles of mRNA vaccines and highlights the importance of in vivo mRNA delivery and recent advances in harnessing their therapeutic potential.

  20. Investigation of the current yaw engineering models for simulation of wind turbines in BEM and comparison with CFD and experiment

    NASA Astrophysics Data System (ADS)

    Rahimi, H.; Hartvelt, M.; Peinke, J.; Schepers, J. G.

    2016-09-01

    The aim of this work is to investigate the capabilities of current engineering tools based on Blade Element Momentum (BEM) and free vortex wake codes for the prediction of key aerodynamic parameters of wind turbines in yawed flow. Axial induction factor and aerodynamic loads of three wind turbines (NREL VI, AVATAR and INNWIND.EU) were investigated using wind tunnel measurements and numerical simulations for 0 and 30 degrees of yaw. Results indicated that for axial conditions there is a good agreement between all codes in terms of mean values of aerodynamic parameters, however in yawed flow significant deviations were observed. This was due to unsteady phenomena such as advancing & retreating and skewed wake effect. These deviations were more visible in aerodynamic parameters in comparison to the rotor azimuthal angle for the sections at the root and tip where the skewed wake effect plays a major role.

Top