Science.gov

Sample records for advanced numerical tools

  1. The science of and advanced technology for cost-effective manufacture of high precision engineering products. Volume 4. Thermal effects on the accuracy of numerically controlled machine tool

    NASA Astrophysics Data System (ADS)

    Venugopal, R.; Barash, M. M.; Liu, C. R.

    1985-10-01

    Thermal effects on the accuracy of numerically controlled machine tools are specially important in the context of unmanned manufacture or under conditions of precision metal cutting. Removal of the operator from the direct control of the metal cutting process has created problems in terms of maintaining accuracy. The objective of this research is to study thermal effects on the accuracy of numerically controlled machine tools. The initial part of the research report is concerned with the analysis of a hypothetical machine. The thermal characteristics of this machine are studied. Numerical methods for evaluating the errors exhibited by the slides of the machine are proposed and the possibility of predicting thermally induced errors by the use of regression equations is investigated. A method for computing the workspace error is also presented. The final part is concerned with the actual measurement of errors on a modern CNC machining center. Thermal influences on the errors is the main objective of the experimental work. Thermal influences on the errors of machine tools are predictable. Techniques for determining thermal effects on machine tools at a design stage are also presented. ; Error models and prediction; Metrology; Automation.

  2. Numerical tools for atomistic simulations.

    SciTech Connect

    Fang, H.; Gullett, Philip Michael; Slepoy, Alexander; Horstemeyer, Mark F.; Baskes, Michael I.; Wagner, Gregory John; Li, Mo

    2004-01-01

    The final report for a Laboratory Directed Research and Development project entitled 'Parallel Atomistic Computing for Failure Analysis of Micromachines' is presented. In this project, atomistic algorithms for parallel computers were developed to assist in quantification of microstructure-property relations related to weapon micro-components. With these and other serial computing tools, we are performing atomistic simulations of various sizes, geometries, materials, and boundary conditions. These tools provide the capability to handle the different size-scale effects required to predict failure. Nonlocal continuum models have been proposed to address this problem; however, they are phenomenological in nature and are difficult to validate for micro-scale components. Our goal is to separately quantify damage nucleation, growth, and coalescence mechanisms to provide a basis for macro-scale continuum models that will be used for micromachine design. Because micro-component experiments are difficult, a systematic computational study that employs Monte Carlo methods, molecular statics, and molecular dynamics (EAM and MEAM) simulations to compute continuum quantities will provide mechanism-property relations associated with the following parameters: specimen size, number of grains, crystal orientation, strain rates, temperature, defect nearest neighbor distance, void/crack size, chemical state, and stress state. This study will quantify sizescale effects from nanometers to microns in terms of damage progression and thus potentially allow for optimized micro-machine designs that are more reliable and have higher fidelity in terms of strength. In order to accomplish this task, several atomistic methods needed to be developed and evaluated to cover the range of defects, strain rates, temperatures, and sizes that a material may see in micro-machines. Therefore we are providing a complete set of tools for large scale atomistic simulations that include pre-processing of

  3. Numerically Controlled Machine Tools and Worker Skills.

    ERIC Educational Resources Information Center

    Keefe, Jeffrey H.

    1991-01-01

    Analysis of data from "Industry Wage Surveys of Machinery Manufacturers" on the skill levels of 57 machining jobs found that introduction of numerically controlled machine tools has resulted in a very small reduction in skill levels or no significant change, supporting neither the deskilling argument nor argument that skill levels increase with…

  4. Numerical modeling tools for chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Jasinski, Thomas J.; Childs, Edward P.

    1992-01-01

    Development of general numerical simulation tools for chemical vapor deposition (CVD) was the objective of this study. Physical models of important CVD phenomena were developed and implemented into the commercial computational fluid dynamics software FLUENT. The resulting software can address general geometries as well as the most important phenomena occurring with CVD reactors: fluid flow patterns, temperature and chemical species distribution, gas phase and surface deposition. The physical models are documented which are available and examples are provided of CVD simulation capabilities.

  5. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  6. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  7. Advances of implementing NC machine tools discussed

    NASA Astrophysics Data System (ADS)

    Kukuyev, Y. P.; Trukhan, Y. V.

    1984-11-01

    Numerical control machine tools which are one of the principal resources of reequipment, mechanization and automation of small series and series production in machine building were examined. The continually increasing volume of NC machine tools which are produced and introduced is economically significant for introduction of these machine tools to operation and organization of their effective use. Organizational and technical measures were directed at solving these problems. To insure the fastest introduction of NC machine tools into operation and their technical maintenance, a number of setting up organizations was organized. Setting up services are also provided by the plants manufacturing the NC machine tools, and appropriate subdivisions are created for this purpose.

  8. Rapid medical advances challenge the tooling industry.

    PubMed

    Conley, B

    2008-01-01

    The requirement for greater performance in smaller spaces has increased demands for product and process innovation in tubing and other medical products. In turn, these developments have placed greater demands on the producers of the advanced tooling for these products. Tooling manufacturers must now continuously design equipment with much tighter tolerances for more sophisticated coextrusions and for newer generations of multilumen and multilayer tubing.

  9. Brush seal numerical simulation: Concepts and advances

    NASA Technical Reports Server (NTRS)

    Braun, M. J.; Kudriavtsev, V. V.

    1994-01-01

    The development of the brush seal is considered to be most promising among the advanced type seals that are presently in use in the high speed turbomachinery. The brush is usually mounted on the stationary portions of the engine and has direct contact with the rotating element, in the process of limiting the 'unwanted' leakage flows between stages, or various engine cavities. This type of sealing technology is providing high (in comparison with conventional seals) pressure drops due mainly to the high packing density (around 100 bristles/sq mm), and brush compliance with the rotor motions. In the design of modern aerospace turbomachinery leakage flows between the stages must be minimal, thus contributing to the higher efficiency of the engine. Use of the brush seal instead of the labyrinth seal reduces the leakage flow by one order of magnitude. Brush seals also have been found to enhance dynamic performance, cost less, and are lighter than labyrinth seals. Even though industrial brush seals have been successfully developed through extensive experimentation, there is no comprehensive numerical methodology for the design or prediction of their performance. The existing analytical/numerical approaches are based on bulk flow models and do not allow the investigation of the effects of brush morphology (bristle arrangement), or brushes arrangement (number of brushes, spacing between them), on the pressure drops and flow leakage. An increase in the brush seal efficiency is clearly a complex problem that is closely related to the brush geometry and arrangement, and can be solved most likely only by means of a numerically distributed model.

  10. Brush seal numerical simulation: Concepts and advances

    NASA Astrophysics Data System (ADS)

    Braun, M. J.; Kudriavtsev, V. V.

    1994-07-01

    The development of the brush seal is considered to be most promising among the advanced type seals that are presently in use in the high speed turbomachinery. The brush is usually mounted on the stationary portions of the engine and has direct contact with the rotating element, in the process of limiting the 'unwanted' leakage flows between stages, or various engine cavities. This type of sealing technology is providing high (in comparison with conventional seals) pressure drops due mainly to the high packing density (around 100 bristles/sq mm), and brush compliance with the rotor motions. In the design of modern aerospace turbomachinery leakage flows between the stages must be minimal, thus contributing to the higher efficiency of the engine. Use of the brush seal instead of the labyrinth seal reduces the leakage flow by one order of magnitude. Brush seals also have been found to enhance dynamic performance, cost less, and are lighter than labyrinth seals. Even though industrial brush seals have been successfully developed through extensive experimentation, there is no comprehensive numerical methodology for the design or prediction of their performance. The existing analytical/numerical approaches are based on bulk flow models and do not allow the investigation of the effects of brush morphology (bristle arrangement), or brushes arrangement (number of brushes, spacing between them), on the pressure drops and flow leakage. An increase in the brush seal efficiency is clearly a complex problem that is closely related to the brush geometry and arrangement, and can be solved most likely only by means of a numerically distributed model.

  11. Advanced genetic tools for plant biotechnology

    SciTech Connect

    Liu, WS; Yuan, JS; Stewart, CN

    2013-10-09

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  12. Interpolator for numerically controlled machine tools

    DOEpatents

    Bowers, Gary L.; Davenport, Clyde M.; Stephens, Albert E.

    1976-01-01

    A digital differential analyzer circuit is provided that depending on the embodiment chosen can carry out linear, parabolic, circular or cubic interpolation. In the embodiment for parabolic interpolations, the circuit provides pulse trains for the X and Y slide motors of a two-axis machine to effect tool motion along a parabolic path. The pulse trains are generated by the circuit in such a way that parabolic tool motion is obtained from information contained in only one block of binary input data. A part contour may be approximated by one or more parabolic arcs. Acceleration and initial velocity values from a data block are set in fixed bit size registers for each axis separately but simultaneously and the values are integrated to obtain the movement along the respective axis as a function of time. Integration is performed by continual addition at a specified rate of an integrand value stored in one register to the remainder temporarily stored in another identical size register. Overflows from the addition process are indicative of the integral. The overflow output pulses from the second integration may be applied to motors which position the respective machine slides according to a parabolic motion in time to produce a parabolic machine tool motion in space. An additional register for each axis is provided in the circuit to allow "floating" of the radix points of the integrand registers and the velocity increment to improve position accuracy and to reduce errors encountered when the acceleration integrand magnitudes are small when compared to the velocity integrands. A divider circuit is provided in the output of the circuit to smooth the output pulse spacing and prevent motor stall, because the overflow pulses produced in the binary addition process are spaced unevenly in time. The divider has the effect of passing only every nth motor drive pulse, with n being specifiable. The circuit inputs (integrands, rates, etc.) are scaled to give exactly n times the

  13. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  14. Program Helps Specify Paths For Numerically Controlled Tools

    NASA Technical Reports Server (NTRS)

    Premack, Timothy; Poland, James, Jr.

    1996-01-01

    ESDAPT computer program provides graphical programming environment for developing APT (Automatically Programmed Tool) programs for controlling numerically controlled machine tools. Establishes graphical user interface providing user with APT syntax-sensitive text-editing subprogram and windows for displaying geometry and tool paths. APT geometry statements also created by use of menus and screen picks. Written in C language, yacc, lex, and XView for use on Sun4-series computers running SunOS.

  15. Self-advancing step-tap tool

    NASA Technical Reports Server (NTRS)

    Pettit, Donald R. (Inventor); Penner, Ronald K. (Inventor); Franklin, Larry D. (Inventor); Camarda, Charles J. (Inventor)

    2008-01-01

    Methods and tool for simultaneously forming a bore in a work piece and forming a series of threads in said bore. In an embodiment, the tool has a predetermined axial length, a proximal end, and a distal end, said tool comprising: a shank located at said proximal end; a pilot drill portion located at said distal end; and a mill portion intermediately disposed between said shank and said pilot drill portion. The mill portion is comprised of at least two drill-tap sections of predetermined axial lengths and at least one transition section of predetermined axial length, wherein each of said at least one transition section is sandwiched between a distinct set of two of said at least two drill-tap sections. The at least two drill-tap sections are formed of one or more drill-tap cutting teeth spirally increasing along said at least two drill-tap sections, wherein said tool is self-advanced in said work piece along said formed threads, and wherein said tool simultaneously forms said bore and said series of threads along a substantially similar longitudinal axis.

  16. Analytical and numerical methods; advanced computer concepts

    SciTech Connect

    Lax, P D

    1991-03-01

    This past year, two projects have been completed and a new is under way. First, in joint work with R. Kohn, we developed a numerical algorithm to study the blowup of solutions to equations with certain similarity transformations. In the second project, the adaptive mesh refinement code of Berger and Colella for shock hydrodynamic calculations has been parallelized and numerical studies using two different shared memory machines have been done. My current effort is towards the development of Cartesian mesh methods to solve pdes with complicated geometries. Most of the coming year will be spent on this project, which is joint work with Prof. Randy Leveque at the University of Washington in Seattle.

  17. Advances in numerical and applied mathematics

    NASA Technical Reports Server (NTRS)

    South, J. C., Jr. (Editor); Hussaini, M. Y. (Editor)

    1986-01-01

    This collection of papers covers some recent developments in numerical analysis and computational fluid dynamics. Some of these studies are of a fundamental nature. They address basic issues such as intermediate boundary conditions for approximate factorization schemes, existence and uniqueness of steady states for time dependent problems, and pitfalls of implicit time stepping. The other studies deal with modern numerical methods such as total variation diminishing schemes, higher order variants of vortex and particle methods, spectral multidomain techniques, and front tracking techniques. There is also a paper on adaptive grids. The fluid dynamics papers treat the classical problems of imcompressible flows in helically coiled pipes, vortex breakdown, and transonic flows.

  18. Numerical Tool Path Optimization for Conventional Sheet Metal Spinning Processes

    NASA Astrophysics Data System (ADS)

    Rentsch, Benedikt; Manopulo, Niko; Hora, Pavel

    2016-08-01

    To this day, conventional sheet metal spinning processes are designed with a very low degree of automation. They are usually executed by experienced personnel, who actively adjust the tool paths during production. The practically unlimited freedom in designing the tool paths enables the efficient manufacturing of complex geometries on one hand, but is challenging to translate into a standardized procedure on the other. The present study aims to propose a systematic methodology, based on a 3D FEM model combined with a numerical optimization strategy, in order to design tool paths. The accurate numerical modelling of the spinning process is firstly discussed, followed by an analysis of appropriate objective functions and constraints required to obtain a failure free tool path design.

  19. Advanced Numerical Model for Irradiated Concrete

    SciTech Connect

    Giorla, Alain B.

    2015-03-01

    In this report, we establish a numerical model for concrete exposed to irradiation to address these three critical points. The model accounts for creep in the cement paste and its coupling with damage, temperature and relative humidity. The shift in failure mode with the loading rate is also properly represented. The numerical model for creep has been validated and calibrated against different experiments in the literature [Wittmann, 1970, Le Roy, 1995]. Results from a simplified model are shown to showcase the ability of numerical homogenization to simulate irradiation effects in concrete. In future works, the complete model will be applied to the analysis of the irradiation experiments of Elleuch et al. [1972] and Kelly et al. [1969]. This requires a careful examination of the experimental environmental conditions as in both cases certain critical information are missing, including the relative humidity history. A sensitivity analysis will be conducted to provide lower and upper bounds of the concrete expansion under irradiation, and check if the scatter in the simulated results matches the one found in experiments. The numerical and experimental results will be compared in terms of expansion and loss of mechanical stiffness and strength. Both effects should be captured accordingly by the model to validate it. Once the model has been validated on these two experiments, it can be applied to simulate concrete from nuclear power plants. To do so, the materials used in these concrete must be as well characterized as possible. The main parameters required are the mechanical properties of each constituent in the concrete (aggregates, cement paste), namely the elastic modulus, the creep properties, the tensile and compressive strength, the thermal expansion coefficient, and the drying shrinkage. These can be either measured experimentally, estimated from the initial composition in the case of cement paste, or back-calculated from mechanical tests on concrete. If some

  20. Development of Advanced Tools for Cryogenic Integration

    NASA Astrophysics Data System (ADS)

    Bugby, D. C.; Marland, B. C.; Stouffer, C. J.; Kroliczek, E. J.

    2004-06-01

    This paper describes four advanced devices (or tools) that were developed to help solve problems in cryogenic integration. The four devices are: (1) an across-gimbal nitrogen cryogenic loop heat pipe (CLHP); (2) a miniaturized neon CLHP; (3) a differential thermal expansion (DTE) cryogenic thermal switch (CTSW); and (4) a dual-volume nitrogen cryogenic thermal storage unit (CTSU). The across-gimbal CLHP provides a low torque, high conductance solution for gimbaled cryogenic systems wishing to position their cryocoolers off-gimbal. The miniaturized CLHP combines thermal transport, flexibility, and thermal switching (at 35 K) into one device that can be directly mounted to both the cooler cold head and the cooled component. The DTE-CTSW, designed and successfully tested in a previous program using a stainless steel tube and beryllium (Be) end-pieces, was redesigned with a polymer rod and high-purity aluminum (Al) end-pieces to improve performance and manufacturability while still providing a miniaturized design. Lastly, the CTSU was designed with a 6063 Al heat exchanger and integrally welded, segmented, high purity Al thermal straps for direct attachment to both a cooler cold head and a Be component whose peak heat load exceeds its average load by 2.5 times. For each device, the paper will describe its development objective, operating principles, heritage, requirements, design, test data and lessons learned.

  1. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    SciTech Connect

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-07

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  2. Advanced cryogenics for cutting tools. Final report

    SciTech Connect

    Lazarus, L.J.

    1996-10-01

    The purpose of the investigation was to determine if cryogenic treatment improved the life and cost effectiveness of perishable cutting tools over other treatments or coatings. Test results showed that in five of seven of the perishable cutting tools tested there was no improvement in tool life. The other two tools showed a small gain in tool life, but not as much as when switching manufacturers of the cutting tool. The following conclusions were drawn from this study: (1) titanium nitride coatings are more effective than cryogenic treatment in increasing the life of perishable cutting tools made from all cutting tool materials, (2) cryogenic treatment may increase tool life if the cutting tool is improperly heat treated during its origination, and (3) cryogenic treatment was only effective on those tools made from less sophisticated high speed tool steels. As a part of a recent detailed investigation, four cutting tool manufacturers and two cutting tool laboratories were queried and none could supply any data to substantiate cryogenic treatment of perishable cutting tools.

  3. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  4. Numerical Relativity as a tool for studying the Early Universe

    NASA Astrophysics Data System (ADS)

    Garrison, David

    2013-04-01

    Numerical simulations are becoming a more effective tool for conducting detailed investigations into the evolution of our universe. In this presentation, I show how the framework of numerical relativity can be used for studying cosmological models. We are working to develop a large-scale simulation of the dynamical processes in the early universe. These take into account interactions of dark matter, scalar perturbations, gravitational waves, magnetic fields and a turbulent plasma. The code described in this report is a GRMHD code based on the Cactus framework and is structured to utilize one of several different differencing methods chosen at run-time. It is being developed and tested on the Texas Learning and Computation Center's Xanadu cluster.

  5. Advanced Reach Tool (ART): development of the mechanistic model.

    PubMed

    Fransman, Wouter; Van Tongeren, Martie; Cherrie, John W; Tischer, Martin; Schneider, Thomas; Schinkel, Jody; Kromhout, Hans; Warren, Nick; Goede, Henk; Tielemans, Erik

    2011-11-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. The ART mechanistic model is based on a conceptual framework that adopts a source receptor approach, which describes the transport of a contaminant from the source to the receptor and defines seven independent principal modifying factors: substance emission potential, activity emission potential, localized controls, segregation, personal enclosure, surface contamination, and dispersion. ART currently differentiates between three different exposure types: vapours, mists, and dust (fumes, fibres, and gases are presently excluded). Various sources were used to assign numerical values to the multipliers to each modifying factor. The evidence used to underpin this assessment procedure was based on chemical and physical laws. In addition, empirical data obtained from literature were used. Where this was not possible, expert elicitation was applied for the assessment procedure. Multipliers for all modifying factors were peer reviewed by leading experts from industry, research institutes, and public authorities across the globe. In addition, several workshops with experts were organized to discuss the proposed exposure multipliers. The mechanistic model is a central part of the ART tool and with advancing knowledge on exposure, determinants will require updates and refinements on a continuous basis, such as the effect of worker behaviour on personal exposure, 'best practice' values that describe the maximum achievable effectiveness of control measures, the intrinsic emission potential of various solid objects (e.g. metal, glass, plastics, etc.), and extending the applicability domain to certain types of exposures (e.g. gas, fume, and fibre exposure).

  6. Preface to advances in numerical simulation of plasmas

    NASA Astrophysics Data System (ADS)

    Parker, Scott E.; Chacon, Luis

    2016-10-01

    This Journal of Computational Physics Special Issue, titled "Advances in Numerical Simulation of Plasmas," presents a snapshot of the international state of the art in the field of computational plasma physics. The articles herein are a subset of the topics presented as invited talks at the 24th International Conference on the Numerical Simulation of Plasmas (ICNSP), August 12-14, 2015 in Golden, Colorado. The choice of papers was highly selective. The ICNSP is held every other year and is the premier scientific meeting in the field of computational plasma physics.

  7. Alternative Fuel and Advanced Vehicle Tools (AFAVT), AFDC (Fact Sheet)

    SciTech Connect

    Not Available

    2010-01-01

    The Alternative Fuels and Advanced Vehicles Web site offers a collection of calculators, interactive maps, and informational tools to assist fleets, fuel providers, and others looking to reduce petroleum consumption in the transportation sector.

  8. Some recent advances in the numerical solution of differential equations

    NASA Astrophysics Data System (ADS)

    D'Ambrosio, Raffaele

    2016-06-01

    The purpose of the talk is the presentation of some recent advances in the numerical solution of differential equations, with special emphasis to reaction-diffusion problems, Hamiltonian problems and ordinary differential equations with discontinuous right-hand side. As a special case, in this short paper we focus on the solution of reaction-diffusion problems by means of special purpose numerical methods particularly adapted to the problem: indeed, following a problem oriented approach, we propose a modified method of lines based on the employ of finite differences shaped on the qualitative behavior of the solutions. Constructive issues and a brief analysis are presented, together with some numerical experiments showing the effectiveness of the approach and a comparison with existing solvers.

  9. Innovative Tools Advance Revolutionary Weld Technique

    NASA Technical Reports Server (NTRS)

    2009-01-01

    The iconic, orange external tank of the space shuttle launch system not only contains the fuel used by the shuttle s main engines during liftoff but also comprises the shuttle s backbone, supporting the space shuttle orbiter and solid rocket boosters. Given the tank s structural importance and the extreme forces (7.8 million pounds of thrust load) and temperatures it encounters during launch, the welds used to construct the tank must be highly reliable. Variable polarity plasma arc welding, developed for manufacturing the external tank and later employed for building the International Space Station, was until 1994 the best process for joining the aluminum alloys used during construction. That year, Marshall Space Flight Center engineers began experimenting with a relatively new welding technique called friction stir welding (FSW), developed in 1991 by The Welding Institute, of Cambridge, England. FSW differs from traditional fusion welding in that it is a solid-state welding technique, using frictional heat and motion to join structural components without actually melting any of the material. The weld is created by a shouldered pin tool that is plunged into the seam of the materials to be joined. The tool traverses the line while rotating at high speeds, generating friction that heats and softens but does not melt the metal. (The heat produced approaches about 80 percent of the metal s melting temperature.) The pin tool s rotation crushes and stirs the plasticized metal, extruding it along the seam as the tool moves forward. The material cools and consolidates, resulting in a weld with superior mechanical properties as compared to those weld properties of fusion welds. The innovative FSW technology promises a number of attractive benefits. Because the welded materials are not melted, many of the undesirables associated with fusion welding porosity, cracking, shrinkage, and distortion of the weld are minimized or avoided. The process is more energy efficient, safe

  10. DRIVER TO SUPPORT USE OF NUMERICAL SIMULATION TOOLS

    2001-02-13

    UNIPACK is a computer interface that simplifies and enhances the use of numerical simulation tools to design a primary geometry and/or a forming die for a powder compact and/or to design the pressing process used to shape a powder by compaction. More particularly, it is an interface that utilizes predefined generic geometric configurations to simplify the use of finite element method modeling software to simply and more efficiently design: (1) the shape and size amore » powder compact; (2) a forming die to shape a powder compact; and/or (3) the pressing process used to form a powder compact. UNIPACK is a user interface for a predictive model for powder compaction that incorporates unprecedented flexibility to design powder press tooling and powder pressing processes. UNIPACK works with the Sandia National Laboratories (SNL) Engineering Analysis Cide Access System (SEACAS) to generate a finite element (FE) mesh and automatically perform a FE analysis of powder compaction. UNIPACK was developed to allow a non-expert with minimal training to quickly and easily design/construct a variable dimension component or die in real time on a desktop or laptop personal computer.« less

  11. Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    2001-01-01

    The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.

  12. Terahertz Tools Advance Imaging for Security, Industry

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Picometrix, a wholly owned subsidiary of Advanced Photonix Inc. (API), of Ann Arbor, Michigan, invented the world s first commercial terahertz system. The company improved the portability and capabilities of their systems through Small Business Innovation Research (SBIR) agreements with Langley Research Center to provide terahertz imaging capabilities for inspecting the space shuttle external tanks and orbiters. Now API s systems make use of the unique imaging capacity of terahertz radiation on manufacturing floors, for thickness measurements of coatings, pharmaceutical tablet production, and even art conservation.

  13. Advances in nanocrystallography as a proteomic tool.

    PubMed

    Pechkova, Eugenia; Bragazzi, Nicola Luigi; Nicolini, Claudio

    2014-01-01

    In order to overcome the difficulties and hurdles too much often encountered in crystallizing a protein with the conventional techniques, our group has introduced the innovative Langmuir-Blodgett (LB)-based crystallization, as a major advance in the field of both structural and functional proteomics, thus pioneering the emerging field of the so-called nanocrystallography or nanobiocrystallography. This approach uniquely combines protein crystallography and nanotechnologies within an integrated, coherent framework that allows one to obtain highly stable protein crystals and to fully characterize them at a nano- and subnanoscale. A variety of experimental techniques and theoretical/semi-theoretical approaches, ranging from atomic force microscopy, circular dichroism, Raman spectroscopy and other spectroscopic methods, microbeam grazing-incidence small-angle X-ray scattering to in silico simulations, bioinformatics, and molecular dynamics, has been exploited in order to study the LB-films and to investigate the kinetics and the main features of LB-grown crystals. When compared to classical hanging-drop crystallization, LB technique appears strikingly superior and yields results comparable with crystallization in microgravity environments. Therefore, the achievement of LB-based crystallography can have a tremendous impact in the field of industrial and clinical/therapeutic applications, opening new perspectives for personalized medicine. These implications are envisaged and discussed in the present contribution.

  14. Advanced tool kits for EPR security.

    PubMed

    Blobel, B

    2000-11-01

    Responding to the challenge for efficient and high quality health care, the shared care paradigm must be established in health. In that context, information systems such as electronic patient records (EPR) have to meet this paradigm supporting communication and interoperation between the health care establishments (HCE) and health professionals (HP) involved. Due to the sensitivity of personal medical information, this co-operation must be provided in a trustworthy way. To enable different views of HCE and HP ranging from management, doctors, nurses up to systems administrators and IT professionals, a set of models for analysis, design and implementation of secure distributed EPR has been developed and introduced. The approach is based on the popular UML methodology and the component paradigm for open, interoperable systems. Easy to use tool kits deal with both application security services and communication security services but also with the security infrastructure needed. Regarding the requirements for distributed multi-user EPRs, modelling and implementation of policy agreements, authorisation and access control are especially considered. Current developments for a security infrastructure in health care based on cryptographic algorithms as health professional cards (HPC), security services employing digital signatures, and health-related TTP services are discussed. CEN and ISO initiatives for health informatics standards in the context of secure and communicable EPR are especially mentioned. PMID:11154968

  15. Advanced CAN (Controller Area Network) Tool

    SciTech Connect

    Terry, D.J.

    2000-03-17

    The CAN interface cards that are currently in use are PCMCIA based and use a microprocessor and CAN chip that are no longer in production. The long-term support of the SGT CAN interface is of concern due to this issue along with performance inadequacies and technical support. The CAN bus is at the heart of the SGT trailer. If the CAN bus in the SGT trailer cannot be maintained adequately, then the trailer itself cannot be maintained adequately. These concerns led to the need for a CRADA to help develop a new product that would be called the ''Gryphon'' CAN tool. FM and T provided manufacturing expertise along with design criteria to ensure SGT compatibility and long-term support. FM and T also provided resources for software support. Dearborn provided software and hardware design expertise to implement the necessary requirements. Both partners worked around heavy internal workloads to support completion of the project. This CRADA establishes a US source for an item that is very critical to support the SGT project. The Dearborn Group had the same goal to provide a US alternative to German suppliers. The Dearborn Group was also interested in developing a CAN product that has performance characteristics that place the Gryphon in a class by itself. This enhanced product not only meets and exceeds SGT requirements; it has opened up options that were not even considered before the project began. The cost of the product is also less than the European options.

  16. Numerical analysis of the V-Y shaped advancement flap.

    PubMed

    Remache, D; Chambert, J; Pauchot, J; Jacquet, E

    2015-10-01

    The V-Y advancement flap is a usual technique for the closure of skin defects. A triangular flap is incised adjacent to a skin defect of rectangular shape. As the flap is advanced to close the initial defect, two smaller defects in the shape of a parallelogram are formed with respect to a reflection symmetry. The height of the defects depends on the apex angle of the flap and the closure efforts are related to the defects height. Andrades et al. 2005 have performed a geometrical analysis of the V-Y flap technique in order to reach a compromise between the flap size and the defects width. However, the geometrical approach does not consider the mechanical properties of the skin. The present analysis based on the finite element method is proposed as a complement to the geometrical one. This analysis aims to highlight the major role of the skin elasticity for a full analysis of the V-Y advancement flap. Furthermore, the study of this technique shows that closing at the flap apex seems mechanically the most interesting step. Thus different strategies of defect closure at the flap apex stemming from surgeon's know-how have been tested by numerical simulations. PMID:26342442

  17. Advanced numerical methods in mesh generation and mesh adaptation

    SciTech Connect

    Lipnikov, Konstantine; Danilov, A; Vassilevski, Y; Agonzal, A

    2010-01-01

    Numerical solution of partial differential equations requires appropriate meshes, efficient solvers and robust and reliable error estimates. Generation of high-quality meshes for complex engineering models is a non-trivial task. This task is made more difficult when the mesh has to be adapted to a problem solution. This article is focused on a synergistic approach to the mesh generation and mesh adaptation, where best properties of various mesh generation methods are combined to build efficiently simplicial meshes. First, the advancing front technique (AFT) is combined with the incremental Delaunay triangulation (DT) to build an initial mesh. Second, the metric-based mesh adaptation (MBA) method is employed to improve quality of the generated mesh and/or to adapt it to a problem solution. We demonstrate with numerical experiments that combination of all three methods is required for robust meshing of complex engineering models. The key to successful mesh generation is the high-quality of the triangles in the initial front. We use a black-box technique to improve surface meshes exported from an unattainable CAD system. The initial surface mesh is refined into a shape-regular triangulation which approximates the boundary with the same accuracy as the CAD mesh. The DT method adds robustness to the AFT. The resulting mesh is topologically correct but may contain a few slivers. The MBA uses seven local operations to modify the mesh topology. It improves significantly the mesh quality. The MBA method is also used to adapt the mesh to a problem solution to minimize computational resources required for solving the problem. The MBA has a solid theoretical background. In the first two experiments, we consider the convection-diffusion and elasticity problems. We demonstrate the optimal reduction rate of the discretization error on a sequence of adaptive strongly anisotropic meshes. The key element of the MBA method is construction of a tensor metric from hierarchical edge

  18. Advanced numerics for multi-dimensional fluid flow calculations

    SciTech Connect

    Vanka, S.P.

    1984-04-01

    In recent years, there has been a growing interest in the development and use of mathematical models for the simulation of fluid flow, heat transfer and combustion processes in engineering equipment. The equations representing the multi-dimensional transport of mass, momenta and species are numerically solved by finite-difference or finite-element techniques. However despite the multiude of differencing schemes and solution algorithms, and the advancement of computing power, the calculation of multi-dimensional flows, especially three-dimensional flows, remains a mammoth task. The following discussion is concerned with the author's recent work on the construction of accurate discretization schemes for the partial derivatives, and the efficient solution of the set of nonlinear algebraic equations resulting after discretization. The present work has been jointly supported by the Ramjet Engine Division of the Wright Patterson Air Force Base, Ohio, and the NASA Lewis Research Center.

  19. Advanced numerics for multi-dimensional fluid flow calculations

    NASA Technical Reports Server (NTRS)

    Vanka, S. P.

    1984-01-01

    In recent years, there has been a growing interest in the development and use of mathematical models for the simulation of fluid flow, heat transfer and combustion processes in engineering equipment. The equations representing the multi-dimensional transport of mass, momenta and species are numerically solved by finite-difference or finite-element techniques. However despite the multiude of differencing schemes and solution algorithms, and the advancement of computing power, the calculation of multi-dimensional flows, especially three-dimensional flows, remains a mammoth task. The following discussion is concerned with the author's recent work on the construction of accurate discretization schemes for the partial derivatives, and the efficient solution of the set of nonlinear algebraic equations resulting after discretization. The present work has been jointly supported by the Ramjet Engine Division of the Wright Patterson Air Force Base, Ohio, and the NASA Lewis Research Center.

  20. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  1. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  2. Numerical Modeling and Inverse Scattering in Nondestructive Testing: Recent Applications and Advances

    NASA Astrophysics Data System (ADS)

    Marklein, R.; Langenberg, K. J.; Mayer, K.; Shlivinski, A.; Miao, J.; Zimmer, A.; Müller, W.; Schmitz, V.; Kohl, C.; Mletzko, U.

    2005-04-01

    This paper presents recent advances and future challenges of the application of different numerical modeling tools and linear and nonlinear inversion algorithms in ultrasonics and electromagnetics applied in NDE. The inversion methods considered in the presented work vary from linear schemes, e.g. SAFT/InASAFT and Diffraction Tomography/FT-SAFT, to nonlinear schemes, e.g. the Contrast Source Inversion. Inversion results are presented and compared for modeled and measured ultrasonic and electromagnetic data to locate voids and cracks as well as to locate aluminum tendon ducts in concrete, which is a typical GPR problem. Finite Integration Technique (FIT) and Domain Integral Equation (DIE) solvers are used as modeling tools.

  3. Anvil Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe, III; Bauman, William, III; Keen, Jeremy

    2007-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. In order for the Anvil Tool to remain available to the meteorologists, the AMU was tasked to transition the tool to the Advanced Weather interactive Processing System (AWIPS). This report describes the work done by the AMU to develop the Anvil Tool for AWIPS to create a graphical overlay depicting the threat from thunderstorm anvil clouds. The AWIPS Anvil Tool is based on the previously deployed AMU MIDDS Anvil Tool. SMG and 45 WS forecasters have used the MIDDS Anvil Tool during launch and landing operations. SMG's primary weather analysis and display system is now AWIPS and the 45 WS has plans to replace MIDDS with AWIPS. The Anvil Tool creates a graphic that users can overlay on satellite or radar imagery to depict the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on an average of the upper-level observed or forecasted winds. The graphic includes 10 and 20 nm standoff circles centered at the location of interest, in addition to one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 degree sector width based on a previous AMU study which determined thunderstorm anvils move in a direction plus or minus 15 degrees of the upper-level (300- to 150-mb) wind direction. This report briefly describes the history of the MIDDS Anvil Tool and then explains how the initial development of the AWIPS Anvil Tool was carried out. After testing was

  4. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  5. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  6. Anvil Forecast Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.

  7. Methods, Software and Tools for Three Numerical Applications. Final report

    SciTech Connect

    E. R. Jessup

    2000-03-01

    This is a report of the results of the authors work supported by DOE contract DE-FG03-97ER25325. They proposed to study three numerical problems. They are: (1) the extension of the PMESC parallel programming library; (2) the development of algorithms and software for certain generalized eigenvalue and singular value (SVD) problems, and (3) the application of techniques of linear algebra to an information retrieval technique known as latent semantic indexing (LSI).

  8. Advanced Electric Submersible Pump Design Tool for Geothermal Applications

    SciTech Connect

    Xuele Qi; Norman Turnquist; Farshad Ghasripoor

    2012-05-31

    Electrical Submersible Pumps (ESPs) present higher efficiency, larger production rate, and can be operated in deeper wells than the other geothermal artificial lifting systems. Enhanced Geothermal Systems (EGS) applications recommend lifting 300 C geothermal water at 80kg/s flow rate in a maximum 10-5/8-inch diameter wellbore to improve the cost-effectiveness. In this paper, an advanced ESP design tool comprising a 1D theoretical model and a 3D CFD analysis has been developed to design ESPs for geothermal applications. Design of Experiments was also performed to optimize the geometry and performance. The designed mixed-flow type centrifugal impeller and diffuser exhibit high efficiency and head rise under simulated EGS conditions. The design tool has been validated by comparing the prediction to experimental data of an existing ESP product.

  9. Advances in Mass Spectrometric Tools for Probing Neuropeptides

    NASA Astrophysics Data System (ADS)

    Buchberger, Amanda; Yu, Qing; Li, Lingjun

    2015-07-01

    Neuropeptides are important mediators in the functionality of the brain and other neurological organs. Because neuropeptides exist in a wide range of concentrations, appropriate characterization methods are needed to provide dynamic, chemical, and spatial information. Mass spectrometry and compatible tools have been a popular choice in analyzing neuropeptides. There have been several advances and challenges, both of which are the focus of this review. Discussions range from sample collection to bioinformatic tools, although avenues such as quantitation and imaging are included. Further development of the presented methods for neuropeptidomic mass spectrometric analysis is inevitable, which will lead to a further understanding of the complex interplay of neuropeptides and other signaling molecules in the nervous system.

  10. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  11. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2010-12-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  12. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2011-05-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  13. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    NASA Astrophysics Data System (ADS)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-09-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  14. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-09-15

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  15. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  16. A new clinical tool for assessing numerical abilities in neurological diseases: numerical activities of daily living.

    PubMed

    Semenza, Carlo; Meneghello, Francesca; Arcara, Giorgio; Burgio, Francesca; Gnoato, Francesca; Facchini, Silvia; Benavides-Varela, Silvia; Clementi, Maurizio; Butterworth, Brian

    2014-01-01

    The aim of this study was to build an instrument, the numerical activities of daily living (NADL), designed to identify the specific impairments in numerical functions that may cause problems in everyday life. These impairments go beyond what can be inferred from the available scales evaluating activities of daily living in general, and are not adequately captured by measures of the general deterioration of cognitive functions as assessed by standard clinical instruments like the MMSE and MoCA. We assessed a control group (n = 148) and a patient group affected by a wide variety of neurological conditions (n = 175), with NADL along with IADL, MMSE, and MoCA. The NADL battery was found to have satisfactory construct validity and reliability, across a wide age range. This enabled us to calculate appropriate criteria for impairment that took into account age and education. It was found that neurological patients tended to overestimate their abilities as compared to the judgment made by their caregivers, assessed with objective tests of numerical abilities. PMID:25126077

  17. A new clinical tool for assessing numerical abilities in neurological diseases: numerical activities of daily living

    PubMed Central

    Semenza, Carlo; Meneghello, Francesca; Arcara, Giorgio; Burgio, Francesca; Gnoato, Francesca; Facchini, Silvia; Benavides-Varela, Silvia; Clementi, Maurizio; Butterworth, Brian

    2014-01-01

    The aim of this study was to build an instrument, the numerical activities of daily living (NADL), designed to identify the specific impairments in numerical functions that may cause problems in everyday life. These impairments go beyond what can be inferred from the available scales evaluating activities of daily living in general, and are not adequately captured by measures of the general deterioration of cognitive functions as assessed by standard clinical instruments like the MMSE and MoCA. We assessed a control group (n = 148) and a patient group affected by a wide variety of neurological conditions (n = 175), with NADL along with IADL, MMSE, and MoCA. The NADL battery was found to have satisfactory construct validity and reliability, across a wide age range. This enabled us to calculate appropriate criteria for impairment that took into account age and education. It was found that neurological patients tended to overestimate their abilities as compared to the judgment made by their caregivers, assessed with objective tests of numerical abilities. PMID:25126077

  18. A new clinical tool for assessing numerical abilities in neurological diseases: numerical activities of daily living.

    PubMed

    Semenza, Carlo; Meneghello, Francesca; Arcara, Giorgio; Burgio, Francesca; Gnoato, Francesca; Facchini, Silvia; Benavides-Varela, Silvia; Clementi, Maurizio; Butterworth, Brian

    2014-01-01

    The aim of this study was to build an instrument, the numerical activities of daily living (NADL), designed to identify the specific impairments in numerical functions that may cause problems in everyday life. These impairments go beyond what can be inferred from the available scales evaluating activities of daily living in general, and are not adequately captured by measures of the general deterioration of cognitive functions as assessed by standard clinical instruments like the MMSE and MoCA. We assessed a control group (n = 148) and a patient group affected by a wide variety of neurological conditions (n = 175), with NADL along with IADL, MMSE, and MoCA. The NADL battery was found to have satisfactory construct validity and reliability, across a wide age range. This enabled us to calculate appropriate criteria for impairment that took into account age and education. It was found that neurological patients tended to overestimate their abilities as compared to the judgment made by their caregivers, assessed with objective tests of numerical abilities.

  19. Recent advances in two-phase flow numerics

    SciTech Connect

    Mahaffy, J.H.; Macian, R.

    1997-07-01

    The authors review three topics in the broad field of numerical methods that may be of interest to individuals modeling two-phase flow in nuclear power plants. The first topic is iterative solution of linear equations created during the solution of finite volume equations. The second is numerical tracking of macroscopic liquid interfaces. The final area surveyed is the use of higher spatial difference techniques.

  20. An Advanced Decision Support Tool for Electricity Infrastructure Operations

    SciTech Connect

    Chen, Yousu; Huang, Zhenyu; Wong, Pak C.; Mackey, Patrick S.; Allwardt, Craig H.; Ma, Jian; Greitzer, Frank L.

    2010-01-31

    Electricity infrastructure, as one of the most critical infrastructures in the U.S., plays an important role in modern societies. Its failure would lead to significant disruption of people’s lives, industry and commercial activities, and result in massive economic losses. Reliable operation of electricity infrastructure is an extremely challenging task because human operators need to consider thousands of possible configurations in near real-time to choose the best option and operate the network effectively. In today’s practice, electricity infrastructure operation is largely based on operators’ experience with very limited real-time decision support, resulting in inadequate management of complex predictions and the inability to anticipate, recognize, and respond to situations caused by human errors, natural disasters, or cyber attacks. Therefore, a systematic approach is needed to manage the complex operational paradigms and choose the best option in a near-real-time manner. This paper proposes an advanced decision support tool for electricity infrastructure operations. The tool has the functions of turning large amount of data into actionable information to help operators monitor power grid status in real time; performing trend analysis to indentify system trend at the regional level or system level to help the operator to foresee and discern emergencies, studying clustering analysis to assist operators to identify the relationships between system configurations and affected assets, and interactively evaluating the alternative remedial actions to aid operators to make effective and timely decisions. This tool can provide significant decision support on electricity infrastructure operations and lead to better reliability in power grids. This paper presents examples with actual electricity infrastructure data to demonstrate the capability of this tool.

  1. Bioinformatics Methods and Tools to Advance Clinical Care

    PubMed Central

    Lecroq, T.

    2015-01-01

    Summary Objectives To summarize excellent current research in the field of Bioinformatics and Translational Informatics with application in the health domain and clinical care. Method We provide a synopsis of the articles selected for the IMIA Yearbook 2015, from which we attempt to derive a synthetic overview of current and future activities in the field. As last year, a first step of selection was performed by querying MEDLINE with a list of MeSH descriptors completed by a list of terms adapted to the section. Each section editor has evaluated separately the set of 1,594 articles and the evaluation results were merged for retaining 15 articles for peer-review. Results The selection and evaluation process of this Yearbook’s section on Bioinformatics and Translational Informatics yielded four excellent articles regarding data management and genome medicine that are mainly tool-based papers. In the first article, the authors present PPISURV a tool for uncovering the role of specific genes in cancer survival outcome. The second article describes the classifier PredictSNP which combines six performing tools for predicting disease-related mutations. In the third article, by presenting a high-coverage map of the human proteome using high resolution mass spectrometry, the authors highlight the need for using mass spectrometry to complement genome annotation. The fourth article is also related to patient survival and decision support. The authors present datamining methods of large-scale datasets of past transplants. The objective is to identify chances of survival. Conclusions The current research activities still attest the continuous convergence of Bioinformatics and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care. Indeed, there is a need for powerful tools for managing and interpreting complex, large-scale genomic and biological datasets, but also a need for user-friendly tools developed for the clinicians in their

  2. Clinical holistic health: advanced tools for holistic medicine.

    PubMed

    Ventegodt, Søren; Clausen, Birgitte; Nielsen, May Lyck; Merrick, Joav

    2006-02-24

    According to holistic medical theory, the patient will heal when old painful moments, the traumatic events of life that are often called "gestalts", are integrated in the present "now". The advanced holistic physician's expanded toolbox has many different tools to induce this healing, some that are more dangerous and potentially traumatic than others. The more intense the therapeutic technique, the more emotional energy will be released and contained in the session, but the higher also is the risk for the therapist to lose control of the session and lose the patient to his or her own dark side. To avoid harming the patient must be the highest priority in holistic existential therapy, making sufficient education and training an issue of highest importance. The concept of "stepping up" the therapy by using more and more "dramatic" methods to get access to repressed emotions and events has led us to a "therapeutic staircase" with ten steps: (1) establishing the relationship; (2) establishing intimacy, trust, and confidentiality; (3) giving support and holding; (4) taking the patient into the process of physical, emotional, and mental healing; (5) social healing of being in the family; (6) spiritual healing--returning to the abstract wholeness of the soul; (7) healing the informational layer of the body; (8) healing the three fundamental dimensions of existence: love, power, and sexuality in a direct way using, among other techniques, "controlled violence" and "acupressure through the vagina"; (9) mind-expanding and consciousness-transformative techniques like psychotropic drugs; and (10) techniques transgressing the patient's borders and, therefore, often traumatizing (for instance, the use of force against the will of the patient). We believe that the systematic use of the staircase will greatly improve the power and efficiency of holistic medicine for the patient and we invite a broad cooperation in scientifically testing the efficiency of the advanced holistic

  3. Sandia Advanced MEMS Design Tools, Version 2.0

    2002-06-13

    Sandia Advanced MEMS Design Tools is a 5-level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c)Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the processmore » of having MEMS fabricated at SNL e) Facilitate the process of having post-fabrication services performed While there exist some files on the CD that are used in conjunction with the software AutoCAD, these files are not intended for use independent of the CD. NOTE: THE CUSTOMER MUST PURCHASE HIS/HER OWN COPY OF AutoCAD TO USE WITH THESE FILES.« less

  4. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  5. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  6. Tools for the advancement of undergraduate statistics education

    NASA Astrophysics Data System (ADS)

    Schaffner, Andrew Alan

    To keep pace with advances in applied statistics and to maintain literate consumers of quantitative analyses, statistics educators stress the need for change in the classroom (Cobb, 1992; Garfield, 1993, 1995; Moore, 1991a; Snee, 1993; Steinhorst and Keeler, 1995). These authors stress a more concept oriented undergraduate introductory statistics course which emphasizes true understanding over mechanical skills. Drawing on recent educational research, this dissertation attempts to realize this vision by developing tools and pedagogy to assist statistics instructors. This dissertation describes statistical facets, pieces of statistical understanding that are building blocks of knowledge, and discusses DIANA, a World-Wide Web tool for diagnosing facets. Further, I show how facets may be incorporated into course design through the development of benchmark lessons based on the principles of collaborative learning (diSessa and Minstrell, 1995; Cohen, 1994; Reynolds et al., 1995; Bruer, 1993; von Glasersfeld, 1991) and activity based courses (Jones, 1991; Yackel, Cobb and Wood, 1991). To support benchmark lessons and collaborative learning in large classes I describe Virtual Benchmark Instruction, benchmark lessons which take place on a structured hypertext bulletin board using the technology of the World-Wide Web. Finally, I present randomized experiments which suggest that these educational developments are effective in a university introductory statistics course.

  7. Advanced REACH Tool: a Bayesian model for occupational exposure assessment.

    PubMed

    McNally, Kevin; Warren, Nicholas; Fransman, Wouter; Entink, Rinke Klein; Schinkel, Jody; van Tongeren, Martie; Cherrie, John W; Kromhout, Hans; Schneider, Thomas; Tielemans, Erik

    2014-06-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sources of information within a Bayesian statistical framework. The information is obtained from expert knowledge expressed in a calibrated mechanistic model of exposure assessment, data on inter- and intra-individual variability in exposures from the literature, and context-specific exposure measurements. The ART provides central estimates and credible intervals for different percentiles of the exposure distribution, for full-shift and long-term average exposures. The ART can produce exposure estimates in the absence of measurements, but the precision of the estimates improves as more data become available. The methodology presented in this paper is able to utilize partially analogous data, a novel approach designed to make efficient use of a sparsely populated measurement database although some additional research is still required before practical implementation. The methodology is demonstrated using two worked examples: an exposure to copper pyrithione in the spraying of antifouling paints and an exposure to ethyl acetate in shoe repair. PMID:24665110

  8. Sandia Advanced MEMS Design Tools, Version 2.2.5

    SciTech Connect

    Yarberry, Victor; Allen, James; Lantz, Jeffery; Priddy, Brian; & Westling, Belinda

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.

  9. Sandia Advanced MEMS Design Tools, Version 2.2.5

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external tomore » Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less

  10. Recent advances in numerical analysis of structural eigenvalue problems

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.

    1973-01-01

    A wide range of eigenvalue problems encountered in practical structural engineering analyses is defined, in which the structures are assumed to be discretized by any suitable technique such as the finite-element method. A review of the usual numerical procedures for the solution of such eigenvalue problems is presented and is followed by an extensive account of recently developed eigenproblem solution procedures. Particular emphasis is placed on the new numerical algorithms and associated computer programs based on the Sturm sequence method. Eigenvalue algorithms developed for efficient solution of natural frequency and buckling problems of structures are presented, as well as some eigenvalue procedures formulated in connection with the solution of quadratic matrix equations associated with free vibration analysis of structures. A new algorithm is described for natural frequency analysis of damped structural systems.

  11. Numerical tools for the characterization of microelectromechanical systems by digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Pagliarulo, Vito; Russo, Tiziana; Miccio, Lisa; Ferraro, Pietro

    2015-10-01

    Digital holography (DH) in microscopy became an important interferometric tool in optical metrology when camera sensors reached a higher pixel number with smaller size and high-speed computers became able to process the acquired images. This allowed the investigation of engineered surfaces on microscale, such as microelectromechanical systems (MEMS). In DH, numerical tools perform the reconstruction of the wave field. This offers the possibility of retrieving not only the intensity of the acquired wavefield, but also the phase distribution. This review describes the principles of DH and shows the most important numerical tools discovered and applied to date in the field of MEMS. Both the static and the dynamic regimes can be analyzed by means of DH. Whereas the first one is mostly related to the characterization after the fabrication process, the second one is a useful tool to characterize the actuation of the MEMS.

  12. Analysis of the thermo-mechanical deformations in a hot forging tool by numerical simulation

    NASA Astrophysics Data System (ADS)

    L-Cancelos, R.; Varas, F.; Martín, E.; Viéitez, I.

    2016-03-01

    Although programs have been developed for the design of tools for hot forging, its design is still largely based on the experience of the tool maker. This obliges to build some test matrices and correct their errors to minimize distortions in the forged piece. This phase prior to mass production consumes time and material resources, which makes the final product more expensive. The forging tools are usually constituted by various parts made of different grades of steel, which in turn have different mechanical properties and therefore suffer different degrees of strain. Furthermore, the tools used in the hot forging are exposed to a thermal field that also induces strain or stress based on the degree of confinement of the piece. Therefore, the mechanical behaviour of the assembly is determined by the contact between the different pieces. The numerical simulation allows to analyse different configurations and anticipate possible defects before tool making, thus, reducing the costs of this preliminary phase. In order to improve the dimensional quality of the manufactured parts, the work presented here focuses on the application of a numerical model to a hot forging manufacturing process in order to predict the areas of the forging die subjected to large deformations. The thermo-mechanical model developed and implemented with free software (Code-Aster) includes the strains of thermal origin, strains during forge impact and contact effects. The numerical results are validated with experimental measurements in a tooling set that produces forged crankshafts for the automotive industry. The numerical results show good agreement with the experimental tests. Thereby, a very useful tool for the design of tooling sets for hot forging is achieved.

  13. Basic and advanced numerical performances relate to mathematical expertise but are fully mediated by visuospatial skills.

    PubMed

    Sella, Francesco; Sader, Elie; Lolliot, Simon; Cohen Kadosh, Roi

    2016-09-01

    Recent studies have highlighted the potential role of basic numerical processing in the acquisition of numerical and mathematical competences. However, it is debated whether high-level numerical skills and mathematics depends specifically on basic numerical representations. In this study mathematicians and nonmathematicians performed a basic number line task, which required mapping positive and negative numbers on a physical horizontal line, and has been shown to correlate with more advanced numerical abilities and mathematical achievement. We found that mathematicians were more accurate compared with nonmathematicians when mapping positive, but not negative numbers, which are considered numerical primitives and cultural artifacts, respectively. Moreover, performance on positive number mapping could predict whether one is a mathematician or not, and was mediated by more advanced mathematical skills. This finding might suggest a link between basic and advanced mathematical skills. However, when we included visuospatial skills, as measured by block design subtest, the mediation analysis revealed that the relation between the performance in the number line task and the group membership was explained by non-numerical visuospatial skills. These results demonstrate that relation between basic, even specific, numerical skills and advanced mathematical achievement can be artifactual and explained by visuospatial processing. (PsycINFO Database Record

  14. Basic and Advanced Numerical Performances Relate to Mathematical Expertise but Are Fully Mediated by Visuospatial Skills

    PubMed Central

    2016-01-01

    Recent studies have highlighted the potential role of basic numerical processing in the acquisition of numerical and mathematical competences. However, it is debated whether high-level numerical skills and mathematics depends specifically on basic numerical representations. In this study mathematicians and nonmathematicians performed a basic number line task, which required mapping positive and negative numbers on a physical horizontal line, and has been shown to correlate with more advanced numerical abilities and mathematical achievement. We found that mathematicians were more accurate compared with nonmathematicians when mapping positive, but not negative numbers, which are considered numerical primitives and cultural artifacts, respectively. Moreover, performance on positive number mapping could predict whether one is a mathematician or not, and was mediated by more advanced mathematical skills. This finding might suggest a link between basic and advanced mathematical skills. However, when we included visuospatial skills, as measured by block design subtest, the mediation analysis revealed that the relation between the performance in the number line task and the group membership was explained by non-numerical visuospatial skills. These results demonstrate that relation between basic, even specific, numerical skills and advanced mathematical achievement can be artifactual and explained by visuospatial processing. PMID:26913930

  15. Basic and advanced numerical performances relate to mathematical expertise but are fully mediated by visuospatial skills.

    PubMed

    Sella, Francesco; Sader, Elie; Lolliot, Simon; Cohen Kadosh, Roi

    2016-09-01

    Recent studies have highlighted the potential role of basic numerical processing in the acquisition of numerical and mathematical competences. However, it is debated whether high-level numerical skills and mathematics depends specifically on basic numerical representations. In this study mathematicians and nonmathematicians performed a basic number line task, which required mapping positive and negative numbers on a physical horizontal line, and has been shown to correlate with more advanced numerical abilities and mathematical achievement. We found that mathematicians were more accurate compared with nonmathematicians when mapping positive, but not negative numbers, which are considered numerical primitives and cultural artifacts, respectively. Moreover, performance on positive number mapping could predict whether one is a mathematician or not, and was mediated by more advanced mathematical skills. This finding might suggest a link between basic and advanced mathematical skills. However, when we included visuospatial skills, as measured by block design subtest, the mediation analysis revealed that the relation between the performance in the number line task and the group membership was explained by non-numerical visuospatial skills. These results demonstrate that relation between basic, even specific, numerical skills and advanced mathematical achievement can be artifactual and explained by visuospatial processing. (PsycINFO Database Record PMID:26913930

  16. Querator: an advanced multi-archive data mining tool

    NASA Astrophysics Data System (ADS)

    Pierfederici, Francesco

    2001-11-01

    In recent years, the operation of large telescopes with wide field detectors - such as the European Southern Observatory (ESO) Wide Field Imager (WFI) on the 2.2 meters telescope at La Silla, Chile - have dramatically increased the amount of astronomical data produced each year. The next survey telescopes, such as the ESO VST, will continue on this trend, producing extremely large datasets. Astronomy, therefore, has become an incredibly data rich field requiring new tools and new strategies to efficiently handle huge archives and fully exploit their scientific content. At the Space Telescope European Coordinating Facility we are working on a new project, code named Querator (http://archive.eso.org/querator/). Querator is an advanced multi-archive search engine built to address the needs of astronomers looking for multicolor imaging data across different astronomical data-centers. Querator returns sets of images of a given astronomical object or search region. A set contains exposures in a number of different wave bands. The user constraints the number of desired wave bands by selecting from a set of instruments, filters or by specifying actual physical units. As far as present-day data-centers are concerned, Querator points out the need for: - an uniform and standard description of archival data and - an uniform and standard description of how the data was acquired (i.e. instrument and observation characteristics). Clearly, these pieces of information will constitute an intermediate layer between the data itself and the data mining tools operating on it. This layered structure is a prerequisite to real data-center inter-operability and, hence, to Virtual Observatories. A detailed description of Querator's design, of the required data structures, of the problems encountered so far and of the proposed solutions will be given in the following pages. Throughout this paper we'll favor the term data-center over archive to stress the need to look at raw-pixels' archives

  17. Numerical modeling of spray combustion with an advanced VOF method

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Shang, Huan-Min; Shih, Ming-Hsin; Liaw, Paul

    1995-01-01

    This paper summarizes the technical development and validation of a multiphase computational fluid dynamics (CFD) numerical method using the volume-of-fluid (VOF) model and a Lagrangian tracking model which can be employed to analyze general multiphase flow problems with free surface mechanism. The gas-liquid interface mass, momentum and energy conservation relationships are modeled by continuum surface mechanisms. A new solution method is developed such that the present VOF model can be applied for all-speed flow regimes. The objectives of the present study are to develop and verify the fractional volume-of-fluid cell partitioning approach into a predictor-corrector algorithm and to demonstrate the effectiveness of the present approach by simulating benchmark problems including laminar impinging jets, shear coaxial jet atomization and shear coaxial spray combustion flows.

  18. DSA hole defectivity analysis using advanced optical inspection tool

    NASA Astrophysics Data System (ADS)

    Harukawa, Ryota; Aoki, Masami; Cross, Andrew; Nagaswami, Venkat; Tomita, Tadatoshi; Nagahara, Seiji; Muramatsu, Makoto; Kawakami, Shinichiro; Kosugi, Hitoshi; Rathsack, Benjamen; Kitano, Takahiro; Sweis, Jason; Mokhberi, Ali

    2013-04-01

    This paper discusses the defect density detection and analysis methodology using advanced optical wafer inspection capability to enable accelerated development of a DSA process/process tools and the required inspection capability to monitor such a process. The defectivity inspection methodologies are optimized for grapho epitaxy directed self-assembly (DSA) contact holes with 25 nm sizes. A defect test reticle with programmed defects on guide patterns is designed for improved optimization of defectivity monitoring. Using this reticle, resist guide holes with a variety of sizes and shapes are patterned using an ArF immersion scanner. The negative tone development (NTD) type thermally stable resist guide is used for DSA of a polystyrene-b-poly(methyl methacrylate) (PS-b-PMMA) block copolymer (BCP). Using a variety of defects intentionally made by changing guide pattern sizes, the detection rates of each specific defectivity type has been analyzed. It is found in this work that to maximize sensitivity, a two pass scan with bright field (BF) and dark field (DF) modes provides the best overall defect type coverage and sensitivity. The performance of the two pass scan with BF and DF modes is also revealed by defect analysis for baseline defectivity on a wafer processed with nominal process conditions.

  19. Astonishing advances in mouse genetic tools for biomedical research.

    PubMed

    Kaczmarczyk, Lech; Jackson, Walker S

    2015-01-01

    The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data. PMID:26513700

  20. European regulatory tools for advanced therapy medicinal products.

    PubMed

    Flory, Egbert; Reinhardt, Jens

    2013-12-01

    Increasing scientific knowledge and technical innovations in the areas of cell biology, biotechnology and medicine resulted in the development of promising therapeutic approaches for the prevention and treatment of human diseases. Advanced therapy medicinal products (ATMPs) reflect a complex and innovative class of biopharmaceuticals as these products are highly research-driven, characterised by innovative manufacturing processes and heterogeneous with regard to their origin, type and complexity. This class of ATMP integrates gene therapy medicinal products, somatic cell therapy medicinal products and tissue engineering products and are often individualized and patient-specific products. Multiple challenges arise from the nature of ATMPs, which are often developed by micro, small and medium sized enterprises, university and academia, for whom regulatory experiences are limited and regulatory requirements are challenging. Regulatory guidance such as the reflection paper on classification of ATMPs and guidelines highlighting product-specific issues support academic research groups and pharmaceutical companies to foster the development of safe and effective ATMPs. This review provides an overview on the European regulatory aspects of ATMPs and highlights specific regulatory tools such as the ATMP classification procedure, a discussion on the hospital exemption for selected ATMPs as well as borderline issues towards transplants/transfusion products.

  1. Astonishing advances in mouse genetic tools for biomedical research.

    PubMed

    Kaczmarczyk, Lech; Jackson, Walker S

    2015-01-01

    The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data.

  2. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  3. Advanced Numerical Methods and Software Approaches for Semiconductor Device Simulation

    DOE PAGES

    Carey, Graham F.; Pardhanani, A. L.; Bova, S. W.

    2000-01-01

    In this article we concisely present several modern strategies that are applicable to driftdominated carrier transport in higher-order deterministic models such as the driftdiffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of “upwind” and artificial dissipation schemes, generalization of the traditional Scharfetter – Gummel approach, Petrov – Galerkin and streamline-upwind Petrov Galerkin (SUPG), “entropy” variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of themore » methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. We have included numerical examples from our recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and we emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, we briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.« less

  4. Advanced numerical methods and software approaches for semiconductor device simulation

    SciTech Connect

    CAREY,GRAHAM F.; PARDHANANI,A.L.; BOVA,STEVEN W.

    2000-03-23

    In this article the authors concisely present several modern strategies that are applicable to drift-dominated carrier transport in higher-order deterministic models such as the drift-diffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of upwind and artificial dissipation schemes, generalization of the traditional Scharfetter-Gummel approach, Petrov-Galerkin and streamline-upwind Petrov Galerkin (SUPG), entropy variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of the methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. They have included numerical examples from the recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and they emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, they briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.

  5. STRING 3: An Advanced Groundwater Flow Visualization Tool

    NASA Astrophysics Data System (ADS)

    Schröder, Simon; Michel, Isabel; Biedert, Tim; Gräfe, Marius; Seidel, Torsten; König, Christoph

    2016-04-01

    The visualization of 3D groundwater flow is a challenging task. Previous versions of our software STRING [1] solely focused on intuitive visualization of complex flow scenarios for non-professional audiences. STRING, developed by Fraunhofer ITWM (Kaiserslautern, Germany) and delta h Ingenieurgesellschaft mbH (Witten, Germany), provides the necessary means for visualization of both 2D and 3D data on planar and curved surfaces. In this contribution we discuss how to extend this approach to a full 3D tool and its challenges in continuation of Michel et al. [2]. This elevates STRING from a post-production to an exploration tool for experts. In STRING moving pathlets provide an intuition of velocity and direction of both steady-state and transient flows. The visualization concept is based on the Lagrangian view of the flow. To capture every detail of the flow an advanced method for intelligent, time-dependent seeding is used building on the Finite Pointset Method (FPM) developed by Fraunhofer ITWM. Lifting our visualization approach from 2D into 3D provides many new challenges. With the implementation of a seeding strategy for 3D one of the major problems has already been solved (see Schröder et al. [3]). As pathlets only provide an overview of the velocity field other means are required for the visualization of additional flow properties. We suggest the use of Direct Volume Rendering and isosurfaces for scalar features. In this regard we were able to develop an efficient approach for combining the rendering through raytracing of the volume and regular OpenGL geometries. This is achieved through the use of Depth Peeling or A-Buffers for the rendering of transparent geometries. Animation of pathlets requires a strict boundary of the simulation domain. Hence, STRING needs to extract the boundary, even from unstructured data, if it is not provided. In 3D we additionally need a good visualization of the boundary itself. For this the silhouette based on the angle of

  6. Numerical Propulsion System Simulation (NPSS): An Award Winning Propulsion System Simulation Tool

    NASA Technical Reports Server (NTRS)

    Stauber, Laurel J.; Naiman, Cynthia G.

    2002-01-01

    The Numerical Propulsion System Simulation (NPSS) is a full propulsion system simulation tool used by aerospace engineers to predict and analyze the aerothermodynamic behavior of commercial jet aircraft, military applications, and space transportation. The NPSS framework was developed to support aerospace, but other applications are already leveraging the initial capabilities, such as aviation safety, ground-based power, and alternative energy conversion devices such as fuel cells. By using the framework and developing the necessary components, future applications that NPSS could support include nuclear power, water treatment, biomedicine, chemical processing, and marine propulsion. NPSS will dramatically reduce the time, effort, and expense necessary to design and test jet engines. It accomplishes that by generating sophisticated computer simulations of an aerospace object or system, thus enabling engineers to "test" various design options without having to conduct costly, time-consuming real-life tests. The ultimate goal of NPSS is to create a numerical "test cell" that enables engineers to create complete engine simulations overnight on cost-effective computing platforms. Using NPSS, engine designers will be able to analyze different parts of the engine simultaneously, perform different types of analysis simultaneously (e.g., aerodynamic and structural), and perform analysis in a more efficient and less costly manner. NPSS will cut the development time of a new engine in half, from 10 years to 5 years. And NPSS will have a similar effect on the cost of development: new jet engines will cost about a billion dollars to develop rather than two billion. NPSS is also being applied to the development of space transportation technologies, and it is expected that similar efficiencies and cost savings will result. Advancements of NPSS in fiscal year 2001 included enhancing the NPSS Developer's Kit to easily integrate external components of varying fidelities, providing

  7. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    SciTech Connect

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from

  8. Comparing Simple and Advanced Video Tools as Supports for Complex Collaborative Design Processes

    ERIC Educational Resources Information Center

    Zahn, Carmen; Pea, Roy; Hesse, Friedrich W.; Rosen, Joe

    2010-01-01

    Working with digital video technologies, particularly advanced video tools with editing capabilities, offers new prospects for meaningful learning through design. However, it is also possible that the additional complexity of such tools does "not" advance learning. We compared in an experiment the design processes and learning outcomes of 24…

  9. New advanced radio diagnostics tools for Space Weather Program

    NASA Astrophysics Data System (ADS)

    Krankowski, A.; Rothkaehl, H.; Atamaniuk, B.; Morawski, M.; Zakharenkova, I.; Cherniak, I.; Otmianowska-Mazur, K.

    2013-12-01

    To give a more detailed and complete understanding of physical plasma processes that govern the solar-terrestrial space, and to develop qualitative and quantitative models of the magnetosphere-ionosphere-thermosphere coupling, it is necessary to design and build the next generation of instruments for space diagnostics and monitoring. Novel ground- based wide-area sensor networks, such as the LOFAR (Low Frequency Array) radar facility, comprising wide band, and vector-sensing radio receivers and multi-spacecraft plasma diagnostics should help solve outstanding problems of space physics and describe long-term environmental changes. The LOw Frequency ARray - LOFAR - is a new fully digital radio telescope designed for frequencies between 30 MHz and 240 MHz located in Europe. The three new LOFAR stations will be installed until summer 2015 in Poland. The LOFAR facilities in Poland will be distributed among three sites: Lazy (East of Krakow), Borowiec near Poznan and Baldy near Olsztyn. All they will be connected via PIONIER dedicated links to Poznan. Each site will host one LOFAR station (96 high-band+96 low-band antennas). They will most time work as a part of European network, however, when less charged, they can operate as a national network The new digital radio frequency analyzer (RFA) on board the low-orbiting RELEC satellite was designed to monitor and investigate the ionospheric plasma properties. This two-point ground-based and topside ionosphere-located space plasma diagnostic can be a useful new tool for monitoring and diagnosing turbulent plasma properties. The RFA on board the RELEC satellite is the first in a series of experiments which is planned to be launched into the near-Earth environment. In order to improve and validate the large scales and small scales ionospheric structures we will used the GPS observations collected at IGS/EPN network employed to reconstruct diurnal variations of TEC using all satellite passes over individual GPS stations and the

  10. Introduction to NuMAD: A numerical manufacturing and design tool

    SciTech Connect

    Laird, D.L.; Ashwill, T.D.

    1997-11-01

    Given the complex geometry of most wind turbine blades, structural modeling using the finite element method is generally performed using a unique model for each particular blade analysis. Development time (often considerable) spent creating a model for one blade may not aid in the development of a model for a different blade. In an effort to reduce model development time and increase the usability of advanced finite element analysis capabilities, a new software tool, NuMAD, is being developed.

  11. Numerical Stability and Accuracy of Temporally Coupled Multi-Physics Modules in Wind-Turbine CAE Tools

    SciTech Connect

    Gasmi, A.; Sprague, M. A.; Jonkman, J. M.; Jones, W. B.

    2013-02-01

    In this paper we examine the stability and accuracy of numerical algorithms for coupling time-dependent multi-physics modules relevant to computer-aided engineering (CAE) of wind turbines. This work is motivated by an in-progress major revision of FAST, the National Renewable Energy Laboratory's (NREL's) premier aero-elastic CAE simulation tool. We employ two simple examples as test systems, while algorithm descriptions are kept general. Coupled-system governing equations are framed in monolithic and partitioned representations as differential-algebraic equations. Explicit and implicit loose partition coupling is examined. In explicit coupling, partitions are advanced in time from known information. In implicit coupling, there is dependence on other-partition data at the next time step; coupling is accomplished through a predictor-corrector (PC) approach. Numerical time integration of coupled ordinary-differential equations (ODEs) is accomplished with one of three, fourth-order fixed-time-increment methods: Runge-Kutta (RK), Adams-Bashforth (AB), and Adams-Bashforth-Moulton (ABM). Through numerical experiments it is shown that explicit coupling can be dramatically less stable and less accurate than simulations performed with the monolithic system. However, PC implicit coupling restored stability and fourth-order accuracy for ABM; only second-order accuracy was achieved with RK integration. For systems without constraints, explicit time integration with AB and explicit loose coupling exhibited desired accuracy and stability.

  12. Numerical approximation of a nonlinear delay-advance functional differential equation by a finite element method

    NASA Astrophysics Data System (ADS)

    Teodoro, M. F.

    2012-09-01

    We are particularly interested in the numerical solution of the functional differential equations with symmetric delay and advance. In this work, we consider a nonlinear forward-backward equation, the Fitz Hugh-Nagumo equation. It is presented a scheme which extends the algorithm introduced in [1]. A computational method using Newton's method, finite element method and method of steps is developped.

  13. Laser Hardening Prediction Tool Based On a Solid State Transformations Numerical Model

    SciTech Connect

    Martinez, S.; Ukar, E.; Lamikiz, A.

    2011-01-17

    This paper presents a tool to predict hardening layer in selective laser hardening processes where laser beam heats the part locally while the bulk acts as a heat sink.The tool to predict accurately the temperature field in the workpiece is a numerical model that combines a three dimensional transient numerical solution for heating where is possible to introduce different laser sources. The thermal field was modeled using a kinetic model based on Johnson-Mehl-Avrami equation. Considering this equation, an experimental adjustment of transformation parameters was carried out to get the heating transformation diagrams (CHT). With the temperature field and CHT diagrams the model predicts the percentage of base material converted into austenite. These two parameters are used as first step to estimate the depth of hardened layer in the part.The model has been adjusted and validated with experimental data for DIN 1.2379, cold work tool steel typically used in mold and die making industry. This steel presents solid state diffusive transformations at relative low temperature. These transformations must be considered in order to get good accuracy of temperature field prediction during heating phase. For model validation, surface temperature measured by pyrometry, thermal field as well as the hardened layer obtained from metallographic study, were compared with the model data showing a good adjustment.

  14. Laser Hardening Prediction Tool Based On a Solid State Transformations Numerical Model

    NASA Astrophysics Data System (ADS)

    Martínez, S.; Ukar, E.; Lamikiz, A.; Liebana, F.

    2011-01-01

    This paper presents a tool to predict hardening layer in selective laser hardening processes where laser beam heats the part locally while the bulk acts as a heat sink. The tool to predict accurately the temperature field in the workpiece is a numerical model that combines a three dimensional transient numerical solution for heating where is possible to introduce different laser sources. The thermal field was modeled using a kinetic model based on Johnson-Mehl-Avrami equation. Considering this equation, an experimental adjustment of transformation parameters was carried out to get the heating transformation diagrams (CHT). With the temperature field and CHT diagrams the model predicts the percentage of base material converted into austenite. These two parameters are used as first step to estimate the depth of hardened layer in the part. The model has been adjusted and validated with experimental data for DIN 1.2379, cold work tool steel typically used in mold and die making industry. This steel presents solid state diffusive transformations at relative low temperature. These transformations must be considered in order to get good accuracy of temperature field prediction during heating phase. For model validation, surface temperature measured by pyrometry, thermal field as well as the hardened layer obtained from metallographic study, were compared with the model data showing a good adjustment.

  15. Restricted numerical range: A versatile tool in the theory of quantum information

    NASA Astrophysics Data System (ADS)

    Gawron, Piotr; Puchała, Zbigniew; Miszczak, Jarosław Adam; Skowronek, Łukasz; Życzkowski, Karol

    2010-10-01

    Numerical range of a Hermitian operator X is defined as the set of all possible expectation values of this observable among a normalized quantum state. We analyze a modification of this definition in which the expectation value is taken among a certain subset of the set of all quantum states. One considers, for instance, the set of real states, the set of product states, separable states, or the set of maximally entangled states. We show exemplary applications of these algebraic tools in the theory of quantum information: analysis of k-positive maps and entanglement witnesses, as well as study of the minimal output entropy of a quantum channel. Product numerical range of a unitary operator is used to solve the problem of local distinguishability of a family of two unitary gates.

  16. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  17. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin

    2009-04-01

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  18. A numerical tool for reproducing driver behaviour: experiments and predictive simulations.

    PubMed

    Casucci, M; Marchitto, M; Cacciabue, P C

    2010-03-01

    This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions. PMID:19249745

  19. Advanced Numerical Imaging Procedure Accounting for Non-Ideal Effects in GPR Scenarios

    NASA Astrophysics Data System (ADS)

    Comite, Davide; Galli, Alessandro; Catapano, Ilaria; Soldovieri, Francesco

    2015-04-01

    The capability to provide fast and reliable imaging of targets and interfaces in non-accessible probed scenarios is a topic of great scientific interest, and many investigations have shown that Ground Penetrating Radar (GPR) can provide an efficient technique to conduct this kind of analysis in various applications of geophysical nature and civil engineering. In these cases, the development of an efficient and accurate imaging procedure is strongly dependent on the capability of accounting for the incident field that activates the scattering phenomenon. In this frame, based on a suitable implementation of an electromagnetic (EM) CAD tool (CST Microwave Studio), it has been possible to accurately and efficiently model the radiation pattern of real antennas in environments typically considered in GPR surveys [1]. A typical scenario of our interest is constituted by targets hidden in a ground medium, described by certain EM parameters and probed by a movable GPR using interfacial antennas [2]. The transmitting and receiving antennas considered here are Vivaldi ones, but a wide variety of other antennas can be modeled and designed, similar to those ones available in commercial GPR systems. Hence, an advanced version of a well-known microwave tomography approach (MTA) [3] has been implemented, both in the canonical 2D scalar case and in the more realistic 3D vectorial one. Such an approach is able to account for the real distribution of the radiated and scattered EM fields. Comparisons of results obtained by means of a 'conventional' implementation of the MTA, where the antennas are modeled as ideal line sources, and by means of our 'advanced' approach, which instead takes into account the radiation features of the chosen antenna type, have been carried out and discussed. Since the antenna radiation patterns are modified by the probed environment, whose EM features and the possible stratified structure usually are not exactly known, the imaging capabilities of the MTA

  20. Left Ventricular Flow Analysis: Recent Advances in Numerical Methods and Applications in Cardiac Ultrasound

    PubMed Central

    Borazjani, Iman; Westerdale, John; McMahon, Eileen M.; Rajaraman, Prathish K.; Heys, Jeffrey J.

    2013-01-01

    The left ventricle (LV) pumps oxygenated blood from the lungs to the rest of the body through systemic circulation. The efficiency of such a pumping function is dependent on blood flow within the LV chamber. It is therefore crucial to accurately characterize LV hemodynamics. Improved understanding of LV hemodynamics is expected to provide important clinical diagnostic and prognostic information. We review the recent advances in numerical and experimental methods for characterizing LV flows and focus on analysis of intraventricular flow fields by echocardiographic particle image velocimetry (echo-PIV), due to its potential for broad and practical utility. Future research directions to advance patient-specific LV simulations include development of methods capable of resolving heart valves, higher temporal resolution, automated generation of three-dimensional (3D) geometry, and incorporating actual flow measurements into the numerical solution of the 3D cardiovascular fluid dynamics. PMID:23690874

  1. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  2. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  3. Penetration of cutting tool into cortical bone: experimental and numerical investigation of anisotropic mechanical behaviour.

    PubMed

    Li, Simin; Abdel-Wahab, Adel; Demirci, Emrah; Silberschmidt, Vadim V

    2014-03-21

    An anisotropic mechanical behaviour of cortical bone and its intrinsic hierarchical microstructure act as protective mechanisms to prevent catastrophic failure due to natural loading conditions; however, they increase the extent of complexity of a penetration process in the case of orthopaedic surgery. Experimental results available in literature provide only limited information about processes in the vicinity of a tool-bone interaction zone. Also, available numerical models the bone-cutting process do not account for material anisotropy or the effect of damage mechanisms. In this study, both experimental and numerical studies were conducted to address these issues and to elucidate the effect of anisotropic mechanical behaviour of cortical bone tissue on penetration of a sharp cutting tool. First, a set of tool-penetration experiments was performed in directions parallel and perpendicular to bone axis. Also, these experiments included bone samples cut from four different cortices to evaluate the effect of spatial variability and material anisotropy on the penetration processes. Distinct deformation and damage mechanisms linked to different microstructure orientations were captured using a micro-lens high-speed camera. Then, a novel hybrid FE model employing a smoothed-particle-hydrodynamic domain embedded into a continuum FE one was developed based on the experimental configuration to characterise the anisotropic deformation and damage behaviour of cortical bone under a penetration process. The results of our study revealed a clear anisotropic material behaviour of the studied cortical bone tissue and the influence of the underlying microstructure. The proposed FE model reflected adequately the experimental results and demonstrated the need for the use of the anisotropic and damage material model to analyse cutting of the cortical-bone tissue.

  4. Ductile damage prediction in metal forming processes: Advanced modeling and numerical simulation

    NASA Astrophysics Data System (ADS)

    Saanouni, K.

    2013-05-01

    This paper describes the needs required in modern virtual metal forming including both sheet and bulk metal forming of mechanical components. These concern the advanced modeling of thermo-mechanical behavior including the multiphysical phenomena and their interaction or strong coupling, as well as the associated numerical aspects using fully adaptive simulation strategies. First a survey of advanced constitutive equations accounting for the main thermomechanical phenomena as the thermo-elasto-plastic finite strains with isotropic and kinematic hardenings fully coupled with ductile damage will be presented. Only the macroscopic phenomenological approach with state variables (monoscale approach) will be discussed in the general framework of the rational thermodynamics for generalized micromorphic continua. The micro-macro (multi-scales approach) in the framework of polycrystalline inelasticity is not presented here for the sake of shortness but will be presented during the oral presentation. The main numerical aspects related to the resolution of the associated initial and boundary value problem will be outlined. A fully adaptive numerical methodology will be briefly described and some numerical examples will be given in order to show the high predictive capabilities of this adaptive methodology for virtual metal forming simulations.

  5. Water quality management of aquifer recharge using advanced tools.

    PubMed

    Lazarova, Valentina; Emsellem, Yves; Paille, Julie; Glucina, Karl; Gislette, Philippe

    2011-01-01

    Managed aquifer recharge (MAR) with recycled water or other alternative resources is one of the most rapidly growing techniques that is viewed as a necessity in water-short areas. In order to better control health and environmental effects of MAR, this paper presents two case studies demonstrating how to improve water quality, enable reliable tracing of injected water and better control and manage MAR operation in the case of indirect and direct aquifer recharge. Two water quality management strategies are illustrated on two full-scale case studies, including the results of the combination of non conventional and advanced technologies for water quality improvement, comprehensive sampling and monitoring programs including emerging pollutants, tracer studies using boron isotopes and integrative aquifer 3D GIS hydraulic and hydrodispersive modelling.

  6. Advanced Epi Tools for Gallium Nitride Light Emitting Diode Devices

    SciTech Connect

    Patibandla, Nag; Agrawal, Vivek

    2012-12-01

    Over the course of this program, Applied Materials, Inc., with generous support from the United States Department of Energy, developed a world-class three chamber III-Nitride epi cluster tool for low-cost, high volume GaN growth for the solid state lighting industry. One of the major achievements of the program was to design, build, and demonstrate the world’s largest wafer capacity HVPE chamber suitable for repeatable high volume III-Nitride template and device manufacturing. Applied Materials’ experience in developing deposition chambers for the silicon chip industry over many decades resulted in many orders of magnitude reductions in the price of transistors. That experience and understanding was used in developing this GaN epi deposition tool. The multi-chamber approach, which continues to be unique in the ability of the each chamber to deposit a section of the full device structure, unlike other cluster tools, allows for extreme flexibility in the manufacturing process. This robust architecture is suitable for not just the LED industry, but GaN power devices as well, both horizontal and vertical designs. The new HVPE technology developed allows GaN to be grown at a rate unheard of with MOCVD, up to 20x the typical MOCVD rates of 3{micro}m per hour, with bulk crystal quality better than the highest-quality commercial GaN films grown by MOCVD at a much cheaper overall cost. This is a unique development as the HVPE process has been known for decades, but never successfully commercially developed for high volume manufacturing. This research shows the potential of the first commercial-grade HVPE chamber, an elusive goal for III-V researchers and those wanting to capitalize on the promise of HVPE. Additionally, in the course of this program, Applied Materials built two MOCVD chambers, in addition to the HVPE chamber, and a robot that moves wafers between them. The MOCVD chambers demonstrated industry-leading wavelength yield for GaN based LED wafers and industry

  7. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    SciTech Connect

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  8. Scanning magnetoresistive microscopy: An advanced characterization tool for magnetic nanosystems

    NASA Astrophysics Data System (ADS)

    Mitin, D.; Grobis, M.; Albrecht, M.

    2016-02-01

    An advanced scanning magnetoresistive microscopy (SMRM) — a robust magnetic imaging and probing technique — will be presented, which utilizes state-of-the-art recording heads of a hard disk drive as sensors. The spatial resolution of modern tunneling magnetoresistive sensors is nowadays comparable to the more commonly used magnetic force microscopes. Important advantages of SMRM are the ability to detect pure magnetic signals directly proportional to the out-of-plane magnetic stray field, negligible sensor stray fields, and the ability to apply local bipolar magnetic field pulses up to 10 kOe with bandwidths from DC up to 1 GHz. Moreover, the SMRM can be further equipped with a heating stage and external magnetic field units. The performance of this method and corresponding best practices are demonstrated by presenting various examples, including a temperature dependent recording study on hard magnetic L10 FeCuPt thin films, imaging of magnetic vortex states in an in-plane magnetic field, and their controlled manipulation by applying local field pulses.

  9. Scanning magnetoresistive microscopy: An advanced characterization tool for magnetic nanosystems.

    PubMed

    Mitin, D; Grobis, M; Albrecht, M

    2016-02-01

    An advanced scanning magnetoresistive microscopy (SMRM) - a robust magnetic imaging and probing technique - will be presented, which utilizes state-of-the-art recording heads of a hard disk drive as sensors. The spatial resolution of modern tunneling magnetoresistive sensors is nowadays comparable to the more commonly used magnetic force microscopes. Important advantages of SMRM are the ability to detect pure magnetic signals directly proportional to the out-of-plane magnetic stray field, negligible sensor stray fields, and the ability to apply local bipolar magnetic field pulses up to 10 kOe with bandwidths from DC up to 1 GHz. Moreover, the SMRM can be further equipped with a heating stage and external magnetic field units. The performance of this method and corresponding best practices are demonstrated by presenting various examples, including a temperature dependent recording study on hard magnetic L1(0) FeCuPt thin films, imaging of magnetic vortex states in an in-plane magnetic field, and their controlled manipulation by applying local field pulses. PMID:26931856

  10. Advanced Flow Control as a Management Tool in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Wugalter, S.

    1974-01-01

    Advanced Flow Control is closely related to Air Traffic Control. Air Traffic Control is the business of the Federal Aviation Administration. To formulate an understanding of advanced flow control and its use as a management tool in the National Airspace System, it becomes necessary to speak somewhat of air traffic control, the role of FAA, and their relationship to advanced flow control. Also, this should dispell forever, any notion that advanced flow control is the inspirational master valve scheme to be used on the Alaskan Oil Pipeline.

  11. Advanced tools for astronomical time series and image analysis

    NASA Astrophysics Data System (ADS)

    Scargle, Jeffrey D.

    The algorithms described here, which I have developed for applications in X-ray and γ-ray astronomy, will hopefully be of use in other ways, perhaps aiding in the exploration of modern astronomy's data cornucopia. The goal is to describe principled approaches to some ubiquitous problems, such as detection and characterization of periodic and aperiodic signals, estimation of time delays between multiple time series, and source detection in noisy images with noisy backgrounds. The latter problem is related to detection of clusters in data spaces of various dimensions. A goal of this work is to achieve a unifying view of several related topics: signal detection and characterization, cluster identification, classification, density estimation, and multivariate regression. In addition to being useful for analysis of data from space-based and ground-based missions, these algorithms may be a basis for a future automatic science discovery facility, and in turn provide analysis tools for the Virtual Observatory. This chapter has ties to those by Larry Bretthorst, Tom Loredo, Alanna Connors, Fionn Murtagh, Jim Berger, David van Dyk, Vicent Martinez & Enn Saar.

  12. Benchmark of numerical tools simulating beam propagation and secondary particles in ITER NBI

    NASA Astrophysics Data System (ADS)

    Sartori, E.; Veltri, P.; Dlougach, E.; Hemsworth, R.; Serianni, G.; Singh, M.

    2015-04-01

    Injection of high energy beams of neutral particles is a method for plasma heating in fusion devices. The ITER injector, and its prototype MITICA (Megavolt ITER Injector and Concept Advancement), are large extrapolations from existing devices: therefore numerical modeling is needed to set thermo-mechanical requirements for all beam-facing components. As the power and charge deposition originates from several sources (primary beam, co-accelerated electrons, and secondary production by beam-gas, beam-surface, and electron-surface interaction), the beam propagation along the beam line is simulated by comprehensive 3D models. This paper presents a comparative study between two codes: BTR has been used for several years in the design of the ITER HNB/DNB components; SAMANTHA code was independently developed and includes additional phenomena, such as secondary particles generated by collision of beam particles with the background gas. The code comparison is valuable in the perspective of the upcoming experimental operations, in order to prepare a reliable numerical support to the interpretation of experimental measurements in the beam test facilities. The power density map calculated on the Electrostatic Residual Ion Dump (ERID) is the chosen benchmark, as it depends on the electric and magnetic fields as well as on the evolution of the beam species via interaction with the gas. Finally the paper shows additional results provided by SAMANTHA, like the secondary electrons produced by volume processes accelerated by the ERID fringe-field towards the Cryopumps.

  13. Benchmark of numerical tools simulating beam propagation and secondary particles in ITER NBI

    SciTech Connect

    Sartori, E. Veltri, P.; Serianni, G.; Dlougach, E.; Hemsworth, R.; Singh, M.

    2015-04-08

    Injection of high energy beams of neutral particles is a method for plasma heating in fusion devices. The ITER injector, and its prototype MITICA (Megavolt ITER Injector and Concept Advancement), are large extrapolations from existing devices: therefore numerical modeling is needed to set thermo-mechanical requirements for all beam-facing components. As the power and charge deposition originates from several sources (primary beam, co-accelerated electrons, and secondary production by beam-gas, beam-surface, and electron-surface interaction), the beam propagation along the beam line is simulated by comprehensive 3D models. This paper presents a comparative study between two codes: BTR has been used for several years in the design of the ITER HNB/DNB components; SAMANTHA code was independently developed and includes additional phenomena, such as secondary particles generated by collision of beam particles with the background gas. The code comparison is valuable in the perspective of the upcoming experimental operations, in order to prepare a reliable numerical support to the interpretation of experimental measurements in the beam test facilities. The power density map calculated on the Electrostatic Residual Ion Dump (ERID) is the chosen benchmark, as it depends on the electric and magnetic fields as well as on the evolution of the beam species via interaction with the gas. Finally the paper shows additional results provided by SAMANTHA, like the secondary electrons produced by volume processes accelerated by the ERID fringe-field towards the Cryopumps.

  14. Advancing alternate tools: why science education needs CRP and CRT

    NASA Astrophysics Data System (ADS)

    Dodo Seriki, Vanessa

    2016-09-01

    Ridgeway and Yerrick's paper, Whose banner are we waving?: exploring STEM partnerships for marginalized urban youth, unearthed the tensions that existed between a local community "expert" and a group of students and their facilitator in an afterschool program. Those of us who work with youth who are traditionally marginalized, understand the importance of teaching in culturally relevant ways, but far too often—as Ridgeway and Yerrick shared—community partners have beliefs, motives, and ideologies that are incompatible to the program's mission and goals. Nevertheless, we often enter partnerships assuming that the other party understands the needs of the students or community; understands how in U.S. society White is normative while all others are deficient; and understands how to engage with students in culturally relevant ways. This forum addresses the underlying assumption, described in the Ridgeway and Yerrick article, that educators—despite their background and experiences—are able to teach in culturally relevant ways. Additionally, I assert based on the finding in the article that just as Ladson-Billings and Tate (Teach Coll Rec 97(1):47-68, 1995) asserted, race in the U.S. society, as a scholarly pursuit, was under theorized. The same is true of science education; race in science education is under theorized and the use of culturally relevant pedagogy and critical race theory as a pedagogical model and analytical tool, respectively, in science education is minimal. The increased use of both would impact our understanding of who does science, and how to broaden participation among people of color.

  15. Exploring the nonequilibrium dynamics of ultracold quantum gases by using numerical tools

    NASA Astrophysics Data System (ADS)

    Heidrich-Meisner, Fabian

    Numerical tools such as exact diagonalization or the density matrix renormalization group method have been vital for the study of the nonequilibrium dynamics of strongly correlated many-body systems. Moreover, they provided unique insight for the interpretation of quantum gas experiments, whenever a direct comparison with theory is possible. By considering the example of the experiment by Ronzheimer et al., in which both an interaction quench and the release of bosons from a trap into an empty optical lattice (sudden expansion) was realized, I discuss several nonequilibrium effects of strongly interacting quantum gases. These include the thermalization of a closed quantum system and its connection to the eigenstate thermalization hypothesis, nonequilibrium mass transport, dynamical fermionization, and transient phenomena such as quantum distillation or dynamical quasicondensation. I highlight the role of integrability in giving rise to ballistic transport in strongly interacting 1D systems and in determining the asymptotic state after a quantum quench. The talk concludes with a perspective on open questions concerning 2D systems and the numerical simulation of their nonequilibrium dynamics. Supported by Deutsche Forschungsgemeinschaft (DFG) via FOR 801.

  16. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  17. Numerical study of electromagnetic waves generated by a prototype dielectric logging tool

    USGS Publications Warehouse

    Ellefsen, K.J.; Abraham, J.D.; Wright, D.L.; Mazzella, A.T.

    2004-01-01

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a numerical study was conducted using both the finite-difference, time-domain method and a frequency-wavenumber method. When the propagation velocity in the borehole was greater than that in the formation (e.g., an air-filled borehole in the unsaturated zone), only a guided wave propagated along the borehole. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave radiated electromagnetic energy into the formation, causing its amplitude to decrease. When the propagation velocity in the borehole was less than that in the formation (e.g., a water-filled borehole in the saturated zone), both a refracted wave and a guided wave propagated along the borehole. The velocity of the refracted wave equaled the phase velocity of a plane wave in the formation, and the refracted wave preceded the guided wave. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave did not radiate electromagnetic energy into the formation. To analyze traces recorded by the prototype tool during laboratory tests, they were compared to traces calculated with the finite-difference method. The first parts of both the recorded and the calculated traces were similar, indicating that guided and refracted waves indeed propagated along the prototype tool. ?? 2004 Society of Exploration Geophysicists. All rights reserved.

  18. Experimental and numerical approaches for application of density and thermal neutron tools in slim borehole

    NASA Astrophysics Data System (ADS)

    Hwang, Seho; Shin, Jehyun; Won, Byeongho; Kim, Jongman

    2015-04-01

    To perform the groundwater investigation, geological surveys, geotechnical investigation, generally 3 inches diameter borehole is drilled, and PVC or steel casing having a 50 mm inner diameter is installed to prevent for collapse borehole in the case of shallow unconsolidated formation or fractured zone. In this case, well loggings for formation evaluation have many limitations, and especially radioactive tools having large diameter are basically difficult to apply. Available radioactive logs can be applied within the casing are natural gamma ray log, density log and neutron logs. Natural gamma ray log is used for estimation of shale volume, stratigraphic and facies classification such as shale and sandstone, and almost borehole environment can be corrected using manufactured charts. In the case of the small diameter borehole such as 50 mm diameter cased borehole, we should apply the small diameter radioactive logging tools. However the measured data is generally count per second. So we should convert the measured count per second to meaningful physical properties such as density or neutron porosity according to the strength of radioactive source, the distance between the source and the detector, the mud and casing type, and so on. In this study, the experimental and numerical methods are used to convert the measured count per second to density and neutron porosity for density and neutron logs logging tools having one detector. 1Ci Am-Be single neutron logs were compared using 3Ci Am-Be dual neutron logs in the same boreholes, and empirical relationship between the single and dual neutron log is derived. The diameter and lithology of target boreholes are 3 inches and granite, sandstone, mud, etc. The response characteristics for a very small diameter and no orientation of the radioactive source density logging (4 pi omni-directional source) were analyzed using the MCNP. Numerical modeling was performed while varying the distance of the radioactive source - detector

  19. Advances in Coupling of Kinetics and Molecular Scale Tools to Shed Light on Soil Biogeochemical Processes

    SciTech Connect

    Sparks, Donald

    2014-09-02

    Biogeochemical processes in soils such as sorption, precipitation, and redox play critical roles in the cycling and fate of nutrients, metal(loid)s and organic chemicals in soil and water environments. Advanced analytical tools enable soil scientists to track these processes in real-time and at the molecular scale. Our review focuses on recent research that has employed state-of-the-art molecular scale spectroscopy, coupled with kinetics, to elucidate the mechanisms of nutrient and metal(loid) reactivity and speciation in soils. We found that by coupling kinetics with advanced molecular and nano-scale tools major advances have been made in elucidating important soil chemical processes including sorption, precipitation, dissolution, and redox of metal(loids) and nutrients. Such advances will aid in better predicting the fate and mobility of nutrients and contaminants in soils and water and enhance environmental and agricultural sustainability.

  20. Recent numerical and algorithmic advances within the volume tracking framework for modeling interfacial flows

    DOE PAGES

    François, Marianne M.

    2015-05-28

    A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges.more » In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.« less

  1. Recent numerical and algorithmic advances within the volume tracking framework for modeling interfacial flows

    SciTech Connect

    François, Marianne M.

    2015-05-28

    A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges. In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.

  2. Numerical study of Alfvén eigenmodes in the Experimental Advanced Superconducting Tokamak

    SciTech Connect

    Hu, Youjun; Li, Guoqiang; Yang, Wenjun; Zhou, Deng; Ren, Qilong; Gorelenkov, N. N.; Cai, Huishan

    2014-05-15

    Alfvén eigenmodes in up-down asymmetric tokamak equilibria are studied by a new magnetohydrodynamic eigenvalue code. The code is verified with the NOVA code for the Solovév equilibrium and then is used to study Alfvén eigenmodes in a up-down asymmetric equilibrium of the Experimental Advanced Superconducting Tokamak. The frequency and mode structure of toroidicity-induced Alfvén eigenmodes are calculated. It is demonstrated numerically that up-down asymmetry induces phase variation in the eigenfunction across the major radius on the midplane.

  3. Simulation studies of the impact of advanced observing systems on numerical weather prediction

    NASA Technical Reports Server (NTRS)

    Atlas, R.; Kalnay, E.; Susskind, J.; Reuter, D.; Baker, W. E.; Halem, M.

    1984-01-01

    To study the potential impact of advanced passive sounders and lidar temperature, pressure, humidity, and wind observing systems on large-scale numerical weather prediction, a series of realistic simulation studies between the European Center for medium-range weather forecasts, the National Meteorological Center, and the Goddard Laboratory for Atmospheric Sciences is conducted. The project attempts to avoid the unrealistic character of earlier simulation studies. The previous simulation studies and real-data impact tests are reviewed and the design of the current simulation system is described. Consideration is given to the simulation of observations of space-based sounding systems.

  4. Numerical Weather Prediction Models on Linux Boxes as tools in meteorological education in Hungary

    NASA Astrophysics Data System (ADS)

    Gyongyosi, A. Z.; Andre, K.; Salavec, P.; Horanyi, A.; Szepszo, G.; Mille, M.; Tasnadi, P.; Weidiger, T.

    2012-04-01

    . Numerical modeling became a common tool in the daily practice of weather experts forecasters due to the i) increasing user demands for weather data by the costumers, ii) the growth in computer resources, iii) numerical weather prediction systems available for integration on affordable, off the shelf computers and iv) available input data (from ECMWF or NCEP) for model integrations. Beside learning the theoretical basis, since the last year. Students in their MSc or BSc Thesis Research or in Student's Research ProjectsStudent's Research Projects h have the opportunity to run numerical models and to analyze the outputs for different purposes including wind energy estimation, simulation of the dynamics of a polar low, and subtropical cyclones, analysis of the isentropic potential vorticity field, examination of coupled atmospheric dispersion models, etc. A special course in the application of numerical modeling has been held (is being announced for the upcoming semester) (is being announced for the upcoming semester) for our students in order to improve their skills on this field. Several numerical model (NRIPR ETA and WRF) systems have been adapted in the University and integrated WRF have been tested and used for the geographical region of the Carpathian Basin (NRIPR, ETA and WRF). Recently ALADIN/CHAPEAU the academic version of the ARPEGE ALADIN cy33t1 meso-scale numerical weather prediction model system (which is the operational forecasting tool of our National Weather Service) has been installed at our Institute. ALADIN is the operational forecasting model of the Hungarian Meteorological Service and developed in the framework of the international ALADIN co-operation. Our main objectives are i) the analysis of different typical weather situations, ii) fine tuning of parameterization schemes and the iii) comparison of the ALADIN/CHAPEAU and WRF model outputs based on case studies. The necessary hardware and software innovations has have been done. In the presentation the

  5. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  6. Earthquake information products and tools from the Advanced National Seismic System (ANSS)

    USGS Publications Warehouse

    Wald, Lisa

    2006-01-01

    This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.

  7. Foot-ankle simulators: A tool to advance biomechanical understanding of a complex anatomical structure.

    PubMed

    Natsakis, Tassos; Burg, Josefien; Dereymaeker, Greta; Jonkers, Ilse; Vander Sloten, Jos

    2016-05-01

    In vitro gait simulations have been available to researchers for more than two decades and have become an invaluable tool for understanding fundamental foot-ankle biomechanics. This has been realised through several incremental technological and methodological developments, such as the actuation of muscle tendons, the increase in controlled degrees of freedom and the use of advanced control schemes. Furthermore, in vitro experimentation enabled performing highly repeatable and controllable simulations of gait during simultaneous measurement of several biomechanical signals (e.g. bone kinematics, intra-articular pressure distribution, bone strain). Such signals cannot always be captured in detail using in vivo techniques, and the importance of in vitro experimentation is therefore highlighted. The information provided by in vitro gait simulations enabled researchers to answer numerous clinical questions related to pathology, injury and surgery. In this article, first an overview of the developments in design and methodology of the various foot-ankle simulators is presented. Furthermore, an overview of the conducted studies is outlined and an example of a study aiming at understanding the differences in kinematics of the hindfoot, ankle and subtalar joints after total ankle arthroplasty is presented. Finally, the limitations and future perspectives of in vitro experimentation and in particular of foot-ankle gait simulators are discussed. It is expected that the biofidelic nature of the controllers will be improved in order to make them more subject-specific and to link foot motion to the simulated behaviour of the entire missing body, providing additional information for understanding the complex anatomical structure of the foot. PMID:27160562

  8. Advanced Numerical-Algebraic Thinking: Constructing the Concept of Covariation as a Prelude to the Concept of Function

    ERIC Educational Resources Information Center

    Hitt, Fernando; Morasse, Christian

    2009-01-01

    Introduction: In this document we stress the importance of developing in children a structure for advanced numerical-algebraic thinking that can provide an element of control when solving mathematical situations. We analyze pupils' conceptions that induce errors in algebra due to a lack of control in connection with their numerical thinking. We…

  9. DynamiX, numerical tool for design of next-generation x-ray telescopes.

    PubMed

    Chauvin, Maxime; Roques, Jean-Pierre

    2010-07-20

    We present a new code aimed at the simulation of grazing-incidence x-ray telescopes subject to deformations and demonstrate its ability with two test cases: the Simbol-X and the International X-ray Observatory (IXO) missions. The code, based on Monte Carlo ray tracing, computes the full photon trajectories up to the detector plane, accounting for the x-ray interactions and for the telescope motion and deformation. The simulation produces images and spectra for any telescope configuration using Wolter I mirrors and semiconductor detectors. This numerical tool allows us to study the telescope performance in terms of angular resolution, effective area, and detector efficiency, accounting for the telescope behavior. We have implemented an image reconstruction method based on the measurement of the detector drifts by an optical sensor metrology. Using an accurate metrology, this method allows us to recover the loss of angular resolution induced by the telescope instability. In the framework of the Simbol-X mission, this code was used to study the impacts of the parameters on the telescope performance. In this paper we present detailed performance analysis of Simbol-X, taking into account the satellite motions and the image reconstruction. To illustrate the versatility of the code, we present an additional performance analysis with a particular configuration of IXO.

  10. Pantograph catenary dynamic optimisation based on advanced multibody and finite element co-simulation tools

    NASA Astrophysics Data System (ADS)

    Massat, Jean-Pierre; Laurent, Christophe; Bianchi, Jean-Philippe; Balmès, Etienne

    2014-05-01

    This paper presents recent developments undertaken by SNCF Innovation & Research Department on numerical modelling of pantograph catenary interaction. It aims at describing an efficient co-simulation process between finite element (FE) and multibody (MB) modelling methods. FE catenary models are coupled with a full flexible MB representation with pneumatic actuation of pantograph. These advanced functionalities allow new kind of numerical analyses such as dynamic improvements based on innovative pneumatic suspensions or assessment of crash risks crossing areas that demonstrate the powerful capabilities of this computing approach.

  11. Numerical approach for the voloxidation process of an advanced spent fuel conditioning process (ACP)

    SciTech Connect

    Park, Byung Heung; Jeong, Sang Mun; Seo, Chung-Seok

    2007-07-01

    A voloxidation process is adopted as the first step of an advanced spent fuel conditioning process in order to prepare the SF oxide to be reduced in the following electrolytic reduction process. A semi-batch type voloxidizer was devised to transform a SF pellet into powder. In this work, a simple reactor model was developed for the purpose of correlating a gas phase flow rate with an operation time as a numerical approach. With an assumption that a solid phase and a gas phase are homogeneous in a reactor, a reaction rate for an oxidation was introduced into a mass balance equation. The developed equation can describe a change of an outlet's oxygen concentration including such a case that a gas flow is not sufficient enough to continue a reaction at its maximum reaction rate. (authors)

  12. Synthetic biology and molecular genetics in non-conventional yeasts: Current tools and future advances.

    PubMed

    Wagner, James M; Alper, Hal S

    2016-04-01

    Coupling the tools of synthetic biology with traditional molecular genetic techniques can enable the rapid prototyping and optimization of yeast strains. While the era of yeast synthetic biology began in the well-characterized model organism Saccharomyces cerevisiae, it is swiftly expanding to include non-conventional yeast production systems such as Hansenula polymorpha, Kluyveromyces lactis, Pichia pastoris, and Yarrowia lipolytica. These yeasts already have roles in the manufacture of vaccines, therapeutic proteins, food additives, and biorenewable chemicals, but recent synthetic biology advances have the potential to greatly expand and diversify their impact on biotechnology. In this review, we summarize the development of synthetic biological tools (including promoters and terminators) and enabling molecular genetics approaches that have been applied in these four promising alternative biomanufacturing platforms. An emphasis is placed on synthetic parts and genome editing tools. Finally, we discuss examples of synthetic tools developed in other organisms that can be adapted or optimized for these hosts in the near future.

  13. Measuring political commitment and opportunities to advance food and nutrition security: piloting a rapid assessment tool.

    PubMed

    Fox, Ashley M; Balarajan, Yarlini; Cheng, Chloe; Reich, Michael R

    2015-06-01

    Lack of political commitment has been identified as a primary reason for the low priority that food and nutrition interventions receive from national governments relative to the high disease burden caused by malnutrition. Researchers have identified a number of factors that contribute to food and nutrition's 'low-priority cycle' on national policy agendas, but few tools exist to rapidly measure political commitment and identify opportunities to advance food and nutrition on the policy agenda. This article presents a theory-based rapid assessment approach to gauging countries' level of political commitment to food and nutrition security and identifying opportunities to advance food and nutrition on the policy agenda. The rapid assessment tool was piloted among food and nutrition policymakers and planners in 10 low- and middle-income countries in April to June 2013. Food and nutrition commitment and policy opportunity scores were calculated for each country and strategies to advance food and nutrition on policy agendas were designed for each country. The article finds that, in a majority of countries, political leaders had verbally and symbolically committed to addressing food and nutrition, but adequate financial resources were not allocated to implement specific programmes. In addition, whereas the low cohesion of the policy community has been viewed a major underlying cause of the low-priority status of food and nutrition, the analysis finds that policy community cohesion and having a well thought-out policy alternative were present in most countries. This tool may be useful to policymakers and planners providing information that can be used to benchmark and/or evaluate advocacy efforts to advance reforms in the food and nutrition sector; furthermore, the results can help identify specific strategies that can be employed to move the food and nutrition agenda forward. This tool complements others that have been recently developed to measure national commitment to

  14. Measuring political commitment and opportunities to advance food and nutrition security: piloting a rapid assessment tool.

    PubMed

    Fox, Ashley M; Balarajan, Yarlini; Cheng, Chloe; Reich, Michael R

    2015-06-01

    Lack of political commitment has been identified as a primary reason for the low priority that food and nutrition interventions receive from national governments relative to the high disease burden caused by malnutrition. Researchers have identified a number of factors that contribute to food and nutrition's 'low-priority cycle' on national policy agendas, but few tools exist to rapidly measure political commitment and identify opportunities to advance food and nutrition on the policy agenda. This article presents a theory-based rapid assessment approach to gauging countries' level of political commitment to food and nutrition security and identifying opportunities to advance food and nutrition on the policy agenda. The rapid assessment tool was piloted among food and nutrition policymakers and planners in 10 low- and middle-income countries in April to June 2013. Food and nutrition commitment and policy opportunity scores were calculated for each country and strategies to advance food and nutrition on policy agendas were designed for each country. The article finds that, in a majority of countries, political leaders had verbally and symbolically committed to addressing food and nutrition, but adequate financial resources were not allocated to implement specific programmes. In addition, whereas the low cohesion of the policy community has been viewed a major underlying cause of the low-priority status of food and nutrition, the analysis finds that policy community cohesion and having a well thought-out policy alternative were present in most countries. This tool may be useful to policymakers and planners providing information that can be used to benchmark and/or evaluate advocacy efforts to advance reforms in the food and nutrition sector; furthermore, the results can help identify specific strategies that can be employed to move the food and nutrition agenda forward. This tool complements others that have been recently developed to measure national commitment to

  15. Building Models in the Classroom: Taking Advantage of Sophisticated Geomorphic Numerical Tools Using a Simple Graphical User Interface

    NASA Astrophysics Data System (ADS)

    Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.

    2014-12-01

    Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

  16. The role of numerical simulation for the development of an advanced HIFU system

    NASA Astrophysics Data System (ADS)

    Okita, Kohei; Narumi, Ryuta; Azuma, Takashi; Takagi, Shu; Matumoto, Yoichiro

    2014-10-01

    High-intensity focused ultrasound (HIFU) has been used clinically and is under clinical trials to treat various diseases. An advanced HIFU system employs ultrasound techniques for guidance during HIFU treatment instead of magnetic resonance imaging in current HIFU systems. A HIFU beam imaging for monitoring the HIFU beam and a localized motion imaging for treatment validation of tissue are introduced briefly as the real-time ultrasound monitoring techniques. Numerical simulations have a great impact on the development of real-time ultrasound monitoring as well as the improvement of the safety and efficacy of treatment in advanced HIFU systems. A HIFU simulator was developed to reproduce ultrasound propagation through the body in consideration of the elasticity of tissue, and was validated by comparison with in vitro experiments in which the ultrasound emitted from the phased-array transducer propagates through the acrylic plate acting as a bone phantom. As the result, the defocus and distortion of the ultrasound propagating through the acrylic plate in the simulation quantitatively agree with that in the experimental results. Therefore, the HIFU simulator accurately reproduces the ultrasound propagation through the medium whose shape and physical properties are well known. In addition, it is experimentally confirmed that simulation-assisted focus control of the phased-array transducer enables efficient assignment of the focus to the target. Simulation-assisted focus control can contribute to design of transducers and treatment planning.

  17. Advanced Engineering Tools for Structural Analysis of Advanced Power Plants Application to the GE ESBWR Design

    SciTech Connect

    Gamble, R.E.; Fanning, A.; Diaz Llanos, M.; Moreno, A.; Carrasco, A.

    2002-07-01

    Experience in the design of nuclear reactors for power generation shows that the plant structures and buildings involved are one of the major contributors to plant capital investment. Consequently, the design of theses elements must be optimised if cost reductions in future reactors are to be achieved. The benefits of using the 'Best Estimate Approach' are well known in the area of core and systems design. This consists in developing accurate models of a plant's phenomenology and behaviour, minimising the margins. Different safety margins have been applied in the past when performing structural analyses. Three of these margins can be identified: - increasing the value of the load by a factor that depends on the load frequency; - decreasing the resistance of the structure's resistance, and - safety margins introduced through two step analysis. The first two type of margins are established in the applicable codes in order to provide design safety margins. The third one derives from limitations in tools which, in the past, did not allow obtaining an accurate model in which both the dynamic and static loads could be evaluated simultaneously. Nowadays, improvements in hardware and software have eliminated the need for two-step calculations in structural analysis (dynamic plus static), allowing the creation one-through finite element models in which all loads, both dynamic and static, are combined without the determination of the equivalent static loads from the dynamic loads. This paper summarizes how these models and methods have been applied to optimize the Reactor Building structural design of the General Electric (GE) ESBWR Passive Plant. The work has focused on three areas: - the design of the Gravity Driven Cooling System (GDCS) Pools as pressure boundary between the Drywell and the Wet-well; - the evaluation of the thickness of the Reactor Building foundation slab, and - the global structural evaluation of the Reactor Building.

  18. Advanced Risk Reduction Tool (ARRT) Special Case Study Report: Science and Engineering Technical Assessments (SETA) Program

    NASA Technical Reports Server (NTRS)

    Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian

    2000-01-01

    This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.

  19. An Introduction to the Advanced Tracking and Resource Tool for Archive Collections (ATRAC)

    NASA Astrophysics Data System (ADS)

    Roberts, K.; Ritchey, N. A.; Jones, P.; Brown, H.

    2011-12-01

    The National Climatic Data Center (NCDC) has stepped up to meet the demand of today's exponential growth of archive projects and datasets by creating a web-based tool for managing and tracking data archiving, the Advanced Tracking and Resource tool for Archive Collections (ATRAC). ATRAC allows users to enter, share and display information for an archive project. User-friendly forms collect new input or use existing components of information in the system. The tool generates archive documents in various formats from the input and can automatically notify stakeholders of important project milestones. Current information on projects, tasks and events are displayed in configurable timeframes with viewing rights set by the project stakeholders. This presentation will demonstrate ATRAC's latest features and how the capabilities of ATRAC can improve project communication and work flow.

  20. BOOK REVIEW: Advanced Topics in Computational Partial Differential Equations: Numerical Methods and Diffpack Programming

    NASA Astrophysics Data System (ADS)

    Katsaounis, T. D.

    2005-02-01

    The scope of this book is to present well known simple and advanced numerical methods for solving partial differential equations (PDEs) and how to implement these methods using the programming environment of the software package Diffpack. A basic background in PDEs and numerical methods is required by the potential reader. Further, a basic knowledge of the finite element method and its implementation in one and two space dimensions is required. The authors claim that no prior knowledge of the package Diffpack is required, which is true, but the reader should be at least familiar with an object oriented programming language like C++ in order to better comprehend the programming environment of Diffpack. Certainly, a prior knowledge or usage of Diffpack would be a great advantage to the reader. The book consists of 15 chapters, each one written by one or more authors. Each chapter is basically divided into two parts: the first part is about mathematical models described by PDEs and numerical methods to solve these models and the second part describes how to implement the numerical methods using the programming environment of Diffpack. Each chapter closes with a list of references on its subject. The first nine chapters cover well known numerical methods for solving the basic types of PDEs. Further, programming techniques on the serial as well as on the parallel implementation of numerical methods are also included in these chapters. The last five chapters are dedicated to applications, modelled by PDEs, in a variety of fields. The first chapter is an introduction to parallel processing. It covers fundamentals of parallel processing in a simple and concrete way and no prior knowledge of the subject is required. Examples of parallel implementation of basic linear algebra operations are presented using the Message Passing Interface (MPI) programming environment. Here, some knowledge of MPI routines is required by the reader. Examples solving in parallel simple PDEs using

  1. Simulation tools for computer-aided design and numerical investigations of high-power gyrotrons

    NASA Astrophysics Data System (ADS)

    Damyanova, M.; Balabanova, E.; Kern, S.; Illy, S.; Sabchevski, S.; Thumm, M.; Vasileva, E.; Zhelyazkov, I.

    2012-03-01

    Modelling and simulation are essential tools for computer-aided design (CAD), analysis and optimization of high-power gyrotrons used as radiation sources for electron cyclotron resonance heating (ECRH) and current drive (ECCD) of magnetically confined plasmas in the thermonuclear reactor ITER. In this communication, we present the current status of our simulation tools and discuss their further development.

  2. Numerical Evaluation of Fluid Mixing Phenomena in Boiling Water Reactor Using Advanced Interface Tracking Method

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Takase, Kazuyuki

    Thermal-hydraulic design of the current boiling water reactor (BWR) is performed with the subchannel analysis codes which incorporated the correlations based on empirical results including actual-size tests. Then, for the Innovative Water Reactor for Flexible Fuel Cycle (FLWR) core, an actual size test of an embodiment of its design is required to confirm or modify such correlations. In this situation, development of a method that enables the thermal-hydraulic design of nuclear reactors without these actual size tests is desired, because these tests take a long time and entail great cost. For this reason, we developed an advanced thermal-hydraulic design method for FLWRs using innovative two-phase flow simulation technology. In this study, a detailed Two-Phase Flow simulation code using advanced Interface Tracking method: TPFIT is developed to calculate the detailed information of the two-phase flow. In this paper, firstly, we tried to verify the TPFIT code by comparing it with the existing 2-channel air-water mixing experimental results. Secondary, the TPFIT code was applied to simulation of steam-water two-phase flow in a model of two subchannels of a current BWRs and FLWRs rod bundle. The fluid mixing was observed at a gap between the subchannels. The existing two-phase flow correlation for fluid mixing is evaluated using detailed numerical simulation data. This data indicates that pressure difference between fluid channels is responsible for the fluid mixing, and thus the effects of the time average pressure difference and fluctuations must be incorporated in the two-phase flow correlation for fluid mixing. When inlet quality ratio of subchannels is relatively large, it is understood that evaluation precision of the existing two-phase flow correlations for fluid mixing are relatively low.

  3. Advanced gradient-index lens design tools to maximize system performance and reduce SWaP

    NASA Astrophysics Data System (ADS)

    Campbell, Sawyer D.; Nagar, Jogender; Brocker, Donovan E.; Easum, John A.; Turpin, Jeremiah P.; Werner, Douglas H.

    2016-05-01

    GRadient-INdex (GRIN) lenses have long been of interest due to their potential for providing levels of performance unachievable with traditional homogeneous lenses. While historically limited by a lack of suitable materials, rapid advancements in manufacturing techniques, including 3D printing, have recently kindled a renewed interest in GRIN optics. Further increasing the desire for GRIN devices has been the advent of Transformation Optics (TO), which provides the mathematical framework for representing the behavior of electromagnetic radiation in a given geometry by "transforming" it to an alternative, usually more desirable, geometry through an appropriate mapping of the constituent material parameters. Using TO, aspherical lenses can be transformed to simpler spherical and flat geometries or even rotationally-asymmetric shapes which result in true 3D GRIN profiles. Meanwhile, there is a critical lack of suitable design tools which can effectively evaluate the optical wave propagation through 3D GRIN profiles produced by TO. Current modeling software packages for optical lens systems also lack advanced multi-objective global optimization capability which allows the user to explicitly view the trade-offs between all design objectives such as focus quality, FOV, ▵nand focal drift due to chromatic aberrations. When coupled with advanced design methodologies such as TO, wavefront matching (WFM), and analytical achromatic GRIN theory, these tools provide a powerful framework for maximizing SWaP (Size, Weight and Power) reduction in GRIN-enabled optical systems. We provide an overview of our advanced GRIN design tools and examples which minimize the presence of mono- and polychromatic aberrations in the context of reducing SWaP.

  4. SUSPNDRS: a numerical simulation tool for the nonlinear transient analysis of cable support bridge structures, part 1: theoretical development

    SciTech Connect

    McCallen, D.; Astaneh-Asl, A.

    1997-06-01

    The work reprint on herein was aimed at developing methodologies and tools for efficient and accurate numerical simulation of the seismic response of suspension and cable-stayed structures. A special purpose finite element program has been constructed and the underlying theory and demonstration example problems are presented. A companion report [Ref 1] discusses the application of this technology for a major suspension bridge structure.

  5. Basic and Advanced Numerical Performances Relate to Mathematical Expertise but Are Fully Mediated by Visuospatial Skills

    ERIC Educational Resources Information Center

    Sella, Francesco; Sader, Elie; Lolliot, Simon; Cohen Kadosh, Roi

    2016-01-01

    Recent studies have highlighted the potential role of basic numerical processing in the acquisition of numerical and mathematical competences. However, it is debated whether high-level numerical skills and mathematics depends specifically on basic numerical representations. In this study mathematicians and nonmathematicians performed a basic…

  6. A review on recent advances in numerical modelling of bone cutting.

    PubMed

    Marco, Miguel; Rodríguez-Millán, Marcos; Santiuste, Carlos; Giner, Eugenio; Henar Miguélez, María

    2015-04-01

    Common practice of surgical treatments in orthopaedics and traumatology involves cutting processes of bone. These operations introduce risk of thermo-mechanical damage, since the threshold of critical temperature producing thermal osteonecrosis is very low. Therefore, it is important to develop predictive tools capable of simulating accurately the increase of temperature during bone cutting, being the modelling of these processes still a challenge. In addition, the prediction of cutting forces and mechanical damage is also important during machining operations. As the accuracy of simulations depends greatly on the proper choice of the thermo-mechanical properties, an essential part of the numerical model is the constitutive behaviour of the bone tissue, which is considered in different ways in the literature. This paper focuses on the review of the main contributions in modelling of bone cutting with special attention to the bone mechanical behaviour. The aim is to give the reader a complete vision of the approaches commonly presented in the literature in order to help in the development of accurate models for bone cutting. PMID:25676359

  7. Anvil Forecast Tool in the Advanced Weather Interactive Processing System (AWIPS)

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.

  8. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  9. Constrained numerical gradients and composite gradients: Practical tools for geometry optimization and potential energy surface navigation.

    PubMed

    Stenrup, Michael; Lindh, Roland; Fdez Galván, Ignacio

    2015-08-15

    A method is proposed to easily reduce the number of energy evaluations required to compute numerical gradients when constraints are imposed on the system, especially in connection with rigid fragment optimization. The method is based on the separation of the coordinate space into a constrained and an unconstrained space, and the numerical differentiation is done exclusively in the unconstrained space. The decrease in the number of energy calculations can be very important if the system is significantly constrained. The performance of the method is tested on systems that can be considered as composed of several rigid groups or molecules, and the results show that the error with respect to conventional optimizations is of the order of the convergence criteria. Comparison with another method designed for rigid fragment optimization proves the present method to be competitive. The proposed method can also be applied to combine numerical and analytical gradients computed at different theory levels, allowing an unconstrained optimization with numerical differentiation restricted to the most significant degrees of freedom. This approach can be a practical alternative when analytical gradients are not available at the desired computational level and full numerical differentiation is not affordable.

  10. Folder: A numerical tool to simulate the development of structures in layered media

    NASA Astrophysics Data System (ADS)

    Adamuszek, Marta; Dabrowski, Marcin; Schmid, Daniel W.

    2016-03-01

    We present Folder, a numerical toolbox for modelling deformation in layered media subject to layer parallel shortening or extension in two dimensions. The toolbox includes a range of features that ensure maximum flexibility to configure model geometry, define material parameters, specify numerical parameters, and choose the plotting options. Folder builds on an efficient finite element method model and implements state of the art iterative and time integration schemes. We describe the basic Folder features and present several case studies of single and multilayer stacks subject to layer parallel shortening and extension. Folder additionally comprises an application that illustrates various analytical solutions of growth rates calculated for the cases of layer parallel shortening and extension of a single layer with interfaces perturbed with a single sinusoidal waveform. We further derive two novel analytical expressions for the growth rate in the cases of layer parallel shortening and extension of a linear viscous layer embedded in a linear viscous medium of a finite thickness. These solutions help understand mechanical instabilities in layered rocks and provide a unique opportunity for benchmarking of numerical codes. We demonstrate how Folder can be used for benchmarking of numerical codes. We test the accuracy of single-layer folding simulations using various 1) spatial and temporal resolutions, 2) iterative algorithms for non-linear materials, and 3) time integration schemes. The accuracy of the numerical results is quantified by: 1) comparing them to analytical solutions, if available, or 2) running convergence tests. As a result, we provide a map of the most optimal choice of grid size, time step, and number of iterations to keep the results of the numerical simulations below a given error for a given time integration scheme. Folder is an open source MATLAB application and comes with a user-friendly graphical interface. Folder is suitable for both educational

  11. Recent advances in theoretical and numerical studies of wire array Z-pinch in the IAPCM

    SciTech Connect

    Ding, Ning Zhang, Yang Xiao, Delong Wu, Jiming Huang, Jun Yin, Li Sun, Shunkai Xue, Chuang Dai, Zihuan Ning, Cheng Shu, Xiaojian Wang, Jianguo Li, Hua

    2014-12-15

    Fast Z-pinch has produced the most powerful X-ray radiation source in laboratory and also shows the possibility to drive inertial confinement fusion (ICF). Recent advances in wire-array Z-pinch researches at the Institute of Applied Physics and Computational Mathematics are presented in this paper. A typical wire array Z-pinch process has three phases: wire plasma formation and ablation, implosion and the MRT instability development, stagnation and radiation. A mass injection model with azimuthal modulation coefficient is used to describe the wire initiation, and the dynamics of ablated plasmas of wire-array Z-pinches in (r, θ) geometry is numerically studied. In the implosion phase, a two-dimensional(r, z) three temperature radiation MHD code MARED has been developed to investigate the development of the Magneto-Rayleigh-Taylor(MRT) instability. We also analyze the implosion modes of nested wire-array and find that the inner wire-array is hardly affected before the impaction of the outer wire-array. While the plasma accelerated to high speed in the implosion stage stagnates on the axis, abundant x-ray radiation is produced. The energy spectrum of the radiation and the production mechanism are investigated. The computational x-ray pulse shows a reasonable agreement with the experimental result. We also suggest that using alloyed wire-arrays can increase multi-keV K-shell yield by decreasing the opacity of K-shell lines. In addition, we use a detailed circuit model to study the energy coupling between the generator and the Z-pinch implosion. Recently, we are concentrating on the problems of Z-pinch driven ICF, such as dynamic hohlraum and capsule implosions. Our numerical investigations on the interaction of wire-array Z-pinches on foam convertors show qualitative agreements with experimental results on the “Qiangguang I” facility. An integrated two-dimensional simulation of dynamic hohlraum driven capsule implosion provides us the physical insights of wire

  12. Recent advances in theoretical and numerical studies of wire array Z-pinch in the IAPCM

    NASA Astrophysics Data System (ADS)

    Ding, Ning; Zhang, Yang; Xiao, Delong; Wu, Jiming; Huang, Jun; Yin, Li; Sun, Shunkai; Xue, Chuang; Dai, Zihuan; Ning, Cheng; Shu, Xiaojian; Wang, Jianguo; Li, Hua

    2014-12-01

    Fast Z-pinch has produced the most powerful X-ray radiation source in laboratory and also shows the possibility to drive inertial confinement fusion (ICF). Recent advances in wire-array Z-pinch researches at the Institute of Applied Physics and Computational Mathematics are presented in this paper. A typical wire array Z-pinch process has three phases: wire plasma formation and ablation, implosion and the MRT instability development, stagnation and radiation. A mass injection model with azimuthal modulation coefficient is used to describe the wire initiation, and the dynamics of ablated plasmas of wire-array Z-pinches in (r, θ) geometry is numerically studied. In the implosion phase, a two-dimensional(r, z) three temperature radiation MHD code MARED has been developed to investigate the development of the Magneto-Rayleigh-Taylor(MRT) instability. We also analyze the implosion modes of nested wire-array and find that the inner wire-array is hardly affected before the impaction of the outer wire-array. While the plasma accelerated to high speed in the implosion stage stagnates on the axis, abundant x-ray radiation is produced. The energy spectrum of the radiation and the production mechanism are investigated. The computational x-ray pulse shows a reasonable agreement with the experimental result. We also suggest that using alloyed wire-arrays can increase multi-keV K-shell yield by decreasing the opacity of K-shell lines. In addition, we use a detailed circuit model to study the energy coupling between the generator and the Z-pinch implosion. Recently, we are concentrating on the problems of Z-pinch driven ICF, such as dynamic hohlraum and capsule implosions. Our numerical investigations on the interaction of wire-array Z-pinches on foam convertors show qualitative agreements with experimental results on the "Qiangguang I" facility. An integrated two-dimensional simulation of dynamic hohlraum driven capsule implosion provides us the physical insights of wire

  13. Development of Numerical Tools for the Investigation of Plasma Detachment from Magnetic Nozzles

    NASA Technical Reports Server (NTRS)

    Sankaran, Kamesh; Polzin, Kurt A.

    2007-01-01

    A multidimensional numerical simulation framework aimed at investigating the process of plasma detachment from a magnetic nozzle is introduced. An existing numerical code based on a magnetohydrodynamic formulation of the plasma flow equations that accounts for various dispersive and dissipative processes in plasmas was significantly enhanced to allow for the modeling of axisymmetric domains containing three.dimensiunai momentum and magnetic flux vectors. A separate magnetostatic solver was used to simulate the applied magnetic field topologies found in various nozzle experiments. Numerical results from a magnetic diffusion test problem in which all three components of the magnetic field were present exhibit excellent quantitative agreement with the analytical solution, and the lack of numerical instabilities due to fluctuations in the value of del(raised dot)B indicate that the conservative MHD framework with dissipative effects is well-suited for multi-dimensional analysis of magnetic nozzles. Further studies will focus on modeling literature experiments both for the purpose of code validation and to extract physical insight regarding the mechanisms driving detachment.

  14. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  15. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  16. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  17. Science-Based Approach for Advancing Marine and Hydrokinetic Energy: Integrating Numerical Simulations with Experiments

    NASA Astrophysics Data System (ADS)

    Sotiropoulos, F.; Kang, S.; Chamorro, L. P.; Hill, C.

    2011-12-01

    experimentally in the St. Anthony Falls Laboratory Main Channel. The experiments and simulations are compared with each other and shown to be in very good agreement both in terms of the mean flow and the turbulence statistics. The results are analyzed to study the structure of turbulence in the wake of the turbine and also identify the effects of turbulent fluctuations in the approach flow on the power produced by the turbine. Overall our results make a strong case that high-resolution numerical modeling, validated with detailed laboratory measurements, is a viable tool for assessing and optimizing the performance of MHK devices.

  18. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    SciTech Connect

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division; Purdue Univ.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  19. An Analysis of Energy Savings Possible Through Advances in Automotive Tooling Technology

    SciTech Connect

    Rick Schmoyer, RLS

    2004-12-03

    The use of lightweight and highly formable advanced materials in automobile and truck manufacturing has the potential to save fuel. Advances in tooling technology would promote the use of these materials. This report describes an energy savings analysis performed to approximate the potential fuel savings and consequential carbon-emission reductions that would be possible because of advances in tooling in the manufacturing of, in particular, non-powertrain components of passenger cars and heavy trucks. Separate energy analyses are performed for cars and heavy trucks. Heavy trucks are considered to be Class 7 and 8 trucks (trucks rated over 26,000 lbs gross vehicle weight). A critical input to the analysis is a set of estimates of the percentage reductions in weight and drag that could be achieved by the implementation of advanced materials, as a consequence of improved tooling technology, which were obtained by surveying tooling industry experts who attended a DOE Workshop, Tooling Technology for Low-Volume Vehicle Production, held in Seattle and Detroit in October and November 2003. The analysis is also based on 2001 fuel consumption totals and on energy-audit component proportions of fuel use due to drag, rolling resistance, and braking. The consumption proportions are assumed constant over time, but an allowance is made for fleet growth. The savings for a particular component is then the product of total fuel consumption, the percentage reduction of the component, and the energy audit component proportion. Fuel savings estimates for trucks also account for weight-limited versus volume-limited operations. Energy savings are assumed to be of two types: (1) direct energy savings incurred through reduced forces that must be overcome to move the vehicle or to slow it down in braking. and (2) indirect energy savings through reductions in the required engine power, the production and transmission of which incur thermodynamic losses, internal friction, and other

  20. Numerical Simulations and Tracer Studies as a Tool to Support Water Circulation Modeling in Breeding Reservoirs

    NASA Astrophysics Data System (ADS)

    Zima, Piotr

    2014-12-01

    The article presents a proposal of a method for computer-aided design and analysis of breeding reservoirs in zoos and aquariums. The method applied involves the use of computer simulations of water circulation in breeding pools. A mathematical model of a pool was developed, and a tracer study was carried out. A simplified model of two-dimensional flow in the form of a biharmonic equation for the stream function (converted into components of the velocity vector) was adopted to describe the flow field. This equation, supplemented by appropriate boundary conditions, was solved numerically by the finite difference method. Next, a tracer migration equation was solved, which was a two-dimensional advection-dispersion equation describing the unsteady transport of a non-active, permanent solute. In order to obtain a proper solution, a tracer study (with rhodamine WT as a tracer) was conducted in situ. The results of these measurements were compared with numerical solutions obtained. The results of numerical simulations made it possible to reconstruct water circulation in the breading pool and to identify still water zones, where water circulation was impeded.

  1. Code package MAG c user tool for numerical modeling of 1D shock wave and dynamic processes in solids

    NASA Astrophysics Data System (ADS)

    Rudenko, Vladimir; Shaburov, Michail

    1999-06-01

    Design and theoretical and numerical preparation of shock wave experiments require, as a rule, conduction of a large amount of calculations. Usually preparation of a problem for numerical solution, calculation and processing of the results is done be programmers c mathematicians. The appearance of powerful personal computers and interface tools allows to develop such user-oriented programs that a researcher can handle them without the help of a mathematician, even if he does not have special programming background. Code package MAG for numerical solution of 1D system of equations of hydrodynamics, elastoplastics, heat conduction and magnetic hydrodynamic. A number of modern models of elastoplastics and kinetics of power materials is implemented in it. The package includes libraries of equations of state, thermal physical and electromagnetic properties of substances. The code package is an interactive visual medium providing a user with the following capabilities: ? Input and edit initial data for a problem; ? Calculate separate problems, as well as series of problems with a possibility of automatic variation of parameters; ? View the modeled phenomena dynamically using the means of visualization; ? Control the process of calculation: terminate the calculation, change parameters, make express-processing of the results, continue the calculation etc.; ? Process the numerical results producing final plots and tables; ? Record and store numerical results in databases, including the formats supported by Microsoft Word, Acces, Exel; ? Make dynamic visual comparison of the results of several simultaneous calculations; ? Carry out automatic numerical optimization of a selected experimental scheme. The package is easy in use, allows prompt input and convenient information processing. The validity of numerical results obtained with the package MAG has been proved by numerous hydrodynamic experiments and comparisons with numerical results from similar programs. The package was

  2. Development of a carburizing and quenching simulation tool: Numerical simulations of rings and gears

    SciTech Connect

    Anderson, C.; Goldman, P.; Rangaswamy, P.

    1996-06-24

    This paper describes a calculational procedure using the ABAQUS finite element code that simulates a carburizing and quench heat treat cycle for automotive gears. The procedure features a numerically efficient 2-phase constitutive model to represent transformational plasticity effects for the austenite/martensite transformation together with refined finite element meshes to capture the steep gradients in stress and composition near the gear surfaces. The procedure is illustrated on carburizing and quenching of a thick ring, and comparison of model predictions for distortion, phase distribution, and residual stress with experiment is discussed. Sensitivity of predictions to mesh refinement is studied.

  3. Contemporary molecular tools in microbial ecology and their application to advancing biotechnology.

    PubMed

    Rashid, Mamoon; Stingl, Ulrich

    2015-12-01

    Novel methods in microbial ecology are revolutionizing our understanding of the structure and function of microbes in the environment, but concomitant advances in applications of these tools to biotechnology are mostly lagging behind. After more than a century of efforts to improve microbial culturing techniques, about 70-80% of microbial diversity - recently called the "microbial dark matter" - remains uncultured. In early attempts to identify and sample these so far uncultured taxonomic lineages, methods that amplify and sequence ribosomal RNA genes were extensively used. Recent developments in cell separation techniques, DNA amplification, and high-throughput DNA sequencing platforms have now made the discovery of genes/genomes of uncultured microorganisms from different environments possible through the use of metagenomic techniques and single-cell genomics. When used synergistically, these metagenomic and single-cell techniques create a powerful tool to study microbial diversity. These genomics techniques have already been successfully exploited to identify sources for i) novel enzymes or natural products for biotechnology applications, ii) novel genes from extremophiles, and iii) whole genomes or operons from uncultured microbes. More can be done to utilize these tools more efficiently in biotechnology.

  4. Advances in Analytical and Numerical Dispersion Modeling of Pollutants Releasing from an Area-source

    NASA Astrophysics Data System (ADS)

    Nimmatoori, Praneeth

    The air quality near agricultural activities such as tilling, plowing, harvesting, and manure application is of main concern because they release fine particulate matter into the atmosphere. These releases are modeled as area-sources in the air quality modeling research. None of the currently available dispersion models relate and incorporate physical characteristics and meteorological conditions for modeling the dispersion and deposition of particulates emitting from such area-sources. This knowledge gap was addressed by developing the advanced analytical and numerical methods for modeling the dispersion of particulate matter. The development, application, and evaluation of new dispersion modeling methods are discussed in detail in this dissertation. In the analytical modeling, a ground-level area source analytical dispersion model known as particulate matter deposition -- PMD was developed for predicting the concentrations of different particle sizes. Both the particle dynamics (particle physical characteristics) and meteorological conditions which have significant effect on the dispersion of particulates were related and incorporated in the PMD model using the formulations of particle gravitational settling and dry deposition velocities. The modeled particle size concentrations of the PMD model were evaluated statistically after applying it to particulates released from a biosolid applied agricultural field. The evaluation of the PMD model using the statistical criteria concluded effective and successful inclusion of dry deposition theory for modeling particulate matter concentrations. A comprehensive review of analytical area-source dispersion models, which do not account for dry deposition and treat pollutants as gases, was conducted and determined three models -- the Shear, the Parker, and the Smith. A statistical evaluation of these dispersion models was conducted after applying them to two different field data sets and the statistical results concluded that

  5. Numerical simulation of extrusion: A good tool for troubleshooting extrusion problems

    NASA Astrophysics Data System (ADS)

    Touré, Birane; Svabik, Jiri; Veaux, Michael; Bahloul, Walid; Mascia, Jean-Pierre; Abéguilé, Mikael; Seux, Thierry; Hauko, Stig-Jrale

    2013-04-01

    The paper describes different studies linked to extrusion problems encountered during production in the cable industry. Extrusion simulation or rheology analysis can be used to understand the origin of the problems in cable manufacturing and to set up durable industrial solutions by selecting the appropriate compounds, optimizing screw profiles, modifying the geometries of the tooling or adapting the processing conditions. Our investigations based on the COMPUPLAST® Virtual Extrusion Laboratory™ software, have brought large understanding and these investigations are cheaper and faster alternatives to real experiments (trials/errors) when analyzing and optimizing extrusion processes. Three examples are presented in the paper.

  6. FOLDER: A numerical tool to simulate the development of structures in layered media

    NASA Astrophysics Data System (ADS)

    Adamuszek, Marta; Dabrowski, Marcin; Schmid, Daniel W.

    2015-04-01

    FOLDER is a numerical toolbox for modelling deformation in layered media during layer parallel shortening or extension in two dimensions. FOLDER builds on MILAMIN [1], a finite element method based mechanical solver, with a range of utilities included from the MUTILS package [2]. Numerical mesh is generated using the Triangle software [3]. The toolbox includes features that allow for: 1) designing complex structures such as multi-layer stacks, 2) accurately simulating large-strain deformation of linear and non-linear viscous materials, 3) post-processing of various physical fields such as velocity (total and perturbing), rate of deformation, finite strain, stress, deviatoric stress, pressure, apparent viscosity. FOLDER is designed to ensure maximum flexibility to configure model geometry, define material parameters, specify range of numerical parameters in simulations and choose the plotting options. FOLDER is an open source MATLAB application and comes with a user friendly graphical interface. The toolbox additionally comprises an educational application that illustrates various analytical solutions of growth rates calculated for the cases of folding and necking of a single layer with interfaces perturbed with a single sinusoidal waveform. We further derive two novel analytical expressions for the growth rate in the cases of folding and necking of a linear viscous layer embedded in a linear viscous medium of a finite thickness. We use FOLDER to test the accuracy of single-layer folding simulations using various 1) spatial and temporal resolutions, 2) time integration schemes, and 3) iterative algorithms for non-linear materials. The accuracy of the numerical results is quantified by: 1) comparing them to analytical solution, if available, or 2) running convergence tests. As a result, we provide a map of the most optimal choice of grid size, time step, and number of iterations to keep the results of the numerical simulations below a given error for a given time

  7. Numerical Modeling Tools for the Prediction of Solution Migration Applicable to Mining Site

    SciTech Connect

    Martell, M.; Vaughn, P.

    1999-01-06

    Mining has always had an important influence on cultures and traditions of communities around the globe and throughout history. Today, because mining legislation places heavy emphasis on environmental protection, there is great interest in having a comprehensive understanding of ancient mining and mining sites. Multi-disciplinary approaches (i.e., Pb isotopes as tracers) are being used to explore the distribution of metals in natural environments. Another successful approach is to model solution migration numerically. A proven method to simulate solution migration in natural rock salt has been applied to project through time for 10,000 years the system performance and solution concentrations surrounding a proposed nuclear waste repository. This capability is readily adaptable to simulate solution migration around mining.

  8. Advanced material modelling in numerical simulation of primary acetabular press-fit cup stability.

    PubMed

    Souffrant, R; Zietz, C; Fritsche, A; Kluess, D; Mittelmeier, W; Bader, R

    2012-01-01

    Primary stability of artificial acetabular cups, used for total hip arthroplasty, is required for the subsequent osteointegration and good long-term clinical results of the implant. Although closed-cell polymer foams represent an adequate bone substitute in experimental studies investigating primary stability, correct numerical modelling of this material depends on the parameter selection. Material parameters necessary for crushable foam plasticity behaviour were originated from numerical simulations matched with experimental tests of the polymethacrylimide raw material. Experimental primary stability tests of acetabular press-fit cups consisting of static shell assembly with consecutively pull-out and lever-out testing were subsequently simulated using finite element analysis. Identified and optimised parameters allowed the accurate numerical reproduction of the raw material tests. Correlation between experimental tests and the numerical simulation of primary implant stability depended on the value of interference fit. However, the validated material model provides the opportunity for subsequent parametric numerical studies.

  9. Free Radical Addition Polymerization Kinetics without Steady-State Approximations: A Numerical Analysis for the Polymer, Physical, or Advanced Organic Chemistry Course

    ERIC Educational Resources Information Center

    Iler, H. Darrell; Brown, Amber; Landis, Amanda; Schimke, Greg; Peters, George

    2014-01-01

    A numerical analysis of the free radical addition polymerization system is described that provides those teaching polymer, physical, or advanced organic chemistry courses the opportunity to introduce students to numerical methods in the context of a simple but mathematically stiff chemical kinetic system. Numerical analysis can lead students to an…

  10. A decision support tool for synchronizing technology advances with strategic mission objectives

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.; Willoughby, John K.

    1992-01-01

    Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.

  11. Advanced Launch Technology Life Cycle Analysis Using the Architectural Comparison Tool (ACT)

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.

    2015-01-01

    Life cycle technology impact comparisons for nanolauncher technology concepts were performed using an Affordability Comparison Tool (ACT) prototype. Examined are cost drivers and whether technology investments can dramatically affect the life cycle characteristics. Primary among the selected applications was the prospect of improving nanolauncher systems. As a result, findings and conclusions are documented for ways of creating more productive and affordable nanolauncher systems; e.g., an Express Lane-Flex Lane concept is forwarded, and the beneficial effect of incorporating advanced integrated avionics is explored. Also, a Functional Systems Breakdown Structure (F-SBS) was developed to derive consistent definitions of the flight and ground systems for both system performance and life cycle analysis. Further, a comprehensive catalog of ground segment functions was created.

  12. Common Analysis Tool Being Developed for Aeropropulsion: The National Cycle Program Within the Numerical Propulsion System Simulation Environment

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    1999-01-01

    The NASA Lewis Research Center is developing an environment for analyzing and designing aircraft engines-the Numerical Propulsion System Simulation (NPSS). NPSS will integrate multiple disciplines, such as aerodynamics, structure, and heat transfer, and will make use of numerical "zooming" on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS uses the latest computing and communication technologies to capture complex physical processes in a timely, cost-effective manner. The vision of NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Through the NASA/Industry Cooperative Effort agreement, NASA Lewis and industry partners are developing a new engine simulation called the National Cycle Program (NCP). NCP, which is the first step toward NPSS and is its initial framework, supports the aerothermodynamic system simulation process for the full life cycle of an engine. U.S. aircraft and airframe companies recognize NCP as the future industry standard common analysis tool for aeropropulsion system modeling. The estimated potential payoff for NCP is a $50 million/yr savings to industry through improved engineering productivity.

  13. Experimental and Numerical Studies on Rock Breaking with TBM Tools under High Stress Confinement

    NASA Astrophysics Data System (ADS)

    Innaurato, N.; Oggeri, C.; Oreste, P. P.; Vinai, R.

    2007-10-01

    The understanding of rock breaking and chipping due to the TBM cutter disks mechanism in deep tunnels is considered in this paper. The interest stems from the use of TBMs for the excavation of long Trans-Alpine tunnels. Some tests that simulate the disk cutter action at the tunnel face by means of an indenter, acting on a rock specimen are proposed. The rock specimen is confined through a flat-jack and a confinement-free area on one side of the specimen simulates the formation of a groove near the indenter, like it occurs in TBM excavation conditions. Results show a limited influence of the confinement stress versus the thrust increment required for breaking the rock between the indenter and the free side of the specimen. Numerical modelling of the cutter disk action on confined material has also been carried out in order to investigate further aspects of the fracture initiation. Also in this case the importance of the relative position between disk cutter and groove is pointed out.

  14. Development of a carburizing and quenching simulation tool: numerical simulations of rings and gears

    SciTech Connect

    Anderson, C.; Godlman, P.; Rangaswamy, P.

    1996-10-01

    The ability to accurately calculate temperatures, stresses and metallurgical transformations in a single calculation or in a sequence of calculations is the key to prediction of distortion, residual stress and phase distribution in quench hardened automotive parts. Successful predictions in turn rely on the adequacy of the input data to the calculational procedure. These data include mechanical and thermal properties of the alloy phases over the range of temperature and strain rates experienced during the heat treat process, the mathematical description of the transformation kinetics, and the accuracy of the heat transfer boundary conditions. In this presentation we describe a calculational procedure using the ABAQUS{sup (1)} finite element code that simulates a carburizing and quench heat treat cycle for automotive gears. The calculational procedure features a numerically efficient 2-phase constitutive model, developed as part of the NCMS-Heat Treatment Distortion Prediction program, to represent transformational plasticity effects for the austenite/martensite Deformation together with refined finite element meshes to capture the steep gradients in stress and composition near the gear surfaces. The calculational procedure is illustrated on carburizing and quenching of a thick ring and comparison of model predictions for distortion, phase distribution, and residual stress with experimental measurements are discussed. Included in this model study is an investigation of the sensitivity of the predictions to mesh refinement.

  15. Numerical simulation as an important tool in developing novel hypersonic technologies

    NASA Astrophysics Data System (ADS)

    Bocharov, A. N.; Balakirev, B. A.; Bityurin, V. A.; Gryaznov, V. K.; Golovin, N. N.; Iosilevskiy, I. L.; Evstigneev, N. M.; Medin, S. A.; Naumov, N. D.; Petrovskiy, V. P.; Ryabkov, O. I.; Solomonov, Yu S.; Tatarinov, A. V.; Teplyakov, I. O.; Tikhonov, A. A.; Fortov, V. E.

    2015-11-01

    Development of novel hypersonic technologies necessarily requires the development of methods for analyzing a motion of hypervelocity vehicles. This paper could be considered as the initial stage in developing of complex computational model for studying flows around hypervelocity vehicles of arbitrary shape. Essential part of the model is a solution to three-dimensional transport equations for mass, momentum and energy for the medium in the state of both LTE (local thermodynamic equilibrium) and non-LTE. One of the primary requirements to the developed model is the realization on the modern heterogeneous computer systems including both CPU and GPU. The paper presents the first results on numerical simulation of hypersonic flow. The first problem considered is three-dimensional flow around curved body under angle of attack. The performance of heterogeneous 4-GPU computer system is tested. The second problem highlights the capabilities of the developed model to study heat and mass transfer problems. Namely, interior heat problem is considered which takes into account ablation of thermal protection system and variation of the surface shape of the vehicle.

  16. Numerical arc segmentation algorithm for a radio conference: A software tool for communication satellite systems planning

    NASA Technical Reports Server (NTRS)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    The Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC) on the Use of the Geostationary Satellite Orbit and the Planning of Space Services Utilizing It. Through careful selection of the predetermined arc (PDA) for each administration, flexibility can be increased in terms of choice of system technical characteristics and specific orbit location while reducing the need for coordination among administrations. The NASARC software determines pairwise compatibility between all possible service areas at discrete arc locations. NASARC then exhaustively enumerates groups of administrations whose satellites can be closely located in orbit, and finds the arc segment over which each such compatible group exists. From the set of all possible compatible groupings, groups and their associated arc segments are selected using a heuristic procedure such that a PDA is identified for each administration. Various aspects of the NASARC concept and how the software accomplishes specific features of allotment planning are discussed.

  17. Advanced numerical methods for three dimensional two-phase flow calculations

    SciTech Connect

    Toumi, I.; Caruge, D.

    1997-07-01

    This paper is devoted to new numerical methods developed for both one and three dimensional two-phase flow calculations. These methods are finite volume numerical methods and are based on the use of Approximate Riemann Solvers concepts to define convective fluxes versus mean cell quantities. The first part of the paper presents the numerical method for a one dimensional hyperbolic two-fluid model including differential terms as added mass and interface pressure. This numerical solution scheme makes use of the Riemann problem solution to define backward and forward differencing to approximate spatial derivatives. The construction of this approximate Riemann solver uses an extension of Roe`s method that has been successfully used to solve gas dynamic equations. As far as the two-fluid model is hyperbolic, this numerical method seems very efficient for the numerical solution of two-phase flow problems. The scheme was applied both to shock tube problems and to standard tests for two-fluid computer codes. The second part describes the numerical method in the three dimensional case. The authors discuss also some improvements performed to obtain a fully implicit solution method that provides fast running steady state calculations. Such a scheme is not implemented in a thermal-hydraulic computer code devoted to 3-D steady-state and transient computations. Some results obtained for Pressurised Water Reactors concerning upper plenum calculations and a steady state flow in the core with rod bow effect evaluation are presented. In practice these new numerical methods have proved to be stable on non staggered grids and capable of generating accurate non oscillating solutions for two-phase flow calculations.

  18. GenSAA: A tool for advancing satellite monitoring with graphical expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Luczak, Edward C.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real time data for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At the NASA Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  19. Analysis of the Source Physics Experiment SPE4 Prime Using State-Of Parallel Numerical Tools.

    NASA Astrophysics Data System (ADS)

    Vorobiev, O.; Ezzedine, S. M.; Antoun, T.; Glenn, L.

    2015-12-01

    This work describes a methodology used for large scale modeling of wave propagation from underground chemical explosions conducted at the Nevada National Security Site (NNSS) fractured granitic rock. We show that the discrete natures of rock masses as well as the spatial variability of the fabric of rock properties are very important to understand ground motions induced by underground explosions. In order to build a credible conceptual model of the subsurface we integrated the geological, geomechanical and geophysical characterizations conducted during recent test at the NNSS as well as historical data from the characterization during the underground nuclear test conducted at the NNSS. Because detailed site characterization is limited, expensive and, in some instances, impossible we have numerically investigated the effects of the characterization gaps on the overall response of the system. We performed several computational studies to identify the key important geologic features specific to fractured media mainly the joints characterized at the NNSS. We have also explored common key features to both geological environments such as saturation and topography and assess which characteristics affect the most the ground motion in the near-field and in the far-field. Stochastic representation of these features based on the field characterizations has been implemented into LLNL's Geodyn-L hydrocode. Simulations were used to guide site characterization efforts in order to provide the essential data to the modeling community. We validate our computational results by comparing the measured and computed ground motion at various ranges for the recently executed SPE4 prime experiment. We have also conducted a comparative study between SPE4 prime and previous experiments SPE1 and SPE3 to assess similarities and differences and draw conclusions on designing SPE5.

  20. DNA technological progress toward advanced diagnostic tools to support human hookworm control.

    PubMed

    Gasser, R B; Cantacessi, C; Loukas, A

    2008-01-01

    Blood-feeding hookworms are parasitic nematodes of major human health importance. Currently, it is estimated that 740 million people are infected worldwide, and more than 80 million of them are severely affected clinically by hookworm disease. In spite of the health problems caused and the advances toward the development of vaccines against some hookworms, limited attention has been paid to the need for improved, practical methods of diagnosis. Accurate diagnosis and genetic characterization of hookworms is central to their effective control. While traditional diagnostic methods have considerable limitations, there has been some progress toward the development of molecular-diagnostic tools. The present article provides a brief background on hookworm disease of humans, reviews the main methods that have been used for diagnosis and describes progress in establishing polymerase chain reaction (PCR)-based methods for the specific diagnosis of hookworm infection and the genetic characterisation of the causative agents. This progress provides a foundation for the rapid development of practical, highly sensitive and specific diagnostic and analytical tools to be used in improved hookworm prevention and control programmes.

  1. MATISSE: Multi-purpose Advanced Tool for Instruments for the Solar System Exploration .

    NASA Astrophysics Data System (ADS)

    Zinzi, A.; Capria, M. T.; Antonelli, L. A.

    In planetary sciences, design, assemble and launch onboard instruments are only preliminary steps toward the final aim of converting data into scientific knowledge, as the real challenge is the data analysis and interpretation. Up to now data have been generally stored in "old style" archives, i.e. common ftp servers where the user can manually search for data browsing directories organized in a time order manner. However, as datasets to be stored and searched become particularly large, this latter task absorbs a great part of the time, subtracting time to the real scientific work. In order to reduce the time spent to search and analyze data MATISSE (Multi-purpose Advanced Tool for Instruments for the Solar System Exploration), a new set of software tools developed together with the scientific teams of the instruments involved, is under development at ASDC (ASI Science Data Center), whose experience in space missions data management is well known (e.g., \\citealt{verrecchia07,pittori09,giommi09,massaro11}) and its features and aims will be presented here.

  2. Numerous Numerals.

    ERIC Educational Resources Information Center

    Henle, James M.

    This pamphlet consists of 17 brief chapters, each containing a discussion of a numeration system and a set of problems on the use of that system. The numeration systems used include Egyptian fractions, ordinary continued fractions and variants of that method, and systems using positive and negative bases. The book is informal and addressed to…

  3. Sea Surface Salinity spectra: a validation tool for satellite, numerical simulations and in-situ data

    NASA Astrophysics Data System (ADS)

    Hoareau, Nina; Portabella, Marcos; García Ladona, Emilio; Turiel, Antonio; Ballabrera, Joaquim

    2014-05-01

    Satellite Remote sensing measurements are used in oceanography since the mid-1970s. Thanks to satellite imagery, the research community has been able to better interpret surface structures, such as meandering fronts or eddies, which became apparent in instantaneous views of the ocean. Moreover, satellite altimeter and sea surface temperature (SST) observations evidenced the high percentage of ocean energy accumulated at the intermediate scales (tens to hundreds of km, days-weeks), i.e., the oceanic mesoscale. Today, thanks to the launch of the Soil Moiture and Ocean Salinity (SMOS) mission (2009) and the Aquarius mission (2011), we have more than four years of satellite-derived Sea Surface Salinity (SSS) observations with the objectives of improving seasonal and interannual climate prediction, ocean rainfall estimates and hydrologic budgets, and monitoring large-scale salinity events and thermohaline convection (Lagerloef, 2001). A study from Reynolds and Chelton (2010) compared six different SST products using spatial power density spectra in three regions of the ocean at different periods (January and July 2007-2008). The results showed that the spatial spectra vary geographically and temporally, and from one product to the next. Here, a similar study is presented for the first time with SSS data to help understand the spatial signature of the SSS variability and validate the different data sources. Thanks to the increased maturity of remote sensing estimations of SSS, the spatial spectra of the SSS fields provided by numerical models can now be compared with observations. In this work, we focus on the region of North Atlantic Ocean for the year of January and July of 2011 and 2012. The data used in this work come from Satellites (AQUARIUS and/or SMOS Level 2), outputs of an ocean model (NEMO-OPA, configuration DRAKKAR-NATL025), in-situ observations collected during the Barcelona World Race (BWR 2010), and the climatology of Levitus (WOA09). The results show that

  4. A review of recent advances in numerical simulations of microscale fuel processor for hydrogen production

    NASA Astrophysics Data System (ADS)

    Holladay, J. D.; Wang, Y.

    2015-05-01

    Microscale (<5 W) reformers for hydrogen production have been investigated for over a decade. These devices are intended to provide hydrogen for small fuel cells. Due to the reformer's small size, numerical simulations are critical to understand heat and mass transfer phenomena occurring in the systems and help guide the further improvements. This paper reviews the development of the numerical codes and details the reaction equations used. The majority of the devices utilized methanol as the fuel due to methanol's low reforming temperature and high conversion, although, there are several methane fueled systems. The increased computational power and more complex codes have led to improved accuracy of numerical simulations. Initial models focused on the reformer, while more recently, the simulations began including other unit operations such as vaporizers, inlet manifolds, and combustors. These codes are critical for developing the next generation systems. The systems reviewed included plate reactors, microchannel reactors, and annulus reactors for both wash-coated and packed bed systems.

  5. Recent advances in numerical simulation and control of asymmetric flows around slender bodies

    NASA Technical Reports Server (NTRS)

    Kandil, Osama A.; Wong, Tin-Chee; Sharaf, Hazem H.; Liu, C. H.

    1992-01-01

    The problems of asymmetric flow around slender bodies and its control are formulated using the unsteady, compressible, thin-layer or full Navier-Stokes equations which are solved using an implicit, flux-difference splitting, finite-volume scheme. The problem is numerically simulated for both locally-conical and three-dimensional flows. The numerical applications include studies of the effects of relative incidence, Mach number and Reynolds number on the flow asymmetry. For the control of flow asymmetry, the numerical simulation cover passive and active control methods. For the passive control, the effectiveness of vertical fins placed in the leeward plane of geometric symmetry and side strakes with different orientations is studied. For the active control, the effectiveness of normal and tangential flow injection and surface heating and a combination of these methods is studied.

  6. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    ERIC Educational Resources Information Center

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  7. Development and experimental validation of a numerical tool for structural health and usage monitoring systems based on chirped grating sensors.

    PubMed

    Bettini, Paolo; Guerreschi, Erika; Sala, Giuseppe

    2015-01-01

    The interest of the aerospace industries in structural health and usage monitoring systems is continuously increasing. Among the techniques available in literature those based on Fibre Bragg Grating sensors are much promising thanks to their peculiarities. Different Chirped Bragg Grating sensor configurations have been investigated in this paper. Starting from a numerical model capable of simulating the spectral response of a grating subjected to a generic strain profile (direct problem), a new code has been developed, allowing strain reconstruction from the experimental validation of the program, carried out through different loading cases applied on a chirped grating. The wavelength of the reflection spectrum for a chirped FBG has a one-to-one correspondence to the position along the gauge section, thus allowing strain reconstruction over the entire sensor length. Tests conducted on chirped FBGs also evidenced their potential for SHM applications, if coupled with appropriate numerical strain reconstructions tools. Finally, a new class of sensors-Draw Tower Grating arrays-has been studied. These sensors are applicable to distributed sensing and load reconstruction over large structures, thanks to their greater length. Three configurations have been evaluated, having different spatial and spectral characteristics, in order to explore possible applications of such sensors to SHM systems.

  8. Development and Experimental Validation of a Numerical Tool for Structural Health and Usage Monitoring Systems Based on Chirped Grating Sensors

    PubMed Central

    Bettini, Paolo; Guerreschi, Erika; Sala, Giuseppe

    2015-01-01

    The interest of the aerospace industries in structural health and usage monitoring systems is continuously increasing. Among the techniques available in literature those based on Fibre Bragg Grating sensors are much promising thanks to their peculiarities. Different Chirped Bragg Grating sensor configurations have been investigated in this paper. Starting from a numerical model capable of simulating the spectral response of a grating subjected to a generic strain profile (direct problem), a new code has been developed, allowing strain reconstruction from the experimental validation of the program, carried out through different loading cases applied on a chirped grating. The wavelength of the reflection spectrum for a chirped FBG has a one-to-one correspondence to the position along the gauge section, thus allowing strain reconstruction over the entire sensor length. Tests conducted on chirped FBGs also evidenced their potential for SHM applications, if coupled with appropriate numerical strain reconstructions tools. Finally, a new class of sensors—Draw Tower Grating arrays—has been studied. These sensors are applicable to distributed sensing and load reconstruction over large structures, thanks to their greater length. Three configurations have been evaluated, having different spatial and spectral characteristics, in order to explore possible applications of such sensors to SHM systems. PMID:25587979

  9. Development and experimental validation of a numerical tool for structural health and usage monitoring systems based on chirped grating sensors.

    PubMed

    Bettini, Paolo; Guerreschi, Erika; Sala, Giuseppe

    2015-01-01

    The interest of the aerospace industries in structural health and usage monitoring systems is continuously increasing. Among the techniques available in literature those based on Fibre Bragg Grating sensors are much promising thanks to their peculiarities. Different Chirped Bragg Grating sensor configurations have been investigated in this paper. Starting from a numerical model capable of simulating the spectral response of a grating subjected to a generic strain profile (direct problem), a new code has been developed, allowing strain reconstruction from the experimental validation of the program, carried out through different loading cases applied on a chirped grating. The wavelength of the reflection spectrum for a chirped FBG has a one-to-one correspondence to the position along the gauge section, thus allowing strain reconstruction over the entire sensor length. Tests conducted on chirped FBGs also evidenced their potential for SHM applications, if coupled with appropriate numerical strain reconstructions tools. Finally, a new class of sensors-Draw Tower Grating arrays-has been studied. These sensors are applicable to distributed sensing and load reconstruction over large structures, thanks to their greater length. Three configurations have been evaluated, having different spatial and spectral characteristics, in order to explore possible applications of such sensors to SHM systems. PMID:25587979

  10. Evolution of the design methodologies for the next generation of RPV Extensive role of the thermal-hydraulics numerical tools

    SciTech Connect

    Goreaud, Nicolas; Nicaise, Norbert; Stoudt, Roger

    2004-07-01

    The thermal-hydraulic design of the first PWR's was mainly based on an experimental approach, with a large series of test on the main equipment (control rod guide tubes, RPV plenums..), to check its performances. Development of CFD-codes and computers now allows for complex simulations of hydraulic phenomena. Provided adequate qualification, these numerical tools are efficient means to determine hydraulics in given design, and to perform sensitivities for optimization of new designs. Experiments always play their role, first for qualification, and for validation at the last stage of the design. The design of the European Pressurized water Reactor (EPR), is based on both hydraulic calculations and experiments, handled in a complementary approach. This paper describes the effort launched by Framatome-ANP on hydraulic calculations for the Reactor Pressure Vessel (RPV) of the EPR reactor. It concerns 3D-calculations of RPV-inlet including cold legs, RPV-downcomer and lower plenum, RPV-upper plenum up to and including hot legs. It covers normal operating conditions, but also accidental conditions as PTS (Pressurized Thermal Shock) in small break loss of coolant accident (SB-LOCA). Those hydraulic studies have provided numerous useful information for the mechanical design of RPV-internals. (authors)

  11. Advanced CNC and CAM Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and student laboratory manual for a 1-year vocational training program to prepare students for entry-level positions as advanced computer numerical control (CNC) and computer-assisted manufacturing (CAM) technicians.. The program was developed through a modification of the DACUM…

  12. Development, Implementation and Application of Micromechanical Analysis Tools for Advanced High Temperature Composites

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document contains the final report to the NASA Glenn Research Center (GRC) for the research project entitled Development, Implementation, and Application of Micromechanical Analysis Tools for Advanced High-Temperature Composites. The research supporting this initiative has been conducted by Dr. Brett A. Bednarcyk, a Senior Scientist at OM in Brookpark, Ohio from the period of August 1998 to March 2005. Most of the work summarized herein involved development, implementation, and application of enhancements and new capabilities for NASA GRC's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package. When the project began, this software was at a low TRL (3-4) and at release version 2.0. Due to this project, the TRL of MAC/GMC has been raised to 7 and two new versions (3.0 and 4.0) have been released. The most important accomplishments with respect to MAC/GMC are: (1) A multi-scale framework has been built around the software, enabling coupled design and analysis from the global structure scale down to the micro fiber-matrix scale; (2) The software has been expanded to analyze smart materials; (3) State-of-the-art micromechanics theories have been implemented and validated within the code; (4) The damage, failure, and lifing capabilities of the code have been expanded from a very limited state to a vast degree of functionality and utility; and (5) The user flexibility of the code has been significantly enhanced. MAC/GMC is now the premier code for design and analysis of advanced composite and smart materials. It is a candidate for the 2005 NASA Software of the Year Award. The work completed over the course of the project is summarized below on a year by year basis. All publications resulting from the project are listed at the end of this report.

  13. A numerical technique for calculation of the noise of high-speed propellers with advanced blade geometry

    NASA Technical Reports Server (NTRS)

    Nystrom, P. A.; Farassat, F.

    1980-01-01

    A numerical technique and computer program were developed for the prediction of the noise of propellers with advanced geometry. The blade upper and lower surfaces are described by a curvilinear coordinate system, which was also used to divide the blade surfaces into panels. Two different acoustic formulations in the time domain were used to improve the speed and efficiency of the noise calculations: an acoustic formualtion with the Doppler factor singularity for panels moving at subsonic speeds and the collapsing sphere formulation for panels moving at transonic or supersonic speeds. This second formulation involves a sphere which is centered at the observer position and whose radius decreases at the speed of sound. The acoustic equation consisted of integrals over the curve of intersection for both the sphere and the panels on the blade. Algorithms used in some parts of the computer program are discussed. Comparisons with measured acoustic data for two model high speed propellers with advanced geometry are also presented.

  14. A review of recent advances of numerical simulations of microscale fuel processors for hydrogen production

    SciTech Connect

    Holladay, Jamelyn D.; Wang, Yong

    2015-05-01

    Microscale (<5W) reformers for hydrogen production have been investigated for over a decade. These devices are intended to provide hydrogen for small fuel cells. Due to the reformer’s small size, numerical simulations are critical to understand heat and mass transfer phenomena occurring in the systems. This paper reviews the development of the numerical codes and details the reaction equations used. The majority of the devices utilized methanol as the fuel due to methanol’s low reforming temperature and high conversion, although, there are several methane fueled systems. As computational power has decreased in cost and increased in availability, the codes increased in complexity and accuracy. Initial models focused on the reformer, while more recently, the simulations began including other unit operations such as vaporizers, inlet manifolds, and combustors. These codes are critical for developing the next generation systems. The systems reviewed included, plate reactors, microchannel reactors, annulus reactors, wash-coated, packed bed systems.

  15. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2013-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent "go-to" group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA's design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer's needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  16. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Creech, Dennis M.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2012-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent go-to group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA s design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer s needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  17. Numerical evaluation of longitudinal motions of Wigley hulls advancing in waves by using Bessho form translating-pulsating source Green'S function

    NASA Astrophysics Data System (ADS)

    Xiao, Wenbin; Dong, Wencai

    2016-06-01

    In the framework of 3D potential flow theory, Bessho form translating-pulsating source Green's function in frequency domain is chosen as the integral kernel in this study and hybrid source-and-dipole distribution model of the boundary element method is applied to directly solve the velocity potential for advancing ship in regular waves. Numerical characteristics of the Green function show that the contribution of local-flow components to velocity potential is concentrated at the nearby source point area and the wave component dominates the magnitude of velocity potential in the far field. Two kinds of mathematical models, with or without local-flow components taken into account, are adopted to numerically calculate the longitudinal motions of Wigley hulls, which demonstrates the applicability of translating-pulsating source Green's function method for various ship forms. In addition, the mesh analysis of discrete surface is carried out from the perspective of ship-form characteristics. The study shows that the longitudinal motion results by the simplified model are somewhat greater than the experimental data in the resonant zone, and the model can be used as an effective tool to predict ship seakeeping properties. However, translating-pulsating source Green function method is only appropriate for the qualitative analysis of motion response in waves if the ship geometrical shape fails to satisfy the slender-body assumption.

  18. Ares First Stage "Systemology" - Combining Advanced Systems Engineering and Planning Tools to Assure Mission Success

    NASA Technical Reports Server (NTRS)

    Seiler, James; Brasfield, Fred; Cannon, Scott

    2008-01-01

    Ares is an integral part of NASA s Constellation architecture that will provide crew and cargo access to the International Space Station as well as low earth orbit support for lunar missions. Ares replaces the Space Shuttle in the post 2010 time frame. Ares I is an in-line, two-stage rocket topped by the Orion Crew Exploration Vehicle, its service module, and a launch abort system. The Ares I first stage is a single, five-segment reusable solid rocket booster derived from the Space Shuttle Program's reusable solid rocket motor. The Ares second or upper stage is propelled by a J-2X main engine fueled with liquid oxygen and liquid hydrogen. This paper describes the advanced systems engineering and planning tools being utilized for the design, test, and qualification of the Ares I first stage element. Included are descriptions of the current first stage design, the milestone schedule requirements, and the marriage of systems engineering, detailed planning efforts, and roadmapping employed to achieve these goals.

  19. Recent advances in i-Gene tools and analysis: microarrays, next generation sequencing and mass spectrometry.

    PubMed

    Moorhouse, Michael J; Sharma, Hari S

    2011-08-01

    Recent advances in technology and associated methodology have made the current period one of the most exciting in molecular biology and medicine. Underlying these is an appreciation that modern research is driven by increasing large amounts of data being interpreted by interdisciplinary collaborative teams which are often geographically dispersed. The availability of cheap computing power, high speed informatics networks and high quality analysis software has been essential to this as has the application of modern quality assurance methodologies. In this review, we discuss the application of modern 'High-Throughput' molecular biological technologies such as 'Microarrays' and 'Next Generation Sequencing' to scientific and biomedical research as we have observed. Furthermore in this review, we also offer some guidance that enables the reader as to understand certain features of these as well as new strategies and help them to apply these i-Gene tools in their endeavours successfully. Collectively, we term this 'i-Gene Analysis'. We also offer predictions as to the developments that are anticipated in the near and more distant future.

  20. Advances in the genetic dissection of plant cell walls: tools and resources available in Miscanthus

    PubMed Central

    Slavov, Gancho; Allison, Gordon; Bosch, Maurice

    2013-01-01

    Tropical C4 grasses from the genus Miscanthus are believed to have great potential as biomass crops. However, Miscanthus species are essentially undomesticated, and genetic, molecular and bioinformatics tools are in very early stages of development. Furthermore, similar to other crops targeted as lignocellulosic feedstocks, the efficient utilization of biomass is hampered by our limited knowledge of the structural organization of the plant cell wall and the underlying genetic components that control this organization. The Institute of Biological, Environmental and Rural Sciences (IBERS) has assembled an extensive collection of germplasm for several species of Miscanthus. In addition, an integrated, multidisciplinary research programme at IBERS aims to inform accelerated breeding for biomass productivity and composition, while also generating fundamental knowledge. Here we review recent advances with respect to the genetic characterization of the cell wall in Miscanthus. First, we present a summary of recent and on-going biochemical studies, including prospects and limitations for the development of powerful phenotyping approaches. Second, we review current knowledge about genetic variation for cell wall characteristics of Miscanthus and illustrate how phenotypic data, combined with high-density arrays of single-nucleotide polymorphisms, are being used in genome-wide association studies to generate testable hypotheses and guide biological discovery. Finally, we provide an overview of the current knowledge about the molecular biology of cell wall biosynthesis in Miscanthus and closely related grasses, discuss the key conceptual and technological bottlenecks, and outline the short-term prospects for progress in this field. PMID:23847628

  1. Advances in the genetic dissection of plant cell walls: tools and resources available in Miscanthus.

    PubMed

    Slavov, Gancho; Allison, Gordon; Bosch, Maurice

    2013-01-01

    Tropical C4 grasses from the genus Miscanthus are believed to have great potential as biomass crops. However, Miscanthus species are essentially undomesticated, and genetic, molecular and bioinformatics tools are in very early stages of development. Furthermore, similar to other crops targeted as lignocellulosic feedstocks, the efficient utilization of biomass is hampered by our limited knowledge of the structural organization of the plant cell wall and the underlying genetic components that control this organization. The Institute of Biological, Environmental and Rural Sciences (IBERS) has assembled an extensive collection of germplasm for several species of Miscanthus. In addition, an integrated, multidisciplinary research programme at IBERS aims to inform accelerated breeding for biomass productivity and composition, while also generating fundamental knowledge. Here we review recent advances with respect to the genetic characterization of the cell wall in Miscanthus. First, we present a summary of recent and on-going biochemical studies, including prospects and limitations for the development of powerful phenotyping approaches. Second, we review current knowledge about genetic variation for cell wall characteristics of Miscanthus and illustrate how phenotypic data, combined with high-density arrays of single-nucleotide polymorphisms, are being used in genome-wide association studies to generate testable hypotheses and guide biological discovery. Finally, we provide an overview of the current knowledge about the molecular biology of cell wall biosynthesis in Miscanthus and closely related grasses, discuss the key conceptual and technological bottlenecks, and outline the short-term prospects for progress in this field.

  2. How Project Management Tools Aid in Association to Advance Collegiate Schools of Business (AACSB) International Maintenance of Accreditation

    ERIC Educational Resources Information Center

    Cann, Cynthia W.; Brumagim, Alan L.

    2008-01-01

    The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…

  3. Advanced Techniques for Seismic Protection of Historical Buildings: Experimental and Numerical Approach

    SciTech Connect

    Mazzolani, Federico M.

    2008-07-08

    The seismic protection of historical and monumental buildings, namely dating back from the ancient age up to the 20th Century, is being looked at with greater and greater interest, above all in the Euro-Mediterranean area, its cultural heritage being strongly susceptible to undergo severe damage or even collapse due to earthquake. The cultural importance of historical and monumental constructions limits, in many cases, the possibility to upgrade them from the seismic point of view, due to the fear of using intervention techniques which could have detrimental effects on their cultural value. Consequently, a great interest is growing in the development of sustainable methodologies for the use of Reversible Mixed Technologies (RMTs) in the seismic protection of the existing constructions. RMTs, in fact, are conceived for exploiting the peculiarities of innovative materials and special devices, and they allow ease of removal when necessary. This paper deals with the experimental and numerical studies, framed within the EC PROHITECH research project, on the application of RMTs to the historical and monumental constructions mainly belonging to the cultural heritage of the Euro-Mediterranean area. The experimental tests and the numerical analyses are carried out at five different levels, namely full scale models, large scale models, sub-systems, devices, materials and elements.

  4. Numerical Study on Crossflow Printed Circuit Heat Exchanger for Advanced Small Modular Reactors

    SciTech Connect

    Yoon, Su-Jong; Sabharwall, Piyush; Kim, Eung-Soo

    2014-03-01

    Various fluids such as water, gases (helium), molten salts (FLiNaK, FLiBe) and liquid metal (sodium) are used as a coolant of advanced small modular reactors (SMRs). The printed circuit heat exchanger (PCHE) has been adopted as the intermediate and/or secondary heat exchanger of SMR systems because this heat exchanger is compact and effective. The size and cost of PCHE can be changed by the coolant type of each SMR. In this study, the crossflow PCHE analysis code for advanced small modular reactor has been developed for the thermal design and cost estimation of the heat exchanger. The analytical solution of single pass, both unmixed fluids crossflow heat exchanger model was employed to calculate a two dimensional temperature profile of a crossflow PCHE. The analytical solution of crossflow heat exchanger was simply implemented by using built in function of the MATLAB program. The effect of fluid property uncertainty on the calculation results was evaluated. In addition, the effect of heat transfer correlations on the calculated temperature profile was analyzed by taking into account possible combinations of primary and secondary coolants in the SMR systems. Size and cost of heat exchanger were evaluated for the given temperature requirement of each SMR.

  5. An Efficient and Imperfect Model for Gravel-Bed Braided River Morphodynamics: Numerical Simulations as Exploratory Tools

    NASA Astrophysics Data System (ADS)

    Kasprak, A.; Brasington, J.; Hafen, K.; Wheaton, J. M.

    2015-12-01

    Numerical models that predict channel evolution through time are an essential tool for investigating processes that occur over timescales which render field observation intractable. However, available morphodynamic models generally take one of two approaches to the complex problem of computing morphodynamics, resulting in oversimplification of the relevant physics (e.g. cellular models) or faithful, yet computationally intensive, representations of the hydraulic and sediment transport processes at play. The practical implication of these approaches is that river scientists must often choose between unrealistic results, in the case of the former, or computational demands that render modeling realistic spatiotemporal scales of channel evolution impossible. Here we present a new modeling framework that operates at the timescale of individual competent flows (e.g. floods), and uses a highly-simplified sediment transport routine that moves volumes of material according to morphologically-derived characteristic transport distances, or path lengths. Using this framework, we have constructed an open-source morphodynamic model, termed MoRPHED, which is here applied, and its validity investigated, at timescales ranging from a single event to a decade on two braided rivers in the UK and New Zealand. We do not purport that MoRPHED is the best, nor even an adequate, tool for modeling braided river dynamics at this range of timescales. Rather, our goal in this research is to explore the utility, feasibility, and sensitivity of an event-scale, path-length-based modeling framework for predicting braided river dynamics. To that end, we further explore (a) which processes are naturally emergent and which must be explicitly parameterized in the model, (b) the sensitivity of the model to the choice of particle travel distance, and (c) whether an event-scale model timestep is adequate for producing braided channel dynamics. The results of this research may inform techniques for future

  6. Numerical modelling of the groundwater inflow to an advancing open pit mine: Kolahdarvazeh pit, Central Iran.

    PubMed

    Bahrami, Saeed; Doulati Ardejani, Faramarz; Aslani, Soheyla; Baafi, Ernest

    2014-12-01

    The groundwater inflow into a mine during its life and after ceasing operations is one of the most important concerns of the mining industry. This paper presents a hydrogeological assessment of the Irankuh Zn-Pb mine at 20 km south of Esfahan and 1 km northeast of Abnil in west-Central Iran. During mine excavation, the upper impervious bed of a confined aquifer was broken and water at high-pressure flowed into an open pit mine associated with the Kolahdarvazeh deposit. The inflow rates were 6.7 and 1.4 m(3)/s at the maximum and minimum quantities, respectively. Permeability, storage coefficient, thickness and initial head of the fully saturated confined aquifer were 3.5 × 10(-4) m/s, 0.2, 30 m and 60 m, respectively. The hydraulic heads as a function of time were monitored at four observation wells in the vicinity of the pit over 19 weeks and at an observation well near a test well over 21 h. In addition, by measuring the rate of pumping out from the pit sump, at a constant head (usually equal to height of the pit floor), the real inflow rates to the pit were monitored. The main innovations of this work were to make comparison between numerical modelling using a finite element software called SEEP/W and actual data related to inflow and extend the applicability of the numerical model. This model was further used to estimate the hydraulic heads at the observation wells around the pit over 19 weeks during mining operations. Data from a pump-out test and observation wells were used for model calibration and verification. In order to evaluate the model efficiency, the modelling results of inflow quantity and hydraulic heads were compared to those from analytical solutions, as well as the field data. The mean percent error in relation to field data for the inflow quantity was 0.108. It varied between 1.16 and 1.46 for hydraulic head predictions, which are much lower values than the mean percent errors resulted from the analytical solutions (from 1.8 to 5

  7. Numerical modelling of the groundwater inflow to an advancing open pit mine: Kolahdarvazeh pit, Central Iran.

    PubMed

    Bahrami, Saeed; Doulati Ardejani, Faramarz; Aslani, Soheyla; Baafi, Ernest

    2014-12-01

    The groundwater inflow into a mine during its life and after ceasing operations is one of the most important concerns of the mining industry. This paper presents a hydrogeological assessment of the Irankuh Zn-Pb mine at 20 km south of Esfahan and 1 km northeast of Abnil in west-Central Iran. During mine excavation, the upper impervious bed of a confined aquifer was broken and water at high-pressure flowed into an open pit mine associated with the Kolahdarvazeh deposit. The inflow rates were 6.7 and 1.4 m(3)/s at the maximum and minimum quantities, respectively. Permeability, storage coefficient, thickness and initial head of the fully saturated confined aquifer were 3.5 × 10(-4) m/s, 0.2, 30 m and 60 m, respectively. The hydraulic heads as a function of time were monitored at four observation wells in the vicinity of the pit over 19 weeks and at an observation well near a test well over 21 h. In addition, by measuring the rate of pumping out from the pit sump, at a constant head (usually equal to height of the pit floor), the real inflow rates to the pit were monitored. The main innovations of this work were to make comparison between numerical modelling using a finite element software called SEEP/W and actual data related to inflow and extend the applicability of the numerical model. This model was further used to estimate the hydraulic heads at the observation wells around the pit over 19 weeks during mining operations. Data from a pump-out test and observation wells were used for model calibration and verification. In order to evaluate the model efficiency, the modelling results of inflow quantity and hydraulic heads were compared to those from analytical solutions, as well as the field data. The mean percent error in relation to field data for the inflow quantity was 0.108. It varied between 1.16 and 1.46 for hydraulic head predictions, which are much lower values than the mean percent errors resulted from the analytical solutions (from 1.8 to 5

  8. Predictive Modeling of Estrogen Receptor Binding Agents Using Advanced Cheminformatics Tools and Massive Public Data

    PubMed Central

    Ribay, Kathryn; Kim, Marlene T.; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao

    2016-01-01

    Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR

  9. Predictive Modeling of Estrogen Receptor Binding Agents Using Advanced Cheminformatics Tools and Massive Public Data

    PubMed Central

    Ribay, Kathryn; Kim, Marlene T.; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao

    2016-01-01

    Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR

  10. Evaluation of Temperature Gradient in Advanced Automated Directional Solidification Furnace (AADSF) by Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Bune, Andris V.; Gillies, Donald C.; Lehoczky, Sandor L.

    1996-01-01

    A numerical model of heat transfer using combined conduction, radiation and convection in AADSF was used to evaluate temperature gradients in the vicinity of the crystal/melt interface for variety of hot and cold zone set point temperatures specifically for the growth of mercury cadmium telluride (MCT). Reverse usage of hot and cold zones was simulated to aid the choice of proper orientation of crystal/melt interface regarding residual acceleration vector without actual change of furnace location on board the orbiter. It appears that an additional booster heater will be extremely helpful to ensure desired temperature gradient when hot and cold zones are reversed. Further efforts are required to investigate advantages/disadvantages of symmetrical furnace design (i.e. with similar length of hot and cold zones).

  11. Advanced friction simulation of standardized friction tests: a numerical and experimental demonstrator

    NASA Astrophysics Data System (ADS)

    Hol, J.; Wiebenga, J. H.; Hörning, M.; Dietrich, F.; Dane, C.

    2016-08-01

    For the characterization of friction conditions under sheet metal forming process conditions, different friction test set-ups are being used in industry. However, different friction tests and test set-ups are known to result in scattering friction results. In this work, the TriboForm software is utilized to numerically model the frictional behavior. The simulated coefficients of friction are experimentally validated using friction results from a standardized strip drawing friction test set-up. The experimental and simulation results of the friction behavior show a good overall agreement. This demonstrates that the TriboForm software enables simulating friction conditions for varying tribology conditions, i.e. resulting in a generally applicable approach for friction characterization under industrial sheet metal forming process conditions.

  12. Numerical simulation of the reactive flow in advanced (HSR) combustors using KIVA-2

    NASA Technical Reports Server (NTRS)

    Winowich, Nicholas S.

    1991-01-01

    Recent work has been done with the goal of establishing ultralow emission aircraft gas turbine combustors. A significant portion of the effort is the development of three dimensional computational combustor models. The KIVA-II computer code which is based on the Implicit Continuous Eulerian Difference mesh Arbitrary Lagrangian Eulerian (ICED-ALE) numerical scheme is one of the codes selected by NASA to achieve these goals. This report involves a simulation of jet injection through slanted slots within the Rich burn/Quick quench/Lean burn (RQL) baseline experimental rig. The RQL combustor distinguishes three regions of combustion. This work specifically focuses on modeling the quick quench mixer region in which secondary injection air is introduced radially through 12 equally spaced slots around the mixer circumference. Steady state solutions are achieved with modifications to the KIVA-II program. Work currently underway will evaluate thermal mixing as a function of injection air velocity and angle of inclination of the slots.

  13. A numerical investigation on the efficiency of range extending systems using Advanced Vehicle Simulator

    NASA Astrophysics Data System (ADS)

    Varnhagen, Scott; Same, Adam; Remillard, Jesse; Park, Jae Wan

    2011-03-01

    Series plug-in hybrid electric vehicles of varying engine configuration and battery capacity are modeled using Advanced Vehicle Simulator (ADVISOR). The performance of these vehicles is analyzed on the bases of energy consumption and greenhouse gas emissions on the tank-to-wheel and well-to-wheel paths. Both city and highway driving conditions are considered during the simulation. When simulated on the well-to-wheel path, it is shown that the range extender with a Wankel rotary engine consumes less energy and emits fewer greenhouse gases compared to the other systems with reciprocating engines during many driving cycles. The rotary engine has a higher power-to-weight ratio and lower noise, vibration and harshness compared to conventional reciprocating engines, although performs less efficiently. The benefits of a Wankel engine make it an attractive option for use as a range extender in a plug-in hybrid electric vehicle.

  14. Theoretical and numerical methods used as design tool for an aircraft: Application on three real-world configurations

    NASA Astrophysics Data System (ADS)

    Anton, Nicoleta

    The mathematical models needed to represent the various dynamics phenomena have been conceived in many disciplines related to aerospace engineering. Major aerospace companies have developed their own codes to estimate aerodynamic characteristics and aircraft stability in the conceptual phase, in parallel with universities that have developed various codes for educational and research purposes. This paper presents a design tool that includes FDerivatives code, the new weight functions method and the continuity algorithm. FDerivatives code, developed at the LARCASE laboratory, is dedicated to the analytical and numerical calculations of the aerodynamic coefficients and their corresponding stability derivatives in the subsonic regime. It was developed as part of two research projects. The first project was initiated by CAE Inc. and the Consortium for Research and Innovation in Aerospace in Quebec (CRIAQ), and the second project was funded by NATO in the framework of the NATO RTO AVT-161 "Assessment of Stability and Control Prediction Methods for NATO Air and Sea Vehicles" program. Presagis gave the "Best Simulation Award" to the LARCASE laboratory for FDerivatives and data FLSIM applications. The new method, called the weight functions method, was used as an extension of the former project. Stability analysis of three different aircraft configurations was performed with the weight functions method and validated for longitudinal and lateral motions with the root locus method. The model, tested with the continuity algorithm, is the High Incidence Research Aircraft Model (HIRM) developed by the Swedish Defense Research Agency and implemented in the Aero-Data Model In Research Environment (ADMIRE).

  15. Advancing predictive models for particulate formation in turbulent flames via massively parallel direct numerical simulations

    PubMed Central

    Bisetti, Fabrizio; Attili, Antonio; Pitsch, Heinz

    2014-01-01

    Combustion of fossil fuels is likely to continue for the near future due to the growing trends in energy consumption worldwide. The increase in efficiency and the reduction of pollutant emissions from combustion devices are pivotal to achieving meaningful levels of carbon abatement as part of the ongoing climate change efforts. Computational fluid dynamics featuring adequate combustion models will play an increasingly important role in the design of more efficient and cleaner industrial burners, internal combustion engines, and combustors for stationary power generation and aircraft propulsion. Today, turbulent combustion modelling is hindered severely by the lack of data that are accurate and sufficiently complete to assess and remedy model deficiencies effectively. In particular, the formation of pollutants is a complex, nonlinear and multi-scale process characterized by the interaction of molecular and turbulent mixing with a multitude of chemical reactions with disparate time scales. The use of direct numerical simulation (DNS) featuring a state of the art description of the underlying chemistry and physical processes has contributed greatly to combustion model development in recent years. In this paper, the analysis of the intricate evolution of soot formation in turbulent flames demonstrates how DNS databases are used to illuminate relevant physico-chemical mechanisms and to identify modelling needs. PMID:25024412

  16. Advancing predictive models for particulate formation in turbulent flames via massively parallel direct numerical simulations.

    PubMed

    Bisetti, Fabrizio; Attili, Antonio; Pitsch, Heinz

    2014-08-13

    Combustion of fossil fuels is likely to continue for the near future due to the growing trends in energy consumption worldwide. The increase in efficiency and the reduction of pollutant emissions from combustion devices are pivotal to achieving meaningful levels of carbon abatement as part of the ongoing climate change efforts. Computational fluid dynamics featuring adequate combustion models will play an increasingly important role in the design of more efficient and cleaner industrial burners, internal combustion engines, and combustors for stationary power generation and aircraft propulsion. Today, turbulent combustion modelling is hindered severely by the lack of data that are accurate and sufficiently complete to assess and remedy model deficiencies effectively. In particular, the formation of pollutants is a complex, nonlinear and multi-scale process characterized by the interaction of molecular and turbulent mixing with a multitude of chemical reactions with disparate time scales. The use of direct numerical simulation (DNS) featuring a state of the art description of the underlying chemistry and physical processes has contributed greatly to combustion model development in recent years. In this paper, the analysis of the intricate evolution of soot formation in turbulent flames demonstrates how DNS databases are used to illuminate relevant physico-chemical mechanisms and to identify modelling needs.

  17. Advancing predictive models for particulate formation in turbulent flames via massively parallel direct numerical simulations.

    PubMed

    Bisetti, Fabrizio; Attili, Antonio; Pitsch, Heinz

    2014-08-13

    Combustion of fossil fuels is likely to continue for the near future due to the growing trends in energy consumption worldwide. The increase in efficiency and the reduction of pollutant emissions from combustion devices are pivotal to achieving meaningful levels of carbon abatement as part of the ongoing climate change efforts. Computational fluid dynamics featuring adequate combustion models will play an increasingly important role in the design of more efficient and cleaner industrial burners, internal combustion engines, and combustors for stationary power generation and aircraft propulsion. Today, turbulent combustion modelling is hindered severely by the lack of data that are accurate and sufficiently complete to assess and remedy model deficiencies effectively. In particular, the formation of pollutants is a complex, nonlinear and multi-scale process characterized by the interaction of molecular and turbulent mixing with a multitude of chemical reactions with disparate time scales. The use of direct numerical simulation (DNS) featuring a state of the art description of the underlying chemistry and physical processes has contributed greatly to combustion model development in recent years. In this paper, the analysis of the intricate evolution of soot formation in turbulent flames demonstrates how DNS databases are used to illuminate relevant physico-chemical mechanisms and to identify modelling needs. PMID:25024412

  18. State of the art: diagnostic tools and innovative therapies for treatment of advanced thymoma and thymic carcinoma.

    PubMed

    Ried, Michael; Marx, Alexander; Götz, Andrea; Hamer, Okka; Schalke, Berthold; Hofmann, Hans-Stefan

    2016-06-01

    In this review article, state-of-the-art diagnostic tools and innovative treatments of thymoma and thymic carcinoma (TC) are described with special respect to advanced tumour stages. Complete surgical resection (R0) remains the standard therapeutic approach for almost all a priori resectable mediastinal tumours as defined by preoperative standard computed tomography (CT). If lymphoma or germ-cell tumours are differential diagnostic considerations, biopsy may be indicated. Resection status is the most important prognostic factor in thymoma and TC, followed by tumour stage. Advanced (Masaoka-Koga stage III and IVa) tumours require interdisciplinary therapy decisions based on distinctive findings of preoperative CT scan and ancillary investigations [magnetic resonance imaging (MRI)] to select cases for primary surgery or neoadjuvant strategies with optional secondary resection. In neoadjuvant settings, octreotide scans and histological evaluation of pretherapeutic needle biopsies may help to choose between somatostatin agonist/prednisolone regimens and neoadjuvant chemotherapy as first-line treatment. Finally, a multimodality treatment regime is recommended for advanced and unresectable thymic tumours. In conclusion, advanced stage thymoma and TC should preferably be treated in experienced centres in order to provide all modern diagnostic tools (imaging, histology) and innovative therapy techniques. Systemic and local (hyperthermic intrathoracic chemotherapy) medical treatments together with extended surgical resections have increased the therapeutic options in patients with advanced or recurrent thymoma and TC.

  19. Numerical Simulations of Optical Turbulence Using an Advanced Atmospheric Prediction Model: Implications for Adaptive Optics Design

    NASA Astrophysics Data System (ADS)

    Alliss, R.

    2014-09-01

    Optical turbulence (OT) acts to distort light in the atmosphere, degrading imagery from astronomical telescopes and reducing the data quality of optical imaging and communication links. Some of the degradation due to turbulence can be corrected by adaptive optics. However, the severity of optical turbulence, and thus the amount of correction required, is largely dependent upon the turbulence at the location of interest. Therefore, it is vital to understand the climatology of optical turbulence at such locations. In many cases, it is impractical and expensive to setup instrumentation to characterize the climatology of OT, so numerical simulations become a less expensive and convenient alternative. The strength of OT is characterized by the refractive index structure function Cn2, which in turn is used to calculate atmospheric seeing parameters. While attempts have been made to characterize Cn2 using empirical models, Cn2 can be calculated more directly from Numerical Weather Prediction (NWP) simulations using pressure, temperature, thermal stability, vertical wind shear, turbulent Prandtl number, and turbulence kinetic energy (TKE). In this work we use the Weather Research and Forecast (WRF) NWP model to generate Cn2 climatologies in the planetary boundary layer and free atmosphere, allowing for both point-to-point and ground-to-space seeing estimates of the Fried Coherence length (ro) and other seeing parameters. Simulations are performed using a multi-node linux cluster using the Intel chip architecture. The WRF model is configured to run at 1km horizontal resolution and centered on the Mauna Loa Observatory (MLO) of the Big Island. The vertical resolution varies from 25 meters in the boundary layer to 500 meters in the stratosphere. The model top is 20 km. The Mellor-Yamada-Janjic (MYJ) TKE scheme has been modified to diagnose the turbulent Prandtl number as a function of the Richardson number, following observations by Kondo and others. This modification

  20. Advancing Satellite-Based Flood Prediction in Complex Terrain Using High-Resolution Numerical Weather Prediction

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Anagnostou, E. N.; Nikolopoulos, E. I.; Bartsotas, N. S.

    2015-12-01

    Floods constitute one of the most significant and frequent natural hazard in mountainous regions. Satellite-based precipitation products offer in many cases the only available source of QPE. However, satellite-based QPE over complex terrain suffer from significant bias that limits their applicability for hydrologic modeling. In this work we investigate the potential of a new correction procedure, which involves the use of high-resolution numerical weather prediction (NWP) model simulations to adjust satellite QPE. Adjustment is based on the pdf matching of satellite and NWP (used as reference) precipitation distribution. The impact of correction procedure on simulating the hydrologic response is examined for 15 storm events that generated floods over the mountainous Upper Adige region of Northern Italy. Atmospheric simulations were performed at 1-km resolution from a state-of-the-art atmospheric model (RAMS/ICLAMS). The proposed error correction procedure was then applied on the widely used TRMM 3B42 satellite precipitation product and the evaluation of the correction was based on independent in situ precipitation measurements from a dense rain gauge network (1 gauge / 70 km2) available in the study area. Satellite QPE, before and after correction, are used to simulate flood response using ARFFS (Adige River Flood Forecasting System), a semi-distributed hydrologic model, which is used for operational flood forecasting in the region. Results showed that bias in satellite QPE before correction was significant and had a tremendous impact on the simulation of flood peak, however the correction procedure was able to reduce bias in QPE and therefore improve considerably the simulated flood hydrograph.

  1. Advanced Hydraulic Tomography Analysis Strategies--A Numerical Study based on Field Observations

    NASA Astrophysics Data System (ADS)

    Tso, C. M.; Yeh, T. J.

    2013-12-01

    This report presents a discussion on some of the unexplored issues pertaining to the application of hydraulic tomography to interpret pumping test data collected in the field. Using numerical experiments, we probe at a few new strategies to analyze pumping test results for multi-layer aquifers. First of all, we study the averaging of heads over packer intervals of a wellbore. How does the length of the packers reduce the resolution of the estimated hydraulic conductivity (K) field? Next we investigate the effect of using hard data (a.k.a. primary information or K measurements) conditioning on the estimated K field. Does the conditioning constrain the solution better and if so, by how much? Then we examine the effect of initial guess of K field on the inversion results. Currently, our hydraulic tomography approach (SSLE (Yeh and Liu (2000) and SimSLE (Xiang et al. (2009)) assumes a homogeneous K field as initial guess by default. What if we use a random field as initial guess? What about assigning different zones in the domain and designate different homogenous initial guess values to each of them? Finally, updating and storing the covariance matrix heavily consumes computation time during the inversion process and can sometimes be prohibiting when solving large problems. In fact, it is often the most time-consuming part of the hydraulic tomography analysis. We study the effects on the hydraulic tomography results of (1) whether updating the covariance matrix after each iteration and (2) whether storing the full matrix or diagonal terms only. The investigation outlined above will shed light on the development of more effective and reliable hydraulic tomography analysis practices and algorithms.

  2. Numerical Investigation of a Cascaded Longitudinal Space-Charge Amplifier at the Fermilab's Advanced Superconducting Test Accelerator

    SciTech Connect

    Halavanau, A.; Piot, P.

    2015-06-01

    In a cascaded longitudinal space-charge amplifier (LSCA), initial density noise in a relativistic e-beam is amplified via the interplay of longitudinal space charge forces and properly located dispersive sections. This type of amplification process was shown to potentially result in large final density modulations [1] compatible with the production of broadband electromagnetic radiation. The technique was recently demonstrated in the optical domain [2]. In this paper we investigate, via numerical simulations, the performances of a cascaded LSCA beamline at the Fermilab’s Advanced Superconducting Test Accelerator (ASTA). We especially explore the properties of the produced broadband radiation. Our studies have been conducted with a grid-less three-dimensional space-charge algorithm.

  3. Development of 3D multimedia with advanced computer animation tools for outreach activities related to Meteor Science and Meteoritics

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.

    2012-09-01

    Documentaries related to Astronomy and Planetary Sciences are a common and very attractive way to promote the interest of the public in these areas. These educational tools can get benefit from new advanced computer animation software and 3D technologies, as these allow making these documentaries even more attractive. However, special care must be taken in order to guarantee that the information contained in them is serious and objective. In this sense, an additional value is given when the footage is produced by the own researchers. With this aim, a new documentary produced and directed by Prof. Madiedo has been developed. The documentary, which has been entirely developed by means of advanced computer animation tools, is dedicated to several aspects of Meteor Science and Meteoritics. The main features of this outreach and education initiative are exposed here.

  4. Advancing Efficient All-Electron Electronic Structure Methods Based on Numeric Atom-Centered Orbitals for Energy Related Materials

    NASA Astrophysics Data System (ADS)

    Blum, Volker

    This talk describes recent advances of a general, efficient, accurate all-electron electronic theory approach based on numeric atom-centered orbitals; emphasis is placed on developments related to materials for energy conversion and their discovery. For total energies and electron band structures, we show that the overall accuracy is on par with the best benchmark quality codes for materials, but scalable to large system sizes (1,000s of atoms) and amenable to both periodic and non-periodic simulations. A recent localized resolution-of-identity approach for the Coulomb operator enables O (N) hybrid functional based descriptions of the electronic structure of non-periodic and periodic systems, shown for supercell sizes up to 1,000 atoms; the same approach yields accurate results for many-body perturbation theory as well. For molecular systems, we also show how many-body perturbation theory for charged and neutral quasiparticle excitation energies can be efficiently yet accurately applied using basis sets of computationally manageable size. Finally, the talk highlights applications to the electronic structure of hybrid organic-inorganic perovskite materials, as well as to graphene-based substrates for possible future transition metal compound based electrocatalyst materials. All methods described here are part of the FHI-aims code. VB gratefully acknowledges contributions by numerous collaborators at Duke University, Fritz Haber Institute Berlin, TU Munich, USTC Hefei, Aalto University, and many others around the globe.

  5. Numerical Viscous Flow Analysis of an Advanced Semispan Diamond-Wing Model at High-Life Conditions

    NASA Technical Reports Server (NTRS)

    Ghaffari, F.; Biedron, R. T.; Luckring, J. M.

    2002-01-01

    Turbulent Navier-Stokes computational results are presented for an advanced diamond wing semispan model at low speed, high-lift conditions. The numerical results are obtained in support of a wind-tunnel test that was conducted in the National Transonic Facility (NTF) at the NASA Langley Research Center. The model incorporated a generic fuselage and was mounted on the tunnel sidewall using a constant width standoff. The analyses include: (1) the numerical simulation of the NTF empty, tunnel flow characteristics; (2) semispan high-lift model with the standoff in the tunnel environment; (3) semispan high-lift model with the standoff and viscous sidewall in free air; and (4) semispan high-lift model without the standoff in free air. The computations were performed at conditions that correspond to a nominal approach and landing configuration. The wing surface pressure distributions computed for the model in both the tunnel and in free air agreed well with the corresponding experimental data and they both indicated small increments due to the wall interference effects. However, the wall interference effects were found to be more pronounced in the total measured and the computed lift, drag and pitching moment due to standard induced up-flow effects. Although the magnitudes of the computed forces and moment were slightly off compared to the measured data, the increments due the wall interference effects were predicted well. The numerical predictions are also presented on the combined effects of the tunnel sidewall boundary layer and the standoff geometry on the fuselage fore-body pressure distributions and the resulting impact on the overall configuration longitudinal aerodynamic characteristics.

  6. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    SciTech Connect

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.; Qualls, A. L.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  7. 3-D Numerical Modeling as a Tool for Managing Mineral Water Extraction from a Complex Groundwater Basin in Italy

    NASA Astrophysics Data System (ADS)

    Zanini, A.; Tanda, M.

    2007-12-01

    The groundwater in Italy plays an important role as drinking water; in fact it covers about the 30% of the national demand (70% in Northern Italy). The mineral water distribution in Italy is an important business with an increasing demand from abroad countries. The mineral water Companies have a great interest in order to increase the water extraction, but for the delicate and complex geology of the subsoil, where such very high quality waters are contained, a particular attention must be paid in order to avoid an excessive lowering of the groundwater reservoirs or great changes in the groundwater flow directions. A big water Company asked our University to set up a numerical model of the groundwater basin, in order to obtain a useful tool which allows to evaluate the strength of the aquifer and to design new extraction wells. The study area is located along Appennini Mountains and it covers a surface of about 18 km2; the topography ranges from 200 to 600 m a.s.l.. In ancient times only a spring with naturally sparkling water was known in the area, but at present the mineral water is extracted from deep pumping wells. The area is characterized by a very complex geology: the subsoil structure is described by a sequence of layers of silt-clay, marl-clay, travertine and alluvial deposit. Different groundwater layers are present and the one with best quality flows in the travertine layer; the natural flow rate seems to be not subjected to seasonal variations. The water age analysis revealed a very old water which means that the mineral aquifers are not directly connected with the meteoric recharge. The Geologists of the Company suggest that the water supply of the mineral aquifers comes from a carbonated unit located in the deep layers of the mountains bordering the spring area. The valley is crossed by a river that does not present connections to the mineral aquifers. Inside the area there are about 30 pumping wells that extract water at different depths. We built a 3

  8. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  9. Advances in Chimera Grid Tools for Multi-Body Dynamics Simulations and Script Creation

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    This viewgraph presentation contains information about (1) Framework for multi-body dynamics - Geometry Manipulation Protocol (GMP), (2) Simulation procedure using Chimera Grid Tools (CGT) and OVERFLOW-2 (3) Further recent developments in Chimera Grid Tools OVERGRID, Grid modules, Script library and (4) Future work.

  10. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  11. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs (azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, and methylenedioxy...

  12. CRISPR/Cas9: an advanced tool for editing plant genomes.

    PubMed

    Samanta, Milan Kumar; Dey, Avishek; Gayen, Srimonta

    2016-10-01

    To meet current challenges in agriculture, genome editing using sequence-specific nucleases (SSNs) is a powerful tool for basic and applied plant biology research. Here, we describe the principle and application of available genome editing tools, including zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs) and the clustered regularly interspaced short palindromic repeat associated CRISPR/Cas9 system. Among these SSNs, CRISPR/Cas9 is the most recently characterized and rapidly developing genome editing technology, and has been successfully utilized in a wide variety of organisms. This review specifically illustrates the power of CRISPR/Cas9 as a tool for plant genome engineering, and describes the strengths and weaknesses of the CRISPR/Cas9 technology compared to two well-established genome editing tools, ZFNs and TALENs. PMID:27012546

  13. Advancing lighting and daylighting simulation: The transition from analysis to design aid tools

    SciTech Connect

    Hitchcock, R.J.

    1995-05-01

    This paper explores three significant software development requirements for making the transition from stand-alone lighting simulation/analysis tools to simulation-based design aid tools. These requirements include specialized lighting simulation engines, facilitated methods for creating detailed simulatable building descriptions, an automated techniques for providing lighting design guidance. Initial computer implementations meant to address each of these requirements are discussed to further elaborate these requirements and to illustrate work-in-progress.

  14. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  15. Advanced repair solution of clear defects on HTPSM by using nanomachining tool

    NASA Astrophysics Data System (ADS)

    Lee, Hyemi; Kim, Munsik; Jung, Hoyong; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    As the mask specifications become tighter for low k1 lithography, more aggressive repair accuracy is required below sub 20nm tech. node. To meet tight defect specifications, many maskshops select effective repair tools according to defect types. Normally, pattern defects are repaired by the e-beam repair tool and soft defects such as particles are repaired by the nanomachining tool. It is difficult for an e-beam repair tool to remove particle defects because it uses chemical reaction between gas and electron, and a nanomachining tool, which uses physical reaction between a nano-tip and defects, cannot be applied for repairing clear defects. Generally, film deposition process is widely used for repairing clear defects. However, the deposited film has weak cleaning durability, so it is easily removed by accumulated cleaning process. Although the deposited film is strongly attached on MoSiN(or Qz) film, the adhesive strength between deposited Cr film and MoSiN(or Qz) film becomes weaker and weaker by the accumulated energy when masks are exposed in a scanner tool due to the different coefficient of thermal expansion of each materials. Therefore, whenever a re-pellicle process is needed to a mask, all deposited repair points have to be confirmed whether those deposition film are damaged or not. And if a deposition point is damaged, repair process is needed again. This process causes longer and more complex process. In this paper, the basic theory and the principle are introduced to recover clear defects by using nanomachining tool, and the evaluated results are reviewed at dense line (L/S) patterns and contact hole (C/H) patterns. Also, the results using a nanomachining were compared with those using an e-beam repair tool, including the cleaning durability evaluated by the accumulated cleaning process. Besides, we discuss the phase shift issue and the solution about the image placement error caused by phase error.

  16. Numerical study on the splitting of a vapor bubble in the ultrasonic assisted EDM process with the curved tool and workpiece.

    PubMed

    Shervani-Tabar, M T; Seyed-Sadjadi, M H; Shabgard, M R

    2013-01-01

    Electrical discharge machining (EDM) is a powerful and modern method of machining. In the EDM process, a vapor bubble is generated between the tool and the workpiece in the dielectric liquid due to an electrical discharge. In this process dynamic behavior of the vapor bubble affects machining process. Vibration of the tool surface affects bubble behavior and consequently affects material removal rate (MRR). In this paper, dynamic behavior of the vapor bubble in an ultrasonic assisted EDM process after the appearance of the necking phenomenon is investigated. It is noteworthy that necking phenomenon occurs when the bubble takes the shape of an hour-glass. After the appearance of the necking phenomenon, the vapor bubble splits into two parts and two liquid jets are developed on the boundaries of the upper and lower parts of the vapor bubble. The liquid jet developed on the upper part of the bubble impinges to the tool and the liquid jet developed on the lower part of the bubble impinges to the workpiece. These liquid jets cause evacuation of debris from the gap between the tool and the workpiece and also cause erosion of the workpiece and the tool. Curved tool and workpiece affect the shape and the velocity of the liquid jets during splitting of the vapor bubble. In this paper dynamics of the vapor bubble after its splitting near the curved tool and workpiece is investigated in three cases. In the first case surfaces of the tool and the workpiece are flat, in the second case surfaces of the tool and the workpiece are convex and in the third case surfaces of the tool and workpiece are concave. Numerical results show that in the third case, the velocity of liquid jets which are developed on the boundaries of the upper and lower parts of the vapor bubble after its splitting have the highest magnitude and their shape are broader than the other cases.

  17. Numerical study on the splitting of a vapor bubble in the ultrasonic assisted EDM process with the curved tool and workpiece.

    PubMed

    Shervani-Tabar, M T; Seyed-Sadjadi, M H; Shabgard, M R

    2013-01-01

    Electrical discharge machining (EDM) is a powerful and modern method of machining. In the EDM process, a vapor bubble is generated between the tool and the workpiece in the dielectric liquid due to an electrical discharge. In this process dynamic behavior of the vapor bubble affects machining process. Vibration of the tool surface affects bubble behavior and consequently affects material removal rate (MRR). In this paper, dynamic behavior of the vapor bubble in an ultrasonic assisted EDM process after the appearance of the necking phenomenon is investigated. It is noteworthy that necking phenomenon occurs when the bubble takes the shape of an hour-glass. After the appearance of the necking phenomenon, the vapor bubble splits into two parts and two liquid jets are developed on the boundaries of the upper and lower parts of the vapor bubble. The liquid jet developed on the upper part of the bubble impinges to the tool and the liquid jet developed on the lower part of the bubble impinges to the workpiece. These liquid jets cause evacuation of debris from the gap between the tool and the workpiece and also cause erosion of the workpiece and the tool. Curved tool and workpiece affect the shape and the velocity of the liquid jets during splitting of the vapor bubble. In this paper dynamics of the vapor bubble after its splitting near the curved tool and workpiece is investigated in three cases. In the first case surfaces of the tool and the workpiece are flat, in the second case surfaces of the tool and the workpiece are convex and in the third case surfaces of the tool and workpiece are concave. Numerical results show that in the third case, the velocity of liquid jets which are developed on the boundaries of the upper and lower parts of the vapor bubble after its splitting have the highest magnitude and their shape are broader than the other cases. PMID:22784706

  18. Handbook of Research on Hybrid Learning Models: Advanced Tools, Technologies, and Applications

    ERIC Educational Resources Information Center

    Wang, Fu Lee, Ed.; Fong, Joseph, Ed.; Kwan, Reggie, Ed.

    2010-01-01

    Hybrid learning is now the single-greatest trend in education today due to the numerous educational advantages when both traditional classroom learning and e-learning are implemented collectively. This handbook collects emerging research and pedagogies related to the convergence of teaching and learning methods. This significant "Handbook of…

  19. A Clinical Assessment Tool for Advanced Theory of Mind Performance in 5 to 12 Year Olds

    ERIC Educational Resources Information Center

    O'Hare, Anne E.; Bremner, Lynne; Nash, Marysia; Happe, Francesca; Pettigrew, Luisa M.

    2009-01-01

    One hundred forty typically developing 5- to 12-year-old children were assessed with a test of advanced theory of mind employing Happe's strange stories. There was no significant difference in performance between boys and girls. The stories discriminated performance across the different ages with the lowest performance being in the younger…

  20. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  1. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced state-of-the-art technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs [azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, m...

  2. Advanced Technologies as Educational Tools in Science: Concepts, Applications, and Issues. Monograph Series Number 8.

    ERIC Educational Resources Information Center

    Kumar, David D.; And Others

    Systems incorporating two advanced technologies, hypermedia systems and intelligent tutors, are examined with respect to their potential impact on science education. The conceptual framework underlying these systems is discussed first. Applications of systems are then presented with examples of each in operation within the context of science…

  3. Just-in-Time Teaching: A Tool for Enhancing Student Engagement in Advanced Foreign Language Learning

    ERIC Educational Resources Information Center

    Abreu, Laurel; Knouse, Stephanie

    2014-01-01

    Scholars have indicated a need for further research on effective pedagogical strategies designed for advanced foreign language courses in the postsecondary setting, especially in light of decreased enrollments at this level and the elimination of foreign language programs altogether in some institutions (Paesani & Allen, 2012). This article…

  4. Recent advances in microbial production of fuels and chemicals using tools and strategies of systems metabolic engineering.

    PubMed

    Cho, Changhee; Choi, So Young; Luo, Zi Wei; Lee, Sang Yup

    2015-11-15

    The advent of various systems metabolic engineering tools and strategies has enabled more sophisticated engineering of microorganisms for the production of industrially useful fuels and chemicals. Advances in systems metabolic engineering have been made in overproducing natural chemicals and producing novel non-natural chemicals. In this paper, we review the tools and strategies of systems metabolic engineering employed for the development of microorganisms for the production of various industrially useful chemicals belonging to fuels, building block chemicals, and specialty chemicals, in particular focusing on those reported in the last three years. It was aimed at providing the current landscape of systems metabolic engineering and suggesting directions to address future challenges towards successfully establishing processes for the bio-based production of fuels and chemicals from renewable resources.

  5. Advanced Algorithms and Automation Tools for Discrete Ordinates Methods in Parallel Environments

    SciTech Connect

    Alireza Haghighat

    2003-05-07

    This final report discusses major accomplishments of a 3-year project under the DOE's NEER Program. The project has developed innovative and automated algorithms, codes, and tools for solving the discrete ordinates particle transport method efficiently in parallel environments. Using a number of benchmark and real-life problems, the performance and accuracy of the new algorithms have been measured and analyzed.

  6. Continuous Symmetry and Chemistry Teachers: Learning Advanced Chemistry Content through Novel Visualization Tools

    ERIC Educational Resources Information Center

    Tuvi-Arad, Inbal; Blonder, Ron

    2010-01-01

    In this paper we describe the learning process of a group of experienced chemistry teachers in a specially designed workshop on molecular symmetry and continuous symmetry. The workshop was based on interactive visualization tools that allow molecules and their symmetry elements to be rotated in three dimensions. The topic of continuous symmetry is…

  7. Advances in omics and bioinformatics tools for systems analyses of plant functions.

    PubMed

    Mochida, Keiichi; Shinozaki, Kazuo

    2011-12-01

    Omics and bioinformatics are essential to understanding the molecular systems that underlie various plant functions. Recent game-changing sequencing technologies have revitalized sequencing approaches in genomics and have produced opportunities for various emerging analytical applications. Driven by technological advances, several new omics layers such as the interactome, epigenome and hormonome have emerged. Furthermore, in several plant species, the development of omics resources has progressed to address particular biological properties of individual species. Integration of knowledge from omics-based research is an emerging issue as researchers seek to identify significance, gain biological insights and promote translational research. From these perspectives, we provide this review of the emerging aspects of plant systems research based on omics and bioinformatics analyses together with their associated resources and technological advances.

  8. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  9. Anvil Forecast Tool in the Advanced Weather Interactive Processing System, Phase II

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III

    2008-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Light Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input.

  10. ADVANCEMENT OF NUCLEIC ACID-BASED TOOLS FOR MONITORING IN SITU REDUCTIVE DECHLORINATION

    SciTech Connect

    Vangelas, K; ELIZABETH EDWARDS, E; FRANK LOFFLER, F; Brian02 Looney, B

    2006-11-17

    Regulatory protocols generally recognize that destructive processes are the most effective mechanisms that support natural attenuation of chlorinated solvents. In many cases, these destructive processes will be biological processes and, for chlorinated compounds, will often be reductive processes that occur under anaerobic conditions. The existing EPA guidance (EPA, 1998) provides a list of parameters that provide indirect evidence of reductive dechlorination processes. In an effort to gather direct evidence of these processes, scientists have identified key microorganisms and are currently developing tools to measure the abundance and activity of these organisms in subsurface systems. Drs. Edwards and Luffler are two recognized leaders in this field. The research described herein continues their development efforts to provide a suite of tools to enable direct measures of biological processes related to the reductive dechlorination of TCE and PCE. This study investigated the strengths and weaknesses of the 16S rRNA gene-based approach to characterizing the natural attenuation capabilities in samples. The results suggested that an approach based solely on 16S rRNA may not provide sufficient information to document the natural attenuation capabilities in a system because it does not distinguish between strains of organisms that have different biodegradation capabilities. The results of the investigations provided evidence that tools focusing on relevant enzymes for functionally desired characteristics may be useful adjuncts to the 16SrRNA methods.

  11. From beginners to trained users: an advanced tool to guide experimenters in basic applied fluorescence

    NASA Astrophysics Data System (ADS)

    Pingand, Philippe B.; Lerner, Dan A.

    1993-05-01

    UPY-F is a software dedicated to solving various queries issued by end-users of spectrofluorimeters when they come across a problem in the course of an experiment. The main goal is to provide a diagnostic for the nonpertinent use of a spectrofluorimeter. Many artifacts may induce the operator into trouble and except for experts, the simple manipulation of the controls of a fluorimeter results in effects not always fully appreciated. The solution retained is an association between a powerful hypermedia tool and an expert system. A straight expert system offers a number of well-known advantages. But it is not well accepted by the user due to the many moves between the spectrofluorimeter and the diagnostic tool. In our hypermedia tool, knowledge can be displayed by the means of visual concepts through which one can browse, and navigate. The user still perceives his problem as a whole, which may not be the case with a straight expert system. We demonstrate typical situations in which an event will trigger a chain reasoning leading to the debugging of the problem. The system is not only meant to help a beginner but can conform itself to guide a well trained experimenter. We think that its functionalities and user-friendly interface are very attractive and open new vistas in the way future users may be trained, whether they work in research labs or industrial settings, as it could namely cut down on the time spent for their training.

  12. A US Perspective on Selected Biotechnological Advancements in Fish Health Part II: Genetic stock improvement, biosecurity tools and alternative protein sources in fish diets

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remarkable biotechnological advancements have been made in the aquaculture industry in the past five years. Advancements, in areas such as fish vaccines, improved genetic stock, biosecurity tools and alternative protein sources in fish diets, are necessary to meet the rapid growth of the aquacultur...

  13. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater. PMID:26856870

  14. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  15. Numerical and structural aberrations in advanced neuroblastoma tumours by CGH analysis; survival correlates with chromosome 17 status

    PubMed Central

    Cunsolo, C Lo; Bicocchi, M P; Petti, A R; Tonini, G P

    2000-01-01

    Rapid tumour progression in neuroblastoma is associated with MYCN amplification, deletion of the short arm of chromosome 1 and gain of 17q. However, patients with advanced disease without MYCN amplification and/or 1p deletion have a very poor outcome too, which suggests other genetic defects may predict an unfavourable prognosis. We employed CGH to study 22 tumours of patients at stages 3 and 4 over one year of age (6 and 16 cases respectively). Patients were divided in groups (A) long-term survivors and (B) short-term survivors. CGH showed a total of 226 chromosome imbalances (110 in group A and 116 in group B). The neuroblastoma cells of long-term survivors showed a preponderance of numerical aberrations (54%vs 43%); particularly gains of entire chromosomes 1 (P< 0.03), 7 (P< 0.04) and 19 (P< 0.05). An extra copy of 17 was detected in 6/8 (75%) samples of group A and only 1/14 (7%) samples of group B (P< 0.002). Conversely, tumours of patients who died from disease progression displayed a higher frequency of structural abnormalities (43%vs 35%), including loss of 1p, 9p, 11q, 15q and 18q and gain of 12q, although the difference was not significant (P= 0.24). Unbalanced gain of 17q was detected in 8/14 (57%) tumours of group B and only 1/8 (13%) tumours of group A (P< 0.05). The peculiar genetic difference observed in the tumours of long and short-term survivors may have prognostic relevance. © 2000 Cancer Research Campaign PMID:11044353

  16. Advanced techniques in IR thermography as a tool for the pest management professional

    NASA Astrophysics Data System (ADS)

    Grossman, Jon L.

    2006-04-01

    Within the past five years, the Pest Management industry has become aware that IR thermography can aid in the detection of pest infestations and locate other conditions that are within the purview of the industry. This paper will review the applications that can be utilized by the pest management professional and discuss the advanced techniques that may be required in conjunction with thermal imaging to locate insect and other pest infestations, moisture within structures, the verification of data and the special challenges associated with the inspection process.

  17. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Beers, Benjamin; Philips, Alan; Holt, James B.; Threet, Grady E., Jr.

    2013-01-01

    The Earth to Orbit (ETO) Team of the Advanced Concepts Office (ACO) at NASA Marshal Space Flight Center (MSFC) is considered the preeminent group to go to for prephase A and phase A concept definition. The ACO team has been at the forefront of a multitude of launch vehicle studies determining the future direction of the Agency as a whole due, in part, to their rapid turnaround time in analyzing concepts and their ability to cover broad trade spaces of vehicles in that limited timeframe. Each completed vehicle concept includes a full mass breakdown of each vehicle to tertiary subsystem components, along with a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. Additionally, a structural analysis of the vehicle based on material properties and geometries is performed as well as an analysis to determine the flight loads based on the trajectory outputs. As mentioned, the ACO Earth to Orbit Team prides themselves on their rapid turnaround time and often need to fulfill customer requests within limited schedule or little advanced notice. Due to working in this fast paced environment, the ETO team has developed some finely honed skills and methods to maximize the delivery capability to meet their customer needs. This paper will describe the interfaces between the 3 primary disciplines used in the design process; weights and sizing, trajectory, and structural analysis, as well as the approach each discipline employs to streamline their particular piece of the design process.

  18. Recent advances in developing molecular tools for targeted genome engineering of mammalian cells.

    PubMed

    Lim, Kwang-il

    2015-01-01

    Various biological molecules naturally existing in diversified species including fungi, bacteria, and bacteriophage have functionalities for DNA binding and processing. The biological molecules have been recently actively engineered for use in customized genome editing of mammalian cells as the molecule-encoding DNA sequence information and the underlying mechanisms how the molecules work are unveiled. Excitingly, multiple novel methods based on the newly constructed artificial molecular tools have enabled modifications of specific endogenous genetic elements in the genome context at efficiencies that are much higher than that of the conventional homologous recombination based methods. This minireview introduces the most recently spotlighted molecular genome engineering tools with their key features and ongoing modifications for better performance. Such ongoing efforts have mainly focused on the removal of the inherent DNA sequence recognition rigidity from the original molecular platforms, the addition of newly tailored targeting functions into the engineered molecules, and the enhancement of their targeting specificity. Effective targeted genome engineering of mammalian cells will enable not only sophisticated genetic studies in the context of the genome, but also widely-applicable universal therapeutics based on the pinpointing and correction of the disease-causing genetic elements within the genome in the near future.

  19. Neuron-Miner: An Advanced Tool for Morphological Search and Retrieval in Neuroscientific Image Databases.

    PubMed

    Conjeti, Sailesh; Mesbah, Sepideh; Negahdar, Mohammadreza; Rautenberg, Philipp L; Zhang, Shaoting; Navab, Nassir; Katouzian, Amin

    2016-10-01

    The steadily growing amounts of digital neuroscientific data demands for a reliable, systematic, and computationally effective retrieval algorithm. In this paper, we present Neuron-Miner, which is a tool for fast and accurate reference-based retrieval within neuron image databases. The proposed algorithm is established upon hashing (search and retrieval) technique by employing multiple unsupervised random trees, collectively called as Hashing Forests (HF). The HF are trained to parse the neuromorphological space hierarchically and preserve the inherent neuron neighborhoods while encoding with compact binary codewords. We further introduce the inverse-coding formulation within HF to effectively mitigate pairwise neuron similarity comparisons, thus allowing scalability to massive databases with little additional time overhead. The proposed hashing tool has superior approximation of the true neuromorphological neighborhood with better retrieval and ranking performance in comparison to existing generalized hashing methods. This is exhaustively validated by quantifying the results over 31266 neuron reconstructions from Neuromorpho.org dataset curated from 147 different archives. We envisage that finding and ranking similar neurons through reference-based querying via Neuron Miner would assist neuroscientists in objectively understanding the relationship between neuronal structure and function for applications in comparative anatomy or diagnosis. PMID:27155864

  20. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  1. Neuron-Miner: An Advanced Tool for Morphological Search and Retrieval in Neuroscientific Image Databases.

    PubMed

    Conjeti, Sailesh; Mesbah, Sepideh; Negahdar, Mohammadreza; Rautenberg, Philipp L; Zhang, Shaoting; Navab, Nassir; Katouzian, Amin

    2016-10-01

    The steadily growing amounts of digital neuroscientific data demands for a reliable, systematic, and computationally effective retrieval algorithm. In this paper, we present Neuron-Miner, which is a tool for fast and accurate reference-based retrieval within neuron image databases. The proposed algorithm is established upon hashing (search and retrieval) technique by employing multiple unsupervised random trees, collectively called as Hashing Forests (HF). The HF are trained to parse the neuromorphological space hierarchically and preserve the inherent neuron neighborhoods while encoding with compact binary codewords. We further introduce the inverse-coding formulation within HF to effectively mitigate pairwise neuron similarity comparisons, thus allowing scalability to massive databases with little additional time overhead. The proposed hashing tool has superior approximation of the true neuromorphological neighborhood with better retrieval and ranking performance in comparison to existing generalized hashing methods. This is exhaustively validated by quantifying the results over 31266 neuron reconstructions from Neuromorpho.org dataset curated from 147 different archives. We envisage that finding and ranking similar neurons through reference-based querying via Neuron Miner would assist neuroscientists in objectively understanding the relationship between neuronal structure and function for applications in comparative anatomy or diagnosis.

  2. Emerging tools for continuous nutrient monitoring networks: Sensors advancing science and water resources protection

    USGS Publications Warehouse

    Pellerin, Brian; Stauffer, Beth A; Young, Dwane A; Sullivan, Daniel J.; Bricker, Suzanne B.; Walbridge, Mark R; Clyde, Gerard A; Shaw, Denice M

    2016-01-01

    Sensors and enabling technologies are becoming increasingly important tools for water quality monitoring and associated water resource management decisions. In particular, nutrient sensors are of interest because of the well-known adverse effects of nutrient enrichment on coastal hypoxia, harmful algal blooms, and impacts to human health. Accurate and timely information on nutrient concentrations and loads is integral to strategies designed to minimize risk to humans and manage the underlying drivers of water quality impairment. Using nitrate sensors as an example, we highlight the types of applications in freshwater and coastal environments that are likely to benefit from continuous, real-time nutrient data. The concurrent emergence of new tools to integrate, manage and share large data sets is critical to the successful use of nutrient sensors and has made it possible for the field of continuous nutrient monitoring to rapidly move forward. We highlight several near-term opportunities for Federal agencies, as well as the broader scientific and management community, that will help accelerate sensor development, build and leverage sites within a national network, and develop open data standards and data management protocols that are key to realizing the benefits of a large-scale, integrated monitoring network. Investing in these opportunities will provide new information to guide management and policies designed to protect and restore our nation’s water resources.

  3. Recent advances in developing molecular tools for targeted genome engineering of mammalian cells.

    PubMed

    Lim, Kwang-il

    2015-01-01

    Various biological molecules naturally existing in diversified species including fungi, bacteria, and bacteriophage have functionalities for DNA binding and processing. The biological molecules have been recently actively engineered for use in customized genome editing of mammalian cells as the molecule-encoding DNA sequence information and the underlying mechanisms how the molecules work are unveiled. Excitingly, multiple novel methods based on the newly constructed artificial molecular tools have enabled modifications of specific endogenous genetic elements in the genome context at efficiencies that are much higher than that of the conventional homologous recombination based methods. This minireview introduces the most recently spotlighted molecular genome engineering tools with their key features and ongoing modifications for better performance. Such ongoing efforts have mainly focused on the removal of the inherent DNA sequence recognition rigidity from the original molecular platforms, the addition of newly tailored targeting functions into the engineered molecules, and the enhancement of their targeting specificity. Effective targeted genome engineering of mammalian cells will enable not only sophisticated genetic studies in the context of the genome, but also widely-applicable universal therapeutics based on the pinpointing and correction of the disease-causing genetic elements within the genome in the near future. PMID:25104401

  4. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  5. Numerical modelling as a cost-reduction tool for probability of detection of bolt hole eddy current testing

    NASA Astrophysics Data System (ADS)

    Mandache, C.; Khan, M.; Fahr, A.; Yanishevsky, M.

    2011-03-01

    Probability of detection (PoD) studies are broadly used to determine the reliability of specific nondestructive inspection procedures, as well as to provide data for damage tolerance life estimations and calculation of inspection intervals for critical components. They require inspections on a large set of samples, a fact that makes these statistical assessments time- and cost-consuming. Physics-based numerical simulations of nondestructive testing inspections could be used as a cost-effective alternative to empirical investigations. They realistically predict the inspection outputs as functions of the input characteristics related to the test piece, transducer and instrument settings, which are subsequently used to partially substitute and/or complement inspection data in PoD analysis. This work focuses on the numerical modelling aspects of eddy current testing for the bolt hole inspections of wing box structures typical of the Lockheed Martin C-130 Hercules and P-3 Orion aircraft, found in the air force inventory of many countries. Boundary element-based numerical modelling software was employed to predict the eddy current signal responses when varying inspection parameters related to probe characteristics, crack geometry and test piece properties. Two demonstrator exercises were used for eddy current signal prediction when lowering the driver probe frequency and changing the material's electrical conductivity, followed by subsequent discussions and examination of the implications on using simulated data in the PoD analysis. Despite some simplifying assumptions, the modelled eddy current signals were found to provide similar results to the actual inspections. It is concluded that physics-based numerical simulations have the potential to partially substitute or complement inspection data required for PoD studies, reducing the cost, time, effort and resources necessary for a full empirical PoD assessment.

  6. QCanvas: An Advanced Tool for Data Clustering and Visualization of Genomics Data.

    PubMed

    Kim, Nayoung; Park, Herin; He, Ningning; Lee, Hyeon Young; Yoon, Sukjoon

    2012-12-01

    We developed a user-friendly, interactive program to simultaneously cluster and visualize omics data, such as DNA and protein array profiles. This program provides diverse algorithms for the hierarchical clustering of two-dimensional data. The clustering results can be interactively visualized and optimized on a heatmap. The present tool does not require any prior knowledge of scripting languages to carry out the data clustering and visualization. Furthermore, the heatmaps allow the selective display of data points satisfying user-defined criteria. For example, a clustered heatmap of experimental values can be differentially visualized based on statistical values, such as p-values. Including diverse menu-based display options, QCanvas provides a convenient graphical user interface for pattern analysis and visualization with high-quality graphics.

  7. Advances in ion trap mass spectrometry: Photodissociation as a tool for structural elucidation

    SciTech Connect

    Stephenson, J.L. Jr.; Booth, M.M.; Eyler, J.R.; Yost, R.A.

    1995-12-01

    Photo-induced dissociation (PID) is the next most frequently used method (after collisional activation) for activation of Polyatomic ions in tandem mass spectrometry. The range of internal energies present after the photon absorption process are much narrower than those obtained with collisional energy transfer. Therefore, the usefulness of PID for the study of ion structures is greatly enhanced. The long storage times and instrumental configuration of the ion trap mass spectrometer are ideally suited for photodissociation experiments. This presentation will focus on both the fundamental and analytical applications of CO{sub 2} lasers in conjunction with ion trap mass spectrometry. The first portion of this talk will examine the fundamental issues of wavelength dependence, chemical kinetics, photoabsorption cross section, and collisional effects on photodissociation efficiency. The second half of this presentation will look at novel instrumentation for electrospray/ion trap mass spectrometry, with the concurrent development of photodissociation as a tool for structural elucidation of organic compounds and antibiotics.

  8. Microfluidic chips with multi-junctions: an advanced tool in recovering proteins from inclusion bodies.

    PubMed

    Yamaguchi, Hiroshi; Miyazaki, Masaya

    2015-01-01

    Active recombinant proteins are used for studying the biological functions of genes and for the development of therapeutic drugs. Overexpression of recombinant proteins in bacteria often results in the formation of inclusion bodies, which are protein aggregates with non-native conformations. Protein refolding is an important process for obtaining active recombinant proteins from inclusion bodies. However, the conventional refolding method of dialysis or dilution is time-consuming and recovered active protein yields are often low, and a cumbersome trial-and-error process is required to achieve success. To circumvent these difficulties, we used controllable diffusion through laminar flow in microchannels to regulate the denaturant concentration. This method largely aims at reducing protein aggregation during the refolding procedure. This Commentary introduces the principles of the protein refolding method using microfluidic chips and the advantage of our results as a tool for rapid and efficient recovery of active recombinant proteins from inclusion bodies.

  9. Advanced Prediction of Tool Wear by Taking the Load History into Consideration

    NASA Astrophysics Data System (ADS)

    Ersoy, K.; Nuernberg, G.; Herrmann, G.; Hoffmann, H.

    2007-04-01

    A disadvantage of the conventional methods of simulating the wear occurring in deep drawing processes is that the wear coefficient, and thus wear too, is considered to be constant along loading duration, which, in case of deep drawing, corresponds to sliding distance and number of punch strokes. However, in reality, it is a known fact that wear development is not constant over time. In former studies, the authors presented a method, which makes it possible to consider the number of punch strokes in the simulation of wear. Another enhancement of this method is introduced in this paper. It is proposed to consider wear as a function of wear work instead of the number of punch strokes. Using this approach, the wear coefficients are implemented as a function of wear work and fully take into account the load history of the respective node. This enhancement makes it possible to apply the variable wear coefficients to completely different geometries, where one punch stroke involves different sliding distance or pressure values than the experiments with which the wear coefficients were determined. In this study, deep drawing experiments with a cylindrical cup geometry were carried out, in which the characteristic wear coefficient values as well as their gradients along the life cycle were determined. In this case, the die was produced via rapid tooling techniques. The prediction of tool wear is carried out with REDSY, a wear simulation software which was developed at the Institute of Metal Forming and Casting, TU-Muenchen. The wear predictions made by this software are based on the results of a conventional deep drawing simulation. For the wear modelling a modified Archard model was used.

  10. OCT corneal epithelial topographic asymmetry as a sensitive diagnostic tool for early and advancing keratoconus

    PubMed Central

    Kanellopoulos, Anastasios John; Asimellis, George

    2014-01-01

    Purpose To investigate epithelial thickness-distribution characteristics in a large group of keratoconic patients and their correlation to normal eyes employing anterior-segment optical coherence tomography (AS-OCT). Materials and methods The study group (n=160 eyes) consisted of clinically diagnosed keratoconus eyes; the control group (n=160) consisted of nonkeratoconic eyes. Three separate, three-dimensional epithelial thickness maps were obtained employing AS-OCT, enabling investigation of the pupil center, average, mid-peripheral, superior, inferior, maximum, minimum, and topographic epithelial thickness variability. Intraindividual repeatability of measurements was assessed. We introduced correlation of the epithelial data via newly defined indices. The epithelial thickness indices were then correlated with two Scheimpflug imaging-derived AS-irregularity indices: the index of height decentration, and the index of surface variance highly sensitive to early and advancing keratoconus diagnosis as validation. Results Intraindividual repeatability of epithelial thickness measurement in the keratoconic group was on average 1.67 μm. For the control group, repeatability was on average 1.13 μm. In the keratoconic group, pupil-center epithelial thickness was 51.75±7.02 μm, while maximum and minimum epithelial thickness were 63.54±8.85 μm and 40.73±8.51 μm. In the control group, epithelial thickness at the center was 52.54±3.23 μm, with maximum 55.33±3.27 μm and minimum 48.50±3.98 μm epithelial thickness. Topographic variability was 6.07±3.55 μm in the keratoconic group, while for the control group it was 1.59±0.79 μm. In keratoconus, topographic epithelial thickness change from normal, correlated tightly with the topometric asymmetry indices of IHD and ISV derived from Scheimpflug imaging. Conclusion Simple, OCT-derived epithelial mapping, appears to have critical potential in early and advancing keratoconus diagnosis, confirmed with its correlation

  11. Myositis registries and biorepositories: powerful tools to advance clinical, epidemiologic and pathogenic research

    PubMed Central

    Rider, Lisa G.; Dankó, Katalin; Miller, Frederick W.

    2016-01-01

    Purpose of review Clinical registries and biorepositories have proven extremely useful in many studies of diseases, especially rare diseases. Given their rarity and diversity, the idiopathic inflammatory myopathies, or myositis syndromes, have benefited from individual researchers’ collections of cohorts of patients. Major efforts are being made to establish large registries and biorepositories that will allow many additional studies to be performed that were not possible before. Here we describe the registries developed by investigators and patient support groups that are currently available for collaborative research purposes. Recent findings We have identified 46 myositis research registries, including many with biorepositories, which have been developed for a wide variety of purposes and have resulted in great advances in understanding the range of phenotypes, clinical presentations, risk factors, pathogenic mechanisms, outcome assessment, therapeutic responses, and prognoses. These are now available for collaborative use to undertake additional studies. Two myositis patient registries have been developed for research, and myositis patient support groups maintain demographic registries with large numbers of patients available to be contacted for potential research participation. Summary Investigator-initiated myositis research registries and biorepositories have proven extremely useful in understanding many aspects of these rare and diverse autoimmune diseases. These registries and biorepositories, in addition to those developed by myositis patient support groups, deserve continued support to maintain the momentum in this field as they offer major opportunities to improve understanding of the pathogenesis and treatment of these diseases in cost-effective ways. PMID:25225838

  12. Propulsion Simulations Using Advanced Turbulence Models with the Unstructured Grid CFD Tool, TetrUSS

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Frink, Neal T.; Deere, Karen A.; Pandya, Mohangna J.

    2004-01-01

    A computational investigation has been completed to assess the capability of TetrUSS for exhaust nozzle flows. Three configurations were chosen for this study (1) an axisymmetric supersonic jet, (2) a transonic axisymmetric boattail with solid sting operated at different Reynolds number and Mach number, and (3) an isolated non-axisymmetric nacelle with a supersonic cruise nozzle. These configurations were chosen because existing experimental data provided a means for measuring the ability of TetrUSS for simulating complex nozzle flows. The main objective of this paper is to validate the implementation of advanced two-equation turbulence models in the unstructured-grid CFD code USM3D for propulsion flow cases. USM3D is the flow solver of the TetrUSS system. Three different turbulence models, namely, Menter Shear Stress Transport (SST), basic k epsilon, and the Spalart-Allmaras (SA) are used in the present study. The results are generally in agreement with other implementations of these models in structured-grid CFD codes. Results indicate that USM3D provides accurate simulations for complex aerodynamic configurations with propulsion integration.

  13. Advanced semi-active engine and transmission mounts: tools for modelling, analysis, design, and tuning

    NASA Astrophysics Data System (ADS)

    Farjoud, Alireza; Taylor, Russell; Schumann, Eric; Schlangen, Timothy

    2014-02-01

    This paper is focused on modelling, design, and testing of semi-active magneto-rheological (MR) engine and transmission mounts used in the automotive industry. The purpose is to develop a complete analysis, synthesis, design, and tuning tool that reduces the need for expensive and time-consuming laboratory and field tests. A detailed mathematical model of such devices is developed using multi-physics modelling techniques for physical systems with various energy domains. The model includes all major features of an MR mount including fluid dynamics, fluid track, elastic components, decoupler, rate-dip, gas-charged chamber, MR fluid rheology, magnetic circuit, electronic driver, and control algorithm. Conventional passive hydraulic mounts can also be studied using the same mathematical model. The model is validated using standard experimental procedures. It is used for design and parametric study of mounts; effects of various geometric and material parameters on dynamic response of mounts can be studied. Additionally, this model can be used to test various control strategies to obtain best vibration isolation performance by tuning control parameters. Another benefit of this work is that nonlinear interactions between sub-components of the mount can be observed and investigated. This is not possible by using simplified linear models currently available.

  14. SERS as an advanced tool for investigating chloroethyl nitrosourea derivatives complexation with DNA.

    PubMed

    Agarwal, Shweta; Ray, Bhumika; Mehrotra, Ranjana

    2015-11-01

    We report surface-enhanced Raman spectroscopic (SERS) studies on free calf thymus DNA and its complexes with anti-tumor chloroethyl nitrosourea derivatives; semustine and nimustine. Since, first incident of SERS in 1974, it has rapidly established into an analytical tool, which can be used for the trace detection and characterization of analytes. Here, we depict yet another application of SERS in the field of drug-DNA interaction and thereby, its promising role in rational designing of new chemotherapeutic agents. Vibrational spectral analysis has been performed in an attempt to delineate the anti-cancer action mechanism of above mentioned nitrosourea derivatives. Strong SERS bands associated with the complexation of DNA with semustine and nimustine have been observed, which reveal binding of nitrosourea derivatives with heterocyclic nitrogenous base pair of DNA duplex. Formation of dG-dC interstrand cross-link in DNA double helices is also suggested by the SERS spectral outcomes of CENUs-DNA adduct. Results, demonstrated here, reflect recent progress in the newly developing field of drug-DNA interaction analysis via SERS.

  15. New advances and validation of knowledge management tools for critical care using classifier techniques.

    PubMed Central

    Frize, M.; Wang, L.; Ennett, C. M.; Nickerson, B. G.; Solven, F. G.; Stevenson, M.

    1998-01-01

    An earlier version (2.0) of the case-based reasoning (CBR) tool, called IDEAS for ICU's, allowed users to compare the ten closest matching cases to the newest patient admission, using a large database of intensive care patient records, and physician-selected matching-weights [1,2]. The new version incorporates matching-weights, which have been determined quantitatively. A faster CBR matching engine has also been incorporated into the new CBR. In a second approach, a back-propagation, feed-forward artificial neural network estimated two classes of the outcome "duration of artificial ventilation" for a subset of the database used for the CBR work. Weight-elimination was successfully applied to reduce the number of input variables and speed-up the estimation of outcomes. New experiments examined the impact of using a different number of input variables on the performance of the ANN, measured by correct classification rates (CCR) and the Average Squared Error (ASE). PMID:9929280

  16. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    PubMed

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  17. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science

    PubMed Central

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  18. STED-FLCS: An Advanced Tool to Reveal Spatiotemporal Heterogeneity of Molecular Membrane Dynamics.

    PubMed

    Vicidomini, Giuseppe; Ta, Haisen; Honigmann, Alf; Mueller, Veronika; Clausen, Mathias P; Waithe, Dominic; Galiani, Silvia; Sezgin, Erdinc; Diaspro, Alberto; Hell, Stefan W; Eggeling, Christian

    2015-09-01

    Heterogeneous diffusion dynamics of molecules play an important role in many cellular signaling events, such as of lipids in plasma membrane bioactivity. However, these dynamics can often only be visualized by single-molecule and super-resolution optical microscopy techniques. Using fluorescence lifetime correlation spectroscopy (FLCS, an extension of fluorescence correlation spectroscopy, FCS) on a super-resolution stimulated emission depletion (STED) microscope, we here extend previous observations of nanoscale lipid dynamics in the plasma membrane of living mammalian cells. STED-FLCS allows an improved determination of spatiotemporal heterogeneity in molecular diffusion and interaction dynamics via a novel gated detection scheme, as demonstrated by a comparison between STED-FLCS and previous conventional STED-FCS recordings on fluorescent phosphoglycerolipid and sphingolipid analogues in the plasma membrane of live mammalian cells. The STED-FLCS data indicate that biophysical and biochemical parameters such as the affinity for molecular complexes strongly change over space and time within a few seconds. Drug treatment for cholesterol depletion or actin cytoskeleton depolymerization not only results in the already previously observed decreased affinity for molecular interactions but also in a slight reduction of the spatiotemporal heterogeneity. STED-FLCS specifically demonstrates a significant improvement over previous gated STED-FCS experiments and with its improved spatial and temporal resolution is a novel tool for investigating how heterogeneities of the cellular plasma membrane may regulate biofunctionality. PMID:26235350

  19. Tool for the Integrated Dynamic Numerical Propulsion System Simulation (NPSS)/Turbine Engine Closed-Loop Transient Analysis (TTECTrA) User's Guide

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey C.; Csank, Jeffrey T.

    2016-01-01

    The Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA ver2) is a control design tool thatenables preliminary estimation of transient performance for models without requiring a full nonlinear controller to bedesigned. The program is compatible with subsonic engine models implemented in the MATLAB/Simulink (TheMathworks, Inc.) environment and Numerical Propulsion System Simulation (NPSS) framework. At a specified flightcondition, TTECTrA will design a closed-loop controller meeting user-defined requirements in a semi or fully automatedfashion. Multiple specifications may be provided, in which case TTECTrA will design one controller for each, producing acollection of controllers in a single run. Each resulting controller contains a setpoint map, a schedule of setpointcontroller gains, and limiters; all contributing to transient characteristics. The goal of the program is to providesteady-state engine designers with more immediate feedback on the transient engine performance earlier in the design cycle.

  20. Picosecond laser fabrication of micro cutting tool geometries on polycrystalline diamond composites using a high-numerical aperture micro scanning system

    NASA Astrophysics Data System (ADS)

    Eberle, Gregory; Dold, Claus; Wegener, Konrad

    2015-03-01

    The generation of microsized components found in LEDs, watches, molds as well as other types of micromechanics and microelectronics require a corresponding micro cutting tool in order to be manufactured, typically by milling or turning. Micro cutting tools are made of cemented tungsten carbide and are conventionally fabricated either by electrical discharge machining (EDM) or by grinding. An alternative method is proposed through a laser-based solution operating in the picosecond pulse duration whereby the beam is deflected using a modified galvanometer-driven micro scanning system exhibiting a high numerical aperture. A micro cutting tool material which cannot be easily processed using conventional methods is investigated, which is a fine grain polycrystalline diamond composite (PCD). The generation of various micro cutting tool relevant geometries, such as chip breakers and cutting edges, are demonstrated. The generated geometries are subsequently evaluated using scanning electron microscopy (SEM) and quality is measured in terms of surface roughness and cutting edge sharpness. Additionally, two processing strategies in which the laser beam processes tangentially and orthogonally are compared in terms of quality.

  1. The STREON Recirculation Chamber: An Advanced Tool to Quantify Stream Ecosystem Metabolism in the Benthic Zone

    NASA Astrophysics Data System (ADS)

    Brock, J. T.; Utz, R.; McLaughlin, B.

    2013-12-01

    The STReam Experimental Observatory Network is a large-scale experimental effort that will investigate the effects of eutrophication and loss of large consumers in stream ecosystems. STREON represents the first experimental effort undertaken and supported by the National Ecological Observatory Network (NEON).Two treatments will be applied at 10 NEON sites and maintained for 10 years in the STREON program: the addition of nitrate and phosphate to enrich concentrations by five times ambient levels and electrical fields that exclude top consumers (i.e., fish or invertebrates) of the food web from the surface of buried sediment baskets. Following a 3-5 week period, the sediment baskets will be extracted and incubated in closed, recirculating metabolic chambers to measure rates of respiration, photosynthesis, and nutrient uptake. All STREON-generated data will be open access and available on the NEON web portal. The recirculation chamber represents a critical infrastructural component of STREON. Although researchers have applied such chambers for metabolic and nutrient uptake measurements in the past, the scope of STREON demands a novel design that addresses multiple processes often neglected by earlier models. The STREON recirculation chamber must be capable of: 1) incorporating hyporheic exchange into the flow field to ensure measurements of respiration include the activity of subsurface biota, 2) operating consistently with heterogeneous sediments from sand to cobble, 3) minimizing heat exchange from the motor and external environment, 4) delivering a reproducible uniform flow field over the surface of the sediment basket, and 5) efficient assembly/disassembly with minimal use of tools. The chamber also required a means of accommodating an optical dissolved oxygen probe and a means to inject/extract water. A prototype STREON chamber has been designed and thoroughly tested. The flow field within the chamber has been mapped using particle imaging velocimetry (PIV

  2. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  3. Randomized Controlled Trial of a Video Decision Support Tool for Cardiopulmonary Resuscitation Decision Making in Advanced Cancer

    PubMed Central

    Volandes, Angelo E.; Paasche-Orlow, Michael K.; Mitchell, Susan L.; El-Jawahri, Areej; Davis, Aretha Delight; Barry, Michael J.; Hartshorn, Kevan L.; Jackson, Vicki Ann; Gillick, Muriel R.; Walker-Corkery, Elizabeth S.; Chang, Yuchiao; López, Lenny; Kemeny, Margaret; Bulone, Linda; Mann, Eileen; Misra, Sumi; Peachey, Matt; Abbo, Elmer D.; Eichler, April F.; Epstein, Andrew S.; Noy, Ariela; Levin, Tomer T.; Temel, Jennifer S.

    2013-01-01

    Purpose Decision making regarding cardiopulmonary resuscitation (CPR) is challenging. This study examined the effect of a video decision support tool on CPR preferences among patients with advanced cancer. Patients and Methods We performed a randomized controlled trial of 150 patients with advanced cancer from four oncology centers. Participants in the control arm (n = 80) listened to a verbal narrative describing CPR and the likelihood of successful resuscitation. Participants in the intervention arm (n = 70) listened to the identical narrative and viewed a 3-minute video depicting a patient on a ventilator and CPR being performed on a simulated patient. The primary outcome was participants' preference for or against CPR measured immediately after exposure to either modality. Secondary outcomes were participants' knowledge of CPR (score range of 0 to 4, with higher score indicating more knowledge) and comfort with video. Results The mean age of participants was 62 years (standard deviation, 11 years); 49% were women, 44% were African American or Latino, and 47% had lung or colon cancer. After the verbal narrative, in the control arm, 38 participants (48%) wanted CPR, 41 (51%) wanted no CPR, and one (1%) was uncertain. In contrast, in the intervention arm, 14 participants (20%) wanted CPR, 55 (79%) wanted no CPR, and 1 (1%) was uncertain (unadjusted odds ratio, 3.5; 95% CI, 1.7 to 7.2; P < .001). Mean knowledge scores were higher in the intervention arm than in the control arm (3.3 ± 1.0 v 2.6 ± 1.3, respectively; P < .001), and 65 participants (93%) in the intervention arm were comfortable watching the video. Conclusion Participants with advanced cancer who viewed a video of CPR were less likely to opt for CPR than those who listened to a verbal narrative. PMID:23233708

  4. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Advancing tools and methods

    SciTech Connect

    Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Balge, Marci Z.; Singer, Burton H.; Utzinger, Juerg

    2010-01-15

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developed within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.

  5. Earth remote sensing as an effective tool for the development of advanced innovative educational technologies

    NASA Astrophysics Data System (ADS)

    Mayorova, Vera; Mayorov, Kirill

    2009-11-01

    Current educational system is facing a contradiction between the fundamentality of engineering education and the necessity of applied learning extension, which requires new methods of training to combine both academic and practical knowledge in balance. As a result there are a number of innovations being developed and implemented into the process of education aimed at optimizing the quality of the entire educational system. Among a wide range of innovative educational technologies there is an especially important subset of educational technologies which involve learning through hands-on scientific and technical projects. The purpose of this paper is to describe the implementation of educational technologies based on small satellites development as well as the usage of Earth remote sensing data acquired from these satellites. The increase in public attention to the education through Earth remote sensing is based on the concern that although there is a great progress in the development of new methods of Earth imagery and remote sensing data acquisition there is still a big question remaining open on practical applications of this kind of data. It is important to develop the new way of thinking for the new generation of people so they understand that they are the masters of their own planet and they are responsible for its state. They should desire and should be able to use a powerful set of tools based on modern and perspective Earth remote sensing. For example NASA sponsors "Classroom of the Future" project. The Universities Space Research Association in United States provides a mechanism through which US universities can cooperate effectively with one another, with the government, and with other organizations to further space science and technology, and to promote education in these areas. It also aims at understanding the Earth as a system and promoting the role of humankind in the destiny of their own planet. The Association has founded a Journal of Earth System

  6. A comprehensive tool for image-based generation of fetus and pregnant women mesh models for numerical dosimetry studies

    NASA Astrophysics Data System (ADS)

    Dahdouh, S.; Varsier, N.; Serrurier, A.; De la Plata, J.-P.; Anquez, J.; Angelini, E. D.; Wiart, J.; Bloch, I.

    2014-08-01

    Fetal dosimetry studies require the development of accurate numerical 3D models of the pregnant woman and the fetus. This paper proposes a 3D articulated fetal growth model covering the main phases of pregnancy and a pregnant woman model combining the utero-fetal structures and a deformable non-pregnant woman body envelope. The structures of interest were automatically or semi-automatically (depending on the stage of pregnancy) segmented from a database of images and surface meshes were generated. By interpolating linearly between fetal structures, each one can be generated at any age and in any position. A method is also described to insert the utero-fetal structures in the maternal body. A validation of the fetal models is proposed, comparing a set of biometric measurements to medical reference charts. The usability of the pregnant woman model in dosimetry studies is also investigated, with respect to the influence of the abdominal fat layer.

  7. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  8. Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Baker, John G.

    2009-01-01

    Recent advances in numerical relativity have fueled an explosion of progress in understanding the predictions of Einstein's theory of gravity, General Relativity, for the strong field dynamics, the gravitational radiation wave forms, and consequently the state of the remnant produced from the merger of compact binary objects. I will review recent results from the field, focusing on mergers of two black holes.

  9. Analysis of line-and-space resist patterns with sub-20 nm half-pitch fabricated using high-numerical-aperture exposure tool of extreme ultraviolet lithography

    NASA Astrophysics Data System (ADS)

    Kozawa, Takahiro; Santillan, Julius Joseph; Itani, Toshiro

    2016-09-01

    The resolution of resist processes for extreme ultraviolet (EUV) lithography has been steadily improved and has reached the sub-20 nm half-pitch region. Currently, the resist materials capable of resolving 11 nm half-pitch line-and-space patterns are being developed in industrial fields. In this study, the line-and-space resist patterns with sub-20 nm half-pitches were fabricated using a high-numerical-aperture (NA) EUV exposure tool and analyzed by the Monte Carlo simulation. The scanning electron microscopy (SEM) images of resist patterns after their development were compared with the latent images calculated on the basis of the sensitization and reaction mechanisms of chemically amplified EUV resists. The approximate relationship between resist patterns and latent images was clarified for the sub-20 nm half-pitch region. For the realization of 11 nm half-pitch fabrication, the suppression of the stochastic effects in the development process is an important consideration.

  10. The Standard for Clinicians’ Interview in Psychiatry (SCIP): A Clinician-administered Tool with Categorical, Dimensional, and Numeric Output—Conceptual Development, Design, and Description of the SCIP

    PubMed Central

    Nasrallah, Henry; Muvvala, Srinivas; El-Missiry, Ahmed; Mansour, Hader; Hill, Cheryl; Elswick, Daniel; Price, Elizabeth C.

    2016-01-01

    Existing standardized diagnostic interviews (SDIs) were designed for researchers and produce mainly categorical diagnoses. There is an urgent need for a clinician-administered tool that produces dimensional measures, in addition to categorical diagnoses. The Standard for Clinicians’ Interview in Psychiatry (SCIP) is a method of assessment of psychopathology for adults. It is designed to be administered by clinicians and includes the SCIP manual and the SCIP interview. Clinicians use the SCIP questions and rate the responses according to the SCIP manual rules. Clinicians use the patient’s responses to questions, observe the patient’s behaviors and make the final rating of the various signs and symptoms assessed. The SCIP method of psychiatric assessment has three components: 1) the SCIP interview (dimensional) component, 2) the etiological component, and 3) the disorder classification component. The SCIP produces three main categories of clinical data: 1) a diagnostic classification of psychiatric disorders, 2) dimensional scores, and 3) numeric data. The SCIP provides diagnoses consistent with criteria from editions of the Diagnostic and Statistical Manual (DSM) and International Classification of Disease (ICD). The SCIP produces 18 dimensional measures for key psychiatric signs or symptoms: anxiety, posttraumatic stress, obsessions, compulsions, depression, mania, suicidality, suicidal behavior, delusions, hallucinations, agitation, disorganized behavior, negativity, catatonia, alcohol addiction, drug addiction, attention, and hyperactivity. The SCIP produces numeric severity data for use in either clinical care or research. The SCIP was shown to be a valid and reliable assessment tool, and the validity and reliability results were published in 2014 and 2015. The SCIP is compatible with personalized psychiatry research and is in line with the Research Domain Criteria framework. PMID:27800284

  11. Numerical Modeling for Hole-Edge Cracking of Advanced High-Strength Steels (AHSS) Components in the Static Bend Test

    NASA Astrophysics Data System (ADS)

    Kim, Hyunok; Mohr, William; Yang, Yu-Ping; Zelenak, Paul; Kimchi, Menachem

    2011-08-01

    Numerical modeling of local formability, such as hole-edge cracking and shear fracture in bending of AHSS, is one of the challenging issues for simulation engineers for prediction and evaluation of stamping and crash performance of materials. This is because continuum-mechanics-based finite element method (FEM) modeling requires additional input data, "failure criteria" to predict the local formability limit of materials, in addition to the material flow stress data input for simulation. This paper presents a numerical modeling approach for predicting hole-edge failures during static bend tests of AHSS structures. A local-strain-based failure criterion and a stress-triaxiality-based failure criterion were developed and implemented in LS-DYNA simulation code to predict hole-edge failures in component bend tests. The holes were prepared using two different methods: mechanical punching and water-jet cutting. In the component bend tests, the water-jet trimmed hole showed delayed fracture at the hole-edges, while the mechanical punched hole showed early fracture as the bending angle increased. In comparing the numerical modeling and test results, the load-displacement curve, the displacement at the onset of cracking, and the final crack shape/length were used. Both failure criteria also enable the numerical model to differentiate between the local formability limit of mechanical-punched and water-jet-trimmed holes. The failure criteria and static bend test developed here are useful to evaluate the local formability limit at a structural component level for automotive crash tests.

  12. Advanced numerical methods for the simulation of flows in heterogeneous porous media and their application to parallel computing

    SciTech Connect

    Rame, M.

    1990-01-01

    Flows in highly heterogeneous porous media arise in a variety of processes including enhanced oil recovery, in situ bioremediation of underground contaminants, transport in underground aquifers and transport through biological membranes. The common denominator of these processes is the transport (and possibly reaction) of a multi-component fluid in several phases. A new numerical methodology for the analysis of flows in heterogeneous porous media is presented. Cases of miscible and immiscible displacement are simulated to investigate the influence of the local heterogeneities on the flow paths. This numerical scheme allows for a fine description of the flowing medium and the concentration and saturation distributions thus generated show low numerical dispersion. If the size of the area of interest is a square of a thousand feet per side, geological information on the porous medium can be incorporated to a length scale of about one to two feet. The technique here introduced, Operator Splitting on Multiple Grids, solves the elliptic operators by a higher-order finite-element technique on a coarse grid that proves efficient and accurate in incorporating different scales of heterogeneities. This coarse solution is interpolated to a fine grid by a splines-under-tension technique. The equations for the conservation of species are solved on this fine grid (of approximately half a million cells) by a finite-difference technique yielding numerical dispersions of less than ten feet. Cases presented herein involve a single phase miscible flow, and liquid-phase immiscible displacements. Cases are presented for model distributions of physical properties and for porosity and permeability data taken from a real reservoir. Techniques for the extension of the methods to compressible flow situations and compositional simulations are discussed.

  13. Creation of an ensemble of simulated cardiac cases and a human observer study: tools for the development of numerical observers for SPECT myocardial perfusion imaging

    NASA Astrophysics Data System (ADS)

    O'Connor, J. Michael; Pretorius, P. Hendrik; Gifford, Howard C.; Licho, Robert; Joffe, Samuel; McGuiness, Matthew; Mehurg, Shannon; Zacharias, Michael; Brankov, Jovan G.

    2012-02-01

    Our previous Single Photon Emission Computed Tomography (SPECT) myocardial perfusion imaging (MPI) research explored the utility of numerical observers. We recently created two hundred and eighty simulated SPECT cardiac cases using Dynamic MCAT (DMCAT) and SIMIND Monte Carlo tools. All simulated cases were then processed with two reconstruction methods: iterative ordered subset expectation maximization (OSEM) and filtered back-projection (FBP). Observer study sets were assembled for both OSEM and FBP methods. Five physicians performed an observer study on one hundred and seventy-nine images from the simulated cases. The observer task was to indicate detection of any myocardial perfusion defect using the American Society of Nuclear Cardiology (ASNC) 17-segment cardiac model and the ASNC five-scale rating guidelines. Human observer Receiver Operating Characteristic (ROC) studies established the guidelines for the subsequent evaluation of numerical model observer (NO) performance. Several NOs were formulated and their performance was compared with the human observer performance. One type of NO was based on evaluation of a cardiac polar map that had been pre-processed using a gradient-magnitude watershed segmentation algorithm. The second type of NO was also based on analysis of a cardiac polar map but with use of a priori calculated average image derived from an ensemble of normal cases.

  14. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 9: Tool and Die, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  15. A numerical tool for the calculation of non-equilibrium ionisation states in the solar corona and other astrophysical plasma environments

    NASA Astrophysics Data System (ADS)

    Bradshaw, S. J.

    2009-07-01

    Context: The effects of non-equilibrium processes on the ionisation state of strongly emitting elements in the solar corona can be extremely difficult to assess and yet they are critically important. For example, there is much interest in dynamic heating events localised in the solar corona because they are believed to be responsible for its high temperature and yet recent work has shown that the hottest (≥107 K) emission predicted to be associated with these events can be observationally elusive due to the difficulty of creating the highly ionised states from which the expected emission arises. This leads to the possibility of observing instruments missing such heating events entirely. Aims: The equations describing the evolution of the ionisaton state are a very stiff system of coupled, partial differential equations whose solution can be numerically challenging and time-consuming. Without access to specialised codes and significant computational resources it is extremely difficult to avoid the assumption of an equilibrium ionisation state even when it clearly cannot be justified. The aim of the current work is to develop a computational tool to allow straightforward calculation of the time-dependent ionisation state for a wide variety of physical circumstances. Methods: A numerical model comprising the system of time-dependent ionisation equations for a particular element and tabulated values of plasma temperature as a function of time is developed. The tabulated values can be the solutions of an analytical model, the output from a numerical code or a set of observational measurements. An efficient numerical method to solve the ionisation equations is implemented. Results: A suite of tests is designed and run to demonstrate that the code provides reliable and accurate solutions for a number of scenarios including equilibration of the ion population and rapid heating followed by thermal conductive cooling. It is found that the solver can evolve the ionisation

  16. Image navigation and registration performance assessment tool set for the GOES-R Advanced Baseline Imager and Geostationary Lightning Mapper

    NASA Astrophysics Data System (ADS)

    De Luccia, Frank J.; Houchin, Scott; Porter, Brian C.; Graybill, Justin; Haas, Evan; Johnson, Patrick D.; Isaacson, Peter J.; Reth, Alan D.

    2016-05-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. For ABI, these metrics are the 3-sigma errors in navigation (NAV), channel-to-channel registration (CCR), frame-to-frame registration (FFR), swath-to-swath registration (SSR), and within frame registration (WIFR) for the Level 1B image products. For GLM, the single metric of interest is the 3-sigma error in the navigation of background images (GLM NAV) used by the system to navigate lightning strikes. 3-sigma errors are estimates of the 99. 73rd percentile of the errors accumulated over a 24 hour data collection period. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. Another aspect of the IPATS design that vastly reduces execution time is the off-line propagation of Landsat based truth images to the fixed grid coordinates system for each of the three GOES-R satellite locations, operational East and West and initial checkout locations. This paper describes the algorithmic design and implementation of IPATS and provides preliminary test results.

  17. Image Navigation and Registration Performance Assessment Tool Set for the GOES-R Advanced Baseline Imager and Geostationary Lightning Mapper

    NASA Technical Reports Server (NTRS)

    De Luccia, Frank J.; Houchin, Scott; Porter, Brian C.; Graybill, Justin; Haas, Evan; Johnson, Patrick D.; Isaacson, Peter J.; Reth, Alan D.

    2016-01-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. For ABI, these metrics are the 3-sigma errors in navigation (NAV), channel-to-channel registration (CCR), frame-to-frame registration (FFR), swath-to-swath registration (SSR), and within frame registration (WIFR) for the Level 1B image products. For GLM, the single metric of interest is the 3-sigma error in the navigation of background images (GLM NAV) used by the system to navigate lightning strikes. 3-sigma errors are estimates of the 99.73rd percentile of the errors accumulated over a 24-hour data collection period. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24-hour evaluation period. Another aspect of the IPATS design that vastly reduces execution time is the off-line propagation of Landsat based truth images to the fixed grid coordinates system for each of the three GOES-R satellite locations, operational East and West and initial checkout locations. This paper describes the algorithmic design and implementation of IPATS and provides preliminary test results.

  18. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    SciTech Connect

    Liu, H; Liang, X; Kalbasi, A; Lin, A; Ahn, P; Both, S

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: proton PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.

  19. Innovative and Advanced Coupled Neutron Transport and Thermal Hydraulic Method (Tool) for the Design, Analysis and Optimization of VHTR/NGNP Prismatic Reactors

    SciTech Connect

    Rahnema, Farzad; Garimeela, Srinivas; Ougouag, Abderrafi; Zhang, Dingkang

    2013-11-29

    This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implement a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.

  20. Numerical simulation of heat exchanger

    SciTech Connect

    Sha, W.T.

    1985-01-01

    Accurate and detailed knowledge of the fluid flow field and thermal distribution inside a heat exchanger becomes invaluable as a large, efficient, and reliable unit is sought. This information is needed to provide proper evaluation of the thermal and structural performance characteristics of a heat exchanger. It is to be noted that an analytical prediction method, when properly validated, will greatly reduce the need for model testing, facilitate interpolating and extrapolating test data, aid in optimizing heat-exchanger design and performance, and provide scaling capability. Thus tremendous savings of cost and time are realized. With the advent of large digital computers and advances in the development of computational fluid mechanics, it has become possible to predict analytically, through numerical solution, the conservation equations of mass, momentum, and energy for both the shellside and tubeside fluids. The numerical modeling technique will be a valuable, cost-effective design tool for development of advanced heat exchangers.

  1. Numerical Modeling for Springback Predictions by Considering the Variations of Elastic Modulus in Stamping Advanced High-Strength Steels (AHSS)

    NASA Astrophysics Data System (ADS)

    Kim, Hyunok; Kimchi, Menachem

    2011-08-01

    This paper presents a numerical modeling approach for predicting springback by considering the variations of elastic modulus on springback in stamping AHSS. Various stamping tests and finite-element method (FEM) simulation codes were used in this study. The cyclic loading-unloading tensile tests were conducted to determine the variations of elastic modulus for dual-phase (DP) 780 sheet steel. The biaxial bulge test was used to obtain plastic flow stress data. The non-linear reduction of elastic modulus for increasing the plastic strain was formulated by using the Yoshida model that was implemented in FEM simulations for springback. To understand the effects of material properties on springback, experiments were conducted with a simple geometry such as U-shape bending and the more complex geometry such as the curved flanging and S-rail stamping. Different measurement methods were used to confirm the final part geometry. Two different commercial FEM codes, LS-DYNA and DEFORM, were used to compare the experiments. The variable elastic modulus improved springback predictions in U-shape bending and curved flanging tests compared to FEM with the constant elastic modulus. However, in S-rail stamping tests, both FEM models with the isotropic hardening model showed limitations in predicting the sidewall curl of the S-rail part after springback. To consider the kinematic hardening and Bauschinger effects that result from material bending-unbending in S-rail stamping, the Yoshida model was used for FEM simulation of S-rail stamping and springback. The FEM predictions showed good improvement in correlating with experiments.

  2. A New Method For Advanced Virtual Design Of Stamping Tools For Automotive Industry: Application To Nodular Cast Iron EN-GJS-600-3

    NASA Astrophysics Data System (ADS)

    Ben-Slima, Khalil; Penazzi, Luc; Mabru, Catherine; Ronde-Oustau, François; Rezaï-Aria, Farhad

    2011-05-01

    This contribution presents an approach combining the stamping numerical processing simulations and structure analysis in order to improve the design for optimizing the tool fatigue life. The method consists in simulating the stamping process via AutoForm® (or any FEM Code) by considering the tool as a perfect rigid body. The estimated contact pressure is then used as boundary condition for FEM structure loading analysis. The result of this analysis is used for life prediction of the tool using S-N fatigue curve. If the prescribed tool life requirements are not satisfied, then the critical region of the tool is redesigned and the whole simulation procedures are reactivated. This optimization method is applied for a cast iron EN-GJS-600-3 as candidate stamping tool materiel. The room temperature fatigue S-N curves of this alloy are established in laboratory under uniaxial push/pull cyclic experiments on cylindrical specimens under a load ratio of R (σmin/σmax) = -2.

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  4. Towards Direct Numerical Simulation of mass and energy fluxes at the soil-atmospheric interface with advanced Lattice Boltzmann methods

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Krafczyk, Manfred; Geier, Martin; Schönherr, Martin

    2014-05-01

    The quantification of soil evaporation and of soil water content dynamics near the soil surface are critical in the physics of land-surface processes on many scales and are dominated by multi-component and multi-phase mass and energy fluxes between the ground and the atmosphere. Although it is widely recognized that both liquid and gaseous water movement are fundamental factors in the quantification of soil heat flux and surface evaporation, their computation has only started to be taken into account using simplified macroscopic models. As the flow field over the soil can be safely considered as turbulent, it would be natural to study the detailed transient flow dynamics by means of Large Eddy Simulation (LES [1]) where the three-dimensional flow field is resolved down to the laminar sub-layer. Yet this requires very fine resolved meshes allowing a grid resolution of at least one order of magnitude below the typical grain diameter of the soil under consideration. In order to gain reliable turbulence statistics, up to several hundred eddy turnover times have to be simulated which adds up to several seconds of real time. Yet, the time scale of the receding saturated water front dynamics in the soil is on the order of hours. Thus we are faced with the task of solving a transient turbulent flow problem including the advection-diffusion of water vapour over the soil-atmospheric interface represented by a realistic tomographic reconstruction of a real porous medium taken from laboratory probes. Our flow solver is based on the Lattice Boltzmann method (LBM) [2] which has been extended by a Cumulant approach similar to the one described in [3,4] to minimize the spurious coupling between the degrees of freedom in previous LBM approaches and can be used as an implicit LES turbulence model due to its low numerical dissipation and increased stability at high Reynolds numbers. The kernel has been integrated into the research code Virtualfluids [5] and delivers up to 30% of the

  5. Use of advanced earth observation tools for the analyses of recent surface changes in Kalahari pans and Namibian coastal lagoons

    NASA Astrophysics Data System (ADS)

    Behling, Robert; Milewski, Robert; Chabrillat, Sabine; Völkel, Jörg

    2016-04-01

    The remote sensing analyses in the BMBF-SPACES collaborative project Geoarchives - Signals of Climate and Landscape Change preserved in Southern African Geoarchives - focuses on the use of recent and upcoming Earth Observation Tools for the study of climate and land use changes and its impact on the ecosystem. It aims at demonstrating the potential of recently available advanced optical remote sensing imagery with its extended spectral coverage and temporal resolution for the identification and mapping of sediment features associated with paleo-environmental archives as well as their recent dynamic. In this study we focus on the analyses of two ecosystems of major interest, the Kalahari salt pans as well as the lagoons at Namibia's west coast, that present high dynamic caused by combined hydrological and surface processes linked to climatic events. Multitemporal remote sensing techniques allow us to derive the recent surface dynamic of the salt pans and also provide opportunities to get a detailed understanding of the spatiotemporal development of the coastal lagoons. Furthermore spaceborne hyperspectral analysis can give insight to the current surface mineralogy of the salt pans on a physical basis and provide the intra pan distribution of evaporites. The soils and sediments of the Kalahari salt pans such as the Omongwa pan are a potentially significant storage of global carbon and also function as an important terrestrial climate archive. Thus far the surface distribution of evaporites have been only assessed mono-temporally and on a coarse regional scale, but the dynamic of the salt pans, especially the formation of evaporites, is still uncertain and poorly understood. For the salt pan analyses a change detection is applied using the Iterative-reweighted Multivariate Alteration Detection (IR-MAD) method to identify and investigate surface changes based on a Landsat time-series covering the period 1984-2015. Furthermore the current spatial distribution of

  6. Implementation of an advanced hybrid MPC-PID control system using PAT tools into a direct compaction continuous pharmaceutical tablet manufacturing pilot plant.

    PubMed

    Singh, Ravendra; Sahay, Abhishek; Karry, Krizia M; Muzzio, Fernando; Ierapetritou, Marianthi; Ramachandran, Rohit

    2014-10-01

    It is desirable for a pharmaceutical final dosage form to be manufactured through a quality by design (QbD)-based approach rather than a quality by testing (QbT) approach. An automatic feedback control system coupled with PAT tools that is part of the QbD paradigm shift, has the potential to ensure that the pre-defined end product quality attributes are met in a time and cost efficient manner. In this work, an advanced hybrid MPC-PID control architecture coupled with real time inline/online monitoring tools and principal components analysis (PCA) based additional supervisory control layer has been proposed for a continuous direct compaction tablet manufacturing process. The advantages of both MPC and PID have been utilized in a hybrid scheme. The control hardware and software integration and implementation of the control system has been demonstrated using feeders and blending unit operation of a continuous tablet manufacturing pilot plant and an NIR based PAT tool. The advanced hybrid MPC-PID control scheme leads to enhanced control loop performance of the critical quality attributes in comparison to a regulatory (e.g. PID) control scheme indicating its potential to improve pharmaceutical product quality. PMID:24974987

  7. Final Progress Report: Collaborative Research: Decadal-to-Centennial Climate & Climate Change Studies with Enhanced Variable and Uniform Resolution GCMs Using Advanced Numerical Techniques

    SciTech Connect

    Fox-Rabinovitz, M; Cote, J

    2009-06-05

    The joint U.S-Canadian project has been devoted to: (a) decadal climate studies using developed state-of-the-art GCMs (General Circulation Models) with enhanced variable and uniform resolution; (b) development and implementation of advanced numerical techniques; (c) research in parallel computing and associated numerical methods; (d) atmospheric chemistry experiments related to climate issues; (e) validation of regional climate modeling strategies for nested- and stretched-grid models. The variable-resolution stretched-grid (SG) GCMs produce accurate and cost-efficient regional climate simulations with mesoscale resolution. The advantage of the stretched grid approach is that it allows us to preserve the high quality of both global and regional circulations while providing consistent interactions between global and regional scales and phenomena. The major accomplishment for the project has been the successful international SGMIP-1 and SGMIP-2 (Stretched-Grid Model Intercomparison Project, phase-1 and phase-2) based on this research developments and activities. The SGMIP provides unique high-resolution regional and global multi-model ensembles beneficial for regional climate modeling and broader modeling community. The U.S SGMIP simulations have been produced using SciDAC ORNL supercomputers. Collaborations with other international participants M. Deque (Meteo-France) and J. McGregor (CSIRO, Australia) and their centers and groups have been beneficial for the strong joint effort, especially for the SGMIP activities. The WMO/WCRP/WGNE endorsed the SGMIP activities in 2004-2008. This project reflects a trend in the modeling and broader communities to move towards regional and sub-regional assessments and applications important for the U.S. and Canadian public, business and policy decision makers, as well as for international collaborations on regional, and especially climate related issues.

  8. Using Advanced Monitoring Tools to Evaluate PM PM2.5 2.5 in San Joaquin Valley

    EPA Science Inventory

    One of the primary data deficiencies that prevent the advance of policy relevant research on particulate matter, ozone, and associated precursors is the lack of measurement data and knowledge on the true vertical profile and synoptic-scale spatial distributions of the pollutants....

  9. FACILITATING ADVANCED URBAN METEOROLOGY AND AIR QUALITY MODELING CAPABILITIES WITH HIGH RESOLUTION URBAN DATABASE AND ACCESS PORTAL TOOLS

    EPA Science Inventory

    Information of urban morphological features at high resolution is needed to properly model and characterize the meteorological and air quality fields in urban areas. We describe a new project called National Urban Database with Access Portal Tool, (NUDAPT) that addresses this nee...

  10. Demonstrating Advancements in 3D Analysis and Prediction Tools for Space Weather Forecasting utilizing the Enlil Model

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2012-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Analysis and prediction tools for post processing and visualizing simulation results greatly enhance the utility of these models in aiding space weather forecasters to predict the terrestrial consequences of these events. The Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer (KT) group is making significant progress on an integrated post-processing and analysis and prediction tool based on the ParaView open source visualization application for space weather prediction. These tools will provide space weather forecasters with the ability to use 3D situational awareness of the solar wind, CME, and eventually the geospace environments. Current work focuses on bringing new 3D analysis and prediction tools for the Enlil heliospheric model to space weather forecasters. In this effort we present a ParaView-based model interface that will provide forecasters with an interactive system for analyzing complete 3D datasets from modern space weather models.

  11. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  12. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater. PMID:23982824

  13. Development of a numerical tool to study the mixing phenomenon occurring during mode one operation of a multi-mode ejector-augmented pulsed detonation rocket engine

    NASA Astrophysics Data System (ADS)

    Dawson, Joshua

    simple and as a result of the rapid combustion process the engine cycle is more efficient compared to its combined cycle counterparts. The flow path geometry consists of an inlet system, followed just downstream by a mixing chamber where an ejector structure is placed within the flow path. Downstream of the ejector structure is a duct leading to a convergent-divergent nozzle. During mode one operation and within the ejector, products from the detonation of a stoichiometric hydrogen/air mixture are exhausted directly into the surrounding secondary air stream. Mixing then occurs between both the primary and secondary flow streams, at which point the air mass containing the high pressure, high temperature reaction products is convected downstream towards the nozzle. The engine cycle is engineered to a specific number of detonations per second, creating the pulsating characteristic of the primary flow. The pulsing nature of the primary flow serves as a momentum augmentation, enhancing the thrust and specific impulse at low speeds. Consequently it is necessary to understand the transient mixing process between the primary and secondary flow streams occurring during mode one operation. Using OPENFOAMRTM, an analytic tool is developed to simulate the dynamics of the turbulent detonation process along with detailed chemistry in order to understand the physics involved with the stream interactions. The computational code has been developed within the framework of OPENFOAMRTM, an open-source alternative to commercial CFD software. A conservative formulation of the Farve averaged Navier-Stokes equations is implemented to facilitate programming and numerical stability. Time discretization is accomplished by using the Crank-Nicolson method, achieving second order convergence in time. Species mass fraction transport equations are implemented and a Seulex ODE solver was used to resolve the system of ordinary differential equations describing the hydrogen-air reaction mechanism detailed

  14. Laser Nano-Neurosurgery from Gentle Manipulation to Nano-Incision of Neuronal Cells and Scaffolds: An Advanced Neurotechnology Tool

    PubMed Central

    Soloperto, Alessandro; Palazzolo, Gemma; Tsushima, Hanako; Chieregatti, Evelina; Vassalli, Massimo; Difato, Francesco

    2016-01-01

    Current optical approaches are progressing far beyond the scope of monitoring the structure and function of living matter, and they are becoming widely recognized as extremely precise, minimally-invasive, contact-free handling tools. Laser manipulation of living tissues, single cells, or even single-molecules is becoming a well-established methodology, thus founding the onset of new experimental paradigms and research fields. Indeed, a tightly focused pulsed laser source permits complex tasks such as developing engineered bioscaffolds, applying calibrated forces, transfecting, stimulating, or even ablating single cells with subcellular precision, and operating intracellular surgical protocols at the level of single organelles. In the present review, we report the state of the art of laser manipulation in neuroscience, to inspire future applications of light-assisted tools in nano-neurosurgery. PMID:27013962

  15. The Advanced Course in Professional Selling

    ERIC Educational Resources Information Center

    Loe, Terry; Inks, Scott

    2014-01-01

    More universities are incorporating sales content into their curriculums, and although the introductory courses in professional sales have much common ground and guidance from numerous professional selling texts, instructors teaching the advanced selling course lack the guidance provided by common academic tools and materials. The resulting…

  16. Investigation of Alien Wavelength Quality in Live Multi-Domain, Multi-Vendor Link Using Advanced Simulation Tool

    NASA Astrophysics Data System (ADS)

    Nordal Petersen, Martin; Nuijts, Roeland; Lange Bjørn, Lars

    2014-05-01

    This article presents an advanced optical model for simulation of alien wavelengths in multi-domain and multi-vendor dense wavelength-division multiplexing networks. The model aids optical network planners with a better understanding of the non-linear effects present in dense wavelength-division multiplexing systems and better utilization of alien wavelengths in future applications. The limiting physical effects for alien wavelengths are investigated in relation to power levels, channel spacing, and other factors. The simulation results are verified through experimental setup in live multi-domain dense wavelength-division multiplexing systems between two national research networks: SURFnet in Holland and NORDUnet in Denmark.

  17. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  18. Recent advances in elementary flux modes and yield space analysis as useful tools in metabolic network studies.

    PubMed

    Horvat, Predrag; Koller, Martin; Braunegg, Gerhart

    2015-09-01

    A review of the use of elementary flux modes (EFMs) and their applications in metabolic engineering covered with yield space analysis (YSA) is presented. EFMs are an invaluable tool in mathematical modeling of biochemical processes. They are described from their inception in 1994, followed by various improvements of their computation in later years. YSA constitutes another precious tool for metabolic network modeling, and is presented in details along with EFMs in this article. The application of these techniques is discussed for several case studies of metabolic network modeling provided in respective original articles. The article is concluded by some case studies in which the application of EFMs and YSA turned out to be most useful, such as the analysis of intracellular polyhydroxyalkanoate (PHA) formation and consumption in Cupriavidus necator, including the constraint-based description of the steady-state flux cone of the strain's metabolic network, the profound analysis of a continuous five-stage bioreactor cascade for PHA production by C. necator using EFMs and, finally, the study of metabolic fluxes in the metabolic network of C. necator cultivated on glycerol.

  19. MicroRNA-based diagnostic tools for advanced fibrosis and cirrhosis in patients with chronic hepatitis B and C

    PubMed Central

    Appourchaux, Kevin; Dokmak, Safi; Resche-Rigon, Matthieu; Treton, Xavier; Lapalus, Martine; Gattolliat, Charles-Henry; Porchet, Emmanuelle; Martinot-Peignoux, Michelle; Boyer, Nathalie; Vidaud, Michel; Bedossa, Pierre; Marcellin, Patrick; Bièche, Ivan; Estrabaud, Emilie; Asselah, Tarik

    2016-01-01

    Staging fibrosis is crucial for the prognosis and to determine the rapid need of treatment in patients with chronic hepatitis B (CHB) and C (CHC). The expression of 13 fibrosis-related microRNAs (miRNAs) (miR-20a, miR-21, miR-27a, miR-27b, miR-29a, miR-29c, miR-92a, miR-122, miR-146a, miR-155, miR-221, miR-222, and miR-224) was analyzed in 194 serums and 177 liver biopsies of patients with either CHB or CHC to develop models to diagnose advanced fibrosis and cirrhosis (Metavir F3-F4). In CHB patients, the model (serum miR-122, serum miR-222, platelet count and alkaline phosphatase) was more accurate than APRI and FIB-4 to discriminate in between mild and moderate fibrosis (F1-F2) and F3-F4 (AUC of CHB model: 0.85 vs APRI: 0.70 and FIB-4: 0.81). In CHC patients, the model (hepatic miR-122, hepatic miR-224, platelet count, albumin and alanine aminotransferase) was more accurate than both APRI and FIB-4 to discriminate in between patients with F3-F4 and F1-F2 (AUC of the CHC model = 0.93 vs APRI: 0.86 and FIB-4: 0.79). Most of the miRNAs tested were differentially expressed in patients with CHB and CHC. In particular, serum miR-122 was 28-fold higher in patients with CHB than in those with CHC. Both CHB and CHC models may help for the diagnosis of advanced fibrosis and cirrhosis (F3-F4). PMID:27731343

  20. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    SciTech Connect

    Naqvi, S

    2014-06-15

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physical principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as

  1. Process variation monitoring (PVM) by wafer inspection tool as a complementary method to CD-SEM for mapping field CDU on advanced production devices

    NASA Astrophysics Data System (ADS)

    Kim, Dae Jong; Yoo, Hyung Won; Kim, Chul Hong; Lee, Hak Kwon; Kim, Sung Su; Bae, Koon Ho; Spielberg, Hedvi; Lee, Yun Ho; Levi, Shimon; Bustan, Yariv; Rozentsvige, Moshe

    2010-03-01

    As design rules shrink, Critical Dimension Uniformity (CDU) and Line Edge Roughness (LER) have a dramatic effect on printed final lines and hence the need to control these parameters increases. Sources of CDU and LER variations include scanner auto-focus accuracy and stability, layer stack thickness, composition variations, and exposure variations. Process variations, in advanced VLSI production designs, specifically in memory devices, attributed to CDU and LER affect cell-to-cell parametric variations. These variations significantly impact device performance and die yield. Traditionally, measurements of LER are performed by CD-SEM or OCD metrology tools. Typically, these measurements require a relatively long time to set and cover only selected points of wafer area. In this paper we present the results of a collaborative work of the Process Diagnostic & Control Business Unit of Applied Materials and Hynix Semiconductor Inc. on the implementation of a complementary method to the CDSEM and OCD tools, to monitor defect density and post litho develop CDU and LER on production wafers. The method, referred to as Process Variation Monitoring (PVM) is based on measuring variations in the scattered light from periodic structures. The application is demonstrated using Applied Materials DUV bright field (BF) wafer inspection tool under optimized illumination and collection conditions. The UVisionTM has already passed a successful feasibility study on DRAM products with 66nm and 54nm design rules. The tool has shown high sensitivity to variations across an FEM wafer in both exposure and focus axes. In this article we show how PVM can help detection of Field to Field variations on DRAM wafers with 44nm design rule during normal production run. The complex die layout and the shrink in cell dimensions require high sensitivity to local variations within Dies or Fields. During normal scan of production wafers local Process variations are translated into GL (Grey Level) values

  2. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 12: Instrumentation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  3. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 10: Computer-Aided Drafting & Design, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  4. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  5. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  6. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 8: Sheet Metal & Composites, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  7. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 4: Manufacturing Engineering Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  8. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 3: Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  9. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  10. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  11. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  12. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 6: Welding, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  13. Magnetic resonance imaging: A potential tool in assessing the addition of hyperthermia to neoadjuvant therapy in patients with locally advanced breast cancer

    PubMed Central

    CRACIUNESCU, OANA I.; THRALL, DONALD E.; VUJASKOVIC, ZELJKO; DEWHIRST, MARK W.

    2010-01-01

    The poor overall survival for patients with locally advanced breast cancers has led over the past decade to the introduction of numerous neoadjuvant combined therapy regimens to down-stage the disease before surgery. At the same time, more evidence suggests the need for treatment individualisation with a wide variety of new targets for cancer therapeutics and also multi modality therapies. In this context, early determination of whether the patient will fail to respond can enable the use of alternative therapies that can be more beneficial. The purpose of this review is to examine the potential role of magnetic resonance imaging (MRI) in early prediction of treatment response and prognosis of overall survival in locally advanced breast cancer patients enrolled on multi modality therapy trials that include hyperthermia. The material is organised with a review of dynamic contrast (DCE)-MRI and diffusion weighted (DW)-MRI for characterisation of phenomenological parameters of tumour physiology and their potential role in estimating therapy response. Most of the work published in this field has focused on responses to neoadjuvant chemotherapy regimens alone, so the emphasis will be there, however the available data that involves the addition of hyperthermia to the regimen will be discussed The review will also include future directions that include the potential use of MRI imaging techniques in establishing the role of hyperthermia alone in modifying breast tumour microenvironment, together with specific challenges related to performing such studies. PMID:20849258

  14. Interferometric correction system for a numerically controlled machine

    DOEpatents

    Burleson, Robert R.

    1978-01-01

    An interferometric correction system for a numerically controlled machine is provided to improve the positioning accuracy of a machine tool, for example, for a high-precision numerically controlled machine. A laser interferometer feedback system is used to monitor the positioning of the machine tool which is being moved by command pulses to a positioning system to position the tool. The correction system compares the commanded position as indicated by a command pulse train applied to the positioning system with the actual position of the tool as monitored by the laser interferometer. If the tool position lags the commanded position by a preselected error, additional pulses are added to the pulse train applied to the positioning system to advance the tool closer to the commanded position, thereby reducing the lag error. If the actual tool position is leading in comparison to the commanded position, pulses are deleted from the pulse train where the advance error exceeds the preselected error magnitude to correct the position error of the tool relative to the commanded position.

  15. From Numerical Problem Solving to Model-Based Experimentation Incorporating Computer-Based Tools of Various Scales into the ChE Curriculum

    ERIC Educational Resources Information Center

    Shacham, Mordechai; Cutlip, Michael B.; Brauner, Neima

    2009-01-01

    A continuing challenge to the undergraduate chemical engineering curriculum is the time-effective incorporation and use of computer-based tools throughout the educational program. Computing skills in academia and industry require some proficiency in programming and effective use of software packages for solving 1) single-model, single-algorithm…

  16. Numerical Implementation of Indicators and Statistical Control Tools in Monitoring and Evaluating CACEI-ISO Indicators of Study Program in Industrial Process by Systematization

    ERIC Educational Resources Information Center

    Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar

    2016-01-01

    The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…

  17. Development and implementation of a portable grating interferometer system as a standard tool for testing optics at the Advanced Photon Source beamline 1-BM.

    PubMed

    Assoufid, Lahsen; Shi, Xianbo; Marathe, Shashidhara; Benda, Erika; Wojcik, Michael J; Lang, Keenan; Xu, Ruqing; Liu, Wenjun; Macrander, Albert T; Tischler, Jon Z

    2016-05-01

    We developed a portable X-ray grating interferometer setup as a standard tool for testing optics at the Advanced Photon Source (APS) beamline 1-BM. The interferometer can be operated in phase-stepping, Moiré, or single-grating harmonic imaging mode with 1-D or 2-D gratings. All of the interferometer motions are motorized; hence, it is much easier and quicker to switch between the different modes of operation. A novel aspect of this new instrument is its designed portability. While the setup is designed to be primarily used as a standard tool for testing optics at 1-BM, it could be potentially deployed at other APS beamlines for beam coherence and wavefront characterization or imaging. The design of the interferometer system is described in detail and coherence measurements obtained at the APS 34-ID-E beamline are presented. The coherence was probed in two directions using a 2-D checkerboard, a linear, and a circular grating at X-ray energies of 8 keV, 11 keV, and 18 keV.

  18. Numerical simulations of epitaxial growth process in MOVPE reactor as a tool for design of modern semiconductors for high power electronics

    SciTech Connect

    Skibinski, Jakub; Wejrzanowski, Tomasz; Caban, Piotr; Kurzydlowski, Krzysztof J.

    2014-10-06

    In the present study numerical simulations of epitaxial growth of gallium nitride in Metal Organic Vapor Phase Epitaxy reactor AIX-200/4RF-S is addressed. Epitaxial growth means crystal growth that progresses while inheriting the laminar structure and the orientation of substrate crystals. One of the technological problems is to obtain homogeneous growth rate over the main deposit area. Since there are many agents influencing reaction on crystal area such as temperature, pressure, gas flow or reactor geometry, it is difficult to design optimal process. According to the fact that it's impossible to determine experimentally the exact distribution of heat and mass transfer inside the reactor during crystal growth, modeling is the only solution to understand the process precisely. Numerical simulations allow to understand the epitaxial process by calculation of heat and mass transfer distribution during growth of gallium nitride. Including chemical reactions in numerical model allows to calculate the growth rate of the substrate and estimate the optimal process conditions for obtaining the most homogeneous product.

  19. Dynamic drag force based on iterative density mapping: A new numerical tool for three-dimensional analysis of particle trajectories in a dielectrophoretic system.

    PubMed

    Knoerzer, Markus; Szydzik, Crispin; Tovar-Lopez, Francisco Javier; Tang, Xinke; Mitchell, Arnan; Khoshmanesh, Khashayar

    2016-02-01

    Dielectrophoresis is a widely used means of manipulating suspended particles within microfluidic systems. In order to efficiently design such systems for a desired application, various numerical methods exist that enable particle trajectory plotting in two or three dimensions based on the interplay of hydrodynamic and dielectrophoretic forces. While various models are described in the literature, few are capable of modeling interactions between particles as well as their surrounding environment as these interactions are complex, multifaceted, and computationally expensive to the point of being prohibitive when considering a large number of particles. In this paper, we present a numerical model designed to enable spatial analysis of the physical effects exerted upon particles within microfluidic systems employing dielectrophoresis. The model presents a means of approximating the effects of the presence of large numbers of particles through dynamically adjusting hydrodynamic drag force based on particle density, thereby introducing a measure of emulated particle-particle and particle-liquid interactions. This model is referred to as "dynamic drag force based on iterative density mapping." The resultant numerical model is used to simulate and predict particle trajectory and velocity profiles within a microfluidic system incorporating curved dielectrophoretic microelectrodes. The simulated data are compared favorably with experimental data gathered using microparticle image velocimetry, and is contrasted against simulated data generated using traditional "effective moment Stokes-drag method," showing more accurate particle velocity profiles for areas of high particle density.

  20. Evaluation of contaminant removal of reverse osmosis and advanced oxidation in full-scale operation by combining passive sampling with chemical analysis and bioanalytical tools.

    PubMed

    Escher, Beate I; Lawrence, Michael; Macova, Miroslava; Mueller, Jochen F; Poussade, Yvan; Robillot, Cedric; Roux, Annalie; Gernjak, Wolfgang

    2011-06-15

    Advanced water treatment of secondary treated effluent requires stringent quality control to achieve a water quality suitable for augmenting drinking water supplies. The removal of micropollutants such as pesticides, industrial chemicals, endocrine disrupting chemicals (EDC), pharmaceuticals, and personal care products (PPCP) is paramount. As the concentrations of individual contaminants are typically low, frequent analytical screening is both laborious and costly. We propose and validate an approach for continuous monitoring by applying passive sampling with Empore disks in vessels that were designed to slow down the water flow, and thus uptake kinetics, and ensure that the uptake is only marginally dependent on the chemicals' physicochemical properties over a relatively narrow molecular size range. This design not only assured integrative sampling over 27 days for a broad range of chemicals but also permitted the use of a suite of bioanalytical tools as sum parameters, representative of mixtures of chemicals with a common mode of toxic action. Bioassays proved to be more sensitive than chemical analysis to assess the removal of organic micropollutants by reverse osmosis, followed by UV/H₂O₂ treatment, as many individual compounds fell below the quantification limit of chemical analysis, yet still contributed to the observed mixture toxicity. Nonetheless in several cases, the responses in the bioassays were also below their quantification limits and therefore only three bioassays were evaluated here, representing nonspecific toxicity and two specific end points for estrogenicity and photosynthesis inhibition. Chemical analytical techniques were able to quantify 32 pesticides, 62 PCPPs, and 12 EDCs in reverse osmosis concentrate. However, these chemicals could explain only 1% of the nonspecific toxicity in the Microtox assay in the reverse osmosis concentrate and 0.0025% in the treated water. Likewise only 1% of the estrogenic effect in the E-SCREEN could be

  1. Evaluation of contaminant removal of reverse osmosis and advanced oxidation in full-scale operation by combining passive sampling with chemical analysis and bioanalytical tools.

    PubMed

    Escher, Beate I; Lawrence, Michael; Macova, Miroslava; Mueller, Jochen F; Poussade, Yvan; Robillot, Cedric; Roux, Annalie; Gernjak, Wolfgang

    2011-06-15

    Advanced water treatment of secondary treated effluent requires stringent quality control to achieve a water quality suitable for augmenting drinking water supplies. The removal of micropollutants such as pesticides, industrial chemicals, endocrine disrupting chemicals (EDC), pharmaceuticals, and personal care products (PPCP) is paramount. As the concentrations of individual contaminants are typically low, frequent analytical screening is both laborious and costly. We propose and validate an approach for continuous monitoring by applying passive sampling with Empore disks in vessels that were designed to slow down the water flow, and thus uptake kinetics, and ensure that the uptake is only marginally dependent on the chemicals' physicochemical properties over a relatively narrow molecular size range. This design not only assured integrative sampling over 27 days for a broad range of chemicals but also permitted the use of a suite of bioanalytical tools as sum parameters, representative of mixtures of chemicals with a common mode of toxic action. Bioassays proved to be more sensitive than chemical analysis to assess the removal of organic micropollutants by reverse osmosis, followed by UV/H₂O₂ treatment, as many individual compounds fell below the quantification limit of chemical analysis, yet still contributed to the observed mixture toxicity. Nonetheless in several cases, the responses in the bioassays were also below their quantification limits and therefore only three bioassays were evaluated here, representing nonspecific toxicity and two specific end points for estrogenicity and photosynthesis inhibition. Chemical analytical techniques were able to quantify 32 pesticides, 62 PCPPs, and 12 EDCs in reverse osmosis concentrate. However, these chemicals could explain only 1% of the nonspecific toxicity in the Microtox assay in the reverse osmosis concentrate and 0.0025% in the treated water. Likewise only 1% of the estrogenic effect in the E-SCREEN could be

  2. A seamless flash-flood early warning tool based on IDF-curves and coupling of weather-radar with numerical weather predictions

    NASA Astrophysics Data System (ADS)

    Liechti, Kaethi; Knechtl, Valentin; Andres, Norina; Sideris, Ioannis; Zappa, Massimiliano

    2014-05-01

    A flash-flood is a flood that develops rapidly after a heavy precipitation event. Flash-flood forecasting is an important field of research because flash floods cause a lot of fatalities and damage. A flash-flood early warning tool is developed based on precipitation statistics. Our target areas are small ungauged areas of southern-Switzerland. A total of 759 sub-cathcments was considered. In a first intensity-duration-frequency (IDF) curves for each catchment have been calculated basin on: A) Gridded precipitation products for the period 1961 to 2012 and B) gridded reforecast of the COSMO-LEPS NWP for the period 1971-2000. These different IDF-curves at the catchment level in combination with precipitation forecasts are the basis for the flash-flood early warning tool. The forecast models used are COSMO-2 (deterministic, updated every three hours and with a lead time of 24 hours) and COSMO-LEPS (probabilistic, 16 member and with a lead time of five days). In operational mode COSMO-2 is nudged to real-time data of a weather-radar precipitation obtained by blending the radar qpe with information from a national network of precipitation data. This product is called COMBIPRECIP. The flash-flood early warning tool has been evaluated against observed events. These events are either discharge peaks in gauged sub-areas or reports of damages caused by flash-flood events. The hypothesis that it is possible to detect hydrological events with the flash-flood early warning tool can be partly confirmed. The highest skill is obtained if the return-period of weather radar QPE is assessed at hourly time scale. With this it was possible to confirm most of the damage events occurred in 2010 and 2011. The prototype tool is affected by several false alarms. This is because initial conditions of the soils are not considered. Further steps will be therefore focussed on the addition of real-time hydrological information as obtained from the application of high resolution distributed

  3. Development of a numerical scheme to predict geomagnetic storms after intense solar events and geomagnetic activity 27 days in advance. Final report, 6 Aug 86-16 Nov 90

    SciTech Connect

    Akasofu, S.I.; Lee, L.H.

    1991-02-01

    The modern geomagnetic storm prediction scheme should be based on a numerical simulation method, rather than on a statistical result. Furthermore, the scheme should be able to predict the geomagnetic storm indices, such as the Dst and AE indices, as a function of time. By recognizing that geomagnetic storms are powered by the solar wind-magnetosphere generator and that its power is given in terms of the solar wind speed, the interplanetary magnetic field (IMF) magnitude and polar angle, the authors have made a major advance in predicting both flare-induced storms and recurrent storms. Furthermore, it is demonstrated that the prediction scheme can be calibrated using the interplanetary scintillation (IPS) observation, when the solar disturbance advances about half-way to the earth. It is shown, however, that we are still far from a reliable prediction scheme. The prediction of the IMF polar angle requires future advance in understanding characteristics of magnetic clouds.

  4. The Application of the NASA Advanced Concepts Office, Launch Vehicle Team Design Process and Tools for Modeling Small Responsive Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Threet, Grady E.; Waters, Eric D.; Creech, Dennis M.

    2012-01-01

    The Advanced Concepts Office (ACO) Launch Vehicle Team at the NASA Marshall Space Flight Center (MSFC) is recognized throughout NASA for launch vehicle conceptual definition and pre-phase A concept design evaluation. The Launch Vehicle Team has been instrumental in defining the vehicle trade space for many of NASA s high level launch system studies from the Exploration Systems Architecture Study (ESAS) through the Augustine Report, Constellation, and now Space Launch System (SLS). The Launch Vehicle Team s approach to rapid turn-around and comparative analysis of multiple launch vehicle architectures has played a large role in narrowing the design options for future vehicle development. Recently the Launch Vehicle Team has been developing versions of their vetted tools used on large launch vehicles and repackaged the process and capability to apply to smaller more responsive launch vehicles. Along this development path the LV Team has evaluated trajectory tools and assumptions against sounding rocket trajectories and air launch systems, begun altering subsystem mass estimating relationships to handle smaller vehicle components, and as an additional development driver, have begun an in-house small launch vehicle study. With the recent interest in small responsive launch systems and the known capability and response time of the ACO LV Team, ACO s launch vehicle assessment capability can be utilized to rapidly evaluate the vast and opportune trade space that small launch vehicles currently encompass. This would provide a great benefit to the customer in order to reduce that large trade space to a select few alternatives that should best fit the customer s payload needs.

  5. Numerical simulation of linear and nonlinear quantum optics as a design tool for free-space quantum communications and quantum imaging

    NASA Astrophysics Data System (ADS)

    Meyers, Ronald E.; Deacon, Keith S.; Rosen, D.

    2002-12-01

    A new quantum optics tool for simulating quantum probability density functions resulting from the linear and nonlinear interaction of photons with atoms and with other photons is developed and presented. It can be used to design and simulate quantum optics experiments used in quantum communications, quantum computing, and quantum imaging. Examples of a photon interacting with linears systems of mirrors and beamsplitters are simulated. Nonlinear simulations of the interaction of three photons resulting in photon momentum entanglement is presented. The wavefunction is expanded in Fock states. Fock states cannot be represented by classical modeling and therefore, the results of our modeling can in general represent phenomena in both the linear and nonlinear cases which cannot be modeled by classical linear optics. The modeling presented here is more general than the classical linear optics. Models of atmospheric turbulence and their simulations are presented and demonstrate the potential for first principles physics quantum optics simulations through turbulence in realistic environments.

  6. The use of exploratory experimental designs combined with thermal numerical modelling to obtain a predictive tool for hybrid laser/MIG welding and coating processes

    NASA Astrophysics Data System (ADS)

    Bidi, Lyes; Mattei, Simone; Cicala, Eugen; Andrzejewski, Henri; Le Masson, Philippe; Schroeder, Jeanne

    2011-04-01

    While hybrid laser welding and coating processes involve a large number of physical phenomena, it is currently impossible to predict, for a given set of influencing factors, the shape of the molten zone and the history of temperature fields inside the parts. This remains true for complex processes, such as the hybrid laser/MIG welding process, which consists in combining a laser beam with a MIG torch. The gains obtained result essentially from the synergy of the associated processes: the stability of the process, the quality of the seam realized, and the productivity are increased. This article shows how, by means of a reduced number of experiments (8), it is possible to predict the shape of the molten zone and the temperature field inside parts, for a given window of influencing factors. This method consists in combining the method of exploratory experimental designs with a numerical modelling of the thermal phenomena that occurs during the process, by using the 'heat equivalent source" approach [1-4]. Two validations of this method have been carried out: the first for a set of parameters inside the experimental design, and the other for a set of parameters that lies outside the experimental design, but inside the domain investigated.

  7. FluxSuite: a New Scientific Tool for Advanced Network Management and Cross-Sharing of Next-Generation Flux Stations

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.

    2015-12-01

    Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from

  8. Application of metabolite profiling tools and time-of-flight mass spectrometry in the identification of transformation products of iopromide and iopamidol during advanced oxidation.

    PubMed

    Singh, Randolph R; Lester, Yaal; Linden, Karl G; Love, Nancy G; Atilla-Gokcumen, G Ekin; Aga, Diana S

    2015-03-01

    The efficiency of wastewater treatment systems in removing pharmaceuticals is often assessed on the basis of the decrease in the concentration of the parent compound. However, what is perceived as "removal" during treatment may not necessarily mean mineralization of the pharmaceutical compound but simply conversion into different transformation products (TPs). Using liquid chromatography coupled to a quadrupole time-of-flight mass spectrometer (LC-QToF-MS), we demonstrated conversion of iopromide in wastewater to at least 14 TPs after an advanced oxidation process (AOP) using UV (fluence = 1500 mJ/cm(2)) and H2O2 (10 mg/L). Due to the complexity of the wastewater matrix, the initial experiments were performed using a high concentration (10 mg/L) of iopromide in order to facilitate the identification of TPs. Despite the high concentration of iopromide used, cursory inspection of UV and mass spectra only revealed four TPs in the chromatograms of the post-AOP samples. However, the use of METLIN database and statistics-based profiling tools commonly used in metabolomics proved effective in discriminating between background signals and TPs derived from iopromide. High-resolution mass data allowed one to predict molecular formulas of putative TPs with errors below 5 ppm relative to the observed m/z. Tandem mass spectrometry (MS/MS) data and isotope pattern comparisons provided necessary information that allowed one to elucidate the structure of iopromide TPs. The presence of the proposed iopromide TPs was determined in unspiked wastewater from a municipal wastewater treatment plant, but no iopromide and TPs were detected. Using analogous structural modifications and oxidation that results from the AOP treatment of iopromide, the potential TPs of iopamidol (a structurally similar compound to iopromide) were predicted. The same mass fragmentation pattern observed in iopromide TPs was applied to the predicted iopamidol TPs. LC-QToF-MS revealed the presence of two iopamidol

  9. Numerical modeling of late Glacial Laurentide advance of ice across Hudson Strait: Insights into terrestrial and marine geology, mass balance, and calving flux

    USGS Publications Warehouse

    Pfeffer, W.T.; Dyurgerov, M.; Kaplan, M.; Dwyer, J.; Sassolas, C.; Jennings, A.; Raup, B.; Manley, W.

    1997-01-01

    A time-dependent finite element model was used to reconstruct the advance of ice from a late Glacial dome on northern Quebec/Labrador across Hudson Strait to Meta Incognita Peninsula (Baffin Island) and subsequently to the 9.9-9.6 ka 14C Gold Cove position on Hall Peninsula. Terrestrial geological and geophysical information from Quebec and Labrador was used to constrain initial and boundary conditions, and the model results are compared with terrestrial geological information from Baffin Island and considered in the context of the marine event DC-0 and the Younger Dryas cooling. We conclude that advance across Hudson Strait from Ungava Bay to Baffin Island is possible using realistic glacier physics under a variety of reasonable boundary conditions. Production of ice flux from a dome centered on northeastern Quebec and Labrador sufficient to deliver geologically inferred ice thickness at Gold Cove (Hall Peninsula) appears to require extensive penetration of sliding south from Ungava Bay. The discharge of ice into the ocean associated with advance and retreat across Hudson Strait does not peak at a time coincident with the start of the Younger Dryas and is less than minimum values proposed to influence North Atlantic thermohaline circulation; nevertheless, a significant fraction of freshwater input to the North Atlantic may have been provided abruptly and at a critical time by this event.

  10. Numerical Analysis Objects

    NASA Astrophysics Data System (ADS)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  11. Advanced Tsunami Numerical Simulations and Energy Considerations by use of 3D-2D Coupled Models: The October 11, 1918, Mona Passage Tsunami

    NASA Astrophysics Data System (ADS)

    López-Venegas, Alberto M.; Horrillo, Juan; Pampell-Manis, Alyssa; Huérfano, Victor; Mercado, Aurelio

    2015-06-01

    The most recent tsunami observed along the coast of the island of Puerto Rico occurred on October 11, 1918, after a magnitude 7.2 earthquake in the Mona Passage. The earthquake was responsible for initiating a tsunami that mostly affected the northwestern coast of the island. Runup values from a post-tsunami survey indicated the waves reached up to 6 m. A controversy regarding the source of the tsunami has resulted in several numerical simulations involving either fault rupture or a submarine landslide as the most probable cause of the tsunami. Here we follow up on previous simulations of the tsunami from a submarine landslide source off the western coast of Puerto Rico as initiated by the earthquake. Improvements on our previous study include: (1) higher-resolution bathymetry; (2) a 3D-2D coupled numerical model specifically developed for the tsunami; (3) use of the non-hydrostatic numerical model NEOWAVE (non-hydrostatic evolution of ocean WAVE) featuring two-way nesting capabilities; and (4) comprehensive energy analysis to determine the time of full tsunami wave development. The three-dimensional Navier-Stokes model tsunami solution using the Navier-Stokes algorithm with multiple interfaces for two fluids (water and landslide) was used to determine the initial wave characteristic generated by the submarine landslide. Use of NEOWAVE enabled us to solve for coastal inundation, wave propagation, and detailed runup. Our results were in agreement with previous work in which a submarine landslide is favored as the most probable source of the tsunami, and improvement in the resolution of the bathymetry yielded inundation of the coastal areas that compare well with values from a post-tsunami survey. Our unique energy analysis indicates that most of the wave energy is isolated in the wave generation region, particularly at depths near the landslide, and once the initial wave propagates from the generation region its energy begins to stabilize.

  12. Next-Generation Ion Thruster Design Tool

    NASA Technical Reports Server (NTRS)

    Stolz, Peter

    2015-01-01

    Computational tools that accurately predict the performance of electric propulsion devices are highly desirable and beneficial to NASA and the broader electric propulsion community. The current state of the art in electric propulsion modeling relies heavily on empirical data and numerous computational "knobs." In Phase I of this project, Tech-X Corporation developed the most detailed ion engine discharge chamber model that currently exists. This kinetic model simulates all particles in the discharge chamber along with a physically correct simulation of the electric fields. In addition, kinetic erosion models are included for modeling the ion-impingement effects on thruster component erosion. In Phase II, Tech-X developed a user-friendly computer program for NASA and other governmental and industry customers. Tech-X has implemented a number of advanced numerical routines to bring the computational time down to a commercially acceptable level. NASA now has a highly sophisticated, user-friendly ion engine discharge chamber modeling tool.

  13. Tool setting device

    DOEpatents

    Brown, Raymond J.

    1977-01-01

    The present invention relates to a tool setting device for use with numerically controlled machine tools, such as lathes and milling machines. A reference position of the machine tool relative to the workpiece along both the X and Y axes is utilized by the control circuit for driving the tool through its program. This reference position is determined for both axes by displacing a single linear variable displacement transducer (LVDT) with the machine tool through a T-shaped pivotal bar. The use of the T-shaped bar allows the cutting tool to be moved sequentially in the X or Y direction for indicating the actual position of the machine tool relative to the predetermined desired position in the numerical control circuit by using a single LVDT.

  14. Numerical Development

    ERIC Educational Resources Information Center

    Siegler, Robert S.; Braithwaite, David W.

    2016-01-01

    In this review, we attempt to integrate two crucial aspects of numerical development: learning the magnitudes of individual numbers and learning arithmetic. Numerical magnitude development involves gaining increasingly precise knowledge of increasing ranges and types of numbers: from non-symbolic to small symbolic numbers, from smaller to larger…

  15. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  16. Self-imposed evaluation of the Helmholtz Research School MICMoR as a tool for quality assurance and advancement of a structured graduate programme

    NASA Astrophysics Data System (ADS)

    Elija Bleher, Bärbel; Schmid, Hans Peter; Scholz, Beate

    2015-04-01

    The Helmholtz Research School MICMoR (Mechanisms and Interactions of Climate Change in Mountain Regions) offers a structured graduate programme for doctoral students in the field of climate change research. It is hosted by the Institute of Meteorology and Climate Research (KIT/IMK-IFU) in Garmisch-Partenkirchen, in collaboration with 7 Bavarian partner universities and research institutions. Hence, MICMoR brings together a considerably large network with currently 20 doctoral students and 55 scientists. MICMoR offers scientific and professional skills training, provides a state-of-the-art supervision concept, and fosters international exchange and interdisciplinary collaboration. In order to develop and advance its programme, MICMoR has committed itself to a self-imposed mid-term review in its third year, to monitor to which extent its original objectives have been reached, and to explore and identify where MICMoR has room for improvement. The evaluation especially focused on recruitment, supervision, training, networking and cooperation. Carried out by an external expert (Beate Scholz from scholz ctc), the evaluation was based on a mixed methods approach, i.e. combining a quantitative survey involving all doctoral candidates as well as their supervisors and focus groups with different MICMoR stakeholders. The evaluation has brought forward some highly interesting results, pinpointing challenges and opportunities of setting up a structured doctoral programme. Overall, the evaluation proved to be a useful tool for evidence-based programme and policy planning, and demonstrated a high level of satisfaction of supervisors and fellows. Supervision, with facets ranging from disciplinary feedback to career advice, is demanding and requires strong commitment and adequate human resources development by all parties involved. Thus, MICMoR plans to offer mentor coaching and calls on supervisors and mentors to form a community of learners with their doctoral students. To

  17. Numerical nebulae

    NASA Astrophysics Data System (ADS)

    Rijkhorst, Erik-Jan

    2005-12-01

    The late stages of evolution of stars like our Sun are dominated by several episodes of violent mass loss. Space based observations of the resulting objects, known as Planetary Nebulae, show a bewildering array of highly symmetric shapes. The interplay between gasdynamics and radiative processes determines the morphological outcome of these objects, and numerical models for astrophysical gasdynamics have to incorporate these effects. This thesis presents new numerical techniques for carrying out high-resolution three-dimensional radiation hydrodynamical simulations. Such calculations require parallelization of computer codes, and the use of state-of-the-art supercomputer technology. Numerical models in the context of the shaping of Planetary Nebulae are presented, providing insight into their origin and fate.

  18. LensTools: Weak Lensing computing tools

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  19. NATIONAL URBAN DATABASE AND ACCESS PORTAL TOOL (NUDAPT): FACILITATING ADVANCEMENTS IN URBAN METEOROLOGY AND CLIMATE MODELING WITH COMMUNITY-BASED URBAN DATABASES

    EPA Science Inventory

    We discuss the initial design and application of the National Urban Database and Access Portal Tool (NUDAPT). This new project is sponsored by the USEPA and involves collaborations and contributions from many groups from federal and state agencies, and from private and academic i...

  20. Automatically-Programed Machine Tools

    NASA Technical Reports Server (NTRS)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  1. Numerical Integration

    ERIC Educational Resources Information Center

    Sozio, Gerry

    2009-01-01

    Senior secondary students cover numerical integration techniques in their mathematics courses. In particular, students would be familiar with the "midpoint rule," the elementary "trapezoidal rule" and "Simpson's rule." This article derives these techniques by methods which secondary students may not be familiar with and an approach that…

  2. Research on ARM Numerical Control System

    NASA Astrophysics Data System (ADS)

    Wei, Xu; JiHong, Chen

    Computerized Numerical Control (CNC) machine tools is the foundation of modern manufacturing systems, whose advanced digital technology is the key to solve the problem of sustainable development of machine tool manufacturing industry. The paper is to design CNC system embedded on ARM and indicates the hardware design and the software systems supported. On the hardware side: the driving chip of the motor control unit, as the core of components, is MCX314AL of DSP motion control which is developed by NOVA Electronics Co., Ltd. of Japan. It make convenient to control machine because of its excellent performance, simple interface, easy programming. On the Software side, the uC/OS-2 is selected as the embedded operating system of the open source, which makes a detailed breakdown of the modules of the CNC system. Those priorities are designed according to their actual requirements. The ways of communication between the module and the interrupt response are so different that it guarantees real-time property and reliability of the numerical control system. Therefore, it not only meets the requirements of the current social precision machining, but has good man-machine interface and network support to facilitate a variety of craftsmen use.

  3. Drug-induced sleep endoscopy as a selection tool for mandibular advancement therapy by oral device in patients with mild to moderate obstructive sleep apnoea.

    PubMed

    De Corso, E; Bastanza, G; Della Marca, G; Grippaudo, C; Rizzotto, G; Marchese, M R; Fiorita, A; Sergi, B; Meucci, D; Di Nardo, W; Paludetti, G; Scarano, E

    2015-12-01

    Nowadays oral appliance therapy is recognised as an effective therapy for many patients with primary snoring and mild to moderate obstructive sleep apnoea (OSA), as well as those with more severe OSA who cannot tolerate positive airway pressure (PAP) therapies. For this reason, it is important to focus on objective criteria to indicate which subjects may benefit from treatment with a mandibular advancement device (MAD). Various anthropometric and polysomnographic predictors have been described in the literature, whereas there are still controversies about the role of drug-induced sleep endoscopy (DISE) and advancement bimanual manoeuvre as predictor factors of treatment outcome by oral device. Herein, we report our experience in treatment of mild moderate OSA by oral appliance selected by DISE. We performed a single institution, longitudinal prospective evaluation of a consecutive group of mild moderate patients with obstructive sleep apnoea syndrome who underwent DISE. During sleep endoscopy, gentle manoeuvre of mandibular advancement less than 5 mm was performed. In 30 of 65 patients (46.2%) we obtained an unsuccessful improvement of airway patency whereas in 35 of 65 patients (53.8%) the improvement was successful and patients were considered suitable for oral device application. Because 7 of 35 patients were excluded due to conditions interfering with oral appliance therapy, we finally treated 28 patients. After 3 months of treatment, we observed a significant improvement in the Epworth medium index [(7.35 ± 2.8 versus 4.1 ± 2.2 (p < 0.05)], in mean AHI [(21.4 ± 6 events per hour versus 8.85 ± 6.9 (p < 0.05)] and in mean ODI [(18.6 ± 8 events per hour to 7 ± 5.8 (p < 0.05)]. We observed that the apnoea/hypopnoea index (AHI) improved by up to 50% from baseline in 71.4% of patients selected after DISE for MAD therapy. In the current study, mandibular advancement splint therapy was successfully prescribed on the basis not only of severity of disease, as

  4. Advanced Numerical Modeling of Turbulent Atmospheric Flows

    NASA Astrophysics Data System (ADS)

    Kühnlein, Christian; Dörnbrack, Andreas; Gerz, Thomas

    The present chapter introduces the method of computational simulation to predict and study turbulent atmospheric flows. This includes a description of the fundamental approach to computational simulation and the practical implementation using the technique of large-eddy simulation. In addition, selected contributions from IPA scientists to computational model development and various examples for applications are given. These examples include homogeneous turbulence, convective boundary layers, heated forest canopy, buoyant thermals, and large-scale flows with baroclinic wave instability.

  5. Numerical simulations of glass impacts using smooth particle hydrodynamics

    SciTech Connect

    Mandell, D.A.; Wingate, C.A.

    1996-05-01

    As part of a program to develop advanced hydrocode design tools, we have implemented a brittle fracture model for glass into the SPHINX smooth particle hydrodynamics code. We have evaluated this model and the code by predicting data from one-dimensional flyer plate impacts into glass. Since fractured glass properties, which are needed in the model, are not available, we did sensitivity studies of these properties, as well as sensitivity studies to determine the number of particles needed in the calculations. The numerical results are in good agreement with the data. {copyright} {ital 1996 American Institute of Physics.}

  6. Numerical simulations of glass impacts using smooth particle hydrodynamics

    SciTech Connect

    Mandell, D.A.; Wingate, C.A.

    1995-07-01

    As part of a program to develop advanced hydrocode design tools, we have implemented a brittle fracture model for glass into the SPHINX smooth particle hydrodynamics code. We have evaluated this model and the code by predicting data from one-dimensional flyer plate impacts into glass. Since fractured glass properties, which are needed in the model, are not available, we did sensitivity studies of these properties, as well as sensitivity studies to determine the number of particles needed in the calculations. The numerical results are in good agreement with the data.

  7. Recent Advances in Vibroacoustics

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Mark E.

    2002-01-01

    Numerous vibroacoustics advances and impacts in the aerospace industry have occurred over the last 15 years. This article addresses some of these that developed from engineering programmatic task-work at the NASA Glenn Research Center at Lewis Field.

  8. How we used a patient visit tracker tool to advance experiential learning in systems-based practice and quality improvement in a medical student clinic.

    PubMed

    Chen, Chen Amy; Park, Ryan J; Hegde, John V; Jun, Tomi; Christman, Mitalee P; Yoo, Sun M; Yamasaki, Alisa; Berhanu, Aaron; Vohra-Khullar, Pamela; Remus, Kristin; Schwartzstein, Richard M; Weinstein, Amy R

    2016-01-01

    Poorly designed healthcare systems increase costs and preventable medical errors. To address these issues, systems-based practice (SBP) education provides future physicians with the tools to identify systemic errors and implement quality improvement (QI) initiatives to enhance the delivery of cost-effective, safe and multi-disciplinary care. Although SBP education is being implemented in residency programs and is mandated by the Accreditation Council for Graduate Medical Education (ACGME) as one of its core competencies, it has largely not been integrated into undergraduate medical education. We propose that Medical Student-Faculty Collaborative Clinics (MSFCCs) may be the ideal environment in which to train medical students in SBPs and QI initiatives, as they allow students to play pivotal roles in project development, administration, and management. Here we describe a process of experiential learning that was developed within a newly established MSFCC, which challenged students to identify inefficiencies, implement interventions, and track the results. After identifying bottlenecks in clinic operations, our students designed a patient visit tracker tool to monitor clinic flow and implemented solutions to decrease patient visit times. Our model allowed students to drive their own active learning in a practical clinical setting, providing early and unique training in crucial QI skills.

  9. Regional Arctic System Model (RASM): A Tool to Address the U.S. Priorities and Advance Capabilities for Arctic Climate Modeling and Prediction

    NASA Astrophysics Data System (ADS)

    Maslowski, W.; Roberts, A.; Cassano, J. J.; Gutowski, W. J., Jr.; Nijssen, B.; Osinski, R.; Zeng, X.; Brunke, M.; Duvivier, A.; Hamman, J.; Hossainzadeh, S.; Hughes, M.; Seefeldt, M. W.

    2015-12-01

    The Arctic is undergoing some of the most coordinated rapid climatic changes currently occurring anywhere on Earth, including the retreat of the perennial sea ice cover, which integrates forcing by, exchanges with and feedbacks between atmosphere, ocean and land. While historical reconstructions from Earth System Models (ESMs) are in broad agreement with these changes, the rate of change in ESMs generally remains outpaced by observations. Reasons for that relate to a combination of coarse resolution, inadequate parameterizations, under-represented processes and a limited knowledge of physical interactions. We demonstrate the capability of the Regional Arctic System Model (RASM) in addressing some of the ESM limitations in simulating observed variability and trends in arctic surface climate. RASM is a high resolution, pan-Arctic coupled climate model with the sea ice and ocean model components configured at an eddy-permitting resolution of 1/12o and the atmosphere and land hydrology model components at 50 km resolution, which are all coupled at 20-minute intervals. RASM is an example of limited-area, process-resolving, fully coupled ESM, which due to the constraints from boundary conditions facilitates detailed comparisons with observational statistics that are not possible with ESMs. The overall goal of RASM is to address key requirements published in the Navy Arctic Roadmap: 2014-2030 and in the Implementation Plan for the National Strategy for the Arctic Region, regarding the need for advanced modeling capabilities for operational forecasting and strategic climate predictions through 2030. The main science objectives of RASM are to advance understanding and model representation of critical physical processes and feedbacks of importance to sea ice thickness and area distribution. RASM results are presented to quantify relative contributions by (i) resolved processes and feedbacks as well as (ii) sensitivity to space dependent sub-grid parameterizations to better

  10. Final Progress Report submitted via the DOE Energy Link (E-Link) in June 2009 [Collaborative Research: Decadal-to-Centennial Climate & Climate Change Studies with Enhanced Variable and Uniform Resolution GCMs Using Advanced Numerical Techniques

    SciTech Connect

    Fox-Rabinovitz, M; Cote, J

    2009-10-09

    The joint U.S-Canadian project has been devoted to: (a) decadal climate studies using developed state-of-the-art GCMs (General Circulation Models) with enhanced variable and uniform resolution; (b) development and implementation of advanced numerical techniques; (c) research in parallel computing and associated numerical methods; (d) atmospheric chemistry experiments related to climate issues; (e) validation of regional climate modeling strategies for nested- and stretched-grid models. The variable-resolution stretched-grid (SG) GCMs produce accurate and cost-efficient regional climate simulations with mesoscale resolution. The advantage of the stretched grid approach is that it allows us to preserve the high quality of both global and regional circulations while providing consistent interactions between global and regional scales and phenomena. The major accomplishment for the project has been the successful international SGMIP-1 and SGMIP-2 (Stretched-Grid Model Intercomparison Project, phase-1 and phase-2) based on this research developments and activities. The SGMIP provides unique high-resolution regional and global multi-model ensembles beneficial for regional climate modeling and broader modeling community. The U.S SGMIP simulations have been produced using SciDAC ORNL supercomputers. The results of the successful SGMIP multi-model ensemble simulations of the U.S. climate are available at the SGMIP web site (http://essic.umd.edu/~foxrab/sgmip.html) and through the link to the WMO/WCRP/WGNE web site: http://collaboration.cmc.ec.gc.ca/science/wgne. Collaborations with other international participants M. Deque (Meteo-France) and J. McGregor (CSIRO, Australia) and their centers and groups have been beneficial for the strong joint effort, especially for the SGMIP activities. The WMO/WCRP/WGNE endorsed the SGMIP activities in 2004-2008. This project reflects a trend in the modeling and broader communities to move towards regional and sub-regional assessments and

  11. The effects of using screencasting as a multimedia pre-training tool to manage the intrinsic cognitive load of chemical equilibrium instruction for advanced high school chemistry students

    NASA Astrophysics Data System (ADS)

    Musallam, Ramsey

    Chemistry is a complex knowledge domain. Specifically, research notes that Chemical Equilibrium presents greater cognitive challenges than other topics in chemistry. Cognitive Load Theory describes the impact a subject, and the learning environment, have on working memory. Intrinsic load is the facet of Cognitive Load Theory that explains the complexity innate to complex subjects. The purpose of this study was to build on the limited research into intrinsic cognitive load, by examining the effects of using multimedia screencasts as a pre-training technique to manage the intrinsic cognitive load of chemical equilibrium instruction for advanced high school chemistry students. A convenience sample of 62 fourth-year high school students enrolled in an advanced chemistry course from a co-ed high school in urban San Francisco were given a chemical equilibrium concept pre-test. Upon conclusion of the pre-test, students were randomly assigned to two groups: pre-training and no pre-training. The pre-training group received a 10 minute and 52 second pre-training screencast that provided definitions, concepts and an overview of chemical equilibrium. After pre-training both group received the same 50-minute instructional lecture. After instruction, all students were given a chemical equilibrium concept post-test. Independent sample t-tests were conducted to examine differences in performance and intrinsic load. No significant differences in performance or intrinsic load, as measured by ratings of mental effort, were observed on the pre-test. Significant differences in performance, t(60)=3.70, p=.0005, and intrinsic load, t(60)=5.34, p=.0001, were observed on the post-test. A significant correlation between total performance scores and total mental effort ratings was also observed, r(60)=-0.44, p=.0003. Because no significant differences in prior knowledge were observed, it can be concluded that pre-training was successful at reducing intrinsic load. Moreover, a significant

  12. Magnetospheric ULF wave studies in the frame of Swarm mission: new advanced tools for automated detection of pulsations in magnetic and electric field observations

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Georgiou, Marina; Giamini, Sigiava A.; Sandberg, Ingmar; Haagmans, Roger

    2014-05-01

    The rekindling of the interest in space science in the last 15 years has led to many successful satellite missions in the Earth's magnetosphere and topside ionosphere, which were able to provide the scientific community with high-quality data on the magnetic and electric fields surrounding our planet. This data pool will be further enriched by the measurements of ESA's Swarm mission, a constellation of three satellites in different polar orbits, flying at altitudes from 400 to 550 km, which was launched on the 22nd of November 2013. Aiming at the best scientific exploitation of this corpus of accumulated data, we have developed a set of analysis tools that can cope with measurements of various spacecraft, at various regions of the magnetosphere and in the topside ionosphere. Our algorithms are based on a combination of wavelet spectral methods and artificial neural network techniques and are suited for the detection of waves and wave-like disturbances as well as the extraction of several physical parameters. Our recent work demonstrates the applicability of our developed analysis tools, both for individual case studies and statistical analysis of ultra low frequency (ULF) waves. We provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz), Pc4 (7-22 mHz) and Pc5 (1-7 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA, GIMA and IMAGE magnetometer networks. Our study shows that the same wave event, characterized by increased activity in the high end of the Pc3 band, was simultaneously observed by all three satellite missions and by certain stations of ground networks. This observation provides a strong argument in favour of the

  13. New advanced tools for combined ULF wave analysis of multipoint space-borne and ground observations: application to single event and statistical studies

    NASA Astrophysics Data System (ADS)

    Balasis, G.; Papadimitriou, C.; Daglis, I. A.; Georgiou, M.; Giamini, S. A.

    2013-12-01

    In the past decade, a critical mass of high-quality scientific data on the electric and magnetic fields in the Earth's magnetosphere and topside ionosphere has been progressively collected. This data pool will be further enriched by the measurements of the upcoming ESA/Swarm mission, a constellation of three satellites in three different polar orbits between 400 and 550 km altitude, which is expected to be launched in November 2013. New analysis tools that can cope with measurements of various spacecraft at various regions of the magnetosphere and in the topside ionosphere as well as ground stations will effectively enhance the scientific exploitation of the accumulated data. Here, we report on a new suite of algorithms based on a combination of wavelet spectral methods and artificial neural network techniques and demonstrate the applicability of our recently developed analysis tools both for individual case studies and statistical studies of ultra-low frequency (ULF) waves. First, we provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz) and Pc4-5 (1-22 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA and GIMA magnetometer networks. Then, we perform a statistical study of Pc3 wave events observed by CHAMP for the full decade (2001-2010) of the satellite vector magnetic data: the creation of a database of such events enabled us to derive valuable statistics for many important physical properties relating to the spatio-temporal location of these waves, the wave power and frequency, as well as other parameters and their correlation with solar wind conditions, magnetospheric indices, electron density data, ring current decay

  14. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    NASA Astrophysics Data System (ADS)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application

  15. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  16. Numerical model representation and validation strategies

    SciTech Connect

    Dolin, R.M.; Hefele, J.

    1997-10-01

    This paper describes model representation and validation strategies for use in numerical tools that define models in terms of topology, geometry, or topography. Examples of such tools include Computer-Assisted Engineering (CAE), Computer-Assisted Manufacturing (CAM), Finite Element Analysis (FEA), and Virtual Environment Simulation (VES) tools. These tools represent either physical objects or conceptual ideas using numerical models for the purpose of posing a question, performing a task, or generating information. Dependence on these numerical representations require that models be precise, consistent across different applications, and verifiable. This paper describes a strategy for ensuring precise, consistent, and verifiable numerical model representations in a topographic framework. The main assertion put forth is that topographic model descriptions are more appropriate for numerical applications than topological or geometrical descriptions. A topographic model verification and validation methodology is presented.

  17. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    NASA Astrophysics Data System (ADS)

    Chen, Ji; Wu, Yiping

    2012-02-01

    SummaryThis paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  18. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    USGS Publications Warehouse

    Chen, J.; Wu, Y.

    2012-01-01

    This paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  19. Cell electrospinning: a novel tool for functionalising fibres, scaffolds and membranes with living cells and other advanced materials for regenerative biology and medicine.

    PubMed

    Jayasinghe, Suwan N

    2013-04-21

    Recent years have seen interest in approaches for directly generating fibers and scaffolds following a rising trend for their exploration in the health sciences. In this review the author wishes to briefly highlight the many approaches explored to date for generating such structures, while underlining their advantages and disadvantages, and their contribution in particular to the biomedical sciences. Such structures have been demonstrated as having implications in both the laboratory and the clinic, as they mimic the native extra cellular matrix. Interestingly the only materials investigated until very recently for generating fibrous architectures employed either natural or synthetic polymers with or without the addition of functional molecule(s). Arguably although such constructs have been demonstrated to have many applications, they lack the one unit most important for carrying out the ability to directly reconstruct a three-dimensional functional tissue, namely living cells. Therefore recent findings have demonstrated the ability to directly form cell-laden fibers and scaffolds in useful quantities from which functional three-dimensional living tissues can be conceived. These recent developments have far-reaching ramifications to many areas of research and development, a few of which range from tissue engineering and regenerative medicine, a novel approach to analyzing cell behavior and function in real time in three-dimensions, to the advanced controlled and targeted delivery of experimental and/or medical cells and/or genes for localized treatment. At present these developments have passed all in vitro and in vivo mouse model based challenge trials and are now spearheading their journey towards initiating human clinical trials.

  20. Molecular tools for chemical biotechnology

    PubMed Central

    Galanie, Stephanie; Siddiqui, Michael S.; Smolke, Christina D.

    2013-01-01

    Biotechnological production of high value chemical products increasingly involves engineering in vivo multi-enzyme pathways and host metabolism. Recent approaches to these engineering objectives have made use of molecular tools to advance de novo pathway identification, tunable enzyme expression, and rapid pathway construction. Molecular tools also enable optimization of single enzymes and entire genomes through diversity generation and screening, whole cell analytics, and synthetic metabolic control networks. In this review, we focus on advanced molecular tools and their applications to engineered pathways in host organisms, highlighting the degree to which each tool is generalizable. PMID:23528237

  1. The General Comments on HIV adopted by the African Commission on Human and Peoples' Rights as a tool to advance the sexual and reproductive rights of women in Africa.

    PubMed

    Durojaye, Ebenezer

    2014-12-01

    The present article examines the contents and importance of the General Comments adopted by the African Commission on Human and Peoples' Rights on Article 14 (1) (d) and (e) of the Protocol to the African Charter on the Rights of Women in Africa as a tool for advancing women's rights in the context of HIV. Given that discriminatory practices in all facets of life have continued to limit African women's enjoyment of their sexual and reproductive rights and render them susceptible to HIV infection, it becomes vital that African governments adopt appropriate measures to address this challenge. The provisions of the Protocol on the Rights of Women in Africa present great opportunities for this to be realized. The radical and progressive provisions of the Protocol will be of no use to women unless policymakers and other stakeholders have a clear understanding of them and are able to implement them effectively. The adoption of the General Comments is a welcome development, and states and civil society groups must maximize it to advance women's rights.

  2. Advancing computational methods for calibration of the Soil and Water Assessment Tool (SWAT): Application for modeling climate change impacts on water resources in the Upper Neuse Watershed of North Carolina

    NASA Astrophysics Data System (ADS)

    Ercan, Mehmet Bulent

    -Dominated Sorting Genetic Algorithm II (NSGA-II). This tool was demonstrated through an application for the Upper Neuse Watershed in North Carolina, USA. The objective functions used for the calibration were Nash-Sutcliffe (E) and Percent Bias (PB), and the objective sites were the Flat, Little, and Eno watershed outlets. The results show that the use of multi-objective calibration algorithms for SWAT calibration improved model performance especially in terms of minimizing PB compared to the single objective model calibration. The third study builds upon the first two studies by leveraging the new calibration methods and tools to study future climate impacts on the Upper Neuse watershed. Statistically downscaled outputs from eight Global Circulation Models (GCMs) were used for both low and high emission scenarios to drive a well calibrated SWAT model of the Upper Neuse watershed. The objective of the study was to understand the potential hydrologic response of the watershed, which serves as a public water supply for the growing Research Triangle Park region of North Carolina, under projected climate change scenarios. The future climate change scenarios, in general, indicate an increase in precipitation and temperature for the watershed in coming decades. The SWAT simulations using the future climate scenarios, in general, suggest an increase in soil water and water yield, and a decrease in evapotranspiration within the Upper Neuse watershed. In summary, this dissertation advances the field of watershed-scale hydrologic modeling by (i) providing some of the first work to apply cloud computing for the computationally-demanding task of model calibration; (ii) providing a new, open source library that can be used by SWAT modelers to perform multi-objective calibration of their models; and (iii) advancing understanding of climate change impacts on water resources for an important watershed in the Research Triangle Park region of North Carolina. The third study leveraged the

  3. Double diameter boring tool

    DOEpatents

    Ashbaugh, Fred N.; Murry, Kenneth R.

    1988-12-27

    A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting edges formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first cutting edge tip to the axis of rotation plus the distance from the second cutting edge tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second cutting edge tip to the axis of rotation minus one-half the distance from the first cutting edge tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.

  4. Double diameter boring tool

    DOEpatents

    Ashbaugh, F.A.; Murry, K.R.

    1986-02-10

    A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting flutes formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first flute tip to the axis of rotation plus the distance from the second flute tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second flute tip to the axis of rotation minus one-half the distance from the first flute tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.

  5. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  6. Sheet Bending using Soft Tools

    NASA Astrophysics Data System (ADS)

    Sinke, J.

    2011-05-01

    Sheet bending is usually performed by air bending and V-die bending processes. Both processes apply rigid tools. These solid tools facilitate the generation of software for the numerical control of those processes. When the lower rigid die is replaced with a soft or rubber tool, the numerical control becomes much more difficult, since the soft tool deforms too. Compared to other bending processes the rubber backed bending process has some distinct advantages, like large radius-to-thickness ratios, applicability to materials with topcoats, well defined radii, and the feasibility of forming details (ridges, beads). These advantages may give the process exclusive benefits over conventional bending processes, not only for industries related to mechanical engineering and sheet metal forming, but also for other disciplines like Architecture and Industrial Design The largest disadvantage is that also the soft (rubber) tool deforms. Although the tool deformation is elastic and recovers after each process cycle, the applied force during bending is related to the deformation of the metal sheet and the deformation of the rubber. The deformation of the rubber interacts with the process but also with sheet parameters. This makes the numerical control of the process much more complicated. This paper presents a model for the bending of sheet materials using a rubber lower die. This model can be implemented in software in order to control the bending process numerically. The model itself is based on numerical and experimental research. In this research a number of variables related to the tooling and the material have been evaluated. The numerical part of the research was used to investigate the influence of the features of the soft lower tool, like the hardness and dimensions, and the influence of the sheet thickness, which also interacts with the soft tool deformation. The experimental research was focused on the relation between the machine control parameters and the most

  7. A survey of parallel programming tools

    NASA Technical Reports Server (NTRS)

    Cheng, Doreen Y.

    1991-01-01

    This survey examines 39 parallel programming tools. Focus is placed on those tool capabilites needed for parallel scientific programming rather than for general computer science. The tools are classified with current and future needs of Numerical Aerodynamic Simulator (NAS) in mind: existing and anticipated NAS supercomputers and workstations; operating systems; programming languages; and applications. They are divided into four categories: suggested acquisitions, tools already brought in; tools worth tracking; and tools eliminated from further consideration at this time.

  8. Space Station robotics planning tools

    NASA Technical Reports Server (NTRS)

    Testa, Bridget Mintz

    1992-01-01

    The concepts are described for the set of advanced Space Station Freedom (SSF) robotics planning tools for use in the Space Station Control Center (SSCC). It is also shown how planning for SSF robotics operations is an international process, and baseline concepts are indicated for that process. Current SRMS methods provide the backdrop for this SSF theater of multiple robots, long operating time-space, advanced tools, and international cooperation.

  9. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  10. Percussion tool

    DOEpatents

    Reed, Teddy R.

    2006-11-28

    A percussion tool is described and which includes a housing mounting a tool bit; a reciprocally moveable hammer borne by the housing and which is operable to repeatedly strike the tool bit; and a reciprocally moveable piston enclosed within the hammer and which imparts reciprocal movement to the reciprocally moveable hammer.

  11. FORTRAN tools

    NASA Technical Reports Server (NTRS)

    Presser, L.

    1978-01-01

    An integrated set of FORTRAN tools that are commercially available is described. The basic purpose of various tools is summarized and their economic impact highlighted. The areas addressed by these tools include: code auditing, error detection, program portability, program instrumentation, documentation, clerical aids, and quality assurance.

  12. Welding mechanics for advanced component safety assessment

    NASA Astrophysics Data System (ADS)

    Siegele, Dieter

    2011-06-01

    Numerical methods are nowadays a useful tool for the calculation of distortion and residual stresses as a result from the welding process. Modern finite element codes not only allow for calculation of deformations and stresses due to the welding process but also take into account the change of microstructure due to different heating and cooling rates. As an extension to the pure welding simulation, the field of welding mechanics combines the mechanics and the material behaviour from the welding process with the assessment of service behaviour of welded components. In the paper, new results of experimental and numerical work in the field of welding mechanics are described. Through examples from automotive, nuclear and pipe-line applications it is demonstrated that an equilibrated treatment and a close interaction of "process", "properties" and "defects" are necessary to come up with an advanced fitness-forservice assessment of welded components.

  13. Advances in HIV Prevention for Serodiscordant Couples

    PubMed Central

    Muessig, Kathryn E.; Cohen, Myron S.

    2014-01-01

    Serodiscordant couples play an important role in maintaining the global HIV epidemic. This review summarizes biobehavioral and biomedical HIV prevention options for serodiscordant couples focusing on advances in 2013 and 2014, including World Health Organization guidelines and best-evidence for couples counseling, couples-based interventions, and the use of antiviral agents for prevention. In the past few years marked advances have been made in HIV prevention for serodiscordant couples and numerous ongoing studies are continuously expanding HIV prevention tools, especially in the area of pre-exposure prophylaxis. Uptake and adherence to antiviral therapy remains a key challenge. Additional research is needed to develop evidence-based interventions for couples, and especially for male-male couples. Randomized trials have demonstrated the prevention benefits of antiretroviral-based approaches among serodiscordant couples; however, residual transmission observed in recognized serodiscordant couples represents an important and resolvable challenge in HIV prevention. PMID:25145645

  14. Operationalizing the TANIC and NICA-L3/L4 Tools to Improve Informatics Competencies.

    PubMed

    Sipes, Carolyn; McGonigle, Dee; Hunter, Kathy; Hebda, Toni; Hill, Taryn; Lamblin, Jean

    2016-01-01

    Two tools were developed for nurses to self-assess different levels of informatics competencies. The TANIC is used for all nurses to self-assess; the NICA-L3/L4 is a tool for the informatics nurse specialist (INS) to self-assess skill levels. There are 167 informatics items in the TANIC and 178 advanced informatics items in the NICA-L3/L4. These tools were piloted; the results presented here. Based on the evaluation, the tools have been integrated into informatics courses at the BSN and MSN programs at Chamberlain College of Nursing, and presented in two AACN webinars and other national conferences. Numerous requests have been honored to provide the tools for other schools of nursing to use in their courses, including DNP programs. Other requests include those from CNIOs and managers to include in their job descriptions for informatics nurses. PMID:27332209

  15. PV Hourly Simulation Tool

    SciTech Connect

    Dean, Jesse; Metzger, Ian

    2010-12-31

    This software requires inputs of simple general building characteristics and usage information to calculate the energy and cost benefits of solar PV. This tool conducts and complex hourly simulation of solar PV based primarily on the area available on the rooftop. It uses a simplified efficiency calculation method and real panel characteristics. It includes a detailed rate structure to account for time-of-use rates, on-peak and off-peak pricing, and multiple rate seasons. This tool includes the option for advanced system design inputs if they are known. This tool calculates energy savings, demand reduction, cost savings, incentives and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.

  16. PV Hourly Simulation Tool

    2010-12-31

    This software requires inputs of simple general building characteristics and usage information to calculate the energy and cost benefits of solar PV. This tool conducts and complex hourly simulation of solar PV based primarily on the area available on the rooftop. It uses a simplified efficiency calculation method and real panel characteristics. It includes a detailed rate structure to account for time-of-use rates, on-peak and off-peak pricing, and multiple rate seasons. This tool includes themore » option for advanced system design inputs if they are known. This tool calculates energy savings, demand reduction, cost savings, incentives and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.« less

  17. Numerical wave propagation in ImageJ.

    PubMed

    Piedrahita-Quintero, Pablo; Castañeda, Raul; Garcia-Sucerquia, Jorge

    2015-07-20

    An ImageJ plugin for numerical wave propagation is presented. The plugin provides ImageJ, the well-known software for image processing, with the capability of computing numerical wave propagation by the use of angular spectrum, Fresnel, and Fresnel-Bluestein algorithms. The plugin enables numerical wave propagation within the robust environment provided by the complete set of built-in tools for image processing available in ImageJ. The plugin can be used for teaching and research purposes. We illustrate its use to numerically recreate Poisson's spot and Babinet's principle, and in the numerical reconstruction of digitally recorded holograms from millimeter-sized and pure phase microscopic objects.

  18. Hydroforming Of Patchwork Blanks — Numerical Modeling And Experimental Validation

    NASA Astrophysics Data System (ADS)

    Lamprecht, Klaus; Merklein, Marion; Geiger, Manfred

    2005-08-01

    In comparison to the commonly applied technology of tailored blanks the concept of patchwork blanks offers a number of additional advantages. Potential application areas for patchwork blanks in automotive industry are e.g. local reinforcements of automotive closures, structural reinforcements of rails and pillars as well as shock towers. But even if there is a significant application potential for patchwork blanks in automobile production, industrial realization of this innovative technique is decelerated due to a lack of knowledge regarding the forming behavior and the numerical modeling of patchwork blanks. Especially for the numerical simulation of hydroforming processes, where one part of the forming tool is replaced by a fluid under pressure, advanced modeling techniques are required to ensure an accurate prediction of the blanks' forming behavior. The objective of this contribution is to provide an appropriate model for the numerical simulation of patchwork blanks' forming processes. Therefore, different finite element modeling techniques for patchwork blanks are presented. In addition to basic shell element models a combined finite element model consisting of shell and solid elements is defined. Special emphasis is placed on the modeling of the weld seam. For this purpose the local mechanical properties of the weld metal, which have been determined by means of Martens-hardness measurements and uniaxial tensile tests, are integrated in the finite element models. The results obtained from the numerical simulations are compared to experimental data from a hydraulic bulge test. In this context the focus is laid on laser- and spot-welded patchwork blanks.

  19. Astronomer's Proposal Tool

    NASA Technical Reports Server (NTRS)

    Krueger, Tony

    2005-01-01

    Astronomer's Proposal Tool (APT) is a computer program that assists astronomers in preparing their Phase 1 and Phase 2 Hubble Space Telescope science programs. APT is a successor to the Remote Proposal Submission System 2 (RPS2) program, which has been rendered obsolete by more recent advances in computer software and hardware. APT exploits advances associated with widespread use of the Internet, multiplatform visual development software tools, and overall increases in the power of desktop computer hardware, all in such a way as to make the preparation and submission of proposals more intuitive and make observatory operations less cumbersome. APT provides documentation and help that are friendly, up to date, and easily accessible to users of varying levels of expertise, while defining an extensible framework that is responsive to changes in both technology and observatory operations. APT consists of two major components: (1) a set of software tools that are intuitive, visual, and responsive and (2) an integrated software environment that unifies all the tools and makes them interoperable. The APT tools include the Visual Target Tuner, Proposal Editor, Exposure Planner, Bright Object Checker, and Visit Planner.

  20. Numerical Modelling of Gelating Aerosols

    SciTech Connect

    Babovsky, Hans

    2008-09-01

    The numerical simulation of the gel phase transition of an aerosol system is an interesting and demanding task. Here, we follow an approach first discussed in [6, 8] which turns out as a useful numerical tool. We investigate several improvements and generalizations. In the center of interest are coagulation diffusion systems, where the aerosol dynamics is supplemented with diffusive spreading in physical space. This leads to a variety of scenarios (depending on the coagulation kernel and the diffusion model) for the spatial evolution of the gelation area.

  1. New Instrumental Tools for Advanced Astrochemical Applications

    NASA Astrophysics Data System (ADS)

    Steber, Amanda; Zinn, Sabrina; Schnell, Melanie; Rijs, Anouk

    2015-06-01

    Astrochemistry has been a growing field over the past several years. As the data from the Atacama Large Millimeter Array (ALMA) becomes publicly available, new and fast techniques for the analysis of the data will need to be developed, as well as fast, sensitive laboratory techniques. This lab is in the process of building up instrumentation that will be dedicated to the measurement of astrochemically relevant species, both in the microwave and the millimeter wave regimes. Discharge experiments, laser ablation experiments, as well as time of flight measurements will be possible with this instrumentation. Coupled with instrumentation capabilities will be new software aimed at a speeding up the analysis. The laboratory data will be used to search for new molecular signatures in the interstellar medium (ISM), and help to elucidate molecular reaction pathways occurring in the ISM.

  2. Handling geophysical flows: Numerical modelling using Graphical Processing Units

    NASA Astrophysics Data System (ADS)

    Garcia-Navarro, Pilar; Lacasta, Asier; Juez, Carmelo; Morales-Hernandez, Mario

    2016-04-01

    Computational tools may help engineers in the assessment of sediment transport during the decision-making processes. The main requirements are that the numerical results have to be accurate and simulation models must be fast. The present work is based on the 2D shallow water equations in combination with the 2D Exner equation [1]. The resulting numerical model accuracy was already discussed in previous work. Regarding the speed of the computation, the Exner equation slows down the already costly 2D shallow water model as the number of variables to solve is increased and the numerical stability is more restrictive. On the other hand, the movement of poorly sorted material over steep areas constitutes a hazardous environmental problem. Computational tools help in the predictions of such landslides [2]. In order to overcome this problem, this work proposes the use of Graphical Processing Units (GPUs) for decreasing significantly the simulation time [3, 4]. The numerical scheme implemented in GPU is based on a finite volume scheme. The mathematical model and the numerical implementation are compared against experimental and field data. In addition, the computational times obtained with the Graphical Hardware technology are compared against Single-Core (sequential) and Multi-Core (parallel) CPU implementations. References [Juez et al.(2014)] Juez, C., Murillo, J., & Garca-Navarro, P. (2014) A 2D weakly-coupled and efficient numerical model for transient shallow flow and movable bed. Advances in Water Resources. 71 93-109. [Juez et al.(2013)] Juez, C., Murillo, J., & Garca-Navarro, P. (2013) . 2D simulation of granular flow over irregular steep slopes using global and local coordinates. Journal of Computational Physics. 225 166-204. [Lacasta et al.(2014)] Lacasta, A., Morales-Hernndez, M., Murillo, J., & Garca-Navarro, P. (2014) An optimized GPU implementation of a 2D free surface simulation model on unstructured meshes Advances in Engineering Software. 78 1-15. [Lacasta

  3. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  4. Sarcopenia and physical function in overweight patients with advanced cancer.

    PubMed

    Prado, Carla M M; Lieffers, Jessica R; Bowthorpe, Lindsay; Baracos, Vickie E; Mourtzakis, Marina; McCargar, Linda J

    2013-01-01

    Advanced cancer is associated with numerous metabolic abnormalities that may lead to significant body composition changes, particularly muscle loss or sarcopenia. Sarcopenia in cancer has been associated with poor clinical outcomes, including poor physical function. Accurate tools to assess body composition are expensive and not readily available in clinical settings. Unfortunately, little is known about the efficacy of affordable and portable techniques to assess functional status in patients with cancer. We investigated the prevalence of sarcopenia and its association with different portable and low-cost functional status measurement tools (i.e., handgrip strength testing, a two-minute walking test, and a self-report questionnaire) in overweight/obese patients (body mass index ≥ 25 kg/m²) with advanced cancer. Twenty-eight patients (68% men) aged 64.5 ± 9.5 years with advanced lung or colorectal cancer were included. Sarcopenia was assessed by measuring appendicular skeletal muscle (ASM) adjusted by height (ASM index), using dual energy X-ray absorptiometry. Approximately 36% of patients had sarcopenia. Average handgrip strength was greater in men without sarcopenia than in men with it (p=0.035). In men, ASM index was positively correlated with average (r=0.535, p=0.018) and peak handgrip strength (r=0.457, p=0.049). No differences were observed among female patients. Handgrip strength was associated with sarcopenia in male patients with advanced cancer, and therefore it may be used as a portable and simple nutritional screening tool.

  5. GRIPPING TOOL

    DOEpatents

    Sandrock, R.J.

    1961-12-12

    A self-actuated gripping tool is described for transferring fuel elements and the like into reactors and other inaccessible locations. The tool will grasp or release the load only when properly positioned for this purpose. In addition, the load cannot be released except when unsupported by the tool, so that jarring or contact will not bring about accidental release of the load. The gripping members or jaws of the device are cam-actuated by an axially slidable shaft which has two lockable positions. A spring urges the shaft into one position and a solenoid is provided to overcome the spring and move it into the other position. The weight of the tool operates a sleeve to lock the shaft in its existing position. Only when the cable supporting the tool is slack is the device capable of being actuated either to grasp or release its load. (AEC)

  6. Omics Tools

    SciTech Connect

    Schaumberg, Andrew

    2012-12-21

    The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the “omics.jar” file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not contain Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on a server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the “help” command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args

  7. Omics Tools

    2012-12-21

    The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the “omics.jar” file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not containmore » Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on a server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the “help” command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args« less

  8. Numerical Modeling in Geodynamics: Success, Failure and Perspective

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.

    2005-12-01

    A real success in numerical modeling of dynamics of the Earth can be achieved only by multidisciplinary research teams of experts in geodynamics, applied and pure mathematics, and computer science. The success in numerical modeling is based on the following basic, but simple, rules. (i) People need simplicity most, but they understand intricacies best (B. Pasternak, writer). Start from a simple numerical model, which describes basic physical laws by a set of mathematical equations, and move then to a complex model. Never start from a complex model, because you cannot understand the contribution of each term of the equations to the modeled geophysical phenomenon. (ii) Study the numerical methods behind your computer code. Otherwise it becomes difficult to distinguish true and erroneous solutions to the geodynamic problem, especially when your problem is complex enough. (iii) Test your model versus analytical and asymptotic solutions, simple 2D and 3D model examples. Develop benchmark analysis of different numerical codes and compare numerical results with laboratory experiments. Remember that the numerical tool you employ is not perfect, and there are small bugs in every computer code. Therefore the testing is the most important part of your numerical modeling. (iv) Prove (if possible) or learn relevant statements concerning the existence, uniqueness and stability of the solution to the mathematical and discrete problems. Otherwise you can solve an improperly-posed problem, and the results of the modeling will be far from the true solution of your model problem. (v) Try to analyze numerical models of a geological phenomenon using as less as possible tuning model variables. Already two tuning variables give enough possibilities to constrain your model well enough with respect to observations. The data fitting sometimes is quite attractive and can take you far from a principal aim of your numerical modeling: to understand geophysical phenomena. (vi) If the number of

  9. Evaluation of dietary assessment tools used to assess the diet of adults participating in the Communities Advancing the Studies of Tribal Nations Across the Lifespan (CoASTAL) cohort

    PubMed Central

    Fialkowski, Marie K.; McCrory, Megan A.; Roberts, Sparkle M.; Tracy, J. Kathleen; Grattan, Lynn M.

    2011-01-01

    Background Accurate assessment of dietary intake is essential for researchers and public health practitioners to make advancements in health. This is especially important in Native Americans who display disease prevalence rates that are dramatically higher than the general U.S. population. Objective The objective of this study was to evaluate three dietary assessment tools: 1) dietary records, 2) a food frequency questionnaire (FFQ), and 3) a shellfish assessment survey (SAS) among Native American adults from the Communities Advancing Studies of Tribal Nations Across the Lifespan (CoASTAL) cohort. Design CoASTAL was comprised of randomly selected individuals from three tribal registries of Pacific Northwest Tribal Nations. This cross-sectional study used data from the baseline of CoASTAL and was restricted to the non-pregnant adults (18+ yr) who completed the SAS (n=500), a FFQ (n=518), dietary records (n=444), weight measures (n=493), and height measures (n=496). Paired t-tests, Pearson correlation coefficients, and percent agreement were used to evaluate the dietary records and the FFQ with and without accounting for plausibility of reported energy intake (rEI). Sensitivity and specificity as well as Spearman correlation coefficients were used to evaluate the SAS and the FFQ compared to dietary records. Results Statistically significant correlations between the FFQ and dietary records for selected nutrients were not the same by gender. Accounting for plausibility of rEI for the dietary records and the FFQ improved the strength of the correlations for percent energy from protein, energy from carbohydrate, and calcium for both men and women. In addition, significant associations between rEI (dietary records and FFQ) and weight were more apparent when using only rEI considered plausible. The SAS was found to similarly assess shellfish consumption in comparison to the FFQ. Conclusion These results support the benefit of multiple measures of diet, including regional

  10. Eclipse Parallel Tools Platform

    2005-02-18

    Designing and developing parallel programs is an inherently complex task. Developers must choose from the many parallel architectures and programming paradigms that are available, and face a plethora of tools that are required to execute, debug, and analyze parallel programs i these environments. Few, if any, of these tools provide any degree of integration, or indeed any commonality in their user interfaces at all. This further complicates the parallel developer's task, hampering software engineering practices,more » and ultimately reducing productivity. One consequence of this complexity is that best practice in parallel application development has not advanced to the same degree as more traditional programming methodologies. The result is that there is currently no open-source, industry-strength platform that provides a highly integrated environment specifically designed for parallel application development. Eclipse is a universal tool-hosting platform that is designed to providing a robust, full-featured, commercial-quality, industry platform for the development of highly integrated tools. It provides a wide range of core services for tool integration that allow tool producers to concentrate on their tool technology rather than on platform specific issues. The Eclipse Integrated Development Environment is an open-source project that is supported by over 70 organizations, including IBM, Intel and HP. The Eclipse Parallel Tools Platform (PTP) plug-in extends the Eclipse framwork by providing support for a rich set of parallel programming languages and paradigms, and a core infrastructure for the integration of a wide variety of parallel tools. The first version of the PTP is a prototype that only provides minimal functionality for parallel tool integration of a wide variety of parallel tools. The first version of the PTP is a prototype that only provides minimal functionality for parallel tool integration, support for a small number of parallel architectures

  11. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  12. 3-D Numerical Modeling of a Complex Salt Structure

    SciTech Connect

    House, L.; Larsen, S.; Bednar, J.B.

    2000-02-17

    Reliably processing, imaging, and interpreting seismic data from areas with complicated structures, such as sub-salt, requires a thorough understanding of elastic as well as acoustic wave propagation. Elastic numerical modeling is an essential tool to develop that understanding. While 2-D elastic modeling is in common use, 3-D elastic modeling has been too computationally intensive to be used routinely. Recent advances in computing hardware, including commodity-based hardware, have substantially reduced computing costs. These advances are making 3-D elastic numerical modeling more feasible. A series of example 3-D elastic calculations were performed using a complicated structure, the SEG/EAGE salt structure. The synthetic traces show that the effects of shear wave propagation can be important for imaging and interpretation of images, and also for AVO and other applications that rely on trace amplitudes. Additional calculations are needed to better identify and understand the complex wave propagation effects produced in complicated structures, such as the SEG/EAGE salt structure.

  13. Advances in Laryngoscopy.

    PubMed

    Aziz, Michael

    2015-01-01

    Recent technological advances have made airway management safer. Because difficult intubation remains challenging to predict, having tools readily available that can be used to manage a difficult airway in any setting is critical. Fortunately, video technology has resulted in improvements for intubation performance while using laryngoscopy by various means. These technologies have been applied to rigid optical stylets, flexible intubation scopes, and, most notably, rigid laryngoscopes. These tools have proven effective for the anticipated difficult airway as well as the unanticipated difficult airway.

  14. Modern industrial simulation tools: Kernel-level integration of high performance parallel processing, object-oriented numerics, and adaptive finite element analysis. Final report, July 16, 1993--September 30, 1997

    SciTech Connect

    Deb, M.K.; Kennon, S.R.

    1998-04-01

    A cooperative R&D effort between industry and the US government, this project, under the HPPP (High Performance Parallel Processing) initiative of the Dept. of Energy, started the investigations into parallel object-oriented (OO) numerics. The basic goal was to research and utilize the emerging technologies to create a physics-independent computational kernel for applications using adaptive finite element method. The industrial team included Computational Mechanics Co., Inc. (COMCO) of Austin, TX (as the primary contractor), Scientific Computing Associates, Inc. (SCA) of New Haven, CT, Texaco and CONVEX. Sandia National Laboratory (Albq., NM) was the technology partner from the government side. COMCO had the responsibility of the main kernel design and development, SCA had the lead in parallel solver technology and guidance on OO technologies was Sandia`s main expertise in this venture. CONVEX and Texaco supported the partnership by hardware resource and application knowledge, respectively. As such, a minimum of fifty-percent cost-sharing was provided by the industry partnership during this project. This report describes the R&D activities and provides some details about the prototype kernel and example applications.

  15. The Numerical Tokamak Project (NTP) simulation of turbulent transport in the core plasma: A grand challenge in plasma physics

    SciTech Connect

    Not Available

    1993-12-01

    The long-range goal of the Numerical Tokamak Project (NTP) is the reliable prediction of tokamak performance using physics-based numerical tools describing tokamak physics. The NTP is accomplishing the development of the most advanced particle and extended fluid model`s on massively parallel processing (MPP) environments as part of a multi-institutional, multi-disciplinary numerical study of tokamak core fluctuations. The NTP is a continuing focus of the Office of Fusion Energy`s theory and computation program. Near-term HPCC work concentrates on developing a predictive numerical description of the core plasma transport in tokamaks driven by low-frequency collective fluctuations. This work addresses one of the greatest intellectual challenges to our understanding of the physics of tokamak performance and needs the most advanced computational resources to progress. We are conducting detailed comparisons of kinetic and fluid numerical models of tokamak turbulence. These comparisons are stimulating the improvement of each and the development of hybrid models which embody aspects of both. The combination of emerging massively parallel processing hardware and algorithmic improvements will result in an estimated 10**2--10**6 performance increase. Development of information processing and visualization tools is accelerating our comparison of computational models to one another, to experimental data, and to analytical theory, providing a bootstrap effect in our understanding of the target physics. The measure of success is the degree to which the experimentally observed scaling of fluctuation-driven transport may be predicted numerically. The NTP is advancing the HPCC Initiative through its state-of-the-art computational work. We are pushing the capability of high performance computing through our efforts which are strongly leveraged by OFE support.

  16. Advanced 0.3-NA EUV lithography capabilities at the ALS

    SciTech Connect

    Naulleau, Patrick; Anderson, Erik; Dean, Kim; Denham, Paul; Goldberg, Kenneth A.; Hoef, Brian; Jackson, Keith

    2005-07-07

    For volume nanoelectronics production using Extreme ultraviolet (EUV) lithography [1] to become a reality around the year 2011, advanced EUV research tools are required today. Microfield exposure tools have played a vital role in the early development of EUV lithography [2-4] concentrating on numerical apertures (NA) of 0.2 and smaller. Expected to enter production at the 32-nm node with NAs of 0.25, EUV can no longer rely on these early research tools to provide relevant learning. To overcome this problem, a new generation of microfield exposure tools, operating at an NA of 0.3 have been developed [5-8]. Like their predecessors, these tools trade off field size and speed for greatly reduced complexity. One of these tools is implemented at Lawrence Berkeley National Laboratory's Advanced Light Source synchrotron radiation facility. This tool gets around the problem of the intrinsically high coherence of the synchrotron source [9,10] by using an active illuminator scheme [11]. Here we describe recent printing results obtained from the Berkeley EUV exposure tool. Limited by the availability of ultra-high resolution chemically amplified resists, present resolution limits are approximately 32 nm for equal lines and spaces and 27 nm for semi-isolated lines.

  17. Advanced extravehicular mobility unit study

    NASA Technical Reports Server (NTRS)

    Elkins, W.

    1982-01-01

    Components of the advanced extravehicular mobility unit (suit) are described. Design considerations for radiation protection, extravehicular operational pressure, mobility effects, tool/glove/effector, anthropometric definition, lighting, and equipment turnaround are addressed.

  18. Advanced Heart Failure

    MedlinePlus

    ... High Blood Pressure Tools & Resources Stroke More Advanced Heart Failure Updated:Oct 8,2015 When heart failure (HF) ... content was last reviewed on 04/06/2015. Heart Failure • Home • About Heart Failure • Causes and Risks for ...

  19. Apes produce tools for future use.

    PubMed

    Bräuer, Juliane; Call, Josep

    2015-03-01

    There is now growing evidence that some animal species are able to plan for the future. For example great apes save and exchange tools for future use. Here we raise the question whether chimpanzees, orangutans, and bonobos would produce tools for future use. Subjects only had access to a baited apparatus for a limited duration and therefore should use the time preceding this access to create the appropriate tools in order to get the rewards. The apes were tested in three conditions depending on the need for pre-prepared tools. Either eight tools, one tool or no tools were needed to retrieve the reward. The apes prepared tools in advance for future use and they produced them mainly in conditions when they were really needed. The fact that apes were able to solve this new task indicates that their planning skills are flexible. However, for the condition in which eight tools were needed, apes produced less than two tools per trial in advance. However, they used their chance to produce additional tools in the tool use phase-thus often obtaining most of the reward from the apparatus. Increased pressure to prepare more tools in advance did not have an effect on their performance.

  20. Apes produce tools for future use.

    PubMed

    Bräuer, Juliane; Call, Josep

    2015-03-01

    There is now growing evidence that some animal species are able to plan for the future. For example great apes save and exchange tools for future use. Here we raise the question whether chimpanzees, orangutans, and bonobos would produce tools for future use. Subjects only had access to a baited apparatus for a limited duration and therefore should use the time preceding this access to create the appropriate tools in order to get the rewards. The apes were tested in three conditions depending on the need for pre-prepared tools. Either eight tools, one tool or no tools were needed to retrieve the reward. The apes prepared tools in advance for future use and they produced them mainly in conditions when they were really needed. The fact that apes were able to solve this new task indicates that their planning skills are flexible. However, for the condition in which eight tools were needed, apes produced less than two tools per trial in advance. However, they used their chance to produce additional tools in the tool use phase-thus often obtaining most of the reward from the apparatus. Increased pressure to prepare more tools in advance did not have an effect on their performance. PMID:25236323

  1. Robust Neighboring Optimal Guidance for the Advanced Launch System

    NASA Technical Reports Server (NTRS)

    Hull, David G.

    1993-01-01

    In recent years, optimization has become an engineering tool through the availability of numerous successful nonlinear programming codes. Optimal control problems are converted into parameter optimization (nonlinear programming) problems by assuming the control to be piecewise linear, making the unknowns the nodes or junction points of the linear control segments. Once the optimal piecewise linear control (suboptimal) control is known, a guidance law for operating near the suboptimal path is the neighboring optimal piecewise linear control (neighboring suboptimal control). Research conducted under this grant has been directed toward the investigation of neighboring suboptimal control as a guidance scheme for an advanced launch system.

  2. ATST telescope mount: telescope of machine tool

    NASA Astrophysics Data System (ADS)

    Jeffers, Paul; Stolz, Günter; Bonomi, Giovanni; Dreyer, Oliver; Kärcher, Hans

    2012-09-01

    The Advanced Technology Solar Telescope (ATST) will be the largest solar telescope in the world, and will be able to provide the sharpest views ever taken of the solar surface. The telescope has a 4m aperture primary mirror, however due to the off axis nature of the optical layout, the telescope mount has proportions similar to an 8 meter class telescope. The technology normally used in this class of telescope is well understood in the telescope community and has been successfully implemented in numerous projects. The world of large machine tools has developed in a separate realm with similar levels of performance requirement but different boundary conditions. In addition the competitive nature of private industry has encouraged development and usage of more cost effective solutions both in initial capital cost and thru-life operating cost. Telescope mounts move relatively slowly with requirements for high stability under external environmental influences such as wind buffeting. Large machine tools operate under high speed requirements coupled with high application of force through the machine but with little or no external environmental influences. The benefits of these parallel development paths and the ATST system requirements are being combined in the ATST Telescope Mount Assembly (TMA). The process of balancing the system requirements with new technologies is based on the experience of the ATST project team, Ingersoll Machine Tools who are the main contractor for the TMA and MT Mechatronics who are their design subcontractors. This paper highlights a number of these proven technologies from the commercially driven machine tool world that are being introduced to the TMA design. Also the challenges of integrating and ensuring that the differences in application requirements are accounted for in the design are discussed.

  3. Management Tools

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  4. Advanced electron microscopy for advanced materials.

    PubMed

    Van Tendeloo, Gustaaf; Bals, Sara; Van Aert, Sandra; Verbeeck, Jo; Van Dyck, Dirk

    2012-11-01

    The idea of this Review is to introduce newly developed possibilities of advanced electron microscopy to the materials science community. Over the last decade, electron microscopy has evolved into a full analytical tool, able to provide atomic scale information on the position, nature, and even the valency atoms. This information is classically obtained in two dimensions (2D), but can now also be obtained in 3D. We show examples of applications in the field of nanoparticles and interfaces.

  5. Technology Tools

    ERIC Educational Resources Information Center

    Kennedy, Mike

    2005-01-01

    Personal computers (PCs) have transformed the way teachers teach, students learn, and school operations are conducted. However, the addition of PCs is not the only technological advancement that can help education institutions run more productively. The progress that has made computers smaller, faster and cheaper also has led to the availability…

  6. Fluid sampling tool

    DOEpatents

    Garcia, Anthony R.; Johnston, Roger G.; Martinez, Ronald K.

    1999-05-25

    A fluid sampling tool for sampling fluid from a container. The tool has a fluid collecting portion which is drilled into the container wall, thereby affixing it to the wall. The tool may have a fluid extracting section which withdraws fluid collected by the fluid collecting section. The fluid collecting section has a fluted shank with an end configured to drill a hole into a container wall. The shank has a threaded portion for tapping the borehole. The shank is threadably engaged to a cylindrical housing having an inner axial passageway sealed at one end by a septum. A flexible member having a cylindrical portion and a bulbous portion is provided. The housing can be slid into an inner axial passageway in the cylindrical portion and sealed to the flexible member. The bulbous portion has an outer lip defining an opening. The housing is clamped into the chuck of a drill, the lip of the bulbous section is pressed against a container wall until the shank touches the wall, and the user operates the drill. Wall shavings (kerf) are confined in a chamber formed in the bulbous section as it folds when the shank advances inside the container. After sufficient advancement of the shank, an o-ring makes a seal with the container wall.

  7. Fluid sampling tool

    DOEpatents

    Garcia, A.R.; Johnston, R.G.; Martinez, R.K.

    1999-05-25

    A fluid sampling tool is described for sampling fluid from a container. The tool has a fluid collecting portion which is drilled into the container wall, thereby affixing it to the wall. The tool may have a fluid extracting section which withdraws fluid collected by the fluid collecting section. The fluid collecting section has a fluted shank with an end configured to drill a hole into a container wall. The shank has a threaded portion for tapping the borehole. The shank is threadably engaged to a cylindrical housing having an inner axial passageway sealed at one end by a septum. A flexible member having a cylindrical portion and a bulbous portion is provided. The housing can be slid into an inner axial passageway in the cylindrical portion and sealed to the flexible member. The bulbous portion has an outer lip defining an opening. The housing is clamped into the chuck of a drill, the lip of the bulbous section is pressed against a container wall until the shank touches the wall, and the user operates the drill. Wall shavings (kerf) are confined in a chamber formed in the bulbous section as it folds when the shank advances inside the container. After sufficient advancement of the shank, an o-ring makes a seal with the container wall. 6 figs.

  8. Technology Tools to Support Reading in the Digital Age

    ERIC Educational Resources Information Center

    Biancarosa, Gina; Griffiths, Gina G.

    2012-01-01

    Advances in digital technologies are dramatically altering the texts and tools available to teachers and students. These technological advances have created excitement among many for their potential to be used as instructional tools for literacy education. Yet with the promise of these advances come issues that can exacerbate the literacy…

  9. The quiet revolution of numerical weather prediction.

    PubMed

    Bauer, Peter; Thorpe, Alan; Brunet, Gilbert

    2015-09-01

    Advances in numerical weather prediction represent a quiet revolution because they have resulted from a steady accumulation of scientific knowledge and technological advances over many years that, with only a few exceptions, have not been associated with the aura of fundamental physics breakthroughs. Nonetheless, the impact of numerical weather prediction is among the greatest of any area of physical science. As a computational problem, global weather prediction is comparable to the simulation of the human brain and of the evolution of the early Universe, and it is performed every day at major operational centres across the world.

  10. Numerical Simulation of Carbon Dioxide Injection in the Western Section of the Farnsworth Unit

    SciTech Connect

    White, Mark D.; McPherson, Brian J.; Grigg, Reid B.; Ampomah, William; Appold, Martin S.

    2014-05-05

    Numerical simulation is an invaluable analytical tool for scientists and engineers in making predictions about of the fate of carbon dioxide injected into deep geologic formations for long-term storage. Current numerical simulators for assessing storage in deep saline formations have capabilities for modeling strongly coupled processes involving multifluid flow, heat transfer, chemistry, and rock mechanics in geologic media. Except for moderate pressure conditions, numerical simulators for deep saline formations only require the tracking of two immiscible phases and a limited number of phase components, beyond those comprising the geochemical reactive system. The requirements for numerically simulating the utilization and storage of carbon dioxide in partially depleted petroleum reservoirs are more numerous than those for deep saline formations. The minimum number of immiscible phases increases to three, the number of phase components may easily increase fourfold, and the coupled processes of heat transfer, geochemistry, and geomechanics remain. Public and scientific confidence in the ability of numerical simulators used for carbon dioxide sequestration in deep saline formations has advanced via a natural progression of the simulators being proven against benchmark problems, code comparisons, laboratory-scale experiments, pilot-scale injections, and commercial-scale injections. This paper describes a new numerical simulator for the scientific investigation of carbon dioxide utilization and storage in partially depleted petroleum reservoirs, with an emphasis on its unique features for scientific investigations; and documents the numerical simulation of the utilization of carbon dioxide for enhanced oil recovery in the western section of the Farnsworth Unit and represents an early stage in the progression of numerical simulators for carbon utilization and storage in depleted oil reservoirs.

  11. Climate Data Analysis Tools

    2009-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems, The power of the system comes from Python and its ability to seamlissly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and fraphical user interfaces (GUI). Themore » CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management Systems or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS).« less

  12. Climate Data Analysis Tools

    SciTech Connect

    2009-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems, The power of the system comes from Python and its ability to seamlissly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and fraphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management Systems or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS).

  13. Descendants and advance directives.

    PubMed

    Buford, Christopher

    2014-01-01

    Some of the concerns that have been raised in connection to the use of advance directives are of the epistemic variety. Such concerns highlight the possibility that adhering to an advance directive may conflict with what the author of the directive actually wants (or would want) at the time of treatment. However, at least one objection to the employment of advance directives is metaphysical in nature. The objection to be discussed here, first formulated by Rebecca Dresser and labeled by Allen Buchanan as the slavery argument and David DeGrazia the someone else problem, aims to undermine the legitimacy of certain uses of advance directives by concluding that such uses rest upon an incorrect assumption about the identity over time of those ostensibly governed by the directives. There have been numerous attempts to respond to this objection. This paper aims to assess two strategies that have been pursued to cope with the problem.

  14. Descendants and advance directives.

    PubMed

    Buford, Christopher

    2014-01-01

    Some of the concerns that have been raised in connection to the use of advance directives are of the epistemic variety. Such concerns highlight the possibility that adhering to an advance directive may conflict with what the author of the directive actually wants (or would want) at the time of treatment. However, at least one objection to the employment of advance directives is metaphysical in nature. The objection to be discussed here, first formulated by Rebecca Dresser and labeled by Allen Buchanan as the slavery argument and David DeGrazia the someone else problem, aims to undermine the legitimacy of certain uses of advance directives by concluding that such uses rest upon an incorrect assumption about the identity over time of those ostensibly governed by the directives. There have been numerous attempts to respond to this objection. This paper aims to assess two strategies that have been pursued to cope with the problem. PMID:25743056

  15. Disruptive Innovation in Numerical Hydrodynamics

    SciTech Connect

    Waltz, Jacob I.

    2012-09-06

    We propose the research and development of a high-fidelity hydrodynamic algorithm for tetrahedral meshes that will lead to a disruptive innovation in the numerical modeling of Laboratory problems. Our proposed innovation has the potential to reduce turnaround time by orders of magnitude relative to Advanced Simulation and Computing (ASC) codes; reduce simulation setup costs by millions of dollars per year; and effectively leverage Graphics Processing Unit (GPU) and future Exascale computing hardware. If successful, this work will lead to a dramatic leap forward in the Laboratory's quest for a predictive simulation capability.

  16. Fluid blade disablement tool

    SciTech Connect

    Jakaboski, Juan-Carlos; Hughs, Chance G.; Todd, Steven N.

    2012-01-10

    A fluid blade disablement (FBD) tool that forms both a focused fluid projectile that resembles a blade, which can provide precision penetration of a barrier wall, and a broad fluid projectile that functions substantially like a hammer, which can produce general disruption of structures behind the barrier wall. Embodiments of the FBD tool comprise a container capable of holding fluid, an explosive assembly which is positioned within the container and which comprises an explosive holder and explosive, and a means for detonating. The container has a concavity on the side adjacent to the exposed surface of the explosive. The position of the concavity relative to the explosive and its construction of materials with thicknesses that facilitate inversion and/or rupture of the concavity wall enable the formation of a sharp and coherent blade of fluid advancing ahead of the detonation gases.

  17. Waste glass melter numerical and physical modeling

    SciTech Connect

    Eyler, L.L.; Peters, R.D.; Lessor, D.L.; Lowery, P.S.; Elliott, M.L.

    1991-10-01

    Results of physical and numerical simulation modeling of high-level liquid waste vitrification melters are presented. Physical modeling uses simulant fluids in laboratory testing. Visualization results provide insight into convective melt flow patterns from which information is derived to support performance estimation of operating melters and data to support numerical simulation. Numerical simulation results of several melter configurations are presented. These are in support of programs to evaluate melter operation characteristics and performance. Included are investigations into power skewing and alternating current electric field phase angle in a dual electrode pair reference design and bi-modal convective stability in an advanced design. 9 refs., 9 figs., 1 tab.

  18. Verification and Validation Strategy for LWRS Tools

    SciTech Connect

    Carl M. Stoots; Richard R. Schultz; Hans D. Gougar; Thomas K Larson; Michael Corradini; Laura Swiler; David Pointer; Jess Gehin

    2012-09-01

    One intension of the Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to create advanced computational tools for safety assessment that enable more accurate representation of a nuclear power plant safety margin. These tools are to be used to study the unique issues posed by lifetime extension and relicensing of the existing operating fleet of nuclear power plants well beyond their first license extension period. The extent to which new computational models / codes such as RELAP-7 can be used for reactor licensing / relicensing activities depends mainly upon the thoroughness with which they have been verified and validated (V&V). This document outlines the LWRS program strategy by which RELAP-7 code V&V planning is to be accomplished. From the perspective of developing and applying thermal-hydraulic and reactivity-specific models to reactor systems, the US Nuclear Regulatory Commission (NRC) Regulatory Guide 1.203 gives key guidance to numeric model developers and those tasked with the validation of numeric models. By creating Regulatory Guide 1.203 the NRC defined a framework for development, assessment, and approval of transient and accident analysis methods. As a result, this methodology is very relevant and is recommended as the path forward for RELAP-7 V&V. However, the unique issues posed by lifetime extension will require considerations in addition to those addressed in Regulatory Guide 1.203. Some of these include prioritization of which plants / designs should be studied first, coupling modern supporting experiments to the stringent needs of new high fidelity models / codes, and scaling of aging effects.

  19. Downhole tool

    SciTech Connect

    Hall, David R.; Muradov, Andrei; Pixton, David S.; Dahlgren, Scott Steven; Briscoe, Michael A.

    2007-03-20

    A double shouldered downhole tool connection comprises box and pin connections having mating threads intermediate mating primary and secondary shoulders. The connection further comprises a secondary shoulder component retained in the box connection intermediate a floating component and the primary shoulders. The secondary shoulder component and the pin connection cooperate to transfer a portion of makeup load to the box connection. The downhole tool may be selected from the group consisting of drill pipe, drill collars, production pipe, and reamers. The floating component may be selected from the group consisting of electronics modules, generators, gyroscopes, power sources, and stators. The secondary shoulder component may comprises an interface to the box connection selected from the group consisting of radial grooves, axial grooves, tapered grooves, radial protrusions, axial protrusions, tapered protrusions, shoulders, and threads.

  20. Cordless Tool

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Apollo-era technology spurred the development of cordless products that we take for granted everyday. In the 1960s, NASA asked Black Decker to develop a special drill that would be powerful enough to cut through hard layers of the lunar surface and be lightweight, compact, and operate under its own power source, allowing Apollo astronauts to collect lunar samples further away from the Lunar Experiment Module. In response, Black Decker developed a computer program that analyzed and optimized drill motor operations. From their analysis, engineers were able to design a motor that was powerful yet required minimal battery power to operate. Since those first days of cordless products, Black Decker has continued to refine this technology and they now sell their rechargeable products worldwide (i.e. the Dustbuster, cordless tools for home and industrial use, and medical tools.)

  1. Hopper File Management Tool

    SciTech Connect

    Long, J W; O'Neill, N J; Smith, N G; Springmeyer, R R; Remmele, S; Richards, D A; Southon, J

    2004-11-15

    Hopper is a powerful interactive tool that allows users to transfer and manipulate files and directories by means of a graphical user interface. Users can connect to and manage resources using the major file transfer protocols. Implemented in Java, Hopper can be run almost anywhere: from an individual's desktop machine to large production machines. In a high-performance computing environment, managing files can become a difficult and time-consuming task that distracts from scientific work. Users must deal with multiple file transfer protocols, transferring enormous amounts of files between computer platforms, repeated authentication, organizing massive amounts of data, and other detailed but necessary tasks. This is often accomplished with a set of several different tools, each with its own interface and idiosyncrasies. Our goal is to develop tools for a more automated approach to file management that substantially improves users' ability to transfer, organize, search, and operate on collections of files. This paper describes the Hopper tool for advanced file management, including the software architecture, the functionality, and the user interface.

  2. Numerical Boundary Condition Procedures

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Topics include numerical procedures for treating inflow and outflow boundaries, steady and unsteady discontinuous surfaces, far field boundaries, and multiblock grids. In addition, the effects of numerical boundary approximations on stability, accuracy, and convergence rate of the numerical solution are discussed.

  3. Advanced Tools Webinar Series Presents: Regulatory Issues and Case Studies of Advanced Tools

    EPA Science Inventory

    U.S. EPA has released A Guide for Assessing Biodegradation and Source Identification of Organic Ground Water Contaminants using Compound Specific Isotope Analysis (CSIA) [EPA 600/R-08/148 | December 2008 | www.epa.gov/ada]. The Guide provides recommendations for sample collecti...

  4. Numerical simulation of shrouded propellers

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.

    1991-01-01

    A numerical model was developed for the evaluation of the performance characteristics of a shrouded propeller. Using this model, a computational study was carried out to investigate the feasibility of improving the aerodynamic performance of a propeller by encasing it in a shroud. The propeller blade was modeled by a segmented bound vortex positioned along the span of the blade at its quarter-chord-line. The shroud was modeled by a number of discrete vortex rings. Due to the mutual dependence of shroud and propeller vortex strengths and the propeller vortex wake an iterative scheme was employed. Three shroud configurations were considered: a cylindrical and two conical shrouds. The computed performance of the shrouded propeller was compared with that of a free propeller of identical propeller geometry. The numerical results indicated that the cylindrical shroud outperformed the conical shroud configurations for the cases considered. Furthermore, when compared to the free propeller performance, the cylindrical shroud showed a considerable performance enhancement over the free propeller. However, the improvements were found to decrease with an increase in the advance ratio and to virtually diminish at advance ratios of about 2.5.

  5. New efficient optimizing techniques for Kalman filters and numerical weather prediction models

    NASA Astrophysics Data System (ADS)

    Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis

    2016-06-01

    The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.

  6. Mathematical simulation of soil vapor extraction systems: Model development and numerical examples

    NASA Astrophysics Data System (ADS)

    Rathfelder, Klaus; Yeh, William W.-G.; Mackay, Douglas

    1991-12-01

    This paper describes the development of a numerical model for prediction of soil vapor extraction processes. The major emphasis is placed on field-scale predictions with the objective to advance development of planning tools for design and operation of venting systems. The numerical model solves two-dimensional flow and transport equations for general n-component contaminant mixtures. Flow is limited to the gas phase and local equilibrium partitioning is assumed in tracking contaminants in the immiscible fluid, water, gas, and solid phase. Model predictions compared favorably with analytical solutions and multicomponent column venting experiments. Sensitivity analysis indicates equilibrium phase partitioning is a good assumption in modeling organic liquid volatilization occurring in field venting operations. Mass transfer rates in volatilization from the water phase and contaminant desorption are potentially rate limiting. Simulations of hypothetical field-scale problems show efficiency of venting operations is most sensitive to vapor pressure and the magnitude and distribution of soil permeability.

  7. The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    1999-01-01

    Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  8. CFD Multiphysics Tool

    NASA Technical Reports Server (NTRS)

    Perrell, Eric R.

    2005-01-01

    The recent bold initiatives to expand the human presence in space require innovative approaches to the design of propulsion systems whose underlying technology is not yet mature. The space propulsion community has identified a number of candidate concepts. A short list includes solar sails, high-energy-density chemical propellants, electric and electromagnetic accelerators, solar-thermal and nuclear-thermal expanders. For each of these, the underlying physics are relatively well understood. One could easily cite authoritative texts, addressing both the governing equations, and practical solution methods for, e.g. electromagnetic fields, heat transfer, radiation, thermophysics, structural dynamics, particulate kinematics, nuclear energy, power conversion, and fluid dynamics. One could also easily cite scholarly works in which complete equation sets for any one of these physical processes have been accurately solved relative to complex engineered systems. The Advanced Concepts and Analysis Office (ACAO), Space Transportation Directorate, NASA Marshall Space Flight Center, has recently released the first alpha version of a set of computer utilities for performing the applicable physical analyses relative to candidate deep-space propulsion systems such as those listed above. PARSEC, Preliminary Analysis of Revolutionary in-Space Engineering Concepts, enables rapid iterative calculations using several physics tools developed in-house. A complete cycle of the entire tool set takes about twenty minutes. PARSEC is a level-zero/level-one design tool. For PARSEC s proof-of-concept, and preliminary design decision-making, assumptions that significantly simplify the governing equation sets are necessary. To proceed to level-two, one wishes to retain modeling of the underlying physics as close as practical to known applicable first principles. This report describes results of collaboration between ACAO, and Embry-Riddle Aeronautical University (ERAU), to begin building a set of

  9. Advances in attosecond science

    NASA Astrophysics Data System (ADS)

    Calegari, Francesca; Sansone, Giuseppe; Stagira, Salvatore; Vozzi, Caterina; Nisoli, Mauro

    2016-03-01

    Attosecond science offers formidable tools for the investigation of electronic processes at the heart of important physical processes in atomic, molecular and solid-state physics. In the last 15 years impressive advances have been obtained from both the experimental and theoretical points of view. Attosecond pulses, in the form of isolated pulses or of trains of pulses, are now routinely available in various laboratories. In this review recent advances in attosecond science are reported and important applications are discussed. After a brief presentation of various techniques that can be employed for the generation and diagnosis of sub-femtosecond pulses, various applications are reported in atomic, molecular and condensed-matter physics.

  10. Advanced Solar Power Systems

    NASA Technical Reports Server (NTRS)

    Atkinson, J. H.; Hobgood, J. M.

    1984-01-01

    The Advanced Solar Power System (ASPS) concentrator uses a technically sophisticated design and extensive tooling to produce very efficient (80 to 90%) and versatile energy supply equipment which is inexpensive to manufacture and requires little maintenance. The advanced optical design has two 10th order, generalized aspheric surfaces in a Cassegrainian configuration which gives outstanding performance and is relatively insensitive to temperature changes and wind loading. Manufacturing tolerances also have been achieved. The key to the ASPS is the direct absorption of concentrated sunlight in the working fluid by radiative transfers in a black body cavity. The basic ASPS design concepts, efficiency, optical system, and tracking and focusing controls are described.

  11. Advances in surgery.

    PubMed

    Weder, W

    2012-09-01

    In the last decade, technological advances, new staging tools, better understanding the role of surgery within multimodal treatment concepts in advanced stages and progress in the functional assessment of surgical candidates improved the quality of surgery in the management of patients with lung cancer. Lung resection with video-assisted thoracoscopic access gained wide acceptance, the indication for lobectomy or sublobar resection in early stages was applied based on new data and selection for multimodal treatment in stage III is better understood based on the data. a major impact on the outcome of patients with lung cancer has the treatment in specialized high-volume centers.

  12. Risk Management Implementation Tool

    NASA Technical Reports Server (NTRS)

    Wright, Shayla L.

    2004-01-01

    Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.

  13. Advanced Triangulation Displacement Sensors

    NASA Technical Reports Server (NTRS)

    Poteet, Wade M.; Cauthen, Harold K.

    1996-01-01

    Advanced optoelectronic triangulation displacement sensors undergoing development. Highly miniaturized, more stable, more accurate, and relatively easy to use. Incorporate wideband electronic circuits suitable for real-time monitoring and control of displacements. Measurements expected to be accurate to within nanometers. In principle, sensors mass-produced at relatively low unit cost. Potential applications numerous. Possible industrial application in measuring runout of rotating shaft or other moving part during fabrication in "zero-defect" manufacturing system, in which measured runout automatically corrected.

  14. Lower Paleolithic bone tools from the 'Spear Horizon' at Schöningen (Germany).

    PubMed

    Van Kolfschoten, Thijs; Parfitt, Simon A; Serangeli, Jordi; Bello, Silvia M

    2015-12-01

    The Lower Paleolithic locality of Schöningen 13 II-4 is famous for the discovery of wooden spears found amongst the butchered remains of numerous horses and other large herbivores. Although the spears have attracted the most interest, other aspects of the associated artifact assemblage have received less attention. Here we describe an extraordinary assemblage of 88 bone tools from the 'Spear Horizon.' This sample includes numerous long-bone shaft fragments (mostly of horse), three ribs used as 'retouchers' to resharpen flint tools, and a complete horse innominate that was used as an anvil in bipolar knapping. Most of the retouchers were prepared by scraping the diaphysis of fresh and dry long-bones. Technological analysis of the associated lithic assemblage demonstrates exhaustive resharpening to maintain functional cutting edges. Whereas the flint tools were brought to the site, curated, and maintained, the retouchers had a shorter use-history and were either discarded after a limited period or broken to extract marrow. Horse and bison metapodials with flaked and rounded epiphyses are interpreted as hammers used to break marrow bones. Several of the 'metapodial hammers' were additionally used as knapping percussors. These constitute the earliest evidence of multi-purpose bone tools in the archeological record. Our results highlight the advanced knowledge in the use of bones as tools during the Lower Paleolithic, with major implications for understanding aspects of non-lithic technology and planning depth in early hominins. PMID:26653208

  15. Lower Paleolithic bone tools from the 'Spear Horizon' at Schöningen (Germany).

    PubMed

    Van Kolfschoten, Thijs; Parfitt, Simon A; Serangeli, Jordi; Bello, Silvia M

    2015-12-01

    The Lower Paleolithic locality of Schöningen 13 II-4 is famous for the discovery of wooden spears found amongst the butchered remains of numerous horses and other large herbivores. Although the spears have attracted the most interest, other aspects of the associated artifact assemblage have received less attention. Here we describe an extraordinary assemblage of 88 bone tools from the 'Spear Horizon.' This sample includes numerous long-bone shaft fragments (mostly of horse), three ribs used as 'retouchers' to resharpen flint tools, and a complete horse innominate that was used as an anvil in bipolar knapping. Most of the retouchers were prepared by scraping the diaphysis of fresh and dry long-bones. Technological analysis of the associated lithic assemblage demonstrates exhaustive resharpening to maintain functional cutting edges. Whereas the flint tools were brought to the site, curated, and maintained, the retouchers had a shorter use-history and were either discarded after a limited period or broken to extract marrow. Horse and bison metapodials with flaked and rounded epiphyses are interpreted as hammers used to break marrow bones. Several of the 'metapodial hammers' were additionally used as knapping percussors. These constitute the earliest evidence of multi-purpose bone tools in the archeological record. Our results highlight the advanced knowledge in the use of bones as tools during the Lower Paleolithic, with major implications for understanding aspects of non-lithic technology and planning depth in early hominins.

  16. Advanced Virtual Reality Simulations in Aerospace Education and Research

    NASA Astrophysics Data System (ADS)

    Plotnikova, L.; Trivailo, P.

    2002-01-01

    Recent research developments at Aerospace Engineering, RMIT University have demonstrated great potential for using Virtual Reality simulations as a very effective tool in advanced structures and dynamics applications. They have also been extremely successful in teaching of various undergraduate and postgraduate courses for presenting complex concepts in structural and dynamics designs. Characteristic examples are related to the classical orbital mechanics, spacecraft attitude and structural dynamics. Advanced simulations, reflecting current research by the authors, are mainly related to the implementation of various non-linear dynamic techniques, including using Kane's equations to study dynamics of space tethered satellite systems and the Co-rotational Finite Element method to study reconfigurable robotic systems undergoing large rotations and large translations. The current article will describe the numerical implementation of the modern methods of dynamics, and will concentrate on the post-processing stage of the dynamic simulations. Numerous examples of building Virtual Reality stand-alone animations, designed by the authors, will be discussed in detail. These virtual reality examples will include: The striking feature of the developed technology is the use of the standard mathematical packages, like MATLAB, as a post-processing tool to generate Virtual Reality Modelling Language files with brilliant interactive, graphics and audio effects. These stand-alone demonstration files can be run under Netscape or Microsoft Explorer and do not require MATLAB. Use of this technology enables scientists to easily share their results with colleagues using the Internet, contributing to the flexible learning development at schools and Universities.

  17. Tool Gear: Infrastructure for Parallel Tools

    SciTech Connect

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  18. HIP-assisted CTE mismatch tooling

    SciTech Connect

    Zick, D.H.

    1996-12-31

    A novel tooling technique is described which allows diffusion bonding of components with excellent dimensional control. The technique makes use of the difference in coefficients of thermal expansion (CTE) between the tooling and the bonded components. Unlike traditional CTE mismatch tooling, the new technique allows low tensile strength, low cost materials such as graphite or ceramics to be used as the major tooling structure. Hot isostatic pressing (HIP) is employed to clamp together the tooling through a surrounding metallic capsule. An example will be presented of how the technique was used to bond numerous patterned stainless steel plates into a block containing intricate interconnected passages.

  19. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  20. WGS Analysis and Interpretation in Clinical and Public Health Microbiology Laboratories: What Are the Requirements and How Do Existing Tools Compare?

    PubMed Central

    Wyres, Kelly L.; Conway, Thomas C.; Garg, Saurabh; Queiroz, Carlos; Reumann, Matthias; Holt, Kathryn; Rusu, Laura I.

    2014-01-01

    Recent advances in DNA sequencing technologies have the potential to transform the field of clinical and public health microbiology, and in the last few years numerous case studies have demonstrated successful applications in this context. Among other considerations, a lack of user-friendly data analysis and interpretation tools has been frequently cited as a major barrier to routine use of these techniques. Here we consider the requirements of microbiology laboratories for the analysis, clinical interpretation and management of bacterial whole-genome sequence (WGS) data. Then we discuss relevant, existing WGS analysis tools. We highlight many essential and useful features that are represented among existing tools, but find that no single tool fulfils all of the necessary requirements. We conclude that to fully realise the potential of WGS analyses for clinical and public health microbiology laboratories of all scales, we will need to develop tools specifically with the needs of these laboratories in mind. PMID:25437808

  1. Recent advances in dermoscopy

    PubMed Central

    Russo, Teresa; Piccolo, Vincenzo; Lallas, Aimilios; Argenziano, Giuseppe

    2016-01-01

    The use of dermoscopy has offered a new morphological dimension of skin lesions and has provided an effective diagnostic tool to differentiate melanoma from other benign or malignant skin tumors but also to support the clinical diagnosis in general dermatology. The aim of this article is to provide an overview of the most recent and important advances in the rising world of dermoscopy. PMID:26949523

  2. General model for boring tool optimization

    NASA Astrophysics Data System (ADS)

    Moraru, G. M.; rbes, M. V. Ze; Popescu, L. G.

    2016-08-01

    Optimizing a tool (and therefore those for boring) consist in improving its performance through maximizing the objective functions chosen by the designer and/or by user. In order to define and to implement the proposed objective functions, contribute numerous features and performance required by tool users. Incorporation of new features makes the cutting tool to be competitive in the market and to meet user requirements.

  3. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  4. Numerical simulation of "an American haboob"

    NASA Astrophysics Data System (ADS)

    Vukovic, A.; Vujadinovic, M.; Pejanovic, G.; Andric, J.; Kumjian, M. R.; Djurdjevic, V.; Dacic, M.; Prasad, A. K.; El-Askary, H. M.; Paris, B. C.; Petkovic, S.; Nickovic, S.; Sprigg, W. A.

    2014-04-01

    A dust storm of fearful proportions hit Phoenix in the early evening hours of 5 July 2011. This storm, an American haboob, was predicted hours in advance because numerical, land-atmosphere modeling, computing power and remote sensing of dust events have improved greatly over the past decade. High-resolution numerical models are required for accurate simulation of the small scales of the haboob process, with high velocity surface winds produced by strong convection and severe downbursts. Dust productive areas in this region consist mainly of agricultural fields, with soil surfaces disturbed by plowing and tracks of land in the high Sonoran Desert laid barren by ongoing draught. Model simulation of the 5 July 2011 dust storm uses the coupled atmospheric-dust model NMME-DREAM (Non-hydrostatic Mesoscale Model on E grid, Janjic et al., 2001; Dust REgional Atmospheric Model, Nickovic et al., 2001; Pérez et al., 2006) with 4 km horizontal resolution. A mask of the potentially dust productive regions is obtained from the land cover and the normalized difference vegetation index (NDVI) data from the Moderate Resolution Imaging Spectroradiometer (MODIS). The scope of this paper is validation of the dust model performance, and not use of the model as a tool to investigate mechanisms related to the storm. Results demonstrate the potential technical capacity and availability of the relevant data to build an operational system for dust storm forecasting as a part of a warning system. Model results are compared with radar and other satellite-based images and surface meteorological and PM10 observations. The atmospheric model successfully hindcasted the position of the front in space and time, with about 1 h late arrival in Phoenix. The dust model predicted the rapid uptake of dust and high values of dust concentration in the ensuing storm. South of Phoenix, over the closest source regions (~25 km), the model PM10 surface dust concentration reached ~2500 μg m-3, but

  5. Sasquatch Footprint Tool

    NASA Technical Reports Server (NTRS)

    Bledsoe, Kristin

    2013-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) is the parachute system for NASA s Orion spacecraft. The test program consists of numerous drop tests, wherein a test article rigged with parachutes is extracted or released from an aircraft. During such tests, range safety is paramount, as is the recoverability of the parachutes and test article. It is crucial to establish an aircraft release point that will ensure that the article and all items released from it will land in safe locations. A new footprint predictor tool, called Sasquatch, was created in MATLAB. This tool takes in a simulated trajectory for the test article, information about all released objects, and atmospheric wind data (simulated or actual) to calculate the trajectories of the released objects. Dispersions are applied to the landing locations of those objects, taking into account the variability of winds, aircraft release point, and object descent rate. Sasquatch establishes a payload release point (e.g., where the payload will be extracted from the carrier aircraft) that will ensure that the payload and all objects released from it will land in a specified cleared area. The landing locations (the final points in the trajectories) are plotted on a map of the test range. Sasquatch was originally designed for CPAS drop tests and includes extensive information about both the CPAS hardware and the primary test range used for CPAS testing. However, it can easily be adapted for more complex CPAS drop tests, other NASA projects, and commercial partners. CPAS has developed the Sasquatch footprint tool to ensure range safety during parachute drop tests. Sasquatch is well correlated to test data and continues to ensure the safety of test personnel as well as the safe recovery of all equipment. The tool will continue to be modified based on new test data, improving predictions and providing added capability to meet the requirements of more complex testing.

  6. Advanced transmission studies

    NASA Technical Reports Server (NTRS)

    Coy, John J.; Bill, Robert C.

    1988-01-01

    The NASA Lewis Research Center and the U.S. Army Aviation Systems Command share an interest in advancing the technology for helicopter propulsion systems. In particular, this paper presents highlights from that portion of the program in drive train technology and the related mechanical components. The major goals of the program are to increase the life, reliability, and maintainability; reduce the weight, noise, and vibration; and maintain the relatively high mechanical efficiency of the gear train. The current activity emphasizes noise reduction technology and analytical code development followed by experimental verification. Selected significant advances in technology for transmissions are reviewed, including advanced configurations and new analytical tools. Finally, the plan for future transmission research is presented.

  7. Advance care directives

    MedlinePlus

    ... advance directive; Do-not-resuscitate - advance directive; Durable power of attorney - advance care directive; POA - advance care directive; Health care agent - advance care directive; Health care proxy - ...

  8. OpenSMOKE++: An object-oriented framework for the numerical modeling of reactive systems with detailed kinetic mechanisms

    NASA Astrophysics Data System (ADS)

    Cuoci, A.; Frassoldati, A.; Faravelli, T.; Ranzi, E.

    2015-07-01

    OpenSMOKE++ is a general framework for numerical simulations of reacting systems with detailed kinetic mechanisms, including thousands of chemical species and reactions. The framework is entirely written in object-oriented C++ and can be easily extended and customized by the user for specific systems, without having to modify the core functionality of the program. The OpenSMOKE++ framework can handle simulations of ideal chemical reactors (plug-flow, batch, and jet stirred reactors), shock-tubes, rapid compression machines, and can be easily incorporated into multi-dimensional CFD codes for the modeling of reacting flows. OpenSMOKE++ provides useful numerical tools such as the sensitivity and rate of production analyses, needed to recognize the main chemical paths and to interpret the numerical results from a kinetic point of view. Since simulations involving large kinetic mechanisms are very time consuming, OpenSMOKE++ adopts advanced numerical techniques able to reduce the computational cost, without sacrificing the accuracy and the robustness of the calculations. In the present paper we give a detailed description of the framework features, the numerical models available, and the implementation of the code. The possibility of coupling the OpenSMOKE++ functionality with existing numerical codes is discussed. The computational performances of the framework are presented, and the capabilities of OpenSMOKE++ in terms of integration of stiff ODE systems are discussed and analyzed with special emphasis. Some examples demonstrating the ability of the OpenSMOKE++ framework to successfully manage large kinetic mechanisms are eventually presented.

  9. Advanced powder processing

    SciTech Connect

    Janney, M.A.

    1997-04-01

    Gelcasting is an advanced powder forming process. It is most commonly used to form ceramic or metal powders into complex, near-net shapes. Turbine rotors, gears, nozzles, and crucibles have been successfully gelcast in silicon nitride, alumina, nickel-based superalloy, and several steels. Gelcasting can also be used to make blanks that can be green machined to near-net shape and then high fired. Green machining has been successfully applied to both ceramic and metal gelcast blanks. Recently, the authors have used gelcasting to make tooling for metal casting applications. Most of the work has centered on H13 tool steel. They have demonstrated an ability to gelcast and sinter H13 to near net shape for metal casting tooling. Also, blanks of H13 have been cast, green machined into complex shape, and fired. Issues associated with forming, binder burnout, and sintering are addressed.

  10. Tools Automate Spacecraft Testing, Operation

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "NASA began the Small Explorer (SMEX) program to develop spacecraft to advance astrophysics and space physics. As one of the entities supporting software development at Goddard Space Flight Center, the Hammers Company Inc. (tHC Inc.), of Greenbelt, Maryland, developed the Integrated Test and Operations System to support SMEX. Later, the company received additional Small Business Innovation Research (SBIR) funding from Goddard for a tool to facilitate the development of flight software called VirtualSat. NASA uses the tools to support 15 satellites, and the aerospace industry is using them to develop science instruments, spacecraft computer systems, and navigation and control software."

  11. MWD tools improve drilling performance

    SciTech Connect

    Moore, S.D.

    1986-02-01

    Downhole measurement while drilling technology is changing the way many wells are drilled. The capability to understand what is occurring at the drill bit as it actually happens is improving drilling performance, safety, and ultimately cost effectiveness. MWD evolved because of the need to acquire real-time data at the well site. The technology was not developed by vendors as simply an ''add-on'' tool - something an operator didn't realize he needed. MWD, with state-of-the-art, rugged, electronic downhole tools, is the closest thing the petroleum industry has to aerospace engineering. The constraints placed on MWD tools are greater than any other downhole tool-including wireline electric logs - because they are in the hole for long durations, operating under severe hole conditions. MWD tools were first used to monitor directional drilling operations on a real-time basis, More recently vendors have developed formation capabilities for MWD. Tools capable of measuring other drilling parameters such as weight on bit and downhole torque and pressure are also available. MWD technology continues to advance rapidly as the second and third generation of tools and equipment are introduced. Improvements are coming in many areas, but the biggest change will be in the development of new surface equipment to analyze retrieved data. For several years, MWD has been providing a reliable and accurate stream of real-time data from downhole. New software packages for surface equipment will allow the data to be analyzed in new ways to improve drilling efficiencies.

  12. Reactor2D: A tool for simulation of shock deformation

    NASA Astrophysics Data System (ADS)

    Kraus, Eugeny I.; Shabalin, Ivan I.

    2016-10-01

    The basic steps for creating a numerical tool to simulate the deformation and failure processes of complex technical objects (CTO) are presented. Calculations of shock loading of CTO both at low and high speeds, showing the efficiency of the numerical tools created are carried out.

  13. Numerical Modeling of Turbulent Combustion

    NASA Technical Reports Server (NTRS)

    Ghoneim, A. F.; Chorin, A. J.; Oppenheim, A. K.

    1983-01-01

    The work in numerical modeling is focused on the use of the random vortex method to treat turbulent flow fields associated with combustion while flame fronts are considered as interfaces between reactants and products, propagating with the flow and at the same time advancing in the direction normal to themselves at a prescribed burning speed. The latter is associated with the generation of specific volume (the flame front acting, in effect, as the locus of volumetric sources) to account for the expansion of the flow field due to the exothermicity of the combustion process. The model was applied to the flow in a channel equipped with a rearward facing step. The results obtained revealed the mechanism of the formation of large scale turbulent structure in the wake of the step, while it showed the flame to stabilize on the outer edges of these eddies.

  14. Advanced Beamline Design for Fermilab's Advanced Superconducting Test Accelerator

    SciTech Connect

    Prokop, Christopher

    2014-01-01

    The Advanced Superconducting Test Accelerator (ASTA) at Fermilab is a new electron accelerator currently in the commissioning stage. In addition to testing superconducting accelerating cavities for future accelerators, it is foreseen to support a variety of Advanced Accelerator R&D (AARD) experiments. Producing the required electron bunches with the expected flexibility is challenging. The goal of this dissertation is to explore via numerical simulations new accelerator beamlines that can enable the advanced manipulation of electron bunches. The work especially includes the design of a low-energy bunch compressor and a study of transverse-to-longitudinal phase space exchangers.

  15. Advanced beamline design for Fermilab's Advanced Superconducting Test Accelerator

    NASA Astrophysics Data System (ADS)

    Prokop, Christopher R.

    The Advanced Superconducting Test Accelerator (ASTA) at Fermilab is a new electron accelerator currently in the commissioning stage. In addition to testing superconducting accelerating cavities for future accelerators, it is foreseen to support a variety of Advanced Accelerator R&D (AARD) experiments. Producing the required electron bunches with the expected flexibility is challenging. The goal of this dissertation is to explore via numerical simulations new accelerator beamlines that can enable the advanced manipulation of electron bunches. The work especially includes the design of a low-energy bunch compressor and a study of transverse-to-longitudinal phase space exchangers.

  16. Numerical predictions in acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1992-01-01

    Computational Aeroacoustics (CAA) involves the calculation of the sound produced by a flow as well as the underlying flowfield itself from first principles. This paper describes the numerical challenges of CAA and recent research efforts to overcome these challenges. In addition, it includes the benefits of CAA in removing restrictions of linearity, single frequency, constant parameters, low Mach numbers, etc. found in standard acoustic analyses as well as means for evaluating the validity of these numerical approaches. Finally, numerous applications of CAA to both classical as well as modern problems of concern to the aerospace industry are presented.

  17. Numerical predictions in acoustics

    NASA Astrophysics Data System (ADS)

    Hardin, Jay C.

    Computational Aeroacoustics (CAA) involves the calculation of the sound produced by a flow as well as the underlying flowfield itself from first principles. This paper describes the numerical challenges of CAA and recent research efforts to overcome these challenges. In addition, it includes the benefits of CAA in removing restrictions of linearity, single frequency, constant parameters, low Mach numbers, etc. found in standard acoustic analyses as well as means for evaluating the validity of these numerical approaches. Finally, numerous applications of CAA to both classical as well as modern problems of concern to the aerospace industry are presented.

  18. Influence of the Numerical Dispersion Effects in the Modelling of Ultrasonic Measurements

    NASA Astrophysics Data System (ADS)

    Prikšaitis, J.; Mažeika, L.; Barauskas, R.; Žukauskas, E.; Kriščiūnas, A.

    The modern structures in aerospace, transport, wind energy and other industries contain components manufactured of composite materials. One of the advanced techniques used for inspection and monitoring of such structures are based on the application of guided ultrasonic waves. Numerical simulation is one of the most efficient tools enabling adequate interpretation of a signal. However, the necessity to relate the sampling steps in time and space domains with frequency and wavelength of the ultrasonic waves propagating within the object under investigation often leads to unacceptably long simulation time. Employment of coarser meshes tends to create additional problems caused by the numerical dispersion effects. It leads to the distortion of the shape of the simulated signal and mismatches against the experimental results. The objective of this work was to investigate the numerically caused distortions in simulated ultrasonic waves and to develop the technique enabling to minimize their influence. The analysis has been carried out by investigating the propagation of wideband ultrasonic waves in materials with known elastic properties by using the finite element model. The calculated signals have been used for the estimation of the propagation velocity, which has been compared against the corresponding wave velocities obtained by the analytical formulae and against signal velocities measured experimentally. As a result of the investigation, the rules enabling to determine a well-balanced set of modelling parameters have been developed. It was demonstrated that the models developed on the base of this set of parameters enable to reduce the numerical dispersion errors, as well as, the simulation time.

  19. Recent Advances in Neural Recording Microsystems

    PubMed Central

    Gosselin, Benoit

    2011-01-01

    The accelerating pace of research in neuroscience has created a considerable demand for neural interfacing microsystems capable of monitoring the activity of large groups of neurons. These emerging tools have revealed a tremendous potential for the advancement of knowledge in brain research and for the development of useful clinical applications. They can extract the relevant control signals directly from the brain enabling individuals with severe disabilities to communicate their intentions to other devices, like computers or various prostheses. Such microsystems are self-contained devices composed of a neural probe attached with an integrated circuit for extracting neural signals from multiple channels, and transferring the data outside the body. The greatest challenge facing development of such emerging devices into viable clinical systems involves addressing their small form factor and low-power consumption constraints, while providing superior resolution. In this paper, we survey the recent progress in the design and the implementation of multi-channel neural recording Microsystems, with particular emphasis on the design of recording and telemetry electronics. An overview of the numerous neural signal modalities is given and the existing microsystem topologies are covered. We present energy-efficient sensory circuits to retrieve weak signals from neural probes and we compare them. We cover data management and smart power scheduling approaches, and we review advances in low-power telemetry. Finally, we conclude by summarizing the remaining challenges and by highlighting the emerging trends in the field. PMID:22163863

  20. The FEYNMAN tools for quantum information processing: Design and implementation

    NASA Astrophysics Data System (ADS)

    Fritzsche, S.

    2014-06-01

    The FEYNMAN tools have been re-designed with the goal to establish and implement a high-level (computer) language that is capable to deal with the physics of finite, n-qubit systems, from frequently required computations to mathematically advanced tasks in quantum information processing. In particular, emphasis has been placed to introduce a small but powerful set of keystring-driven commands in order to support both, symbolic and numerical computations. Though the current design is implemented again within the framework of MAPLE, it is general and flexible enough to be utilized and combined with other languages and computational environments. The present implementation facilitates a large number of computational tasks, including the definition, manipulation and parametrization of quantum states, the evaluation of quantum measures and quantum operations, the evolution of quantum noise in discrete models, quantum measurements and state estimation, and several others. The design is based on a few high-level commands, with a syntax close to the mathematical notation and its use in the literature, and which can be generalized quite readily in order to solve computational tasks at even higher degree of complexity. In this work, I present and discuss the (re-design of the) FEYNMAN tools and make major parts of the code available for public use. Moreover, a few selected examples are shown and demonstrate possible application of this toolbox. The FEYNMAN tools are provided as MAPLE library and can hence be used on all platforms on which this computer-algebra system is accessible.

  1. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  2. Numerical simulations of plasmas

    SciTech Connect

    Dnestrovskii, Y.N.; Kostomarov, D.P.

    1986-01-01

    This book presents a modern, consistent, and systematic development of numerical computer simulation of plasmas in controlled thermonuclear fusion. The authors focus on recent Soviet research in mathematical modeling of Tokomak plasmas and present kinetic hydrodynamic and transport models.

  3. Rocket engine numerical simulator

    NASA Technical Reports Server (NTRS)

    Davidian, Ken

    1993-01-01

    The topics are presented in viewgraph form and include the following: a rocket engine numerical simulator (RENS) definition; objectives; justification; approach; potential applications; potential users; RENS work flowchart; RENS prototype; and conclusion.

  4. Rocket engine numerical simulation

    NASA Technical Reports Server (NTRS)

    Davidian, Ken

    1993-01-01

    The topics are presented in view graph form and include the following: a definition of the rocket engine numerical simulator (RENS); objectives; justification; approach; potential applications; potential users; RENS work flowchart; RENS prototype; and conclusions.

  5. Numerical Techniques in Acoustics

    NASA Technical Reports Server (NTRS)

    Baumeister, K. J. (Compiler)

    1985-01-01

    This is the compilation of abstracts of the Numerical Techniques in Acoustics Forum held at the ASME's Winter Annual Meeting. This forum was for informal presentation and information exchange of ongoing acoustic work in finite elements, finite difference, boundary elements and other numerical approaches. As part of this forum, it was intended to allow the participants time to raise questions on unresolved problems and to generate discussions on possible approaches and methods of solution.

  6. Performance and Weight Estimates for an Advanced Open Rotor Engine

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.; Tong, Michael T.

    2012-01-01

    NASA s Environmentally Responsible Aviation Project and Subsonic Fixed Wing Project are focused on developing concepts and technologies which may enable dramatic reductions to the environmental impact of future generation subsonic aircraft. The open rotor concept (also historically referred to an unducted fan or advanced turboprop) may allow for the achievement of this objective by reducing engine fuel consumption. To evaluate the potential impact of open rotor engines, cycle modeling and engine weight estimation capabilities have been developed. The initial development of the cycle modeling capabilities in the Numerical Propulsion System Simulation (NPSS) tool was presented in a previous paper. Following that initial development, further advancements have been made to the cycle modeling and weight estimation capabilities for open rotor engines and are presented in this paper. The developed modeling capabilities are used to predict the performance of an advanced open rotor concept using modern counter-rotating propeller designs. Finally, performance and weight estimates for this engine are presented and compared to results from a previous NASA study of advanced geared and direct-drive turbofans.

  7. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  8. Frontiers in Numerical Relativity

    NASA Astrophysics Data System (ADS)

    Evans, Charles R.; Finn, Lee S.; Hobill, David W.

    2011-06-01

    Preface; Participants; Introduction; 1. Supercomputing and numerical relativity: a look at the past, present and future David W. Hobill and Larry L. Smarr; 2. Computational relativity in two and three dimensions Stuart L. Shapiro and Saul A. Teukolsky; 3. Slowly moving maximally charged black holes Robert C. Ferrell and Douglas M. Eardley; 4. Kepler's third law in general relativity Steven Detweiler; 5. Black hole spacetimes: testing numerical relativity David H. Bernstein, David W. Hobill and Larry L. Smarr; 6. Three dimensional initial data of numerical relativity Ken-ichi Oohara and Takashi Nakamura; 7. Initial data for collisions of black holes and other gravitational miscellany James W. York, Jr.; 8. Analytic-numerical matching for gravitational waveform extraction Andrew M. Abrahams; 9. Supernovae, gravitational radiation and the quadrupole formula L. S. Finn; 10. Gravitational radiation from perturbations of stellar core collapse models Edward Seidel and Thomas Moore; 11. General relativistic implicit radiation hydrodynamics in polar sliced space-time Paul J. Schinder; 12. General relativistic radiation hydrodynamics in spherically symmetric spacetimes A. Mezzacappa and R. A. Matzner; 13. Constraint preserving transport for magnetohydrodynamics John F. Hawley and Charles R. Evans; 14. Enforcing the momentum constraints during axisymmetric spacelike simulations Charles R. Evans; 15. Experiences with an adaptive mesh refinement algorithm in numerical relativity Matthew W. Choptuik; 16. The multigrid technique Gregory B. Cook; 17. Finite element methods in numerical relativity P. J. Mann; 18. Pseudo-spectral methods applied to gravitational collapse Silvano Bonazzola and Jean-Alain Marck; 19. Methods in 3D numerical relativity Takashi Nakamura and Ken-ichi Oohara; 20. Nonaxisymmetric rotating gravitational collapse and gravitational radiation Richard F. Stark; 21. Nonaxisymmetric neutron star collisions: initial results using smooth particle hydrodynamics

  9. Advanced Microsensors

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This video looks at a spinoff application of the technology from advanced microsensors -- those that monitor and determine conditions of spacecraft like the Space Shuttle. The application featured is concerned with the monitoring of the health of premature babies.

  10. Advanced Composition

    ERIC Educational Resources Information Center

    Sarantos, R. L.

    1974-01-01

    This is an excerpt from a course for advanced students, designed to teach proficiency in English composition by providing activities specifically geared to the elimination of native language interference. (LG)

  11. Microfluidic Tools for Protein Crystallography

    NASA Astrophysics Data System (ADS)

    Abdallah, Bahige G.

    . Additionally, a passive mixer was created to generate unique solution concentrations within isolated nanowells to crystallize phycocyanin and lysozyme. Crystal imaging with brightfield microscopy, UV fluorescence, and SONICC coupled with numerical modeling allowed quantification of crystal growth conditions for efficient phase diagram development. The developed microfluidic tools demonstrated the capability of improving samples for protein crystallography, offering a foundation for continued development of platforms to aid protein structure determination.

  12. MFL tool hardware for pipeline inspection

    SciTech Connect

    Tandon, K.K.

    1997-02-01

    The intelligent pig based on the magnetic flux leakage (MFL) is frequently used for inline inspection of gas and liquid transportation pipelines. The tool is capable of reliably detecting and characterizing several commonly occurring pipeline defects including metal loss due to corrosion and gouges, dents, and buckles, which tend to threaten the structural integrity of the pipeline. The defect detection and characterization capabilities of the tool are directly dependent upon the type of critical hardware components and systems selected for the tool assembly. This article discusses the key components of an advanced or high resolution MFL tool.

  13. Foundational Tools for Petascale Computing

    SciTech Connect

    Miller, Barton

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  14. Clean Cities Tools: Tools to Help You Drive Smarter, Use Less Petroleum, and Reduce Emissions (Brochure)

    SciTech Connect

    Not Available

    2011-06-01

    Clean Cities' Alternative Fuels and Advanced Vehicles Data Center (AFDC) features a wide range of Web-based tools to help vehicle fleets and individual consumers reduce their petroleum use. This brochure lists and describes Clean Cities online tools related to vehicles, alternative fueling stations, electric vehicle charging stations, fuel conservation, emissions reduction, fuel economy, and more.

  15. Clean Cities Tools: Tools to Help You Save Money, Use Less Petroleum, and Reduce Emissions (Brochure)

    SciTech Connect

    Not Available

    2012-01-01

    Clean Cities Alternative Fuels and Advanced Vehicles Data Center (AFDC) features a wide range of Web-based tools to help vehicle fleets and individual consumers reduce their petroleum use. This brochure lists and describes Clean Cities online tools related to vehicles, alternative fueling stations, electric vehicle charging stations, fuel conservation, emissions reduction, fuel economy, and more.

  16. A numerical classical flutter analysis of advanced propellers

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Reddy, T. S. R.; Mehmed, O.

    1992-01-01

    A three-dimensional Euler solver is coupled with a three-dimensional structural dynamics model to investigate flutter of propfans. An implicit-explicit hybrid scheme is used to reduce computational time for the solution of Euler equations. The aeroelastic equations are formulated in normal modes and are solved for flutter in frequency domain. The required generalized forces are obtained using a pulse response method. Computations show that the instability is dominated by the second mode frequency as was observed in experiment.

  17. Advances in Estuarine Physics

    NASA Astrophysics Data System (ADS)

    Maccready, Parker; Geyer, W. Rockwell

    2010-01-01

    Recent advances in our understanding of estuarine circulation and salinity structure are reviewed. We focus on well- and partially mixed systems that are long relative to the tidal excursion. Dynamics of the coupled system of width- and tidally averaged momentum and salt equations are now better understood owing to the development of simple numerical solution techniques. These have led to a greater appreciation of the key role played by the time dependency of the length of the salt intrusion. Improved realism in simplified tidally averaged physics has been driven by simultaneous advances in our understanding of the detailed dynamics within the tidal cycle and across irregular channel cross-sections. The complex interactions of turbulence, stratification, and advection are now understood well enough to motivate a new generation of physically plausible mixing parameterizations for the tidally averaged equations.

  18. FAST Modular Wind Turbine CAE Tool: Nonmatching Spatial and Temporal Meshes: Preprint

    SciTech Connect

    Sprague, M. A.; Jonkman, J. M.; Jonkman, B. J.

    2014-01-01

    In this paper we propose and examine numerical algorithms for coupling time-dependent multi-physics modules relevant to computer-aided engineering (CAE) of wind turbines. In particular, we examine algorithms for coupling modules where spatial grids are non- matching at interfaces and module solutions are time advanced with different time increments and different time integrators. Sharing of data between modules is accomplished with a predictor-corrector approach, which allows for either implicit or explicit time integration within each module. Algorithms are presented in a general framework, but are applied to simple problems that are representative of the systems found in a whole-turbine analysis. Numerical experiments are used to explore the stability, accuracy, and efficiency of the proposed algorithms. This work is motivated by an in-progress major revision of FAST, the National Renewable Energy Laboratory's (NREL's) premier aero-elastic CAE simulation tool. The algorithms described here will greatly increase the flexibility and efficiency of FAST.

  19. Numerical simulation of in situ bioremediation

    SciTech Connect

    Travis, B.J.

    1998-12-31

    Models that couple subsurface flow and transport with microbial processes are an important tool for assessing the effectiveness of bioremediation in field applications. A numerical algorithm is described that differs from previous in situ bioremediation models in that it includes: both vadose and groundwater zones, unsteady air and water flow, limited nutrients and airborne nutrients, toxicity, cometabolic kinetics, kinetic sorption, subgridscale averaging, pore clogging and protozoan grazing.

  20. Toward Scientific Numerical Modeling

    NASA Technical Reports Server (NTRS)

    Kleb, Bil

    2007-01-01

    Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and verifying that numerical models are translated into code correctly, however, are necessary first steps toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. To address these two shortcomings, two proposals are offered: (1) an unobtrusive mechanism to document input parameter uncertainties in situ and (2) an adaptation of the Scientific Method to numerical model development and deployment. Because these two steps require changes in the computational simulation community to bear fruit, they are presented in terms of the Beckhard-Harris-Gleicher change model.