Science.gov

Sample records for advanced numerical tools

  1. Numerical tools for atomistic simulations.

    SciTech Connect

    Fang, H.; Gullett, Philip Michael; Slepoy, Alexander; Horstemeyer, Mark F.; Baskes, Michael I.; Wagner, Gregory John; Li, Mo

    2004-01-01

    The final report for a Laboratory Directed Research and Development project entitled 'Parallel Atomistic Computing for Failure Analysis of Micromachines' is presented. In this project, atomistic algorithms for parallel computers were developed to assist in quantification of microstructure-property relations related to weapon micro-components. With these and other serial computing tools, we are performing atomistic simulations of various sizes, geometries, materials, and boundary conditions. These tools provide the capability to handle the different size-scale effects required to predict failure. Nonlocal continuum models have been proposed to address this problem; however, they are phenomenological in nature and are difficult to validate for micro-scale components. Our goal is to separately quantify damage nucleation, growth, and coalescence mechanisms to provide a basis for macro-scale continuum models that will be used for micromachine design. Because micro-component experiments are difficult, a systematic computational study that employs Monte Carlo methods, molecular statics, and molecular dynamics (EAM and MEAM) simulations to compute continuum quantities will provide mechanism-property relations associated with the following parameters: specimen size, number of grains, crystal orientation, strain rates, temperature, defect nearest neighbor distance, void/crack size, chemical state, and stress state. This study will quantify sizescale effects from nanometers to microns in terms of damage progression and thus potentially allow for optimized micro-machine designs that are more reliable and have higher fidelity in terms of strength. In order to accomplish this task, several atomistic methods needed to be developed and evaluated to cover the range of defects, strain rates, temperatures, and sizes that a material may see in micro-machines. Therefore we are providing a complete set of tools for large scale atomistic simulations that include pre-processing of

  2. Numerically Controlled Machine Tools and Worker Skills.

    ERIC Educational Resources Information Center

    Keefe, Jeffrey H.

    1991-01-01

    Analysis of data from "Industry Wage Surveys of Machinery Manufacturers" on the skill levels of 57 machining jobs found that introduction of numerically controlled machine tools has resulted in a very small reduction in skill levels or no significant change, supporting neither the deskilling argument nor argument that skill levels increase with…

  3. Numerical modeling tools for chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Jasinski, Thomas J.; Childs, Edward P.

    1992-01-01

    Development of general numerical simulation tools for chemical vapor deposition (CVD) was the objective of this study. Physical models of important CVD phenomena were developed and implemented into the commercial computational fluid dynamics software FLUENT. The resulting software can address general geometries as well as the most important phenomena occurring with CVD reactors: fluid flow patterns, temperature and chemical species distribution, gas phase and surface deposition. The physical models are documented which are available and examples are provided of CVD simulation capabilities.

  4. Modeling Tool Advances Rotorcraft Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Continuum Dynamics Inc. (CDI), founded in 1979, specializes in advanced engineering services, including fluid dynamic modeling and analysis for aeronautics research. The company has completed a number of SBIR research projects with NASA, including early rotorcraft work done through Langley Research Center, but more recently, out of Ames Research Center. NASA Small Business Innovation Research (SBIR) grants on helicopter wake modeling resulted in the Comprehensive Hierarchical Aeromechanics Rotorcraft Model (CHARM), a tool for studying helicopter and tiltrotor unsteady free wake modeling, including distributed and integrated loads, and performance prediction. Application of the software code in a blade redesign program for Carson Helicopters, of Perkasie, Pennsylvania, increased the payload and cruise speeds of its S-61 helicopter. Follow-on development resulted in a $24 million revenue increase for Sikorsky Aircraft Corporation, of Stratford, Connecticut, as part of the company's rotor design efforts. Now under continuous development for more than 25 years, CHARM models the complete aerodynamics and dynamics of rotorcraft in general flight conditions. CHARM has been used to model a broad spectrum of rotorcraft attributes, including performance, blade loading, blade-vortex interaction noise, air flow fields, and hub loads. The highly accurate software is currently in use by all major rotorcraft manufacturers, NASA, the U.S. Army, and the U.S. Navy.

  5. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    SciTech Connect

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity is

  6. Brush seal numerical simulation: Concepts and advances

    NASA Technical Reports Server (NTRS)

    Braun, M. J.; Kudriavtsev, V. V.

    1994-01-01

    The development of the brush seal is considered to be most promising among the advanced type seals that are presently in use in the high speed turbomachinery. The brush is usually mounted on the stationary portions of the engine and has direct contact with the rotating element, in the process of limiting the 'unwanted' leakage flows between stages, or various engine cavities. This type of sealing technology is providing high (in comparison with conventional seals) pressure drops due mainly to the high packing density (around 100 bristles/sq mm), and brush compliance with the rotor motions. In the design of modern aerospace turbomachinery leakage flows between the stages must be minimal, thus contributing to the higher efficiency of the engine. Use of the brush seal instead of the labyrinth seal reduces the leakage flow by one order of magnitude. Brush seals also have been found to enhance dynamic performance, cost less, and are lighter than labyrinth seals. Even though industrial brush seals have been successfully developed through extensive experimentation, there is no comprehensive numerical methodology for the design or prediction of their performance. The existing analytical/numerical approaches are based on bulk flow models and do not allow the investigation of the effects of brush morphology (bristle arrangement), or brushes arrangement (number of brushes, spacing between them), on the pressure drops and flow leakage. An increase in the brush seal efficiency is clearly a complex problem that is closely related to the brush geometry and arrangement, and can be solved most likely only by means of a numerically distributed model.

  7. Brush seal numerical simulation: Concepts and advances

    NASA Astrophysics Data System (ADS)

    Braun, M. J.; Kudriavtsev, V. V.

    1994-07-01

    The development of the brush seal is considered to be most promising among the advanced type seals that are presently in use in the high speed turbomachinery. The brush is usually mounted on the stationary portions of the engine and has direct contact with the rotating element, in the process of limiting the 'unwanted' leakage flows between stages, or various engine cavities. This type of sealing technology is providing high (in comparison with conventional seals) pressure drops due mainly to the high packing density (around 100 bristles/sq mm), and brush compliance with the rotor motions. In the design of modern aerospace turbomachinery leakage flows between the stages must be minimal, thus contributing to the higher efficiency of the engine. Use of the brush seal instead of the labyrinth seal reduces the leakage flow by one order of magnitude. Brush seals also have been found to enhance dynamic performance, cost less, and are lighter than labyrinth seals. Even though industrial brush seals have been successfully developed through extensive experimentation, there is no comprehensive numerical methodology for the design or prediction of their performance. The existing analytical/numerical approaches are based on bulk flow models and do not allow the investigation of the effects of brush morphology (bristle arrangement), or brushes arrangement (number of brushes, spacing between them), on the pressure drops and flow leakage. An increase in the brush seal efficiency is clearly a complex problem that is closely related to the brush geometry and arrangement, and can be solved most likely only by means of a numerically distributed model.

  8. Advanced genetic tools for plant biotechnology

    SciTech Connect

    Liu, WS; Yuan, JS; Stewart, CN

    2013-10-09

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  9. Exposure tool control for advanced semiconductor lithography

    NASA Astrophysics Data System (ADS)

    Matsuyama, Tomoyuki

    2015-08-01

    This is a review paper to show how we control exposure tool parameters in order to satisfy patterning performance and productivity requirements for advanced semiconductor lithography. In this paper, we will discuss how we control illumination source shape to satisfy required imaging performance, heat-induced lens aberration during exposure to minimize the aberration impact on imaging, dose and focus control to realize uniform patterning performance across the wafer and patterning position of circuit patterns on different layers. The contents are mainly about current Nikon immersion exposure tools.

  10. Interpolator for numerically controlled machine tools

    DOEpatents

    Bowers, Gary L.; Davenport, Clyde M.; Stephens, Albert E.

    1976-01-01

    A digital differential analyzer circuit is provided that depending on the embodiment chosen can carry out linear, parabolic, circular or cubic interpolation. In the embodiment for parabolic interpolations, the circuit provides pulse trains for the X and Y slide motors of a two-axis machine to effect tool motion along a parabolic path. The pulse trains are generated by the circuit in such a way that parabolic tool motion is obtained from information contained in only one block of binary input data. A part contour may be approximated by one or more parabolic arcs. Acceleration and initial velocity values from a data block are set in fixed bit size registers for each axis separately but simultaneously and the values are integrated to obtain the movement along the respective axis as a function of time. Integration is performed by continual addition at a specified rate of an integrand value stored in one register to the remainder temporarily stored in another identical size register. Overflows from the addition process are indicative of the integral. The overflow output pulses from the second integration may be applied to motors which position the respective machine slides according to a parabolic motion in time to produce a parabolic machine tool motion in space. An additional register for each axis is provided in the circuit to allow "floating" of the radix points of the integrand registers and the velocity increment to improve position accuracy and to reduce errors encountered when the acceleration integrand magnitudes are small when compared to the velocity integrands. A divider circuit is provided in the output of the circuit to smooth the output pulse spacing and prevent motor stall, because the overflow pulses produced in the binary addition process are spaced unevenly in time. The divider has the effect of passing only every nth motor drive pulse, with n being specifiable. The circuit inputs (integrands, rates, etc.) are scaled to give exactly n times the

  11. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  12. Self-advancing step-tap tool

    NASA Technical Reports Server (NTRS)

    Pettit, Donald R. (Inventor); Penner, Ronald K. (Inventor); Franklin, Larry D. (Inventor); Camarda, Charles J. (Inventor)

    2008-01-01

    Methods and tool for simultaneously forming a bore in a work piece and forming a series of threads in said bore. In an embodiment, the tool has a predetermined axial length, a proximal end, and a distal end, said tool comprising: a shank located at said proximal end; a pilot drill portion located at said distal end; and a mill portion intermediately disposed between said shank and said pilot drill portion. The mill portion is comprised of at least two drill-tap sections of predetermined axial lengths and at least one transition section of predetermined axial length, wherein each of said at least one transition section is sandwiched between a distinct set of two of said at least two drill-tap sections. The at least two drill-tap sections are formed of one or more drill-tap cutting teeth spirally increasing along said at least two drill-tap sections, wherein said tool is self-advanced in said work piece along said formed threads, and wherein said tool simultaneously forms said bore and said series of threads along a substantially similar longitudinal axis.

  13. Program Helps Specify Paths For Numerically Controlled Tools

    NASA Technical Reports Server (NTRS)

    Premack, Timothy; Poland, James, Jr.

    1996-01-01

    ESDAPT computer program provides graphical programming environment for developing APT (Automatically Programmed Tool) programs for controlling numerically controlled machine tools. Establishes graphical user interface providing user with APT syntax-sensitive text-editing subprogram and windows for displaying geometry and tool paths. APT geometry statements also created by use of menus and screen picks. Written in C language, yacc, lex, and XView for use on Sun4-series computers running SunOS.

  14. Visualization tool for advanced laser system development

    NASA Astrophysics Data System (ADS)

    Crockett, Gregg A.; Brunson, Richard L.

    2002-06-01

    Simulation development for Laser Weapon Systems design and system trade analyses has progressed to new levels with the advent of object-oriented software development tools and PC processor capabilities. These tools allow rapid visualization of upcoming laser weapon system architectures and the ability to rapidly respond to what-if scenario questions from potential user commands. These simulations can solve very intensive problems in short time periods to investigate the parameter space of a newly emerging weapon system concept, or can address user mission performance for many different scenario engagements. Equally important to the rapid solution of complex numerical problems is the ability to rapidly visualize the results of the simulation, and to effectively interact with visualized output to glean new insights into the complex interactions of a scenario. Boeing has applied these ideas to develop a tool called the Satellite Visualization and Signature Tool (SVST). This Windows application is based upon a series of C++ coded modules that have evolved from several programs at Boeing-SVS. The SVST structure, extensibility, and some recent results of applying the simulation to weapon system concepts and designs will be discussed in this paper.

  15. Advances in numerical and applied mathematics

    NASA Technical Reports Server (NTRS)

    South, J. C., Jr. (Editor); Hussaini, M. Y. (Editor)

    1986-01-01

    This collection of papers covers some recent developments in numerical analysis and computational fluid dynamics. Some of these studies are of a fundamental nature. They address basic issues such as intermediate boundary conditions for approximate factorization schemes, existence and uniqueness of steady states for time dependent problems, and pitfalls of implicit time stepping. The other studies deal with modern numerical methods such as total variation diminishing schemes, higher order variants of vortex and particle methods, spectral multidomain techniques, and front tracking techniques. There is also a paper on adaptive grids. The fluid dynamics papers treat the classical problems of imcompressible flows in helically coiled pipes, vortex breakdown, and transonic flows.

  16. Development of advanced composite ceramic tool material

    SciTech Connect

    Huang Chuanzhen; Ai Xing

    1996-08-01

    An advanced ceramic cutting tool material has been developed by means of silicon carbide whisker (SiCw) reinforcement and silicon carbide particle (SiCp) dispersion. The material has the advantage of high bending strength and fracture toughness. Compared with the mechanical properties of Al{sub 2}O{sub 3}/SiCp(AP), Al{sub 2}O{sub 3}/SiCw(JX-1), and Al{sub 2}O{sub 3}/SiCp/SiCw(JX-2-I), it confirms that JX-2-I composites have obvious additive effects of both reinforcing and toughening. The reinforcing and toughening mechanisms of JX-2-I composites were studied based on the analysis of thermal expansion mismatch and the observation of microstructure. The cutting performance of JX-2-I composites was investigated primarily.

  17. Advanced Numerical Model for Irradiated Concrete

    SciTech Connect

    Giorla, Alain B.

    2015-03-01

    In this report, we establish a numerical model for concrete exposed to irradiation to address these three critical points. The model accounts for creep in the cement paste and its coupling with damage, temperature and relative humidity. The shift in failure mode with the loading rate is also properly represented. The numerical model for creep has been validated and calibrated against different experiments in the literature [Wittmann, 1970, Le Roy, 1995]. Results from a simplified model are shown to showcase the ability of numerical homogenization to simulate irradiation effects in concrete. In future works, the complete model will be applied to the analysis of the irradiation experiments of Elleuch et al. [1972] and Kelly et al. [1969]. This requires a careful examination of the experimental environmental conditions as in both cases certain critical information are missing, including the relative humidity history. A sensitivity analysis will be conducted to provide lower and upper bounds of the concrete expansion under irradiation, and check if the scatter in the simulated results matches the one found in experiments. The numerical and experimental results will be compared in terms of expansion and loss of mechanical stiffness and strength. Both effects should be captured accordingly by the model to validate it. Once the model has been validated on these two experiments, it can be applied to simulate concrete from nuclear power plants. To do so, the materials used in these concrete must be as well characterized as possible. The main parameters required are the mechanical properties of each constituent in the concrete (aggregates, cement paste), namely the elastic modulus, the creep properties, the tensile and compressive strength, the thermal expansion coefficient, and the drying shrinkage. These can be either measured experimentally, estimated from the initial composition in the case of cement paste, or back-calculated from mechanical tests on concrete. If some

  18. Advanced in turbulence physics and modeling by direct numerical simulations

    NASA Technical Reports Server (NTRS)

    Reynolds, W. C.

    1987-01-01

    The advent of direct numerical simulations of turbulence has opened avenues for research on turbulence physics and turbulence modeling. Direct numerical simulation provides values for anything that the scientist or modeler would like to know about the flow. An overview of some recent advances in the physical understanding of turbulence and in turbulence modeling obtained through such simulations is presented.

  19. Development of Advanced Tools for Cryogenic Integration

    NASA Astrophysics Data System (ADS)

    Bugby, D. C.; Marland, B. C.; Stouffer, C. J.; Kroliczek, E. J.

    2004-06-01

    This paper describes four advanced devices (or tools) that were developed to help solve problems in cryogenic integration. The four devices are: (1) an across-gimbal nitrogen cryogenic loop heat pipe (CLHP); (2) a miniaturized neon CLHP; (3) a differential thermal expansion (DTE) cryogenic thermal switch (CTSW); and (4) a dual-volume nitrogen cryogenic thermal storage unit (CTSU). The across-gimbal CLHP provides a low torque, high conductance solution for gimbaled cryogenic systems wishing to position their cryocoolers off-gimbal. The miniaturized CLHP combines thermal transport, flexibility, and thermal switching (at 35 K) into one device that can be directly mounted to both the cooler cold head and the cooled component. The DTE-CTSW, designed and successfully tested in a previous program using a stainless steel tube and beryllium (Be) end-pieces, was redesigned with a polymer rod and high-purity aluminum (Al) end-pieces to improve performance and manufacturability while still providing a miniaturized design. Lastly, the CTSU was designed with a 6063 Al heat exchanger and integrally welded, segmented, high purity Al thermal straps for direct attachment to both a cooler cold head and a Be component whose peak heat load exceeds its average load by 2.5 times. For each device, the paper will describe its development objective, operating principles, heritage, requirements, design, test data and lessons learned.

  20. Numerical Forming Simulations and Optimisation in Advanced Materials

    NASA Astrophysics Data System (ADS)

    Huétink, J.; van den Boogaard, A. H.; Geijselears, H. J. M.; Meinders, T.

    2007-05-01

    With the introduction of new materials as high strength steels, metastable steels and fibre reinforced composites, the need for advanced physically valid constitutive models arises. In finite deformation problems constitutive relations are commonly formulated in terms the Cauchy stress as a function of the elastic Finger tensor and an objective rate of the Cauchy stress as a function of the rate of deformation tensor. For isotropic materials models this is rather straightforward, but for anisotropic material models, including elastic anisotropy as well as plastic anisotropy, this may lead to confusing formulations. It will be shown that it is more convenient to define the constitutive relations in terms of invariant tensors referred to the deformed metric. Experimental results are presented that show new combinations of strain rate and strain path sensitivity. An adaptive through- thickness integration scheme for plate elements is developed, which improves the accuracy of spring back prediction at minimal costs. A procedure is described to automatically compensate the CAD tool shape numerically to obtain the desired product shape. Forming processes need to be optimized for cost saving and product improvement. Until recently, a trial-and-error process in the factory primarily did this optimization. An optimisation strategy is proposed that assists an engineer to model an optimization problem that suits his needs, including an efficient algorithm for solving the problem.

  1. Advanced cryogenics for cutting tools. Final report

    SciTech Connect

    Lazarus, L.J.

    1996-10-01

    The purpose of the investigation was to determine if cryogenic treatment improved the life and cost effectiveness of perishable cutting tools over other treatments or coatings. Test results showed that in five of seven of the perishable cutting tools tested there was no improvement in tool life. The other two tools showed a small gain in tool life, but not as much as when switching manufacturers of the cutting tool. The following conclusions were drawn from this study: (1) titanium nitride coatings are more effective than cryogenic treatment in increasing the life of perishable cutting tools made from all cutting tool materials, (2) cryogenic treatment may increase tool life if the cutting tool is improperly heat treated during its origination, and (3) cryogenic treatment was only effective on those tools made from less sophisticated high speed tool steels. As a part of a recent detailed investigation, four cutting tool manufacturers and two cutting tool laboratories were queried and none could supply any data to substantiate cryogenic treatment of perishable cutting tools.

  2. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    SciTech Connect

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-07

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  3. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  4. Numerical Relativity as a tool for studying the Early Universe

    NASA Astrophysics Data System (ADS)

    Garrison, David

    2013-04-01

    Numerical simulations are becoming a more effective tool for conducting detailed investigations into the evolution of our universe. In this presentation, I show how the framework of numerical relativity can be used for studying cosmological models. We are working to develop a large-scale simulation of the dynamical processes in the early universe. These take into account interactions of dark matter, scalar perturbations, gravitational waves, magnetic fields and a turbulent plasma. The code described in this report is a GRMHD code based on the Cactus framework and is structured to utilize one of several different differencing methods chosen at run-time. It is being developed and tested on the Texas Learning and Computation Center's Xanadu cluster.

  5. Numerical Relativity: A critical new tool for astrophysics

    NASA Astrophysics Data System (ADS)

    Zlochower, Yosef

    2009-05-01

    The past few years have seen a renaissance in Numerical Relativity that has transformed the field into a critical tool for studying astrophysical systems. Researchers around the world have made many important new discoveries in the evolution of black-hole systems. In this talk I will describe many of the results that a few years ago seemed impossible to obtain, including unexpectedly large recoil kicks, modeling of the remnant masses and spins, post-Newtonian / NR comparisons, highly-accurate long-term evolutions, explorations of mathematical structure of remnant spacetimes, and N-black-hole merger scenarios.

  6. Alternative Fuel and Advanced Vehicle Tools (AFAVT), AFDC (Fact Sheet)

    SciTech Connect

    Not Available

    2010-01-01

    The Alternative Fuels and Advanced Vehicles Web site offers a collection of calculators, interactive maps, and informational tools to assist fleets, fuel providers, and others looking to reduce petroleum consumption in the transportation sector.

  7. Innovative Tools Advance Revolutionary Weld Technique

    NASA Technical Reports Server (NTRS)

    2009-01-01

    The iconic, orange external tank of the space shuttle launch system not only contains the fuel used by the shuttle s main engines during liftoff but also comprises the shuttle s backbone, supporting the space shuttle orbiter and solid rocket boosters. Given the tank s structural importance and the extreme forces (7.8 million pounds of thrust load) and temperatures it encounters during launch, the welds used to construct the tank must be highly reliable. Variable polarity plasma arc welding, developed for manufacturing the external tank and later employed for building the International Space Station, was until 1994 the best process for joining the aluminum alloys used during construction. That year, Marshall Space Flight Center engineers began experimenting with a relatively new welding technique called friction stir welding (FSW), developed in 1991 by The Welding Institute, of Cambridge, England. FSW differs from traditional fusion welding in that it is a solid-state welding technique, using frictional heat and motion to join structural components without actually melting any of the material. The weld is created by a shouldered pin tool that is plunged into the seam of the materials to be joined. The tool traverses the line while rotating at high speeds, generating friction that heats and softens but does not melt the metal. (The heat produced approaches about 80 percent of the metal s melting temperature.) The pin tool s rotation crushes and stirs the plasticized metal, extruding it along the seam as the tool moves forward. The material cools and consolidates, resulting in a weld with superior mechanical properties as compared to those weld properties of fusion welds. The innovative FSW technology promises a number of attractive benefits. Because the welded materials are not melted, many of the undesirables associated with fusion welding porosity, cracking, shrinkage, and distortion of the weld are minimized or avoided. The process is more energy efficient, safe

  8. Some recent advances in the numerical solution of differential equations

    NASA Astrophysics Data System (ADS)

    D'Ambrosio, Raffaele

    2016-06-01

    The purpose of the talk is the presentation of some recent advances in the numerical solution of differential equations, with special emphasis to reaction-diffusion problems, Hamiltonian problems and ordinary differential equations with discontinuous right-hand side. As a special case, in this short paper we focus on the solution of reaction-diffusion problems by means of special purpose numerical methods particularly adapted to the problem: indeed, following a problem oriented approach, we propose a modified method of lines based on the employ of finite differences shaped on the qualitative behavior of the solutions. Constructive issues and a brief analysis are presented, together with some numerical experiments showing the effectiveness of the approach and a comparison with existing solvers.

  9. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  10. Terahertz Tools Advance Imaging for Security, Industry

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Picometrix, a wholly owned subsidiary of Advanced Photonix Inc. (API), of Ann Arbor, Michigan, invented the world s first commercial terahertz system. The company improved the portability and capabilities of their systems through Small Business Innovation Research (SBIR) agreements with Langley Research Center to provide terahertz imaging capabilities for inspecting the space shuttle external tanks and orbiters. Now API s systems make use of the unique imaging capacity of terahertz radiation on manufacturing floors, for thickness measurements of coatings, pharmaceutical tablet production, and even art conservation.

  11. [Advance directives, a tool to humanize care].

    PubMed

    Olmari-Ebbing, M; Zumbach, C N; Forest, M I; Rapin, C H

    2000-07-01

    The relationship between the patient and a medical care giver is complex specially as it implies to the human, juridical and practical points of view. It depends on legal and deontological considerations, but also on professional habits. Today, we are confronted to a fundamental modification of this relationship. Professional guidelines exist, but are rarely applied and rarely taught in universities. However, patients are eager to move from a paternalistic relationship to a true partnership, more harmonious and more respectful of individual values ("value based medicine"). Advance directives give us an opportunity to improve our practices and to provide care consistent with the needs and wishes of each patient. PMID:10967645

  12. DRIVER TO SUPPORT USE OF NUMERICAL SIMULATION TOOLS

    2001-02-13

    UNIPACK is a computer interface that simplifies and enhances the use of numerical simulation tools to design a primary geometry and/or a forming die for a powder compact and/or to design the pressing process used to shape a powder by compaction. More particularly, it is an interface that utilizes predefined generic geometric configurations to simplify the use of finite element method modeling software to simply and more efficiently design: (1) the shape and size amore » powder compact; (2) a forming die to shape a powder compact; and/or (3) the pressing process used to form a powder compact. UNIPACK is a user interface for a predictive model for powder compaction that incorporates unprecedented flexibility to design powder press tooling and powder pressing processes. UNIPACK works with the Sandia National Laboratories (SNL) Engineering Analysis Cide Access System (SEACAS) to generate a finite element (FE) mesh and automatically perform a FE analysis of powder compaction. UNIPACK was developed to allow a non-expert with minimal training to quickly and easily design/construct a variable dimension component or die in real time on a desktop or laptop personal computer.« less

  13. Advanced machine tools, loading systems viewed

    NASA Astrophysics Data System (ADS)

    Kharkov, V. I.

    1986-03-01

    The machine-tooling complex built from a revolving lathe and a two-armed robot designed to machine short revolving bodies including parts with curvilinear and threaded surfaces from piece blanks in either small-series or series multiitem production is described. The complex consists of: (1) a model 1V340F30 revolving lathe with a vertical axis of rotation, 8-position revolving head on a cross carriage and an Elektronika NTs-31 on-line control system; (2) a gantry-style two-armed M20-Ts robot with a 20-kilogram (20 x 2) load capacity; and (3) an 8-position indexable blank table, one of whose positions is for initial unloading of finished parts. Subsequently, machined parts are set onto the position into which all of the blanks are unloaded. Complex enclosure allows adjustment and process correction during maintenance and convenient observation of the machining process.

  14. Advances in nanocrystallography as a proteomic tool.

    PubMed

    Pechkova, Eugenia; Bragazzi, Nicola Luigi; Nicolini, Claudio

    2014-01-01

    In order to overcome the difficulties and hurdles too much often encountered in crystallizing a protein with the conventional techniques, our group has introduced the innovative Langmuir-Blodgett (LB)-based crystallization, as a major advance in the field of both structural and functional proteomics, thus pioneering the emerging field of the so-called nanocrystallography or nanobiocrystallography. This approach uniquely combines protein crystallography and nanotechnologies within an integrated, coherent framework that allows one to obtain highly stable protein crystals and to fully characterize them at a nano- and subnanoscale. A variety of experimental techniques and theoretical/semi-theoretical approaches, ranging from atomic force microscopy, circular dichroism, Raman spectroscopy and other spectroscopic methods, microbeam grazing-incidence small-angle X-ray scattering to in silico simulations, bioinformatics, and molecular dynamics, has been exploited in order to study the LB-films and to investigate the kinetics and the main features of LB-grown crystals. When compared to classical hanging-drop crystallization, LB technique appears strikingly superior and yields results comparable with crystallization in microgravity environments. Therefore, the achievement of LB-based crystallography can have a tremendous impact in the field of industrial and clinical/therapeutic applications, opening new perspectives for personalized medicine. These implications are envisaged and discussed in the present contribution. PMID:24985772

  15. Advanced tool kits for EPR security.

    PubMed

    Blobel, B

    2000-11-01

    Responding to the challenge for efficient and high quality health care, the shared care paradigm must be established in health. In that context, information systems such as electronic patient records (EPR) have to meet this paradigm supporting communication and interoperation between the health care establishments (HCE) and health professionals (HP) involved. Due to the sensitivity of personal medical information, this co-operation must be provided in a trustworthy way. To enable different views of HCE and HP ranging from management, doctors, nurses up to systems administrators and IT professionals, a set of models for analysis, design and implementation of secure distributed EPR has been developed and introduced. The approach is based on the popular UML methodology and the component paradigm for open, interoperable systems. Easy to use tool kits deal with both application security services and communication security services but also with the security infrastructure needed. Regarding the requirements for distributed multi-user EPRs, modelling and implementation of policy agreements, authorisation and access control are especially considered. Current developments for a security infrastructure in health care based on cryptographic algorithms as health professional cards (HPC), security services employing digital signatures, and health-related TTP services are discussed. CEN and ISO initiatives for health informatics standards in the context of secure and communicable EPR are especially mentioned. PMID:11154968

  16. Advanced CAN (Controller Area Network) Tool

    SciTech Connect

    Terry, D.J.

    2000-03-17

    The CAN interface cards that are currently in use are PCMCIA based and use a microprocessor and CAN chip that are no longer in production. The long-term support of the SGT CAN interface is of concern due to this issue along with performance inadequacies and technical support. The CAN bus is at the heart of the SGT trailer. If the CAN bus in the SGT trailer cannot be maintained adequately, then the trailer itself cannot be maintained adequately. These concerns led to the need for a CRADA to help develop a new product that would be called the ''Gryphon'' CAN tool. FM and T provided manufacturing expertise along with design criteria to ensure SGT compatibility and long-term support. FM and T also provided resources for software support. Dearborn provided software and hardware design expertise to implement the necessary requirements. Both partners worked around heavy internal workloads to support completion of the project. This CRADA establishes a US source for an item that is very critical to support the SGT project. The Dearborn Group had the same goal to provide a US alternative to German suppliers. The Dearborn Group was also interested in developing a CAN product that has performance characteristics that place the Gryphon in a class by itself. This enhanced product not only meets and exceeds SGT requirements; it has opened up options that were not even considered before the project began. The cost of the product is also less than the European options.

  17. Advanced numerical methods in mesh generation and mesh adaptation

    SciTech Connect

    Lipnikov, Konstantine; Danilov, A; Vassilevski, Y; Agonzal, A

    2010-01-01

    Numerical solution of partial differential equations requires appropriate meshes, efficient solvers and robust and reliable error estimates. Generation of high-quality meshes for complex engineering models is a non-trivial task. This task is made more difficult when the mesh has to be adapted to a problem solution. This article is focused on a synergistic approach to the mesh generation and mesh adaptation, where best properties of various mesh generation methods are combined to build efficiently simplicial meshes. First, the advancing front technique (AFT) is combined with the incremental Delaunay triangulation (DT) to build an initial mesh. Second, the metric-based mesh adaptation (MBA) method is employed to improve quality of the generated mesh and/or to adapt it to a problem solution. We demonstrate with numerical experiments that combination of all three methods is required for robust meshing of complex engineering models. The key to successful mesh generation is the high-quality of the triangles in the initial front. We use a black-box technique to improve surface meshes exported from an unattainable CAD system. The initial surface mesh is refined into a shape-regular triangulation which approximates the boundary with the same accuracy as the CAD mesh. The DT method adds robustness to the AFT. The resulting mesh is topologically correct but may contain a few slivers. The MBA uses seven local operations to modify the mesh topology. It improves significantly the mesh quality. The MBA method is also used to adapt the mesh to a problem solution to minimize computational resources required for solving the problem. The MBA has a solid theoretical background. In the first two experiments, we consider the convection-diffusion and elasticity problems. We demonstrate the optimal reduction rate of the discretization error on a sequence of adaptive strongly anisotropic meshes. The key element of the MBA method is construction of a tensor metric from hierarchical edge

  18. Machine Tool Advanced Skills Technology Program (MAST). Overview and Methodology.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology Program (MAST) is a geographical partnership of six of the nation's best two-year colleges located in the six states that have about one-third of the density of metals-related industries in the United States. The purpose of the MAST grant is to develop and implement a national training model to overcome…

  19. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  20. Advanced numerics for multi-dimensional fluid flow calculations

    NASA Technical Reports Server (NTRS)

    Vanka, S. P.

    1984-01-01

    In recent years, there has been a growing interest in the development and use of mathematical models for the simulation of fluid flow, heat transfer and combustion processes in engineering equipment. The equations representing the multi-dimensional transport of mass, momenta and species are numerically solved by finite-difference or finite-element techniques. However despite the multiude of differencing schemes and solution algorithms, and the advancement of computing power, the calculation of multi-dimensional flows, especially three-dimensional flows, remains a mammoth task. The following discussion is concerned with the author's recent work on the construction of accurate discretization schemes for the partial derivatives, and the efficient solution of the set of nonlinear algebraic equations resulting after discretization. The present work has been jointly supported by the Ramjet Engine Division of the Wright Patterson Air Force Base, Ohio, and the NASA Lewis Research Center.

  1. Microfield exposure tool enables advances in EUV lithography development

    SciTech Connect

    Naulleau, Patrick

    2009-09-07

    With demonstrated resist resolution of 20 nm half pitch, the SEMATECH Berkeley BUV microfield exposure tool continues to push crucial advances in the areas of BUY resists and masks. The ever progressing shrink in computer chip feature sizes has been fueled over the years by a continual reduction in the wavelength of light used to pattern the chips. Recently, this trend has been threatened by unavailability of lens materials suitable for wavelengths shorter than 193 nm. To circumvent this roadblock, a reflective technology utilizing a significantly shorter extreme ultraviolet (EUV) wavelength (13.5 nm) has been under development for the past decade. The dramatic wavelength shrink was required to compensate for optical design limitations intrinsic in mirror-based systems compared to refractive lens systems. With this significant reduction in wavelength comes a variety of new challenges including developing sources of adequate power, photoresists with suitable resolution, sensitivity, and line-edge roughness characteristics, as well as the fabrication of reflection masks with zero defects. While source development can proceed in the absence of available exposure tools, in order for progress to be made in the areas of resists and masks it is crucial to have access to advanced exposure tools with resolutions equal to or better than that expected from initial production tools. These advanced development tools, however, need not be full field tools. Also, implementing such tools at synchrotron facilities allows them to be developed independent of the availability of reliable stand-alone BUY sources. One such tool is the SEMATECH Berkeley microfield exposure tool (MET). The most unique attribute of the SEMA TECH Berkeley MET is its use of a custom-coherence illuminator made possible by its implementation on a synchrotron beamline. With only conventional illumination and conventional binary masks, the resolution limit of the 0.3-NA optic is approximately 25 nm, however

  2. NUMERICAL CONTROL OF MACHINE TOOLS, AN INSTRUCTOR'S GUIDE.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento. Bureau of Industrial Education.

    IN A SUMMER WORKSHOP, JUNIOR COLLEGE INSTRUCTORS AND INDUSTRIAL SUPERVISORS DEVELOPED THIS GUIDE FOR TEACHER USE IN A 3-SEMESTER-HOUR COURSE AT THE JUNIOR COLLEGE LEVEL. THE COURSE OBJECTIVES ARE TO (1) UPGRADE JOURNEYMEN IN MACHINE TOOL OPERATION, MAINTENANCE, AND TOOLING, AND (2) ACQUAINT MANUFACTURING, SUPERVISORY, PLANNING, AND MAINTENANCE…

  3. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  4. Anvil Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe, III; Bauman, William, III; Keen, Jeremy

    2007-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. In order for the Anvil Tool to remain available to the meteorologists, the AMU was tasked to transition the tool to the Advanced Weather interactive Processing System (AWIPS). This report describes the work done by the AMU to develop the Anvil Tool for AWIPS to create a graphical overlay depicting the threat from thunderstorm anvil clouds. The AWIPS Anvil Tool is based on the previously deployed AMU MIDDS Anvil Tool. SMG and 45 WS forecasters have used the MIDDS Anvil Tool during launch and landing operations. SMG's primary weather analysis and display system is now AWIPS and the 45 WS has plans to replace MIDDS with AWIPS. The Anvil Tool creates a graphic that users can overlay on satellite or radar imagery to depict the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on an average of the upper-level observed or forecasted winds. The graphic includes 10 and 20 nm standoff circles centered at the location of interest, in addition to one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 degree sector width based on a previous AMU study which determined thunderstorm anvils move in a direction plus or minus 15 degrees of the upper-level (300- to 150-mb) wind direction. This report briefly describes the history of the MIDDS Anvil Tool and then explains how the initial development of the AWIPS Anvil Tool was carried out. After testing was

  5. An Advanced Manipulator For Poisson Series With Numerical Coefficients

    NASA Astrophysics Data System (ADS)

    Biscani, Francesco; Casotto, S.

    2006-06-01

    The availability of an efficient and featureful manipulator for Poisson deries with numerical coefficients is a standard need for celestial mechanicians and has arisen during our work on the analytical development of the Tide-Generating-Potential (TGP). In the harmonic expansion of the TGP the Poisson series appearing in the theories of motion of the celestial bodies are subjected to a wide set of mathematical operations, ranging from simple additions and multiplications to more sophisticated operations on Legendre polynomials and spherical harmonics with Poisson series as arguments. To perform these operations we have developed an algebraic manipulator, called Piranha, structured as an object-oriented multi-platform C++ library. Piranha handles series with real and complex coefficients, and operates with an arbitrary degree of precision. It supports advanced features such as trigonometric operations and the generation of special functions from Poisson series. Piranha is provided with a proof-of-concept, multi-platform GUI, which serves as a testbed and benchmark for the library. We describe Piranha's architecture and characteristics, what it accomplishes currently and how it will be extended in the future (e.g., to handle series with symbolic coefficients in a consistent fashion with its current design).

  6. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  7. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  8. Anvil Forecast Tool in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.

  9. Advanced tools, multiple missions, flexible organizations, and education

    NASA Astrophysics Data System (ADS)

    Lucas, Ray A.; Koratkar, Anuradha

    2000-07-01

    In this new era of modern astronomy, observations across multiple wavelengths are often required. This implies understanding many different costly and complex observatories. Yet, the process for translating ideas into proposals is very similar for all of these observatories If we had a new generation of uniform, common tools, writing proposals for the various observatories would be simpler for the observer because the learning curve would not be as steep. As observatory staffs struggle to meet the demands for higher scientific productivity with fewer resources, it is important to remember that another benefit of having such universal tools is that they enable much greater flexibility within an organization. The shifting manpower needs of multiple- instrument support or multiple-mission operations may be more readily met since the expertise is built into the tools. The flexibility of an organization is critical to its ability to change, to plan ahead, and respond to various new opportunities and operating conditions on shorter time scales, and to achieve the goal of maximizing scientific returns. In this paper we will discuss the role of a new generation of tools with relation to multiple missions and observatories. We will also discuss some of the impact of how uniform, consistently familiar software tools can enhance the individual's expertise and the organization's flexibility. Finally, we will discuss the relevance of advanced tools to higher education.

  10. Numerical optimization design of advanced transonic wing configurations

    NASA Technical Reports Server (NTRS)

    Cosentino, G. B.; Holst, T. L.

    1984-01-01

    A computationally efficient and versatile technique for use in the design of advanced transonic wing configurations has been developed. A reliable and fast transonic wing flow-field analysis program, TWING, has been coupled with a modified quasi-Newton method, unconstrained optimization algorithm, QNMDIF, to create a new design tool. Fully three-dimensional wing designs utilizing both specified wing pressure distributions and drag-to-lift ration minimization as design objectives are demonstrated. Because of the high computational efficiency of each of the components of the design code, in particular the vectorization of TWING and the high speed of the Cray X-MP vector computer, the computer time required for a typical wing design is reduced by approximately an order of magnitude over previous methods. In the results presented here, this computed wave drag has been used as the quantity to be optimized (minimized) with great success, yielding wing designs with nearly shock-free (zero wave drag) pressure distributions and very reasonable wing section shapes.

  11. Selected aspects of microelectronics technology and applications: Numerically controlled machine tools. Technology trends series no. 2

    NASA Astrophysics Data System (ADS)

    Sigurdson, J.; Tagerud, J.

    1986-05-01

    A UNIDO publication about machine tools with automatic control discusses the following: (1) numerical control (NC) machine tool perspectives, definition of NC, flexible manufacturing systems, robots and their industrial application, research and development, and sensors; (2) experience in developing a capability in NC machine tools; (3) policy issues; (4) procedures for retrieval of relevant documentation from data bases. Diagrams, statistics, bibliography are included.

  12. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Astrophysics Data System (ADS)

    Doyle, Monica M.; O'Neil, Daniel A.; Christensen, Carissa B.

    2005-02-01

    Forecasting technology capabilities requires a tool and a process for capturing state-of-the-art technology metrics and estimates for future metrics. A decision support tool, known as the Advanced Technology Lifecycle Analysis System (ATLAS), contains a Technology Tool Box (TTB) database designed to accomplish this goal. Sections of this database correspond to a Work Breakdown Structure (WBS) developed by NASA's Exploration Systems Research and Technology (ESRT) Program. These sections cover the waterfront of technologies required for human and robotic space exploration. Records in each section include technology performance, operations, and programmatic metrics. Timeframes in the database provide metric values for the state of the art (Timeframe 0) and forecasts for timeframes that correspond to spiral development milestones in NASA's Exploration Systems Mission Directorate (ESMD) development strategy. Collecting and vetting data for the TTB will involve technologists from across the agency, the aerospace industry and academia. Technologists will have opportunities to submit technology metrics and forecasts to the TTB development team. Semi-annual forums will facilitate discussions about the basis of forecast estimates. As the tool and process mature, the TTB will serve as a powerful communication and decision support tool for the ESRT program.

  13. Interoperable mesh and geometry tools for advanced petascale simulations

    SciTech Connect

    Diachin, L; Bauer, A; Fix, B; Kraftcheck, J; Jansen, K; Luo, X; Miller, M; Ollivier-Gooch, C; Shephard, M; Tautges, T; Trease, H

    2007-07-04

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. The Center for Interoperable Technologies for Advanced Petascale Simulations (ITAPS) will deliver interoperable and interchangeable mesh, geometry, and field manipulation services that are of direct use to SciDAC applications. The premise of our technology development goal is to provide such services as libraries that can be used with minimal intrusion into application codes. To develop these technologies, we focus on defining a common data model and datastructure neutral interfaces that unify a number of different services such as mesh generation and improvement, front tracking, adaptive mesh refinement, shape optimization, and solution transfer operations. We highlight the use of several ITAPS services in SciDAC applications.

  14. Advanced Electric Submersible Pump Design Tool for Geothermal Applications

    SciTech Connect

    Xuele Qi; Norman Turnquist; Farshad Ghasripoor

    2012-05-31

    Electrical Submersible Pumps (ESPs) present higher efficiency, larger production rate, and can be operated in deeper wells than the other geothermal artificial lifting systems. Enhanced Geothermal Systems (EGS) applications recommend lifting 300 C geothermal water at 80kg/s flow rate in a maximum 10-5/8-inch diameter wellbore to improve the cost-effectiveness. In this paper, an advanced ESP design tool comprising a 1D theoretical model and a 3D CFD analysis has been developed to design ESPs for geothermal applications. Design of Experiments was also performed to optimize the geometry and performance. The designed mixed-flow type centrifugal impeller and diffuser exhibit high efficiency and head rise under simulated EGS conditions. The design tool has been validated by comparing the prediction to experimental data of an existing ESP product.

  15. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  16. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2010-12-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  17. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2011-05-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling

  18. Tools for advance directives. American Health Information Management Association.

    PubMed

    Schraffenberger, L A

    1992-02-01

    This issue of the Journal of AHIMA contains a Position Statement on advance directives. Here we have included several "tools" or helpful documents to support your organization's ongoing education regarding advance directives. First, we offer a "Sample Policy and Procedure" addressing the administrative process of advance directives. This sample policy was adapted from a policy shared by Jean Clark, RRA, operations director with Roper Hospital in Charleston, SC, and a director on the AHIMA Board of Directors. Do not automatically accept this policy and procedure for your organization. Instead, the health information management professional could use this sample to write your organization's own, specific policy and procedures that are consistent with your state's law and legal counsel's advice. The second article, "Advance Directives and the New Joint Commission Requirements," compares 1992 Joint Commission standards for Patient Rights and The Patient Self-Determination Act requirements. Selected sections from the Joint Commission chapter on Patient Rights are highlighted and comments added that contrast it with the act. "Common Questions and Answers Related to Advance Directives" is the third tool we offer. These questions and answers may be used for a patient education brochure or staff inservice education program outline. Again, information specific to your own state needs to be added. The fourth tool we offer is miniature "Sample Slides" or overhead transparency copy that can be enlarged and used for a presentation on the basics of advance directives for a community group for staff education. We thank Dee McLane, RRA, director, Medical Information Services at Self Memorial Hospital in Greenwood, SC, who developed these slides for presentations conducted at her hospital. We also thank Jeri Whitworth, RRA, who produced the graphics on these slides. Whitworth is a first year director on the AHIMA Board of Directors this year. Again you can use as is or consider

  19. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    NASA Astrophysics Data System (ADS)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-09-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  20. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  1. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  2. ADVISOR: a systems analysis tool for advanced vehicle modeling

    NASA Astrophysics Data System (ADS)

    Markel, T.; Brooker, A.; Hendricks, T.; Johnson, V.; Kelly, K.; Kramer, B.; O'Keefe, M.; Sprik, S.; Wipke, K.

    This paper provides an overview of Advanced Vehicle Simulator (ADVISOR)—the US Department of Energy's (DOE's) ADVISOR written in the MATLAB/Simulink environment and developed by the National Renewable Energy Laboratory. ADVISOR provides the vehicle engineering community with an easy-to-use, flexible, yet robust and supported analysis package for advanced vehicle modeling. It is primarily used to quantify the fuel economy, the performance, and the emissions of vehicles that use alternative technologies including fuel cells, batteries, electric motors, and internal combustion engines in hybrid (i.e. multiple power sources) configurations. It excels at quantifying the relative change that can be expected due to the implementation of technology compared to a baseline scenario. ADVISOR's capabilities and limitations are presented and the power source models that are included in ADVISOR are discussed. Finally, several applications of the tool are presented to highlight ADVISOR's functionality. The content of this paper is based on a presentation made at the 'Development of Advanced Battery Engineering Models' workshop held in Crystal City, Virginia in August 2001.

  3. An Advanced Tool for Control System Design and Maintenance

    SciTech Connect

    Storm, Joachim; Lohmann, Heinz

    2006-07-01

    The detailed engineering for control systems is usually supported by CAD Tools creating the relevant logic diagrams including software parameters and signal cross references. However at this stage of the design an early V and V process for checking out the functional correctness of the design is not available. The article describes the scope and capabilities of an advanced control system design tool which has the embedded capability of a stand-alone simulation of complex logic structures. The tool provides the following features for constructing logic diagrams for control systems: - Drag and Drop construction of logic diagrams using a predefined symbol sets; - Cross reference facility; - Data extraction facility; - Stand-alone simulation for Logic Diagrams featuring: On the fly changes, signal line animation, value boxes and mini trends etc. - Creation and on-line animation of Compound Objects (Handler); - Code Generation Facility for Simulation; - Code Generation Facility for several control systems. The results of the integrated simulation based V and V process can be used further for initial control system configuration and life cycle management as well as for Engineering Test Bed applications and finally in full Scope Replica Simulators for Operator Training. (authors)

  4. A new clinical tool for assessing numerical abilities in neurological diseases: numerical activities of daily living

    PubMed Central

    Semenza, Carlo; Meneghello, Francesca; Arcara, Giorgio; Burgio, Francesca; Gnoato, Francesca; Facchini, Silvia; Benavides-Varela, Silvia; Clementi, Maurizio; Butterworth, Brian

    2014-01-01

    The aim of this study was to build an instrument, the numerical activities of daily living (NADL), designed to identify the specific impairments in numerical functions that may cause problems in everyday life. These impairments go beyond what can be inferred from the available scales evaluating activities of daily living in general, and are not adequately captured by measures of the general deterioration of cognitive functions as assessed by standard clinical instruments like the MMSE and MoCA. We assessed a control group (n = 148) and a patient group affected by a wide variety of neurological conditions (n = 175), with NADL along with IADL, MMSE, and MoCA. The NADL battery was found to have satisfactory construct validity and reliability, across a wide age range. This enabled us to calculate appropriate criteria for impairment that took into account age and education. It was found that neurological patients tended to overestimate their abilities as compared to the judgment made by their caregivers, assessed with objective tests of numerical abilities. PMID:25126077

  5. A new clinical tool for assessing numerical abilities in neurological diseases: numerical activities of daily living.

    PubMed

    Semenza, Carlo; Meneghello, Francesca; Arcara, Giorgio; Burgio, Francesca; Gnoato, Francesca; Facchini, Silvia; Benavides-Varela, Silvia; Clementi, Maurizio; Butterworth, Brian

    2014-01-01

    The aim of this study was to build an instrument, the numerical activities of daily living (NADL), designed to identify the specific impairments in numerical functions that may cause problems in everyday life. These impairments go beyond what can be inferred from the available scales evaluating activities of daily living in general, and are not adequately captured by measures of the general deterioration of cognitive functions as assessed by standard clinical instruments like the MMSE and MoCA. We assessed a control group (n = 148) and a patient group affected by a wide variety of neurological conditions (n = 175), with NADL along with IADL, MMSE, and MoCA. The NADL battery was found to have satisfactory construct validity and reliability, across a wide age range. This enabled us to calculate appropriate criteria for impairment that took into account age and education. It was found that neurological patients tended to overestimate their abilities as compared to the judgment made by their caregivers, assessed with objective tests of numerical abilities. PMID:25126077

  6. Plans and resources required for a computer numerically controlled machine tool tester

    SciTech Connect

    Newton, L.E.; Burleson, R.R.; McCue, H.K.; Pomernacki, C.L.; Mansfield, A.R.; Childs, J.J.

    1982-07-19

    Precision computer numerically controlled (CNC) machine tools present unique and especially difficult problems in the areas of qualification and fault isolation. In this report, we examine and classify these problems, discuss methods to resolve them effectively, and present estimates of the resources needed to design and build a CNC/machine tool tester.

  7. Recent advances in two-phase flow numerics

    SciTech Connect

    Mahaffy, J.H.; Macian, R.

    1997-07-01

    The authors review three topics in the broad field of numerical methods that may be of interest to individuals modeling two-phase flow in nuclear power plants. The first topic is iterative solution of linear equations created during the solution of finite volume equations. The second is numerical tracking of macroscopic liquid interfaces. The final area surveyed is the use of higher spatial difference techniques.

  8. An Advanced Decision Support Tool for Electricity Infrastructure Operations

    SciTech Connect

    Chen, Yousu; Huang, Zhenyu; Wong, Pak C.; Mackey, Patrick S.; Allwardt, Craig H.; Ma, Jian; Greitzer, Frank L.

    2010-01-31

    Electricity infrastructure, as one of the most critical infrastructures in the U.S., plays an important role in modern societies. Its failure would lead to significant disruption of people’s lives, industry and commercial activities, and result in massive economic losses. Reliable operation of electricity infrastructure is an extremely challenging task because human operators need to consider thousands of possible configurations in near real-time to choose the best option and operate the network effectively. In today’s practice, electricity infrastructure operation is largely based on operators’ experience with very limited real-time decision support, resulting in inadequate management of complex predictions and the inability to anticipate, recognize, and respond to situations caused by human errors, natural disasters, or cyber attacks. Therefore, a systematic approach is needed to manage the complex operational paradigms and choose the best option in a near-real-time manner. This paper proposes an advanced decision support tool for electricity infrastructure operations. The tool has the functions of turning large amount of data into actionable information to help operators monitor power grid status in real time; performing trend analysis to indentify system trend at the regional level or system level to help the operator to foresee and discern emergencies, studying clustering analysis to assist operators to identify the relationships between system configurations and affected assets, and interactively evaluating the alternative remedial actions to aid operators to make effective and timely decisions. This tool can provide significant decision support on electricity infrastructure operations and lead to better reliability in power grids. This paper presents examples with actual electricity infrastructure data to demonstrate the capability of this tool.

  9. Bioinformatics Methods and Tools to Advance Clinical Care

    PubMed Central

    Lecroq, T.

    2015-01-01

    Summary Objectives To summarize excellent current research in the field of Bioinformatics and Translational Informatics with application in the health domain and clinical care. Method We provide a synopsis of the articles selected for the IMIA Yearbook 2015, from which we attempt to derive a synthetic overview of current and future activities in the field. As last year, a first step of selection was performed by querying MEDLINE with a list of MeSH descriptors completed by a list of terms adapted to the section. Each section editor has evaluated separately the set of 1,594 articles and the evaluation results were merged for retaining 15 articles for peer-review. Results The selection and evaluation process of this Yearbook’s section on Bioinformatics and Translational Informatics yielded four excellent articles regarding data management and genome medicine that are mainly tool-based papers. In the first article, the authors present PPISURV a tool for uncovering the role of specific genes in cancer survival outcome. The second article describes the classifier PredictSNP which combines six performing tools for predicting disease-related mutations. In the third article, by presenting a high-coverage map of the human proteome using high resolution mass spectrometry, the authors highlight the need for using mass spectrometry to complement genome annotation. The fourth article is also related to patient survival and decision support. The authors present datamining methods of large-scale datasets of past transplants. The objective is to identify chances of survival. Conclusions The current research activities still attest the continuous convergence of Bioinformatics and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care. Indeed, there is a need for powerful tools for managing and interpreting complex, large-scale genomic and biological datasets, but also a need for user-friendly tools developed for the clinicians in their

  10. Clinical holistic health: advanced tools for holistic medicine.

    PubMed

    Ventegodt, Søren; Clausen, Birgitte; Nielsen, May Lyck; Merrick, Joav

    2006-01-01

    According to holistic medical theory, the patient will heal when old painful moments, the traumatic events of life that are often called "gestalts", are integrated in the present "now". The advanced holistic physician's expanded toolbox has many different tools to induce this healing, some that are more dangerous and potentially traumatic than others. The more intense the therapeutic technique, the more emotional energy will be released and contained in the session, but the higher also is the risk for the therapist to lose control of the session and lose the patient to his or her own dark side. To avoid harming the patient must be the highest priority in holistic existential therapy, making sufficient education and training an issue of highest importance. The concept of "stepping up" the therapy by using more and more "dramatic" methods to get access to repressed emotions and events has led us to a "therapeutic staircase" with ten steps: (1) establishing the relationship; (2) establishing intimacy, trust, and confidentiality; (3) giving support and holding; (4) taking the patient into the process of physical, emotional, and mental healing; (5) social healing of being in the family; (6) spiritual healing--returning to the abstract wholeness of the soul; (7) healing the informational layer of the body; (8) healing the three fundamental dimensions of existence: love, power, and sexuality in a direct way using, among other techniques, "controlled violence" and "acupressure through the vagina"; (9) mind-expanding and consciousness-transformative techniques like psychotropic drugs; and (10) techniques transgressing the patient's borders and, therefore, often traumatizing (for instance, the use of force against the will of the patient). We believe that the systematic use of the staircase will greatly improve the power and efficiency of holistic medicine for the patient and we invite a broad cooperation in scientifically testing the efficiency of the advanced holistic

  11. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  12. Sandia Advanced MEMS Design Tools, Version 2.0

    2002-06-13

    Sandia Advanced MEMS Design Tools is a 5-level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c)Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the processmore » of having MEMS fabricated at SNL e) Facilitate the process of having post-fabrication services performed While there exist some files on the CD that are used in conjunction with the software AutoCAD, these files are not intended for use independent of the CD. NOTE: THE CUSTOMER MUST PURCHASE HIS/HER OWN COPY OF AutoCAD TO USE WITH THESE FILES.« less

  13. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  14. Advanced REACH Tool: a Bayesian model for occupational exposure assessment.

    PubMed

    McNally, Kevin; Warren, Nicholas; Fransman, Wouter; Entink, Rinke Klein; Schinkel, Jody; van Tongeren, Martie; Cherrie, John W; Kromhout, Hans; Schneider, Thomas; Tielemans, Erik

    2014-06-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sources of information within a Bayesian statistical framework. The information is obtained from expert knowledge expressed in a calibrated mechanistic model of exposure assessment, data on inter- and intra-individual variability in exposures from the literature, and context-specific exposure measurements. The ART provides central estimates and credible intervals for different percentiles of the exposure distribution, for full-shift and long-term average exposures. The ART can produce exposure estimates in the absence of measurements, but the precision of the estimates improves as more data become available. The methodology presented in this paper is able to utilize partially analogous data, a novel approach designed to make efficient use of a sparsely populated measurement database although some additional research is still required before practical implementation. The methodology is demonstrated using two worked examples: an exposure to copper pyrithione in the spraying of antifouling paints and an exposure to ethyl acetate in shoe repair. PMID:24665110

  15. Tools for the advancement of undergraduate statistics education

    NASA Astrophysics Data System (ADS)

    Schaffner, Andrew Alan

    To keep pace with advances in applied statistics and to maintain literate consumers of quantitative analyses, statistics educators stress the need for change in the classroom (Cobb, 1992; Garfield, 1993, 1995; Moore, 1991a; Snee, 1993; Steinhorst and Keeler, 1995). These authors stress a more concept oriented undergraduate introductory statistics course which emphasizes true understanding over mechanical skills. Drawing on recent educational research, this dissertation attempts to realize this vision by developing tools and pedagogy to assist statistics instructors. This dissertation describes statistical facets, pieces of statistical understanding that are building blocks of knowledge, and discusses DIANA, a World-Wide Web tool for diagnosing facets. Further, I show how facets may be incorporated into course design through the development of benchmark lessons based on the principles of collaborative learning (diSessa and Minstrell, 1995; Cohen, 1994; Reynolds et al., 1995; Bruer, 1993; von Glasersfeld, 1991) and activity based courses (Jones, 1991; Yackel, Cobb and Wood, 1991). To support benchmark lessons and collaborative learning in large classes I describe Virtual Benchmark Instruction, benchmark lessons which take place on a structured hypertext bulletin board using the technology of the World-Wide Web. Finally, I present randomized experiments which suggest that these educational developments are effective in a university introductory statistics course.

  16. Advanced REACH Tool: A Bayesian Model for Occupational Exposure Assessment

    PubMed Central

    McNally, Kevin; Warren, Nicholas; Fransman, Wouter; Entink, Rinke Klein; Schinkel, Jody; van Tongeren, Martie; Cherrie, John W.; Kromhout, Hans; Schneider, Thomas; Tielemans, Erik

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sources of information within a Bayesian statistical framework. The information is obtained from expert knowledge expressed in a calibrated mechanistic model of exposure assessment, data on inter- and intra-individual variability in exposures from the literature, and context-specific exposure measurements. The ART provides central estimates and credible intervals for different percentiles of the exposure distribution, for full-shift and long-term average exposures. The ART can produce exposure estimates in the absence of measurements, but the precision of the estimates improves as more data become available. The methodology presented in this paper is able to utilize partially analogous data, a novel approach designed to make efficient use of a sparsely populated measurement database although some additional research is still required before practical implementation. The methodology is demonstrated using two worked examples: an exposure to copper pyrithione in the spraying of antifouling paints and an exposure to ethyl acetate in shoe repair. PMID:24665110

  17. Sandia Advanced MEMS Design Tools, Version 2.2.5

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external tomore » Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less

  18. Sandia Advanced MEMS Design Tools, Version 2.2.5

    SciTech Connect

    Yarberry, Victor; Allen, James; Lantz, Jeffery; Priddy, Brian; & Westling, Belinda

    2010-01-19

    The Sandia National Laboratories Advanced MEMS Design Tools, Version 2.2.5, is a collection of menus, prototype drawings, and executables that provide significant productivity enhancements when using AutoCAD to design MEMS components. This release is designed for AutoCAD 2000i, 2002, or 2004 and is supported under Windows NT 4.0, Windows 2000, or XP. SUMMiT V (Sandia Ultra planar Multi level MEMS Technology) is a 5 level surface micromachine fabrication technology, which customers internal and external to Sandia can access to fabricate prototype MEMS devices. This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) New features in this version: AutoCAD 2004 support has been added. SafeExplode ? a new feature that explodes blocks without affecting polylines (avoids exploding polylines into objects that are ignored by the DRC and Visualization tools). Layer control menu ? a pull-down menu for selecting layers to isolate, freeze, or thaw. Updated tools: A check has been added to catch invalid block names. DRC features: Added username/password validation, added a method to update the user?s password. SNL_DRC_WIDTH ? a value to control the width of the DRC error lines. SNL_BIAS_VALUE ? a value use to offset selected geometry SNL_PROCESS_NAME ? a value to specify the process name Documentation changes: The documentation has been updated to include the new features. While there exist some files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.

  19. The Role of Numerical Simulation in Advancing Plasma Propulsion

    NASA Astrophysics Data System (ADS)

    Turchi, P. J.; Mikellides, P. G.; Mikellides, I. G.

    1999-11-01

    Plasma thrusters often involve a complex set of interactions among several distinct physical processes. While each process can yield to separate mathematical representation, their combination generally requires numerical simulation. We have extended and used the MACH2 code successfully to simulate both self-field and applied-field magnetoplasmadynamic thrusters and, more recently, ablation-fed pulsed plasma microthrusters. MACH2 provides a framework in which to compute 2-1/2 dimensional, unsteady, MHD flows in two-temperature LTE. It couples to several options for electrical circuitry and allows access to both analytic formulas and tabular values for material properties and transport coefficients, including phenomenological models for anomalous transport. Even with all these capabilities, however, successful modeling demands comparison with experiment and with analytic solutions in idealized limits, and careful combination of MACH2 results with separate physical reasoning. Although well understood elsewhere in plasma physics, the strengths and limitations of numerical simulation for plasma propulsion needs further discussion.

  20. Numerical Simulations and Optimisation in Forming of Advanced Materials

    NASA Astrophysics Data System (ADS)

    Huétink, J.

    2007-04-01

    With the introduction of new materials as high strength steels, metastable steels and fiber reinforce composites, the need for advanced physically valid constitutive models arises. A biaxial test equipment is developed and applied for the determination of material data as well as for validation of material models. An adaptive through- thickness integration scheme for plate elements is developed, which improves the accuracy of spring back prediction at minimal costs. An optimization strategy is proposed that assists an engineer to model an optimization problem.

  1. Numerical tools for the characterization of microelectromechanical systems by digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Pagliarulo, Vito; Russo, Tiziana; Miccio, Lisa; Ferraro, Pietro

    2015-10-01

    Digital holography (DH) in microscopy became an important interferometric tool in optical metrology when camera sensors reached a higher pixel number with smaller size and high-speed computers became able to process the acquired images. This allowed the investigation of engineered surfaces on microscale, such as microelectromechanical systems (MEMS). In DH, numerical tools perform the reconstruction of the wave field. This offers the possibility of retrieving not only the intensity of the acquired wavefield, but also the phase distribution. This review describes the principles of DH and shows the most important numerical tools discovered and applied to date in the field of MEMS. Both the static and the dynamic regimes can be analyzed by means of DH. Whereas the first one is mostly related to the characterization after the fabrication process, the second one is a useful tool to characterize the actuation of the MEMS.

  2. Analysis of the thermo-mechanical deformations in a hot forging tool by numerical simulation

    NASA Astrophysics Data System (ADS)

    L-Cancelos, R.; Varas, F.; Martín, E.; Viéitez, I.

    2016-03-01

    Although programs have been developed for the design of tools for hot forging, its design is still largely based on the experience of the tool maker. This obliges to build some test matrices and correct their errors to minimize distortions in the forged piece. This phase prior to mass production consumes time and material resources, which makes the final product more expensive. The forging tools are usually constituted by various parts made of different grades of steel, which in turn have different mechanical properties and therefore suffer different degrees of strain. Furthermore, the tools used in the hot forging are exposed to a thermal field that also induces strain or stress based on the degree of confinement of the piece. Therefore, the mechanical behaviour of the assembly is determined by the contact between the different pieces. The numerical simulation allows to analyse different configurations and anticipate possible defects before tool making, thus, reducing the costs of this preliminary phase. In order to improve the dimensional quality of the manufactured parts, the work presented here focuses on the application of a numerical model to a hot forging manufacturing process in order to predict the areas of the forging die subjected to large deformations. The thermo-mechanical model developed and implemented with free software (Code-Aster) includes the strains of thermal origin, strains during forge impact and contact effects. The numerical results are validated with experimental measurements in a tooling set that produces forged crankshafts for the automotive industry. The numerical results show good agreement with the experimental tests. Thereby, a very useful tool for the design of tooling sets for hot forging is achieved.

  3. Basic and Advanced Numerical Performances Relate to Mathematical Expertise but Are Fully Mediated by Visuospatial Skills

    PubMed Central

    2016-01-01

    Recent studies have highlighted the potential role of basic numerical processing in the acquisition of numerical and mathematical competences. However, it is debated whether high-level numerical skills and mathematics depends specifically on basic numerical representations. In this study mathematicians and nonmathematicians performed a basic number line task, which required mapping positive and negative numbers on a physical horizontal line, and has been shown to correlate with more advanced numerical abilities and mathematical achievement. We found that mathematicians were more accurate compared with nonmathematicians when mapping positive, but not negative numbers, which are considered numerical primitives and cultural artifacts, respectively. Moreover, performance on positive number mapping could predict whether one is a mathematician or not, and was mediated by more advanced mathematical skills. This finding might suggest a link between basic and advanced mathematical skills. However, when we included visuospatial skills, as measured by block design subtest, the mediation analysis revealed that the relation between the performance in the number line task and the group membership was explained by non-numerical visuospatial skills. These results demonstrate that relation between basic, even specific, numerical skills and advanced mathematical achievement can be artifactual and explained by visuospatial processing. PMID:26913930

  4. Numerical modeling of spray combustion with an advanced VOF method

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Shang, Huan-Min; Shih, Ming-Hsin; Liaw, Paul

    1995-01-01

    This paper summarizes the technical development and validation of a multiphase computational fluid dynamics (CFD) numerical method using the volume-of-fluid (VOF) model and a Lagrangian tracking model which can be employed to analyze general multiphase flow problems with free surface mechanism. The gas-liquid interface mass, momentum and energy conservation relationships are modeled by continuum surface mechanisms. A new solution method is developed such that the present VOF model can be applied for all-speed flow regimes. The objectives of the present study are to develop and verify the fractional volume-of-fluid cell partitioning approach into a predictor-corrector algorithm and to demonstrate the effectiveness of the present approach by simulating benchmark problems including laminar impinging jets, shear coaxial jet atomization and shear coaxial spray combustion flows.

  5. Functional toxicology: tools to advance the future of toxicity testing.

    PubMed

    Gaytán, Brandon D; Vulpe, Chris D

    2014-01-01

    The increased presence of chemical contaminants in the environment is an undeniable concern to human health and ecosystems. Historically, by relying heavily upon costly and laborious animal-based toxicity assays, the field of toxicology has often neglected examinations of the cellular and molecular mechanisms of toxicity for the majority of compounds-information that, if available, would strengthen risk assessment analyses. Functional toxicology, where cells or organisms with gene deletions or depleted proteins are used to assess genetic requirements for chemical tolerance, can advance the field of toxicity testing by contributing data regarding chemical mechanisms of toxicity. Functional toxicology can be accomplished using available genetic tools in yeasts, other fungi and bacteria, and eukaryotes of increased complexity, including zebrafish, fruit flies, rodents, and human cell lines. Underscored is the value of using less complex systems such as yeasts to direct further studies in more complex systems such as human cell lines. Functional techniques can yield (1) novel insights into chemical toxicity; (2) pathways and mechanisms deserving of further study; and (3) candidate human toxicant susceptibility or resistance genes. PMID:24847352

  6. Functional toxicology: tools to advance the future of toxicity testing

    PubMed Central

    Gaytán, Brandon D.; Vulpe, Chris D.

    2014-01-01

    The increased presence of chemical contaminants in the environment is an undeniable concern to human health and ecosystems. Historically, by relying heavily upon costly and laborious animal-based toxicity assays, the field of toxicology has often neglected examinations of the cellular and molecular mechanisms of toxicity for the majority of compounds—information that, if available, would strengthen risk assessment analyses. Functional toxicology, where cells or organisms with gene deletions or depleted proteins are used to assess genetic requirements for chemical tolerance, can advance the field of toxicity testing by contributing data regarding chemical mechanisms of toxicity. Functional toxicology can be accomplished using available genetic tools in yeasts, other fungi and bacteria, and eukaryotes of increased complexity, including zebrafish, fruit flies, rodents, and human cell lines. Underscored is the value of using less complex systems such as yeasts to direct further studies in more complex systems such as human cell lines. Functional techniques can yield (1) novel insights into chemical toxicity; (2) pathways and mechanisms deserving of further study; and (3) candidate human toxicant susceptibility or resistance genes. PMID:24847352

  7. Advanced numerical methods and software approaches for semiconductor device simulation

    SciTech Connect

    CAREY,GRAHAM F.; PARDHANANI,A.L.; BOVA,STEVEN W.

    2000-03-23

    In this article the authors concisely present several modern strategies that are applicable to drift-dominated carrier transport in higher-order deterministic models such as the drift-diffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of upwind and artificial dissipation schemes, generalization of the traditional Scharfetter-Gummel approach, Petrov-Galerkin and streamline-upwind Petrov Galerkin (SUPG), entropy variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of the methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. They have included numerical examples from the recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and they emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, they briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.

  8. Advanced Numerical Methods and Software Approaches for Semiconductor Device Simulation

    DOE PAGESBeta

    Carey, Graham F.; Pardhanani, A. L.; Bova, S. W.

    2000-01-01

    In this article we concisely present several modern strategies that are applicable to driftdominated carrier transport in higher-order deterministic models such as the driftdiffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of “upwind” and artificial dissipation schemes, generalization of the traditional Scharfetter – Gummel approach, Petrov – Galerkin and streamline-upwind Petrov Galerkin (SUPG), “entropy” variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of themore » methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. We have included numerical examples from our recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and we emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, we briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.« less

  9. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  10. Technology and Jobs: Computer-Aided Design. Numerical-Control Machine-Tool Operators. Office Automation.

    ERIC Educational Resources Information Center

    Stanton, Michael; And Others

    1985-01-01

    Three reports on the effects of high technology on the nature of work include (1) Stanton on applications and implications of computer-aided design for engineers, drafters, and architects; (2) Nardone on the outlook and training of numerical-control machine tool operators; and (3) Austin and Drake on the future of clerical occupations in automated…

  11. STRING 3: An Advanced Groundwater Flow Visualization Tool

    NASA Astrophysics Data System (ADS)

    Schröder, Simon; Michel, Isabel; Biedert, Tim; Gräfe, Marius; Seidel, Torsten; König, Christoph

    2016-04-01

    The visualization of 3D groundwater flow is a challenging task. Previous versions of our software STRING [1] solely focused on intuitive visualization of complex flow scenarios for non-professional audiences. STRING, developed by Fraunhofer ITWM (Kaiserslautern, Germany) and delta h Ingenieurgesellschaft mbH (Witten, Germany), provides the necessary means for visualization of both 2D and 3D data on planar and curved surfaces. In this contribution we discuss how to extend this approach to a full 3D tool and its challenges in continuation of Michel et al. [2]. This elevates STRING from a post-production to an exploration tool for experts. In STRING moving pathlets provide an intuition of velocity and direction of both steady-state and transient flows. The visualization concept is based on the Lagrangian view of the flow. To capture every detail of the flow an advanced method for intelligent, time-dependent seeding is used building on the Finite Pointset Method (FPM) developed by Fraunhofer ITWM. Lifting our visualization approach from 2D into 3D provides many new challenges. With the implementation of a seeding strategy for 3D one of the major problems has already been solved (see Schröder et al. [3]). As pathlets only provide an overview of the velocity field other means are required for the visualization of additional flow properties. We suggest the use of Direct Volume Rendering and isosurfaces for scalar features. In this regard we were able to develop an efficient approach for combining the rendering through raytracing of the volume and regular OpenGL geometries. This is achieved through the use of Depth Peeling or A-Buffers for the rendering of transparent geometries. Animation of pathlets requires a strict boundary of the simulation domain. Hence, STRING needs to extract the boundary, even from unstructured data, if it is not provided. In 3D we additionally need a good visualization of the boundary itself. For this the silhouette based on the angle of

  12. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    SciTech Connect

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from

  13. Numerical Propulsion System Simulation (NPSS): An Award Winning Propulsion System Simulation Tool

    NASA Technical Reports Server (NTRS)

    Stauber, Laurel J.; Naiman, Cynthia G.

    2002-01-01

    The Numerical Propulsion System Simulation (NPSS) is a full propulsion system simulation tool used by aerospace engineers to predict and analyze the aerothermodynamic behavior of commercial jet aircraft, military applications, and space transportation. The NPSS framework was developed to support aerospace, but other applications are already leveraging the initial capabilities, such as aviation safety, ground-based power, and alternative energy conversion devices such as fuel cells. By using the framework and developing the necessary components, future applications that NPSS could support include nuclear power, water treatment, biomedicine, chemical processing, and marine propulsion. NPSS will dramatically reduce the time, effort, and expense necessary to design and test jet engines. It accomplishes that by generating sophisticated computer simulations of an aerospace object or system, thus enabling engineers to "test" various design options without having to conduct costly, time-consuming real-life tests. The ultimate goal of NPSS is to create a numerical "test cell" that enables engineers to create complete engine simulations overnight on cost-effective computing platforms. Using NPSS, engine designers will be able to analyze different parts of the engine simultaneously, perform different types of analysis simultaneously (e.g., aerodynamic and structural), and perform analysis in a more efficient and less costly manner. NPSS will cut the development time of a new engine in half, from 10 years to 5 years. And NPSS will have a similar effect on the cost of development: new jet engines will cost about a billion dollars to develop rather than two billion. NPSS is also being applied to the development of space transportation technologies, and it is expected that similar efficiencies and cost savings will result. Advancements of NPSS in fiscal year 2001 included enhancing the NPSS Developer's Kit to easily integrate external components of varying fidelities, providing

  14. Comparing Simple and Advanced Video Tools as Supports for Complex Collaborative Design Processes

    ERIC Educational Resources Information Center

    Zahn, Carmen; Pea, Roy; Hesse, Friedrich W.; Rosen, Joe

    2010-01-01

    Working with digital video technologies, particularly advanced video tools with editing capabilities, offers new prospects for meaningful learning through design. However, it is also possible that the additional complexity of such tools does "not" advance learning. We compared in an experiment the design processes and learning outcomes of 24…

  15. Isogeometric analysis: a powerful numerical tool for the elastic analysis of historical masonry arches

    NASA Astrophysics Data System (ADS)

    Cazzani, Antonio; Malagù, Marcello; Turco, Emilio

    2016-03-01

    We illustrate a numerical tool for analyzing plane arches such as those frequently used in historical masonry heritage. It is based on a refined elastic mechanical model derived from the isogeometric approach. In particular, geometry and displacements are modeled by means of non-uniform rational B-splines. After a brief introduction, outlining the basic assumptions of this approach and the corresponding modeling choices, several numerical applications to arches, which are typical of masonry structures, show the performance of this novel technique. These are discussed in detail to emphasize the advantage and potential developments of isogeometric analysis in the field of structural analysis of historical masonry buildings with complex geometries.

  16. New advanced radio diagnostics tools for Space Weather Program

    NASA Astrophysics Data System (ADS)

    Krankowski, A.; Rothkaehl, H.; Atamaniuk, B.; Morawski, M.; Zakharenkova, I.; Cherniak, I.; Otmianowska-Mazur, K.

    2013-12-01

    To give a more detailed and complete understanding of physical plasma processes that govern the solar-terrestrial space, and to develop qualitative and quantitative models of the magnetosphere-ionosphere-thermosphere coupling, it is necessary to design and build the next generation of instruments for space diagnostics and monitoring. Novel ground- based wide-area sensor networks, such as the LOFAR (Low Frequency Array) radar facility, comprising wide band, and vector-sensing radio receivers and multi-spacecraft plasma diagnostics should help solve outstanding problems of space physics and describe long-term environmental changes. The LOw Frequency ARray - LOFAR - is a new fully digital radio telescope designed for frequencies between 30 MHz and 240 MHz located in Europe. The three new LOFAR stations will be installed until summer 2015 in Poland. The LOFAR facilities in Poland will be distributed among three sites: Lazy (East of Krakow), Borowiec near Poznan and Baldy near Olsztyn. All they will be connected via PIONIER dedicated links to Poznan. Each site will host one LOFAR station (96 high-band+96 low-band antennas). They will most time work as a part of European network, however, when less charged, they can operate as a national network The new digital radio frequency analyzer (RFA) on board the low-orbiting RELEC satellite was designed to monitor and investigate the ionospheric plasma properties. This two-point ground-based and topside ionosphere-located space plasma diagnostic can be a useful new tool for monitoring and diagnosing turbulent plasma properties. The RFA on board the RELEC satellite is the first in a series of experiments which is planned to be launched into the near-Earth environment. In order to improve and validate the large scales and small scales ionospheric structures we will used the GPS observations collected at IGS/EPN network employed to reconstruct diurnal variations of TEC using all satellite passes over individual GPS stations and the

  17. Laser vision: lidar as a transformative tool to advance critical zone science

    NASA Astrophysics Data System (ADS)

    Harpold, A. A.; Marshall, J. A.; Lyon, S. W.; Barnhart, T. B.; Fisher, B.; Donovan, M.; Brubaker, K. M.; Crosby, C. J.; Glenn, N. F.; Glennie, C. L.; Kirchner, P. B.; Lam, N.; Mankoff, K. D.; McCreight, J. L.; Molotch, N. P.; Musselman, K. N.; Pelletier, J.; Russo, T.; Sangireddy, H.; Sjöberg, Y.; Swetnam, T.; West, N.

    2015-01-01

    Laser vision: lidar as a transformative tool to advance critical zone science. Observation and quantification of the Earth surface is undergoing a revolutionary change due to the increased spatial resolution and extent afforded by light detection and ranging (lidar) technology. As a consequence, lidar-derived information has led to fundamental discoveries within the individual disciplines of geomorphology, hydrology, and ecology. These disciplines form the cornerstones of Critical Zone (CZ) science, where researchers study how interactions among the geosphere, hydrosphere, and ecosphere shape and maintain the "zone of life", extending from the groundwater to the vegetation canopy. Lidar holds promise as a transdisciplinary CZ research tool by simultaneously allowing for quantification of topographic, vegetative, and hydrological data. Researchers are just beginning to utilize lidar datasets to answer synergistic questions in CZ science, such as how landforms and soils develop in space and time as a function of the local climate, biota, hydrologic properties, and lithology. This review's objective is to demonstrate the transformative potential of lidar by critically assessing both challenges and opportunities for transdisciplinary lidar applications. A review of 147 peer-reviewed studies utilizing lidar showed that 38 % of the studies were focused in geomorphology, 18 % in hydrology, 32 % in ecology, and the remaining 12 % have an interdisciplinary focus. We find that using lidar to its full potential will require numerous advances across CZ applications, including new and more powerful open-source processing tools, exploiting new lidar acquisition technologies, and improved integration with physically-based models and complementary in situ and remote-sensing observations. We provide a five-year vision to utilize and advocate for the expanded use of lidar datasets to benefit CZ science applications.

  18. Advances in numerical solutions to integral equations in liquid state theory

    NASA Astrophysics Data System (ADS)

    Howard, Jesse J.

    Solvent effects play a vital role in the accurate description of the free energy profile for solution phase chemical and structural processes. The inclusion of solvent effects in any meaningful theoretical model however, has proven to be a formidable task. Generally, methods involving Poisson-Boltzmann (PB) theory and molecular dynamic (MD) simulations are used, but they either fail to accurately describe the solvent effects or require an exhaustive computation effort to overcome sampling problems. An alternative to these methods are the integral equations (IEs) of liquid state theory which have become more widely applicable due to recent advancements in the theory of interaction site fluids and the numerical methods to solve the equations. In this work a new numerical method is developed based on a Newton-type scheme coupled with Picard/MDIIS routines. To extend the range of these numerical methods to large-scale data systems, the size of the Jacobian is reduced using basis functions, and the Newton steps are calculated using a GMRes solver. The method is then applied to calculate solutions to the 3D reference interaction site model (RISM) IEs of statistical mechanics, which are derived from first principles, for a solute model of a pair of parallel graphene plates at various separations in pure water. The 3D IEs are then extended to electrostatic models using an exact treatment of the long-range Coulomb interactions for negatively charged walls and DNA duplexes in aqueous electrolyte solutions to calculate the density profiles and solution thermodynamics. It is found that the 3D-IEs provide a qualitative description of the density distributions of the solvent species when compared to MD results, but at a much reduced computational effort in comparison to MD simulations. The thermodynamics of the solvated systems are also qualitatively reproduced by the IE results. The findings of this work show the IEs to be a valuable tool for the study and prediction of

  19. Numerical Stability and Accuracy of Temporally Coupled Multi-Physics Modules in Wind-Turbine CAE Tools

    SciTech Connect

    Gasmi, A.; Sprague, M. A.; Jonkman, J. M.; Jones, W. B.

    2013-02-01

    In this paper we examine the stability and accuracy of numerical algorithms for coupling time-dependent multi-physics modules relevant to computer-aided engineering (CAE) of wind turbines. This work is motivated by an in-progress major revision of FAST, the National Renewable Energy Laboratory's (NREL's) premier aero-elastic CAE simulation tool. We employ two simple examples as test systems, while algorithm descriptions are kept general. Coupled-system governing equations are framed in monolithic and partitioned representations as differential-algebraic equations. Explicit and implicit loose partition coupling is examined. In explicit coupling, partitions are advanced in time from known information. In implicit coupling, there is dependence on other-partition data at the next time step; coupling is accomplished through a predictor-corrector (PC) approach. Numerical time integration of coupled ordinary-differential equations (ODEs) is accomplished with one of three, fourth-order fixed-time-increment methods: Runge-Kutta (RK), Adams-Bashforth (AB), and Adams-Bashforth-Moulton (ABM). Through numerical experiments it is shown that explicit coupling can be dramatically less stable and less accurate than simulations performed with the monolithic system. However, PC implicit coupling restored stability and fourth-order accuracy for ABM; only second-order accuracy was achieved with RK integration. For systems without constraints, explicit time integration with AB and explicit loose coupling exhibited desired accuracy and stability.

  20. A Straightforward Method for Advance Estimation of User Charges for Information in Numeric Databases.

    ERIC Educational Resources Information Center

    Jarvelin, Kalervo

    1986-01-01

    Describes a method for advance estimation of user charges for queries in relational data model-based numeric databases when charges are based on data retrieved. Use of this approach is demonstrated by sample queries to an imaginary marketing database. The principles and methods of this approach and its relevance are discussed. (MBR)

  1. Advanced PANIC quick-look tool using Python

    NASA Astrophysics Data System (ADS)

    Ibáñez, José-Miguel; García Segura, Antonio J.; Storz, Clemens; Fried, Josef W.; Fernández, Matilde; Rodríguez Gómez, Julio F.; Terrón, V.; Cárdenas, M. C.

    2012-09-01

    PANIC, the Panoramic Near Infrared Camera, is an instrument for the Calar Alto Observatory currently being integrated in laboratory and whose first light is foreseen for end 2012 or early 2013. We present here how the PANIC Quick-Look tool (PQL) and pipeline (PAPI) are being implemented, using existing rapid programming Python technologies and packages, together with well-known astronomical software suites (Astromatic, IRAF) and parallel processing techniques. We will briefly describe the structure of the PQL tool, whose main characteristics are the use of the SQLite database and PyQt, a Python binding of the GUI toolkit Qt.

  2. Laser Hardening Prediction Tool Based On a Solid State Transformations Numerical Model

    SciTech Connect

    Martinez, S.; Ukar, E.; Lamikiz, A.

    2011-01-17

    This paper presents a tool to predict hardening layer in selective laser hardening processes where laser beam heats the part locally while the bulk acts as a heat sink.The tool to predict accurately the temperature field in the workpiece is a numerical model that combines a three dimensional transient numerical solution for heating where is possible to introduce different laser sources. The thermal field was modeled using a kinetic model based on Johnson-Mehl-Avrami equation. Considering this equation, an experimental adjustment of transformation parameters was carried out to get the heating transformation diagrams (CHT). With the temperature field and CHT diagrams the model predicts the percentage of base material converted into austenite. These two parameters are used as first step to estimate the depth of hardened layer in the part.The model has been adjusted and validated with experimental data for DIN 1.2379, cold work tool steel typically used in mold and die making industry. This steel presents solid state diffusive transformations at relative low temperature. These transformations must be considered in order to get good accuracy of temperature field prediction during heating phase. For model validation, surface temperature measured by pyrometry, thermal field as well as the hardened layer obtained from metallographic study, were compared with the model data showing a good adjustment.

  3. XML based tools for assessing potential impact of advanced technology space validation

    NASA Technical Reports Server (NTRS)

    Some, Raphael R.; Weisbin, Charles

    2004-01-01

    A hierarchical XML database and related analysis tools are being developed by the New Millennium Program to provide guidance on the relative impact, to future NASA missions, of advanced technologies under consideration for developmental funding.

  4. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin

    2009-04-01

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  5. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  6. Advanced Numerical Imaging Procedure Accounting for Non-Ideal Effects in GPR Scenarios

    NASA Astrophysics Data System (ADS)

    Comite, Davide; Galli, Alessandro; Catapano, Ilaria; Soldovieri, Francesco

    2015-04-01

    The capability to provide fast and reliable imaging of targets and interfaces in non-accessible probed scenarios is a topic of great scientific interest, and many investigations have shown that Ground Penetrating Radar (GPR) can provide an efficient technique to conduct this kind of analysis in various applications of geophysical nature and civil engineering. In these cases, the development of an efficient and accurate imaging procedure is strongly dependent on the capability of accounting for the incident field that activates the scattering phenomenon. In this frame, based on a suitable implementation of an electromagnetic (EM) CAD tool (CST Microwave Studio), it has been possible to accurately and efficiently model the radiation pattern of real antennas in environments typically considered in GPR surveys [1]. A typical scenario of our interest is constituted by targets hidden in a ground medium, described by certain EM parameters and probed by a movable GPR using interfacial antennas [2]. The transmitting and receiving antennas considered here are Vivaldi ones, but a wide variety of other antennas can be modeled and designed, similar to those ones available in commercial GPR systems. Hence, an advanced version of a well-known microwave tomography approach (MTA) [3] has been implemented, both in the canonical 2D scalar case and in the more realistic 3D vectorial one. Such an approach is able to account for the real distribution of the radiated and scattered EM fields. Comparisons of results obtained by means of a 'conventional' implementation of the MTA, where the antennas are modeled as ideal line sources, and by means of our 'advanced' approach, which instead takes into account the radiation features of the chosen antenna type, have been carried out and discussed. Since the antenna radiation patterns are modified by the probed environment, whose EM features and the possible stratified structure usually are not exactly known, the imaging capabilities of the MTA

  7. A numerical tool for reproducing driver behaviour: experiments and predictive simulations.

    PubMed

    Casucci, M; Marchitto, M; Cacciabue, P C

    2010-03-01

    This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions. PMID:19249745

  8. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  9. Left Ventricular Flow Analysis: Recent Advances in Numerical Methods and Applications in Cardiac Ultrasound

    PubMed Central

    Borazjani, Iman; Westerdale, John; McMahon, Eileen M.; Rajaraman, Prathish K.; Heys, Jeffrey J.

    2013-01-01

    The left ventricle (LV) pumps oxygenated blood from the lungs to the rest of the body through systemic circulation. The efficiency of such a pumping function is dependent on blood flow within the LV chamber. It is therefore crucial to accurately characterize LV hemodynamics. Improved understanding of LV hemodynamics is expected to provide important clinical diagnostic and prognostic information. We review the recent advances in numerical and experimental methods for characterizing LV flows and focus on analysis of intraventricular flow fields by echocardiographic particle image velocimetry (echo-PIV), due to its potential for broad and practical utility. Future research directions to advance patient-specific LV simulations include development of methods capable of resolving heart valves, higher temporal resolution, automated generation of three-dimensional (3D) geometry, and incorporating actual flow measurements into the numerical solution of the 3D cardiovascular fluid dynamics. PMID:23690874

  10. An advanced image analysis tool for the quantification and characterization of breast cancer in microscopy images.

    PubMed

    Goudas, Theodosios; Maglogiannis, Ilias

    2015-03-01

    The paper presents an advanced image analysis tool for the accurate and fast characterization and quantification of cancer and apoptotic cells in microscopy images. The proposed tool utilizes adaptive thresholding and a Support Vector Machines classifier. The segmentation results are enhanced through a Majority Voting and a Watershed technique, while an object labeling algorithm has been developed for the fast and accurate validation of the recognized cells. Expert pathologists evaluated the tool and the reported results are satisfying and reproducible. PMID:25681102

  11. Laser vision: lidar as a transformative tool to advance critical zone science

    NASA Astrophysics Data System (ADS)

    Harpold, A. A.; Marshall, J. A.; Lyon, S. W.; Barnhart, T. B.; Fisher, B. A.; Donovan, M.; Brubaker, K. M.; Crosby, C. J.; Glenn, N. F.; Glennie, C. L.; Kirchner, P. B.; Lam, N.; Mankoff, K. D.; McCreight, J. L.; Molotch, N. P.; Musselman, K. N.; Pelletier, J.; Russo, T.; Sangireddy, H.; Sjöberg, Y.; Swetnam, T.; West, N.

    2015-06-01

    . We propose that using lidar to its full potential will require numerous advances, including new and more powerful open-source processing tools, exploiting new lidar acquisition technologies, and improved integration with physically based models and complementary in situ and remote-sensing observations. We provide a 5-year vision that advocates for the expanded use of lidar data sets and highlights subsequent potential to advance the state of CZ science.

  12. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  13. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    substances, helping in the management of the crisis, in the distribution of response resources, or prioritizing specific areas. They can also be used for detection of pollution sources. However, the resources involved, and the scientific and technological levels needed in the manipulation of numerical models, had both limited the interoperability between operational models, monitoring tools and decision-support software tools. The increasing predictive capacity of metocean conditions and fate and behaviour of pollutants spilt at sea or costal zones, and the presence of monitoring tools like vessel traffic control systems, can both provide a safer support for decision-making in emergency or planning issues associated to pollution risk management, especially if used in an integrated way. Following this approach, and taking advantage of an integrated framework developed in ARCOPOL (www.arcopol.eu) and EASYCO (www.project-easy.info) projects, three innovative model-supported software tools were developed and applied in the Atlantic Area, and / or the Portuguese Coast. Two of these tools are used for spill model simulations - a web-based interface (EASYCO web bidirectional tool) and an advanced desktop application (MOHID Desktop Spill Simulator) - both of them allowing end user to have control over the model simulations. Parameters such as date and time of the event, location and oil spill volume are provided the users; these interactive tools also integrate best available metocean forecasts (waves, meteorological, hydrodynamics) from different institutions in the Atlantic Area. Metocean data are continuously gathered from remote THREDDS data servers (using OPENDAP) or ftp sites, and then automatically interpolated and pre-processed to be available for the simulators. These simulation tools developed can also import initial data and export results from/to remote servers, using OGC WFS services. Simulations are provided to end user in a matter of seconds, and thus, can be very

  14. Advanced Epi Tools for Gallium Nitride Light Emitting Diode Devices

    SciTech Connect

    Patibandla, Nag; Agrawal, Vivek

    2012-12-01

    Over the course of this program, Applied Materials, Inc., with generous support from the United States Department of Energy, developed a world-class three chamber III-Nitride epi cluster tool for low-cost, high volume GaN growth for the solid state lighting industry. One of the major achievements of the program was to design, build, and demonstrate the world’s largest wafer capacity HVPE chamber suitable for repeatable high volume III-Nitride template and device manufacturing. Applied Materials’ experience in developing deposition chambers for the silicon chip industry over many decades resulted in many orders of magnitude reductions in the price of transistors. That experience and understanding was used in developing this GaN epi deposition tool. The multi-chamber approach, which continues to be unique in the ability of the each chamber to deposit a section of the full device structure, unlike other cluster tools, allows for extreme flexibility in the manufacturing process. This robust architecture is suitable for not just the LED industry, but GaN power devices as well, both horizontal and vertical designs. The new HVPE technology developed allows GaN to be grown at a rate unheard of with MOCVD, up to 20x the typical MOCVD rates of 3{micro}m per hour, with bulk crystal quality better than the highest-quality commercial GaN films grown by MOCVD at a much cheaper overall cost. This is a unique development as the HVPE process has been known for decades, but never successfully commercially developed for high volume manufacturing. This research shows the potential of the first commercial-grade HVPE chamber, an elusive goal for III-V researchers and those wanting to capitalize on the promise of HVPE. Additionally, in the course of this program, Applied Materials built two MOCVD chambers, in addition to the HVPE chamber, and a robot that moves wafers between them. The MOCVD chambers demonstrated industry-leading wavelength yield for GaN based LED wafers and industry

  15. CUAHSI's Hydrologic Measurement Facility: Putting Advanced Tools in Scientists' Hands

    NASA Astrophysics Data System (ADS)

    Hooper, R. P.; Robinson, D.; Selker, J.; Duncan, J.

    2006-05-01

    Like related environmental sciences, the hydrologic sciences community has been defining environmental observatories and the support components necessary for their successful implementation, such as informatics (cyberinfrastructure) and instrumentation. Unlike programs, such as NEON and OOI, that have been pursuing large-scale capital funding through the Major Research Equipment program of the National Science Foundation, CUAHSI has been pursuing incremental development of observatories that has allowed us to pilot different parts of these support functions, namely Hydrologic Information Systems and a Hydrologic Measurement Facility (HMF), the subject of this paper. The approach has allowed us to gain greater specificity of the requirements for these facilities and their operational challenges. The HMF is developing the foundation to support innovative research across the breadth of the Hydrologic Community, including classic PI-driven projects as well as over 20 grass-roots observatories that have been developing over the past 2 years. HMF is organized around three basic areas: water cycle instrumentation, biogeochemistry and geophysics. Committees have been meeting to determined the most effective manner to deliver instrumentation, whether by special instrumentation packages proposed by host institutions; collaborative agreements with federal agencies; and contributions from industrial partners. These efforts are guided by the results of a community wide survey conducted in Nov-Dec 2005, and a series of ongoing workshops. The survey helped identify the types of equipment that will advance hydrological sciences and are often beyond the capabilities of individual PI's. Respondents to the survey indicated they were keen for HMF to focus on providing supported equipment such as atmospheric profilers like LIDAR, geophysical instrumentation ranging from airborne sensors to ground-penetrating radar, and field-deployed mass spectrophotometers. A recently signed agreement

  16. From bacterial genomics to metagenomics: concept, tools and recent advances.

    PubMed

    Sharma, Pooja; Kumari, Hansi; Kumar, Mukesh; Verma, Mansi; Kumari, Kirti; Malhotra, Shweta; Khurana, Jitendra; Lal, Rup

    2008-06-01

    In the last 20 years, the applications of genomics tools have completely transformed the field of microbial research. This has primarily happened due to revolution in sequencing technologies that have become available today. This review therefore, first describes the discoveries, upgradation and automation of sequencing techniques in a chronological order, followed by a brief discussion on microbial genomics. Some of the recently sequenced bacterial genomes are described to explain how complete genome data is now being used to derive interesting findings. Apart from the genomics of individual microbes, the study of unculturable microbiota from different environments is increasingly gaining importance. The second section is thus dedicated to the concept of metagenomics describing environmental DNA isolation, metagenomic library construction and screening methods to look for novel and potentially important genes, enzymes and biomolecules. It also deals with the pioneering studies in the area of metagenomics that are offering new insights into the previously unappreciated microbial world. PMID:23100712

  17. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    SciTech Connect

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  18. Advanced Flow Control as a Management Tool in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Wugalter, S.

    1974-01-01

    Advanced Flow Control is closely related to Air Traffic Control. Air Traffic Control is the business of the Federal Aviation Administration. To formulate an understanding of advanced flow control and its use as a management tool in the National Airspace System, it becomes necessary to speak somewhat of air traffic control, the role of FAA, and their relationship to advanced flow control. Also, this should dispell forever, any notion that advanced flow control is the inspirational master valve scheme to be used on the Alaskan Oil Pipeline.

  19. Scanning magnetoresistive microscopy: An advanced characterization tool for magnetic nanosystems.

    PubMed

    Mitin, D; Grobis, M; Albrecht, M

    2016-02-01

    An advanced scanning magnetoresistive microscopy (SMRM) - a robust magnetic imaging and probing technique - will be presented, which utilizes state-of-the-art recording heads of a hard disk drive as sensors. The spatial resolution of modern tunneling magnetoresistive sensors is nowadays comparable to the more commonly used magnetic force microscopes. Important advantages of SMRM are the ability to detect pure magnetic signals directly proportional to the out-of-plane magnetic stray field, negligible sensor stray fields, and the ability to apply local bipolar magnetic field pulses up to 10 kOe with bandwidths from DC up to 1 GHz. Moreover, the SMRM can be further equipped with a heating stage and external magnetic field units. The performance of this method and corresponding best practices are demonstrated by presenting various examples, including a temperature dependent recording study on hard magnetic L1(0) FeCuPt thin films, imaging of magnetic vortex states in an in-plane magnetic field, and their controlled manipulation by applying local field pulses. PMID:26931856

  20. Scanning magnetoresistive microscopy: An advanced characterization tool for magnetic nanosystems

    NASA Astrophysics Data System (ADS)

    Mitin, D.; Grobis, M.; Albrecht, M.

    2016-02-01

    An advanced scanning magnetoresistive microscopy (SMRM) — a robust magnetic imaging and probing technique — will be presented, which utilizes state-of-the-art recording heads of a hard disk drive as sensors. The spatial resolution of modern tunneling magnetoresistive sensors is nowadays comparable to the more commonly used magnetic force microscopes. Important advantages of SMRM are the ability to detect pure magnetic signals directly proportional to the out-of-plane magnetic stray field, negligible sensor stray fields, and the ability to apply local bipolar magnetic field pulses up to 10 kOe with bandwidths from DC up to 1 GHz. Moreover, the SMRM can be further equipped with a heating stage and external magnetic field units. The performance of this method and corresponding best practices are demonstrated by presenting various examples, including a temperature dependent recording study on hard magnetic L10 FeCuPt thin films, imaging of magnetic vortex states in an in-plane magnetic field, and their controlled manipulation by applying local field pulses.

  1. Teaching Advanced Data Analysis Tools to High School Astronomy Students

    NASA Astrophysics Data System (ADS)

    Black, David V.; Herring, Julie; Hintz, Eric G.

    2015-01-01

    A major barrier to becoming an astronomer is learning how to analyze astronomical data, such as using photometry to compare the brightness of stars. Most fledgling astronomers learn observation, data reduction, and analysis skills through an upper division college class. If the same skills could be taught in an introductory high school astronomy class, then more students would have an opportunity to do authentic science earlier, with implications for how many choose to become astronomers. Several software tools have been developed that can analyze astronomical data ranging from fairly straightforward (AstroImageJ and DS9) to very complex (IRAF and DAOphot). During the summer of 2014, a study was undertaken at Brigham Young University through a Research Experience for Teachers (RET) program to evaluate the effectiveness and ease-of-use of these four software packages. Standard tasks tested included creating a false-color IR image using WISE data in DS9, Adobe Photoshop, and The Gimp; a multi-aperture analyses of variable stars over time using AstroImageJ; creating Spectral Energy Distributions (SEDs) of stars using photometry at multiple wavelengths in AstroImageJ and DS9; and color-magnitude and hydrogen alpha index diagrams for open star clusters using IRAF and DAOphot. Tutorials were then written and combined with screen captures to teach high school astronomy students at Walden School of Liberal Arts in Provo, UT how to perform these same tasks. They analyzed image data using the four software packages, imported it into Microsoft Excel, and created charts using images from BYU's 36-inch telescope at their West Mountain Observatory. The students' attempts to complete these tasks were observed, mentoring was provided, and the students then reported on their experience through a self-reflection essay and concept test. Results indicate that high school astronomy students can successfully complete professional-level astronomy data analyses when given detailed

  2. Benchmark of numerical tools simulating beam propagation and secondary particles in ITER NBI

    NASA Astrophysics Data System (ADS)

    Sartori, E.; Veltri, P.; Dlougach, E.; Hemsworth, R.; Serianni, G.; Singh, M.

    2015-04-01

    Injection of high energy beams of neutral particles is a method for plasma heating in fusion devices. The ITER injector, and its prototype MITICA (Megavolt ITER Injector and Concept Advancement), are large extrapolations from existing devices: therefore numerical modeling is needed to set thermo-mechanical requirements for all beam-facing components. As the power and charge deposition originates from several sources (primary beam, co-accelerated electrons, and secondary production by beam-gas, beam-surface, and electron-surface interaction), the beam propagation along the beam line is simulated by comprehensive 3D models. This paper presents a comparative study between two codes: BTR has been used for several years in the design of the ITER HNB/DNB components; SAMANTHA code was independently developed and includes additional phenomena, such as secondary particles generated by collision of beam particles with the background gas. The code comparison is valuable in the perspective of the upcoming experimental operations, in order to prepare a reliable numerical support to the interpretation of experimental measurements in the beam test facilities. The power density map calculated on the Electrostatic Residual Ion Dump (ERID) is the chosen benchmark, as it depends on the electric and magnetic fields as well as on the evolution of the beam species via interaction with the gas. Finally the paper shows additional results provided by SAMANTHA, like the secondary electrons produced by volume processes accelerated by the ERID fringe-field towards the Cryopumps.

  3. Benchmark of numerical tools simulating beam propagation and secondary particles in ITER NBI

    SciTech Connect

    Sartori, E. Veltri, P.; Serianni, G.; Dlougach, E.; Hemsworth, R.; Singh, M.

    2015-04-08

    Injection of high energy beams of neutral particles is a method for plasma heating in fusion devices. The ITER injector, and its prototype MITICA (Megavolt ITER Injector and Concept Advancement), are large extrapolations from existing devices: therefore numerical modeling is needed to set thermo-mechanical requirements for all beam-facing components. As the power and charge deposition originates from several sources (primary beam, co-accelerated electrons, and secondary production by beam-gas, beam-surface, and electron-surface interaction), the beam propagation along the beam line is simulated by comprehensive 3D models. This paper presents a comparative study between two codes: BTR has been used for several years in the design of the ITER HNB/DNB components; SAMANTHA code was independently developed and includes additional phenomena, such as secondary particles generated by collision of beam particles with the background gas. The code comparison is valuable in the perspective of the upcoming experimental operations, in order to prepare a reliable numerical support to the interpretation of experimental measurements in the beam test facilities. The power density map calculated on the Electrostatic Residual Ion Dump (ERID) is the chosen benchmark, as it depends on the electric and magnetic fields as well as on the evolution of the beam species via interaction with the gas. Finally the paper shows additional results provided by SAMANTHA, like the secondary electrons produced by volume processes accelerated by the ERID fringe-field towards the Cryopumps.

  4. Exploring the nonequilibrium dynamics of ultracold quantum gases by using numerical tools

    NASA Astrophysics Data System (ADS)

    Heidrich-Meisner, Fabian

    Numerical tools such as exact diagonalization or the density matrix renormalization group method have been vital for the study of the nonequilibrium dynamics of strongly correlated many-body systems. Moreover, they provided unique insight for the interpretation of quantum gas experiments, whenever a direct comparison with theory is possible. By considering the example of the experiment by Ronzheimer et al., in which both an interaction quench and the release of bosons from a trap into an empty optical lattice (sudden expansion) was realized, I discuss several nonequilibrium effects of strongly interacting quantum gases. These include the thermalization of a closed quantum system and its connection to the eigenstate thermalization hypothesis, nonequilibrium mass transport, dynamical fermionization, and transient phenomena such as quantum distillation or dynamical quasicondensation. I highlight the role of integrability in giving rise to ballistic transport in strongly interacting 1D systems and in determining the asymptotic state after a quantum quench. The talk concludes with a perspective on open questions concerning 2D systems and the numerical simulation of their nonequilibrium dynamics. Supported by Deutsche Forschungsgemeinschaft (DFG) via FOR 801.

  5. Advances in Coupling of Kinetics and Molecular Scale Tools to Shed Light on Soil Biogeochemical Processes

    SciTech Connect

    Sparks, Donald

    2014-09-02

    Biogeochemical processes in soils such as sorption, precipitation, and redox play critical roles in the cycling and fate of nutrients, metal(loid)s and organic chemicals in soil and water environments. Advanced analytical tools enable soil scientists to track these processes in real-time and at the molecular scale. Our review focuses on recent research that has employed state-of-the-art molecular scale spectroscopy, coupled with kinetics, to elucidate the mechanisms of nutrient and metal(loid) reactivity and speciation in soils. We found that by coupling kinetics with advanced molecular and nano-scale tools major advances have been made in elucidating important soil chemical processes including sorption, precipitation, dissolution, and redox of metal(loids) and nutrients. Such advances will aid in better predicting the fate and mobility of nutrients and contaminants in soils and water and enhance environmental and agricultural sustainability.

  6. Numerical study of electromagnetic waves generated by a prototype dielectric logging tool

    USGS Publications Warehouse

    Ellefsen, K.J.; Abraham, J.D.; Wright, D.L.; Mazzella, A.T.

    2004-01-01

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a numerical study was conducted using both the finite-difference, time-domain method and a frequency-wavenumber method. When the propagation velocity in the borehole was greater than that in the formation (e.g., an air-filled borehole in the unsaturated zone), only a guided wave propagated along the borehole. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave radiated electromagnetic energy into the formation, causing its amplitude to decrease. When the propagation velocity in the borehole was less than that in the formation (e.g., a water-filled borehole in the saturated zone), both a refracted wave and a guided wave propagated along the borehole. The velocity of the refracted wave equaled the phase velocity of a plane wave in the formation, and the refracted wave preceded the guided wave. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave did not radiate electromagnetic energy into the formation. To analyze traces recorded by the prototype tool during laboratory tests, they were compared to traces calculated with the finite-difference method. The first parts of both the recorded and the calculated traces were similar, indicating that guided and refracted waves indeed propagated along the prototype tool. ?? 2004 Society of Exploration Geophysicists. All rights reserved.

  7. Experimental and numerical approaches for application of density and thermal neutron tools in slim borehole

    NASA Astrophysics Data System (ADS)

    Hwang, Seho; Shin, Jehyun; Won, Byeongho; Kim, Jongman

    2015-04-01

    To perform the groundwater investigation, geological surveys, geotechnical investigation, generally 3 inches diameter borehole is drilled, and PVC or steel casing having a 50 mm inner diameter is installed to prevent for collapse borehole in the case of shallow unconsolidated formation or fractured zone. In this case, well loggings for formation evaluation have many limitations, and especially radioactive tools having large diameter are basically difficult to apply. Available radioactive logs can be applied within the casing are natural gamma ray log, density log and neutron logs. Natural gamma ray log is used for estimation of shale volume, stratigraphic and facies classification such as shale and sandstone, and almost borehole environment can be corrected using manufactured charts. In the case of the small diameter borehole such as 50 mm diameter cased borehole, we should apply the small diameter radioactive logging tools. However the measured data is generally count per second. So we should convert the measured count per second to meaningful physical properties such as density or neutron porosity according to the strength of radioactive source, the distance between the source and the detector, the mud and casing type, and so on. In this study, the experimental and numerical methods are used to convert the measured count per second to density and neutron porosity for density and neutron logs logging tools having one detector. 1Ci Am-Be single neutron logs were compared using 3Ci Am-Be dual neutron logs in the same boreholes, and empirical relationship between the single and dual neutron log is derived. The diameter and lithology of target boreholes are 3 inches and granite, sandstone, mud, etc. The response characteristics for a very small diameter and no orientation of the radioactive source density logging (4 pi omni-directional source) were analyzed using the MCNP. Numerical modeling was performed while varying the distance of the radioactive source - detector

  8. Recent numerical and algorithmic advances within the volume tracking framework for modeling interfacial flows

    SciTech Connect

    François, Marianne M.

    2015-05-28

    A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges. In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.

  9. Recent numerical and algorithmic advances within the volume tracking framework for modeling interfacial flows

    DOE PAGESBeta

    François, Marianne M.

    2015-05-28

    A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges.more » In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.« less

  10. Numerical study of Alfvén eigenmodes in the Experimental Advanced Superconducting Tokamak

    SciTech Connect

    Hu, Youjun; Li, Guoqiang; Yang, Wenjun; Zhou, Deng; Ren, Qilong; Gorelenkov, N. N.; Cai, Huishan

    2014-05-15

    Alfvén eigenmodes in up-down asymmetric tokamak equilibria are studied by a new magnetohydrodynamic eigenvalue code. The code is verified with the NOVA code for the Solovév equilibrium and then is used to study Alfvén eigenmodes in a up-down asymmetric equilibrium of the Experimental Advanced Superconducting Tokamak. The frequency and mode structure of toroidicity-induced Alfvén eigenmodes are calculated. It is demonstrated numerically that up-down asymmetry induces phase variation in the eigenfunction across the major radius on the midplane.

  11. Simulation studies of the impact of advanced observing systems on numerical weather prediction

    NASA Technical Reports Server (NTRS)

    Atlas, R.; Kalnay, E.; Susskind, J.; Reuter, D.; Baker, W. E.; Halem, M.

    1984-01-01

    To study the potential impact of advanced passive sounders and lidar temperature, pressure, humidity, and wind observing systems on large-scale numerical weather prediction, a series of realistic simulation studies between the European Center for medium-range weather forecasts, the National Meteorological Center, and the Goddard Laboratory for Atmospheric Sciences is conducted. The project attempts to avoid the unrealistic character of earlier simulation studies. The previous simulation studies and real-data impact tests are reviewed and the design of the current simulation system is described. Consideration is given to the simulation of observations of space-based sounding systems.

  12. Numerical Weather Prediction Models on Linux Boxes as tools in meteorological education in Hungary

    NASA Astrophysics Data System (ADS)

    Gyongyosi, A. Z.; Andre, K.; Salavec, P.; Horanyi, A.; Szepszo, G.; Mille, M.; Tasnadi, P.; Weidiger, T.

    2012-04-01

    . Numerical modeling became a common tool in the daily practice of weather experts forecasters due to the i) increasing user demands for weather data by the costumers, ii) the growth in computer resources, iii) numerical weather prediction systems available for integration on affordable, off the shelf computers and iv) available input data (from ECMWF or NCEP) for model integrations. Beside learning the theoretical basis, since the last year. Students in their MSc or BSc Thesis Research or in Student's Research ProjectsStudent's Research Projects h have the opportunity to run numerical models and to analyze the outputs for different purposes including wind energy estimation, simulation of the dynamics of a polar low, and subtropical cyclones, analysis of the isentropic potential vorticity field, examination of coupled atmospheric dispersion models, etc. A special course in the application of numerical modeling has been held (is being announced for the upcoming semester) (is being announced for the upcoming semester) for our students in order to improve their skills on this field. Several numerical model (NRIPR ETA and WRF) systems have been adapted in the University and integrated WRF have been tested and used for the geographical region of the Carpathian Basin (NRIPR, ETA and WRF). Recently ALADIN/CHAPEAU the academic version of the ARPEGE ALADIN cy33t1 meso-scale numerical weather prediction model system (which is the operational forecasting tool of our National Weather Service) has been installed at our Institute. ALADIN is the operational forecasting model of the Hungarian Meteorological Service and developed in the framework of the international ALADIN co-operation. Our main objectives are i) the analysis of different typical weather situations, ii) fine tuning of parameterization schemes and the iii) comparison of the ALADIN/CHAPEAU and WRF model outputs based on case studies. The necessary hardware and software innovations has have been done. In the presentation the

  13. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  14. A review on recent advances in the numerical simulation for coalbed-methane-recovery process

    SciTech Connect

    Wei, X.R.; Wang, G.X.; Massarotto, P.; Golding, S.D.; Rudolph, V.

    2007-12-15

    The recent advances in numerical simulation for primary coalbed methane (CBM) recovery and enhanced coalbed-methane recovery (ECBMR) processes are reviewed, primarily focusing on the progress that has occurred since the late 1980s. Two major issues regarding the numerical modeling will be discussed in this review: first, multicomponent gas transport in in-situ bulk coal and, second, changes of coal properties during methane (CH{sub 4}) production. For the former issues, a detailed review of more recent advances in modeling gas and water transport within a coal matrix is presented. Further, various factors influencing gas diffusion through the coal matrix will be highlighted as well, such as pore structure, concentration and pressure, and water effects. An ongoing bottleneck for evaluating total mass transport rate is developing a reasonable representation of multiscale pore space that considers coal type and rank. Moreover, few efforts have been concerned with modeling water-flow behavior in the coal matrix and its effects on CH{sub 4} production and on the exchange of carbon dioxide (CO{sub 2}) and CH{sub 4}. As for the second issue, theoretical coupled fluid-flow and geomechanical models have been proposed to describe the evolution of pore structure during CH{sub 4} production, instead of traditional empirical equations. However, there is currently no effective coupled model for engineering applications. Finally, perspectives on developing suitable simulation models for CBM production and for predicting CO{sub 2}-sequestration ECBMR are suggested.

  15. AN EIGHT WEEK SEMINAR IN AN INTRODUCTION TO NUMERICAL CONTROL ON TWO- AND THREE-AXIS MACHINE TOOLS FOR VOCATIONAL AND TECHNICAL MACHINE TOOL INSTRUCTORS. FINAL REPORT.

    ERIC Educational Resources Information Center

    BOLDT, MILTON; POKORNY, HARRY

    THIRTY-THREE MACHINE SHOP INSTRUCTORS FROM 17 STATES PARTICIPATED IN AN 8-WEEK SEMINAR TO DEVELOP THE SKILLS AND KNOWLEDGE ESSENTIAL FOR TEACHING THE OPERATION OF NUMERICALLY CONTROLLED MACHINE TOOLS. THE SEMINAR WAS GIVEN FROM JUNE 20 TO AUGUST 12, 1966, WITH COLLEGE CREDIT AVAILABLE THROUGH STOUT STATE UNIVERSITY. THE PARTICIPANTS COMPLETED AN…

  16. Noodles: a tool for visualization of numerical weather model ensemble uncertainty.

    PubMed

    Sanyal, Jibonananda; Zhang, Song; Dyer, Jamie; Mercer, Andrew; Amburn, Philip; Moorhead, Robert J

    2010-01-01

    Numerical weather prediction ensembles are routinely used for operational weather forecasting. The members of these ensembles are individual simulations with either slightly perturbed initial conditions or different model parameterizations, or occasionally both. Multi-member ensemble output is usually large, multivariate, and challenging to interpret interactively. Forecast meteorologists are interested in understanding the uncertainties associated with numerical weather prediction; specifically variability between the ensemble members. Currently, visualization of ensemble members is mostly accomplished through spaghetti plots of a single mid-troposphere pressure surface height contour. In order to explore new uncertainty visualization methods, the Weather Research and Forecasting (WRF) model was used to create a 48-hour, 18 member parameterization ensemble of the 13 March 1993 "Superstorm". A tool was designed to interactively explore the ensemble uncertainty of three important weather variables: water-vapor mixing ratio, perturbation potential temperature, and perturbation pressure. Uncertainty was quantified using individual ensemble member standard deviation, inter-quartile range, and the width of the 95% confidence interval. Bootstrapping was employed to overcome the dependence on normality in the uncertainty metrics. A coordinated view of ribbon and glyph-based uncertainty visualization, spaghetti plots, iso-pressure colormaps, and data transect plots was provided to two meteorologists for expert evaluation. They found it useful in assessing uncertainty in the data, especially in finding outliers in the ensemble run and therefore avoiding the WRF parameterizations that lead to these outliers. Additionally, the meteorologists could identify spatial regions where the uncertainty was significantly high, allowing for identification of poorly simulated storm environments and physical interpretation of these model issues. PMID:20975183

  17. Numerical simulation of ISFET structures for biosensing devices with TCAD tools

    PubMed Central

    2015-01-01

    Background Ion Sensitive Field Effect Transistors (ISFETs) are one of the primitive structures for the fabrication of biosensors (BioFETs). Aiming at the optimization of the design and fabrication processes of BioFETs, the correlation between technological parameters and device electrical response can be obtained by means of an electrical device-level simulation. In this work we present a numerical simulation approach to the study of ISFET structures for bio-sensing devices (BioFET) using Synopsys Sentaurus Technology Computer-Aided Design (TCAD) tools. Methods The properties of a custom-defined material were modified in order to reproduce the electrolyte behavior. In particular, the parameters of an intrinsic semiconductor material have been set in order to reproduce an electrolyte solution. By replacing the electrolyte solution with an intrinsic semiconductor, the electrostatic solution of the electrolyte region can therefore be calculated by solving the semiconductor equation within this region. Results The electrostatic behaviour (transfer characteristics) of a general BioFET structure has been simulated when the captured target number increases from 1 to 10. The ID current as a function of the VDS voltage for different positions of a single charged block and for different values of the reference electrode have been calculated. The electrical potential distribution along the electrolyte-insulator-semiconductor structure has been evaluated for different molar concentrations of the electrolyte solution. Conclusions We presented a numerical simulation approach to the study of Ion-Sensitive Field Effect Transistor (ISFET) structures for biosensing devices (BioFETs) using the Synopsys Sentaurus Technology Computer-Aided Design (TCAD) tools. A powerful framework for the design and optimization of biosensor has been devised, thus helping in reducing technology development time and cost. The main finding of the analysis of a general reference BioFET shows that there is

  18. Development of Advanced Light-Duty Powertrain and Hybrid Analysis Tool (SAE 2013-01-0808)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by Environmental Protection Agency to evaluate the Greenhouse gas emissions and fuel efficiency from light-duty vehicles. It is a physics-based, forward-looking, full vehicle computer simulator, which is cap...

  19. Earthquake information products and tools from the Advanced National Seismic System (ANSS)

    USGS Publications Warehouse

    Wald, Lisa

    2006-01-01

    This Fact Sheet provides a brief description of postearthquake tools and products provided by the Advanced National Seismic System (ANSS) through the U.S. Geological Survey Earthquake Hazards Program. The focus is on products specifically aimed at providing situational awareness in the period immediately following significant earthquake events.

  20. An advanced numerical model for phase change problems in complicated geometries

    NASA Astrophysics Data System (ADS)

    Khashan, Saud Abdel-Aziz

    1998-11-01

    An advanced fixed-grid enthalpy formulation based finite volume numerical method is developed to solve the phase change problems in complicated geometries. The numerical method is based on a general non-orthogonal grid structure and a colocated arrangement of variables. Second order discretizations and interpolations are used. The convergence rate is considerably accelerated by switching-off the velocity in the solidified region in an implicit way. This switching-off technique has a strong compatibility with SIMPLE-like methods. For all test cases conducted in this study, the rate of convergence using the new treatment exceeds that of the other enthalpy formulation-based methods and with less numerical stability constraints, when used in convection-diffusion phase change problems. For better run in vector computers, The Incomplete LU decomposition (ILU) matrix solver is partially vectorized. The Mflops (million floating point operation per second) number is raised from 60 to over 300. Water freezing in orthogonal and non-orthogonal geometry are studied under the effect of density inversion. All thermo-physical properties of the water are dealt with as temperature-dependent (no Boussinsq approximation). The results show a profound effect of density inversion on the flow/energy field and on the local as well as on the universal freezing rate.

  1. Numerical-experimental identification of the most effective dynamic operation mode of a vibration drilling tool for improved cutting performance

    NASA Astrophysics Data System (ADS)

    Ostasevicius, V.; Ubartas, M.; Gaidys, R.; Jurenas, V.; Samper, S.; Dauksevicius, R.

    2012-11-01

    This study is concerned with application of numerical-experimental approach for characterizing dynamic behavior of the developed piezoelectrically excited vibration drilling tool with the aim to identify the most effective conditions of tool vibration mode control for improved cutting efficiency. 3D finite element model of the tool was created on the basis of an elastically fixed pre-twisted cantilever (standard twist drill). The model was experimentally verified and used together with tool vibration measurements in order to reveal rich dynamic behavior of the pre-twisted structure, representing a case of parametric vibrations with axial, torsional and transverse natural vibrations accompanied by the additional dynamic effects arising due to the coupling of axial and torsional deflections ((un)twisting). Numerical results combined with extensive data from interferometric, accelerometric, dynamometric and surface roughness measurements allowed to determine critical excitation frequencies and the corresponding vibration modes, which have the largest influence on the performance metrics of the vibration drilling process. The most favorable tool excitation conditions were established: inducing the axial mode of the vibration tool itself through tailoring of driving frequency enables to minimize magnitudes of surface roughness, cutting force and torque. Research results confirm the importance of the tool mode control in enhancing the effectiveness of vibration cutting tools from the viewpoint of structural dynamics.

  2. Synthetic biology and molecular genetics in non-conventional yeasts: Current tools and future advances.

    PubMed

    Wagner, James M; Alper, Hal S

    2016-04-01

    Coupling the tools of synthetic biology with traditional molecular genetic techniques can enable the rapid prototyping and optimization of yeast strains. While the era of yeast synthetic biology began in the well-characterized model organism Saccharomyces cerevisiae, it is swiftly expanding to include non-conventional yeast production systems such as Hansenula polymorpha, Kluyveromyces lactis, Pichia pastoris, and Yarrowia lipolytica. These yeasts already have roles in the manufacture of vaccines, therapeutic proteins, food additives, and biorenewable chemicals, but recent synthetic biology advances have the potential to greatly expand and diversify their impact on biotechnology. In this review, we summarize the development of synthetic biological tools (including promoters and terminators) and enabling molecular genetics approaches that have been applied in these four promising alternative biomanufacturing platforms. An emphasis is placed on synthetic parts and genome editing tools. Finally, we discuss examples of synthetic tools developed in other organisms that can be adapted or optimized for these hosts in the near future. PMID:26701310

  3. Numerical approach for the voloxidation process of an advanced spent fuel conditioning process (ACP)

    SciTech Connect

    Park, Byung Heung; Jeong, Sang Mun; Seo, Chung-Seok

    2007-07-01

    A voloxidation process is adopted as the first step of an advanced spent fuel conditioning process in order to prepare the SF oxide to be reduced in the following electrolytic reduction process. A semi-batch type voloxidizer was devised to transform a SF pellet into powder. In this work, a simple reactor model was developed for the purpose of correlating a gas phase flow rate with an operation time as a numerical approach. With an assumption that a solid phase and a gas phase are homogeneous in a reactor, a reaction rate for an oxidation was introduced into a mass balance equation. The developed equation can describe a change of an outlet's oxygen concentration including such a case that a gas flow is not sufficient enough to continue a reaction at its maximum reaction rate. (authors)

  4. Building Models in the Classroom: Taking Advantage of Sophisticated Geomorphic Numerical Tools Using a Simple Graphical User Interface

    NASA Astrophysics Data System (ADS)

    Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.

    2014-12-01

    Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

  5. The role of numerical simulation for the development of an advanced HIFU system

    NASA Astrophysics Data System (ADS)

    Okita, Kohei; Narumi, Ryuta; Azuma, Takashi; Takagi, Shu; Matumoto, Yoichiro

    2014-10-01

    High-intensity focused ultrasound (HIFU) has been used clinically and is under clinical trials to treat various diseases. An advanced HIFU system employs ultrasound techniques for guidance during HIFU treatment instead of magnetic resonance imaging in current HIFU systems. A HIFU beam imaging for monitoring the HIFU beam and a localized motion imaging for treatment validation of tissue are introduced briefly as the real-time ultrasound monitoring techniques. Numerical simulations have a great impact on the development of real-time ultrasound monitoring as well as the improvement of the safety and efficacy of treatment in advanced HIFU systems. A HIFU simulator was developed to reproduce ultrasound propagation through the body in consideration of the elasticity of tissue, and was validated by comparison with in vitro experiments in which the ultrasound emitted from the phased-array transducer propagates through the acrylic plate acting as a bone phantom. As the result, the defocus and distortion of the ultrasound propagating through the acrylic plate in the simulation quantitatively agree with that in the experimental results. Therefore, the HIFU simulator accurately reproduces the ultrasound propagation through the medium whose shape and physical properties are well known. In addition, it is experimentally confirmed that simulation-assisted focus control of the phased-array transducer enables efficient assignment of the focus to the target. Simulation-assisted focus control can contribute to design of transducers and treatment planning.

  6. Advanced Risk Reduction Tool (ARRT) Special Case Study Report: Science and Engineering Technical Assessments (SETA) Program

    NASA Technical Reports Server (NTRS)

    Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian

    2000-01-01

    This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.

  7. BOOK REVIEW: Advanced Topics in Computational Partial Differential Equations: Numerical Methods and Diffpack Programming

    NASA Astrophysics Data System (ADS)

    Katsaounis, T. D.

    2005-02-01

    The scope of this book is to present well known simple and advanced numerical methods for solving partial differential equations (PDEs) and how to implement these methods using the programming environment of the software package Diffpack. A basic background in PDEs and numerical methods is required by the potential reader. Further, a basic knowledge of the finite element method and its implementation in one and two space dimensions is required. The authors claim that no prior knowledge of the package Diffpack is required, which is true, but the reader should be at least familiar with an object oriented programming language like C++ in order to better comprehend the programming environment of Diffpack. Certainly, a prior knowledge or usage of Diffpack would be a great advantage to the reader. The book consists of 15 chapters, each one written by one or more authors. Each chapter is basically divided into two parts: the first part is about mathematical models described by PDEs and numerical methods to solve these models and the second part describes how to implement the numerical methods using the programming environment of Diffpack. Each chapter closes with a list of references on its subject. The first nine chapters cover well known numerical methods for solving the basic types of PDEs. Further, programming techniques on the serial as well as on the parallel implementation of numerical methods are also included in these chapters. The last five chapters are dedicated to applications, modelled by PDEs, in a variety of fields. The first chapter is an introduction to parallel processing. It covers fundamentals of parallel processing in a simple and concrete way and no prior knowledge of the subject is required. Examples of parallel implementation of basic linear algebra operations are presented using the Message Passing Interface (MPI) programming environment. Here, some knowledge of MPI routines is required by the reader. Examples solving in parallel simple PDEs using

  8. Numerical Evaluation of Fluid Mixing Phenomena in Boiling Water Reactor Using Advanced Interface Tracking Method

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Takase, Kazuyuki

    Thermal-hydraulic design of the current boiling water reactor (BWR) is performed with the subchannel analysis codes which incorporated the correlations based on empirical results including actual-size tests. Then, for the Innovative Water Reactor for Flexible Fuel Cycle (FLWR) core, an actual size test of an embodiment of its design is required to confirm or modify such correlations. In this situation, development of a method that enables the thermal-hydraulic design of nuclear reactors without these actual size tests is desired, because these tests take a long time and entail great cost. For this reason, we developed an advanced thermal-hydraulic design method for FLWRs using innovative two-phase flow simulation technology. In this study, a detailed Two-Phase Flow simulation code using advanced Interface Tracking method: TPFIT is developed to calculate the detailed information of the two-phase flow. In this paper, firstly, we tried to verify the TPFIT code by comparing it with the existing 2-channel air-water mixing experimental results. Secondary, the TPFIT code was applied to simulation of steam-water two-phase flow in a model of two subchannels of a current BWRs and FLWRs rod bundle. The fluid mixing was observed at a gap between the subchannels. The existing two-phase flow correlation for fluid mixing is evaluated using detailed numerical simulation data. This data indicates that pressure difference between fluid channels is responsible for the fluid mixing, and thus the effects of the time average pressure difference and fluctuations must be incorporated in the two-phase flow correlation for fluid mixing. When inlet quality ratio of subchannels is relatively large, it is understood that evaluation precision of the existing two-phase flow correlations for fluid mixing are relatively low.

  9. Advanced gradient-index lens design tools to maximize system performance and reduce SWaP

    NASA Astrophysics Data System (ADS)

    Campbell, Sawyer D.; Nagar, Jogender; Brocker, Donovan E.; Easum, John A.; Turpin, Jeremiah P.; Werner, Douglas H.

    2016-05-01

    GRadient-INdex (GRIN) lenses have long been of interest due to their potential for providing levels of performance unachievable with traditional homogeneous lenses. While historically limited by a lack of suitable materials, rapid advancements in manufacturing techniques, including 3D printing, have recently kindled a renewed interest in GRIN optics. Further increasing the desire for GRIN devices has been the advent of Transformation Optics (TO), which provides the mathematical framework for representing the behavior of electromagnetic radiation in a given geometry by "transforming" it to an alternative, usually more desirable, geometry through an appropriate mapping of the constituent material parameters. Using TO, aspherical lenses can be transformed to simpler spherical and flat geometries or even rotationally-asymmetric shapes which result in true 3D GRIN profiles. Meanwhile, there is a critical lack of suitable design tools which can effectively evaluate the optical wave propagation through 3D GRIN profiles produced by TO. Current modeling software packages for optical lens systems also lack advanced multi-objective global optimization capability which allows the user to explicitly view the trade-offs between all design objectives such as focus quality, FOV, ▵nand focal drift due to chromatic aberrations. When coupled with advanced design methodologies such as TO, wavefront matching (WFM), and analytical achromatic GRIN theory, these tools provide a powerful framework for maximizing SWaP (Size, Weight and Power) reduction in GRIN-enabled optical systems. We provide an overview of our advanced GRIN design tools and examples which minimize the presence of mono- and polychromatic aberrations in the context of reducing SWaP.

  10. A hybrid numerical technique for predicting the aerodynamic and acoustic fields of advanced turboprops

    NASA Technical Reports Server (NTRS)

    Homicz, G. F.; Moselle, J. R.

    1985-01-01

    A hybrid numerical procedure is presented for the prediction of the aerodynamic and acoustic performance of advanced turboprops. A hybrid scheme is proposed which in principle leads to a consistent simultaneous prediction of both fields. In the inner flow a finite difference method, the Approximate-Factorization Alternating-Direction-Implicit (ADI) scheme, is used to solve the nonlinear Euler equations. In the outer flow the linearized acoustic equations are solved via a Boundary-Integral Equation (BIE) method. The two solutions are iteratively matched across a fictitious interface in the flow so as to maintain continuity. At convergence the resulting aerodynamic load prediction will automatically satisfy the appropriate free-field boundary conditions at the edge of the finite difference grid, while the acoustic predictions will reflect the back-reaction of the radiated field on the magnitude of the loading source terms, as well as refractive effects in the inner flow. The equations and logic needed to match the two solutions are developed and the computer program implementing the procedure is described. Unfortunately, no converged solutions were obtained, due to unexpectedly large running times. The reasons for this are discussed and several means to alleviate the situation are suggested.

  11. Anvil Forecast Tool in the Advanced Weather Interactive Processing System (AWIPS)

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.

  12. Five levels of PACS modularity: integrating 3D and other advanced visualization tools.

    PubMed

    Wang, Kenneth C; Filice, Ross W; Philbin, James F; Siegel, Eliot L; Nagy, Paul G

    2011-12-01

    The current array of PACS products and 3D visualization tools presents a wide range of options for applying advanced visualization methods in clinical radiology. The emergence of server-based rendering techniques creates new opportunities for raising the level of clinical image review. However, best-of-breed implementations of core PACS technology, volumetric image navigation, and application-specific 3D packages will, in general, be supplied by different vendors. Integration issues should be carefully considered before deploying such systems. This work presents a classification scheme describing five tiers of PACS modularity and integration with advanced visualization tools, with the goals of characterizing current options for such integration, providing an approach for evaluating such systems, and discussing possible future architectures. These five levels of increasing PACS modularity begin with what was until recently the dominant model for integrating advanced visualization into the clinical radiologist's workflow, consisting of a dedicated stand-alone post-processing workstation in the reading room. Introduction of context-sharing, thin clients using server-based rendering, archive integration, and user-level application hosting at successive levels of the hierarchy lead to a modularized imaging architecture, which promotes user interface integration, resource efficiency, system performance, supportability, and flexibility. These technical factors and system metrics are discussed in the context of the proposed five-level classification scheme. PMID:21301923

  13. SUSPNDRS: a numerical simulation tool for the nonlinear transient analysis of cable support bridge structures, part 1: theoretical development

    SciTech Connect

    McCallen, D.; Astaneh-Asl, A.

    1997-06-01

    The work reprint on herein was aimed at developing methodologies and tools for efficient and accurate numerical simulation of the seismic response of suspension and cable-stayed structures. A special purpose finite element program has been constructed and the underlying theory and demonstration example problems are presented. A companion report [Ref 1] discusses the application of this technology for a major suspension bridge structure.

  14. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  15. Numerical modeling of the gamma-gamma density tool responses in horizontal wells with an axial asymmetry.

    PubMed

    Dworak, Dominik; Woźnicka, Urszula; Zorski, Tomasz; Wiącek, Urszula

    2011-01-01

    A signal of a spectrometric gamma-gamma density tool in specific borehole conditions has been numerically calculated. Transport of gamma rays, from a point (137)Cs gamma source situated in a borehole tool, through rock media to detectors, has been simulated using a Monte Carlo code. The influence of heterogeneity of the rock medium surrounding the borehole on the signal of the detectors has been examined. This heterogeneity results from the presence of an interface between two different geological layers, parallel to the borehole wall. The above conditions may occur in horizontal logging, when the borehole is drilled along the boundary of geological layers. It is possible to assess the distance from the boundary on the basis of the responses of the gamma-gamma density tool, using the classic interpretation "spine & ribs" procedure. The effect of different densities of the bordered layers on the tool response has been analyzed. The presented calculations show the wide possibilities of numerical modeling of the complex borehole geometry and solving difficult interpretation problems in nuclear well logging. PMID:20850331

  16. A Manually Operated, Advance Off-Stylet Insertion Tool for Minimally Invasive Cochlear Implantation Surgery

    PubMed Central

    Kratchman, Louis B.; Schurzig, Daniel; McRackan, Theodore R.; Balachandran, Ramya; Noble, Jack H.; Webster, Robert J.; Labadie, Robert F.

    2014-01-01

    The current technique for cochlear implantation (CI) surgery requires a mastoidectomy to gain access to the cochlea for electrode array insertion. It has been shown that microstereotactic frames can enable an image-guided, minimally invasive approach to CI surgery called percutaneous cochlear implantation (PCI) that uses a single drill hole for electrode array insertion, avoiding a more invasive mastoidectomy. Current clinical methods for electrode array insertion are not compatible with PCI surgery because they require a mastoidectomy to access the cochlea; thus, we have developed a manually operated electrode array insertion tool that can be deployed through a PCI drill hole. The tool can be adjusted using a preoperative CT scan for accurate execution of the advance off-stylet (AOS) insertion technique and requires less skill to operate than is currently required to implant electrode arrays. We performed three cadaver insertion experiments using the AOS technique and determined that all insertions were successful using CT and microdissection. PMID:22851233

  17. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  18. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  19. NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Craig, D. A.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The objective of this Technical Interchange Meeting was to increase the quantity and quality of technical, cost, and programmatic data used to model the impact of investing in different technologies. The focus of this meeting was the Technology Tool Box (TTB), a database of performance, operations, and programmatic parameters provided by technologists and used by systems engineers. The TTB is the data repository used by a system of models known as the Advanced Technology Lifecycle Analysis System (ATLAS). This report describes the result of the November meeting, and also provides background information on ATLAS and the TTB.

  20. Theoretical base and numerical tools for modeling transitions between continuous and disperse multiphase motions

    NASA Astrophysics Data System (ADS)

    Zhang, Duan; Ma, Xia; Giguere, Paul

    2009-11-01

    Transitions between continuous and disperse multiphase motions happen commonly in nature and in our daily life. The phenomena include dissolving sugar cubes in a cup, formation of rain and hail, shattering a piece of glass. The capability of numerically simulating these phenomena is both important to industrial applications and to the understanding of nature. Relative to other aspects in this topic, theories for disperse multiphase flow is better developed despite many important issues still to be resolved. The theory for continuous multiphase flow is still in its infancy. The study of transition between continuous and disperse multiphase motion is at an even earlier stage of development. In this talk, we describe a possible theoretical framework based on the probability and statistical theory and a useful numerical method in simulating these phenomena. Deficiencies in the theory and in the numerical method are also discussed.

  1. Recent advances in theoretical and numerical studies of wire array Z-pinch in the IAPCM

    SciTech Connect

    Ding, Ning Zhang, Yang Xiao, Delong Wu, Jiming Huang, Jun Yin, Li Sun, Shunkai Xue, Chuang Dai, Zihuan Ning, Cheng Shu, Xiaojian Wang, Jianguo Li, Hua

    2014-12-15

    Fast Z-pinch has produced the most powerful X-ray radiation source in laboratory and also shows the possibility to drive inertial confinement fusion (ICF). Recent advances in wire-array Z-pinch researches at the Institute of Applied Physics and Computational Mathematics are presented in this paper. A typical wire array Z-pinch process has three phases: wire plasma formation and ablation, implosion and the MRT instability development, stagnation and radiation. A mass injection model with azimuthal modulation coefficient is used to describe the wire initiation, and the dynamics of ablated plasmas of wire-array Z-pinches in (r, θ) geometry is numerically studied. In the implosion phase, a two-dimensional(r, z) three temperature radiation MHD code MARED has been developed to investigate the development of the Magneto-Rayleigh-Taylor(MRT) instability. We also analyze the implosion modes of nested wire-array and find that the inner wire-array is hardly affected before the impaction of the outer wire-array. While the plasma accelerated to high speed in the implosion stage stagnates on the axis, abundant x-ray radiation is produced. The energy spectrum of the radiation and the production mechanism are investigated. The computational x-ray pulse shows a reasonable agreement with the experimental result. We also suggest that using alloyed wire-arrays can increase multi-keV K-shell yield by decreasing the opacity of K-shell lines. In addition, we use a detailed circuit model to study the energy coupling between the generator and the Z-pinch implosion. Recently, we are concentrating on the problems of Z-pinch driven ICF, such as dynamic hohlraum and capsule implosions. Our numerical investigations on the interaction of wire-array Z-pinches on foam convertors show qualitative agreements with experimental results on the “Qiangguang I” facility. An integrated two-dimensional simulation of dynamic hohlraum driven capsule implosion provides us the physical insights of wire

  2. Recent advances in theoretical and numerical studies of wire array Z-pinch in the IAPCM

    NASA Astrophysics Data System (ADS)

    Ding, Ning; Zhang, Yang; Xiao, Delong; Wu, Jiming; Huang, Jun; Yin, Li; Sun, Shunkai; Xue, Chuang; Dai, Zihuan; Ning, Cheng; Shu, Xiaojian; Wang, Jianguo; Li, Hua

    2014-12-01

    Fast Z-pinch has produced the most powerful X-ray radiation source in laboratory and also shows the possibility to drive inertial confinement fusion (ICF). Recent advances in wire-array Z-pinch researches at the Institute of Applied Physics and Computational Mathematics are presented in this paper. A typical wire array Z-pinch process has three phases: wire plasma formation and ablation, implosion and the MRT instability development, stagnation and radiation. A mass injection model with azimuthal modulation coefficient is used to describe the wire initiation, and the dynamics of ablated plasmas of wire-array Z-pinches in (r, θ) geometry is numerically studied. In the implosion phase, a two-dimensional(r, z) three temperature radiation MHD code MARED has been developed to investigate the development of the Magneto-Rayleigh-Taylor(MRT) instability. We also analyze the implosion modes of nested wire-array and find that the inner wire-array is hardly affected before the impaction of the outer wire-array. While the plasma accelerated to high speed in the implosion stage stagnates on the axis, abundant x-ray radiation is produced. The energy spectrum of the radiation and the production mechanism are investigated. The computational x-ray pulse shows a reasonable agreement with the experimental result. We also suggest that using alloyed wire-arrays can increase multi-keV K-shell yield by decreasing the opacity of K-shell lines. In addition, we use a detailed circuit model to study the energy coupling between the generator and the Z-pinch implosion. Recently, we are concentrating on the problems of Z-pinch driven ICF, such as dynamic hohlraum and capsule implosions. Our numerical investigations on the interaction of wire-array Z-pinches on foam convertors show qualitative agreements with experimental results on the "Qiangguang I" facility. An integrated two-dimensional simulation of dynamic hohlraum driven capsule implosion provides us the physical insights of wire

  3. Folder: A numerical tool to simulate the development of structures in layered media

    NASA Astrophysics Data System (ADS)

    Adamuszek, Marta; Dabrowski, Marcin; Schmid, Daniel W.

    2016-03-01

    We present Folder, a numerical toolbox for modelling deformation in layered media subject to layer parallel shortening or extension in two dimensions. The toolbox includes a range of features that ensure maximum flexibility to configure model geometry, define material parameters, specify numerical parameters, and choose the plotting options. Folder builds on an efficient finite element method model and implements state of the art iterative and time integration schemes. We describe the basic Folder features and present several case studies of single and multilayer stacks subject to layer parallel shortening and extension. Folder additionally comprises an application that illustrates various analytical solutions of growth rates calculated for the cases of layer parallel shortening and extension of a single layer with interfaces perturbed with a single sinusoidal waveform. We further derive two novel analytical expressions for the growth rate in the cases of layer parallel shortening and extension of a linear viscous layer embedded in a linear viscous medium of a finite thickness. These solutions help understand mechanical instabilities in layered rocks and provide a unique opportunity for benchmarking of numerical codes. We demonstrate how Folder can be used for benchmarking of numerical codes. We test the accuracy of single-layer folding simulations using various 1) spatial and temporal resolutions, 2) iterative algorithms for non-linear materials, and 3) time integration schemes. The accuracy of the numerical results is quantified by: 1) comparing them to analytical solutions, if available, or 2) running convergence tests. As a result, we provide a map of the most optimal choice of grid size, time step, and number of iterations to keep the results of the numerical simulations below a given error for a given time integration scheme. Folder is an open source MATLAB application and comes with a user-friendly graphical interface. Folder is suitable for both educational

  4. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database.

    SciTech Connect

    Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division; Purdue Univ.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  5. Science-Based Approach for Advancing Marine and Hydrokinetic Energy: Integrating Numerical Simulations with Experiments

    NASA Astrophysics Data System (ADS)

    Sotiropoulos, F.; Kang, S.; Chamorro, L. P.; Hill, C.

    2011-12-01

    experimentally in the St. Anthony Falls Laboratory Main Channel. The experiments and simulations are compared with each other and shown to be in very good agreement both in terms of the mean flow and the turbulence statistics. The results are analyzed to study the structure of turbulence in the wake of the turbine and also identify the effects of turbulent fluctuations in the approach flow on the power produced by the turbine. Overall our results make a strong case that high-resolution numerical modeling, validated with detailed laboratory measurements, is a viable tool for assessing and optimizing the performance of MHK devices.

  6. An Analysis of Energy Savings Possible Through Advances in Automotive Tooling Technology

    SciTech Connect

    Rick Schmoyer, RLS

    2004-12-03

    The use of lightweight and highly formable advanced materials in automobile and truck manufacturing has the potential to save fuel. Advances in tooling technology would promote the use of these materials. This report describes an energy savings analysis performed to approximate the potential fuel savings and consequential carbon-emission reductions that would be possible because of advances in tooling in the manufacturing of, in particular, non-powertrain components of passenger cars and heavy trucks. Separate energy analyses are performed for cars and heavy trucks. Heavy trucks are considered to be Class 7 and 8 trucks (trucks rated over 26,000 lbs gross vehicle weight). A critical input to the analysis is a set of estimates of the percentage reductions in weight and drag that could be achieved by the implementation of advanced materials, as a consequence of improved tooling technology, which were obtained by surveying tooling industry experts who attended a DOE Workshop, Tooling Technology for Low-Volume Vehicle Production, held in Seattle and Detroit in October and November 2003. The analysis is also based on 2001 fuel consumption totals and on energy-audit component proportions of fuel use due to drag, rolling resistance, and braking. The consumption proportions are assumed constant over time, but an allowance is made for fleet growth. The savings for a particular component is then the product of total fuel consumption, the percentage reduction of the component, and the energy audit component proportion. Fuel savings estimates for trucks also account for weight-limited versus volume-limited operations. Energy savings are assumed to be of two types: (1) direct energy savings incurred through reduced forces that must be overcome to move the vehicle or to slow it down in braking. and (2) indirect energy savings through reductions in the required engine power, the production and transmission of which incur thermodynamic losses, internal friction, and other

  7. New Tsunami Forecast Tools for the French Polynesia Tsunami Warning System Part II: Numerical Modelling and Tsunami Height Estimation

    NASA Astrophysics Data System (ADS)

    Jamelot, Anthony; Reymond, Dominique

    2015-03-01

    Tsunami warning is classically based on two fundamental tools: the first one concerns the source parameters estimations, and the second one is the tsunami amplitude forecast. We presented in the first companion paper how the seismic source parameters are evaluated, and this second article describes the operational aspect and accuracy of the estimation of tsunami height using tsunami numerical modelling on a dedicated supercomputer (2.5 T-flops). The French Polynesian tsunami warning centre developed two new tsunami forecast tools for a tsunami warning context, based on our tsunami propagation numerical model named Taitoko. The first tool, named MERIT, that is very rapid, provides a preliminary forecast distribution of the tsunami amplitude for 30 sites located in French Polynesia in less than 5 min. In this case, the coastal tsunami height distribution is calculated from the numerical simulation of the tsunami amplitude in deep ocean using an empirical transfer function inspired by the Green Law. This method, which does not take into account resonance effects of bays and harbour, is suitable for rapid and first estimation of the tsunami danger. The second method, named COASTER, which uses 21 nested grids of increasing resolutions, gives more information about the coastal tsunami effects about the flow velocities, the arrival time of the maximal amplitude, and the maximal run-up height for five representative sites in 45 min. The historical tsunamis recorded over the last 22 years in French Polynesia have been simulated with these new tools to evaluate the accuracy of these methods. The results of the 23 historical tsunami simulations have been compared to the tide-gauge records of three sites in French Polynesia. The results, which are quite encouraging, shows standard errors of generally less than a 2 factor : the maximal standard error is 0.38 m for the Tahauku Bay of Hiva-Oa (Marquesas islands).

  8. Development of Numerical Tools for the Investigation of Plasma Detachment from Magnetic Nozzles

    NASA Technical Reports Server (NTRS)

    Sankaran, Kamesh; Polzin, Kurt A.

    2007-01-01

    A multidimensional numerical simulation framework aimed at investigating the process of plasma detachment from a magnetic nozzle is introduced. An existing numerical code based on a magnetohydrodynamic formulation of the plasma flow equations that accounts for various dispersive and dissipative processes in plasmas was significantly enhanced to allow for the modeling of axisymmetric domains containing three.dimensiunai momentum and magnetic flux vectors. A separate magnetostatic solver was used to simulate the applied magnetic field topologies found in various nozzle experiments. Numerical results from a magnetic diffusion test problem in which all three components of the magnetic field were present exhibit excellent quantitative agreement with the analytical solution, and the lack of numerical instabilities due to fluctuations in the value of del(raised dot)B indicate that the conservative MHD framework with dissipative effects is well-suited for multi-dimensional analysis of magnetic nozzles. Further studies will focus on modeling literature experiments both for the purpose of code validation and to extract physical insight regarding the mechanisms driving detachment.

  9. Simultaneous heat and mass transfer in a horizontal tube absorber: Numerical tools for present and future absorber designs

    NASA Astrophysics Data System (ADS)

    Wassenaar, Reinder Hette

    1994-11-01

    Absorption cycles like the absorption heat pump or the absorption heat transformer can contribute to savings on one of the earth's resources, energy. Because of the high initial expenses, application of an conventional absorption apparatus is only economical in the MW range at today's prices. In this project numerical tools were developed for the design of heat and mass exchangers with a better price-performance ratio, that open a wider field of application for absorption cycles. The tools are mathematical formulations of the conservation of energy and mass for absorption in a falling film flow along a cooled wall. The resulting set of partial differential equations with appropriate boundary conditions is made dimensionless to make them applicable to any mixture or geometry and to summarize the model parameters in a few dimensionless groups. The numerical tools developed give, in contrast to the existing finite difference descriptions, accurate outcomes at very low computational costs and apply also on a flow field of arbitrary geometry.

  10. Numerical Simulations and Tracer Studies as a Tool to Support Water Circulation Modeling in Breeding Reservoirs

    NASA Astrophysics Data System (ADS)

    Zima, Piotr

    2014-12-01

    The article presents a proposal of a method for computer-aided design and analysis of breeding reservoirs in zoos and aquariums. The method applied involves the use of computer simulations of water circulation in breeding pools. A mathematical model of a pool was developed, and a tracer study was carried out. A simplified model of two-dimensional flow in the form of a biharmonic equation for the stream function (converted into components of the velocity vector) was adopted to describe the flow field. This equation, supplemented by appropriate boundary conditions, was solved numerically by the finite difference method. Next, a tracer migration equation was solved, which was a two-dimensional advection-dispersion equation describing the unsteady transport of a non-active, permanent solute. In order to obtain a proper solution, a tracer study (with rhodamine WT as a tracer) was conducted in situ. The results of these measurements were compared with numerical solutions obtained. The results of numerical simulations made it possible to reconstruct water circulation in the breading pool and to identify still water zones, where water circulation was impeded.

  11. Development of Experimental and Computational Aeroacoustic Tools for Advanced Liner Evaluation

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.; Watson, Willie R.; Nark, Douglas N.; Parrott, Tony L.; Gerhold, Carl H.; Brown, Martha C.

    2006-01-01

    Acoustic liners in aircraft engine nacelles suppress radiated noise. Therefore, as air travel increases, increasingly sophisticated tools are needed to maximize noise suppression. During the last 30 years, NASA has invested significant effort in development of experimental and computational acoustic liner evaluation tools. The Curved Duct Test Rig is a 152-mm by 381- mm curved duct that supports liner evaluation at Mach numbers up to 0.3 and source SPLs up to 140 dB, in the presence of user-selected modes. The Grazing Flow Impedance Tube is a 51- mm by 63-mm duct currently being fabricated to operate at Mach numbers up to 0.6 with source SPLs up to at least 140 dB, and will replace the existing 51-mm by 51-mm duct. Together, these test rigs allow evaluation of advanced acoustic liners over a range of conditions representative of those observed in aircraft engine nacelles. Data acquired with these test ducts are processed using three aeroacoustic propagation codes. Two are based on finite element solutions to convected Helmholtz and linearized Euler equations. The third is based on a parabolic approximation to the convected Helmholtz equation. The current status of these computational tools and their associated usage with the Langley test rigs is provided.

  12. Advances in Analytical and Numerical Dispersion Modeling of Pollutants Releasing from an Area-source

    NASA Astrophysics Data System (ADS)

    Nimmatoori, Praneeth

    The air quality near agricultural activities such as tilling, plowing, harvesting, and manure application is of main concern because they release fine particulate matter into the atmosphere. These releases are modeled as area-sources in the air quality modeling research. None of the currently available dispersion models relate and incorporate physical characteristics and meteorological conditions for modeling the dispersion and deposition of particulates emitting from such area-sources. This knowledge gap was addressed by developing the advanced analytical and numerical methods for modeling the dispersion of particulate matter. The development, application, and evaluation of new dispersion modeling methods are discussed in detail in this dissertation. In the analytical modeling, a ground-level area source analytical dispersion model known as particulate matter deposition -- PMD was developed for predicting the concentrations of different particle sizes. Both the particle dynamics (particle physical characteristics) and meteorological conditions which have significant effect on the dispersion of particulates were related and incorporated in the PMD model using the formulations of particle gravitational settling and dry deposition velocities. The modeled particle size concentrations of the PMD model were evaluated statistically after applying it to particulates released from a biosolid applied agricultural field. The evaluation of the PMD model using the statistical criteria concluded effective and successful inclusion of dry deposition theory for modeling particulate matter concentrations. A comprehensive review of analytical area-source dispersion models, which do not account for dry deposition and treat pollutants as gases, was conducted and determined three models -- the Shear, the Parker, and the Smith. A statistical evaluation of these dispersion models was conducted after applying them to two different field data sets and the statistical results concluded that

  13. Numerical tools to validate stationary points of SO(8)-gauged N=8D=4 supergravity

    NASA Astrophysics Data System (ADS)

    Fischbacher, Thomas

    2012-03-01

    Until recently, the preferred strategy to identify stationary points in the scalar potential of SO(8)-gauged N=8 supergravity in D=4 has been to consider truncations of the potential to sub-manifolds of E/SU(8) that are invariant under some postulated residual gauge group G⊂SO(8). As powerful alternative strategies have been shown to exist that allow one to go far beyond what this method can achieve — and in particular have produced numerous solutions that break the SO(8) gauge group to no continuous residual symmetry — independent verification of results becomes a problem due to both the complexity of the scalar potential and the large number of new solutions. This article introduces a conceptually simple self-contained piece of computer code that allows independent numerical validation of claims on the locations of newly discovered stationary points. Program summaryProgram title: e7-vacua Catalogue identifier: AELB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4447 No. of bytes in distributed program, including test data, etc.: 281 689 Distribution format: tar.gz Programming language: Python Computer: Any Operating system: Unix/Linux RAM: 1 Giga-byte Classification: 1.5, 11.1 External routines: Scientific Python (SciPy) ( http://www.scipy.org/), NumPy ( http://numpy.scipy.org) Nature of problem: This code allows numerical validation of claims about the existence of critical points in the scalar potential of four-dimensional SO(8)-gauged N=8 supergravity. Solution method: Tensor algebra. Running time: Full analysis of a solution (including scalar mass matrices): about 15 minutes. Otherwise, about 1-2 minutes.

  14. Numerical arc segmentation algorithm for a radio conference - A software tool for communication satellite systems planning

    NASA Technical Reports Server (NTRS)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    A detailed description of a Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software package for communication satellite systems planning is presented. This software provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC - 88) on the use of the GEO and the planning of space services utilizing GEO. The features of the NASARC software package are described, and detailed information is given about the function of each of the four NASARC program modules. The results of a sample world scenario are presented and discussed.

  15. Development of a carburizing and quenching simulation tool: Numerical simulations of rings and gears

    SciTech Connect

    Anderson, C.; Goldman, P.; Rangaswamy, P.

    1996-06-24

    This paper describes a calculational procedure using the ABAQUS finite element code that simulates a carburizing and quench heat treat cycle for automotive gears. The procedure features a numerically efficient 2-phase constitutive model to represent transformational plasticity effects for the austenite/martensite transformation together with refined finite element meshes to capture the steep gradients in stress and composition near the gear surfaces. The procedure is illustrated on carburizing and quenching of a thick ring, and comparison of model predictions for distortion, phase distribution, and residual stress with experiment is discussed. Sensitivity of predictions to mesh refinement is studied.

  16. Homogenized Tomographic Models: a Tool for Efficient Numerical Modeling of Seismic Wave Propagation

    NASA Astrophysics Data System (ADS)

    Landes, M.; Capdeville, Y.; Shapiro, N.; Guilbert, J.

    2013-12-01

    Full seismic waveforms are frequently used to characterize details of seismic sources and to discriminate their origin. Prediction of realistic waveforms requires developing algorithms for fast and reliable simulation of seismic wave propagation in 3D models of Earth. Classical 3D seismic tomographic models often use a layered parameterization. However, computing exact wave propagation in layered models may result in mesh complexity and long computing time. These difficulties become crucial when considering regional scales with operational interests. The aim of our study is to develop a parameterization of seismic tomographic models adapted for efficient numerical modeling of the wave propagation within a given frequency range. We use a 'homogenization' approach to construct models smoothed at scale naturally imposed by their propagation characteristics at target frequencies. We start with defining a basis of continuous and smooth functions with a Principal Component Analysis based on the statistic of the homogenized CUB2 tomographic model. Then, we invert surface phase and the group velocities deduced from global tomographic maps with a Monte Carlo method to generate smooth depth profiles with a controlled number of unknowns at all grid points. The set of these profiles form a smooth 3D seismic velocity model designed for numerical wave propagation.

  17. FOLDER: A numerical tool to simulate the development of structures in layered media

    NASA Astrophysics Data System (ADS)

    Adamuszek, Marta; Dabrowski, Marcin; Schmid, Daniel W.

    2015-04-01

    FOLDER is a numerical toolbox for modelling deformation in layered media during layer parallel shortening or extension in two dimensions. FOLDER builds on MILAMIN [1], a finite element method based mechanical solver, with a range of utilities included from the MUTILS package [2]. Numerical mesh is generated using the Triangle software [3]. The toolbox includes features that allow for: 1) designing complex structures such as multi-layer stacks, 2) accurately simulating large-strain deformation of linear and non-linear viscous materials, 3) post-processing of various physical fields such as velocity (total and perturbing), rate of deformation, finite strain, stress, deviatoric stress, pressure, apparent viscosity. FOLDER is designed to ensure maximum flexibility to configure model geometry, define material parameters, specify range of numerical parameters in simulations and choose the plotting options. FOLDER is an open source MATLAB application and comes with a user friendly graphical interface. The toolbox additionally comprises an educational application that illustrates various analytical solutions of growth rates calculated for the cases of folding and necking of a single layer with interfaces perturbed with a single sinusoidal waveform. We further derive two novel analytical expressions for the growth rate in the cases of folding and necking of a linear viscous layer embedded in a linear viscous medium of a finite thickness. We use FOLDER to test the accuracy of single-layer folding simulations using various 1) spatial and temporal resolutions, 2) time integration schemes, and 3) iterative algorithms for non-linear materials. The accuracy of the numerical results is quantified by: 1) comparing them to analytical solution, if available, or 2) running convergence tests. As a result, we provide a map of the most optimal choice of grid size, time step, and number of iterations to keep the results of the numerical simulations below a given error for a given time

  18. Advanced material modelling in numerical simulation of primary acetabular press-fit cup stability.

    PubMed

    Souffrant, R; Zietz, C; Fritsche, A; Kluess, D; Mittelmeier, W; Bader, R

    2012-01-01

    Primary stability of artificial acetabular cups, used for total hip arthroplasty, is required for the subsequent osteointegration and good long-term clinical results of the implant. Although closed-cell polymer foams represent an adequate bone substitute in experimental studies investigating primary stability, correct numerical modelling of this material depends on the parameter selection. Material parameters necessary for crushable foam plasticity behaviour were originated from numerical simulations matched with experimental tests of the polymethacrylimide raw material. Experimental primary stability tests of acetabular press-fit cups consisting of static shell assembly with consecutively pull-out and lever-out testing were subsequently simulated using finite element analysis. Identified and optimised parameters allowed the accurate numerical reproduction of the raw material tests. Correlation between experimental tests and the numerical simulation of primary implant stability depended on the value of interference fit. However, the validated material model provides the opportunity for subsequent parametric numerical studies. PMID:22817471

  19. Numerical Modeling Tools for the Prediction of Solution Migration Applicable to Mining Site

    SciTech Connect

    Martell, M.; Vaughn, P.

    1999-01-06

    Mining has always had an important influence on cultures and traditions of communities around the globe and throughout history. Today, because mining legislation places heavy emphasis on environmental protection, there is great interest in having a comprehensive understanding of ancient mining and mining sites. Multi-disciplinary approaches (i.e., Pb isotopes as tracers) are being used to explore the distribution of metals in natural environments. Another successful approach is to model solution migration numerically. A proven method to simulate solution migration in natural rock salt has been applied to project through time for 10,000 years the system performance and solution concentrations surrounding a proposed nuclear waste repository. This capability is readily adaptable to simulate solution migration around mining.

  20. Advanced Launch Technology Life Cycle Analysis Using the Architectural Comparison Tool (ACT)

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.

    2015-01-01

    Life cycle technology impact comparisons for nanolauncher technology concepts were performed using an Affordability Comparison Tool (ACT) prototype. Examined are cost drivers and whether technology investments can dramatically affect the life cycle characteristics. Primary among the selected applications was the prospect of improving nanolauncher systems. As a result, findings and conclusions are documented for ways of creating more productive and affordable nanolauncher systems; e.g., an Express Lane-Flex Lane concept is forwarded, and the beneficial effect of incorporating advanced integrated avionics is explored. Also, a Functional Systems Breakdown Structure (F-SBS) was developed to derive consistent definitions of the flight and ground systems for both system performance and life cycle analysis. Further, a comprehensive catalog of ground segment functions was created.

  1. A decision support tool for synchronizing technology advances with strategic mission objectives

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.; Willoughby, John K.

    1992-01-01

    Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.

  2. Community-based participatory research as a tool to advance environmental health sciences.

    PubMed Central

    O'Fallon, Liam R; Dearry, Allen

    2002-01-01

    The past two decades have witnessed a rapid proliferation of community-based participatory research (CBPR) projects. CBPR methodology presents an alternative to traditional population-based biomedical research practices by encouraging active and equal partnerships between community members and academic investigators. The National Institute of Environmental Health Sciences (NIEHS), the premier biomedical research facility for environmental health, is a leader in promoting the use of CBPR in instances where community-university partnerships serve to advance our understanding of environmentally related disease. In this article, the authors highlight six key principles of CBPR and describe how these principles are met within specific NIEHS-supported research investigations. These projects demonstrate that community-based participatory research can be an effective tool to enhance our knowledge of the causes and mechanisms of disorders having an environmental etiology, reduce adverse health outcomes through innovative intervention strategies and policy change, and address the environmental health concerns of community residents. PMID:11929724

  3. Integrated performance and dependability analysis using the advanced design environment prototype tool ADEPT

    SciTech Connect

    Rao, R.; Rahman, A.; Johnson, B.W.

    1995-09-01

    The Advanced Design Environment Prototype Tool (ADEPT) is an evolving integrated design environment which supports both performance and dependability analysis. ADEPT models are constructed using a collection of predefined library elements, called ADEPT modules. Each ADEPT module has an unambiguous mathematical definition in the form of a Colored Petri Net (CPN) and a corresponding Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) description. As a result, both simulation-based and analytical approaches for analysis can be employed. The focus of this paper is on dependability modeling and analysis using ADEPT. We present the simulation based approach to dependability analysis using ADEPT and an approach to integrating ADEPT and the Reliability Estimation System Testbed (REST) engine developed at NASA. We also present analytical techniques to extract the dependability characteristics of a system from the CPN definitions of the modules, in order to generate alternate models such as Markov models and fault trees.

  4. Numerical Model of Flame Spread Over Solids in Microgravity: A Supplementary Tool for Designing a Space Experiment

    NASA Technical Reports Server (NTRS)

    Shih, Hsin-Yi; Tien, James S.; Ferkul, Paul (Technical Monitor)

    2001-01-01

    The recently developed numerical model of concurrent-flow flame spread over thin solids has been used as a simulation tool to help the designs of a space experiment. The two-dimensional and three-dimensional, steady form of the compressible Navier-Stokes equations with chemical reactions are solved. With the coupled multi-dimensional solver of the radiative heat transfer, the model is capable of answering a number of questions regarding the experiment concept and the hardware designs. In this paper, the capabilities of the numerical model are demonstrated by providing the guidance for several experimental designing issues. The test matrix and operating conditions of the experiment are estimated through the modeling results. The three-dimensional calculations are made to simulate the flame-spreading experiment with realistic hardware configuration. The computed detailed flame structures provide the insight to the data collection. In addition, the heating load and the requirements of the product exhaust cleanup for the flow tunnel are estimated with the model. We anticipate that using this simulation tool will enable a more efficient and successful space experiment to be conducted.

  5. Free Radical Addition Polymerization Kinetics without Steady-State Approximations: A Numerical Analysis for the Polymer, Physical, or Advanced Organic Chemistry Course

    ERIC Educational Resources Information Center

    Iler, H. Darrell; Brown, Amber; Landis, Amanda; Schimke, Greg; Peters, George

    2014-01-01

    A numerical analysis of the free radical addition polymerization system is described that provides those teaching polymer, physical, or advanced organic chemistry courses the opportunity to introduce students to numerical methods in the context of a simple but mathematically stiff chemical kinetic system. Numerical analysis can lead students to an…

  6. GenSAA: A tool for advancing satellite monitoring with graphical expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Luczak, Edward C.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real time data for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At the NASA Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  7. Proposing "the burns suite" as a novel simulation tool for advancing the delivery of burns education.

    PubMed

    Sadideen, Hazim; Wilson, David; Moiemen, Naiem; Kneebone, Roger

    2014-01-01

    Educational theory highlights the importance of contextualized simulation for effective learning. We explored this concept in a burns scenario in a novel, low-cost, high-fidelity, portable, immersive simulation environment (referred to as distributed simulation). This contextualized simulation/distributed simulation combination was named "The Burns Suite" (TBS). A pediatric burn resuscitation scenario was selected after high trainee demand. It was designed on Advanced Trauma and Life Support and Emergency Management of Severe Burns principles and refined using expert opinion through cognitive task analysis. TBS contained "realism" props, briefed nurses, and a simulated patient. Novices and experts were recruited. Five-point Likert-type questionnaires were developed for face and content validity. Cronbach's α was calculated for scale reliability. Semistructured interviews captured responses for qualitative thematic analysis allowing for data triangulation. Twelve participants completed TBS scenario. Mean face and content validity ratings were high (4.6 and 4.5, respectively; range, 4-5). The internal consistency of questions was high. Qualitative data analysis revealed that participants felt 1) the experience was "real" and they were "able to behave as if in a real resuscitation environment," and 2) TBS "addressed what Advanced Trauma and Life Support and Emergency Management of Severe Burns didn't" (including the efficacy of incorporating nontechnical skills). TBS provides a novel, effective simulation tool to significantly advance the delivery of burns education. Recreating clinical challenge is crucial to optimize simulation training. This low-cost approach also has major implications for surgical education, particularly during increasing financial austerity. Alternative scenarios and/or procedures can be recreated within TBS, providing a diverse educational immersive simulation experience. PMID:23877145

  8. Advanced numerical methods for three dimensional two-phase flow calculations

    SciTech Connect

    Toumi, I.; Caruge, D.

    1997-07-01

    This paper is devoted to new numerical methods developed for both one and three dimensional two-phase flow calculations. These methods are finite volume numerical methods and are based on the use of Approximate Riemann Solvers concepts to define convective fluxes versus mean cell quantities. The first part of the paper presents the numerical method for a one dimensional hyperbolic two-fluid model including differential terms as added mass and interface pressure. This numerical solution scheme makes use of the Riemann problem solution to define backward and forward differencing to approximate spatial derivatives. The construction of this approximate Riemann solver uses an extension of Roe`s method that has been successfully used to solve gas dynamic equations. As far as the two-fluid model is hyperbolic, this numerical method seems very efficient for the numerical solution of two-phase flow problems. The scheme was applied both to shock tube problems and to standard tests for two-fluid computer codes. The second part describes the numerical method in the three dimensional case. The authors discuss also some improvements performed to obtain a fully implicit solution method that provides fast running steady state calculations. Such a scheme is not implemented in a thermal-hydraulic computer code devoted to 3-D steady-state and transient computations. Some results obtained for Pressurised Water Reactors concerning upper plenum calculations and a steady state flow in the core with rod bow effect evaluation are presented. In practice these new numerical methods have proved to be stable on non staggered grids and capable of generating accurate non oscillating solutions for two-phase flow calculations.

  9. Common Analysis Tool Being Developed for Aeropropulsion: The National Cycle Program Within the Numerical Propulsion System Simulation Environment

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia G.

    1999-01-01

    The NASA Lewis Research Center is developing an environment for analyzing and designing aircraft engines-the Numerical Propulsion System Simulation (NPSS). NPSS will integrate multiple disciplines, such as aerodynamics, structure, and heat transfer, and will make use of numerical "zooming" on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS uses the latest computing and communication technologies to capture complex physical processes in a timely, cost-effective manner. The vision of NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Through the NASA/Industry Cooperative Effort agreement, NASA Lewis and industry partners are developing a new engine simulation called the National Cycle Program (NCP). NCP, which is the first step toward NPSS and is its initial framework, supports the aerothermodynamic system simulation process for the full life cycle of an engine. U.S. aircraft and airframe companies recognize NCP as the future industry standard common analysis tool for aeropropulsion system modeling. The estimated potential payoff for NCP is a $50 million/yr savings to industry through improved engineering productivity.

  10. Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool.

    PubMed

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model. PMID:20371912

  11. Development of a carburizing and quenching simulation tool: numerical simulations of rings and gears

    SciTech Connect

    Anderson, C.; Godlman, P.; Rangaswamy, P.

    1996-10-01

    The ability to accurately calculate temperatures, stresses and metallurgical transformations in a single calculation or in a sequence of calculations is the key to prediction of distortion, residual stress and phase distribution in quench hardened automotive parts. Successful predictions in turn rely on the adequacy of the input data to the calculational procedure. These data include mechanical and thermal properties of the alloy phases over the range of temperature and strain rates experienced during the heat treat process, the mathematical description of the transformation kinetics, and the accuracy of the heat transfer boundary conditions. In this presentation we describe a calculational procedure using the ABAQUS{sup (1)} finite element code that simulates a carburizing and quench heat treat cycle for automotive gears. The calculational procedure features a numerically efficient 2-phase constitutive model, developed as part of the NCMS-Heat Treatment Distortion Prediction program, to represent transformational plasticity effects for the austenite/martensite Deformation together with refined finite element meshes to capture the steep gradients in stress and composition near the gear surfaces. The calculational procedure is illustrated on carburizing and quenching of a thick ring and comparison of model predictions for distortion, phase distribution, and residual stress with experimental measurements are discussed. Included in this model study is an investigation of the sensitivity of the predictions to mesh refinement.

  12. Numerical arc segmentation algorithm for a radio conference: A software tool for communication satellite systems planning

    NASA Technical Reports Server (NTRS)

    Whyte, W. A.; Heyward, A. O.; Ponchak, D. S.; Spence, R. L.; Zuzek, J. E.

    1988-01-01

    The Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) provides a method of generating predetermined arc segments for use in the development of an allotment planning procedure to be carried out at the 1988 World Administrative Radio Conference (WARC) on the Use of the Geostationary Satellite Orbit and the Planning of Space Services Utilizing It. Through careful selection of the predetermined arc (PDA) for each administration, flexibility can be increased in terms of choice of system technical characteristics and specific orbit location while reducing the need for coordination among administrations. The NASARC software determines pairwise compatibility between all possible service areas at discrete arc locations. NASARC then exhaustively enumerates groups of administrations whose satellites can be closely located in orbit, and finds the arc segment over which each such compatible group exists. From the set of all possible compatible groupings, groups and their associated arc segments are selected using a heuristic procedure such that a PDA is identified for each administration. Various aspects of the NASARC concept and how the software accomplishes specific features of allotment planning are discussed.

  13. CRPropa: A numerical tool for the propagation of UHE cosmic rays, γ-rays and neutrinos

    NASA Astrophysics Data System (ADS)

    Armengaud, Eric; Sigl, Günter; Beau, Tristan; Miniati, Francesco

    2007-12-01

    To understand the origin of ultra-high energy cosmic rays (UHECRs, defined to be above 1018 eV), it is required to model in a realistic way their propagation in the Universe. UHECRs can interact with low energy radio, microwave, infrared and optical photons to produce electron/positron pairs or pions. The latter decay and give rise to neutrinos and electromagnetic cascades extending down to MeV energies. In addition, deflections in cosmic magnetic fields can influence the spectrum and sky distribution of primary cosmic rays and, due to the increased propagation path length, the secondary neutrino and γ-ray fluxes. Neutrino, γ-ray, cosmic ray physics and extra-galactic magnetic fields are, therefore, strongly linked subjects and should be considered together in order to extract maximal information from existing and future data, like the one expected from the Auger Observatory. For that purpose, we have developed CRPropa, a publicly-available numerical package which takes into account interactions and deflections of primary UHECRs as well as propagation of secondary electromagnetic cascades and neutrinos. CRPropa allows to compute the observable properties of UHECRs and their secondaries in a variety of models for the sources and propagation of these particles. Here we present physical processes taken into account as well as benchmark examples; a detailed documentation of the code can be found on our web site.

  14. Numerical simulation as an important tool in developing novel hypersonic technologies

    NASA Astrophysics Data System (ADS)

    Bocharov, A. N.; Balakirev, B. A.; Bityurin, V. A.; Gryaznov, V. K.; Golovin, N. N.; Iosilevskiy, I. L.; Evstigneev, N. M.; Medin, S. A.; Naumov, N. D.; Petrovskiy, V. P.; Ryabkov, O. I.; Solomonov, Yu S.; Tatarinov, A. V.; Teplyakov, I. O.; Tikhonov, A. A.; Fortov, V. E.

    2015-11-01

    Development of novel hypersonic technologies necessarily requires the development of methods for analyzing a motion of hypervelocity vehicles. This paper could be considered as the initial stage in developing of complex computational model for studying flows around hypervelocity vehicles of arbitrary shape. Essential part of the model is a solution to three-dimensional transport equations for mass, momentum and energy for the medium in the state of both LTE (local thermodynamic equilibrium) and non-LTE. One of the primary requirements to the developed model is the realization on the modern heterogeneous computer systems including both CPU and GPU. The paper presents the first results on numerical simulation of hypersonic flow. The first problem considered is three-dimensional flow around curved body under angle of attack. The performance of heterogeneous 4-GPU computer system is tested. The second problem highlights the capabilities of the developed model to study heat and mass transfer problems. Namely, interior heat problem is considered which takes into account ablation of thermal protection system and variation of the surface shape of the vehicle.

  15. TOPLHA: an accurate and efficient numerical tool for analysis and design of LH antennas

    NASA Astrophysics Data System (ADS)

    Milanesio, D.; Lancellotti, V.; Meneghini, O.; Maggiora, R.; Vecchi, G.; Bilato, R.

    2007-09-01

    Auxiliary ICRF heating systems in tokamaks often involve large complex antennas, made up of several conducting straps hosted in distinct cavities that open towards the plasma. The same holds especially true in the LH regime, wherein the antennas are comprised of arrays of many phased waveguides. Upon observing that the various cavities or waveguides couple to each other only through the EM fields existing over the plasma-facing apertures, we self-consistently formulated the EM problem by a convenient set of multiple coupled integral equations. Subsequent application of the Method of Moments yields a highly sparse algebraic system; therefore formal inversion of the system matrix happens to be not so memory demanding, despite the number of unknowns may be quite large (typically 105 or so). The overall strategy has been implemented in an enhanced version of TOPICA (Torino Polytechnic Ion Cyclotron Antenna) and in a newly developed code named TOPLHA (Torino Polytechnic Lower Hybrid Antenna). Both are simulation and prediction tools for plasma facing antennas that incorporate commercial-grade 3D graphic interfaces along with an accurate description of the plasma. In this work we present the new proposed formulation along with examples of application to real life large LH antenna systems.

  16. Groundwater numerical modeling as a complementary tool for designing hydraulic structures

    NASA Astrophysics Data System (ADS)

    Lanubile, R.; Zanini, A.

    2013-12-01

    The city of Parma (Italy) is characterized by a junction of two small rivers: Parma and Baganza. The city, since 2004, is served by a flood control reservoir on the Parma River with the aim at mitigating the flood risk in urban areas. Recently, in order to increase the safety of the city, a new flood control reservoir on Baganza River has been planned. The first carried out study has allowed to define the reservoir location and geometry, the storage area (1200 m x 700 m) and volume, and the maximum acceptable head stage inside the basin. The reservoir consists of a main structure that limits the flow rate downstream, 1700 m of levees and three check dam upstream. These allow to lower the river bed and the storage area with the aim at increasing the storage volume and simultaneously limiting the elevation of levees. Moreover, in order to avoid the piping, grout walls below the main structure and the levees have been planned. During the last year, the aquifer beneath and surrounding the study area has been investigated by means of 18 boreholes, 14 monitoring wells, geoelectrical and geophysical surveys and several pumping tests. The head levels inside the wells have been monitored in order to evaluate the seasonal fluctuations and the influence of the river on groundwater. The local stratigraphy could be simplified in: 0 - 28 m gravel-sand with a succession of thin clay lens, 28 - 35 m clay and 35 - 50 m gravel-sand. The monitoring wells have allowed to identify two different water tables that demonstrate the existence of two aquifers: a phreatic one (0- 28 m) connected to the river stage and a confined one (35-50 m). The phreatic aquifer extends in a wide region that covers not only the reservoir location but also a residential and an agricultural area; for this reason a great attention has been paid on the wells used for human activities and especially for irrigation. A numerical model of the aquifer has been developed by means of MODFLOW 2000. All available data

  17. Analysis of the Source Physics Experiment SPE4 Prime Using State-Of Parallel Numerical Tools.

    NASA Astrophysics Data System (ADS)

    Vorobiev, O.; Ezzedine, S. M.; Antoun, T.; Glenn, L.

    2015-12-01

    This work describes a methodology used for large scale modeling of wave propagation from underground chemical explosions conducted at the Nevada National Security Site (NNSS) fractured granitic rock. We show that the discrete natures of rock masses as well as the spatial variability of the fabric of rock properties are very important to understand ground motions induced by underground explosions. In order to build a credible conceptual model of the subsurface we integrated the geological, geomechanical and geophysical characterizations conducted during recent test at the NNSS as well as historical data from the characterization during the underground nuclear test conducted at the NNSS. Because detailed site characterization is limited, expensive and, in some instances, impossible we have numerically investigated the effects of the characterization gaps on the overall response of the system. We performed several computational studies to identify the key important geologic features specific to fractured media mainly the joints characterized at the NNSS. We have also explored common key features to both geological environments such as saturation and topography and assess which characteristics affect the most the ground motion in the near-field and in the far-field. Stochastic representation of these features based on the field characterizations has been implemented into LLNL's Geodyn-L hydrocode. Simulations were used to guide site characterization efforts in order to provide the essential data to the modeling community. We validate our computational results by comparing the measured and computed ground motion at various ranges for the recently executed SPE4 prime experiment. We have also conducted a comparative study between SPE4 prime and previous experiments SPE1 and SPE3 to assess similarities and differences and draw conclusions on designing SPE5.

  18. Recent advances in numerical simulation and control of asymmetric flows around slender bodies

    NASA Technical Reports Server (NTRS)

    Kandil, Osama A.; Wong, Tin-Chee; Sharaf, Hazem H.; Liu, C. H.

    1992-01-01

    The problems of asymmetric flow around slender bodies and its control are formulated using the unsteady, compressible, thin-layer or full Navier-Stokes equations which are solved using an implicit, flux-difference splitting, finite-volume scheme. The problem is numerically simulated for both locally-conical and three-dimensional flows. The numerical applications include studies of the effects of relative incidence, Mach number and Reynolds number on the flow asymmetry. For the control of flow asymmetry, the numerical simulation cover passive and active control methods. For the passive control, the effectiveness of vertical fins placed in the leeward plane of geometric symmetry and side strakes with different orientations is studied. For the active control, the effectiveness of normal and tangential flow injection and surface heating and a combination of these methods is studied.

  19. Springback Simulation: Impact of Some Advanced Constitutive Models and Numerical Parameters

    NASA Astrophysics Data System (ADS)

    Haddag, Badis; Balan, Tudor; Abed-Meraim, Farid

    2005-08-01

    The impact of material models on the numerical simulation of springback is investigated. The study is focused on the strain-path sensitivity of two hardening models. While both models predict the Bauschinger effect, their response in the transient zone after a strain-path change is fairly different. Their respective predictions are compared in terms of sequential test response and of strip-drawing springback. For this purpose, an accurate and general time integration algorithm has been developed and implemented in the Abaqus code. The impact of several numerical parameters is also studied in order to assess the overall accuracy of the finite element prediction. For some test geometries, both material and numerical parameters are shown to clearly influence the springback behavior at a large extent. Moreover, a general trend cannot always be extracted, thus justifying the need for the finite element simulation of the stamping process.

  20. A review of recent advances in numerical simulations of microscale fuel processor for hydrogen production

    NASA Astrophysics Data System (ADS)

    Holladay, J. D.; Wang, Y.

    2015-05-01

    Microscale (<5 W) reformers for hydrogen production have been investigated for over a decade. These devices are intended to provide hydrogen for small fuel cells. Due to the reformer's small size, numerical simulations are critical to understand heat and mass transfer phenomena occurring in the systems and help guide the further improvements. This paper reviews the development of the numerical codes and details the reaction equations used. The majority of the devices utilized methanol as the fuel due to methanol's low reforming temperature and high conversion, although, there are several methane fueled systems. The increased computational power and more complex codes have led to improved accuracy of numerical simulations. Initial models focused on the reformer, while more recently, the simulations began including other unit operations such as vaporizers, inlet manifolds, and combustors. These codes are critical for developing the next generation systems. The systems reviewed included plate reactors, microchannel reactors, and annulus reactors for both wash-coated and packed bed systems.

  1. TOPICA: an accurate and efficient numerical tool for analysis and design of ICRF antennas

    NASA Astrophysics Data System (ADS)

    Lancellotti, V.; Milanesio, D.; Maggiora, R.; Vecchi, G.; Kyrytsya, V.

    2006-07-01

    The demand for a predictive tool to help in designing ion-cyclotron radio frequency (ICRF) antenna systems for today's fusion experiments has driven the development of codes such as ICANT, RANT3D, and the early development of TOPICA (TOrino Polytechnic Ion Cyclotron Antenna) code. This paper describes the substantive evolution of TOPICA formulation and implementation that presently allow it to handle the actual geometry of ICRF antennas (with curved, solid straps, a general-shape housing, Faraday screen, etc) as well as an accurate plasma description, accounting for density and temperature profiles and finite Larmor radius effects. The antenna is assumed to be housed in a recess-like enclosure. Both goals have been attained by formally separating the problem into two parts: the vacuum region around the antenna and the plasma region inside the toroidal chamber. Field continuity and boundary conditions allow formulating of a set of two coupled integral equations for the unknown equivalent (current) sources; then the equations are reduced to a linear system by a method of moments solution scheme employing 2D finite elements defined over a 3D non-planar surface triangular-cell mesh. In the vacuum region calculations are done in the spatial (configuration) domain, whereas in the plasma region a spectral (wavenumber) representation of fields and currents is adopted, thus permitting a description of the plasma by a surface impedance matrix. Owing to this approach, any plasma model can be used in principle, and at present the FELICE code has been employed. The natural outcomes of TOPICA are the induced currents on the conductors (antenna, housing, etc) and the electric field in front of the plasma, whence the antenna circuit parameters (impedance/scattering matrices), the radiated power and the fields (at locations other than the chamber aperture) are then obtained. An accurate model of the feeding coaxial lines is also included. The theoretical model and its TOPICA

  2. Sea Surface Salinity spectra: a validation tool for satellite, numerical simulations and in-situ data

    NASA Astrophysics Data System (ADS)

    Hoareau, Nina; Portabella, Marcos; García Ladona, Emilio; Turiel, Antonio; Ballabrera, Joaquim

    2014-05-01

    Satellite Remote sensing measurements are used in oceanography since the mid-1970s. Thanks to satellite imagery, the research community has been able to better interpret surface structures, such as meandering fronts or eddies, which became apparent in instantaneous views of the ocean. Moreover, satellite altimeter and sea surface temperature (SST) observations evidenced the high percentage of ocean energy accumulated at the intermediate scales (tens to hundreds of km, days-weeks), i.e., the oceanic mesoscale. Today, thanks to the launch of the Soil Moiture and Ocean Salinity (SMOS) mission (2009) and the Aquarius mission (2011), we have more than four years of satellite-derived Sea Surface Salinity (SSS) observations with the objectives of improving seasonal and interannual climate prediction, ocean rainfall estimates and hydrologic budgets, and monitoring large-scale salinity events and thermohaline convection (Lagerloef, 2001). A study from Reynolds and Chelton (2010) compared six different SST products using spatial power density spectra in three regions of the ocean at different periods (January and July 2007-2008). The results showed that the spatial spectra vary geographically and temporally, and from one product to the next. Here, a similar study is presented for the first time with SSS data to help understand the spatial signature of the SSS variability and validate the different data sources. Thanks to the increased maturity of remote sensing estimations of SSS, the spatial spectra of the SSS fields provided by numerical models can now be compared with observations. In this work, we focus on the region of North Atlantic Ocean for the year of January and July of 2011 and 2012. The data used in this work come from Satellites (AQUARIUS and/or SMOS Level 2), outputs of an ocean model (NEMO-OPA, configuration DRAKKAR-NATL025), in-situ observations collected during the Barcelona World Race (BWR 2010), and the climatology of Levitus (WOA09). The results show that

  3. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    ERIC Educational Resources Information Center

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  4. A Simple Tool to Predict ESRD Within 1 Year in Elderly Patients with Advanced CKD

    PubMed Central

    Drawz, Paul E.; Goswami, Puja; Azem, Reem; Babineau, Denise C.; Rahman, Mahboob

    2013-01-01

    BACKGROUND/OBJECTIVES Chronic kidney disease (CKD) is common in older patients; currently, no tools are available to predict the risk of end-stage renal disease (ESRD) within 1 year. The goal of this study was to develop and validate a model to predict the 1 year risk for ESRD in elderly subjects with advanced CKD. DESIGN Retrospective study SETTING Veterans Affairs Medical Center PARTICIPANTS Patients over 65 years of age with CKD with an estimated (eGFR) less than 30mL/min/1.73m2. MEASUREMENTS The outcome was ESRD within 1 year of the index eGFR. Cox regression was used to develop a predictive model (VA risk score) which was validated in a separate cohort. RESULTS Of the 1,866 patients in the developmental cohort, 77 developed ESRD. Risk factors for ESRD in the final model were age, congestive heart failure, systolic blood pressure, eGFR, potassium, and albumin. In the validation cohort, the C index for the VA risk score was 0.823. The risk for developing ESRD at 1 year from lowest to highest tertile was 0.08%, 2.7%, and 11.3% (P<0.001). The C-index for the recently published Tangri model in the validation cohort was 0.780. CONCLUSION A new model using commonly available clinical measures shows excellent ability to predict the onset of ESRD within the next year in elderly subjects. Additionally, the Tangri model had very good predictive ability. Patients and physicians can use these risk models to inform decisions regarding preparation for renal replacement therapy in patients with advanced CKD. PMID:23617782

  5. Development, Implementation and Application of Micromechanical Analysis Tools for Advanced High Temperature Composites

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document contains the final report to the NASA Glenn Research Center (GRC) for the research project entitled Development, Implementation, and Application of Micromechanical Analysis Tools for Advanced High-Temperature Composites. The research supporting this initiative has been conducted by Dr. Brett A. Bednarcyk, a Senior Scientist at OM in Brookpark, Ohio from the period of August 1998 to March 2005. Most of the work summarized herein involved development, implementation, and application of enhancements and new capabilities for NASA GRC's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package. When the project began, this software was at a low TRL (3-4) and at release version 2.0. Due to this project, the TRL of MAC/GMC has been raised to 7 and two new versions (3.0 and 4.0) have been released. The most important accomplishments with respect to MAC/GMC are: (1) A multi-scale framework has been built around the software, enabling coupled design and analysis from the global structure scale down to the micro fiber-matrix scale; (2) The software has been expanded to analyze smart materials; (3) State-of-the-art micromechanics theories have been implemented and validated within the code; (4) The damage, failure, and lifing capabilities of the code have been expanded from a very limited state to a vast degree of functionality and utility; and (5) The user flexibility of the code has been significantly enhanced. MAC/GMC is now the premier code for design and analysis of advanced composite and smart materials. It is a candidate for the 2005 NASA Software of the Year Award. The work completed over the course of the project is summarized below on a year by year basis. All publications resulting from the project are listed at the end of this report.

  6. Advanced CNC and CAM Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and student laboratory manual for a 1-year vocational training program to prepare students for entry-level positions as advanced computer numerical control (CNC) and computer-assisted manufacturing (CAM) technicians.. The program was developed through a modification of the DACUM…

  7. Advanced REACH Tool: development and application of the substance emission potential modifying factor.

    PubMed

    van Tongeren, Martie; Fransman, Wouter; Spankie, Sally; Tischer, Martin; Brouwer, Derk; Schinkel, Jody; Cherrie, John W; Tielemans, Erik

    2011-11-01

    The Advanced REACH Tool (ART) is an exposure assessment tool that combines mechanistically modelled inhalation exposure estimates with available exposure data using a Bayesian approach. The mechanistic model is based on nine independent principal modifying factors (MF). One of these MF is the substance emission potential, which addresses the intrinsic substance properties as determinants of the emission from a source. This paper describes the current knowledge and evidence on intrinsic characteristics of solids and liquids that determine the potential for their release into workplace air. The principal factor determining the release of aerosols from handling or processing powdered, granular, or pelletized materials is the dustiness of the material, as well as the weight fraction of the substance of interest in the powder and the moisture content. The partial vapour pressure is the main intrinsic factor determining the substance emission potential for emission of vapours. For generation of mist, the substance emission potential is determined by the viscosity of the liquid as well as the weight fraction of the substance of interest in the liquid. Within ART release of vapours is considered for substances with a partial vapour pressure at the process temperature of 10 Pa or more, while mist formation is considered for substances with a vapour pressure ≤ 10 Pa. Relative multipliers are assigned for most of the intrinsic factors, with the exception of the weight fraction and the vapour pressure, which is applied as a continuous variable in the estimation of the substance emission potential. Currently, estimation of substance emission potential is not available for fumes, fibres, and gases. The substance emission potential takes account of the latest thinking on emissions of dusts, mists, and vapours and in our view provides a good balance between theory and pragmatism. Expanding the knowledge base on substance emission potential will improve the predictive power of

  8. Monitoring of seismic time-series with advanced parallel computational tools and complex networks

    NASA Astrophysics Data System (ADS)

    Kechaidou, M.; Sirakoulis, G. Ch.; Scordilis, E. M.

    2012-04-01

    Earthquakes have been in the focus of human and research interest for several centuries due to their catastrophic effect to the everyday life as they occur almost all over the world demonstrating a hard to be modelled unpredictable behaviour. On the other hand, their monitoring with more or less technological updated instruments has been almost continuous and thanks to this fact several mathematical models have been presented and proposed so far to describe possible connections and patterns found in the resulting seismological time-series. Especially, in Greece, one of the most seismically active territories on earth, detailed instrumental seismological data are available from the beginning of the past century providing the researchers with valuable and differential knowledge about the seismicity levels all over the country. Considering available powerful parallel computational tools, such as Cellular Automata, these data can be further successfully analysed and, most important, modelled to provide possible connections between different parameters of the under study seismic time-series. More specifically, Cellular Automata have been proven very effective to compose and model nonlinear complex systems resulting in the advancement of several corresponding models as possible analogues of earthquake fault dynamics. In this work preliminary results of modelling of the seismic time-series with the help of Cellular Automata so as to compose and develop the corresponding complex networks are presented. The proposed methodology will be able to reveal under condition hidden relations as found in the examined time-series and to distinguish the intrinsic time-series characteristics in an effort to transform the examined time-series to complex networks and graphically represent their evolvement in the time-space. Consequently, based on the presented results, the proposed model will eventually serve as a possible efficient flexible computational tool to provide a generic

  9. A numerical technique for calculation of the noise of high-speed propellers with advanced blade geometry

    NASA Technical Reports Server (NTRS)

    Nystrom, P. A.; Farassat, F.

    1980-01-01

    A numerical technique and computer program were developed for the prediction of the noise of propellers with advanced geometry. The blade upper and lower surfaces are described by a curvilinear coordinate system, which was also used to divide the blade surfaces into panels. Two different acoustic formulations in the time domain were used to improve the speed and efficiency of the noise calculations: an acoustic formualtion with the Doppler factor singularity for panels moving at subsonic speeds and the collapsing sphere formulation for panels moving at transonic or supersonic speeds. This second formulation involves a sphere which is centered at the observer position and whose radius decreases at the speed of sound. The acoustic equation consisted of integrals over the curve of intersection for both the sphere and the panels on the blade. Algorithms used in some parts of the computer program are discussed. Comparisons with measured acoustic data for two model high speed propellers with advanced geometry are also presented.

  10. Development and Experimental Validation of a Numerical Tool for Structural Health and Usage Monitoring Systems Based on Chirped Grating Sensors

    PubMed Central

    Bettini, Paolo; Guerreschi, Erika; Sala, Giuseppe

    2015-01-01

    The interest of the aerospace industries in structural health and usage monitoring systems is continuously increasing. Among the techniques available in literature those based on Fibre Bragg Grating sensors are much promising thanks to their peculiarities. Different Chirped Bragg Grating sensor configurations have been investigated in this paper. Starting from a numerical model capable of simulating the spectral response of a grating subjected to a generic strain profile (direct problem), a new code has been developed, allowing strain reconstruction from the experimental validation of the program, carried out through different loading cases applied on a chirped grating. The wavelength of the reflection spectrum for a chirped FBG has a one-to-one correspondence to the position along the gauge section, thus allowing strain reconstruction over the entire sensor length. Tests conducted on chirped FBGs also evidenced their potential for SHM applications, if coupled with appropriate numerical strain reconstructions tools. Finally, a new class of sensors—Draw Tower Grating arrays—has been studied. These sensors are applicable to distributed sensing and load reconstruction over large structures, thanks to their greater length. Three configurations have been evaluated, having different spatial and spectral characteristics, in order to explore possible applications of such sensors to SHM systems. PMID:25587979

  11. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Creech, Dennis M.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2012-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent go-to group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA s design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer s needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  12. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Threet, Grady E., Jr.; Phillips, Alan

    2013-01-01

    The Earth-to-Orbit Team (ETO) of the Advanced Concepts Office (ACO) at NASA Marshall Space Flight Center (MSFC) is considered the pre-eminent "go-to" group for pre-phase A and phase A concept definition. Over the past several years the ETO team has evaluated thousands of launch vehicle concept variations for a significant number of studies including agency-wide efforts such as the Exploration Systems Architecture Study (ESAS), Constellation, Heavy Lift Launch Vehicle (HLLV), Augustine Report, Heavy Lift Propulsion Technology (HLPT), Human Exploration Framework Team (HEFT), and Space Launch System (SLS). The ACO ETO Team is called upon to address many needs in NASA's design community; some of these are defining extremely large trade-spaces, evaluating advanced technology concepts which have not been addressed by a large majority of the aerospace community, and the rapid turn-around of highly time critical actions. It is the time critical actions, those often limited by schedule or little advanced warning, that have forced the five member ETO team to develop a design process robust enough to handle their current output level in order to meet their customer's needs. Based on the number of vehicle concepts evaluated over the past year this output level averages to four completed vehicle concepts per day. Each of these completed vehicle concepts includes a full mass breakdown of the vehicle to a tertiary level of subsystem components and a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. A structural analysis of the vehicle to determine flight loads based on the trajectory output, material properties, and geometry of the concept is also performed. Due to working in this fast-paced and sometimes rapidly changing environment, the ETO Team has developed a finely tuned process to maximize their delivery capabilities. The objective of this paper is to describe the interfaces

  13. Numerical assessment of radiation binary targeted therapy for HER-2 positive breast cancers: advanced calculations and radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Sztejnberg Gonçalves-Carralves, Manuel L.; Jevremovic, Tatjana

    2007-07-01

    In our previous publication (Mundy et al 2006 Phys. Med. Biol. 51 1377) we have described the theoretical assessment of our novel approach in radiation binary targeted therapy for HER-2 positive breast cancers and summarized the future directions in this area of research. In this paper we advanced the numerical analysis to show the detailed radiation dose distribution for various neutron sources in combination with the required boron concentration and allowed radiation skin doses. We once again proved the feasibility of the concept and will use these data and conclusions to start with the experimental verifications.

  14. Illuminating a black box - determination of rates of reactive transport by combining numerical tools with optimized experiments

    NASA Astrophysics Data System (ADS)

    Wehrer, Dr; Totsche, Dr

    2009-04-01

    Only the combination of physical models and experiments can elucidate the processes of reactive transport in porous media. Column scale experiments offer a great opportunity to identify and quantify processes of reactive transport. In contrast to batch experiments, approximately natural flow dynamics can be realized. However, due to the complexity of interactions and wide range of parameters the experiment can be insensitive to the wanted process and misinterpretation of the results is likely. In the proposed talk we want to give examples how numerical tools can be applied for thorough planning and evaluation of experiments. In a first phase, we performed systematical numerical experiments to optimize the experimental conditions, which allow the quantification of (de-)sorption kinetics under percolation conditions. For short term column experiments we found, that the application of flow interruptions along with two different flow velocities can be applied to avoid uniqueness problems with respect to identification of partitioning coefficient and mass transfer rate. By a sensitivity analysis the parameter space was divided into regions where physical reasonable parameter estimates can be expected and where equifinal solutions are likely. In a second phase we conducted column experiments to test this optimized experimental design for its suitability for the identification and quantification of rate-limited contaminant release. We used materials polluted with organic and inorganic contaminants originating from different soils, sites and materials (Coke oven sites, abandoned industrial sites, destruction debris, municipal waste incineration ash). Repacked soil columns were percolated under saturated and unsaturated conditions and were subjected to multiple flow interruptions and different flow velocities. The third phase consisted of data evaluation and process quantification applying numerical inversion of a physical transport model. The parameter sets were evaluated

  15. A review of recent advances of numerical simulations of microscale fuel processors for hydrogen production

    SciTech Connect

    Holladay, Jamelyn D.; Wang, Yong

    2015-05-01

    Microscale (<5W) reformers for hydrogen production have been investigated for over a decade. These devices are intended to provide hydrogen for small fuel cells. Due to the reformer’s small size, numerical simulations are critical to understand heat and mass transfer phenomena occurring in the systems. This paper reviews the development of the numerical codes and details the reaction equations used. The majority of the devices utilized methanol as the fuel due to methanol’s low reforming temperature and high conversion, although, there are several methane fueled systems. As computational power has decreased in cost and increased in availability, the codes increased in complexity and accuracy. Initial models focused on the reformer, while more recently, the simulations began including other unit operations such as vaporizers, inlet manifolds, and combustors. These codes are critical for developing the next generation systems. The systems reviewed included, plate reactors, microchannel reactors, annulus reactors, wash-coated, packed bed systems.

  16. Numerical evaluation of longitudinal motions of Wigley hulls advancing in waves by using Bessho form translating-pulsating source Green'S function

    NASA Astrophysics Data System (ADS)

    Xiao, Wenbin; Dong, Wencai

    2016-06-01

    In the framework of 3D potential flow theory, Bessho form translating-pulsating source Green's function in frequency domain is chosen as the integral kernel in this study and hybrid source-and-dipole distribution model of the boundary element method is applied to directly solve the velocity potential for advancing ship in regular waves. Numerical characteristics of the Green function show that the contribution of local-flow components to velocity potential is concentrated at the nearby source point area and the wave component dominates the magnitude of velocity potential in the far field. Two kinds of mathematical models, with or without local-flow components taken into account, are adopted to numerically calculate the longitudinal motions of Wigley hulls, which demonstrates the applicability of translating-pulsating source Green's function method for various ship forms. In addition, the mesh analysis of discrete surface is carried out from the perspective of ship-form characteristics. The study shows that the longitudinal motion results by the simplified model are somewhat greater than the experimental data in the resonant zone, and the model can be used as an effective tool to predict ship seakeeping properties. However, translating-pulsating source Green function method is only appropriate for the qualitative analysis of motion response in waves if the ship geometrical shape fails to satisfy the slender-body assumption.

  17. Advances in the genetic dissection of plant cell walls: tools and resources available in Miscanthus

    PubMed Central

    Slavov, Gancho; Allison, Gordon; Bosch, Maurice

    2013-01-01

    Tropical C4 grasses from the genus Miscanthus are believed to have great potential as biomass crops. However, Miscanthus species are essentially undomesticated, and genetic, molecular and bioinformatics tools are in very early stages of development. Furthermore, similar to other crops targeted as lignocellulosic feedstocks, the efficient utilization of biomass is hampered by our limited knowledge of the structural organization of the plant cell wall and the underlying genetic components that control this organization. The Institute of Biological, Environmental and Rural Sciences (IBERS) has assembled an extensive collection of germplasm for several species of Miscanthus. In addition, an integrated, multidisciplinary research programme at IBERS aims to inform accelerated breeding for biomass productivity and composition, while also generating fundamental knowledge. Here we review recent advances with respect to the genetic characterization of the cell wall in Miscanthus. First, we present a summary of recent and on-going biochemical studies, including prospects and limitations for the development of powerful phenotyping approaches. Second, we review current knowledge about genetic variation for cell wall characteristics of Miscanthus and illustrate how phenotypic data, combined with high-density arrays of single-nucleotide polymorphisms, are being used in genome-wide association studies to generate testable hypotheses and guide biological discovery. Finally, we provide an overview of the current knowledge about the molecular biology of cell wall biosynthesis in Miscanthus and closely related grasses, discuss the key conceptual and technological bottlenecks, and outline the short-term prospects for progress in this field. PMID:23847628

  18. Ares First Stage "Systemology" - Combining Advanced Systems Engineering and Planning Tools to Assure Mission Success

    NASA Technical Reports Server (NTRS)

    Seiler, James; Brasfield, Fred; Cannon, Scott

    2008-01-01

    Ares is an integral part of NASA s Constellation architecture that will provide crew and cargo access to the International Space Station as well as low earth orbit support for lunar missions. Ares replaces the Space Shuttle in the post 2010 time frame. Ares I is an in-line, two-stage rocket topped by the Orion Crew Exploration Vehicle, its service module, and a launch abort system. The Ares I first stage is a single, five-segment reusable solid rocket booster derived from the Space Shuttle Program's reusable solid rocket motor. The Ares second or upper stage is propelled by a J-2X main engine fueled with liquid oxygen and liquid hydrogen. This paper describes the advanced systems engineering and planning tools being utilized for the design, test, and qualification of the Ares I first stage element. Included are descriptions of the current first stage design, the milestone schedule requirements, and the marriage of systems engineering, detailed planning efforts, and roadmapping employed to achieve these goals.

  19. Bioassays as a tool for evaluating advanced oxidation processes in water and wastewater treatment.

    PubMed

    Rizzo, Luigi

    2011-10-01

    Advanced oxidation processes (AOPs) have been widely used in water and wastewater treatment for the removal of organic and inorganic contaminants as well as to improve biodegradability of industrial wastewater. Unfortunately, the partial oxidation of organic contaminants may result in the formation of intermediates more toxic than parent compounds. In order to avoid this drawback, AOPs are expected to be carefully operated and monitored, and toxicity tests have been used to evaluate whether effluent detoxification takes place. In the present work, the effect of AOPs on the toxicity of aqueous solutions of different classes of contaminants as well as actual aqueous matrices are critically reviewed. The dualism toxicity-biodegradability when AOPs are used as pre-treatment step to improve industrial wastewater biodegradability is also discussed. The main conclusions/remarks include the followings: (i) bioassays are a really useful tool to evaluate the dangerousness of AOPs as well as to set up the proper operative conditions, (ii) target organisms for bioassays should be chosen according to the final use of the treated water matrix, (iii) acute toxicity tests may be not suitable to evaluate toxicity in the presence of low/realistic concentrations of target contaminants, so studies on chronic effects should be further developed, (iv) some toxicity tests may be not useful to evaluate biodegradability potential, in this case more suitable tests should be applied (e.g., activated sludge bioassays, respirometry). PMID:21722938

  20. How Project Management Tools Aid in Association to Advance Collegiate Schools of Business (AACSB) International Maintenance of Accreditation

    ERIC Educational Resources Information Center

    Cann, Cynthia W.; Brumagim, Alan L.

    2008-01-01

    The authors present the case of one business college's use of project management techniques as tools for accomplishing Association to Advance Collegiate Schools of Business (AACSB) International maintenance of accreditation. Using these techniques provides an efficient and effective method of organizing maintenance efforts. In addition, using…

  1. Recent advances in numerical simulation of space-plasma-physics problems

    NASA Technical Reports Server (NTRS)

    Birmingham, T. J.

    1983-01-01

    Computer simulations have become an increasingly popular, important and insightful tool for studying space plasmas. This review describes MHD and particle simulations, both of which treat the plasma and the electromagnetic field in which it moves in a self consistent fashion but on drastically different spatial and temporal scales. The complementary roles of simulation, observations and theory are stressed. Several examples of simulations being carried out in the area of magnetospheric plasma physics are described to illustrate the power, potential and limitations of the approach.

  2. Advanced Techniques for Seismic Protection of Historical Buildings: Experimental and Numerical Approach

    SciTech Connect

    Mazzolani, Federico M.

    2008-07-08

    The seismic protection of historical and monumental buildings, namely dating back from the ancient age up to the 20th Century, is being looked at with greater and greater interest, above all in the Euro-Mediterranean area, its cultural heritage being strongly susceptible to undergo severe damage or even collapse due to earthquake. The cultural importance of historical and monumental constructions limits, in many cases, the possibility to upgrade them from the seismic point of view, due to the fear of using intervention techniques which could have detrimental effects on their cultural value. Consequently, a great interest is growing in the development of sustainable methodologies for the use of Reversible Mixed Technologies (RMTs) in the seismic protection of the existing constructions. RMTs, in fact, are conceived for exploiting the peculiarities of innovative materials and special devices, and they allow ease of removal when necessary. This paper deals with the experimental and numerical studies, framed within the EC PROHITECH research project, on the application of RMTs to the historical and monumental constructions mainly belonging to the cultural heritage of the Euro-Mediterranean area. The experimental tests and the numerical analyses are carried out at five different levels, namely full scale models, large scale models, sub-systems, devices, materials and elements.

  3. Numerical Study on Crossflow Printed Circuit Heat Exchanger for Advanced Small Modular Reactors

    SciTech Connect

    Yoon, Su-Jong; Sabharwall, Piyush; Kim, Eung-Soo

    2014-03-01

    Various fluids such as water, gases (helium), molten salts (FLiNaK, FLiBe) and liquid metal (sodium) are used as a coolant of advanced small modular reactors (SMRs). The printed circuit heat exchanger (PCHE) has been adopted as the intermediate and/or secondary heat exchanger of SMR systems because this heat exchanger is compact and effective. The size and cost of PCHE can be changed by the coolant type of each SMR. In this study, the crossflow PCHE analysis code for advanced small modular reactor has been developed for the thermal design and cost estimation of the heat exchanger. The analytical solution of single pass, both unmixed fluids crossflow heat exchanger model was employed to calculate a two dimensional temperature profile of a crossflow PCHE. The analytical solution of crossflow heat exchanger was simply implemented by using built in function of the MATLAB program. The effect of fluid property uncertainty on the calculation results was evaluated. In addition, the effect of heat transfer correlations on the calculated temperature profile was analyzed by taking into account possible combinations of primary and secondary coolants in the SMR systems. Size and cost of heat exchanger were evaluated for the given temperature requirement of each SMR.

  4. Predictive Modeling of Estrogen Receptor Binding Agents Using Advanced Cheminformatics Tools and Massive Public Data

    PubMed Central

    Ribay, Kathryn; Kim, Marlene T.; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao

    2016-01-01

    Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR

  5. Numerical modelling of the groundwater inflow to an advancing open pit mine: Kolahdarvazeh pit, Central Iran.

    PubMed

    Bahrami, Saeed; Doulati Ardejani, Faramarz; Aslani, Soheyla; Baafi, Ernest

    2014-12-01

    The groundwater inflow into a mine during its life and after ceasing operations is one of the most important concerns of the mining industry. This paper presents a hydrogeological assessment of the Irankuh Zn-Pb mine at 20 km south of Esfahan and 1 km northeast of Abnil in west-Central Iran. During mine excavation, the upper impervious bed of a confined aquifer was broken and water at high-pressure flowed into an open pit mine associated with the Kolahdarvazeh deposit. The inflow rates were 6.7 and 1.4 m(3)/s at the maximum and minimum quantities, respectively. Permeability, storage coefficient, thickness and initial head of the fully saturated confined aquifer were 3.5 × 10(-4) m/s, 0.2, 30 m and 60 m, respectively. The hydraulic heads as a function of time were monitored at four observation wells in the vicinity of the pit over 19 weeks and at an observation well near a test well over 21 h. In addition, by measuring the rate of pumping out from the pit sump, at a constant head (usually equal to height of the pit floor), the real inflow rates to the pit were monitored. The main innovations of this work were to make comparison between numerical modelling using a finite element software called SEEP/W and actual data related to inflow and extend the applicability of the numerical model. This model was further used to estimate the hydraulic heads at the observation wells around the pit over 19 weeks during mining operations. Data from a pump-out test and observation wells were used for model calibration and verification. In order to evaluate the model efficiency, the modelling results of inflow quantity and hydraulic heads were compared to those from analytical solutions, as well as the field data. The mean percent error in relation to field data for the inflow quantity was 0.108. It varied between 1.16 and 1.46 for hydraulic head predictions, which are much lower values than the mean percent errors resulted from the analytical solutions (from 1.8 to 5

  6. Advanced numerical modeling and hybridization techniques for third-generation infrared detector pixel arrays

    NASA Astrophysics Data System (ADS)

    Schuster, Jonathan

    Infrared (IR) detectors are well established as a vital sensor technology for military, defense and commercial applications. Due to the expense and effort required to fabricate pixel arrays, it is imperative to develop numerical simulation models to perform predictive device simulations which assess device characteristics and design considerations. Towards this end, we have developed a robust three-dimensional (3D) numerical simulation model for IR detector pixel arrays. We used the finite-difference time-domain technique to compute the optical characteristics including the reflectance and the carrier generation rate in the device. Subsequently, we employ the finite element method to solve the drift-diffusion equations to compute the electrical characteristics including the I(V) characteristics, quantum efficiency, crosstalk and modulation transfer function. We use our 3D numerical model to study a new class of detector based on the nBn-architecture. This detector is a unipolar unity-gain barrier device consisting of a narrow-gap absorber layer, a wide-gap barrier layer, and a narrow-gap collector layer. We use our model to study the underlying physics of these devices and to explain the anomalously long lateral collection lengths for photocarriers measured experimentally. Next, we investigate the crosstalk in HgCdTe photovoltaic pixel arrays employing a photon-trapping (PT) structure realized with a periodic array of pillars intended to provide broadband operation. The PT region drastically reduces the crosstalk; making the use of the PT structures not only useful to obtain broadband operation, but also desirable for reducing crosstalk, especially in small pitch detector arrays. Then, the power and flexibility of the nBn architecture is coupled with a PT structure to engineer spectrally filtering detectors. Last, we developed a technique to reduce the cost of large-format, high performance HgCdTe detectors by nondestructively screen-testing detector arrays prior

  7. Evaluation of Temperature Gradient in Advanced Automated Directional Solidification Furnace (AADSF) by Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Bune, Andris V.; Gillies, Donald C.; Lehoczky, Sandor L.

    1996-01-01

    A numerical model of heat transfer using combined conduction, radiation and convection in AADSF was used to evaluate temperature gradients in the vicinity of the crystal/melt interface for variety of hot and cold zone set point temperatures specifically for the growth of mercury cadmium telluride (MCT). Reverse usage of hot and cold zones was simulated to aid the choice of proper orientation of crystal/melt interface regarding residual acceleration vector without actual change of furnace location on board the orbiter. It appears that an additional booster heater will be extremely helpful to ensure desired temperature gradient when hot and cold zones are reversed. Further efforts are required to investigate advantages/disadvantages of symmetrical furnace design (i.e. with similar length of hot and cold zones).

  8. Numerical simulation of the reactive flow in advanced (HSR) combustors using KIVA-2

    NASA Technical Reports Server (NTRS)

    Winowich, Nicholas S.

    1991-01-01

    Recent work has been done with the goal of establishing ultralow emission aircraft gas turbine combustors. A significant portion of the effort is the development of three dimensional computational combustor models. The KIVA-II computer code which is based on the Implicit Continuous Eulerian Difference mesh Arbitrary Lagrangian Eulerian (ICED-ALE) numerical scheme is one of the codes selected by NASA to achieve these goals. This report involves a simulation of jet injection through slanted slots within the Rich burn/Quick quench/Lean burn (RQL) baseline experimental rig. The RQL combustor distinguishes three regions of combustion. This work specifically focuses on modeling the quick quench mixer region in which secondary injection air is introduced radially through 12 equally spaced slots around the mixer circumference. Steady state solutions are achieved with modifications to the KIVA-II program. Work currently underway will evaluate thermal mixing as a function of injection air velocity and angle of inclination of the slots.

  9. Numerical simulation of fine blanking process using fully coupled advanced constitutive equations with ductile damage

    NASA Astrophysics Data System (ADS)

    Labergere, C.; Saanouni, K.; Benafia, S.; Galmiche, J.; Sulaiman, H.

    2013-05-01

    This paper presents the modelling and adaptive numerical simulation of the fine blanking process. Thermodynamically-consistent constitutive equations, strongly coupled with ductile damage, together with specific boundary conditions (particular command of forces on blank holder and counterpunch) are presented. This model is implemented into ABAQUS/EXPLICIT using the Vumat user subroutine and connected with an adaptive 2D remeshing procedure. The different material parameters are identified for the steel S600MC using experimental tensile tests conducted until the final fracture. A parametric study aiming to examine the sensitivity of the process parameters (die radius, clearance die/punch) to the punch force and fracture surfaces topology (convex zone, sheared zone, fracture zone and the burr).

  10. Recent advances in methods for numerical solution of O.D.E. initial value problems

    NASA Technical Reports Server (NTRS)

    Bui, T. D.; Oppenheim, A. K.; Pratt, D. T.

    1984-01-01

    In the mathematical modeling of physical systems, it is often necessary to solve an initial value problem (IVP), consisting of a system of ordinary differential equations (ODE). A typical program produces approximate solutions at certain mesh points. Almost all existing codes try to control the local truncation error, while the user is really interested in controlling the true or global error. The present investigation provides a review of recent advances regarding the solution of the IVP, giving particular attention to stiff systems. Stiff phenomena are customarily defined in terms of the eigenvalues of the Jacobian. There are, however, some difficulties connected with this approach. It is pointed out that an estimate of the Lipschitz constant proves to be a very practical way to determine the stiffness of a problem.

  11. State of the art: diagnostic tools and innovative therapies for treatment of advanced thymoma and thymic carcinoma.

    PubMed

    Ried, Michael; Marx, Alexander; Götz, Andrea; Hamer, Okka; Schalke, Berthold; Hofmann, Hans-Stefan

    2016-06-01

    In this review article, state-of-the-art diagnostic tools and innovative treatments of thymoma and thymic carcinoma (TC) are described with special respect to advanced tumour stages. Complete surgical resection (R0) remains the standard therapeutic approach for almost all a priori resectable mediastinal tumours as defined by preoperative standard computed tomography (CT). If lymphoma or germ-cell tumours are differential diagnostic considerations, biopsy may be indicated. Resection status is the most important prognostic factor in thymoma and TC, followed by tumour stage. Advanced (Masaoka-Koga stage III and IVa) tumours require interdisciplinary therapy decisions based on distinctive findings of preoperative CT scan and ancillary investigations [magnetic resonance imaging (MRI)] to select cases for primary surgery or neoadjuvant strategies with optional secondary resection. In neoadjuvant settings, octreotide scans and histological evaluation of pretherapeutic needle biopsies may help to choose between somatostatin agonist/prednisolone regimens and neoadjuvant chemotherapy as first-line treatment. Finally, a multimodality treatment regime is recommended for advanced and unresectable thymic tumours. In conclusion, advanced stage thymoma and TC should preferably be treated in experienced centres in order to provide all modern diagnostic tools (imaging, histology) and innovative therapy techniques. Systemic and local (hyperthermic intrathoracic chemotherapy) medical treatments together with extended surgical resections have increased the therapeutic options in patients with advanced or recurrent thymoma and TC. PMID:26670806

  12. Theoretical and numerical methods used as design tool for an aircraft: Application on three real-world configurations

    NASA Astrophysics Data System (ADS)

    Anton, Nicoleta

    The mathematical models needed to represent the various dynamics phenomena have been conceived in many disciplines related to aerospace engineering. Major aerospace companies have developed their own codes to estimate aerodynamic characteristics and aircraft stability in the conceptual phase, in parallel with universities that have developed various codes for educational and research purposes. This paper presents a design tool that includes FDerivatives code, the new weight functions method and the continuity algorithm. FDerivatives code, developed at the LARCASE laboratory, is dedicated to the analytical and numerical calculations of the aerodynamic coefficients and their corresponding stability derivatives in the subsonic regime. It was developed as part of two research projects. The first project was initiated by CAE Inc. and the Consortium for Research and Innovation in Aerospace in Quebec (CRIAQ), and the second project was funded by NATO in the framework of the NATO RTO AVT-161 "Assessment of Stability and Control Prediction Methods for NATO Air and Sea Vehicles" program. Presagis gave the "Best Simulation Award" to the LARCASE laboratory for FDerivatives and data FLSIM applications. The new method, called the weight functions method, was used as an extension of the former project. Stability analysis of three different aircraft configurations was performed with the weight functions method and validated for longitudinal and lateral motions with the root locus method. The model, tested with the continuity algorithm, is the High Incidence Research Aircraft Model (HIRM) developed by the Swedish Defense Research Agency and implemented in the Aero-Data Model In Research Environment (ADMIRE).

  13. Advancing predictive models for particulate formation in turbulent flames via massively parallel direct numerical simulations

    PubMed Central

    Bisetti, Fabrizio; Attili, Antonio; Pitsch, Heinz

    2014-01-01

    Combustion of fossil fuels is likely to continue for the near future due to the growing trends in energy consumption worldwide. The increase in efficiency and the reduction of pollutant emissions from combustion devices are pivotal to achieving meaningful levels of carbon abatement as part of the ongoing climate change efforts. Computational fluid dynamics featuring adequate combustion models will play an increasingly important role in the design of more efficient and cleaner industrial burners, internal combustion engines, and combustors for stationary power generation and aircraft propulsion. Today, turbulent combustion modelling is hindered severely by the lack of data that are accurate and sufficiently complete to assess and remedy model deficiencies effectively. In particular, the formation of pollutants is a complex, nonlinear and multi-scale process characterized by the interaction of molecular and turbulent mixing with a multitude of chemical reactions with disparate time scales. The use of direct numerical simulation (DNS) featuring a state of the art description of the underlying chemistry and physical processes has contributed greatly to combustion model development in recent years. In this paper, the analysis of the intricate evolution of soot formation in turbulent flames demonstrates how DNS databases are used to illuminate relevant physico-chemical mechanisms and to identify modelling needs. PMID:25024412

  14. Advancing predictive models for particulate formation in turbulent flames via massively parallel direct numerical simulations.

    PubMed

    Bisetti, Fabrizio; Attili, Antonio; Pitsch, Heinz

    2014-08-13

    Combustion of fossil fuels is likely to continue for the near future due to the growing trends in energy consumption worldwide. The increase in efficiency and the reduction of pollutant emissions from combustion devices are pivotal to achieving meaningful levels of carbon abatement as part of the ongoing climate change efforts. Computational fluid dynamics featuring adequate combustion models will play an increasingly important role in the design of more efficient and cleaner industrial burners, internal combustion engines, and combustors for stationary power generation and aircraft propulsion. Today, turbulent combustion modelling is hindered severely by the lack of data that are accurate and sufficiently complete to assess and remedy model deficiencies effectively. In particular, the formation of pollutants is a complex, nonlinear and multi-scale process characterized by the interaction of molecular and turbulent mixing with a multitude of chemical reactions with disparate time scales. The use of direct numerical simulation (DNS) featuring a state of the art description of the underlying chemistry and physical processes has contributed greatly to combustion model development in recent years. In this paper, the analysis of the intricate evolution of soot formation in turbulent flames demonstrates how DNS databases are used to illuminate relevant physico-chemical mechanisms and to identify modelling needs. PMID:25024412

  15. Recent Advancements In The Numerical Simulation Of Non-Equilibrium Flows With Application To Monatomic Gases

    NASA Astrophysics Data System (ADS)

    Kapper, M. G.; Cambier, J.-L.; Bultel, A.; Magin, T. E.

    2011-05-01

    This paper summarizes our current efforts in developing numerical methods for the study of non- equilibrium, high-enthalpy plasma. We describe the general approach used in the model development, some of the problems to be solved and benchmarks showing current capabilities. In particular, we review the recent development of a collisional-radiative model coupled with a single-fluid, two-temperature convection model for the transport of shock-heated argon along with extensions to krypton and xenon. The model is used in a systematic approach to examine the effects of the collision cross sections on the shock structure, including the relaxation layer and subsequent radiative-cooling regime. We review recent results obtained and comparisons with previous experimental results obtained at the University of Toronto’s Institute of Aerospace Studies (UTIAS) and the Australian National University (ANU), which serve as benchmarks to the model. We also show results when unsteady and multi-dimensional effects are included, highlighting the importance of coupling between convective transport and kinetic processes in nonequilibrium flows. We then look at extending the model to both nozzle and external flows to study expansion regimes.

  16. Numerical Simulations of Optical Turbulence Using an Advanced Atmospheric Prediction Model: Implications for Adaptive Optics Design

    NASA Astrophysics Data System (ADS)

    Alliss, R.

    2014-09-01

    Optical turbulence (OT) acts to distort light in the atmosphere, degrading imagery from astronomical telescopes and reducing the data quality of optical imaging and communication links. Some of the degradation due to turbulence can be corrected by adaptive optics. However, the severity of optical turbulence, and thus the amount of correction required, is largely dependent upon the turbulence at the location of interest. Therefore, it is vital to understand the climatology of optical turbulence at such locations. In many cases, it is impractical and expensive to setup instrumentation to characterize the climatology of OT, so numerical simulations become a less expensive and convenient alternative. The strength of OT is characterized by the refractive index structure function Cn2, which in turn is used to calculate atmospheric seeing parameters. While attempts have been made to characterize Cn2 using empirical models, Cn2 can be calculated more directly from Numerical Weather Prediction (NWP) simulations using pressure, temperature, thermal stability, vertical wind shear, turbulent Prandtl number, and turbulence kinetic energy (TKE). In this work we use the Weather Research and Forecast (WRF) NWP model to generate Cn2 climatologies in the planetary boundary layer and free atmosphere, allowing for both point-to-point and ground-to-space seeing estimates of the Fried Coherence length (ro) and other seeing parameters. Simulations are performed using a multi-node linux cluster using the Intel chip architecture. The WRF model is configured to run at 1km horizontal resolution and centered on the Mauna Loa Observatory (MLO) of the Big Island. The vertical resolution varies from 25 meters in the boundary layer to 500 meters in the stratosphere. The model top is 20 km. The Mellor-Yamada-Janjic (MYJ) TKE scheme has been modified to diagnose the turbulent Prandtl number as a function of the Richardson number, following observations by Kondo and others. This modification

  17. Advancing Satellite-Based Flood Prediction in Complex Terrain Using High-Resolution Numerical Weather Prediction

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Anagnostou, E. N.; Nikolopoulos, E. I.; Bartsotas, N. S.

    2015-12-01

    Floods constitute one of the most significant and frequent natural hazard in mountainous regions. Satellite-based precipitation products offer in many cases the only available source of QPE. However, satellite-based QPE over complex terrain suffer from significant bias that limits their applicability for hydrologic modeling. In this work we investigate the potential of a new correction procedure, which involves the use of high-resolution numerical weather prediction (NWP) model simulations to adjust satellite QPE. Adjustment is based on the pdf matching of satellite and NWP (used as reference) precipitation distribution. The impact of correction procedure on simulating the hydrologic response is examined for 15 storm events that generated floods over the mountainous Upper Adige region of Northern Italy. Atmospheric simulations were performed at 1-km resolution from a state-of-the-art atmospheric model (RAMS/ICLAMS). The proposed error correction procedure was then applied on the widely used TRMM 3B42 satellite precipitation product and the evaluation of the correction was based on independent in situ precipitation measurements from a dense rain gauge network (1 gauge / 70 km2) available in the study area. Satellite QPE, before and after correction, are used to simulate flood response using ARFFS (Adige River Flood Forecasting System), a semi-distributed hydrologic model, which is used for operational flood forecasting in the region. Results showed that bias in satellite QPE before correction was significant and had a tremendous impact on the simulation of flood peak, however the correction procedure was able to reduce bias in QPE and therefore improve considerably the simulated flood hydrograph.

  18. Numerical Investigation of a Cascaded Longitudinal Space-Charge Amplifier at the Fermilab's Advanced Superconducting Test Accelerator

    SciTech Connect

    Halavanau, A.; Piot, P.

    2015-06-01

    In a cascaded longitudinal space-charge amplifier (LSCA), initial density noise in a relativistic e-beam is amplified via the interplay of longitudinal space charge forces and properly located dispersive sections. This type of amplification process was shown to potentially result in large final density modulations [1] compatible with the production of broadband electromagnetic radiation. The technique was recently demonstrated in the optical domain [2]. In this paper we investigate, via numerical simulations, the performances of a cascaded LSCA beamline at the Fermilab’s Advanced Superconducting Test Accelerator (ASTA). We especially explore the properties of the produced broadband radiation. Our studies have been conducted with a grid-less three-dimensional space-charge algorithm.

  19. Development of 3D multimedia with advanced computer animation tools for outreach activities related to Meteor Science and Meteoritics

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.

    2012-09-01

    Documentaries related to Astronomy and Planetary Sciences are a common and very attractive way to promote the interest of the public in these areas. These educational tools can get benefit from new advanced computer animation software and 3D technologies, as these allow making these documentaries even more attractive. However, special care must be taken in order to guarantee that the information contained in them is serious and objective. In this sense, an additional value is given when the footage is produced by the own researchers. With this aim, a new documentary produced and directed by Prof. Madiedo has been developed. The documentary, which has been entirely developed by means of advanced computer animation tools, is dedicated to several aspects of Meteor Science and Meteoritics. The main features of this outreach and education initiative are exposed here.

  20. Advancing Efficient All-Electron Electronic Structure Methods Based on Numeric Atom-Centered Orbitals for Energy Related Materials

    NASA Astrophysics Data System (ADS)

    Blum, Volker

    This talk describes recent advances of a general, efficient, accurate all-electron electronic theory approach based on numeric atom-centered orbitals; emphasis is placed on developments related to materials for energy conversion and their discovery. For total energies and electron band structures, we show that the overall accuracy is on par with the best benchmark quality codes for materials, but scalable to large system sizes (1,000s of atoms) and amenable to both periodic and non-periodic simulations. A recent localized resolution-of-identity approach for the Coulomb operator enables O (N) hybrid functional based descriptions of the electronic structure of non-periodic and periodic systems, shown for supercell sizes up to 1,000 atoms; the same approach yields accurate results for many-body perturbation theory as well. For molecular systems, we also show how many-body perturbation theory for charged and neutral quasiparticle excitation energies can be efficiently yet accurately applied using basis sets of computationally manageable size. Finally, the talk highlights applications to the electronic structure of hybrid organic-inorganic perovskite materials, as well as to graphene-based substrates for possible future transition metal compound based electrocatalyst materials. All methods described here are part of the FHI-aims code. VB gratefully acknowledges contributions by numerous collaborators at Duke University, Fritz Haber Institute Berlin, TU Munich, USTC Hefei, Aalto University, and many others around the globe.

  1. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    SciTech Connect

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.; Qualls, A. L.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  2. Numerical Viscous Flow Analysis of an Advanced Semispan Diamond-Wing Model at High-Life Conditions

    NASA Technical Reports Server (NTRS)

    Ghaffari, F.; Biedron, R. T.; Luckring, J. M.

    2002-01-01

    Turbulent Navier-Stokes computational results are presented for an advanced diamond wing semispan model at low speed, high-lift conditions. The numerical results are obtained in support of a wind-tunnel test that was conducted in the National Transonic Facility (NTF) at the NASA Langley Research Center. The model incorporated a generic fuselage and was mounted on the tunnel sidewall using a constant width standoff. The analyses include: (1) the numerical simulation of the NTF empty, tunnel flow characteristics; (2) semispan high-lift model with the standoff in the tunnel environment; (3) semispan high-lift model with the standoff and viscous sidewall in free air; and (4) semispan high-lift model without the standoff in free air. The computations were performed at conditions that correspond to a nominal approach and landing configuration. The wing surface pressure distributions computed for the model in both the tunnel and in free air agreed well with the corresponding experimental data and they both indicated small increments due to the wall interference effects. However, the wall interference effects were found to be more pronounced in the total measured and the computed lift, drag and pitching moment due to standard induced up-flow effects. Although the magnitudes of the computed forces and moment were slightly off compared to the measured data, the increments due the wall interference effects were predicted well. The numerical predictions are also presented on the combined effects of the tunnel sidewall boundary layer and the standoff geometry on the fuselage fore-body pressure distributions and the resulting impact on the overall configuration longitudinal aerodynamic characteristics.

  3. Advances in Chimera Grid Tools for Multi-Body Dynamics Simulations and Script Creation

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    This viewgraph presentation contains information about (1) Framework for multi-body dynamics - Geometry Manipulation Protocol (GMP), (2) Simulation procedure using Chimera Grid Tools (CGT) and OVERFLOW-2 (3) Further recent developments in Chimera Grid Tools OVERGRID, Grid modules, Script library and (4) Future work.

  4. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  5. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  6. Psychiatric symptoms and disorders associated with reproductive cyclicity in women: advances in screening tools.

    PubMed

    Hall, Elise; Steiner, Meir

    2015-06-01

    Female-specific psychiatric illness including premenstrual dysphoria, perinatal depression, and psychopathology related to the perimenopausal period are often underdiagnosed and treated. These conditions can negatively affect the quality of life for women and their families. The development of screening tools has helped guide our understanding of these conditions. There is a wide disparity in the methods, definitions, and tools used in studies relevant to female-specific psychiatric illness. As a result, there is no consensus on one tool that is most appropriate for use in a research or clinical setting. In reviewing this topic, we hope to highlight the evolution of various tools as they have built on preexisting instruments and to identify the psychometric properties and clinical applicability of available tools. It would be valuable for researchers to reach a consensus on a core set of screening instruments specific to female psychopathology to gain consistency within and between clinical settings. PMID:26102476

  7. 3-D Numerical Modeling as a Tool for Managing Mineral Water Extraction from a Complex Groundwater Basin in Italy

    NASA Astrophysics Data System (ADS)

    Zanini, A.; Tanda, M.

    2007-12-01

    The groundwater in Italy plays an important role as drinking water; in fact it covers about the 30% of the national demand (70% in Northern Italy). The mineral water distribution in Italy is an important business with an increasing demand from abroad countries. The mineral water Companies have a great interest in order to increase the water extraction, but for the delicate and complex geology of the subsoil, where such very high quality waters are contained, a particular attention must be paid in order to avoid an excessive lowering of the groundwater reservoirs or great changes in the groundwater flow directions. A big water Company asked our University to set up a numerical model of the groundwater basin, in order to obtain a useful tool which allows to evaluate the strength of the aquifer and to design new extraction wells. The study area is located along Appennini Mountains and it covers a surface of about 18 km2; the topography ranges from 200 to 600 m a.s.l.. In ancient times only a spring with naturally sparkling water was known in the area, but at present the mineral water is extracted from deep pumping wells. The area is characterized by a very complex geology: the subsoil structure is described by a sequence of layers of silt-clay, marl-clay, travertine and alluvial deposit. Different groundwater layers are present and the one with best quality flows in the travertine layer; the natural flow rate seems to be not subjected to seasonal variations. The water age analysis revealed a very old water which means that the mineral aquifers are not directly connected with the meteoric recharge. The Geologists of the Company suggest that the water supply of the mineral aquifers comes from a carbonated unit located in the deep layers of the mountains bordering the spring area. The valley is crossed by a river that does not present connections to the mineral aquifers. Inside the area there are about 30 pumping wells that extract water at different depths. We built a 3

  8. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs (azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, and methylenedioxy...

  9. Advancing lighting and daylighting simulation: The transition from analysis to design aid tools

    SciTech Connect

    Hitchcock, R.J.

    1995-05-01

    This paper explores three significant software development requirements for making the transition from stand-alone lighting simulation/analysis tools to simulation-based design aid tools. These requirements include specialized lighting simulation engines, facilitated methods for creating detailed simulatable building descriptions, an automated techniques for providing lighting design guidance. Initial computer implementations meant to address each of these requirements are discussed to further elaborate these requirements and to illustrate work-in-progress.

  10. Advanced repair solution of clear defects on HTPSM by using nanomachining tool

    NASA Astrophysics Data System (ADS)

    Lee, Hyemi; Kim, Munsik; Jung, Hoyong; Kim, Sangpyo; Yim, Donggyu

    2015-10-01

    As the mask specifications become tighter for low k1 lithography, more aggressive repair accuracy is required below sub 20nm tech. node. To meet tight defect specifications, many maskshops select effective repair tools according to defect types. Normally, pattern defects are repaired by the e-beam repair tool and soft defects such as particles are repaired by the nanomachining tool. It is difficult for an e-beam repair tool to remove particle defects because it uses chemical reaction between gas and electron, and a nanomachining tool, which uses physical reaction between a nano-tip and defects, cannot be applied for repairing clear defects. Generally, film deposition process is widely used for repairing clear defects. However, the deposited film has weak cleaning durability, so it is easily removed by accumulated cleaning process. Although the deposited film is strongly attached on MoSiN(or Qz) film, the adhesive strength between deposited Cr film and MoSiN(or Qz) film becomes weaker and weaker by the accumulated energy when masks are exposed in a scanner tool due to the different coefficient of thermal expansion of each materials. Therefore, whenever a re-pellicle process is needed to a mask, all deposited repair points have to be confirmed whether those deposition film are damaged or not. And if a deposition point is damaged, repair process is needed again. This process causes longer and more complex process. In this paper, the basic theory and the principle are introduced to recover clear defects by using nanomachining tool, and the evaluated results are reviewed at dense line (L/S) patterns and contact hole (C/H) patterns. Also, the results using a nanomachining were compared with those using an e-beam repair tool, including the cleaning durability evaluated by the accumulated cleaning process. Besides, we discuss the phase shift issue and the solution about the image placement error caused by phase error.

  11. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  12. A numerical modelling tool for assessing the impact of climate change and management options on water supply systems

    NASA Astrophysics Data System (ADS)

    Romano, Emanuele; Guyennon, Nicolas; Mariani, Davide; Bruna Petrangeli, Anna; Portoghese, Ivan

    2014-05-01

    Conditions of scarcity for a water supply system occur when the available resource are not able to satisfy the connected demands. They can arise both from a decreasing of the inflow to the exploited resources and/or from a increasing of the demand. Such conditions can be assessed by a water balance model able to simulate both the hydrological processes describing the relationships between the meteorological forcing (precipitation) and the inflows to the exploited reservoir, and the intra- and inter-annual time distribution of the connected demand and the reservoir management policies. We present a numerical modelling tool, developed for the management of the Maggiore Lake, that computes at daily scale the water budget of such reservoir taking into account 1) the monthly precipitation over the watershed basin and the related inflow; 2) the seasonal demand for irrigation and 3) the operative hydrometric levels constraints to the lake water withdrawal. The model represents precipitation over the basin through the space mean of the standardized precipitation indices computed at different aggregation scales using observed time series. The relationship between the precipitation regime and the inflow to the reservoir is obtained through a simple multilinear regression model, considering the SPI computed at 1, 3 and 6 months as independent variables: this allows to take hydrological processes into account featuring different characteristic times and to simulate both the historic inflow regime and the possible conditions forecast by climate scenarios. The regression model is validated on the precipitation and lake inflow observations in the period 1996-2013 using a leave-one-out cross validation. The seasonal irrigation demand is assigned based on the extensions of crops fed by the lake water and regardless of the climate conditions; the actual supply is limited by the operative hydrometric range of allowable water levels, which stop water distribution when the lake level

  13. Potential for MERLIN-Expo, an advanced tool for higher tier exposure assessment, within the EU chemical legislative frameworks.

    PubMed

    Suciu, Nicoleta; Tediosi, Alice; Ciffroy, Philippe; Altenpohl, Annette; Brochot, Céline; Verdonck, Frederik; Ferrari, Federico; Giubilato, Elisa; Capri, Ettore; Fait, Gabriella

    2016-08-15

    MERLIN-Expo merges and integrates advanced exposure assessment methodologies, allowing the building of complex scenarios involving several pollution sources and targets. The assessment of exposure and risks to human health from chemicals is of major concern for policy and ultimately benefits all citizens. The development and operational fusion of the advanced exposure assessment methodologies envisaged in the MERLIN-Expo tool will have a significant impact in the long term on several policies dealing with chemical safety management. There are more than 30 agencies in Europe related to exposure and risk evaluation of chemicals, which have an important role in implementing EU policies, having especially tasks of technical, scientific, operational and/or regulatory nature. The main purpose of the present paper is to introduce MERLIN-Expo and to highlight its potential for being effectively integrated within the group of tools available to assess the risk and exposure of chemicals for EU policy. The main results show that the tool is highly suitable for use in site-specific or local impact assessment, with minor modifications it can also be used for Plant Protection Products (PPPs), biocides and REACH, while major additions would be required for a comprehensive application in the field of consumer and worker exposure assessment. PMID:27107646

  14. Handbook of Research on Hybrid Learning Models: Advanced Tools, Technologies, and Applications

    ERIC Educational Resources Information Center

    Wang, Fu Lee, Ed.; Fong, Joseph, Ed.; Kwan, Reggie, Ed.

    2010-01-01

    Hybrid learning is now the single-greatest trend in education today due to the numerous educational advantages when both traditional classroom learning and e-learning are implemented collectively. This handbook collects emerging research and pedagogies related to the convergence of teaching and learning methods. This significant "Handbook of…

  15. Numerical Investigation of Cross Flow Phenomena in a Tight-Lattice Rod Bundle Using Advanced Interface Tracking Method

    NASA Astrophysics Data System (ADS)

    Zhang, Weizhong; Yoshida, Hiroyuki; Ose, Yasuo; Ohnuki, Akira; Akimoto, Hajime; Hotta, Akitoshi; Fujimura, Ken

    In relation to the design of an innovative FLexible-fuel-cycle Water Reactor (FLWR), investigation of thermal-hydraulic performance in tight-lattice rod bundles of the FLWR is being carried out at Japan Atomic Energy Agency (JAEA). The FLWR core adopts a tight triangular lattice arrangement with about 1 mm gap clearance between adjacent fuel rods. In view of importance of accurate prediction of cross flow between subchannels in the evaluation of the boiling transition (BT) in the FLWR core, this study presents a statistical evaluation of numerical simulation results obtained by a detailed two-phase flow simulation code, TPFIT, which employs an advanced interface tracking method. In order to clarify mechanisms of cross flow in such tight lattice rod bundles, the TPFIT is applied to simulate water-steam two-phase flow in two modeled subchannels. Attention is focused on instantaneous fluctuation characteristics of cross flow. With the calculation of correlation coefficients between differential pressure and gas/liquid mixing coefficients, time scales of cross flow are evaluated, and effects of mixing section length, flow pattern and gap spacing on correlation coefficients are investigated. Differences in mechanism between gas and liquid cross flows are pointed out.

  16. A Clinical Assessment Tool for Advanced Theory of Mind Performance in 5 to 12 Year Olds

    ERIC Educational Resources Information Center

    O'Hare, Anne E.; Bremner, Lynne; Nash, Marysia; Happe, Francesca; Pettigrew, Luisa M.

    2009-01-01

    One hundred forty typically developing 5- to 12-year-old children were assessed with a test of advanced theory of mind employing Happe's strange stories. There was no significant difference in performance between boys and girls. The stories discriminated performance across the different ages with the lowest performance being in the younger…

  17. Just-in-Time Teaching: A Tool for Enhancing Student Engagement in Advanced Foreign Language Learning

    ERIC Educational Resources Information Center

    Abreu, Laurel; Knouse, Stephanie

    2014-01-01

    Scholars have indicated a need for further research on effective pedagogical strategies designed for advanced foreign language courses in the postsecondary setting, especially in light of decreased enrollments at this level and the elimination of foreign language programs altogether in some institutions (Paesani & Allen, 2012). This article…

  18. Advanced Technologies as Educational Tools in Science: Concepts, Applications, and Issues. Monograph Series Number 8.

    ERIC Educational Resources Information Center

    Kumar, David D.; And Others

    Systems incorporating two advanced technologies, hypermedia systems and intelligent tutors, are examined with respect to their potential impact on science education. The conceptual framework underlying these systems is discussed first. Applications of systems are then presented with examples of each in operation within the context of science…

  19. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  20. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    EPA Science Inventory

    The purpose of this poster is to present the application and assessment of advanced state-of-the-art technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs [azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, m...

  1. Recent advances in microbial production of fuels and chemicals using tools and strategies of systems metabolic engineering.

    PubMed

    Cho, Changhee; Choi, So Young; Luo, Zi Wei; Lee, Sang Yup

    2015-11-15

    The advent of various systems metabolic engineering tools and strategies has enabled more sophisticated engineering of microorganisms for the production of industrially useful fuels and chemicals. Advances in systems metabolic engineering have been made in overproducing natural chemicals and producing novel non-natural chemicals. In this paper, we review the tools and strategies of systems metabolic engineering employed for the development of microorganisms for the production of various industrially useful chemicals belonging to fuels, building block chemicals, and specialty chemicals, in particular focusing on those reported in the last three years. It was aimed at providing the current landscape of systems metabolic engineering and suggesting directions to address future challenges towards successfully establishing processes for the bio-based production of fuels and chemicals from renewable resources. PMID:25450194

  2. Continuous Symmetry and Chemistry Teachers: Learning Advanced Chemistry Content through Novel Visualization Tools

    ERIC Educational Resources Information Center

    Tuvi-Arad, Inbal; Blonder, Ron

    2010-01-01

    In this paper we describe the learning process of a group of experienced chemistry teachers in a specially designed workshop on molecular symmetry and continuous symmetry. The workshop was based on interactive visualization tools that allow molecules and their symmetry elements to be rotated in three dimensions. The topic of continuous symmetry is…

  3. Advanced Algorithms and Automation Tools for Discrete Ordinates Methods in Parallel Environments

    SciTech Connect

    Alireza Haghighat

    2003-05-07

    This final report discusses major accomplishments of a 3-year project under the DOE's NEER Program. The project has developed innovative and automated algorithms, codes, and tools for solving the discrete ordinates particle transport method efficiently in parallel environments. Using a number of benchmark and real-life problems, the performance and accuracy of the new algorithms have been measured and analyzed.

  4. SESAME: a software tool for the numerical dosimetric reconstruction of radiological accidents involving external sources and its application to the accident in Chile in December 2005.

    PubMed

    Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F

    2009-01-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described. PMID:19066489

  5. Advances in Omics and Bioinformatics Tools for Systems Analyses of Plant Functions

    PubMed Central

    Mochida, Keiichi; Shinozaki, Kazuo

    2011-01-01

    Omics and bioinformatics are essential to understanding the molecular systems that underlie various plant functions. Recent game-changing sequencing technologies have revitalized sequencing approaches in genomics and have produced opportunities for various emerging analytical applications. Driven by technological advances, several new omics layers such as the interactome, epigenome and hormonome have emerged. Furthermore, in several plant species, the development of omics resources has progressed to address particular biological properties of individual species. Integration of knowledge from omics-based research is an emerging issue as researchers seek to identify significance, gain biological insights and promote translational research. From these perspectives, we provide this review of the emerging aspects of plant systems research based on omics and bioinformatics analyses together with their associated resources and technological advances. PMID:22156726

  6. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  7. Portfolio use as a tool to demonstrate professional development in advanced nursing practice.

    PubMed

    Hespenheide, Molly; Cottingham, Talisha; Mueller, Gail

    2011-01-01

    A concrete way of recognizing and rewarding clinical leadership, excellence in practice, and personal and professional development of the advanced practice registered nurse (APRN) is lacking in the literature and healthcare institutions in the United States. This article presents the process of developing and evaluating a professional development program designed to address this gap. The program uses APRN Professional Performance Standards, Relationship-Based Care, and the Magnet Forces as a guide and theoretical base. A key tenet of the program is the creation of a professional portfolio. Narrative reflections are included that illustrate the convergence of theories. A crosswalk supports this structure, guides portfolio development, and operationalizes the convergence of theories as they specifically relate to professional development in advanced practice. Implementation of the program has proven to be challenging and rewarding. Feedback from APRNs involved in the program supports program participation as a meaningful method to recognize excellence in advanced practice and a clear means to foster ongoing professional growth and development. PMID:22016019

  8. EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing

    PubMed Central

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590

  9. Anvil Forecast Tool in the Advanced Weather Interactive Processing System, Phase II

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III

    2008-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Light Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input.

  10. ADVANCEMENT OF NUCLEIC ACID-BASED TOOLS FOR MONITORING IN SITU REDUCTIVE DECHLORINATION

    SciTech Connect

    Vangelas, K; ELIZABETH EDWARDS, E; FRANK LOFFLER, F; Brian02 Looney, B

    2006-11-17

    Regulatory protocols generally recognize that destructive processes are the most effective mechanisms that support natural attenuation of chlorinated solvents. In many cases, these destructive processes will be biological processes and, for chlorinated compounds, will often be reductive processes that occur under anaerobic conditions. The existing EPA guidance (EPA, 1998) provides a list of parameters that provide indirect evidence of reductive dechlorination processes. In an effort to gather direct evidence of these processes, scientists have identified key microorganisms and are currently developing tools to measure the abundance and activity of these organisms in subsurface systems. Drs. Edwards and Luffler are two recognized leaders in this field. The research described herein continues their development efforts to provide a suite of tools to enable direct measures of biological processes related to the reductive dechlorination of TCE and PCE. This study investigated the strengths and weaknesses of the 16S rRNA gene-based approach to characterizing the natural attenuation capabilities in samples. The results suggested that an approach based solely on 16S rRNA may not provide sufficient information to document the natural attenuation capabilities in a system because it does not distinguish between strains of organisms that have different biodegradation capabilities. The results of the investigations provided evidence that tools focusing on relevant enzymes for functionally desired characteristics may be useful adjuncts to the 16SrRNA methods.

  11. From beginners to trained users: an advanced tool to guide experimenters in basic applied fluorescence

    NASA Astrophysics Data System (ADS)

    Pingand, Philippe B.; Lerner, Dan A.

    1993-05-01

    UPY-F is a software dedicated to solving various queries issued by end-users of spectrofluorimeters when they come across a problem in the course of an experiment. The main goal is to provide a diagnostic for the nonpertinent use of a spectrofluorimeter. Many artifacts may induce the operator into trouble and except for experts, the simple manipulation of the controls of a fluorimeter results in effects not always fully appreciated. The solution retained is an association between a powerful hypermedia tool and an expert system. A straight expert system offers a number of well-known advantages. But it is not well accepted by the user due to the many moves between the spectrofluorimeter and the diagnostic tool. In our hypermedia tool, knowledge can be displayed by the means of visual concepts through which one can browse, and navigate. The user still perceives his problem as a whole, which may not be the case with a straight expert system. We demonstrate typical situations in which an event will trigger a chain reasoning leading to the debugging of the problem. The system is not only meant to help a beginner but can conform itself to guide a well trained experimenter. We think that its functionalities and user-friendly interface are very attractive and open new vistas in the way future users may be trained, whether they work in research labs or industrial settings, as it could namely cut down on the time spent for their training.

  12. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater. PMID:26856870

  13. A Multi-layer, Data-driven Advanced Reasoning Tool for Intelligent Data Mining and Analysis for Smart Grids

    SciTech Connect

    Lu, Ning; Du, Pengwei; Greitzer, Frank L.; Guo, Xinxin; Hohimer, Ryan E.; Pomiak, Yekaterina G.

    2012-12-31

    This paper presents the multi-layer, data-driven advanced reasoning tool (M-DART), a proof-of-principle decision support tool for improved power system operation. M-DART will cross-correlate and examine different data sources to assess anomalies, infer root causes, and anneal data into actionable information. By performing higher-level reasoning “triage” of diverse data sources, M-DART focuses on early detection of emerging power system events and identifies highest priority actions for the human decision maker. M-DART represents a significant advancement over today’s grid monitoring technologies that apply offline analyses to derive model-based guidelines for online real-time operations and use isolated data processing mechanisms focusing on individual data domains. The development of the M-DART will bridge these gaps by reasoning about results obtained from multiple data sources that are enabled by the smart grid infrastructure. This hybrid approach integrates a knowledge base that is trained offline but tuned online to capture model-based relationships while revealing complex causal relationships among data from different domains.

  14. The Advanced Light Source: A new tool for research in atomic and molecular physics

    SciTech Connect

    Schlachter, F.; Robinson, A.

    1991-04-01

    The Advanced Light Source at the Lawrence Berkeley Laboratory will be the world's brightest synchrotron radiation source in the extreme ultraviolet and soft x-ray regions of the spectrum when it begins operation in 1993. It will be available as a national user facility to researchers in a broad range of disciplines, including materials science, atomic and molecular physics, chemistry, biology, imaging, and technology. The high brightness of the ALS will be particularly well suited to high-resolution studies of tenuous targets, such as excited atoms, ions, and clusters. 13 figs., 4 tabs.

  15. NASA Advanced Concepts Office, Earth-To-Orbit Team Design Process and Tools

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.; Garcia, Jessica; Beers, Benjamin; Philips, Alan; Holt, James B.; Threet, Grady E., Jr.

    2013-01-01

    The Earth to Orbit (ETO) Team of the Advanced Concepts Office (ACO) at NASA Marshal Space Flight Center (MSFC) is considered the preeminent group to go to for prephase A and phase A concept definition. The ACO team has been at the forefront of a multitude of launch vehicle studies determining the future direction of the Agency as a whole due, in part, to their rapid turnaround time in analyzing concepts and their ability to cover broad trade spaces of vehicles in that limited timeframe. Each completed vehicle concept includes a full mass breakdown of each vehicle to tertiary subsystem components, along with a vehicle trajectory analysis to determine optimized payload delivery to specified orbital parameters, flight environments, and delta v capability. Additionally, a structural analysis of the vehicle based on material properties and geometries is performed as well as an analysis to determine the flight loads based on the trajectory outputs. As mentioned, the ACO Earth to Orbit Team prides themselves on their rapid turnaround time and often need to fulfill customer requests within limited schedule or little advanced notice. Due to working in this fast paced environment, the ETO team has developed some finely honed skills and methods to maximize the delivery capability to meet their customer needs. This paper will describe the interfaces between the 3 primary disciplines used in the design process; weights and sizing, trajectory, and structural analysis, as well as the approach each discipline employs to streamline their particular piece of the design process.

  16. Reducing the power consumption in LTE-Advanced wireless access networks by a capacity based deployment tool

    NASA Astrophysics Data System (ADS)

    Deruyck, Margot; Joseph, Wout; Tanghe, Emmeric; Martens, Luc

    2014-09-01

    As both the bit rate required by applications on mobile devices and the number of those mobile devices are steadily growing, wireless access networks need to be expanded. As wireless networks also consume a lot of energy, it is important to develop energy-efficient wireless access networks in the near future. In this study, a capacity-based deployment tool for the design of energy-efficient wireless access networks is proposed. Capacity-based means that the network responds to the instantaneous bit rate requirements of the users active in the selected area. To the best of our knowledge, such a deployment tool for energy-efficient wireless access networks has never been presented before. This deployment tool is applied to a realistic case in Ghent, Belgium, to investigate three main functionalities incorporated in LTE-Advanced: carrier aggregation, heterogeneous deployments, and Multiple-Input Multiple-Output (MIMO). The results show that it is recommended to introduce femtocell base stations, supporting both MIMO and carrier aggregation, into the network (heterogeneous deployment) to reduce the network's power consumption. For the selected area and the assumptions made, this results in a power consumption reduction up to 70%. Introducing femtocell base stations without MIMO and carrier aggregation can already result in a significant power consumption reduction of 38%.

  17. Emerging tools for continuous nutrient monitoring networks: Sensors advancing science and water resources protection

    USGS Publications Warehouse

    Pellerin, Brian; Stauffer, Beth A; Young, Dwane A; Sullivan, Daniel J.; Bricker, Suzanne B.; Walbridge, Mark R; Clyde, Gerard A; Shaw, Denice M

    2016-01-01

    Sensors and enabling technologies are becoming increasingly important tools for water quality monitoring and associated water resource management decisions. In particular, nutrient sensors are of interest because of the well-known adverse effects of nutrient enrichment on coastal hypoxia, harmful algal blooms, and impacts to human health. Accurate and timely information on nutrient concentrations and loads is integral to strategies designed to minimize risk to humans and manage the underlying drivers of water quality impairment. Using nitrate sensors as an example, we highlight the types of applications in freshwater and coastal environments that are likely to benefit from continuous, real-time nutrient data. The concurrent emergence of new tools to integrate, manage and share large data sets is critical to the successful use of nutrient sensors and has made it possible for the field of continuous nutrient monitoring to rapidly move forward. We highlight several near-term opportunities for Federal agencies, as well as the broader scientific and management community, that will help accelerate sensor development, build and leverage sites within a national network, and develop open data standards and data management protocols that are key to realizing the benefits of a large-scale, integrated monitoring network. Investing in these opportunities will provide new information to guide management and policies designed to protect and restore our nation’s water resources.

  18. Neuron-Miner: An Advanced Tool for Morphological Search and Retrieval in Neuroscientific Image Databases.

    PubMed

    Conjeti, Sailesh; Mesbah, Sepideh; Negahdar, Mohammadreza; Rautenberg, Philipp L; Zhang, Shaoting; Navab, Nassir; Katouzian, Amin

    2016-10-01

    The steadily growing amounts of digital neuroscientific data demands for a reliable, systematic, and computationally effective retrieval algorithm. In this paper, we present Neuron-Miner, which is a tool for fast and accurate reference-based retrieval within neuron image databases. The proposed algorithm is established upon hashing (search and retrieval) technique by employing multiple unsupervised random trees, collectively called as Hashing Forests (HF). The HF are trained to parse the neuromorphological space hierarchically and preserve the inherent neuron neighborhoods while encoding with compact binary codewords. We further introduce the inverse-coding formulation within HF to effectively mitigate pairwise neuron similarity comparisons, thus allowing scalability to massive databases with little additional time overhead. The proposed hashing tool has superior approximation of the true neuromorphological neighborhood with better retrieval and ranking performance in comparison to existing generalized hashing methods. This is exhaustively validated by quantifying the results over 31266 neuron reconstructions from Neuromorpho.org dataset curated from 147 different archives. We envisage that finding and ranking similar neurons through reference-based querying via Neuron Miner would assist neuroscientists in objectively understanding the relationship between neuronal structure and function for applications in comparative anatomy or diagnosis. PMID:27155864

  19. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  20. Recent advances in developing molecular tools for targeted genome engineering of mammalian cells.

    PubMed

    Lim, Kwang-il

    2015-01-01

    Various biological molecules naturally existing in diversified species including fungi, bacteria, and bacteriophage have functionalities for DNA binding and processing. The biological molecules have been recently actively engineered for use in customized genome editing of mammalian cells as the molecule-encoding DNA sequence information and the underlying mechanisms how the molecules work are unveiled. Excitingly, multiple novel methods based on the newly constructed artificial molecular tools have enabled modifications of specific endogenous genetic elements in the genome context at efficiencies that are much higher than that of the conventional homologous recombination based methods. This minireview introduces the most recently spotlighted molecular genome engineering tools with their key features and ongoing modifications for better performance. Such ongoing efforts have mainly focused on the removal of the inherent DNA sequence recognition rigidity from the original molecular platforms, the addition of newly tailored targeting functions into the engineered molecules, and the enhancement of their targeting specificity. Effective targeted genome engineering of mammalian cells will enable not only sophisticated genetic studies in the context of the genome, but also widely-applicable universal therapeutics based on the pinpointing and correction of the disease-causing genetic elements within the genome in the near future. PMID:25104401

  1. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  2. Using explanatory crop models to develop simple tools for Advanced Life Support system studies

    NASA Technical Reports Server (NTRS)

    Cavazzoni, J.

    2004-01-01

    System-level analyses for Advanced Life Support require mathematical models for various processes, such as for biomass production and waste management, which would ideally be integrated into overall system models. Explanatory models (also referred to as mechanistic or process models) would provide the basis for a more robust system model, as these would be based on an understanding of specific processes. However, implementing such models at the system level may not always be practicable because of their complexity. For the area of biomass production, explanatory models were used to generate parameters and multivariable polynomial equations for basic models that are suitable for estimating the direction and magnitude of daily changes in canopy gas-exchange, harvest index, and production scheduling for both nominal and off-nominal growing conditions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  3. Scanning multispectral IR reflectography SMIRR: an advanced tool for art diagnostics.

    PubMed

    Daffara, Claudia; Pampaloni, Enrico; Pezzati, Luca; Barucci, Marco; Fontana, Raffaella

    2010-06-15

    joint processing of multispectral planes, such as subtraction and ratio methods, false color representation, and statistical tools such as principal component analysis, are applied to the registered image dataset for extracting additional information. Maintaining a visual approach in the data analysis allows this tool to be used by museum staff, the actual end-users. We also present some applications of the technique to the study of Italian masterpieces, discussing interesting preliminary results. The spectral sensitivity of the detection system, the quality of focusing and uniformity of the acquired images, and the possibility for selective imaging in NIR bands in a registered dataset make SMIRR an exceptional tool for nondestructive inspection of painting surfaces. The high quality and detail of SMIRR data underscore the potential for further development in this field. PMID:20230039

  4. Advances in ion trap mass spectrometry: Photodissociation as a tool for structural elucidation

    SciTech Connect

    Stephenson, J.L. Jr.; Booth, M.M.; Eyler, J.R.; Yost, R.A.

    1995-12-01

    Photo-induced dissociation (PID) is the next most frequently used method (after collisional activation) for activation of Polyatomic ions in tandem mass spectrometry. The range of internal energies present after the photon absorption process are much narrower than those obtained with collisional energy transfer. Therefore, the usefulness of PID for the study of ion structures is greatly enhanced. The long storage times and instrumental configuration of the ion trap mass spectrometer are ideally suited for photodissociation experiments. This presentation will focus on both the fundamental and analytical applications of CO{sub 2} lasers in conjunction with ion trap mass spectrometry. The first portion of this talk will examine the fundamental issues of wavelength dependence, chemical kinetics, photoabsorption cross section, and collisional effects on photodissociation efficiency. The second half of this presentation will look at novel instrumentation for electrospray/ion trap mass spectrometry, with the concurrent development of photodissociation as a tool for structural elucidation of organic compounds and antibiotics.

  5. Inspection, maintenance, and repair of large pumps and piping systems using advanced robotic tools

    SciTech Connect

    Lewis, R.K.; Radigan, T.M.

    1998-07-01

    Operating and maintaining large pumps and piping systems can be an expensive proposition. Proper inspections and monitoring can reduce costs. This was difficult in the past, since detailed pump inspections could only be performed by disassembly and many portions of piping systems are buried or covered with insulation. Once these components were disassembled, a majority of the cost was already incurred. At that point, expensive part replacement usually took place whether it was needed or not. With the completion of the Pipe Walker{trademark}/LIP System and the planned development of the Submersible Walker{trademark}, this situation is due to change. The specifications for these inspection and maintenance robots will ensure that. Their ability to traverse both horizontal and vertical, forward and backward, make them unique tools. They will open the door for some innovative approaches to inspection and maintenance of large pumps and piping systems.

  6. Numerical modelling as a cost-reduction tool for probability of detection of bolt hole eddy current testing

    NASA Astrophysics Data System (ADS)

    Mandache, C.; Khan, M.; Fahr, A.; Yanishevsky, M.

    2011-03-01

    Probability of detection (PoD) studies are broadly used to determine the reliability of specific nondestructive inspection procedures, as well as to provide data for damage tolerance life estimations and calculation of inspection intervals for critical components. They require inspections on a large set of samples, a fact that makes these statistical assessments time- and cost-consuming. Physics-based numerical simulations of nondestructive testing inspections could be used as a cost-effective alternative to empirical investigations. They realistically predict the inspection outputs as functions of the input characteristics related to the test piece, transducer and instrument settings, which are subsequently used to partially substitute and/or complement inspection data in PoD analysis. This work focuses on the numerical modelling aspects of eddy current testing for the bolt hole inspections of wing box structures typical of the Lockheed Martin C-130 Hercules and P-3 Orion aircraft, found in the air force inventory of many countries. Boundary element-based numerical modelling software was employed to predict the eddy current signal responses when varying inspection parameters related to probe characteristics, crack geometry and test piece properties. Two demonstrator exercises were used for eddy current signal prediction when lowering the driver probe frequency and changing the material's electrical conductivity, followed by subsequent discussions and examination of the implications on using simulated data in the PoD analysis. Despite some simplifying assumptions, the modelled eddy current signals were found to provide similar results to the actual inspections. It is concluded that physics-based numerical simulations have the potential to partially substitute or complement inspection data required for PoD studies, reducing the cost, time, effort and resources necessary for a full empirical PoD assessment.

  7. Development of advanced computational fluid dynamics tools and their application to simulation of internal turbulent flows

    NASA Astrophysics Data System (ADS)

    Emelyanov, V. N.; Karpenko, A. G.; Volkov, K. N.

    2015-06-01

    Modern graphics processing units (GPU) provide architectures and new programming models that enable to harness their large processing power and to design computational fluid dynamics (CFD) simulations at both high performance and low cost. Possibilities of the use of GPUs for the simulation of internal fluid flows are discussed. The finite volume method is applied to solve three-dimensional (3D) unsteady compressible Euler and Navier-Stokes equations on unstructured meshes. Compute Inified Device Architecture (CUDA) technology is used for programming implementation of parallel computational algorithms. Solution of some fluid dynamics problems on GPUs is presented and approaches to optimization of the CFD code related to the use of different types of memory are discussed. Speedup of solution on GPUs with respect to the solution on central processor unit (CPU) is compared with the use of different meshes and different methods of distribution of input data into blocks. Performance measurements show that numerical schemes developed achieve 20 to 50 speedup on GPU hardware compared to CPU reference implementation. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.

  8. Propulsion Simulations Using Advanced Turbulence Models with the Unstructured Grid CFD Tool, TetrUSS

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Frink, Neal T.; Deere, Karen A.; Pandya, Mohangna J.

    2004-01-01

    A computational investigation has been completed to assess the capability of TetrUSS for exhaust nozzle flows. Three configurations were chosen for this study (1) an axisymmetric supersonic jet, (2) a transonic axisymmetric boattail with solid sting operated at different Reynolds number and Mach number, and (3) an isolated non-axisymmetric nacelle with a supersonic cruise nozzle. These configurations were chosen because existing experimental data provided a means for measuring the ability of TetrUSS for simulating complex nozzle flows. The main objective of this paper is to validate the implementation of advanced two-equation turbulence models in the unstructured-grid CFD code USM3D for propulsion flow cases. USM3D is the flow solver of the TetrUSS system. Three different turbulence models, namely, Menter Shear Stress Transport (SST), basic k epsilon, and the Spalart-Allmaras (SA) are used in the present study. The results are generally in agreement with other implementations of these models in structured-grid CFD codes. Results indicate that USM3D provides accurate simulations for complex aerodynamic configurations with propulsion integration.

  9. Terahertz pulsed imaging as an advanced characterisation tool for film coatings--a review.

    PubMed

    Haaser, Miriam; Gordon, Keith C; Strachan, Clare J; Rades, Thomas

    2013-12-01

    Solid dosage forms are the pharmaceutical drug delivery systems of choice for oral drug delivery. These solid dosage forms are often coated to modify the physico-chemical properties of the active pharmaceutical ingredients (APIs), in particular to alter release kinetics. Since the product performance of coated dosage forms is a function of their critical coating attributes, including coating thickness, uniformity, and density, more advanced quality control techniques than weight gain are required. A recently introduced non-destructive method to quantitatively characterise coating quality is terahertz pulsed imaging (TPI). The ability of terahertz radiation to penetrate many pharmaceutical materials enables structural features of coated solid dosage forms to be probed at depth, which is not readily achievable with other established imaging techniques, e.g. near-infrared (NIR) and Raman spectroscopy. In this review TPI is introduced and various applications of the technique in pharmaceutical coating analysis are discussed. These include evaluation of coating thickness, uniformity, surface morphology, density, defects and buried structures as well as correlation between TPI measurements and drug release performance, coating process monitoring and scale up. Furthermore, challenges and limitations of the technique are discussed. PMID:23570960

  10. Advanced information management tools for investigation and case management support in a networked heterogeneous computing environment

    NASA Astrophysics Data System (ADS)

    Clifton, T. E., III; Lehrer, Nancy; Klopfenstein, Mark; Hoshstrasser, Belinda; Campbell, Rachel

    1997-02-01

    The right information, at the right time and place, is key to successful law enforcement. The information exists; the challenge is in getting the information to the law enforcement professionals in a usable form, when they need it. Over the last year, the authors have applied advanced information management technologies towards addressing this challenge, in concert with a complementary research effort in secure wireless network technology by SRI International. The goal of the combined efforts is to provide law enforcement professionals the ability to access a wide range of heterogeneous and legacy data sources (structured, as well as free text); process information into digital multimedia case folders; and create World Wide Web-based multimedia products, accessible by selected field investigators via Fortezza-enhanced secure web browsers over encrypted wireless communications. We discuss the results of our knowledge acquisition activities at federal, regional, and local law enforcement organizations; our technical solution; results of the one year development and demonstration effort; and plans for future research.

  11. Advanced semi-active engine and transmission mounts: tools for modelling, analysis, design, and tuning

    NASA Astrophysics Data System (ADS)

    Farjoud, Alireza; Taylor, Russell; Schumann, Eric; Schlangen, Timothy

    2014-02-01

    This paper is focused on modelling, design, and testing of semi-active magneto-rheological (MR) engine and transmission mounts used in the automotive industry. The purpose is to develop a complete analysis, synthesis, design, and tuning tool that reduces the need for expensive and time-consuming laboratory and field tests. A detailed mathematical model of such devices is developed using multi-physics modelling techniques for physical systems with various energy domains. The model includes all major features of an MR mount including fluid dynamics, fluid track, elastic components, decoupler, rate-dip, gas-charged chamber, MR fluid rheology, magnetic circuit, electronic driver, and control algorithm. Conventional passive hydraulic mounts can also be studied using the same mathematical model. The model is validated using standard experimental procedures. It is used for design and parametric study of mounts; effects of various geometric and material parameters on dynamic response of mounts can be studied. Additionally, this model can be used to test various control strategies to obtain best vibration isolation performance by tuning control parameters. Another benefit of this work is that nonlinear interactions between sub-components of the mount can be observed and investigated. This is not possible by using simplified linear models currently available.

  12. STED-FLCS: An Advanced Tool to Reveal Spatiotemporal Heterogeneity of Molecular Membrane Dynamics

    PubMed Central

    2015-01-01

    Heterogeneous diffusion dynamics of molecules play an important role in many cellular signaling events, such as of lipids in plasma membrane bioactivity. However, these dynamics can often only be visualized by single-molecule and super-resolution optical microscopy techniques. Using fluorescence lifetime correlation spectroscopy (FLCS, an extension of fluorescence correlation spectroscopy, FCS) on a super-resolution stimulated emission depletion (STED) microscope, we here extend previous observations of nanoscale lipid dynamics in the plasma membrane of living mammalian cells. STED-FLCS allows an improved determination of spatiotemporal heterogeneity in molecular diffusion and interaction dynamics via a novel gated detection scheme, as demonstrated by a comparison between STED-FLCS and previous conventional STED-FCS recordings on fluorescent phosphoglycerolipid and sphingolipid analogues in the plasma membrane of live mammalian cells. The STED-FLCS data indicate that biophysical and biochemical parameters such as the affinity for molecular complexes strongly change over space and time within a few seconds. Drug treatment for cholesterol depletion or actin cytoskeleton depolymerization not only results in the already previously observed decreased affinity for molecular interactions but also in a slight reduction of the spatiotemporal heterogeneity. STED-FLCS specifically demonstrates a significant improvement over previous gated STED-FCS experiments and with its improved spatial and temporal resolution is a novel tool for investigating how heterogeneities of the cellular plasma membrane may regulate biofunctionality. PMID:26235350

  13. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    PubMed

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  14. STED-FLCS: An Advanced Tool to Reveal Spatiotemporal Heterogeneity of Molecular Membrane Dynamics.

    PubMed

    Vicidomini, Giuseppe; Ta, Haisen; Honigmann, Alf; Mueller, Veronika; Clausen, Mathias P; Waithe, Dominic; Galiani, Silvia; Sezgin, Erdinc; Diaspro, Alberto; Hell, Stefan W; Eggeling, Christian

    2015-09-01

    Heterogeneous diffusion dynamics of molecules play an important role in many cellular signaling events, such as of lipids in plasma membrane bioactivity. However, these dynamics can often only be visualized by single-molecule and super-resolution optical microscopy techniques. Using fluorescence lifetime correlation spectroscopy (FLCS, an extension of fluorescence correlation spectroscopy, FCS) on a super-resolution stimulated emission depletion (STED) microscope, we here extend previous observations of nanoscale lipid dynamics in the plasma membrane of living mammalian cells. STED-FLCS allows an improved determination of spatiotemporal heterogeneity in molecular diffusion and interaction dynamics via a novel gated detection scheme, as demonstrated by a comparison between STED-FLCS and previous conventional STED-FCS recordings on fluorescent phosphoglycerolipid and sphingolipid analogues in the plasma membrane of live mammalian cells. The STED-FLCS data indicate that biophysical and biochemical parameters such as the affinity for molecular complexes strongly change over space and time within a few seconds. Drug treatment for cholesterol depletion or actin cytoskeleton depolymerization not only results in the already previously observed decreased affinity for molecular interactions but also in a slight reduction of the spatiotemporal heterogeneity. STED-FLCS specifically demonstrates a significant improvement over previous gated STED-FCS experiments and with its improved spatial and temporal resolution is a novel tool for investigating how heterogeneities of the cellular plasma membrane may regulate biofunctionality. PMID:26235350

  15. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science

    PubMed Central

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883

  16. Genetic tool development underpins recent advances in thermophilic whole‐cell biocatalysts

    PubMed Central

    Taylor, M. P.; van Zyl, L.; Tuffin, I. M.; Leak, D. J.; Cowan, D. A.

    2011-01-01

    Summary The environmental value of sustainably producing bioproducts from biomass is now widely appreciated, with a primary target being the economic production of fuels such as bioethanol from lignocellulose. The application of thermophilic prokaryotes is a rapidly developing niche in this field, driven by their known catabolic versatility with lignocellulose‐derived carbohydrates. Fundamental to the success of this work has been the development of reliable genetic and molecular systems. These technical tools are now available to assist in the development of other (hyper)thermophilic strains with diverse phenotypes such as hemicellulolytic and cellulolytic properties, branched chain alcohol production and other ‘valuable bioproduct’ synthetic capabilities. Here we present an insight into the historical limitations, recent developments and current status of a number of genetic systems for thermophiles. We also highlight the value of reliable genetic methods for increasing our knowledge of thermophile physiology. We argue that the development of robust genetic systems is paramount in the evolution of future thermophilic based bioprocesses and make suggestions for future approaches and genetic targets that will facilitate this process. PMID:21310009

  17. The STREON Recirculation Chamber: An Advanced Tool to Quantify Stream Ecosystem Metabolism in the Benthic Zone

    NASA Astrophysics Data System (ADS)

    Brock, J. T.; Utz, R.; McLaughlin, B.

    2013-12-01

    The STReam Experimental Observatory Network is a large-scale experimental effort that will investigate the effects of eutrophication and loss of large consumers in stream ecosystems. STREON represents the first experimental effort undertaken and supported by the National Ecological Observatory Network (NEON).Two treatments will be applied at 10 NEON sites and maintained for 10 years in the STREON program: the addition of nitrate and phosphate to enrich concentrations by five times ambient levels and electrical fields that exclude top consumers (i.e., fish or invertebrates) of the food web from the surface of buried sediment baskets. Following a 3-5 week period, the sediment baskets will be extracted and incubated in closed, recirculating metabolic chambers to measure rates of respiration, photosynthesis, and nutrient uptake. All STREON-generated data will be open access and available on the NEON web portal. The recirculation chamber represents a critical infrastructural component of STREON. Although researchers have applied such chambers for metabolic and nutrient uptake measurements in the past, the scope of STREON demands a novel design that addresses multiple processes often neglected by earlier models. The STREON recirculation chamber must be capable of: 1) incorporating hyporheic exchange into the flow field to ensure measurements of respiration include the activity of subsurface biota, 2) operating consistently with heterogeneous sediments from sand to cobble, 3) minimizing heat exchange from the motor and external environment, 4) delivering a reproducible uniform flow field over the surface of the sediment basket, and 5) efficient assembly/disassembly with minimal use of tools. The chamber also required a means of accommodating an optical dissolved oxygen probe and a means to inject/extract water. A prototype STREON chamber has been designed and thoroughly tested. The flow field within the chamber has been mapped using particle imaging velocimetry (PIV

  18. Tool for the Integrated Dynamic Numerical Propulsion System Simulation (NPSS)/Turbine Engine Closed-Loop Transient Analysis (TTECTrA) User's Guide

    NASA Technical Reports Server (NTRS)

    Chin, Jeffrey C.; Csank, Jeffrey T.

    2016-01-01

    The Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA ver2) is a control design tool thatenables preliminary estimation of transient performance for models without requiring a full nonlinear controller to bedesigned. The program is compatible with subsonic engine models implemented in the MATLAB/Simulink (TheMathworks, Inc.) environment and Numerical Propulsion System Simulation (NPSS) framework. At a specified flightcondition, TTECTrA will design a closed-loop controller meeting user-defined requirements in a semi or fully automatedfashion. Multiple specifications may be provided, in which case TTECTrA will design one controller for each, producing acollection of controllers in a single run. Each resulting controller contains a setpoint map, a schedule of setpointcontroller gains, and limiters; all contributing to transient characteristics. The goal of the program is to providesteady-state engine designers with more immediate feedback on the transient engine performance earlier in the design cycle.

  19. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Advancing tools and methods

    SciTech Connect

    Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Balge, Marci Z.; Singer, Burton H.; Utzinger, Juerg

    2010-01-15

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developed within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.

  20. Picosecond laser fabrication of micro cutting tool geometries on polycrystalline diamond composites using a high-numerical aperture micro scanning system

    NASA Astrophysics Data System (ADS)

    Eberle, Gregory; Dold, Claus; Wegener, Konrad

    2015-03-01

    The generation of microsized components found in LEDs, watches, molds as well as other types of micromechanics and microelectronics require a corresponding micro cutting tool in order to be manufactured, typically by milling or turning. Micro cutting tools are made of cemented tungsten carbide and are conventionally fabricated either by electrical discharge machining (EDM) or by grinding. An alternative method is proposed through a laser-based solution operating in the picosecond pulse duration whereby the beam is deflected using a modified galvanometer-driven micro scanning system exhibiting a high numerical aperture. A micro cutting tool material which cannot be easily processed using conventional methods is investigated, which is a fine grain polycrystalline diamond composite (PCD). The generation of various micro cutting tool relevant geometries, such as chip breakers and cutting edges, are demonstrated. The generated geometries are subsequently evaluated using scanning electron microscopy (SEM) and quality is measured in terms of surface roughness and cutting edge sharpness. Additionally, two processing strategies in which the laser beam processes tangentially and orthogonally are compared in terms of quality.

  1. Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia

    2006-01-01

    The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.

  2. Earth remote sensing as an effective tool for the development of advanced innovative educational technologies

    NASA Astrophysics Data System (ADS)

    Mayorova, Vera; Mayorov, Kirill

    2009-11-01

    Current educational system is facing a contradiction between the fundamentality of engineering education and the necessity of applied learning extension, which requires new methods of training to combine both academic and practical knowledge in balance. As a result there are a number of innovations being developed and implemented into the process of education aimed at optimizing the quality of the entire educational system. Among a wide range of innovative educational technologies there is an especially important subset of educational technologies which involve learning through hands-on scientific and technical projects. The purpose of this paper is to describe the implementation of educational technologies based on small satellites development as well as the usage of Earth remote sensing data acquired from these satellites. The increase in public attention to the education through Earth remote sensing is based on the concern that although there is a great progress in the development of new methods of Earth imagery and remote sensing data acquisition there is still a big question remaining open on practical applications of this kind of data. It is important to develop the new way of thinking for the new generation of people so they understand that they are the masters of their own planet and they are responsible for its state. They should desire and should be able to use a powerful set of tools based on modern and perspective Earth remote sensing. For example NASA sponsors "Classroom of the Future" project. The Universities Space Research Association in United States provides a mechanism through which US universities can cooperate effectively with one another, with the government, and with other organizations to further space science and technology, and to promote education in these areas. It also aims at understanding the Earth as a system and promoting the role of humankind in the destiny of their own planet. The Association has founded a Journal of Earth System

  3. A comprehensive tool for image-based generation of fetus and pregnant women mesh models for numerical dosimetry studies

    NASA Astrophysics Data System (ADS)

    Dahdouh, S.; Varsier, N.; Serrurier, A.; De la Plata, J.-P.; Anquez, J.; Angelini, E. D.; Wiart, J.; Bloch, I.

    2014-08-01

    Fetal dosimetry studies require the development of accurate numerical 3D models of the pregnant woman and the fetus. This paper proposes a 3D articulated fetal growth model covering the main phases of pregnancy and a pregnant woman model combining the utero-fetal structures and a deformable non-pregnant woman body envelope. The structures of interest were automatically or semi-automatically (depending on the stage of pregnancy) segmented from a database of images and surface meshes were generated. By interpolating linearly between fetal structures, each one can be generated at any age and in any position. A method is also described to insert the utero-fetal structures in the maternal body. A validation of the fetal models is proposed, comparing a set of biometric measurements to medical reference charts. The usability of the pregnant woman model in dosimetry studies is also investigated, with respect to the influence of the abdominal fat layer.

  4. NOTE: Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool

    NASA Astrophysics Data System (ADS)

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  5. Study of wave runup using numerical models and low-altitude aerial photogrammetry: A tool for coastal management

    NASA Astrophysics Data System (ADS)

    Casella, Elisa; Rovere, Alessio; Pedroncini, Andrea; Mucerino, Luigi; Casella, Marco; Cusati, Luis Alberto; Vacchi, Matteo; Ferrari, Marco; Firpo, Marco

    2014-08-01

    Monitoring the impact of sea storms on coastal areas is fundamental to study beach evolution and the vulnerability of low-lying coasts to erosion and flooding. Modelling wave runup on a beach is possible, but it requires accurate topographic data and model tuning, that can be done comparing observed and modeled runup. In this study we collected aerial photos using an Unmanned Aerial Vehicle after two different swells on the same study area. We merged the point cloud obtained with photogrammetry with multibeam data, in order to obtain a complete beach topography. Then, on each set of rectified and georeferenced UAV orthophotos, we identified the maximum wave runup for both events recognizing the wet area left by the waves. We then used our topography and numerical models to simulate the wave runup and compare the model results to observed values during the two events. Our results highlight the potential of the methodology presented, which integrates UAV platforms, photogrammetry and Geographic Information Systems to provide faster and cheaper information on beach topography and geomorphology compared with traditional techniques without losing in accuracy. We use the results obtained from this technique as a topographic base for a model that calculates runup for the two swells. The observed and modeled runups are consistent, and open new directions for future research.

  6. Assessment of HTGR Helium Compressor Analysis Tool Based on Newton-Raphson Numerical Application to Through-flow Analysis

    SciTech Connect

    Ji Hwan Kim; Hyeun Min Kim; Hee Cheon NO

    2006-07-01

    This study describes the development of a computer program for analyzing the off-design performance of axial flow helium compressors, which is one of the major concerns for the power conversion system of a high temperature gas-cooled reactor (HTGR). The compressor performance has been predicted by the aerodynamic analysis of meridional flow with allowances for losses. The governing equations have been derived from Euler turbomachine equation and the streamline curvature method, and then they have been merged into linearized equations based on the Newton-Raphson numerical method. The effect of viscosity is considered by empirical correlations to introduce entropy rises caused by primary loss sources. Use of the method has been illustrated by applying it to a 20-stage helium compressor of the GTHTR300 plant. As a result, the flow throughout the stages of the compressor has been predicted and the compressor characteristics have been also investigated according to the design specification. The program results show much better stability and good convergence with respect to other through-flow methods, and good agreement with the compressor performance map provided by JAEA. (authors)

  7. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 11: Computer-Aided Manufacturing & Advanced CNC, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  8. Numerical Relativity

    NASA Technical Reports Server (NTRS)

    Baker, John G.

    2009-01-01

    Recent advances in numerical relativity have fueled an explosion of progress in understanding the predictions of Einstein's theory of gravity, General Relativity, for the strong field dynamics, the gravitational radiation wave forms, and consequently the state of the remnant produced from the merger of compact binary objects. I will review recent results from the field, focusing on mergers of two black holes.

  9. Analysis of line-and-space resist patterns with sub-20 nm half-pitch fabricated using high-numerical-aperture exposure tool of extreme ultraviolet lithography

    NASA Astrophysics Data System (ADS)

    Kozawa, Takahiro; Santillan, Julius Joseph; Itani, Toshiro

    2016-09-01

    The resolution of resist processes for extreme ultraviolet (EUV) lithography has been steadily improved and has reached the sub-20 nm half-pitch region. Currently, the resist materials capable of resolving 11 nm half-pitch line-and-space patterns are being developed in industrial fields. In this study, the line-and-space resist patterns with sub-20 nm half-pitches were fabricated using a high-numerical-aperture (NA) EUV exposure tool and analyzed by the Monte Carlo simulation. The scanning electron microscopy (SEM) images of resist patterns after their development were compared with the latent images calculated on the basis of the sensitization and reaction mechanisms of chemically amplified EUV resists. The approximate relationship between resist patterns and latent images was clarified for the sub-20 nm half-pitch region. For the realization of 11 nm half-pitch fabrication, the suppression of the stochastic effects in the development process is an important consideration.

  10. Numerical Modeling for Hole-Edge Cracking of Advanced High-Strength Steels (AHSS) Components in the Static Bend Test

    NASA Astrophysics Data System (ADS)

    Kim, Hyunok; Mohr, William; Yang, Yu-Ping; Zelenak, Paul; Kimchi, Menachem

    2011-08-01

    Numerical modeling of local formability, such as hole-edge cracking and shear fracture in bending of AHSS, is one of the challenging issues for simulation engineers for prediction and evaluation of stamping and crash performance of materials. This is because continuum-mechanics-based finite element method (FEM) modeling requires additional input data, "failure criteria" to predict the local formability limit of materials, in addition to the material flow stress data input for simulation. This paper presents a numerical modeling approach for predicting hole-edge failures during static bend tests of AHSS structures. A local-strain-based failure criterion and a stress-triaxiality-based failure criterion were developed and implemented in LS-DYNA simulation code to predict hole-edge failures in component bend tests. The holes were prepared using two different methods: mechanical punching and water-jet cutting. In the component bend tests, the water-jet trimmed hole showed delayed fracture at the hole-edges, while the mechanical punched hole showed early fracture as the bending angle increased. In comparing the numerical modeling and test results, the load-displacement curve, the displacement at the onset of cracking, and the final crack shape/length were used. Both failure criteria also enable the numerical model to differentiate between the local formability limit of mechanical-punched and water-jet-trimmed holes. The failure criteria and static bend test developed here are useful to evaluate the local formability limit at a structural component level for automotive crash tests.

  11. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 9: Tool and Die, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  12. Creation of an ensemble of simulated cardiac cases and a human observer study: tools for the development of numerical observers for SPECT myocardial perfusion imaging

    NASA Astrophysics Data System (ADS)

    O'Connor, J. Michael; Pretorius, P. Hendrik; Gifford, Howard C.; Licho, Robert; Joffe, Samuel; McGuiness, Matthew; Mehurg, Shannon; Zacharias, Michael; Brankov, Jovan G.

    2012-02-01

    Our previous Single Photon Emission Computed Tomography (SPECT) myocardial perfusion imaging (MPI) research explored the utility of numerical observers. We recently created two hundred and eighty simulated SPECT cardiac cases using Dynamic MCAT (DMCAT) and SIMIND Monte Carlo tools. All simulated cases were then processed with two reconstruction methods: iterative ordered subset expectation maximization (OSEM) and filtered back-projection (FBP). Observer study sets were assembled for both OSEM and FBP methods. Five physicians performed an observer study on one hundred and seventy-nine images from the simulated cases. The observer task was to indicate detection of any myocardial perfusion defect using the American Society of Nuclear Cardiology (ASNC) 17-segment cardiac model and the ASNC five-scale rating guidelines. Human observer Receiver Operating Characteristic (ROC) studies established the guidelines for the subsequent evaluation of numerical model observer (NO) performance. Several NOs were formulated and their performance was compared with the human observer performance. One type of NO was based on evaluation of a cardiac polar map that had been pre-processed using a gradient-magnitude watershed segmentation algorithm. The second type of NO was also based on analysis of a cardiac polar map but with use of a priori calculated average image derived from an ensemble of normal cases.

  13. What Is Numerical Control?

    ERIC Educational Resources Information Center

    Goold, Vernell C.

    1977-01-01

    Numerical control (a technique involving coded, numerical instructions for the automatic control and performance of a machine tool) does not replace fundamental machine tool training. It should be added to the training program to give the student an additional tool to accomplish production rates and accuracy that were not possible before. (HD)

  14. The R.E.D. tools: advances in RESP and ESP charge derivation and force field library building.

    PubMed

    Dupradeau, François-Yves; Pigache, Adrien; Zaffran, Thomas; Savineau, Corentin; Lelong, Rodolphe; Grivel, Nicolas; Lelong, Dimitri; Rosanski, Wilfried; Cieplak, Piotr

    2010-07-28

    Deriving atomic charges and building a force field library for a new molecule are key steps when developing a force field required for conducting structural and energy-based analysis using molecular mechanics. Derivation of popular RESP charges for a set of residues is a complex and error prone procedure because it depends on numerous input parameters. To overcome these problems, the R.E.D. Tools (RESP and ESP charge Derive, ) have been developed to perform charge derivation in an automatic and straightforward way. The R.E.D. program handles chemical elements up to bromine in the periodic table. It interfaces different quantum mechanical programs employed for geometry optimization and computing molecular electrostatic potential(s), and performs charge fitting using the RESP program. By defining tight optimization criteria and by controlling the molecular orientation of each optimized geometry, charge values are reproduced at any computer platform with an accuracy of 0.0001 e. The charges can be fitted using multiple conformations, making them suitable for molecular dynamics simulations. R.E.D. allows also for defining charge constraints during multiple molecule charge fitting, which are used to derive charges for molecular fragments. Finally, R.E.D. incorporates charges into a force field library, readily usable in molecular dynamics computer packages. For complex cases, such as a set of homologous molecules belonging to a common family, an entire force field topology database is generated. Currently, the atomic charges and force field libraries have been developed for more than fifty model systems and stored in the RESP ESP charge DDataBase. Selected results related to non-polarizable charge models are presented and discussed. PMID:20574571

  15. The R.E.D. Tools: Advances in RESP and ESP charge derivation and force field library building

    PubMed Central

    Dupradeau, François-Yves; Pigache, Adrien; Zaffran, Thomas; Savineau, Corentin; Lelong, Rodolphe; Grivel, Nicolas; Lelong, Dimitri; Rosanski, Wilfried; Cieplak, Piotr

    2010-01-01

    Deriving atomic charges and building a force field library for a new molecule are key steps when developing a force field required for conducting structural and energy-based analysis using molecular mechanics. Derivation of popular RESP charges for a set of residues is a complex and error prone procedure, because it depends on numerous input parameters. To overcome these problems, the R.E.D. Tools (RESP and ESP charge Derive, http://q4md-forcefieldtools.org/RED/) have been developed to perform charge derivation in an automatic and straightforward way. The R.E.D. program handles chemical elements up to bromine in the periodic table. It interfaces different quantum mechanical programs employed for geometry optimization and computing molecular electrostatic potential(s), and performs charge fitting using the RESP program. By defining tight optimization criteria and by controlling the molecular orientation of each optimized geometry, charge values are reproduced at any computer platform with an accuracy of 0.0001 e. The charges can be fitted using multiple conformations, making them suitable for molecular dynamics simulations. R.E.D. allows also for defining charge constraints during multiple molecule charge fitting, which are used to derive charges for molecular fragments. Finally, R.E.D. incorporates charges into a force field library, readily usable in molecular dynamics computer packages. For complex cases, such as a set of homologous molecules belonging to a common family, an entire force field topology database is generated. Currently, the atomic charges and force field libraries have been developed for more than fifty model systems and stored in the RESP ESP charge DDataBase. Selected results related to non-polarizable charge models are presented and discussed. PMID:20574571

  16. Image Navigation and Registration Performance Assessment Tool Set for the GOES-R Advanced Baseline Imager and Geostationary Lightning Mapper

    NASA Technical Reports Server (NTRS)

    De Luccia, Frank J.; Houchin, Scott; Porter, Brian C.; Graybill, Justin; Haas, Evan; Johnson, Patrick D.; Isaacson, Peter J.; Reth, Alan D.

    2016-01-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. For ABI, these metrics are the 3-sigma errors in navigation (NAV), channel-to-channel registration (CCR), frame-to-frame registration (FFR), swath-to-swath registration (SSR), and within frame registration (WIFR) for the Level 1B image products. For GLM, the single metric of interest is the 3-sigma error in the navigation of background images (GLM NAV) used by the system to navigate lightning strikes. 3-sigma errors are estimates of the 99.73rd percentile of the errors accumulated over a 24-hour data collection period. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24-hour evaluation period. Another aspect of the IPATS design that vastly reduces execution time is the off-line propagation of Landsat based truth images to the fixed grid coordinates system for each of the three GOES-R satellite locations, operational East and West and initial checkout locations. This paper describes the algorithmic design and implementation of IPATS and provides preliminary test results.

  17. Image navigation and registration performance assessment tool set for the GOES-R Advanced Baseline Imager and Geostationary Lightning Mapper

    NASA Astrophysics Data System (ADS)

    De Luccia, Frank J.; Houchin, Scott; Porter, Brian C.; Graybill, Justin; Haas, Evan; Johnson, Patrick D.; Isaacson, Peter J.; Reth, Alan D.

    2016-05-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. For ABI, these metrics are the 3-sigma errors in navigation (NAV), channel-to-channel registration (CCR), frame-to-frame registration (FFR), swath-to-swath registration (SSR), and within frame registration (WIFR) for the Level 1B image products. For GLM, the single metric of interest is the 3-sigma error in the navigation of background images (GLM NAV) used by the system to navigate lightning strikes. 3-sigma errors are estimates of the 99. 73rd percentile of the errors accumulated over a 24 hour data collection period. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. Another aspect of the IPATS design that vastly reduces execution time is the off-line propagation of Landsat based truth images to the fixed grid coordinates system for each of the three GOES-R satellite locations, operational East and West and initial checkout locations. This paper describes the algorithmic design and implementation of IPATS and provides preliminary test results.

  18. Advancing the argument for validity of the Alberta Context Tool with healthcare aides in residential long-term care

    PubMed Central

    2011-01-01

    Background Organizational context has the potential to influence the use of new knowledge. However, despite advances in understanding the theoretical base of organizational context, its measurement has not been adequately addressed, limiting our ability to quantify and assess context in healthcare settings and thus, advance development of contextual interventions to improve patient care. We developed the Alberta Context Tool (the ACT) to address this concern. It consists of 58 items representing 10 modifiable contextual concepts. We reported the initial validation of the ACT in 2009. This paper presents the second stage of the psychometric validation of the ACT. Methods We used the Standards for Educational and Psychological Testing to frame our validity assessment. Data from 645 English speaking healthcare aides from 25 urban residential long-term care facilities (nursing homes) in the three Canadian Prairie Provinces were used for this stage of validation. In this stage we focused on: (1) advanced aspects of internal structure (e.g., confirmatory factor analysis) and (2) relations with other variables validity evidence. To assess reliability and validity of scores obtained using the ACT we conducted: Cronbach's alpha, confirmatory factor analysis, analysis of variance, and tests of association. We also assessed the performance of the ACT when individual responses were aggregated to the care unit level, because the instrument was developed to obtain unit-level scores of context. Results Item-total correlations exceeded acceptable standards (> 0.3) for the majority of items (51 of 58). We ran three confirmatory factor models. Model 1 (all ACT items) displayed unacceptable fit overall and for five specific items (1 item on adequate space for resident care in the Organizational Slack-Space ACT concept and 4 items on use of electronic resources in the Structural and Electronic Resources ACT concept). This prompted specification of two additional models. Model 2 used

  19. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    SciTech Connect

    Liu, H; Liang, X; Kalbasi, A; Lin, A; Ahn, P; Both, S

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: proton PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.

  20. Innovative and Advanced Coupled Neutron Transport and Thermal Hydraulic Method (Tool) for the Design, Analysis and Optimization of VHTR/NGNP Prismatic Reactors

    SciTech Connect

    Rahnema, Farzad; Garimeela, Srinivas; Ougouag, Abderrafi; Zhang, Dingkang

    2013-11-29

    This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implement a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.

  1. Advanced numerical technique for analysis of surface and bulk acoustic waves in resonators using periodic metal gratings

    NASA Astrophysics Data System (ADS)

    Naumenko, Natalya F.

    2014-09-01

    A numerical technique characterized by a unified approach for the analysis of different types of acoustic waves utilized in resonators in which a periodic metal grating is used for excitation and reflection of such waves is described. The combination of the Finite Element Method analysis of the electrode domain with the Spectral Domain Analysis (SDA) applied to the adjacent upper and lower semi-infinite regions, which may be multilayered and include air as a special case of a dielectric material, enables rigorous simulation of the admittance in resonators using surface acoustic waves, Love waves, plate modes including Lamb waves, Stonely waves, and other waves propagating along the interface between two media, and waves with transient structure between the mentioned types. The matrix formalism with improved convergence incorporated into SDA provides fast and robust simulation for multilayered structures with arbitrary thickness of each layer. The described technique is illustrated by a few examples of its application to various combinations of LiNbO3, isotropic silicon dioxide and silicon with a periodic array of Cu electrodes. The wave characteristics extracted from the admittance functions change continuously with the variation of the film and plate thicknesses over wide ranges, even when the wave nature changes. The transformation of the wave nature with the variation of the layer thicknesses is illustrated by diagrams and contour plots of the displacements calculated at resonant frequencies.

  2. Numerical Modeling for Springback Predictions by Considering the Variations of Elastic Modulus in Stamping Advanced High-Strength Steels (AHSS)

    NASA Astrophysics Data System (ADS)

    Kim, Hyunok; Kimchi, Menachem

    2011-08-01

    This paper presents a numerical modeling approach for predicting springback by considering the variations of elastic modulus on springback in stamping AHSS. Various stamping tests and finite-element method (FEM) simulation codes were used in this study. The cyclic loading-unloading tensile tests were conducted to determine the variations of elastic modulus for dual-phase (DP) 780 sheet steel. The biaxial bulge test was used to obtain plastic flow stress data. The non-linear reduction of elastic modulus for increasing the plastic strain was formulated by using the Yoshida model that was implemented in FEM simulations for springback. To understand the effects of material properties on springback, experiments were conducted with a simple geometry such as U-shape bending and the more complex geometry such as the curved flanging and S-rail stamping. Different measurement methods were used to confirm the final part geometry. Two different commercial FEM codes, LS-DYNA and DEFORM, were used to compare the experiments. The variable elastic modulus improved springback predictions in U-shape bending and curved flanging tests compared to FEM with the constant elastic modulus. However, in S-rail stamping tests, both FEM models with the isotropic hardening model showed limitations in predicting the sidewall curl of the S-rail part after springback. To consider the kinematic hardening and Bauschinger effects that result from material bending-unbending in S-rail stamping, the Yoshida model was used for FEM simulation of S-rail stamping and springback. The FEM predictions showed good improvement in correlating with experiments.

  3. A New Method For Advanced Virtual Design Of Stamping Tools For Automotive Industry: Application To Nodular Cast Iron EN-GJS-600-3

    NASA Astrophysics Data System (ADS)

    Ben-Slima, Khalil; Penazzi, Luc; Mabru, Catherine; Ronde-Oustau, François; Rezaï-Aria, Farhad

    2011-05-01

    This contribution presents an approach combining the stamping numerical processing simulations and structure analysis in order to improve the design for optimizing the tool fatigue life. The method consists in simulating the stamping process via AutoForm® (or any FEM Code) by considering the tool as a perfect rigid body. The estimated contact pressure is then used as boundary condition for FEM structure loading analysis. The result of this analysis is used for life prediction of the tool using S-N fatigue curve. If the prescribed tool life requirements are not satisfied, then the critical region of the tool is redesigned and the whole simulation procedures are reactivated. This optimization method is applied for a cast iron EN-GJS-600-3 as candidate stamping tool materiel. The room temperature fatigue S-N curves of this alloy are established in laboratory under uniaxial push/pull cyclic experiments on cylindrical specimens under a load ratio of R (σmin/σmax) = -2.

  4. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 1: Executive Summary, of a 15-Volume Set of Skills Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    The Machine Tool Advanced Skills Technology (MAST) consortium was formed to address the shortage of skilled workers for the machine tools and metals-related industries. Featuring six of the nation's leading advanced technology centers, the MAST consortium developed, tested, and disseminated industry-specific skill standards and model curricula for…

  5. JUST in time health emergency interventions: an innovative approach to training the citizen for emergency situations using virtual reality techniques and advanced IT tools (the VR Tool).

    PubMed

    Manganas, A; Tsiknakis, M; Leisch, E; Ponder, M; Molet, T; Herbelin, B; Magnetat-Thalmann, N; Thalmann, D; Fato, M; Schenone, A

    2004-01-01

    This paper reports the results of the second of the two systems developed by JUST, a collaborative project supported by the European Union under the Information Society Technologies (IST) Programme. The most innovative content of the project has been the design and development of a complementary training course for non-professional health emergency operators, which supports the traditional learning phase, and which purports to improve the retention capability of the trainees. This was achieved with the use of advanced information technology techniques, which provide adequate support and can help to overcome the present weaknesses of the existing training mechanisms. PMID:15747937

  6. Towards Direct Numerical Simulation of mass and energy fluxes at the soil-atmospheric interface with advanced Lattice Boltzmann methods

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Krafczyk, Manfred; Geier, Martin; Schönherr, Martin

    2014-05-01

    The quantification of soil evaporation and of soil water content dynamics near the soil surface are critical in the physics of land-surface processes on many scales and are dominated by multi-component and multi-phase mass and energy fluxes between the ground and the atmosphere. Although it is widely recognized that both liquid and gaseous water movement are fundamental factors in the quantification of soil heat flux and surface evaporation, their computation has only started to be taken into account using simplified macroscopic models. As the flow field over the soil can be safely considered as turbulent, it would be natural to study the detailed transient flow dynamics by means of Large Eddy Simulation (LES [1]) where the three-dimensional flow field is resolved down to the laminar sub-layer. Yet this requires very fine resolved meshes allowing a grid resolution of at least one order of magnitude below the typical grain diameter of the soil under consideration. In order to gain reliable turbulence statistics, up to several hundred eddy turnover times have to be simulated which adds up to several seconds of real time. Yet, the time scale of the receding saturated water front dynamics in the soil is on the order of hours. Thus we are faced with the task of solving a transient turbulent flow problem including the advection-diffusion of water vapour over the soil-atmospheric interface represented by a realistic tomographic reconstruction of a real porous medium taken from laboratory probes. Our flow solver is based on the Lattice Boltzmann method (LBM) [2] which has been extended by a Cumulant approach similar to the one described in [3,4] to minimize the spurious coupling between the degrees of freedom in previous LBM approaches and can be used as an implicit LES turbulence model due to its low numerical dissipation and increased stability at high Reynolds numbers. The kernel has been integrated into the research code Virtualfluids [5] and delivers up to 30% of the

  7. Use of advanced earth observation tools for the analyses of recent surface changes in Kalahari pans and Namibian coastal lagoons

    NASA Astrophysics Data System (ADS)

    Behling, Robert; Milewski, Robert; Chabrillat, Sabine; Völkel, Jörg

    2016-04-01

    The remote sensing analyses in the BMBF-SPACES collaborative project Geoarchives - Signals of Climate and Landscape Change preserved in Southern African Geoarchives - focuses on the use of recent and upcoming Earth Observation Tools for the study of climate and land use changes and its impact on the ecosystem. It aims at demonstrating the potential of recently available advanced optical remote sensing imagery with its extended spectral coverage and temporal resolution for the identification and mapping of sediment features associated with paleo-environmental archives as well as their recent dynamic. In this study we focus on the analyses of two ecosystems of major interest, the Kalahari salt pans as well as the lagoons at Namibia's west coast, that present high dynamic caused by combined hydrological and surface processes linked to climatic events. Multitemporal remote sensing techniques allow us to derive the recent surface dynamic of the salt pans and also provide opportunities to get a detailed understanding of the spatiotemporal development of the coastal lagoons. Furthermore spaceborne hyperspectral analysis can give insight to the current surface mineralogy of the salt pans on a physical basis and provide the intra pan distribution of evaporites. The soils and sediments of the Kalahari salt pans such as the Omongwa pan are a potentially significant storage of global carbon and also function as an important terrestrial climate archive. Thus far the surface distribution of evaporites have been only assessed mono-temporally and on a coarse regional scale, but the dynamic of the salt pans, especially the formation of evaporites, is still uncertain and poorly understood. For the salt pan analyses a change detection is applied using the Iterative-reweighted Multivariate Alteration Detection (IR-MAD) method to identify and investigate surface changes based on a Landsat time-series covering the period 1984-2015. Furthermore the current spatial distribution of

  8. Friction Stir Spot Welding of Advanced High Strength Steels

    SciTech Connect

    Santella, Michael L; Hovanski, Yuri; Grant, Glenn J; Frederick, D Alan; Dahl, Michael E

    2009-02-01

    Friction stir spot welding was used to join two advanced high-strength steels using polycrystalline cubic boron nitride tooling. Numerous tool designs were employed to study the influence of tool geometry on weld joints produced in both DP780 and a hot-stamp boron steel. Tool designs included conventional, concave shouldered pin tools with several pin configurations; a number of shoulderless designs; and a convex, scrolled shoulder tool. Weld quality was assessed based on lap shear strength, microstructure, microhardness, and bonded area. Mechanical properties were functionally related to bonded area and joint microstructure, demonstrating the necessity to characterize processing windows based on tool geometry.

  9. Friction Stir Spot Welding of Advanced High Strength Steels

    SciTech Connect

    Hovanski, Yuri; Santella, M. L.; Grant, Glenn J.

    2009-12-28

    Friction stir spot welding was used to join two advanced high-strength steels using polycrystalline cubic boron nitride tooling. Numerous tool designs were employed to study the influence of tool geometry on weld joints produced in both DP780 and a hot-stamp boron steel. Tool designs included conventional, concave shouldered pin tools with several pin configurations; a number of shoulderless designs; and a convex, scrolled shoulder tool. Weld quality was assessed based on lap shear strength, microstructure, microhardness, and bonded area. Mechanical properties were functionally related to bonded area and joint microstructure, demonstrating the necessity to characterize processing windows based on tool geometry.

  10. Implementation of an advanced hybrid MPC-PID control system using PAT tools into a direct compaction continuous pharmaceutical tablet manufacturing pilot plant.

    PubMed

    Singh, Ravendra; Sahay, Abhishek; Karry, Krizia M; Muzzio, Fernando; Ierapetritou, Marianthi; Ramachandran, Rohit

    2014-10-01

    It is desirable for a pharmaceutical final dosage form to be manufactured through a quality by design (QbD)-based approach rather than a quality by testing (QbT) approach. An automatic feedback control system coupled with PAT tools that is part of the QbD paradigm shift, has the potential to ensure that the pre-defined end product quality attributes are met in a time and cost efficient manner. In this work, an advanced hybrid MPC-PID control architecture coupled with real time inline/online monitoring tools and principal components analysis (PCA) based additional supervisory control layer has been proposed for a continuous direct compaction tablet manufacturing process. The advantages of both MPC and PID have been utilized in a hybrid scheme. The control hardware and software integration and implementation of the control system has been demonstrated using feeders and blending unit operation of a continuous tablet manufacturing pilot plant and an NIR based PAT tool. The advanced hybrid MPC-PID control scheme leads to enhanced control loop performance of the critical quality attributes in comparison to a regulatory (e.g. PID) control scheme indicating its potential to improve pharmaceutical product quality. PMID:24974987

  11. When does a protein become an allergen? Searching for a dynamic definition based on most advanced technology tools

    PubMed Central

    Mari, A

    2008-01-01

    Since the early beginning of allergology as a science considerable efforts have been made by clinicians and researchers to identify and characterize allergic triggers as raw allergenic materials, allergenic sources and tissues, and more recently basic allergenic structures defined as molecules. The last 15–20 years have witnessed many centres focusing on the identification and characterization of allergenic molecules leading to an expanding wealth of knowledge. The need to organize this information leads to the most important question ‘when does a protein become an allergen?’ In this article, I try to address this question by reviewing a few basic concepts of the immunology of IgE-mediated diseases, reporting on the current diagnostic and epidemiological tools used for allergic disease studies and discussing the usefulness of novel biotechnology tools (i.e. proteomics and molecular biology approaches), information technology tools (i.e. Internet-based resources) and microtechnology tools (i.e. proteomic microarray for IgE testing on molecular allergens). A step-wise staging of the identification and characterization process, including bench, clinical and epidemiological aspects, is proposed, in order to classify allergenic molecules dynamically. This proposal reflects the application and use of all the new tools available from current technologies. PMID:18477011

  12. Using Advanced Monitoring Tools to Evaluate PM PM2.5 2.5 in San Joaquin Valley

    EPA Science Inventory

    One of the primary data deficiencies that prevent the advance of policy relevant research on particulate matter, ozone, and associated precursors is the lack of measurement data and knowledge on the true vertical profile and synoptic-scale spatial distributions of the pollutants....

  13. Final Progress Report: Collaborative Research: Decadal-to-Centennial Climate & Climate Change Studies with Enhanced Variable and Uniform Resolution GCMs Using Advanced Numerical Techniques

    SciTech Connect

    Fox-Rabinovitz, M; Cote, J

    2009-06-05

    The joint U.S-Canadian project has been devoted to: (a) decadal climate studies using developed state-of-the-art GCMs (General Circulation Models) with enhanced variable and uniform resolution; (b) development and implementation of advanced numerical techniques; (c) research in parallel computing and associated numerical methods; (d) atmospheric chemistry experiments related to climate issues; (e) validation of regional climate modeling strategies for nested- and stretched-grid models. The variable-resolution stretched-grid (SG) GCMs produce accurate and cost-efficient regional climate simulations with mesoscale resolution. The advantage of the stretched grid approach is that it allows us to preserve the high quality of both global and regional circulations while providing consistent interactions between global and regional scales and phenomena. The major accomplishment for the project has been the successful international SGMIP-1 and SGMIP-2 (Stretched-Grid Model Intercomparison Project, phase-1 and phase-2) based on this research developments and activities. The SGMIP provides unique high-resolution regional and global multi-model ensembles beneficial for regional climate modeling and broader modeling community. The U.S SGMIP simulations have been produced using SciDAC ORNL supercomputers. Collaborations with other international participants M. Deque (Meteo-France) and J. McGregor (CSIRO, Australia) and their centers and groups have been beneficial for the strong joint effort, especially for the SGMIP activities. The WMO/WCRP/WGNE endorsed the SGMIP activities in 2004-2008. This project reflects a trend in the modeling and broader communities to move towards regional and sub-regional assessments and applications important for the U.S. and Canadian public, business and policy decision makers, as well as for international collaborations on regional, and especially climate related issues.

  14. Impact of gastrointestinal parasitic nematodes of sheep, and the role of advanced molecular tools for exploring epidemiology and drug resistance - an Australian perspective

    PubMed Central

    2013-01-01

    Parasitic nematodes (roundworms) of small ruminants and other livestock have major economic impacts worldwide. Despite the impact of the diseases caused by these nematodes and the discovery of new therapeutic agents (anthelmintics), there has been relatively limited progress in the development of practical molecular tools to study the epidemiology of these nematodes. Specific diagnosis underpins parasite control, and the detection and monitoring of anthelmintic resistance in livestock parasites, presently a major concern around the world. The purpose of the present article is to provide a concise account of the biology and knowledge of the epidemiology of the gastrointestinal nematodes (order Strongylida), from an Australian perspective, and to emphasize the importance of utilizing advanced molecular tools for the specific diagnosis of nematode infections for refined investigations of parasite epidemiology and drug resistance detection in combination with conventional methods. It also gives a perspective on the possibility of harnessing genetic, genomic and bioinformatic technologies to better understand parasites and control parasitic diseases. PMID:23711194

  15. Neutron interaction tool, PyNIC, for advanced applications in nuclear power, nuclear medicine, and nuclear security

    NASA Astrophysics Data System (ADS)

    Moffitt, Gregory Bruce

    A neutron interaction simulation tool, PyNIC, was developed for the calculation of neutron activation products and prompt gamma ray emission from neutron capture, neutron inelastic scattering, and fission interactions. This tool was developed in Python with a graphical user interface to facilitate its easy applications. The tool was validated for neutron activation analysis of a number of samples irradiated in the University of Utah TRIGA Reactor. These samples included nickel wire and the NIST standard for coal fly ash. The experimentally determined isotopes for coal fly ash were 56Mn, 40K, and 139Ba. The samples were irradiated at reactor power levels from 1 kW to 90 kW, and the average percent difference between PyNIC estimated and laboratory measured values was 4%, 24%, 38%, and 22% for 64Ni, 56Mn, 40K, and 139Ba, respectively. These differences are mainly attributed to calibration of the high-purity germanium detector and too short of count times. The PyNIC tool is applicable to neutron activation analysis but also can find its applications in nuclear power, nuclear medicine, and in homeland security such as predicting the contents of explosives and special nuclear materials in samples of complex and unknown origins.

  16. FACILITATING ADVANCED URBAN METEOROLOGY AND AIR QUALITY MODELING CAPABILITIES WITH HIGH RESOLUTION URBAN DATABASE AND ACCESS PORTAL TOOLS

    EPA Science Inventory

    Information of urban morphological features at high resolution is needed to properly model and characterize the meteorological and air quality fields in urban areas. We describe a new project called National Urban Database with Access Portal Tool, (NUDAPT) that addresses this nee...

  17. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater. PMID:23982824

  18. Laser Nano-Neurosurgery from Gentle Manipulation to Nano-Incision of Neuronal Cells and Scaffolds: An Advanced Neurotechnology Tool.

    PubMed

    Soloperto, Alessandro; Palazzolo, Gemma; Tsushima, Hanako; Chieregatti, Evelina; Vassalli, Massimo; Difato, Francesco

    2016-01-01

    Current optical approaches are progressing far beyond the scope of monitoring the structure and function of living matter, and they are becoming widely recognized as extremely precise, minimally-invasive, contact-free handling tools. Laser manipulation of living tissues, single cells, or even single-molecules is becoming a well-established methodology, thus founding the onset of new experimental paradigms and research fields. Indeed, a tightly focused pulsed laser source permits complex tasks such as developing engineered bioscaffolds, applying calibrated forces, transfecting, stimulating, or even ablating single cells with subcellular precision, and operating intracellular surgical protocols at the level of single organelles. In the present review, we report the state of the art of laser manipulation in neuroscience, to inspire future applications of light-assisted tools in nano-neurosurgery. PMID:27013962

  19. Laser Nano-Neurosurgery from Gentle Manipulation to Nano-Incision of Neuronal Cells and Scaffolds: An Advanced Neurotechnology Tool

    PubMed Central

    Soloperto, Alessandro; Palazzolo, Gemma; Tsushima, Hanako; Chieregatti, Evelina; Vassalli, Massimo; Difato, Francesco

    2016-01-01

    Current optical approaches are progressing far beyond the scope of monitoring the structure and function of living matter, and they are becoming widely recognized as extremely precise, minimally-invasive, contact-free handling tools. Laser manipulation of living tissues, single cells, or even single-molecules is becoming a well-established methodology, thus founding the onset of new experimental paradigms and research fields. Indeed, a tightly focused pulsed laser source permits complex tasks such as developing engineered bioscaffolds, applying calibrated forces, transfecting, stimulating, or even ablating single cells with subcellular precision, and operating intracellular surgical protocols at the level of single organelles. In the present review, we report the state of the art of laser manipulation in neuroscience, to inspire future applications of light-assisted tools in nano-neurosurgery. PMID:27013962

  20. Development of a numerical tool to study the mixing phenomenon occurring during mode one operation of a multi-mode ejector-augmented pulsed detonation rocket engine

    NASA Astrophysics Data System (ADS)

    Dawson, Joshua

    simple and as a result of the rapid combustion process the engine cycle is more efficient compared to its combined cycle counterparts. The flow path geometry consists of an inlet system, followed just downstream by a mixing chamber where an ejector structure is placed within the flow path. Downstream of the ejector structure is a duct leading to a convergent-divergent nozzle. During mode one operation and within the ejector, products from the detonation of a stoichiometric hydrogen/air mixture are exhausted directly into the surrounding secondary air stream. Mixing then occurs between both the primary and secondary flow streams, at which point the air mass containing the high pressure, high temperature reaction products is convected downstream towards the nozzle. The engine cycle is engineered to a specific number of detonations per second, creating the pulsating characteristic of the primary flow. The pulsing nature of the primary flow serves as a momentum augmentation, enhancing the thrust and specific impulse at low speeds. Consequently it is necessary to understand the transient mixing process between the primary and secondary flow streams occurring during mode one operation. Using OPENFOAMRTM, an analytic tool is developed to simulate the dynamics of the turbulent detonation process along with detailed chemistry in order to understand the physics involved with the stream interactions. The computational code has been developed within the framework of OPENFOAMRTM, an open-source alternative to commercial CFD software. A conservative formulation of the Farve averaged Navier-Stokes equations is implemented to facilitate programming and numerical stability. Time discretization is accomplished by using the Crank-Nicolson method, achieving second order convergence in time. Species mass fraction transport equations are implemented and a Seulex ODE solver was used to resolve the system of ordinary differential equations describing the hydrogen-air reaction mechanism detailed

  1. The Advanced Course in Professional Selling

    ERIC Educational Resources Information Center

    Loe, Terry; Inks, Scott

    2014-01-01

    More universities are incorporating sales content into their curriculums, and although the introductory courses in professional sales have much common ground and guidance from numerous professional selling texts, instructors teaching the advanced selling course lack the guidance provided by common academic tools and materials. The resulting…

  2. Investigation of Alien Wavelength Quality in Live Multi-Domain, Multi-Vendor Link Using Advanced Simulation Tool

    NASA Astrophysics Data System (ADS)

    Nordal Petersen, Martin; Nuijts, Roeland; Lange Bjørn, Lars

    2014-05-01

    This article presents an advanced optical model for simulation of alien wavelengths in multi-domain and multi-vendor dense wavelength-division multiplexing networks. The model aids optical network planners with a better understanding of the non-linear effects present in dense wavelength-division multiplexing systems and better utilization of alien wavelengths in future applications. The limiting physical effects for alien wavelengths are investigated in relation to power levels, channel spacing, and other factors. The simulation results are verified through experimental setup in live multi-domain dense wavelength-division multiplexing systems between two national research networks: SURFnet in Holland and NORDUnet in Denmark.

  3. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  4. Single cell analysis applied to antibody fragment production with Bacillus megaterium: development of advanced physiology and bioprocess state estimation tools

    PubMed Central

    2011-01-01

    Background Single cell analysis for bioprocess monitoring is an important tool to gain deeper insights into particular cell behavior and population dynamics of production processes and can be very useful for discrimination of the real bottleneck between product biosynthesis and secretion, respectively. Results Here different dyes for viability estimation considering membrane potential (DiOC2(3), DiBAC4(3), DiOC6(3)) and cell integrity (DiBAC4(3)/PI, Syto9/PI) were successfully evaluated for Bacillus megaterium cell characterization. It was possible to establish an appropriate assay to measure the production intensities of single cells revealing certain product secretion dynamics. Methods were tested regarding their sensitivity by evaluating fluorescence surface density and fluorescent specific concentration in relation to the electronic cell volume. The assays established were applied at different stages of a bioprocess where the antibody fragment D1.3 scFv production and secretion by B. megaterium was studied. Conclusions It was possible to distinguish between live, metabolic active, depolarized, dormant, and dead cells and to discriminate between high and low productive cells. The methods were shown to be suitable tools for process monitoring at single cell level allowing a better process understanding, increasing robustness and forming a firm basis for physiology-based analysis and optimization with the general application for bioprocess development. PMID:21496219

  5. Recent advances in elementary flux modes and yield space analysis as useful tools in metabolic network studies.

    PubMed

    Horvat, Predrag; Koller, Martin; Braunegg, Gerhart

    2015-09-01

    A review of the use of elementary flux modes (EFMs) and their applications in metabolic engineering covered with yield space analysis (YSA) is presented. EFMs are an invaluable tool in mathematical modeling of biochemical processes. They are described from their inception in 1994, followed by various improvements of their computation in later years. YSA constitutes another precious tool for metabolic network modeling, and is presented in details along with EFMs in this article. The application of these techniques is discussed for several case studies of metabolic network modeling provided in respective original articles. The article is concluded by some case studies in which the application of EFMs and YSA turned out to be most useful, such as the analysis of intracellular polyhydroxyalkanoate (PHA) formation and consumption in Cupriavidus necator, including the constraint-based description of the steady-state flux cone of the strain's metabolic network, the profound analysis of a continuous five-stage bioreactor cascade for PHA production by C. necator using EFMs and, finally, the study of metabolic fluxes in the metabolic network of C. necator cultivated on glycerol. PMID:26066363

  6. Fiber Optic Fourier Transform Infrared Spectroscopic Techniques for Advanced On-Line Chemical Analysis in Semiconductor Fabrication Tools

    NASA Astrophysics Data System (ADS)

    Kester, Michael; Trygstad, Marc; Chabot, Paul

    2003-09-01

    A unique analytical methodology has recently been developed to perform real-time, on-line chemical analysis of bath solutions in semiconductor fabrication tools. A novel, patented fiber optic sensor is used to transmit infrared light directly through the tube walls of the circulating bath solutions within the fabrication tool in a completely non-invasive, non-extractive way. The sensor simply "clips" onto the tubing, thus permitting immediate analysis of the bath composition by Fourier Transform infrared (FTIR) spectroscopy. The infrared spectrometer is capable of multiplexing up to eight "Clippir™" sensor heads to a single interferometer using fiber optic cables. The instrument can analyze almost any bath solution utilized today. The analysis is performed using the near-infrared (NIR) portion of the electromagnetic spectrum, where absorption bands related to molecular vibrations can be found. The Fourier Transform infrared spectrometer gives access to absorption bands over a wide range of frequencies (or wavelengths), and the absorptions are correlated to concentrations using a chemometric approach employing a partial least-squares algorithm. Models are generated from this approach for each chemistry to be analyzed. This paper will review the analytical technology necessary to make such measurements, and discuss the instrument performance criteria required to achieve accurate and precise measurements of bath chemistries. The ability to measure non-infrared absorbing compounds will be discussed, as will the nature of the influence of sample temperature on measurement. Issues critical to the development of robust models and their direct implementation on multiple channels and even different instruments will be considered.

  7. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    SciTech Connect

    Naqvi, S

    2014-06-15

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physical principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as

  8. Process variation monitoring (PVM) by wafer inspection tool as a complementary method to CD-SEM for mapping field CDU on advanced production devices

    NASA Astrophysics Data System (ADS)

    Kim, Dae Jong; Yoo, Hyung Won; Kim, Chul Hong; Lee, Hak Kwon; Kim, Sung Su; Bae, Koon Ho; Spielberg, Hedvi; Lee, Yun Ho; Levi, Shimon; Bustan, Yariv; Rozentsvige, Moshe

    2010-03-01

    As design rules shrink, Critical Dimension Uniformity (CDU) and Line Edge Roughness (LER) have a dramatic effect on printed final lines and hence the need to control these parameters increases. Sources of CDU and LER variations include scanner auto-focus accuracy and stability, layer stack thickness, composition variations, and exposure variations. Process variations, in advanced VLSI production designs, specifically in memory devices, attributed to CDU and LER affect cell-to-cell parametric variations. These variations significantly impact device performance and die yield. Traditionally, measurements of LER are performed by CD-SEM or OCD metrology tools. Typically, these measurements require a relatively long time to set and cover only selected points of wafer area. In this paper we present the results of a collaborative work of the Process Diagnostic & Control Business Unit of Applied Materials and Hynix Semiconductor Inc. on the implementation of a complementary method to the CDSEM and OCD tools, to monitor defect density and post litho develop CDU and LER on production wafers. The method, referred to as Process Variation Monitoring (PVM) is based on measuring variations in the scattered light from periodic structures. The application is demonstrated using Applied Materials DUV bright field (BF) wafer inspection tool under optimized illumination and collection conditions. The UVisionTM has already passed a successful feasibility study on DRAM products with 66nm and 54nm design rules. The tool has shown high sensitivity to variations across an FEM wafer in both exposure and focus axes. In this article we show how PVM can help detection of Field to Field variations on DRAM wafers with 44nm design rule during normal production run. The complex die layout and the shrink in cell dimensions require high sensitivity to local variations within Dies or Fields. During normal scan of production wafers local Process variations are translated into GL (Grey Level) values

  9. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 4: Manufacturing Engineering Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  10. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 6: Welding, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  11. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 10: Computer-Aided Drafting & Design, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  12. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 14: Automated Equipment Technician (CIM), of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  13. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 2: Career Development, General Education and Remediation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  14. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 15: Administrative Information, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This volume developed by the Machine Tool Advanced Skill Technology (MAST) program contains key administrative documents and provides additional sources for machine tool and precision manufacturing information and important points of contact in the industry. The document contains the following sections: a foreword; grant award letter; timeline for…

  15. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  16. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 3: Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  17. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 7: Industrial Maintenance Technology, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  18. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 13: Laser Machining, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  19. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 8: Sheet Metal & Composites, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  20. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 12: Instrumentation, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational specialty areas within the U.S. machine tool and metals-related…

  1. Advanced image processing methods as a tool to map and quantify different types of biological soil crust

    NASA Astrophysics Data System (ADS)

    Rodríguez-Caballero, Emilio; Escribano, Paula; Cantón, Yolanda

    2014-04-01

    Biological soil crusts (BSCs) modify numerous soil surface properties and affect many key ecosystem processes. As BSCs are considered one of the most important components of semiarid ecosystems, accurate characterisation of their spatial distribution is increasingly in demand. This paper describes a novel methodology for identifying the areas dominated by different types of BSCs and quantifying their relative cover at subpixel scale in a semiarid ecosystem of SE Spain. The approach consists of two consecutive steps: (i) First, Support Vector Machine (SVM) classification to identify the main ground units, dominated by homogenous surface cover (bare soil, cyanobacteria BSC, lichen BSC, green and dry vegetation), which are of strong ecological relevance. (ii) Spectral mixture analysis (SMA) of the ground units to quantify the proportion of each type of surface cover within each pixel, to correctly characterize the complex spatial heterogeneity inherent to semiarid ecosystems. SVM classification showed very good results with a Kappa coefficient of 0.93%, discriminating among areas dominated by bare soil, cyanobacteria BSC, lichen BSC, green and dry vegetation. Subpixel relative abundance images achieved relatively high accuracy for both types of BSCs (about 80%), whereas general overestimation of vegetation was observed. Our results open the possibility of introducing the effect of presence and of relative cover of BSCs in spatially distributed hydrological and ecological models, and assessment and monitoring aimed at reducing degradation in these areas.

  2. Comparison of Analytic and Numerical Models With Commercially Available Simulation Tools for the Prediction of Semiconductor Freeze-Out and Exhaustion

    NASA Astrophysics Data System (ADS)

    Reeves, Derek E.

    2002-09-01

    This thesis reports on three procedures and the associated numerical results for obtaining semiconductor majority carrier concentrations when subjected to a temperature sweep. The capability of predicting the exhaustion regime boundaries of a semiconductor is critical in understanding and exploiting the full potential of the modern integrated circuit. An efficient and reliable method is needed to accomplish this task. Silvaco International's semiconductor simulation software was used to predict temperature dependent majority carrier concentration for a semiconductor cell. Comparisons with analytical and numerical MATLAB-based schemes were made. This was done for both Silicon and GaAs materials. Conditions of the simulations demonstrated effect known as Bandgap Narrowing.

  3. Interferometric correction system for a numerically controlled machine

    DOEpatents

    Burleson, Robert R.

    1978-01-01

    An interferometric correction system for a numerically controlled machine is provided to improve the positioning accuracy of a machine tool, for example, for a high-precision numerically controlled machine. A laser interferometer feedback system is used to monitor the positioning of the machine tool which is being moved by command pulses to a positioning system to position the tool. The correction system compares the commanded position as indicated by a command pulse train applied to the positioning system with the actual position of the tool as monitored by the laser interferometer. If the tool position lags the commanded position by a preselected error, additional pulses are added to the pulse train applied to the positioning system to advance the tool closer to the commanded position, thereby reducing the lag error. If the actual tool position is leading in comparison to the commanded position, pulses are deleted from the pulse train where the advance error exceeds the preselected error magnitude to correct the position error of the tool relative to the commanded position.

  4. Development and implementation of a portable grating interferometer system as a standard tool for testing optics at the Advanced Photon Source beamline 1-BM.

    PubMed

    Assoufid, Lahsen; Shi, Xianbo; Marathe, Shashidhara; Benda, Erika; Wojcik, Michael J; Lang, Keenan; Xu, Ruqing; Liu, Wenjun; Macrander, Albert T; Tischler, Jon Z

    2016-05-01

    We developed a portable X-ray grating interferometer setup as a standard tool for testing optics at the Advanced Photon Source (APS) beamline 1-BM. The interferometer can be operated in phase-stepping, Moiré, or single-grating harmonic imaging mode with 1-D or 2-D gratings. All of the interferometer motions are motorized; hence, it is much easier and quicker to switch between the different modes of operation. A novel aspect of this new instrument is its designed portability. While the setup is designed to be primarily used as a standard tool for testing optics at 1-BM, it could be potentially deployed at other APS beamlines for beam coherence and wavefront characterization or imaging. The design of the interferometer system is described in detail and coherence measurements obtained at the APS 34-ID-E beamline are presented. The coherence was probed in two directions using a 2-D checkerboard, a linear, and a circular grating at X-ray energies of 8 keV, 11 keV, and 18 keV. PMID:27250384

  5. From Numerical Problem Solving to Model-Based Experimentation Incorporating Computer-Based Tools of Various Scales into the ChE Curriculum

    ERIC Educational Resources Information Center

    Shacham, Mordechai; Cutlip, Michael B.; Brauner, Neima

    2009-01-01

    A continuing challenge to the undergraduate chemical engineering curriculum is the time-effective incorporation and use of computer-based tools throughout the educational program. Computing skills in academia and industry require some proficiency in programming and effective use of software packages for solving 1) single-model, single-algorithm…

  6. Numerical Implementation of Indicators and Statistical Control Tools in Monitoring and Evaluating CACEI-ISO Indicators of Study Program in Industrial Process by Systematization

    ERIC Educational Resources Information Center

    Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar

    2016-01-01

    The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…

  7. Evaluation of contaminant removal of reverse osmosis and advanced oxidation in full-scale operation by combining passive sampling with chemical analysis and bioanalytical tools.

    PubMed

    Escher, Beate I; Lawrence, Michael; Macova, Miroslava; Mueller, Jochen F; Poussade, Yvan; Robillot, Cedric; Roux, Annalie; Gernjak, Wolfgang

    2011-06-15

    Advanced water treatment of secondary treated effluent requires stringent quality control to achieve a water quality suitable for augmenting drinking water supplies. The removal of micropollutants such as pesticides, industrial chemicals, endocrine disrupting chemicals (EDC), pharmaceuticals, and personal care products (PPCP) is paramount. As the concentrations of individual contaminants are typically low, frequent analytical screening is both laborious and costly. We propose and validate an approach for continuous monitoring by applying passive sampling with Empore disks in vessels that were designed to slow down the water flow, and thus uptake kinetics, and ensure that the uptake is only marginally dependent on the chemicals' physicochemical properties over a relatively narrow molecular size range. This design not only assured integrative sampling over 27 days for a broad range of chemicals but also permitted the use of a suite of bioanalytical tools as sum parameters, representative of mixtures of chemicals with a common mode of toxic action. Bioassays proved to be more sensitive than chemical analysis to assess the removal of organic micropollutants by reverse osmosis, followed by UV/H₂O₂ treatment, as many individual compounds fell below the quantification limit of chemical analysis, yet still contributed to the observed mixture toxicity. Nonetheless in several cases, the responses in the bioassays were also below their quantification limits and therefore only three bioassays were evaluated here, representing nonspecific toxicity and two specific end points for estrogenicity and photosynthesis inhibition. Chemical analytical techniques were able to quantify 32 pesticides, 62 PCPPs, and 12 EDCs in reverse osmosis concentrate. However, these chemicals could explain only 1% of the nonspecific toxicity in the Microtox assay in the reverse osmosis concentrate and 0.0025% in the treated water. Likewise only 1% of the estrogenic effect in the E-SCREEN could be

  8. Numerical simulations of epitaxial growth process in MOVPE reactor as a tool for design of modern semiconductors for high power electronics

    SciTech Connect

    Skibinski, Jakub; Wejrzanowski, Tomasz; Caban, Piotr; Kurzydlowski, Krzysztof J.

    2014-10-06

    In the present study numerical simulations of epitaxial growth of gallium nitride in Metal Organic Vapor Phase Epitaxy reactor AIX-200/4RF-S is addressed. Epitaxial growth means crystal growth that progresses while inheriting the laminar structure and the orientation of substrate crystals. One of the technological problems is to obtain homogeneous growth rate over the main deposit area. Since there are many agents influencing reaction on crystal area such as temperature, pressure, gas flow or reactor geometry, it is difficult to design optimal process. According to the fact that it's impossible to determine experimentally the exact distribution of heat and mass transfer inside the reactor during crystal growth, modeling is the only solution to understand the process precisely. Numerical simulations allow to understand the epitaxial process by calculation of heat and mass transfer distribution during growth of gallium nitride. Including chemical reactions in numerical model allows to calculate the growth rate of the substrate and estimate the optimal process conditions for obtaining the most homogeneous product.

  9. Dynamic drag force based on iterative density mapping: A new numerical tool for three-dimensional analysis of particle trajectories in a dielectrophoretic system.

    PubMed

    Knoerzer, Markus; Szydzik, Crispin; Tovar-Lopez, Francisco Javier; Tang, Xinke; Mitchell, Arnan; Khoshmanesh, Khashayar

    2016-02-01

    Dielectrophoresis is a widely used means of manipulating suspended particles within microfluidic systems. In order to efficiently design such systems for a desired application, various numerical methods exist that enable particle trajectory plotting in two or three dimensions based on the interplay of hydrodynamic and dielectrophoretic forces. While various models are described in the literature, few are capable of modeling interactions between particles as well as their surrounding environment as these interactions are complex, multifaceted, and computationally expensive to the point of being prohibitive when considering a large number of particles. In this paper, we present a numerical model designed to enable spatial analysis of the physical effects exerted upon particles within microfluidic systems employing dielectrophoresis. The model presents a means of approximating the effects of the presence of large numbers of particles through dynamically adjusting hydrodynamic drag force based on particle density, thereby introducing a measure of emulated particle-particle and particle-liquid interactions. This model is referred to as "dynamic drag force based on iterative density mapping." The resultant numerical model is used to simulate and predict particle trajectory and velocity profiles within a microfluidic system incorporating curved dielectrophoretic microelectrodes. The simulated data are compared favorably with experimental data gathered using microparticle image velocimetry, and is contrasted against simulated data generated using traditional "effective moment Stokes-drag method," showing more accurate particle velocity profiles for areas of high particle density. PMID:26643028

  10. Numerical simulations of epitaxial growth process in MOVPE reactor as a tool for design of modern semiconductors for high power electronics

    NASA Astrophysics Data System (ADS)

    Skibinski, Jakub; Caban, Piotr; Wejrzanowski, Tomasz; Kurzydlowski, Krzysztof J.

    2014-10-01

    In the present study numerical simulations of epitaxial growth of gallium nitride in Metal Organic Vapor Phase Epitaxy reactor AIX-200/4RF-S is addressed. Epitaxial growth means crystal growth that progresses while inheriting the laminar structure and the orientation of substrate crystals. One of the technological problems is to obtain homogeneous growth rate over the main deposit area. Since there are many agents influencing reaction on crystal area such as temperature, pressure, gas flow or reactor geometry, it is difficult to design optimal process. According to the fact that it's impossible to determine experimentally the exact distribution of heat and mass transfer inside the reactor during crystal growth, modeling is the only solution to understand the process precisely. Numerical simulations allow to understand the epitaxial process by calculation of heat and mass transfer distribution during growth of gallium nitride. Including chemical reactions in numerical model allows to calculate the growth rate of the substrate and estimate the optimal process conditions for obtaining the most homogeneous product.

  11. A seamless flash-flood early warning tool based on IDF-curves and coupling of weather-radar with numerical weather predictions

    NASA Astrophysics Data System (ADS)

    Liechti, Kaethi; Knechtl, Valentin; Andres, Norina; Sideris, Ioannis; Zappa, Massimiliano

    2014-05-01

    A flash-flood is a flood that develops rapidly after a heavy precipitation event. Flash-flood forecasting is an important field of research because flash floods cause a lot of fatalities and damage. A flash-flood early warning tool is developed based on precipitation statistics. Our target areas are small ungauged areas of southern-Switzerland. A total of 759 sub-cathcments was considered. In a first intensity-duration-frequency (IDF) curves for each catchment have been calculated basin on: A) Gridded precipitation products for the period 1961 to 2012 and B) gridded reforecast of the COSMO-LEPS NWP for the period 1971-2000. These different IDF-curves at the catchment level in combination with precipitation forecasts are the basis for the flash-flood early warning tool. The forecast models used are COSMO-2 (deterministic, updated every three hours and with a lead time of 24 hours) and COSMO-LEPS (probabilistic, 16 member and with a lead time of five days). In operational mode COSMO-2 is nudged to real-time data of a weather-radar precipitation obtained by blending the radar qpe with information from a national network of precipitation data. This product is called COMBIPRECIP. The flash-flood early warning tool has been evaluated against observed events. These events are either discharge peaks in gauged sub-areas or reports of damages caused by flash-flood events. The hypothesis that it is possible to detect hydrological events with the flash-flood early warning tool can be partly confirmed. The highest skill is obtained if the return-period of weather radar QPE is assessed at hourly time scale. With this it was possible to confirm most of the damage events occurred in 2010 and 2011. The prototype tool is affected by several false alarms. This is because initial conditions of the soils are not considered. Further steps will be therefore focussed on the addition of real-time hydrological information as obtained from the application of high resolution distributed

  12. Development of a numerical scheme to predict geomagnetic storms after intense solar events and geomagnetic activity 27 days in advance. Final report, 6 Aug 86-16 Nov 90

    SciTech Connect

    Akasofu, S.I.; Lee, L.H.

    1991-02-01

    The modern geomagnetic storm prediction scheme should be based on a numerical simulation method, rather than on a statistical result. Furthermore, the scheme should be able to predict the geomagnetic storm indices, such as the Dst and AE indices, as a function of time. By recognizing that geomagnetic storms are powered by the solar wind-magnetosphere generator and that its power is given in terms of the solar wind speed, the interplanetary magnetic field (IMF) magnitude and polar angle, the authors have made a major advance in predicting both flare-induced storms and recurrent storms. Furthermore, it is demonstrated that the prediction scheme can be calibrated using the interplanetary scintillation (IPS) observation, when the solar disturbance advances about half-way to the earth. It is shown, however, that we are still far from a reliable prediction scheme. The prediction of the IMF polar angle requires future advance in understanding characteristics of magnetic clouds.

  13. The Application of the NASA Advanced Concepts Office, Launch Vehicle Team Design Process and Tools for Modeling Small Responsive Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Threet, Grady E.; Waters, Eric D.; Creech, Dennis M.

    2012-01-01

    The Advanced Concepts Office (ACO) Launch Vehicle Team at the NASA Marshall Space Flight Center (MSFC) is recognized throughout NASA for launch vehicle conceptual definition and pre-phase A concept design evaluation. The Launch Vehicle Team has been instrumental in defining the vehicle trade space for many of NASA s high level launch system studies from the Exploration Systems Architecture Study (ESAS) through the Augustine Report, Constellation, and now Space Launch System (SLS). The Launch Vehicle Team s approach to rapid turn-around and comparative analysis of multiple launch vehicle architectures has played a large role in narrowing the design options for future vehicle development. Recently the Launch Vehicle Team has been developing versions of their vetted tools used on large launch vehicles and repackaged the process and capability to apply to smaller more responsive launch vehicles. Along this development path the LV Team has evaluated trajectory tools and assumptions against sounding rocket trajectories and air launch systems, begun altering subsystem mass estimating relationships to handle smaller vehicle components, and as an additional development driver, have begun an in-house small launch vehicle study. With the recent interest in small responsive launch systems and the known capability and response time of the ACO LV Team, ACO s launch vehicle assessment capability can be utilized to rapidly evaluate the vast and opportune trade space that small launch vehicles currently encompass. This would provide a great benefit to the customer in order to reduce that large trade space to a select few alternatives that should best fit the customer s payload needs.

  14. Robot Tools

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Mecanotron, now division of Robotics and Automation Corporation, developed a quick-change welding method called the Automatic Robotics Tool-change System (ARTS) under Marshall Space Flight Center and Rockwell International contracts. The ARTS system has six tool positions ranging from coarse sanding disks and abrasive wheels to cloth polishing wheels with motors of various horsepower. The system is used by fabricators of plastic body parts for the auto industry, by Texas Instruments for making radar domes, and for advanced composites at Aerospatiale in France.

  15. FluxSuite: a New Scientific Tool for Advanced Network Management and Cross-Sharing of Next-Generation Flux Stations

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.

    2015-12-01

    Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from

  16. Robotic-locomotor training as a tool to reduce neuromuscular abnormality in spinal cord injury: the application of system identification and advanced longitudinal modeling.

    PubMed

    Mirbagheri, Mehdi M; Kindig, Matthew; Niu, Xun; Varoqui, Deborah; Conaway, Petra

    2013-06-01

    In this study, the effect of the LOKOMAT, a robotic-assisted locomotor training system, on the reduction of neuromuscular abnormalities associated with spasticity was examined, for the first time in the spinal cord injury (SCI) population. Twenty-three individuals with chronic incomplete SCI received 1-hour training sessions in the LOKOMAT three times per week, with up to 45 minutes of training per session; matched control group received no intervention. The neuromuscular properties of the spastic ankle were then evaluated prior to training and after 1, 2, and 4 weeks of training. A parallel-cascade system identification technique was used to determine the reflex and intrinsic stiffness of the ankle joint as a function of ankle position at each time point. The slope of the stiffness vs. joint angle curve, i.e. the modulation of stiffness with joint position, was then calculated and tracked over the four-week period. Growth Mixture Modeling (GMM), an advanced statistical method, was then used to classify subjects into subgroups based on similar trends in recovery pattern of slope over time, and Random Coefficient Regression (RCR) was used to model the recovery patterns within each subgroup. All groups showed significant reductions in both reflex and intrinsic slope over time, but subjects in classes with higher baseline values of the slope showed larger improvements over the four weeks of training. These findings suggest that LOKOMAT training may also be useful for reducing the abnormal modulation of neuromuscular properties that arises as secondary effects after SCI. This can advise clinicians as to which patients can benefit the most from LOKOMAT training prior to beginning the training. Further, this study shows that system identification and GMM/RCR can serve as powerful tools to quantify and track spasticity over time in the SCI population. PMID:24187312

  17. Numerical modeling of late Glacial Laurentide advance of ice across Hudson Strait: Insights into terrestrial and marine geology, mass balance, and calving flux

    USGS Publications Warehouse

    Pfeffer, W.T.; Dyurgerov, M.; Kaplan, M.; Dwyer, J.; Sassolas, C.; Jennings, A.; Raup, B.; Manley, W.

    1997-01-01

    A time-dependent finite element model was used to reconstruct the advance of ice from a late Glacial dome on northern Quebec/Labrador across Hudson Strait to Meta Incognita Peninsula (Baffin Island) and subsequently to the 9.9-9.6 ka 14C Gold Cove position on Hall Peninsula. Terrestrial geological and geophysical information from Quebec and Labrador was used to constrain initial and boundary conditions, and the model results are compared with terrestrial geological information from Baffin Island and considered in the context of the marine event DC-0 and the Younger Dryas cooling. We conclude that advance across Hudson Strait from Ungava Bay to Baffin Island is possible using realistic glacier physics under a variety of reasonable boundary conditions. Production of ice flux from a dome centered on northeastern Quebec and Labrador sufficient to deliver geologically inferred ice thickness at Gold Cove (Hall Peninsula) appears to require extensive penetration of sliding south from Ungava Bay. The discharge of ice into the ocean associated with advance and retreat across Hudson Strait does not peak at a time coincident with the start of the Younger Dryas and is less than minimum values proposed to influence North Atlantic thermohaline circulation; nevertheless, a significant fraction of freshwater input to the North Atlantic may have been provided abruptly and at a critical time by this event.

  18. Advanced Tsunami Numerical Simulations and Energy Considerations by use of 3D-2D Coupled Models: The October 11, 1918, Mona Passage Tsunami

    NASA Astrophysics Data System (ADS)

    López-Venegas, Alberto M.; Horrillo, Juan; Pampell-Manis, Alyssa; Huérfano, Victor; Mercado, Aurelio

    2015-06-01

    The most recent tsunami observed along the coast of the island of Puerto Rico occurred on October 11, 1918, after a magnitude 7.2 earthquake in the Mona Passage. The earthquake was responsible for initiating a tsunami that mostly affected the northwestern coast of the island. Runup values from a post-tsunami survey indicated the waves reached up to 6 m. A controversy regarding the source of the tsunami has resulted in several numerical simulations involving either fault rupture or a submarine landslide as the most probable cause of the tsunami. Here we follow up on previous simulations of the tsunami from a submarine landslide source off the western coast of Puerto Rico as initiated by the earthquake. Improvements on our previous study include: (1) higher-resolution bathymetry; (2) a 3D-2D coupled numerical model specifically developed for the tsunami; (3) use of the non-hydrostatic numerical model NEOWAVE (non-hydrostatic evolution of ocean WAVE) featuring two-way nesting capabilities; and (4) comprehensive energy analysis to determine the time of full tsunami wave development. The three-dimensional Navier-Stokes model tsunami solution using the Navier-Stokes algorithm with multiple interfaces for two fluids (water and landslide) was used to determine the initial wave characteristic generated by the submarine landslide. Use of NEOWAVE enabled us to solve for coastal inundation, wave propagation, and detailed runup. Our results were in agreement with previous work in which a submarine landslide is favored as the most probable source of the tsunami, and improvement in the resolution of the bathymetry yielded inundation of the coastal areas that compare well with values from a post-tsunami survey. Our unique energy analysis indicates that most of the wave energy is isolated in the wave generation region, particularly at depths near the landslide, and once the initial wave propagates from the generation region its energy begins to stabilize.

  19. Development of Design Technology on Thermal-Hydraulic Performance in Tight-Lattice Rod Bundles: III - Numerical Evaluation of Fluid Mixing Phenomena using Advanced Interface-Tracking Method -

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Nagayoshi, Takuji; Takase, Kazuyuki; Akimoto, Hajime

    Thermal-hydraulic design of the current boiling water reactor (BWR) is performed by correlations with empirical results of actual-size tests. However, for the Innovative Water Reactor for Flexible Fuel Cycle (FLWR) core, an actual size test of an embodiment of its design is required to confirm or modify such correlations. Development of a method that enables the thermal-hydraulic design of nuclear reactors without these actual size tests is desired, because these tests take a long time and entail great cost. For this reason we developed an advanced thermal-hydraulic design method for FLWRs using innovative two-phase flow simulation technology. In this study, detailed Two-Phase Flow simulation code using advanced Interface Tracking method: TPFIT is developed to calculate the detailed information of the two-phase flow. We tried to verify the TPFIT code by comparing it with the 2-channel air-water and steam-water mixing experimental results. The predicted result agrees well the observed results and bubble dynamics through the gap and cross flow behavior could be effectively predicted by the TPFIT code, and pressure difference between fluid channels is responsible for the fluid mixing.

  20. Next-Generation Ion Thruster Design Tool

    NASA Technical Reports Server (NTRS)

    Stolz, Peter

    2015-01-01

    Computational tools that accurately predict the performance of electric propulsion devices are highly desirable and beneficial to NASA and the broader electric propulsion community. The current state of the art in electric propulsion modeling relies heavily on empirical data and numerous computational "knobs." In Phase I of this project, Tech-X Corporation developed the most detailed ion engine discharge chamber model that currently exists. This kinetic model simulates all particles in the discharge chamber along with a physically correct simulation of the electric fields. In addition, kinetic erosion models are included for modeling the ion-impingement effects on thruster component erosion. In Phase II, Tech-X developed a user-friendly computer program for NASA and other governmental and industry customers. Tech-X has implemented a number of advanced numerical routines to bring the computational time down to a commercially acceptable level. NASA now has a highly sophisticated, user-friendly ion engine discharge chamber modeling tool.

  1. Tool setting device

    DOEpatents

    Brown, Raymond J.

    1977-01-01

    The present invention relates to a tool setting device for use with numerically controlled machine tools, such as lathes and milling machines. A reference position of the machine tool relative to the workpiece along both the X and Y axes is utilized by the control circuit for driving the tool through its program. This reference position is determined for both axes by displacing a single linear variable displacement transducer (LVDT) with the machine tool through a T-shaped pivotal bar. The use of the T-shaped bar allows the cutting tool to be moved sequentially in the X or Y direction for indicating the actual position of the machine tool relative to the predetermined desired position in the numerical control circuit by using a single LVDT.

  2. A Meta-Analysis Method to Advance Design of Technology-Based Learning Tool: Combining Qualitative and Quantitative Research to Understand Learning in Relation to Different Technology Features

    ERIC Educational Resources Information Center

    Zhang, Lin

    2014-01-01

    Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…

  3. 2D SEDFLUX 1.0C:. an advanced process-response numerical model for the fill of marine sedimentary basins

    NASA Astrophysics Data System (ADS)

    Syvitski, James P. M.; Hutton, Eric W. H.

    2001-07-01

    Numerical simulators of the dynamics of strata formation of continental margins fuse information from the atmosphere, ocean and regional geology. Such models can provide information for areas and times for which actual measurements are not available, or for when purely statistical estimates are not adequate by themselves. SEDFLUX is such a basin-fill model, written in ANSI-standard C, able to simulate the delivery of sediment and their accumulation over time scales of tens of thousands of years. SEDFLUX includes the effects of sea-level fluctuations, river floods, ocean storms, and other relevant environmental factors (climate trends, random catastrophic events), at a time step (daily to yearly) that is sensitive to short-term variations of the seafloor. SEDFLUX combines individual process-response models into one fully interactive model, delivering a multi-sized sediment load onto and across a continental margin, including sediment redistribution by (1) river mouth dynamics, (2) buoyant surface plumes, (3) hyperpycnal flows, (4) ocean storms, (5) slope instabilities, (6) turbidity currents, and (7) debris flows. The model allows for the deposit to compact, to undergo tectonic processes (faults, uplift) and isostatic subsidence from the sediment load. The modeled architecture has a typical vertical resolution of 1-25 cm, and a typical horizontal resolution of between 1 and 100 m.

  4. Numerical Development

    ERIC Educational Resources Information Center

    Siegler, Robert S.; Braithwaite, David W.

    2016-01-01

    In this review, we attempt to integrate two crucial aspects of numerical development: learning the magnitudes of individual numbers and learning arithmetic. Numerical magnitude development involves gaining increasingly precise knowledge of increasing ranges and types of numbers: from non-symbolic to small symbolic numbers, from smaller to larger…

  5. Self-imposed evaluation of the Helmholtz Research School MICMoR as a tool for quality assurance and advancement of a structured graduate programme

    NASA Astrophysics Data System (ADS)

    Elija Bleher, Bärbel; Schmid, Hans Peter; Scholz, Beate

    2015-04-01

    The Helmholtz Research School MICMoR (Mechanisms and Interactions of Climate Change in Mountain Regions) offers a structured graduate programme for doctoral students in the field of climate change research. It is hosted by the Institute of Meteorology and Climate Research (KIT/IMK-IFU) in Garmisch-Partenkirchen, in collaboration with 7 Bavarian partner universities and research institutions. Hence, MICMoR brings together a considerably large network with currently 20 doctoral students and 55 scientists. MICMoR offers scientific and professional skills training, provides a state-of-the-art supervision concept, and fosters international exchange and interdisciplinary collaboration. In order to develop and advance its programme, MICMoR has committed itself to a self-imposed mid-term review in its third year, to monitor to which extent its original objectives have been reached, and to explore and identify where MICMoR has room for improvement. The evaluation especially focused on recruitment, supervision, training, networking and cooperation. Carried out by an external expert (Beate Scholz from scholz ctc), the evaluation was based on a mixed methods approach, i.e. combining a quantitative survey involving all doctoral candidates as well as their supervisors and focus groups with different MICMoR stakeholders. The evaluation has brought forward some highly interesting results, pinpointing challenges and opportunities of setting up a structured doctoral programme. Overall, the evaluation proved to be a useful tool for evidence-based programme and policy planning, and demonstrated a high level of satisfaction of supervisors and fellows. Supervision, with facets ranging from disciplinary feedback to career advice, is demanding and requires strong commitment and adequate human resources development by all parties involved. Thus, MICMoR plans to offer mentor coaching and calls on supervisors and mentors to form a community of learners with their doctoral students. To

  6. NATIONAL URBAN DATABASE AND ACCESS PORTAL TOOL (NUDAPT): FACILITATING ADVANCEMENTS IN URBAN METEOROLOGY AND CLIMATE MODELING WITH COMMUNITY-BASED URBAN DATABASES

    EPA Science Inventory

    We discuss the initial design and application of the National Urban Database and Access Portal Tool (NUDAPT). This new project is sponsored by the USEPA and involves collaborations and contributions from many groups from federal and state agencies, and from private and academic i...

  7. LensTools: Weak Lensing computing tools

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-02-01

    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  8. Research on ARM Numerical Control System

    NASA Astrophysics Data System (ADS)

    Wei, Xu; JiHong, Chen

    Computerized Numerical Control (CNC) machine tools is the foundation of modern manufacturing systems, whose advanced digital technology is the key to solve the problem of sustainable development of machine tool manufacturing industry. The paper is to design CNC system embedded on ARM and indicates the hardware design and the software systems supported. On the hardware side: the driving chip of the motor control unit, as the core of components, is MCX314AL of DSP motion control which is developed by NOVA Electronics Co., Ltd. of Japan. It make convenient to control machine because of its excellent performance, simple interface, easy programming. On the Software side, the uC/OS-2 is selected as the embedded operating system of the open source, which makes a detailed breakdown of the modules of the CNC system. Those priorities are designed according to their actual requirements. The ways of communication between the module and the interrupt response are so different that it guarantees real-time property and reliability of the numerical control system. Therefore, it not only meets the requirements of the current social precision machining, but has good man-machine interface and network support to facilitate a variety of craftsmen use.

  9. Numerical Integration

    ERIC Educational Resources Information Center

    Sozio, Gerry

    2009-01-01

    Senior secondary students cover numerical integration techniques in their mathematics courses. In particular, students would be familiar with the "midpoint rule," the elementary "trapezoidal rule" and "Simpson's rule." This article derives these techniques by methods which secondary students may not be familiar with and an approach that…

  10. Advanced Numerical Modeling of Turbulent Atmospheric Flows

    NASA Astrophysics Data System (ADS)

    Kühnlein, Christian; Dörnbrack, Andreas; Gerz, Thomas

    The present chapter introduces the method of computational simulation to predict and study turbulent atmospheric flows. This includes a description of the fundamental approach to computational simulation and the practical implementation using the technique of large-eddy simulation. In addition, selected contributions from IPA scientists to computational model development and various examples for applications are given. These examples include homogeneous turbulence, convective boundary layers, heated forest canopy, buoyant thermals, and large-scale flows with baroclinic wave instability.

  11. JUST in time health emergency interventions: an innovative approach to training the citizen for emergency situations using virtual reality techniques and advanced IT tools (the Web-CD).

    PubMed

    Manganas, A; Tsiknakis, M; Leisch, E; Karefilaki, L; Monsieurs, K; Bossaert, L L; Giorgini, F

    2004-01-01

    This paper reports the results of the first of the two systems developed by JUST, a collaborative project supported by the European Union under the Information Society Technologies (IST) Programme. The most innovative content of the project has been the design and development of a complementary training course for non-professional health emergency operators, which supports the traditional learning phase, and which purports to improve the retention capability of the trainees. This was achieved with the use of advanced information technology techniques, which provide adequate support and can help to overcome the present weaknesses of the existing training mechanisms. PMID:15747936

  12. Numerical simulations of glass impacts using smooth particle hydrodynamics

    SciTech Connect

    Mandell, D.A.; Wingate, C.A.

    1996-05-01

    As part of a program to develop advanced hydrocode design tools, we have implemented a brittle fracture model for glass into the SPHINX smooth particle hydrodynamics code. We have evaluated this model and the code by predicting data from one-dimensional flyer plate impacts into glass. Since fractured glass properties, which are needed in the model, are not available, we did sensitivity studies of these properties, as well as sensitivity studies to determine the number of particles needed in the calculations. The numerical results are in good agreement with the data. {copyright} {ital 1996 American Institute of Physics.}

  13. Numerical simulations of glass impacts using smooth particle hydrodynamics

    SciTech Connect

    Mandell, D.A.; Wingate, C.A.

    1995-07-01

    As part of a program to develop advanced hydrocode design tools, we have implemented a brittle fracture model for glass into the SPHINX smooth particle hydrodynamics code. We have evaluated this model and the code by predicting data from one-dimensional flyer plate impacts into glass. Since fractured glass properties, which are needed in the model, are not available, we did sensitivity studies of these properties, as well as sensitivity studies to determine the number of particles needed in the calculations. The numerical results are in good agreement with the data.

  14. How we used a patient visit tracker tool to advance experiential learning in systems-based practice and quality improvement in a medical student clinic.

    PubMed

    Chen, Chen Amy; Park, Ryan J; Hegde, John V; Jun, Tomi; Christman, Mitalee P; Yoo, Sun M; Yamasaki, Alisa; Berhanu, Aaron; Vohra-Khullar, Pamela; Remus, Kristin; Schwartzstein, Richard M; Weinstein, Amy R

    2016-01-01

    Poorly designed healthcare systems increase costs and preventable medical errors. To address these issues, systems-based practice (SBP) education provides future physicians with the tools to identify systemic errors and implement quality improvement (QI) initiatives to enhance the delivery of cost-effective, safe and multi-disciplinary care. Although SBP education is being implemented in residency programs and is mandated by the Accreditation Council for Graduate Medical Education (ACGME) as one of its core competencies, it has largely not been integrated into undergraduate medical education. We propose that Medical Student-Faculty Collaborative Clinics (MSFCCs) may be the ideal environment in which to train medical students in SBPs and QI initiatives, as they allow students to play pivotal roles in project development, administration, and management. Here we describe a process of experiential learning that was developed within a newly established MSFCC, which challenged students to identify inefficiencies, implement interventions, and track the results. After identifying bottlenecks in clinic operations, our students designed a patient visit tracker tool to monitor clinic flow and implemented solutions to decrease patient visit times. Our model allowed students to drive their own active learning in a practical clinical setting, providing early and unique training in crucial QI skills. PMID:25401409

  15. Lithographic measurement of EUV flare in the 0.3-NA Micro ExposureTool optic at the Advanced Light Source

    SciTech Connect

    Cain, Jason P.; Naulleau, Patrick; Spanos, Costas J.

    2005-01-01

    The level of flare present in a 0.3-NA EUV optic (the MET optic) at the Advanced Light Source at Lawrence Berkeley National Laboratory is measured using a lithographic method. Photoresist behavior at high exposure doses makes analysis difficult. Flare measurement analysis under scanning electron microscopy (SEM) and optical microscopy is compared, and optical microscopy is found to be a more reliable technique. In addition, the measured results are compared with predictions based on surface roughness measurement of the MET optical elements. When the fields in the exposure matrix are spaced far enough apart to avoid influence from surrounding fields and the data is corrected for imperfect mask contrast and aerial image proximity effects, the results match predicted values quite well. The amount of flare present in this optic ranges from 4.7% for 2 {micro}m features to 6.8% for 500 nm features.

  16. Kinematic Analysis of the Upper Limb Motor Strategies in Stroke Patients as a Tool towards Advanced Neurorehabilitation Strategies: A Preliminary Study

    PubMed Central

    Simbolotti, Chiara

    2014-01-01

    Advanced rehabilitation strategies of the upper limb in stroke patients focus on the recovery of the most important daily activities. In this study we analyzed quantitatively and qualitatively the motor strategies employed by stroke patients when reaching and drinking from a glass. We enrolled 6 hemiparetic poststroke patients and 6 healthy subjects. Motion analysis of the task proposed (reaching for the glass, bringing it to the mouth, and putting it back on the table) with the affected limb was performed. Clinical assessment using the Fugl-Meyer Assessment for Upper Extremity was also included. During the reaching for the glass the patients showed a reduced arm elongation and trunk axial rotation due to motor deficit. For this reason, as observed, they carried out compensatory strategies which included trunk forward displacement and head movements. These preliminary data should be considered to address rehabilitation treatment. Moreover, the kinematic analysis protocol developed might represent an outcome measure of upper limb rehabilitation processes. PMID:24868536

  17. Regional Arctic System Model (RASM): A Tool to Address the U.S. Priorities and Advance Capabilities for Arctic Climate Modeling and Prediction

    NASA Astrophysics Data System (ADS)

    Maslowski, W.; Roberts, A.; Cassano, J. J.; Gutowski, W. J., Jr.; Nijssen, B.; Osinski, R.; Zeng, X.; Brunke, M.; Duvivier, A.; Hamman, J.; Hossainzadeh, S.; Hughes, M.; Seefeldt, M. W.

    2015-12-01

    The Arctic is undergoing some of the most coordinated rapid climatic changes currently occurring anywhere on Earth, including the retreat of the perennial sea ice cover, which integrates forcing by, exchanges with and feedbacks between atmosphere, ocean and land. While historical reconstructions from Earth System Models (ESMs) are in broad agreement with these changes, the rate of change in ESMs generally remains outpaced by observations. Reasons for that relate to a combination of coarse resolution, inadequate parameterizations, under-represented processes and a limited knowledge of physical interactions. We demonstrate the capability of the Regional Arctic System Model (RASM) in addressing some of the ESM limitations in simulating observed variability and trends in arctic surface climate. RASM is a high resolution, pan-Arctic coupled climate model with the sea ice and ocean model components configured at an eddy-permitting resolution of 1/12o and the atmosphere and land hydrology model components at 50 km resolution, which are all coupled at 20-minute intervals. RASM is an example of limited-area, process-resolving, fully coupled ESM, which due to the constraints from boundary conditions facilitates detailed comparisons with observational statistics that are not possible with ESMs. The overall goal of RASM is to address key requirements published in the Navy Arctic Roadmap: 2014-2030 and in the Implementation Plan for the National Strategy for the Arctic Region, regarding the need for advanced modeling capabilities for operational forecasting and strategic climate predictions through 2030. The main science objectives of RASM are to advance understanding and model representation of critical physical processes and feedbacks of importance to sea ice thickness and area distribution. RASM results are presented to quantify relative contributions by (i) resolved processes and feedbacks as well as (ii) sensitivity to space dependent sub-grid parameterizations to better

  18. V-Lab{trademark}: Virtual laboratories -- The analysis tool for structural analysis of composite components

    SciTech Connect

    1999-07-01

    V-Lab{trademark}, an acronym for Virtual Laboratories, is a design and analysis tool for fiber-reinforced composite components. This program allows the user to perform analysis, numerical experimentation, and design prototyping using advanced composite stress and failure analysis tools. The software was designed to be intuitive and easy to use, even by designers who are not experts in composite materials or structural analysis. V-Lab{trademark} is the software tool every specialist in design engineering, structural analysis, research and development and repair needs to perform accurate, fast and economical analysis of composite components.

  19. Final Progress Report submitted via the DOE Energy Link (E-Link) in June 2009 [Collaborative Research: Decadal-to-Centennial Climate & Climate Change Studies with Enhanced Variable and Uniform Resolution GCMs Using Advanced Numerical Techniques

    SciTech Connect

    Fox-Rabinovitz, M; Cote, J

    2009-10-09

    The joint U.S-Canadian project has been devoted to: (a) decadal climate studies using developed state-of-the-art GCMs (General Circulation Models) with enhanced variable and uniform resolution; (b) development and implementation of advanced numerical techniques; (c) research in parallel computing and associated numerical methods; (d) atmospheric chemistry experiments related to climate issues; (e) validation of regional climate modeling strategies for nested- and stretched-grid models. The variable-resolution stretched-grid (SG) GCMs produce accurate and cost-efficient regional climate simulations with mesoscale resolution. The advantage of the stretched grid approach is that it allows us to preserve the high quality of both global and regional circulations while providing consistent interactions between global and regional scales and phenomena. The major accomplishment for the project has been the successful international SGMIP-1 and SGMIP-2 (Stretched-Grid Model Intercomparison Project, phase-1 and phase-2) based on this research developments and activities. The SGMIP provides unique high-resolution regional and global multi-model ensembles beneficial for regional climate modeling and broader modeling community. The U.S SGMIP simulations have been produced using SciDAC ORNL supercomputers. The results of the successful SGMIP multi-model ensemble simulations of the U.S. climate are available at the SGMIP web site (http://essic.umd.edu/~foxrab/sgmip.html) and through the link to the WMO/WCRP/WGNE web site: http://collaboration.cmc.ec.gc.ca/science/wgne. Collaborations with other international participants M. Deque (Meteo-France) and J. McGregor (CSIRO, Australia) and their centers and groups have been beneficial for the strong joint effort, especially for the SGMIP activities. The WMO/WCRP/WGNE endorsed the SGMIP activities in 2004-2008. This project reflects a trend in the modeling and broader communities to move towards regional and sub-regional assessments and

  20. The effects of using screencasting as a multimedia pre-training tool to manage the intrinsic cognitive load of chemical equilibrium instruction for advanced high school chemistry students

    NASA Astrophysics Data System (ADS)

    Musallam, Ramsey

    Chemistry is a complex knowledge domain. Specifically, research notes that Chemical Equilibrium presents greater cognitive challenges than other topics in chemistry. Cognitive Load Theory describes the impact a subject, and the learning environment, have on working memory. Intrinsic load is the facet of Cognitive Load Theory that explains the complexity innate to complex subjects. The purpose of this study was to build on the limited research into intrinsic cognitive load, by examining the effects of using multimedia screencasts as a pre-training technique to manage the intrinsic cognitive load of chemical equilibrium instruction for advanced high school chemistry students. A convenience sample of 62 fourth-year high school students enrolled in an advanced chemistry course from a co-ed high school in urban San Francisco were given a chemical equilibrium concept pre-test. Upon conclusion of the pre-test, students were randomly assigned to two groups: pre-training and no pre-training. The pre-training group received a 10 minute and 52 second pre-training screencast that provided definitions, concepts and an overview of chemical equilibrium. After pre-training both group received the same 50-minute instructional lecture. After instruction, all students were given a chemical equilibrium concept post-test. Independent sample t-tests were conducted to examine differences in performance and intrinsic load. No significant differences in performance or intrinsic load, as measured by ratings of mental effort, were observed on the pre-test. Significant differences in performance, t(60)=3.70, p=.0005, and intrinsic load, t(60)=5.34, p=.0001, were observed on the post-test. A significant correlation between total performance scores and total mental effort ratings was also observed, r(60)=-0.44, p=.0003. Because no significant differences in prior knowledge were observed, it can be concluded that pre-training was successful at reducing intrinsic load. Moreover, a significant

  1. Magnet costs for the Advanced Light Source

    SciTech Connect

    Tanabe, J.; Krupnick, J.; Hoyer, E.; Paterson, A.

    1993-05-01

    The Advanced Light Source (ALS) accelerator is now completed. The numerous conventional magnets required for the booster ring, the storage ring and the low and high energy transfer lines were installed during the last two years. This paper summarizes the various costs associated with the quantity fabrication of selected magnet families. These costs include the costs of prototypes, tooling, coil and core fabrication, assembly and magnetic measurements. Brief descriptions of the magnets and specialized requirements for magnetic measurements are included in order to associate the costs with the relative complexities of the various magnet systems.

  2. Magnetospheric ULF wave studies in the frame of Swarm mission: new advanced tools for automated detection of pulsations in magnetic and electric field observations

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Georgiou, Marina; Giamini, Sigiava A.; Sandberg, Ingmar; Haagmans, Roger

    2014-05-01

    The rekindling of the interest in space science in the last 15 years has led to many successful satellite missions in the Earth's magnetosphere and topside ionosphere, which were able to provide the scientific community with high-quality data on the magnetic and electric fields surrounding our planet. This data pool will be further enriched by the measurements of ESA's Swarm mission, a constellation of three satellites in different polar orbits, flying at altitudes from 400 to 550 km, which was launched on the 22nd of November 2013. Aiming at the best scientific exploitation of this corpus of accumulated data, we have developed a set of analysis tools that can cope with measurements of various spacecraft, at various regions of the magnetosphere and in the topside ionosphere. Our algorithms are based on a combination of wavelet spectral methods and artificial neural network techniques and are suited for the detection of waves and wave-like disturbances as well as the extraction of several physical parameters. Our recent work demonstrates the applicability of our developed analysis tools, both for individual case studies and statistical analysis of ultra low frequency (ULF) waves. We provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz), Pc4 (7-22 mHz) and Pc5 (1-7 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA, GIMA and IMAGE magnetometer networks. Our study shows that the same wave event, characterized by increased activity in the high end of the Pc3 band, was simultaneously observed by all three satellite missions and by certain stations of ground networks. This observation provides a strong argument in favour of the

  3. New advanced tools for combined ULF wave analysis of multipoint space-borne and ground observations: application to single event and statistical studies

    NASA Astrophysics Data System (ADS)

    Balasis, G.; Papadimitriou, C.; Daglis, I. A.; Georgiou, M.; Giamini, S. A.

    2013-12-01

    In the past decade, a critical mass of high-quality scientific data on the electric and magnetic fields in the Earth's magnetosphere and topside ionosphere has been progressively collected. This data pool will be further enriched by the measurements of the upcoming ESA/Swarm mission, a constellation of three satellites in three different polar orbits between 400 and 550 km altitude, which is expected to be launched in November 2013. New analysis tools that can cope with measurements of various spacecraft at various regions of the magnetosphere and in the topside ionosphere as well as ground stations will effectively enhance the scientific exploitation of the accumulated data. Here, we report on a new suite of algorithms based on a combination of wavelet spectral methods and artificial neural network techniques and demonstrate the applicability of our recently developed analysis tools both for individual case studies and statistical studies of ultra-low frequency (ULF) waves. First, we provide evidence for a rare simultaneous observation of a ULF wave event in the Earth's magnetosphere, topside ionosphere and surface: we have found a specific time interval during the Halloween 2003 magnetic storm, when the Cluster and CHAMP spacecraft were in good local time (LT) conjunction, and have examined the ULF wave activity in the Pc3 (22-100 mHz) and Pc4-5 (1-22 mHz) bands using data from the Geotail, Cluster and CHAMP missions, as well as the CARISMA and GIMA magnetometer networks. Then, we perform a statistical study of Pc3 wave events observed by CHAMP for the full decade (2001-2010) of the satellite vector magnetic data: the creation of a database of such events enabled us to derive valuable statistics for many important physical properties relating to the spatio-temporal location of these waves, the wave power and frequency, as well as other parameters and their correlation with solar wind conditions, magnetospheric indices, electron density data, ring current decay

  4. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    NASA Astrophysics Data System (ADS)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application

  5. Dynamics of Numerics and CFD

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Rai, Man Mohan (Technical Monitor)

    1994-01-01

    This lecture attempts to illustrate the basic ideas of how the recent advances in nonlinear dynamical systems theory (dynamics) can provide new insights into the understanding of numerical algorithms used in solving nonlinear differential equations (DEs). Examples will be given of the use of dynamics to explain unusual phenomena that occur in numerics. The inadequacy of the use of linearized analysis for the understanding of long time behavior of nonlinear problems will be illustrated, and the role of dynamics in studying the nonlinear stability, accuracy, convergence property and efficiency of using time- dependent approaches to obtaining steady-state numerical solutions in computational fluid dynamics (CFD) will briefly be explained.

  6. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  7. Conceptual Inventory of Natural Selection as a Tool for Measuring Greek University Students' Evolution Knowledge: Differences between novice and advanced students

    NASA Astrophysics Data System (ADS)

    Athanasiou, Kyriacos; Mavrikaki, Evangelia

    2014-05-01

    The primary objective of this research was to compare various groups of Greek university students for their level of knowledge of Evolution by means of Natural Selection (ENS). For the purpose of the study, we used a well known questionnaire the Conceptual Inventory of Natural Selection (CINS) and 352 biology majors and non-majors students from the University of Athens took part in it. A principal components analysis revealed problems with the items designed to assess the concepts of population stability, differential survival and variation inheritable, therefore these items need to be reconsidered. Nonetheless, the results of the CINS for each Greek sub-group showed that the higher the involvement in evolution education, the higher the students' performances on the CINS test. This linear correlation, together with other evidence, supports the CINS authors' claims about the usefulness of the CINS as an assessment of instruction. Unfortunately, Greek university students gave many teleological and proximate answers to many of the CINS items. Comparisons between least and most evolutionary educated university students revealed that the latter gave more evolutionary answers. Oddly, advanced biology majors students did not show an improvement in all the 20 items of the CINS (only in 14 out of the 20 items) compared to novice biology students. They even gave more teleological answers to the concept natural resources are limited than novice biology majors students. Finally, Greek university students' level of knowledge of ENS seems to be closer to Canadian than US students'.

  8. Advancing representation of hydrologic processes in the Soil and Water Assessment Tool (SWAT) through integration of the TOPographic MODEL (TOPMODEL) features

    USGS Publications Warehouse

    Chen, J.; Wu, Y.

    2012-01-01

    This paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.

  9. A Flexible Visualization Tool for Rapid Access to EFIT Results

    NASA Astrophysics Data System (ADS)

    Zhang, Ruirui; Xiao, Bingjia; Luo, Zhengping

    2014-04-01

    This paper introduces the design and implementation of an interactive tool, the EASTViewer, for the visualization of plasma equilibrium reconstruction results for EAST (the Experimental Advanced Superconducting Tokamak). Aimed at the operating system independently, Python, when combined with the PyGTK toolkit, is used as the programming language. Using modular design, the EASTViewer provides a unified interface with great flexibility. It is easy to access numerous data sources either from local data files or an MDSplus tree, and with the pre-defined configuration files, it can be extended to other tokamaks. The EASTViewer has been used as the major tool to visualize equilibrium data since the second EAST campaign in 2008, and it has been verified that the EASTViewer features a user-friendly interface, and has easy access to numerous data sources and cross-platforms.

  10. Interface between a printed circuit board computer aided design tool (Tektronix 4051 based) and a numerical paper tape controlled drill press (Slo-Syn 530: 100 w/ Dumore Automatic Head Number 8391)

    SciTech Connect

    Heckman, B.K.; Chinn, V.K.

    1981-01-01

    The development and use of computer programs written to produce the paper tape needed for the automation, or numeric control, of drill presses employed to fabricate computed-designed printed circuit boards are described. (LCL)

  11. A storm modeling system as an advanced tool in prediction of well organized slowly moving convective cloud system and early warning of severe weather risk

    NASA Astrophysics Data System (ADS)

    Spiridonov, Vlado; Curic, Mladjen

    2015-02-01

    Short-range prediction of precipitation is a critical input to flood prediction and hence the accuracy of flood warnings. Since most of the intensive processes come from convective clouds-the primary aim is to forecast these small-scale atmospheric processes. One characteristic pattern of organized group of convective clouds consist of a line of deep convection resulted in the repeated passage of heavy-rain-producing convective cells over NW part of Macedonia along the line. This slowly moving convective system produced extreme local rainfall and hailfall in urban Skopje city. A 3-d cloud model is used to simulate the main storm characteristic (e.g., structure, intensity, evolution) and the main physical processes responsible for initiation of heavy rainfall and hailfall. The model showed a good performance in producing significantly more realistic and spatially accurate forecasts of convective rainfall event than is possible with current operational system. The output results give a good initial input for developing appropriate tools such as flooding indices and potential risk mapping for interpreting and presenting the predictions so that they enhance operational flood prediction capabilities and warnings of severe weather risk of weather services. Convective scale model-even for a single case used has proved significant benefits in several aspects (initiation of convection, storm structure and evolution and precipitation). The storm-scale model (grid spacing-1 km) is capable of producing significantly more realistic and spatially accurate forecasts of convective rainfall events than is possible with current operational systems based on model with grid spacing 15 km.

  12. Advancing computational methods for calibration of the Soil and Water Assessment Tool (SWAT): Application for modeling climate change impacts on water resources in the Upper Neuse Watershed of North Carolina

    NASA Astrophysics Data System (ADS)

    Ercan, Mehmet Bulent

    -Dominated Sorting Genetic Algorithm II (NSGA-II). This tool was demonstrated through an application for the Upper Neuse Watershed in North Carolina, USA. The objective functions used for the calibration were Nash-Sutcliffe (E) and Percent Bias (PB), and the objective sites were the Flat, Little, and Eno watershed outlets. The results show that the use of multi-objective calibration algorithms for SWAT calibration improved model performance especially in terms of minimizing PB compared to the single objective model calibration. The third study builds upon the first two studies by leveraging the new calibration methods and tools to study future climate impacts on the Upper Neuse watershed. Statistically downscaled outputs from eight Global Circulation Models (GCMs) were used for both low and high emission scenarios to drive a well calibrated SWAT model of the Upper Neuse watershed. The objective of the study was to understand the potential hydrologic response of the watershed, which serves as a public water supply for the growing Research Triangle Park region of North Carolina, under projected climate change scenarios. The future climate change scenarios, in general, indicate an increase in precipitation and temperature for the watershed in coming decades. The SWAT simulations using the future climate scenarios, in general, suggest an increase in soil water and water yield, and a decrease in evapotranspiration within the Upper Neuse watershed. In summary, this dissertation advances the field of watershed-scale hydrologic modeling by (i) providing some of the first work to apply cloud computing for the computationally-demanding task of model calibration; (ii) providing a new, open source library that can be used by SWAT modelers to perform multi-objective calibration of their models; and (iii) advancing understanding of climate change impacts on water resources for an important watershed in the Research Triangle Park region of North Carolina. The third study leveraged the

  13. Eclipse Parallel Tools Platform

    SciTech Connect

    Watson, Gregory; DeBardeleben, Nathan; Rasmussen, Craig

    2005-02-18

    Designing and developing parallel programs is an inherently complex task. Developers must choose from the many parallel architectures and programming paradigms that are available, and face a plethora of tools that are required to execute, debug, and analyze parallel programs i these environments. Few, if any, of these tools provide any degree of integration, or indeed any commonality in their user interfaces at all. This further complicates the parallel developer's task, hampering software engineering practices, and ultimately reducing productivity. One consequence of this complexity is that best practice in parallel application development has not advanced to the same degree as more traditional programming methodologies. The result is that there is currently no open-source, industry-strength platform that provides a highly integrated environment specifically designed for parallel application development. Eclipse is a universal tool-hosting platform that is designed to providing a robust, full-featured, commercial-quality, industry platform for the development of highly integrated tools. It provides a wide range of core services for tool integration that allow tool producers to concentrate on their tool technology rather than on platform specific issues. The Eclipse Integrated Development Environment is an open-source project that is supported by over 70 organizations, including IBM, Intel and HP. The Eclipse Parallel Tools Platform (PTP) plug-in extends the Eclipse framwork by providing support for a rich set of parallel programming languages and paradigms, and a core infrastructure for the integration of a wide variety of parallel tools. The first version of the PTP is a prototype that only provides minimal functionality for parallel tool integration of a wide variety of parallel tools. The first version of the PTP is a prototype that only provides minimal functionality for parallel tool integration, support for a small number of parallel architectures, and basis

  14. Double diameter boring tool

    DOEpatents

    Ashbaugh, F.A.; Murry, K.R.

    1986-02-10

    A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting flutes formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first flute tip to the axis of rotation plus the distance from the second flute tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second flute tip to the axis of rotation minus one-half the distance from the first flute tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.

  15. Double diameter boring tool

    DOEpatents

    Ashbaugh, Fred N.; Murry, Kenneth R.

    1988-12-27

    A boring tool and a method of operation are provided for boring two concentric holes of precision diameters and depths in a single operation. The boring tool includes an elongated tool body, a shank for attachment to a standard adjustable boring head which is used on a manual or numerical control milling machine and first and second diametrically opposed cutting edges formed for cutting in opposite directions. The diameter of the elongated tool body is substantially equal to the distance from the first cutting edge tip to the axis of rotation plus the distance from the second cutting edge tip to the axis of rotation. The axis of rotation of the tool is spaced from the tool centerline a distance substantially equal to one-half the distance from the second cutting edge tip to the axis of rotation minus one-half the distance from the first cutting edge tip to the axis of rotation. The method includes the step of inserting the boring tool into the boring head, adjusting the distance between the tool centerline and the tool axis of rotation as described above and boring the two concentric holes.

  16. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  17. A survey of parallel programming tools

    NASA Technical Reports Server (NTRS)

    Cheng, Doreen Y.

    1991-01-01

    This survey examines 39 parallel programming tools. Focus is placed on those tool capabilites needed for parallel scientific programming rather than for general computer science. The tools are classified with current and future needs of Numerical Aerodynamic Simulator (NAS) in mind: existing and anticipated NAS supercomputers and workstations; operating systems; programming languages; and applications. They are divided into four categories: suggested acquisitions, tools already brought in; tools worth tracking; and tools eliminated from further consideration at this time.

  18. Space Station robotics planning tools

    NASA Technical Reports Server (NTRS)

    Testa, Bridget Mintz

    1992-01-01

    The concepts are described for the set of advanced Space Station Freedom (SSF) robotics planning tools for use in the Space Station Control Center (SSCC). It is also shown how planning for SSF robotics operations is an international process, and baseline concepts are indicated for that process. Current SRMS methods provide the backdrop for this SSF theater of multiple robots, long operating time-space, advanced tools, and international cooperation.

  19. Shifting tools

    SciTech Connect

    Fisher, E.P.; Welch, W.R.

    1984-03-13

    An improved shifting tool connectable in a well tool string and useful to engage and position a slidable sleeve in a sliding sleeve device in a well flow conductor. The selectively profiled shifting tool keys provide better fit with and more contact area between keys and slidable sleeves. When the engaged slidable sleeve cannot be moved up and the shifting tool is not automatically disengaged, emergency disengagement means may be utilized by applying upward force to the shifting tool sufficient to shear pins and cause all keys to be cammed inwardly at both ends to completely disengage for removal of the shifting tool from the sliding sleeve device.

  20. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  1. Welding mechanics for advanced component safety assessment

    NASA Astrophysics Data System (ADS)

    Siegele, Dieter

    2011-06-01

    Numerical methods are nowadays a useful tool for the calculation of distortion and residual stresses as a result from the welding process. Modern finite element codes not only allow for calculation of deformations and stresses due to the welding process but also take into account the change of microstructure due to different heating and cooling rates. As an extension to the pure welding simulation, the field of welding mechanics combines the mechanics and the material behaviour from the welding process with the assessment of service behaviour of welded components. In the paper, new results of experimental and numerical work in the field of welding mechanics are described. Through examples from automotive, nuclear and pipe-line applications it is demonstrated that an equilibrated treatment and a close interaction of "process", "properties" and "defects" are necessary to come up with an advanced fitness-forservice assessment of welded components.

  2. Advances in HIV Prevention for Serodiscordant Couples

    PubMed Central

    Muessig, Kathryn E.; Cohen, Myron S.

    2014-01-01

    Serodiscordant couples play an important role in maintaining the global HIV epidemic. This review summarizes biobehavioral and biomedical HIV prevention options for serodiscordant couples focusing on advances in 2013 and 2014, including World Health Organization guidelines and best-evidence for couples counseling, couples-based interventions, and the use of antiviral agents for prevention. In the past few years marked advances have been made in HIV prevention for serodiscordant couples and numerous ongoing studies are continuously expanding HIV prevention tools, especially in the area of pre-exposure prophylaxis. Uptake and adherence to antiviral therapy remains a key challenge. Additional research is needed to develop evidence-based interventions for couples, and especially for male-male couples. Randomized trials have demonstrated the prevention benefits of antiretroviral-based approaches among serodiscordant couples; however, residual transmission observed in recognized serodiscordant couples represents an important and resolvable challenge in HIV prevention. PMID:25145645

  3. FORTRAN tools

    NASA Technical Reports Server (NTRS)

    Presser, L.

    1978-01-01

    An integrated set of FORTRAN tools that are commercially available is described. The basic purpose of various tools is summarized and their economic impact highlighted. The areas addressed by these tools include: code auditing, error detection, program portability, program instrumentation, documentation, clerical aids, and quality assurance.

  4. Percussion tool

    SciTech Connect

    Reed, Teddy R.

    2006-11-28

    A percussion tool is described and which includes a housing mounting a tool bit; a reciprocally moveable hammer borne by the housing and which is operable to repeatedly strike the tool bit; and a reciprocally moveable piston enclosed within the hammer and which imparts reciprocal movement to the reciprocally moveable hammer.

  5. Numerical Stimulation of Multicomponent Chromatography Using Spreadsheets.

    ERIC Educational Resources Information Center

    Frey, Douglas D.

    1990-01-01

    Illustrated is the use of spreadsheet programs for implementing finite difference numerical simulations of chromatography as an instructional tool in a separations course. Discussed are differential equations, discretization and integration, spreadsheet development, computer requirements, and typical simulation results. (CW)

  6. Failure Environment Analysis Tool (FEAT)

    NASA Technical Reports Server (NTRS)

    Lawler, D. G.

    1991-01-01

    Information is given in viewgraph form on the Failure Environment Analysis Tool (FEAT), a tool designed to demonstrate advanced modeling and analysis techniques to better understand and capture the flow of failures within and between elements of the Space Station Freedom (SSF) and other large complex systems. Topics covered include objectives, development background, the technical approach, SSF baseline integration, and FEAT growth and evolution.

  7. PV Hourly Simulation Tool

    SciTech Connect

    Dean, Jesse; Metzger, Ian

    2010-12-31

    This software requires inputs of simple general building characteristics and usage information to calculate the energy and cost benefits of solar PV. This tool conducts and complex hourly simulation of solar PV based primarily on the area available on the rooftop. It uses a simplified efficiency calculation method and real panel characteristics. It includes a detailed rate structure to account for time-of-use rates, on-peak and off-peak pricing, and multiple rate seasons. This tool includes the option for advanced system design inputs if they are known. This tool calculates energy savings, demand reduction, cost savings, incentives and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.

  8. PV Hourly Simulation Tool

    2010-12-31

    This software requires inputs of simple general building characteristics and usage information to calculate the energy and cost benefits of solar PV. This tool conducts and complex hourly simulation of solar PV based primarily on the area available on the rooftop. It uses a simplified efficiency calculation method and real panel characteristics. It includes a detailed rate structure to account for time-of-use rates, on-peak and off-peak pricing, and multiple rate seasons. This tool includes themore » option for advanced system design inputs if they are known. This tool calculates energy savings, demand reduction, cost savings, incentives and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.« less

  9. The representation of numerical magnitude

    PubMed Central

    Brannon, Elizabeth M

    2006-01-01

    The combined efforts of many fields are advancing our understanding of how number is represented. Researchers studying numerical reasoning in adult humans, developing humans and non-human animals are using a suite of behavioral and neurobiological methods to uncover similarities and differences in how each population enumerates and compares quantities to identify the neural substrates of numerical cognition. An important picture emerging from this research is that adult humans share with non-human animals a system for representing number as language-independent mental magnitudes and that this system emerges early in development. PMID:16546373

  10. Hydroforming Of Patchwork Blanks — Numerical Modeling And Experimental Validation

    NASA Astrophysics Data System (ADS)

    Lamprecht, Klaus; Merklein, Marion; Geiger, Manfred

    2005-08-01

    In comparison to the commonly applied technology of tailored blanks the concept of patchwork blanks offers a number of additional advantages. Potential application areas for patchwork blanks in automotive industry are e.g. local reinforcements of automotive closures, structural reinforcements of rails and pillars as well as shock towers. But even if there is a significant application potential for patchwork blanks in automobile production, industrial realization of this innovative technique is decelerated due to a lack of knowledge regarding the forming behavior and the numerical modeling of patchwork blanks. Especially for the numerical simulation of hydroforming processes, where one part of the forming tool is replaced by a fluid under pressure, advanced modeling techniques are required to ensure an accurate prediction of the blanks' forming behavior. The objective of this contribution is to provide an appropriate model for the numerical simulation of patchwork blanks' forming processes. Therefore, different finite element modeling techniques for patchwork blanks are presented. In addition to basic shell element models a combined finite element model consisting of shell and solid elements is defined. Special emphasis is placed on the modeling of the weld seam. For this purpose the local mechanical properties of the weld metal, which have been determined by means of Martens-hardness measurements and uniaxial tensile tests, are integrated in the finite element models. The results obtained from the numerical simulations are compared to experimental data from a hydraulic bulge test. In this context the focus is laid on laser- and spot-welded patchwork blanks.

  11. New Instrumental Tools for Advanced Astrochemical Applications

    NASA Astrophysics Data System (ADS)

    Steber, Amanda; Zinn, Sabrina; Schnell, Melanie; Rijs, Anouk

    2015-06-01

    Astrochemistry has been a growing field over the past several years. As the data from the Atacama Large Millimeter Array (ALMA) becomes publicly available, new and fast techniques for the analysis of the data will need to be developed, as well as fast, sensitive laboratory techniques. This lab is in the process of building up instrumentation that will be dedicated to the measurement of astrochemically relevant species, both in the microwave and the millimeter wave regimes. Discharge experiments, laser ablation experiments, as well as time of flight measurements will be possible with this instrumentation. Coupled with instrumentation capabilities will be new software aimed at a speeding up the analysis. The laboratory data will be used to search for new molecular signatures in the interstellar medium (ISM), and help to elucidate molecular reaction pathways occurring in the ISM.

  12. Astronomer's Proposal Tool

    NASA Technical Reports Server (NTRS)

    Krueger, Tony

    2005-01-01

    Astronomer's Proposal Tool (APT) is a computer program that assists astronomers in preparing their Phase 1 and Phase 2 Hubble Space Telescope science programs. APT is a successor to the Remote Proposal Submission System 2 (RPS2) program, which has been rendered obsolete by more recent advances in computer software and hardware. APT exploits advances associated with widespread use of the Internet, multiplatform visual development software tools, and overall increases in the power of desktop computer hardware, all in such a way as to make the preparation and submission of proposals more intuitive and make observatory operations less cumbersome. APT provides documentation and help that are friendly, up to date, and easily accessible to users of varying levels of expertise, while defining an extensible framework that is responsive to changes in both technology and observatory operations. APT consists of two major components: (1) a set of software tools that are intuitive, visual, and responsive and (2) an integrated software environment that unifies all the tools and makes them interoperable. The APT tools include the Visual Target Tuner, Proposal Editor, Exposure Planner, Bright Object Checker, and Visit Planner.

  13. Numerical Modelling of Gelating Aerosols

    SciTech Connect

    Babovsky, Hans

    2008-09-01

    The numerical simulation of the gel phase transition of an aerosol system is an interesting and demanding task. Here, we follow an approach first discussed in [6, 8] which turns out as a useful numerical tool. We investigate several improvements and generalizations. In the center of interest are coagulation diffusion systems, where the aerosol dynamics is supplemented with diffusive spreading in physical space. This leads to a variety of scenarios (depending on the coagulation kernel and the diffusion model) for the spatial evolution of the gelation area.

  14. Handling geophysical flows: Numerical modelling using Graphical Processing Units

    NASA Astrophysics Data System (ADS)

    Garcia-Navarro, Pilar; Lacasta, Asier; Juez, Carmelo; Morales-Hernandez, Mario

    2016-04-01

    Computational tools may help engineers in the assessment of sediment transport during the decision-making processes. The main requirements are that the numerical results have to be accurate and simulation models must be fast. The present work is based on the 2D shallow water equations in combination with the 2D Exner equation [1]. The resulting numerical model accuracy was already discussed in previous work. Regarding the speed of the computation, the Exner equation slows down the already costly 2D shallow water model as the number of variables to solve is increased and the numerical stability is more restrictive. On the other hand, the movement of poorly sorted material over steep areas constitutes a hazardous environmental problem. Computational tools help in the predictions of such landslides [2]. In order to overcome this problem, this work proposes the use of Graphical Processing Units (GPUs) for decreasing significantly the simulation time [3, 4]. The numerical scheme implemented in GPU is based on a finite volume scheme. The mathematical model and the numerical implementation are compared against experimental and field data. In addition, the computational times obtained with the Graphical Hardware technology are compared against Single-Core (sequential) and Multi-Core (parallel) CPU implementations. References [Juez et al.(2014)] Juez, C., Murillo, J., & Garca-Navarro, P. (2014) A 2D weakly-coupled and efficient numerical model for transient shallow flow and movable bed. Advances in Water Resources. 71 93-109. [Juez et al.(2013)] Juez, C., Murillo, J., & Garca-Navarro, P. (2013) . 2D simulation of granular flow over irregular steep slopes using global and local coordinates. Journal of Computational Physics. 225 166-204. [Lacasta et al.(2014)] Lacasta, A., Morales-Hernndez, M., Murillo, J., & Garca-Navarro, P. (2014) An optimized GPU implementation of a 2D free surface simulation model on unstructured meshes Advances in Engineering Software. 78 1-15. [Lacasta

  15. Tool Using

    PubMed Central

    Kahrs, Björn A.; Lockman, Jeffrey J.

    2014-01-01

    Research on the development of tool use in children has often emphasized the cognitive bases of this achievement, focusing on the choice of an artifact, but has largely neglected its motor foundations. However, research across diverse fields, from evolutionary anthropology to cognitive neuroscience, converges on the idea that the actions that embody tool use are also critical for understanding its ontogenesis and phylogenesis. In this article, we highlight findings across these fields to show how a deeper examination of the act of tool using can inform developmental accounts and illuminate what makes human tool use unique. PMID:25400691

  16. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  17. GRIPPING TOOL

    DOEpatents

    Sandrock, R.J.

    1961-12-12

    A self-actuated gripping tool is described for transferring fuel elements and the like into reactors and other inaccessible locations. The tool will grasp or release the load only when properly positioned for this purpose. In addition, the load cannot be released except when unsupported by the tool, so that jarring or contact will not bring about accidental release of the load. The gripping members or jaws of the device are cam-actuated by an axially slidable shaft which has two lockable positions. A spring urges the shaft into one position and a solenoid is provided to overcome the spring and move it into the other position. The weight of the tool operates a sleeve to lock the shaft in its existing position. Only when the cable supporting the tool is slack is the device capable of being actuated either to grasp or release its load. (AEC)

  18. Omics Tools

    SciTech Connect

    Schaumberg, Andrew

    2012-12-21

    The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the “omics.jar” file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not contain Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on a server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the “help” command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args

  19. Omics Tools

    2012-12-21

    The Omics Tools package provides several small trivial tools for work in genomics. This single portable package, the “omics.jar” file, is a toolbox that works in any Java-based environment, including PCs, Macs, and supercomputers. The number of tools is expected to grow. One tool (called cmsearch.hadoop or cmsearch.local), calls the external cmsearch program to predict non-coding RNA in a genome. The cmsearch program is part of the third-party Infernal package. Omics Tools does not containmore » Infernal. Infernal may be installed separately. The cmsearch.hadoop subtool requires Apache Hadoop and runs on a supercomputer, though cmsearch.local does not and runs on a server. Omics Tools does not contain Hadoop. Hadoop mat be installed separartely The other tools (cmgbk, cmgff, fastats, pal, randgrp, randgrpr, randsub) do not interface with third-party tools. Omics Tools is written in Java and Scala programming languages. Invoking the “help” command shows currently available tools, as shown below: schaumbe@gpint06:~/proj/omics$ java -jar omics.jar help Known commands are: cmgbk : compare cmsearch and GenBank Infernal hits cmgff : compare hits among two GFF (version 3) files cmsearch.hadoop : find Infernal hits in a genome, on your supercomputer cmsearch.local : find Infernal hits in a genome, on your workstation fastats : FASTA stats, e.g. # bases, GC content pal : stem-loop motif detection by palindromic sequence search (code stub) randgrp : random subsample without replacement, of groups randgrpr : random subsample with replacement, of groups (fast) randsub : random subsample without replacement, of file lines For more help regarding a particular command, use: java -jar omics.jar command help Usage: java -jar omics.jar command args« less

  20. Evaluation of dietary assessment tools used to assess the diet of adults participating in the Communities Advancing the Studies of Tribal Nations Across the Lifespan (CoASTAL) cohort

    PubMed Central

    Fialkowski, Marie K.; McCrory, Megan A.; Roberts, Sparkle M.; Tracy, J. Kathleen; Grattan, Lynn M.

    2011-01-01

    Background Accurate assessment of dietary intake is essential for researchers and public health practitioners to make advancements in health. This is especially important in Native Americans who display disease prevalence rates that are dramatically higher than the general U.S. population. Objective The objective of this study was to evaluate three dietary assessment tools: 1) dietary records, 2) a food frequency questionnaire (FFQ), and 3) a shellfish assessment survey (SAS) among Native American adults from the Communities Advancing Studies of Tribal Nations Across the Lifespan (CoASTAL) cohort. Design CoASTAL was comprised of randomly selected individuals from three tribal registries of Pacific Northwest Tribal Nations. This cross-sectional study used data from the baseline of CoASTAL and was restricted to the non-pregnant adults (18+ yr) who completed the SAS (n=500), a FFQ (n=518), dietary records (n=444), weight measures (n=493), and height measures (n=496). Paired t-tests, Pearson correlation coefficients, and percent agreement were used to evaluate the dietary records and the FFQ with and without accounting for plausibility of reported energy intake (rEI). Sensitivity and specificity as well as Spearman correlation coefficients were used to evaluate the SAS and the FFQ compared to dietary records. Results Statistically significant correlations between the FFQ and dietary records for selected nutrients were not the same by gender. Accounting for plausibility of rEI for the dietary records and the FFQ improved the strength of the correlations for percent energy from protein, energy from carbohydrate, and calcium for both men and women. In addition, significant associations between rEI (dietary records and FFQ) and weight were more apparent when using only rEI considered plausible. The SAS was found to similarly assess shellfish consumption in comparison to the FFQ. Conclusion These results support the benefit of multiple measures of diet, including regional

  1. Numerical Modeling in Geodynamics: Success, Failure and Perspective

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.

    2005-12-01

    A real success in numerical modeling of dynamics of the Earth can be achieved only by multidisciplinary research teams of experts in geodynamics, applied and pure mathematics, and computer science. The success in numerical modeling is based on the following basic, but simple, rules. (i) People need simplicity most, but they understand intricacies best (B. Pasternak, writer). Start from a simple numerical model, which describes basic physical laws by a set of mathematical equations, and move then to a complex model. Never start from a complex model, because you cannot understand the contribution of each term of the equations to the modeled geophysical phenomenon. (ii) Study the numerical methods behind your computer code. Otherwise it becomes difficult to distinguish true and erroneous solutions to the geodynamic problem, especially when your problem is complex enough. (iii) Test your model versus analytical and asymptotic solutions, simple 2D and 3D model examples. Develop benchmark analysis of different numerical codes and compare numerical results with laboratory experiments. Remember that the numerical tool you employ is not perfect, and there are small bugs in every computer code. Therefore the testing is the most important part of your numerical modeling. (iv) Prove (if possible) or learn relevant statements concerning the existence, uniqueness and stability of the solution to the mathematical and discrete problems. Otherwise you can solve an improperly-posed problem, and the results of the modeling will be far from the true solution of your model problem. (v) Try to analyze numerical models of a geological phenomenon using as less as possible tuning model variables. Already two tuning variables give enough possibilities to constrain your model well enough with respect to observations. The data fitting sometimes is quite attractive and can take you far from a principal aim of your numerical modeling: to understand geophysical phenomena. (vi) If the number of

  2. Eclipse Parallel Tools Platform

    2005-02-18

    Designing and developing parallel programs is an inherently complex task. Developers must choose from the many parallel architectures and programming paradigms that are available, and face a plethora of tools that are required to execute, debug, and analyze parallel programs i these environments. Few, if any, of these tools provide any degree of integration, or indeed any commonality in their user interfaces at all. This further complicates the parallel developer's task, hampering software engineering practices,more » and ultimately reducing productivity. One consequence of this complexity is that best practice in parallel application development has not advanced to the same degree as more traditional programming methodologies. The result is that there is currently no open-source, industry-strength platform that provides a highly integrated environment specifically designed for parallel application development. Eclipse is a universal tool-hosting platform that is designed to providing a robust, full-featured, commercial-quality, industry platform for the development of highly integrated tools. It provides a wide range of core services for tool integration that allow tool producers to concentrate on their tool technology rather than on platform specific issues. The Eclipse Integrated Development Environment is an open-source project that is supported by over 70 organizations, including IBM, Intel and HP. The Eclipse Parallel Tools Platform (PTP) plug-in extends the Eclipse framwork by providing support for a rich set of parallel programming languages and paradigms, and a core infrastructure for the integration of a wide variety of parallel tools. The first version of the PTP is a prototype that only provides minimal functionality for parallel tool integration of a wide variety of parallel tools. The first version of the PTP is a prototype that only provides minimal functionality for parallel tool integration, support for a small number of parallel architectures

  3. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  4. Drilling tool

    SciTech Connect

    Baumann, O.; Dohse, H.P.; Reibetanz, W.; Wanner, K.

    1983-09-27

    A drilling tool is disclosed which has a drilling shaft member, a crown drilling member with an annular wall provided with a plurality of cutting edges and detachably mounted on the shaft member, a center drilling member detachably mounted on the shaft member inside the crown drilling member and having a further cutting edge, and elements for limiting a drilling depth of the tool when the center drilling member is mounted on the shaft member. Thereby, the operator of the drilling tool, after drilling a guiding groove in a rock, is forced to remove the center drilling member from the drilling tool and drill further without the center drilling member, which increases the drilling efficiency.

  5. Modern industrial simulation tools: Kernel-level integration of high performance parallel processing, object-oriented numerics, and adaptive finite element analysis. Final report, July 16, 1993--September 30, 1997

    SciTech Connect

    Deb, M.K.; Kennon, S.R.

    1998-04-01

    A cooperative R&D effort between industry and the US government, this project, under the HPPP (High Performance Parallel Processing) initiative of the Dept. of Energy, started the investigations into parallel object-oriented (OO) numerics. The basic goal was to research and utilize the emerging technologies to create a physics-independent computational kernel for applications using adaptive finite element method. The industrial team included Computational Mechanics Co., Inc. (COMCO) of Austin, TX (as the primary contractor), Scientific Computing Associates, Inc. (SCA) of New Haven, CT, Texaco and CONVEX. Sandia National Laboratory (Albq., NM) was the technology partner from the government side. COMCO had the responsibility of the main kernel design and development, SCA had the lead in parallel solver technology and guidance on OO technologies was Sandia`s main expertise in this venture. CONVEX and Texaco supported the partnership by hardware resource and application knowledge, respectively. As such, a minimum of fifty-percent cost-sharing was provided by the industry partnership during this project. This report describes the R&D activities and provides some details about the prototype kernel and example applications.

  6. Advanced extravehicular mobility unit study

    NASA Technical Reports Server (NTRS)

    Elkins, W.

    1982-01-01

    Components of the advanced extravehicular mobility unit (suit) are described. Design considerations for radiation protection, extravehicular operational pressure, mobility effects, tool/glove/effector, anthropometric definition, lighting, and equipment turnaround are addressed.

  7. Advanced 0.3-NA EUV lithography capabilities at the ALS

    SciTech Connect

    Naulleau, Patrick; Anderson, Erik; Dean, Kim; Denham, Paul; Goldberg, Kenneth A.; Hoef, Brian; Jackson, Keith

    2005-07-07

    For volume nanoelectronics production using Extreme ultraviolet (EUV) lithography [1] to become a reality around the year 2011, advanced EUV research tools are required today. Microfield exposure tools have played a vital role in the early development of EUV lithography [2-4] concentrating on numerical apertures (NA) of 0.2 and smaller. Expected to enter production at the 32-nm node with NAs of 0.25, EUV can no longer rely on these early research tools to provide relevant learning. To overcome this problem, a new generation of microfield exposure tools, operating at an NA of 0.3 have been developed [5-8]. Like their predecessors, these tools trade off field size and speed for greatly reduced complexity. One of these tools is implemented at Lawrence Berkeley National Laboratory's Advanced Light Source synchrotron radiation facility. This tool gets around the problem of the intrinsically high coherence of the synchrotron source [9,10] by using an active illuminator scheme [11]. Here we describe recent printing results obtained from the Berkeley EUV exposure tool. Limited by the availability of ultra-high resolution chemically amplified resists, present resolution limits are approximately 32 nm for equal lines and spaces and 27 nm for semi-isolated lines.

  8. Advanced Heart Failure

    MedlinePlus

    ... High Blood Pressure Tools & Resources Stroke More Advanced Heart Failure Updated:Oct 8,2015 When heart failure (HF) ... content was last reviewed on 04/06/2015. Heart Failure • Home • About Heart Failure • Causes and Risks for ...

  9. The Numerical Tokamak Project (NTP) simulation of turbulent transport in the core plasma: A grand challenge in plasma physics

    SciTech Connect

    Not Available

    1993-12-01

    The long-range goal of the Numerical Tokamak Project (NTP) is the reliable prediction of tokamak performance using physics-based numerical tools describing tokamak physics. The NTP is accomplishing the development of the most advanced particle and extended fluid model`s on massively parallel processing (MPP) environments as part of a multi-institutional, multi-disciplinary numerical study of tokamak core fluctuations. The NTP is a continuing focus of the Office of Fusion Energy`s theory and computation program. Near-term HPCC work concentrates on developing a predictive numerical description of the core plasma transport in tokamaks driven by low-frequency collective fluctuations. This work addresses one of the greatest intellectual challenges to our understanding of the physics of tokamak performance and needs the most advanced computational resources to progress. We are conducting detailed comparisons of kinetic and fluid numerical models of tokamak turbulence. These comparisons are stimulating the improvement of each and the development of hybrid models which embody aspects of both. The combination of emerging massively parallel processing hardware and algorithmic improvements will result in an estimated 10**2--10**6 performance increase. Development of information processing and visualization tools is accelerating our comparison of computational models to one another, to experimental data, and to analytical theory, providing a bootstrap effect in our understanding of the target physics. The measure of success is the degree to which the experimentally observed scaling of fluctuation-driven transport may be predicted numerically. The NTP is advancing the HPCC Initiative through its state-of-the-art computational work. We are pushing the capability of high performance computing through our efforts which are strongly leveraged by OFE support.

  10. Robust Neighboring Optimal Guidance for the Advanced Launch System

    NASA Technical Reports Server (NTRS)

    Hull, David G.

    1993-01-01

    In recent years, optimization has become an engineering tool through the availability of numerous successful nonlinear programming codes. Optimal control problems are converted into parameter optimization (nonlinear programming) problems by assuming the control to be piecewise linear, making the unknowns the nodes or junction points of the linear control segments. Once the optimal piecewise linear control (suboptimal) control is known, a guidance law for operating near the suboptimal path is the neighboring optimal piecewise linear control (neighboring suboptimal control). Research conducted under this grant has been directed toward the investigation of neighboring suboptimal control as a guidance scheme for an advanced launch system.

  11. Apes produce tools for future use.

    PubMed

    Bräuer, Juliane; Call, Josep

    2015-03-01

    There is now growing evidence that some animal species are able to plan for the future. For example great apes save and exchange tools for future use. Here we raise the question whether chimpanzees, orangutans, and bonobos would produce tools for future use. Subjects only had access to a baited apparatus for a limited duration and therefore should use the time preceding this access to create the appropriate tools in order to get the rewards. The apes were tested in three conditions depending on the need for pre-prepared tools. Either eight tools, one tool or no tools were needed to retrieve the reward. The apes prepared tools in advance for future use and they produced them mainly in conditions when they were really needed. The fact that apes were able to solve this new task indicates that their planning skills are flexible. However, for the condition in which eight tools were needed, apes produced less than two tools per trial in advance. However, they used their chance to produce additional tools in the tool use phase-thus often obtaining most of the reward from the apparatus. Increased pressure to prepare more tools in advance did not have an effect on their performance. PMID:25236323

  12. Authoring Tools

    NASA Astrophysics Data System (ADS)

    Treviranus, Jutta

    Authoring tools that are accessible and that enable authors to produce accessible Web content play a critical role in web accessibility. Widespread use of authoring tools that comply to the W3C Authoring Tool Accessibility Guidelines (ATAG) would ensure that even authors who are neither knowledgeable about nor particularly motivated to produce accessible content do so by default. The principles and techniques of ATAG are discussed. Some examples of accessible authoring tools are described including authoring tool content management components such as TinyMCE. Considerations for creating an accessible collaborative environment are also covered. As part of providing accessible content, the debate between system-based personal optimization and one universally accessible site configuration is presented. The issues and potential solutions to address the accessibility crisis presented by the advent of rich internet applications are outlined. This challenge must be met to ensure that a large segment of the population is able to participate in the move toward the web as a two-way communication mechanism.

  13. Numerical Aspects of Solving Differential Equations: Laboratory Approach for Students.

    ERIC Educational Resources Information Center

    Witt, Ana

    1997-01-01

    Describes three labs designed to help students in a first course on ordinary differential equations with three of the most common numerical difficulties they might encounter when solving initial value problems with a numerical software package. The goal of these labs is to help students advance to independent work on common numerical anomalies.…

  14. ATST telescope mount: telescope of machine tool

    NASA Astrophysics Data System (ADS)

    Jeffers, Paul; Stolz, Günter; Bonomi, Giovanni; Dreyer, Oliver; Kärcher, Hans

    2012-09-01

    The Advanced Technology Solar Telescope (ATST) will be the largest solar telescope in the world, and will be able to provide the sharpest views ever taken of the solar surface. The telescope has a 4m aperture primary mirror, however due to the off axis nature of the optical layout, the telescope mount has proportions similar to an 8 meter class telescope. The technology normally used in this class of telescope is well understood in the telescope community and has been successfully implemented in numerous projects. The world of large machine tools has developed in a separate realm with similar levels of performance requirement but different boundary conditions. In addition the competitive nature of private industry has encouraged development and usage of more cost effective solutions both in initial capital cost and thru-life operating cost. Telescope mounts move relatively slowly with requirements for high stability under external environmental influences such as wind buffeting. Large machine tools operate under high speed requirements coupled with high application of force through the machine but with little or no external environmental influences. The benefits of these parallel development paths and the ATST system requirements are being combined in the ATST Telescope Mount Assembly (TMA). The process of balancing the system requirements with new technologies is based on the experience of the ATST project team, Ingersoll Machine Tools who are the main contractor for the TMA and MT Mechatronics who are their design subcontractors. This paper highlights a number of these proven technologies from the commercially driven machine tool world that are being introduced to the TMA design. Also the challenges of integrating and ensuring that the differences in application requirements are accounted for in the design are discussed.

  15. Technology Tools to Support Reading in the Digital Age

    ERIC Educational Resources Information Center

    Biancarosa, Gina; Griffiths, Gina G.

    2012-01-01

    Advances in digital technologies are dramatically altering the texts and tools available to teachers and students. These technological advances have created excitement among many for their potential to be used as instructional tools for literacy education. Yet with the promise of these advances come issues that can exacerbate the literacy…

  16. Fluid sampling tool

    DOEpatents

    Garcia, A.R.; Johnston, R.G.; Martinez, R.K.

    1999-05-25

    A fluid sampling tool is described for sampling fluid from a container. The tool has a fluid collecting portion which is drilled into the container wall, thereby affixing it to the wall. The tool may have a fluid extracting section which withdraws fluid collected by the fluid collecting section. The fluid collecting section has a fluted shank with an end configured to drill a hole into a container wall. The shank has a threaded portion for tapping the borehole. The shank is threadably engaged to a cylindrical housing having an inner axial passageway sealed at one end by a septum. A flexible member having a cylindrical portion and a bulbous portion is provided. The housing can be slid into an inner axial passageway in the cylindrical portion and sealed to the flexible member. The bulbous portion has an outer lip defining an opening. The housing is clamped into the chuck of a drill, the lip of the bulbous section is pressed against a container wall until the shank touches the wall, and the user operates the drill. Wall shavings (kerf) are confined in a chamber formed in the bulbous section as it folds when the shank advances inside the container. After sufficient advancement of the shank, an o-ring makes a seal with the container wall. 6 figs.

  17. Fluid sampling tool

    DOEpatents

    Garcia, Anthony R.; Johnston, Roger G.; Martinez, Ronald K.

    1999-05-25

    A fluid sampling tool for sampling fluid from a container. The tool has a fluid collecting portion which is drilled into the container wall, thereby affixing it to the wall. The tool may have a fluid extracting section which withdraws fluid collected by the fluid collecting section. The fluid collecting section has a fluted shank with an end configured to drill a hole into a container wall. The shank has a threaded portion for tapping the borehole. The shank is threadably engaged to a cylindrical housing having an inner axial passageway sealed at one end by a septum. A flexible member having a cylindrical portion and a bulbous portion is provided. The housing can be slid into an inner axial passageway in the cylindrical portion and sealed to the flexible member. The bulbous portion has an outer lip defining an opening. The housing is clamped into the chuck of a drill, the lip of the bulbous section is pressed against a container wall until the shank touches the wall, and the user operates the drill. Wall shavings (kerf) are confined in a chamber formed in the bulbous section as it folds when the shank advances inside the container. After sufficient advancement of the shank, an o-ring makes a seal with the container wall.

  18. Management Tools

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  19. The quiet revolution of numerical weather prediction

    NASA Astrophysics Data System (ADS)

    Bauer, Peter; Thorpe, Alan; Brunet, Gilbert

    2015-09-01

    Advances in numerical weather prediction represent a quiet revolution because they have resulted from a steady accumulation of scientific knowledge and technological advances over many years that, with only a few exceptions, have not been associated with the aura of fundamental physics breakthroughs. Nonetheless, the impact of numerical weather prediction is among the greatest of any area of physical science. As a computational problem, global weather prediction is comparable to the simulation of the human brain and of the evolution of the early Universe, and it is performed every day at major operational centres across the world.

  20. Descendants and advance directives.

    PubMed

    Buford, Christopher

    2014-01-01

    Some of the concerns that have been raised in connection to the use of advance directives are of the epistemic variety. Such concerns highlight the possibility that adhering to an advance directive may conflict with what the author of the directive actually wants (or would want) at the time of treatment. However, at least one objection to the employment of advance directives is metaphysical in nature. The objection to be discussed here, first formulated by Rebecca Dresser and labeled by Allen Buchanan as the slavery argument and David DeGrazia the someone else problem, aims to undermine the legitimacy of certain uses of advance directives by concluding that such uses rest upon an incorrect assumption about the identity over time of those ostensibly governed by the directives. There have been numerous attempts to respond to this objection. This paper aims to assess two strategies that have been pursued to cope with the problem. PMID:25743056

  1. Numerical Simulation of Carbon Dioxide Injection in the Western Section of the Farnsworth Unit

    SciTech Connect

    White, Mark D.; McPherson, Brian J.; Grigg, Reid B.; Ampomah, William; Appold, Martin S.

    2014-05-05

    Numerical simulation is an invaluable analytical tool for scientists and engineers in making predictions about of the fate of carbon dioxide injected into deep geologic formations for long-term storage. Current numerical simulators for assessing storage in deep saline formations have capabilities for modeling strongly coupled processes involving multifluid flow, heat transfer, chemistry, and rock mechanics in geologic media. Except for moderate pressure conditions, numerical simulators for deep saline formations only require the tracking of two immiscible phases and a limited number of phase components, beyond those comprising the geochemical reactive system. The requirements for numerically simulating the utilization and storage of carbon dioxide in partially depleted petroleum reservoirs are more numerous than those for deep saline formations. The minimum number of immiscible phases increases to three, the number of phase components may easily increase fourfold, and the coupled processes of heat transfer, geochemistry, and geomechanics remain. Public and scientific confidence in the ability of numerical simulators used for carbon dioxide sequestration in deep saline formations has advanced via a natural progression of the simulators being proven against benchmark problems, code comparisons, laboratory-scale experiments, pilot-scale injections, and commercial-scale injections. This paper describes a new numerical simulator for the scientific investigation of carbon dioxide utilization and storage in partially depleted petroleum reservoirs, with an emphasis on its unique features for scientific investigations; and documents the numerical simulation of the utilization of carbon dioxide for enhanced oil recovery in the western section of the Farnsworth Unit and represents an early stage in the progression of numerical simulators for carbon utilization and storage in depleted oil reservoirs.

  2. Image Tool

    SciTech Connect

    Baker, S.A.; Gardner, S.D.; Rogers, M.L.; Sanders, F.; Tunnell, T.W.

    2001-01-01

    ImageTool is a software package developed at Bechtel Nevada, Los Alamos Operations. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data. Performance measures are used to identify capabilities and limitations of a camera system, while establishing a means for comparing systems. The camera evaluations are designed to provide system performance, camera comparison and system modeling information. This program is used to evaluate digital camera images. ImageTool provides basic image restoration and analysis features along with a special set of camera evaluation tools which are used to standardize camera system characterizations. This process is started with the acquisition of a well-defined set of calibration images. Image processing algorithms provide a consistent means of evaluating the camera calibration data. Performance measures in the areas of sensitivity, noise, and resolution are used as a basis for comparing camera systems and evaluating experimental system performance. Camera systems begin with a charge-coupled device (CCD) camera and optical relay system and may incorporate image intensifiers, electro-static image tubes, or electron bombarded charge-coupled devices (EBCCDs). Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera types evaluated include gated intensified cameras and multi-frame cameras used in applications ranging from X-ray radiography to visible and infrared imaging. It is valuable to evaluate the performance of a camera system in order to determine if a particular system meets experimental requirements. In this paper we highlight the processing features of ImageTool.

  3. Climate Data Analysis Tools

    2009-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems, The power of the system comes from Python and its ability to seamlissly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and fraphical user interfaces (GUI). Themore » CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management Systems or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS).« less

  4. Climate Data Analysis Tools

    SciTech Connect

    2009-12-01

    Climate Data Analysis Tools (CDAT) is a software infrastructure that uses an object-oriented scripting language to link together separate software subsystems and packages thus forming an integrated environment for solving model diagnosis problems, The power of the system comes from Python and its ability to seamlissly interconnect software. Python provides a general purpose and full-featured scripting language with a variety of user interfaces including command-line interaction, stand-alone scripts (applications) and fraphical user interfaces (GUI). The CDAT subsystems, implemented as modules, provide access to and management of gridded data (Climate Data Management Systems or CDMS); large-array numerical operations (Numerical Python); and visualization (Visualization and Control System or VCS).

  5. Numerical analysis of InSb parameters and InSb 2D infrared focal plane arrays

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaolei; Zhang, Hongfei; Sun, Weiguo; Zhang, Lei; Meng, Chao; Lu, Zhengxiong

    2012-10-01

    Accurate and reliable numerical simulation tools are necessary for the development of advanced semiconductor devices. InSb is using the MATLAB and TCAD simulation tool to calculatet the InSb body bandstructure, blackbody's radiant emittance and simultaneously solve the Poisson, Continuity and transport equations for 2D detector structures. In this work the material complexities of InSb, such as non-parabolicity, degenergcy, mobility and Auger recombination/generation are explained, and physics based models are developed. The Empirical Tight Binding Method (ETBM) was been using to calculate the bandstructure for InSb at 77 K by Matlab. We describe a set of systematic experiments performed in order to calibrate the simulation to semiconductor devices backside illuminated InSb focal plane arrays realized with planar technology. The spectral photoresponse and crosstalk characteristic for mid-wavelength InSb infrared focal plane arrays have been numerically studied.

  6. Disruptive Innovation in Numerical Hydrodynamics

    SciTech Connect

    Waltz, Jacob I.

    2012-09-06

    We propose the research and development of a high-fidelity hydrodynamic algorithm for tetrahedral meshes that will lead to a disruptive innovation in the numerical modeling of Laboratory problems. Our proposed innovation has the potential to reduce turnaround time by orders of magnitude relative to Advanced Simulation and Computing (ASC) codes; reduce simulation setup costs by millions of dollars per year; and effectively leverage Graphics Processing Unit (GPU) and future Exascale computing hardware. If successful, this work will lead to a dramatic leap forward in the Laboratory's quest for a predictive simulation capability.

  7. Fluid blade disablement tool

    DOEpatents

    Jakaboski, Juan-Carlos; Hughs, Chance G.; Todd, Steven N.

    2012-01-10

    A fluid blade disablement (FBD) tool that forms both a focused fluid projectile that resembles a blade, which can provide precision penetration of a barrier wall, and a broad fluid projectile that functions substantially like a hammer, which can produce general disruption of structures behind the barrier wall. Embodiments of the FBD tool comprise a container capable of holding fluid, an explosive assembly which is positioned within the container and which comprises an explosive holder and explosive, and a means for detonating. The container has a concavity on the side adjacent to the exposed surface of the explosive. The position of the concavity relative to the explosive and its construction of materials with thicknesses that facilitate inversion and/or rupture of the concavity wall enable the formation of a sharp and coherent blade of fluid advancing ahead of the detonation gases.

  8. Hopper File Management Tool

    SciTech Connect

    Long, J W; O'Neill, N J; Smith, N G; Springmeyer, R R; Remmele, S; Richards, D A; Southon, J

    2004-11-15

    Hopper is a powerful interactive tool that allows users to transfer and manipulate files and directories by means of a graphical user interface. Users can connect to and manage resources using the major file transfer protocols. Implemented in Java, Hopper can be run almost anywhere: from an individual's desktop machine to large production machines. In a high-performance computing environment, managing files can become a difficult and time-consuming task that distracts from scientific work. Users must deal with multiple file transfer protocols, transferring enormous amounts of files between computer platforms, repeated authentication, organizing massive amounts of data, and other detailed but necessary tasks. This is often accomplished with a set of several different tools, each with its own interface and idiosyncrasies. Our goal is to develop tools for a more automated approach to file management that substantially improves users' ability to transfer, organize, search, and operate on collections of files. This paper describes the Hopper tool for advanced file management, including the software architecture, the functionality, and the user interface.

  9. Verification and Validation Strategy for LWRS Tools

    SciTech Connect

    Carl M. Stoots; Richard R. Schultz; Hans D. Gougar; Thomas K Larson; Michael Corradini; Laura Swiler; David Pointer; Jess Gehin

    2012-09-01

    One intension of the Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to create advanced computational tools for safety assessment that enable more accurate representation of a nuclear power plant safety margin. These tools are to be used to study the unique issues posed by lifetime extension and relicensing of the existing operating fleet of nuclear power plants well beyond their first license extension period. The extent to which new computational models / codes such as RELAP-7 can be used for reactor licensing / relicensing activities depends mainly upon the thoroughness with which they have been verified and validated (V&V). This document outlines the LWRS program strategy by which RELAP-7 code V&V planning is to be accomplished. From the perspective of developing and applying thermal-hydraulic and reactivity-specific models to reactor systems, the US Nuclear Regulatory Commission (NRC) Regulatory Guide 1.203 gives key guidance to numeric model developers and those tasked with the validation of numeric models. By creating Regulatory Guide 1.203 the NRC defined a framework for development, assessment, and approval of transient and accident analysis methods. As a result, this methodology is very relevant and is recommended as the path forward for RELAP-7 V&V. However, the unique issues posed by lifetime extension will require considerations in addition to those addressed in Regulatory Guide 1.203. Some of these include prioritization of which plants / designs should be studied first, coupling modern supporting experiments to the stringent needs of new high fidelity models / codes, and scaling of aging effects.

  10. Downhole tool

    DOEpatents

    Hall, David R.; Muradov, Andrei; Pixton, David S.; Dahlgren, Scott Steven; Briscoe, Michael A.

    2007-03-20

    A double shouldered downhole tool connection comprises box and pin connections having mating threads intermediate mating primary and secondary shoulders. The connection further comprises a secondary shoulder component retained in the box connection intermediate a floating component and the primary shoulders. The secondary shoulder component and the pin connection cooperate to transfer a portion of makeup load to the box connection. The downhole tool may be selected from the group consisting of drill pipe, drill collars, production pipe, and reamers. The floating component may be selected from the group consisting of electronics modules, generators, gyroscopes, power sources, and stators. The secondary shoulder component may comprises an interface to the box connection selected from the group consisting of radial grooves, axial grooves, tapered grooves, radial protrusions, axial protrusions, tapered protrusions, shoulders, and threads.

  11. Waste glass melter numerical and physical modeling

    SciTech Connect

    Eyler, L.L.; Peters, R.D.; Lessor, D.L.; Lowery, P.S.; Elliott, M.L.

    1991-10-01

    Results of physical and numerical simulation modeling of high-level liquid waste vitrification melters are presented. Physical modeling uses simulant fluids in laboratory testing. Visualization results provide insight into convective melt flow patterns from which information is derived to support performance estimation of operating melters and data to support numerical simulation. Numerical simulation results of several melter configurations are presented. These are in support of programs to evaluate melter operation characteristics and performance. Included are investigations into power skewing and alternating current electric field phase angle in a dual electrode pair reference design and bi-modal convective stability in an advanced design. 9 refs., 9 figs., 1 tab.

  12. Advanced Tools Webinar Series Presents: Regulatory Issues and Case Studies of Advanced Tools

    EPA Science Inventory

    U.S. EPA has released A Guide for Assessing Biodegradation and Source Identification of Organic Ground Water Contaminants using Compound Specific Isotope Analysis (CSIA) [EPA 600/R-08/148 | December 2008 | www.epa.gov/ada]. The Guide provides recommendations for sample collecti...

  13. Numerical Aerodynamic Simulation

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An overview of historical and current numerical aerodynamic simulation (NAS) is given. The capabilities and goals of the Numerical Aerodynamic Simulation Facility are outlined. Emphasis is given to numerical flow visualization and its applications to structural analysis of aircraft and spacecraft bodies. The uses of NAS in computational chemistry, engine design, and galactic evolution are mentioned.

  14. Numerical Boundary Condition Procedures

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Topics include numerical procedures for treating inflow and outflow boundaries, steady and unsteady discontinuous surfaces, far field boundaries, and multiblock grids. In addition, the effects of numerical boundary approximations on stability, accuracy, and convergence rate of the numerical solution are discussed.

  15. Advanced Solar Power Systems

    NASA Technical Reports Server (NTRS)

    Atkinson, J. H.; Hobgood, J. M.

    1984-01-01

    The Advanced Solar Power System (ASPS) concentrator uses a technically sophisticated design and extensive tooling to produce very efficient (80 to 90%) and versatile energy supply equipment which is inexpensive to manufacture and requires little maintenance. The advanced optical design has two 10th order, generalized aspheric surfaces in a Cassegrainian configuration which gives outstanding performance and is relatively insensitive to temperature changes and wind loading. Manufacturing tolerances also have been achieved. The key to the ASPS is the direct absorption of concentrated sunlight in the working fluid by radiative transfers in a black body cavity. The basic ASPS design concepts, efficiency, optical system, and tracking and focusing controls are described.

  16. Advances in attosecond science

    NASA Astrophysics Data System (ADS)

    Calegari, Francesca; Sansone, Giuseppe; Stagira, Salvatore; Vozzi, Caterina; Nisoli, Mauro

    2016-03-01

    Attosecond science offers formidable tools for the investigation of electronic processes at the heart of important physical processes in atomic, molecular and solid-state physics. In the last 15 years impressive advances have been obtained from both the experimental and theoretical points of view. Attosecond pulses, in the form of isolated pulses or of trains of pulses, are now routinely available in various laboratories. In this review recent advances in attosecond science are reported and important applications are discussed. After a brief presentation of various techniques that can be employed for the generation and diagnosis of sub-femtosecond pulses, various applications are reported in atomic, molecular and condensed-matter physics.

  17. New efficient optimizing techniques for Kalman filters and numerical weather prediction models

    NASA Astrophysics Data System (ADS)

    Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis

    2016-06-01

    The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.

  18. CFD Multiphysics Tool

    NASA Technical Reports Server (NTRS)

    Perrell, Eric R.

    2005-01-01

    The recent bold initiatives to expand the human presence in space require innovative approaches to the design of propulsion systems whose underlying technology is not yet mature. The space propulsion community has identified a number of candidate concepts. A short list includes solar sails, high-energy-density chemical propellants, electric and electromagnetic accelerators, solar-thermal and nuclear-thermal expanders. For each of these, the underlying physics are relatively well understood. One could easily cite authoritative texts, addressing both the governing equations, and practical solution methods for, e.g. electromagnetic fields, heat transfer, radiation, thermophysics, structural dynamics, particulate kinematics, nuclear energy, power conversion, and fluid dynamics. One could also easily cite scholarly works in which complete equation sets for any one of these physical processes have been accurately solved relative to complex engineered systems. The Advanced Concepts and Analysis Office (ACAO), Space Transportation Directorate, NASA Marshall Space Flight Center, has recently released the first alpha version of a set of computer utilities for performing the applicable physical analyses relative to candidate deep-space propulsion systems such as those listed above. PARSEC, Preliminary Analysis of Revolutionary in-Space Engineering Concepts, enables rapid iterative calculations using several physics tools developed in-house. A complete cycle of the entire tool set takes about twenty minutes. PARSEC is a level-zero/level-one design tool. For PARSEC s proof-of-concept, and preliminary design decision-making, assumptions that significantly simplify the governing equation sets are necessary. To proceed to level-two, one wishes to retain modeling of the underlying physics as close as practical to known applicable first principles. This report describes results of collaboration between ACAO, and Embry-Riddle Aeronautical University (ERAU), to begin building a set of

  19. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1995-04-01

    Advanced mathematical techniques and computer simulation play a major role in providing enhanced understanding of conventional and advanced materials processing operations. Development and application of mathematical models and computer simulation techniques can provide a quantitative understanding of materials processes and will minimize the need for expensive and time consuming trial- and error-based product development. As computer simulations and materials databases grow in complexity, high performance computing and simulation are expected to play a key role in supporting the improvements required in advanced material syntheses and processing by lessening the dependence on expensive prototyping and re-tooling. Many of these numerical models are highly compute-intensive. It is not unusual for an analysis to require several hours of computational time on current supercomputers despite the simplicity of the models being studied. For example, to accurately simulate the heat transfer in a 1-m{sup 3} block using a simple computational method requires 10`2 arithmetic operations per second of simulated time. For a computer to do the simulation in real time would require a sustained computation rate 1000 times faster than that achievable by current supercomputers. Massively parallel computer systems, which combine several thousand processors able to operate concurrently on a problem are expected to provide orders of magnitude increase in performance. This paper briefly describes advanced computational research in materials processing at ORNL. Continued development of computational techniques and algorithms utilizing the massively parallel computers will allow the simulation of conventional and advanced materials processes in sufficient generality.

  20. Risk Management Implementation Tool

    NASA Technical Reports Server (NTRS)

    Wright, Shayla L.

    2004-01-01

    Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.

  1. Lower Paleolithic bone tools from the 'Spear Horizon' at Schöningen (Germany).

    PubMed

    Van Kolfschoten, Thijs; Parfitt, Simon A; Serangeli, Jordi; Bello, Silvia M

    2015-12-01

    The Lower Paleolithic locality of Schöningen 13 II-4 is famous for the discovery of wooden spears found amongst the butchered remains of numerous horses and other large herbivores. Although the spears have attracted the most interest, other aspects of the associated artifact assemblage have received less attention. Here we describe an extraordinary assemblage of 88 bone tools from the 'Spear Horizon.' This sample includes numerous long-bone shaft fragments (mostly of horse), three ribs used as 'retouchers' to resharpen flint tools, and a complete horse innominate that was used as an anvil in bipolar knapping. Most of the retouchers were prepared by scraping the diaphysis of fresh and dry long-bones. Technological analysis of the associated lithic assemblage demonstrates exhaustive resharpening to maintain functional cutting edges. Whereas the flint tools were brought to the site, curated, and maintained, the retouchers had a shorter use-history and were either discarded after a limited period or broken to extract marrow. Horse and bison metapodials with flaked and rounded epiphyses are interpreted as hammers used to break marrow bones. Several of the 'metapodial hammers' were additionally used as knapping percussors. These constitute the earliest evidence of multi-purpose bone tools in the archeological record. Our results highlight the advanced knowledge in the use of bones as tools during the Lower Paleolithic, with major implications for understanding aspects of non-lithic technology and planning depth in early hominins. PMID:26653208

  2. Tool Gear: Infrastructure for Parallel Tools

    SciTech Connect

    May, J; Gyllenhaal, J

    2003-04-17

    Tool Gear is a software infrastructure for developing performance analysis and other tools. Unlike existing integrated toolkits, which focus on providing a suite of capabilities, Tool Gear is designed to help tool developers create new tools quickly. It combines dynamic instrumentation capabilities with an efficient database and a sophisticated and extensible graphical user interface. This paper describes the design of Tool Gear and presents examples of tools that have been built with it.

  3. Recent advances in dermoscopy

    PubMed Central

    Russo, Teresa; Piccolo, Vincenzo; Lallas, Aimilios; Argenziano, Giuseppe

    2016-01-01

    The use of dermoscopy has offered a new morphological dimension of skin lesions and has provided an effective diagnostic tool to differentiate melanoma from other benign or malignant skin tumors but also to support the clinical diagnosis in general dermatology. The aim of this article is to provide an overview of the most recent and important advances in the rising world of dermoscopy. PMID:26949523

  4. Numerical methods for turbulent flow

    NASA Astrophysics Data System (ADS)

    Turner, James C., Jr.

    1988-09-01

    It has generally become accepted that the Navier-Strokes equations predict the dynamic behavior of turbulent as well as laminar flows of a fluid at a point in space away form a discontinuity such as a shock wave. Turbulence is also closely related to the phenomena of non-uniqueness of solutions of the Navier-Strokes equations. These second order, nonlinear partial differential equations can be solved analytically for only a few simple flows. Turbulent flow fields are much to complex to lend themselves to these few analytical methods. Numerical methods, therefore, offer the only possibility of achieving a solution of turbulent flow equations. In spite of recent advances in computer technology, the direct solution, by discrete methods, of the Navier-Strokes equations for turbulent flow fields is today, and in the foreseeable future, impossible. Thus the only economically feasible way to solve practical turbulent flow problems numerically is to use statistically averaged equations governing mean-flow quantities. The objective is to study some recent developments relating to the use of numerical methods to study turbulent flow.

  5. Plasmon spectroscopy: Theoretical and numerical calculations, and optimization techniques

    NASA Astrophysics Data System (ADS)

    Rodríguez-Oliveros, Rogelio; Paniagua-Domínguez, Ramón; Sánchez-Gil, José A.; Macías, Demetrio

    2016-02-01

    We present an overview of recent advances in plasmonics, mainly concerning theoretical and numerical tools required for the rigorous determination of the spectral properties of complex-shape nanoparticles exhibiting strong localized surface plasmon resonances (LSPRs). Both quasistatic approaches and full electrodynamic methods are described, providing a thorough comparison of their numerical implementations. Special attention is paid to surface integral equation formulations, giving examples of their performance in complicated nanoparticle shapes of interest for their LSPR spectra. In this regard, complex (single) nanoparticle configurations (nanocrosses and nanorods) yield a hierarchy of multiple-order LSPR s with evidence of a rich symmetric or asymmetric (Fano-like) LSPR line shapes. In addition, means to address the design of complex geometries to retrieve LSPR spectra are commented on, with special interest in biologically inspired algorithms. Thewealth of LSPRbased applications are discussed in two choice examples, single-nanoparticle surface-enhanced Raman scattering (SERS) and optical heating, and multifrequency nanoantennas for fluorescence and nonlinear optics.

  6. Green tools

    NASA Astrophysics Data System (ADS)

    With an eye toward forging tools that the nonscientist can use to make environmentally prudent policy, the National Science Foundation has provided the seed funding to establish a new National Center for Environmental Decision-Making Research. NSF has awarded $5 million over the next five years to the Joint Institute for Energy and the Environment at the University of Tennessee for creation of the center. The organizing principle of the effort, according to NSF, is to "make scientific environmental research more relevant and useful to decision makers." Interdisciplinary teams of sociologists, economists, geologists, ecologists, computer scientists, psychologists, urban planners, and others will be asked to interpret existing research and to conduct new studies of environmental problems and how they were resolved.

  7. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  8. WGS Analysis and Interpretation in Clinical and Public Health Microbiology Laboratories: What Are the Requirements and How Do Existing Tools Compare?

    PubMed Central

    Wyres, Kelly L.; Conway, Thomas C.; Garg, Saurabh; Queiroz, Carlos; Reumann, Matthias; Holt, Kathryn; Rusu, Laura I.

    2014-01-01

    Recent advances in DNA sequencing technologies have the potential to transform the field of clinical and public health microbiology, and in the last few years numerous case studies have demonstrated successful applications in this context. Among other considerations, a lack of user-friendly data analysis and interpretation tools has been frequently cited as a major barrier to routine use of these techniques. Here we consider the requirements of microbiology laboratories for the analysis, clinical interpretation and management of bacterial whole-genome sequence (WGS) data. Then we discuss relevant, existing WGS analysis tools. We highlight many essential and useful features that are represented among existing tools, but find that no single tool fulfils all of the necessary requirements. We conclude that to fully realise the potential of WGS analyses for clinical and public health microbiology laboratories of all scales, we will need to develop tools specifically with the needs of these laboratories in mind. PMID:25437808

  9. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  10. Advanced transmission studies

    NASA Technical Reports Server (NTRS)

    Coy, John J.; Bill, Robert C.

    1988-01-01

    The NASA Lewis Research Center and the U.S. Army Aviation Systems Command share an interest in advancing the technology for helicopter propulsion systems. In particular, this paper presents highlights from that portion of the program in drive train technology and the related mechanical components. The major goals of the program are to increase the life, reliability, and maintainability; reduce the weight, noise, and vibration; and maintain the relatively high mechanical efficiency of the gear train. The current activity emphasizes noise reduction technology and analytical code development followed by experimental verification. Selected significant advances in technology for transmissions are reviewed, including advanced configurations and new analytical tools. Finally, the plan for future transmission research is presented.

  11. Overview of Virtual Observatory Tools

    NASA Astrophysics Data System (ADS)

    Allen, M. G.

    2009-07-01

    I provide a brief introduction and tour of selected Virtual Observatory tools to highlight some of the core functions provided by the VO, and the way that astronomers may use the tools and services for doing science. VO tools provide advanced functions for searching and using images, catalogues and spectra that have been made available in the VO. The tools may work together by providing efficient and innovative browsing and analysis of data, and I also describe how many VO services may be accessed by a scripting or command line environment. Early science usage of the VO provides important feedback on the development of the system, and I show how VO portals try to address early user comments about the navigation and use of the VO.

  12. BASEMENT - a freeware simulation tool for hydro- and morphodynamic modelling

    NASA Astrophysics Data System (ADS)

    Vetsch, David; Rousselot, Patric; Volz, Christian; Vonwiller, Lukas; Siviglia, Annunziato; Peter, Samuel; Ehrbar, Daniel; Facchini, Matteo; Boes, Robert

    2014-05-01

    The application of numerical modelling tools to river engineering problems is a well established methodology. In the present contribution, a numerical software for simulation of hydro- and morphodynamics is presented that is available free of charge - also for commercial use. The main motivation for development of the software is to provide an powerful user-friendly tool that facilitates basic applications for practitioners as well as advanced model configuration for research. The underlying one- and two-dimensional models are based on the Saint-Venant equations for hydrodynamics, the Exner-Hirano equations for bed load and an advection-diffusion approach with source terms for suspended sediment transport. Mentionable special features of the software are arbitrary combination of 1-D and 2-D model domains, a PID controller for various monitoring values and use of an unstructured dual-mesh to improve topographic accuracy. Besides the presentation of some appealing examples of use, the possibility of embedding the software into an open-source pre- and post-processing environment is highlighted.

  13. Advanced powder processing

    SciTech Connect

    Janney, M.A.

    1997-04-01

    Gelcasting is an advanced powder forming process. It is most commonly used to form ceramic or metal powders into complex, near-net shapes. Turbine rotors, gears, nozzles, and crucibles have been successfully gelcast in silicon nitride, alumina, nickel-based superalloy, and several steels. Gelcasting can also be used to make blanks that can be green machined to near-net shape and then high fired. Green machining has been successfully applied to both ceramic and metal gelcast blanks. Recently, the authors have used gelcasting to make tooling for metal casting applications. Most of the work has centered on H13 tool steel. They have demonstrated an ability to gelcast and sinter H13 to near net shape for metal casting tooling. Also, blanks of H13 have been cast, green machined into complex shape, and fired. Issues associated with forming, binder burnout, and sintering are addressed.

  14. Numerical simulation of "an American haboob"

    NASA Astrophysics Data System (ADS)

    Vukovic, A.; Vujadinovic, M.; Pejanovic, G.; Andric, J.; Kumjian, M. R.; Djurdjevic, V.; Dacic, M.; Prasad, A. K.; El-Askary, H. M.; Paris, B. C.; Petkovic, S.; Nickovic, S.; Sprigg, W. A.

    2014-04-01

    A dust storm of fearful proportions hit Phoenix in the early evening hours of 5 July 2011. This storm, an American haboob, was predicted hours in advance because numerical, land-atmosphere modeling, computing power and remote sensing of dust events have improved greatly over the past decade. High-resolution numerical models are required for accurate simulation of the small scales of the haboob process, with high velocity surface winds produced by strong convection and severe downbursts. Dust productive areas in this region consist mainly of agricultural fields, with soil surfaces disturbed by plowing and tracks of land in the high Sonoran Desert laid barren by ongoing draught. Model simulation of the 5 July 2011 dust storm uses the coupled atmospheric-dust model NMME-DREAM (Non-hydrostatic Mesoscale Model on E grid, Janjic et al., 2001; Dust REgional Atmospheric Model, Nickovic et al., 2001; Pérez et al., 2006) with 4 km horizontal resolution. A mask of the potentially dust productive regions is obtained from the land cover and the normalized difference vegetation index (NDVI) data from the Moderate Resolution Imaging Spectroradiometer (MODIS). The scope of this paper is validation of the dust model performance, and not use of the model as a tool to investigate mechanisms related to the storm. Results demonstrate the potential technical capacity and availability of the relevant data to build an operational system for dust storm forecasting as a part of a warning system. Model results are compared with radar and other satellite-based images and surface meteorological and PM10 observations. The atmospheric model successfully hindcasted the position of the front in space and time, with about 1 h late arrival in Phoenix. The dust model predicted the rapid uptake of dust and high values of dust concentration in the ensuing storm. South of Phoenix, over the closest source regions (~25 km), the model PM10 surface dust concentration reached ~2500 μg m-3, but

  15. Sasquatch Footprint Tool

    NASA Technical Reports Server (NTRS)

    Bledsoe, Kristin

    2013-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) is the parachute system for NASA s Orion spacecraft. The test program consists of numerous drop tests, wherein a test article rigged with parachutes is extracted or released from an aircraft. During such tests, range safety is paramount, as is the recoverability of the parachutes and test article. It is crucial to establish an aircraft release point that will ensure that the article and all items released from it will land in safe locations. A new footprint predictor tool, called Sasquatch, was created in MATLAB. This tool takes in a simulated trajectory for the test article, information about all released objects, and atmospheric wind data (simulated or actual) to calculate the trajectories of the released objects. Dispersions are applied to the landing locations of those objects, taking into account the variability of winds, aircraft release point, and object descent rate. Sasquatch establishes a payload release point (e.g., where the payload will be extracted from the carrier aircraft) that will ensure that the payload and all objects released from it will land in a specified cleared area. The landing locations (the final points in the trajectories) are plotted on a map of the test range. Sasquatch was originally designed for CPAS drop tests and includes extensive information about both the CPAS hardware and the primary test range used for CPAS testing. However, it can easily be adapted for more complex CPAS drop tests, other NASA projects, and commercial partners. CPAS has developed the Sasquatch footprint tool to ensure range safety during parachute drop tests. Sasquatch is well correlated to test data and continues to ensure the safety of test personnel as well as the safe recovery of all equipment. The tool will continue to be modified based on new test data, improving predictions and providing added capability to meet the requirements of more complex testing.

  16. Advance directives

    PubMed Central

    O’Sullivan, Rory; Mailo, Kevin; Angeles, Ricardo; Agarwal, Gina

    2015-01-01

    Abstract Objective To establish the prevalence of patients with advance directives in a family practice, and to describe patients’ perspectives on a family doctor’s role in initiating discussions about advance directives. Design A self-administered patient questionnaire. Setting A busy urban family medicine teaching clinic in Hamilton, Ont. Participants A convenience sample of adult patients attending the clinic over the course of a typical business week. Main outcome measures The prevalence of advance directives in the patient population was determined, and the patients’ expectations regarding the role of their family doctors were elucidated. Results The survey population consisted of 800 participants (a response rate of 72.5%) well distributed across age groups; 19.7% had written advance directives and 43.8% had previously discussed the topic of advance directives, but only 4.3% of these discussions had occurred with family doctors. In 5.7% of cases, a family physician had raised the issue; 72.3% of respondents believed patients should initiate the discussion. Patients who considered advance directives extremely important were significantly more likely to want their family doctors to start the conversation (odds ratio 3.98; P < .05). Conclusion Advance directives were not routinely addressed in the family practice. Most patients preferred to initiate the discussion of advance directives. However, patients who considered the subject extremely important wanted their family doctors to initiate the discussion. PMID:25873704

  17. Optimising GPR modelling: A practical, multi-threaded approach to 3D FDTD numerical modelling

    NASA Astrophysics Data System (ADS)

    Millington, T. M.; Cassidy, N. J.

    2010-09-01

    The demand for advanced interpretational tools has lead to the development of highly sophisticated, computationally demanding, 3D GPR processing and modelling techniques. Many of these methods solve very large problems with stepwise methods that utilise numerically similar functions within iterative computational loops. Problems of this nature are readily parallelised by splitting the computational domain into smaller, independent chunks for direct use on cluster-style, multi-processor supercomputers. Unfortunately, the implications of running such facilities, as well as time investment needed to develop the parallel codes, means that for most researchers, the use of these advanced methods is too impractical. In this paper, we propose an alternative method of parallelisation which exploits the capabilities of the modern multi-core processors (upon which today's desktop PCs are built) by multi-threading the calculation of a problem's individual sub-solutions. To illustrate the approach, we have applied it to an advanced, 3D, finite-difference time-domain (FDTD) GPR modelling tool in which the calculation of the individual vector field components is multi-threaded. To be of practical use, the FDTD scheme must be able to deliver accurate results with short execution times and we, therefore, show that the performance benefits of our approach can deliver runtimes less than half those of the more conventional, serial programming techniques. We evaluate implementations of the technique using different programming languages (e.g., Matlab, Java, C++), which will facilitate the construction of a flexible modelling tool for use in future GPR research. The implementations are compared on a variety of typical hardware platforms, having between one and eight processing cores available, and also a modern Graphical Processing Unit (GPU)-based computer. Our results show that a multi-threaded xyz modelling approach is easy to implement and delivers excellent results when implemented

  18. Advanced Beamline Design for Fermilab's Advanced Superconducting Test Accelerator

    SciTech Connect

    Prokop, Christopher

    2014-01-01

    The Advanced Superconducting Test Accelerator (ASTA) at Fermilab is a new electron accelerator currently in the commissioning stage. In addition to testing superconducting accelerating cavities for future accelerators, it is foreseen to support a variety of Advanced Accelerator R&D (AARD) experiments. Producing the required electron bunches with the expected flexibility is challenging. The goal of this dissertation is to explore via numerical simulations new accelerator beamlines that can enable the advanced manipulation of electron bunches. The work especially includes the design of a low-energy bunch compressor and a study of transverse-to-longitudinal phase space exchangers.

  19. OpenSMOKE++: An object-oriented framework for the numerical modeling of reactive systems with detailed kinetic mechanisms

    NASA Astrophysics Data System (ADS)

    Cuoci, A.; Frassoldati, A.; Faravelli, T.; Ranzi, E.

    2015-07-01

    OpenSMOKE++ is a general framework for numerical simulations of reacting systems with detailed kinetic mechanisms, including thousands of chemical species and reactions. The framework is entirely written in object-oriented C++ and can be easily extended and customized by the user for specific systems, without having to modify the core functionality of the program. The OpenSMOKE++ framework can handle simulations of ideal chemical reactors (plug-flow, batch, and jet stirred reactors), shock-tubes, rapid compression machines, and can be easily incorporated into multi-dimensional CFD codes for the modeling of reacting flows. OpenSMOKE++ provides useful numerical tools such as the sensitivity and rate of production analyses, needed to recognize the main chemical paths and to interpret the numerical results from a kinetic point of view. Since simulations involving large kinetic mechanisms are very time consuming, OpenSMOKE++ adopts advanced numerical techniques able to reduce the computational cost, without sacrificing the accuracy and the robustness of the calculations. In the present paper we give a detailed description of the framework features, the numerical models available, and the implementation of the code. The possibility of coupling the OpenSMOKE++ functionality with existing numerical codes is discussed. The computational performances of the framework are presented, and the capabilities of OpenSMOKE++ in terms of integration of stiff ODE systems are discussed and analyzed with special emphasis. Some examples demonstrating the ability of the OpenSMOKE++ framework to successfully manage large kinetic mechanisms are eventually presented.

  20. Tools Automate Spacecraft Testing, Operation

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "NASA began the Small Explorer (SMEX) program to develop spacecraft to advance astrophysics and space physics. As one of the entities supporting software development at Goddard Space Flight Center, the Hammers Company Inc. (tHC Inc.), of Greenbelt, Maryland, developed the Integrated Test and Operations System to support SMEX. Later, the company received additional Small Business Innovation Research (SBIR) funding from Goddard for a tool to facilitate the development of flight software called VirtualSat. NASA uses the tools to support 15 satellites, and the aerospace industry is using them to develop science instruments, spacecraft computer systems, and navigation and control software."