Science.gov

Sample records for advanced computational techniques

  1. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  2. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  3. Advanced computer graphic techniques for laser range finder (LRF) simulation

    NASA Astrophysics Data System (ADS)

    Bedkowski, Janusz; Jankowski, Stanislaw

    2008-11-01

    This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.

  4. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  5. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  6. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  7. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging.

  8. Advanced computing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Advanced concepts in hardware, software and algorithms are being pursued for application in next generation space computers and for ground based analysis of space data. The research program focuses on massively parallel computation and neural networks, as well as optical processing and optical networking which are discussed under photonics. Also included are theoretical programs in neural and nonlinear science, and device development for magnetic and ferroelectric memories.

  9. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  10. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  11. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  12. Integrative Utilization of Microenvironments, Biomaterials and Computational Techniques for Advanced Tissue Engineering.

    PubMed

    Shamloo, Amir; Mohammadaliha, Negar; Mohseni, Mina

    2015-10-20

    This review aims to propose the integrative implementation of microfluidic devices, biomaterials, and computational methods that can lead to a significant progress in tissue engineering and regenerative medicine researches. Simultaneous implementation of multiple techniques can be very helpful in addressing biological processes. Providing controllable biochemical and biomechanical cues within artificial extracellular matrix similar to in vivo conditions is crucial in tissue engineering and regenerative medicine researches. Microfluidic devices provide precise spatial and temporal control over cell microenvironment. Moreover, generation of accurate and controllable spatial and temporal gradients of biochemical factors is attainable inside microdevices. Since biomaterials with tunable properties are a worthwhile option to construct artificial extracellular matrix, in vitro platforms that simultaneously utilize natural, synthetic, or engineered biomaterials inside microfluidic devices are phenomenally advantageous to experimental studies in the field of tissue engineering. Additionally, collaboration between experimental and computational methods is a useful way to predict and understand mechanisms responsible for complex biological phenomena. Computational results can be verified by using experimental platforms. Computational methods can also broaden the understanding of the mechanisms behind the biological phenomena observed during experiments. Furthermore, computational methods are powerful tools to optimize the fabrication of microfluidic devices and biomaterials with specific features. Here we present a succinct review of the benefits of microfluidic devices, biomaterial, and computational methods in the case of tissue engineering and regeneration medicine. Furthermore, some breakthroughs in biological phenomena including the neuronal axon development, cancerous cell migration and blood vessel formation via angiogenesis by virtue of the aforementioned approaches

  13. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    SciTech Connect

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  14. Advanced retrieval method in satellite remote sensing atmosphere: the technique of computed tomography

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Xun, Yulong

    1998-08-01

    Computed Tomography (CT) is a modern medical diagnostic technique in which x-ray transmission measurements at numerous angles through the human body are processed by computer to produce cross-sectional pictures of the body. This technique also has found applications in such diverse fields as materials testing, astronomy, microscopy, image processing and oceanography.In this paper, a modification of this technique, using emitted IR or microwave radiation instead of transmitted x-ray radiation, can be applied to satellite radiance measurements taken along the orbital track at various angles. The channels of IR sensors for the CT retrieval are selected from HITRAN Database, and analyzed by Eigen-value analysis. We discuss in detail the effect retrieval result of CT technique form projection-angle. Finally, using the balloon sounding data, the result of CT are compared with the result of conventional method. Because the advantage over conventional remote sensing methods is the additional information acquired by viewing a given point in the atmosphere at several angles as well as several frequencies. The results show that the temperature profiles by CT retrieval are better than the conventional method.

  15. Application of advanced grid generation techniques for flow field computations about complex configurations

    NASA Technical Reports Server (NTRS)

    Kathong, Monchai; Tiwari, Surendra N.

    1988-01-01

    In the computation of flowfields about complex configurations, it is very difficult to construct a boundary-fitted coordinate system. An alternative approach is to use several grids at once, each of which is generated independently. This procedure is called the multiple grids or zonal grids approach; its applications are investigated. The method conservative providing conservation of fluxes at grid interfaces. The Euler equations are solved numerically on such grids for various configurations. The numerical scheme used is the finite-volume technique with a three-stage Runge-Kutta time integration. The code is vectorized and programmed to run on the CDC VPS-32 computer. Steady state solutions of the Euler equations are presented and discussed. The solutions include: low speed flow over a sphere, high speed flow over a slender body, supersonic flow through a duct, and supersonic internal/external flow interaction for an aircraft configuration at various angles of attack. The results demonstrate that the multiple grids approach along with the conservative interfacing is capable of computing the flows about the complex configurations where the use of a single grid system is not possible.

  16. Advanced computational techniques for incompressible/compressible fluid-structure interactions

    NASA Astrophysics Data System (ADS)

    Kumar, Vinod

    2005-07-01

    Fluid-Structure Interaction (FSI) problems are of great importance to many fields of engineering and pose tremendous challenges to numerical analyst. This thesis addresses some of the hurdles faced for both 2D and 3D real life time-dependent FSI problems with particular emphasis on parachute systems. The techniques developed here would help improve the design of parachutes and are of direct relevance to several other FSI problems. The fluid system is solved using the Deforming-Spatial-Domain/Stabilized Space-Time (DSD/SST) finite element formulation for the Navier-Stokes equations of incompressible and compressible flows. The structural dynamics solver is based on a total Lagrangian finite element formulation. Newton-Raphson method is employed to linearize the otherwise nonlinear system resulting from the fluid and structure formulations. The fluid and structural systems are solved in decoupled fashion at each nonlinear iteration. While rigorous coupling methods are desirable for FSI simulations, the decoupled solution techniques provide sufficient convergence in the time-dependent problems considered here. In this thesis, common problems in the FSI simulations of parachutes are discussed and possible remedies for a few of them are presented. Further, the effects of the porosity model on the aerodynamic forces of round parachutes are analyzed. Techniques for solving compressible FSI problems are also discussed. Subsequently, a better stabilization technique is proposed to efficiently capture and accurately predict the shocks in supersonic flows. The numerical examples simulated here require high performance computing. Therefore, numerical tools using distributed memory supercomputers with message passing interface (MPI) libraries were developed.

  17. Research in advanced formal theorem-proving techniques. [design and implementation of computer languages

    NASA Technical Reports Server (NTRS)

    Raphael, B.; Fikes, R.; Waldinger, R.

    1973-01-01

    The results are summarised of a project aimed at the design and implementation of computer languages to aid in expressing problem solving procedures in several areas of artificial intelligence including automatic programming, theorem proving, and robot planning. The principal results of the project were the design and implementation of two complete systems, QA4 and QLISP, and their preliminary experimental use. The various applications of both QA4 and QLISP are given.

  18. Application of advanced computing techniques to the analysis and display of space science measurements

    NASA Technical Reports Server (NTRS)

    Klumpar, D. M.; Lapolla, M. V.; Horblit, B.

    1995-01-01

    A prototype system has been developed to aid the experimental space scientist in the display and analysis of spaceborne data acquired from direct measurement sensors in orbit. We explored the implementation of a rule-based environment for semi-automatic generation of visualizations that assist the domain scientist in exploring one's data. The goal has been to enable rapid generation of visualizations which enhance the scientist's ability to thoroughly mine his data. Transferring the task of visualization generation from the human programmer to the computer produced a rapid prototyping environment for visualizations. The visualization and analysis environment has been tested against a set of data obtained from the Hot Plasma Composition Experiment on the AMPTE/CCE satellite creating new visualizations which provided new insight into the data.

  19. Interfaces for Advanced Computing.

    ERIC Educational Resources Information Center

    Foley, James D.

    1987-01-01

    Discusses the coming generation of supercomputers that will have the power to make elaborate "artificial realities" that facilitate user-computer communication. Illustrates these technological advancements with examples of the use of head-mounted monitors which are connected to position and orientation sensors, and gloves that track finger and…

  20. Advanced Communication Processing Techniques

    NASA Astrophysics Data System (ADS)

    Scholtz, Robert A.

    This document contains the proceedings of the workshop Advanced Communication Processing Techniques, held May 14 to 17, 1989, near Ruidoso, New Mexico. Sponsored by the Army Research Office (under Contract DAAL03-89-G-0016) and organized by the Communication Sciences Institute of the University of Southern California, the workshop had as its objective to determine those applications of intelligent/adaptive communication signal processing that have been realized and to define areas of future research. We at the Communication Sciences Institute believe that there are two emerging areas which deserve considerably more study in the near future: (1) Modulation characterization, i.e., the automation of modulation format recognition so that a receiver can reliably demodulate a signal without using a priori information concerning the signal's structure, and (2) the incorporation of adaptive coding into communication links and networks. (Encoders and decoders which can operate with a wide variety of codes exist, but the way to utilize and control them in links and networks is an issue). To support these two new interest areas, one must have both a knowledge of (3) the kinds of channels and environments in which the systems must operate, and of (4) the latest adaptive equalization techniques which might be employed in these efforts.

  1. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  2. Advanced Computing for Medicine.

    ERIC Educational Resources Information Center

    Rennels, Glenn D.; Shortliffe, Edward H.

    1987-01-01

    Discusses contributions that computers and computer networks are making to the field of medicine. Emphasizes the computer's speed in storing and retrieving data. Suggests that doctors may soon be able to use computers to advise on diagnosis and treatment. (TW)

  3. Advanced prosthetic techniques for below knee amputations.

    PubMed

    Staats, T B

    1985-02-01

    Recent advances in the evaluation of the amputation stump, the materials that are available for prosthetic application, techniques of improving socket fit, and prosthetic finishings promise to dramatically improve amputee function. Precision casting techniques for providing optimal fit of the amputation stump using materials such as alginate are described. The advantages of transparent check sockets for fitting the complicated amputation stump are described. Advances in research that promise to provide more functional prosthetic feet and faster and more reliable socket molding are the use of CAD-CAM (computer aided design-computer aided manufacturing) and the use of gait analysis techniques to aid in the alignment of the prosthesis after socket fitting. Finishing techniques to provide a more natural appearing prosthesis are described. These advances will gradually spread to the entire prosthetic profession.

  4. Advanced Coating Removal Techniques

    NASA Technical Reports Server (NTRS)

    Seibert, Jon

    2006-01-01

    An important step in the repair and protection against corrosion damage is the safe removal of the oxidation and protective coatings without further damaging the integrity of the substrate. Two such methods that are proving to be safe and effective in this task are liquid nitrogen and laser removal operations. Laser technology used for the removal of protective coatings is currently being researched and implemented in various areas of the aerospace industry. Delivering thousands of focused energy pulses, the laser ablates the coating surface by heating and dissolving the material applied to the substrate. The metal substrate will reflect the laser and redirect the energy to any remaining protective coating, thus preventing any collateral damage the substrate may suffer throughout the process. Liquid nitrogen jets are comparable to blasting with an ultra high-pressure water jet but without the residual liquid that requires collection and removal .As the liquid nitrogen reaches the surface it is transformed into gaseous nitrogen and reenters the atmosphere without any contamination to surrounding hardware. These innovative technologies simplify corrosion repair by eliminating hazardous chemicals and repetitive manual labor from the coating removal process. One very significant advantage is the reduction of particulate contamination exposure to personnel. With the removal of coatings adjacent to sensitive flight hardware, a benefit of each technique for the space program is that no contamination such as beads, water, or sanding residue is left behind when the job is finished. One primary concern is the safe removal of coatings from thin aluminum honeycomb face sheet. NASA recently conducted thermal testing on liquid nitrogen systems and found that no damage occurred on 1/6", aluminum substrates. Wright Patterson Air Force Base in conjunction with Boeing and NASA is currently testing the laser remOval technique for process qualification. Other applications of liquid

  5. Advanced Wavefront Control Techniques

    SciTech Connect

    Olivier, S S; Brase, J M; Avicola, K; Thompson, C A; Kartz, M W; Winters, S; Hartley, R; Wihelmsen, J; Dowla, F V; Carrano, C J; Bauman, B J; Pennington, D M; Lande, D; Sawvel, R M; Silva, D A; Cooke, J B; Brown, C G

    2001-02-21

    this project, work was performed in four areas (1) advanced modeling tools for deformable mirrors (2) low-order wavefront correctors with Alvarez lenses, (3) a direct phase measuring heterdyne wavefront sensor, and (4) high-spatial-frequency wavefront control using spatial light modulators.

  6. Advanced qualification techniques

    SciTech Connect

    Winokur, P.S; Shaneyfelt, M.R.; Meisenheimer, T.L.; Fleetwood, D.M.

    1993-12-01

    This paper demonstrates use of the Qualified Manufacturers List (QML) methodology to qualify commercial and military microelectronics for use in space applications. QML ``builds in`` the hardness of product through statistical process control (SPC) of technology parameters relevant to the radiation response, test structure to integrated circuit (IC) correlations, and techniques for extrapolating laboratory test results to low-dose-rate space scenarios. Each of these elements is demonstrated and shown to be a cost-effective alternative to expensive end-of-line IC testing. Several examples of test structured-IC correlations are provided and recent work on complications arising from transistor scaling and geometry is discussed. The use of a 10-keV x-ray wafer-level test system to support SPC and establish ``process capability`` is illustrated and a comparison of 10-keV x-ray and Co{sup 60} gamma irradiations is provided for a wide range of CMOS technologies. The x-ray tester is shown to be cost-effective and its use in lot acceptance/qualification is recommended. Finally, a comparison is provided between MIL-STD-883D, Test Method 1019.4, which governs the testing of packaged semiconductor microcircuits in the DoD, and ESA/SSC Basic Specification No. 22900, Europe`s Total Dose Steady-State Irradiation Test Method. Test Method 1019.4 focuses on conservative estimates of MOS hardness for space and tactical applications, while Basic Specification 22900 focuses on improved simulation of low-dose-rate space environments.

  7. Advanced qualification techniques

    NASA Astrophysics Data System (ADS)

    Winokur, P. S.; Shaneyfelt, M. R.; Meisenheimer, T. L.; Fleetwood, D. M.

    This paper demonstrates use of the Qualified Manufacturers List (QML) methodology to qualify commercial and military microelectronics for use in space applications. QML 'builds in' the hardness of product through statistical process control (SPC) of technology parameters relevant to the radiation response, test structure to integrated circuit (IC) correlations, and techniques for extrapolating laboratory test results to low-dose-rate space scenarios. Each of these elements is demonstrated and shown to be a cost-effective alternative to expensive end-of-line IC testing. Several examples of test structured-IC correlations are provided and recent work on complications arising from transistor scaling and geometry is discussed. The use of a 10-keV x-ray wafer-level test system to support SPC and establish 'process capability' is illustrated and a comparison of 10-keV x-ray and Co-60 gamma irradiations is provided for a wide range of CMOS technologies. The x-ray tester is shown to be cost-effective and its use in lot acceptance/qualification is recommended. Finally, a comparison is provided between MIL-STD-883D, Test Method 1019.4, which governs the testing of packaged semiconductor microcircuits in the DoD, and ESA/SSC Basic Specification No. 22900, Europe's Total Dose Steady-State Irradiation Test Method. Test Method 1019.4 focuses on conservative estimates of MOS hardness for space and tactical applications, while Basic Specification 22900 focuses on improved simulation of low-dose-rate space environments.

  8. Advanced qualification techniques

    SciTech Connect

    Winokur, P.S.; Shaneyfelt, M.R.; Meisenheimer, T.L.; Fleetwood, D.M. )

    1994-06-01

    This paper demonstrates use of the Qualified Manufacturers List (QML) methodology to qualify commercial and military microelectronics for use in space applications. QML ''builds in'' the hardness of product through statistical process control (SPC) of technology parameters relevant to the radiation response, test structure to integrated circuit (IC) correlations, and techniques for extrapolating laboratory test results to low-dose-rate space scenarios. Each of these elements is demonstrated and shown to be a cost-effective alternative to expensive end-of-line IC testing. Several examples of test structure-to-IC correlations are provided and recent work on complications arising from transistor scaling and geometry is discussed. The use of a 10-keV x-ray wafer-level test system to support SPC and establish ''process capability'' is illustrated and a comparison of 10-kev x-ray wafer-level test system to support SPC and establish ''process capability'' is illustrated and a comparison of 10-keV x-ray and Co[sup 60] gamma irradiations is provided for a wide range of CMOS technologies. The x-ray tester is shown to be cost-effective and its use in lot acceptance/qualification is recommended. Finally, a comparison is provided between MIL-STD-883, Test Method 1019.4, which governs the testing of packaged semiconductor microcircuits in the DoD, and ESA/SCC Basic Specification No. 22900, Europe's Total Dose Steady-State Irradiation Test Method. Test Method 1019.4 focuses on conservative estimates of MOS hardness for space and tactical applications, while Basic Specification 22900 focuses on improved simulation of low-dose-rate space environments.

  9. Advanced qualification techniques

    NASA Astrophysics Data System (ADS)

    Winokur, P. S.; Shaneyfelt, M. R.; Meisenheimer, T. L.; Fleetwood, D. M.

    1994-06-01

    This paper demonstrates use of the Qualified Manufacturers List (QML) methodology to qualify commercial and military microelectronics for use in space applications. QML 'builds in' the hardness of product through statistical process control (SPC) of technology parameters relevant to the radiation response, test structure to integrated circuit (IC) correlations, and techniques for extrapolating laboratory test results to low-dose-rate space scenarios. Each of these elements is demonstrated and shown to be a cost-effective alternative to expensive end-of-line IC testing. Several examples of test structure-to-IC correlations are provided and recent work on complications arising from transistor scaling and geometry is discussed. The use of a 10-keV x-ray wafer-level test system to support SPC and establish 'process capability' is illustrated and a comparison of 10-keV x-ray and Co-60 gamma irradiations is provided for a wide range of CMOS technologies. The x-ray tester is shown to be cost-effective and its use in lot acceptance/qualification is recommended. Finally, a comparison is provided between MIL-STD-883, Test Method 1019.4, which governs the testing of packaged semiconductor microcircuits in the DoD, and ESA/SCC Basic Specification No. 22900, Europe's Total Dose Steady-State Irradiation Test Method. Test Method 1019.4 focuses on conservative estimates of MOS hardness for space and tactical applications, while Basic Specification 22900 focuses on improved simulation of low-dose-rate space environments.

  10. Advanced Computing for Science.

    ERIC Educational Resources Information Center

    Hut, Piet; Sussman, Gerald Jay

    1987-01-01

    Discusses some of the contributions that high-speed computing is making to the study of science. Emphasizes the use of computers in exploring complicated systems without the simplification required in traditional methods of observation and experimentation. Provides examples of computer assisted investigations in astronomy and physics. (TW)

  11. Techniques in Advanced Language Teaching.

    ERIC Educational Resources Information Center

    Ager, D. E.

    1967-01-01

    For ease of presentation, advanced grammar teaching techniques are briefly considered under the headings of structuralism (belief in the effectiveness of presenting grammar rules) and contextualism (belief in the maximum use by students of what they know in the target language). The structuralist's problem of establishing a syllabus is discussed…

  12. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1986-01-01

    Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.

  13. Advances in computational solvation thermodynamics

    NASA Astrophysics Data System (ADS)

    Wyczalkowski, Matthew A.

    The aim of this thesis is to develop improved methods for calculating the free energy, entropy and enthalpy of solvation from molecular simulations. Solvation thermodynamics of model compounds provides quantitative measurements used to analyze the stability of protein conformations in aqueous milieus. Solvation free energies govern the favorability of the solvation process, while entropy and enthalpy decompositions give insight into the molecular mechanisms by which the process occurs. Computationally, a coupling parameter lambda modulates solute-solvent interactions to simulate an insertion process, and multiple lengthy simulations at a fixed lambda value are typically required for free energy calculations to converge; entropy and enthalpy decompositions generally take 10-100 times longer. This thesis presents three advances which accelerate the convergence of such calculations: (1) Development of entropy and enthalpy estimators which combine data from multiple simulations; (2) Optimization of lambda schedules, or the set of parameter values associated with each simulation; (3) Validation of Hamiltonian replica exchange, a technique which swaps lambda values between two otherwise independent simulations. Taken together, these techniques promise to increase the accuracy and precision of free energy, entropy and enthalpy calculations. Improved estimates, in turn, can be used to investigate the validity and limits of existing solvation models and refine force field parameters, with the goal of understanding better the collapse transition and aggregation behavior of polypeptides.

  14. Advanced computer languages

    SciTech Connect

    Bryce, H.

    1984-05-03

    If software is to become an equal partner in the so-called fifth generation of computers-which of course it must-programming languages and the human interface will need to clear some high hurdles. Again, the solutions being sought turn to cerebral emulation-here, the way that human beings understand language. The result would be natural or English-like languages that would allow a person to communicate with a computer much as he or she does with another person. In the discussion the authors look at fourth level languages and fifth level languages, used in meeting the goal of AI. The higher level languages aim to be non procedural. Application of LISP, and Forth to natural language interface are described as well as programs such as natural link technology package, written in C.

  15. Evaluation and study of advanced optical contamination, deposition, measurement, and removal techniques. [including computer programs and ultraviolet reflection analysis

    NASA Technical Reports Server (NTRS)

    Linford, R. M. F.; Allen, T. H.; Dillow, C. F.

    1975-01-01

    A program is described to design, fabricate and install an experimental work chamber assembly (WCA) to provide a wide range of experimental capability. The WCA incorporates several techniques for studying the kinetics of contaminant films and their effect on optical surfaces. It incorporates the capability for depositing both optical and contaminant films on temperature-controlled samples, and for in-situ measurements of the vacuum ultraviolet reflectance. Ellipsometer optics are mounted on the chamber for film thickness determinations, and other features include access ports for radiation sources and instrumentation. Several supporting studies were conducted to define specific chamber requirements, to determine the sensitivity of the measurement techniques to be incorporated in the chamber, and to establish procedures for handling samples prior to their installation in the chamber. A bibliography and literature survey of contamination-related articles is included.

  16. Neutron analysis of spent fuel storage installation using parallel computing and advance discrete ordinates and Monte Carlo techniques.

    PubMed

    Shedlock, Daniel; Haghighat, Alireza

    2005-01-01

    In the United States, the Nuclear Waste Policy Act of 1982 mandated centralised storage of spent nuclear fuel by 1988. However, the Yucca Mountain project is currently scheduled to start accepting spent nuclear fuel in 2010. Since many nuclear power plants were only designed for -10 y of spent fuel pool storage, > 35 plants have been forced into alternate means of spent fuel storage. In order to continue operation and make room in spent fuel pools, nuclear generators are turning towards independent spent fuel storage installations (ISFSIs). Typical vertical concrete ISFSIs are -6.1 m high and 3.3 m in diameter. The inherently large system, and the presence of thick concrete shields result in difficulties for both Monte Carlo (MC) and discrete ordinates (SN) calculations. MC calculations require significant variance reduction and multiple runs to obtain a detailed dose distribution. SN models need a large number of spatial meshes to accurately model the geometry and high quadrature orders to reduce ray effects, therefore, requiring significant amounts of computer memory and time. The use of various differencing schemes is needed to account for radial heterogeneity in material cross sections and densities. Two P3, S12, discrete ordinate, PENTRAN (parallel environment neutral-particle TRANsport) models were analysed and different MC models compared. A multigroup MCNP model was developed for direct comparison to the SN models. The biased A3MCNP (automated adjoint accelerated MCNP) and unbiased (MCNP) continuous energy MC models were developed to assess the adequacy of the CASK multigroup (22 neutron, 18 gamma) cross sections. The PENTRAN SN results are in close agreement (5%) with the multigroup MC results; however, they differ by -20-30% from the continuous-energy MC predictions. This large difference can be attributed to the expected difference between multigroup and continuous energy cross sections, and the fact that the CASK library is based on the old ENDF

  17. Neutron analysis of spent fuel storage installation using parallel computing and advance discrete ordinates and Monte Carlo techniques.

    PubMed

    Shedlock, Daniel; Haghighat, Alireza

    2005-01-01

    In the United States, the Nuclear Waste Policy Act of 1982 mandated centralised storage of spent nuclear fuel by 1988. However, the Yucca Mountain project is currently scheduled to start accepting spent nuclear fuel in 2010. Since many nuclear power plants were only designed for -10 y of spent fuel pool storage, > 35 plants have been forced into alternate means of spent fuel storage. In order to continue operation and make room in spent fuel pools, nuclear generators are turning towards independent spent fuel storage installations (ISFSIs). Typical vertical concrete ISFSIs are -6.1 m high and 3.3 m in diameter. The inherently large system, and the presence of thick concrete shields result in difficulties for both Monte Carlo (MC) and discrete ordinates (SN) calculations. MC calculations require significant variance reduction and multiple runs to obtain a detailed dose distribution. SN models need a large number of spatial meshes to accurately model the geometry and high quadrature orders to reduce ray effects, therefore, requiring significant amounts of computer memory and time. The use of various differencing schemes is needed to account for radial heterogeneity in material cross sections and densities. Two P3, S12, discrete ordinate, PENTRAN (parallel environment neutral-particle TRANsport) models were analysed and different MC models compared. A multigroup MCNP model was developed for direct comparison to the SN models. The biased A3MCNP (automated adjoint accelerated MCNP) and unbiased (MCNP) continuous energy MC models were developed to assess the adequacy of the CASK multigroup (22 neutron, 18 gamma) cross sections. The PENTRAN SN results are in close agreement (5%) with the multigroup MC results; however, they differ by -20-30% from the continuous-energy MC predictions. This large difference can be attributed to the expected difference between multigroup and continuous energy cross sections, and the fact that the CASK library is based on the old ENDF

  18. Advances in wound debridement techniques.

    PubMed

    Nazarko, Linda

    2015-06-01

    Dead and devitalised tissue interferes with the process of wound healing. Debridement is a natural process that occurs in all wounds and is crucial to healing; it reduces the bacterial burden in a wound and promotes effective inflammatory responses that encourage the formation of healthy granulation tissue (Wolcott et al, 2009). Wound care should be part of holistic patient care. Recent advances in debridement techniques include: biosurgery, hydrosurgery, mechanical debridement, and ultrasound. Biosurgery and mechanical debridement can be practiced by nonspecialist nurses and can be provided in a patient's home, thus increasing the patient's access to debridement therapy and accelerating wound healing.

  19. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  20. Recent advances in computational aerodynamics

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh K.; Desse, Jerry E.

    1991-04-01

    The current state of the art in computational aerodynamics is described. Recent advances in the discretization of surface geometry, grid generation, and flow simulation algorithms have led to flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics is emerging as a crucial enabling technology for the development and design of flight vehicles. Examples illustrating the current capability for the prediction of aircraft, launch vehicle and helicopter flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  1. Advanced flight computer. Special study

    NASA Technical Reports Server (NTRS)

    Coo, Dennis

    1995-01-01

    This report documents a special study to define a 32-bit radiation hardened, SEU tolerant flight computer architecture, and to investigate current or near-term technologies and development efforts that contribute to the Advanced Flight Computer (AFC) design and development. An AFC processing node architecture is defined. Each node may consist of a multi-chip processor as needed. The modular, building block approach uses VLSI technology and packaging methods that demonstrate a feasible AFC module in 1998 that meets that AFC goals. The defined architecture and approach demonstrate a clear low-risk, low-cost path to the 1998 production goal, with intermediate prototypes in 1996.

  2. Advanced algorithm for orbit computation

    NASA Technical Reports Server (NTRS)

    Szenbehely, V.

    1983-01-01

    Computational and analytical techniques which simplify the solution of complex problems in orbit mechanics, Astrodynamics and Celestial Mechanics were developed. The major tool of the simplification is the substitution of transformations in place of numerical or analytical integrations. In this way the rather complicated equations of orbit mechanics might sometimes be reduced to linear equations representing harmonic oscillators with constant coefficients.

  3. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  4. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  5. Advanced decision aiding techniques applicable to space

    NASA Technical Reports Server (NTRS)

    Kruchten, Robert J.

    1987-01-01

    RADC has had an intensive program to show the feasibility of applying advanced technology to Air Force decision aiding situations. Some aspects of the program, such as Satellite Autonomy, are directly applicable to space systems. For example, RADC has shown the feasibility of decision aids that combine the advantages of laser disks and computer generated graphics; decision aids that interface object-oriented programs with expert systems; decision aids that solve path optimization problems; etc. Some of the key techniques that could be used in space applications are reviewed. Current applications are reviewed along with their advantages and disadvantages, and examples are given of possible space applications. The emphasis is to share RADC experience in decision aiding techniques.

  6. Computational intelligence techniques in bioinformatics.

    PubMed

    Hassanien, Aboul Ella; Al-Shammari, Eiman Tamah; Ghali, Neveen I

    2013-12-01

    Computational intelligence (CI) is a well-established paradigm with current systems having many of the characteristics of biological computers and capable of performing a variety of tasks that are difficult to do using conventional techniques. It is a methodology involving adaptive mechanisms and/or an ability to learn that facilitate intelligent behavior in complex and changing environments, such that the system is perceived to possess one or more attributes of reason, such as generalization, discovery, association and abstraction. The objective of this article is to present to the CI and bioinformatics research communities some of the state-of-the-art in CI applications to bioinformatics and motivate research in new trend-setting directions. In this article, we present an overview of the CI techniques in bioinformatics. We will show how CI techniques including neural networks, restricted Boltzmann machine, deep belief network, fuzzy logic, rough sets, evolutionary algorithms (EA), genetic algorithms (GA), swarm intelligence, artificial immune systems and support vector machines, could be successfully employed to tackle various problems such as gene expression clustering and classification, protein sequence classification, gene selection, DNA fragment assembly, multiple sequence alignment, and protein function prediction and its structure. We discuss some representative methods to provide inspiring examples to illustrate how CI can be utilized to address these problems and how bioinformatics data can be characterized by CI. Challenges to be addressed and future directions of research are also presented and an extensive bibliography is included. PMID:23891719

  7. Advanced techniques of laser telemetry

    NASA Astrophysics Data System (ADS)

    Donati, S.; Gilardini, A.

    The relationships which govern a laser telemeter; noise sources; and measurement accuracy with pulsed and sinusoidal intensity modulation techniques are discussed. Developments in telemetry instrumention and optical detection are considered. Meteorological interferometers, geodimeters, and military telemeters are described. Propagation attenuation and signal to noise ratios are treated. It is shown that accuracy depends on the product of measurement time and received power. The frequency scanning technique of CW and long pulse telemetry; multifrequency techniques; pulse compression; and vernier technique are outlined.

  8. Advances in computed tomography imaging technology.

    PubMed

    Ginat, Daniel Thomas; Gupta, Rajiv

    2014-07-11

    Computed tomography (CT) is an essential tool in diagnostic imaging for evaluating many clinical conditions. In recent years, there have been several notable advances in CT technology that already have had or are expected to have a significant clinical impact, including extreme multidetector CT, iterative reconstruction algorithms, dual-energy CT, cone-beam CT, portable CT, and phase-contrast CT. These techniques and their clinical applications are reviewed and illustrated in this article. In addition, emerging technologies that address deficiencies in these modalities are discussed.

  9. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  10. Splitting advancement genioplasty: a new genioplasty technique.

    PubMed

    Celik, M; Tuncer, S; Büyükçayir, I

    1999-08-01

    A new genioplasty technique has been described and performed on 16 patients since 1995. The technique has been developed to avoid some undesired results of the current osseous genioplasty techniques and to achieve a more natural appearance in advancement genioplasty. According to the authors' technique, a rectangular part of the outer table of the mentum is split away from the mandible, and is advanced and fixated to the mandible. This technique can be used for advancement cases but not for reduction genioplasty. This technique was performed on 16 patients with only minor complications, including one case of wound dehiscence, one hematoma, and one case of osteomyelitis, which was managed with systemic antibiotic therapy. Aesthetic results were found to be satisfactory according to an evaluation by the authors. When the results were evaluated using pre- and postoperative photos, lip position and projection of the mentum were found to be natural in shape appearance. During the late postoperative period, the new bone formation between the advanced segment and the mandible was demonstrated radiographically. Advantages of the technique include having more contact surfaces for bony healing, a natural position of the lower lip, more natural projection of the mentum, tridimensional movement of the mentum, and improvement in the soft tissue of the neck. The disadvantages of the technique are the potential risk of infection due to dead space from the advancement, manipulation problems during surgery, and possible mental nerve injury. Splitting advancement genioplasty was found to be a useful technique for advancement genioplasty. Splitting advancement genioplasty is a more physiological osteotomy technique than most of osseous genioplasty techniques. PMID:10454320

  11. Stitching Techniques Advance Optics Manufacturing

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Because NASA depends on the fabrication and testing of large, high-quality aspheric (nonspherical) optics for applications like the James Webb Space Telescope, it sought an improved method for measuring large aspheres. Through Small Business Innovation Research (SBIR) awards from Goddard Space Flight Center, QED Technologies, of Rochester, New York, upgraded and enhanced its stitching technology for aspheres. QED developed the SSI-A, which earned the company an R&D 100 award, and also developed a breakthrough machine tool called the aspheric stitching interferometer. The equipment is applied to advanced optics in telescopes, microscopes, cameras, medical scopes, binoculars, and photolithography."

  12. Advanced Spectroscopy Technique for Biomedicine

    NASA Astrophysics Data System (ADS)

    Zhao, Jianhua; Zeng, Haishan

    This chapter presents an overview of the applications of optical spectroscopy in biomedicine. We focus on the optical design aspects of advanced biomedical spectroscopy systems, Raman spectroscopy system in particular. Detailed components and system integration are provided. As examples, two real-time in vivo Raman spectroscopy systems, one for skin cancer detection and the other for endoscopic lung cancer detection, and an in vivo confocal Raman spectroscopy system for skin assessment are presented. The applications of Raman spectroscopy in cancer diagnosis of the skin, lung, colon, oral cavity, gastrointestinal tract, breast, and cervix are summarized.

  13. Advanced crew procedures development techniques

    NASA Technical Reports Server (NTRS)

    Arbet, J. D.; Benbow, R. L.; Mangiaracina, A. A.; Mcgavern, J. L.; Spangler, M. C.; Tatum, I. C.

    1975-01-01

    The development of an operational computer program, the Procedures and Performance Program (PPP), is reported which provides a procedures recording and crew/vehicle performance monitoring capability. The PPP provides real time CRT displays and postrun hardcopy of procedures, difference procedures, performance, performance evaluation, and training script/training status data. During post-run, the program is designed to support evaluation through the reconstruction of displays to any point in time. A permanent record of the simulation exercise can be obtained via hardcopy output of the display data, and via magnetic tape transfer to the Generalized Documentation Processor (GDP). Reference procedures data may be transferred from the GDP to the PPP.

  14. Advanced techniques in current signature analysis

    SciTech Connect

    Smith, S.F.; Castleberry, K.N.

    1992-03-01

    In general, both ac and dc motors can be characterized as weakly nonlinear systems, in which both linear and nonlinear effects occur simultaneously. Fortunately, the nonlinearities are generally well behaved and understood and an be handled via several standard mathematical techniques already well developed in the systems modeling area; examples are piecewise linear approximations and Volterra series representations. Field measurements of numerous motors and motor-driven systems confirm the rather complex nature of motor current spectra and illustrate both linear and nonlinear effects (including line harmonics and modulation components). Although previous current signature analysis (CSA) work at Oak Ridge and other sites has principally focused on the modulation mechanisms and detection methods (AM, PM, and FM), more recent studies have been conducted on linear spectral components (those appearing in the electric current at their actual frequencies and not as modulation sidebands). For example, large axial-flow compressors ({approximately}3300 hp) in the US gaseous diffusion uranium enrichment plants exhibit running-speed ({approximately}20 Hz) and high-frequency vibrational information (>1 kHz) in their motor current spectra. Several signal-processing techniques developed to facilitate analysis of these components, including specialized filtering schemes, are presented. Finally, concepts for the designs of advanced digitally based CSA units are offered, which should serve to foster the development of much more computationally capable ``smart`` CSA instrumentation in the next several years. 3 refs.

  15. Advanced techniques in abdominal surgery.

    PubMed Central

    Monson, J R

    1993-01-01

    Almost every abdominal organ is now amenable to laparoscopic surgery. Laparoscopic appendicectomy is a routine procedure which also permits identification of other conditions initially confused with an inflamed appendix. However, assessment of appendiceal inflammation is more difficult. Almost all colonic procedures can be performed laparoscopically, at least partly, though resection for colonic cancer is still controversial. For simple patch repair of perforated duodenal ulcers laparoscopy is ideal, and inguinal groin hernia can be repaired satisfactorily with a patch of synthetic mesh. Many upper abdominal procedures, however, still take more time than the open operations. These techniques reduce postoperative pain and the incidence of wound infections and allow a much earlier return to normal activity compared with open surgery. They have also brought new disciplines: surgeons must learn different hand-eye coordination, meticulous haemostasis is needed to maintain picture quality, and delivery of specimens may be problematic. The widespread introduction of laparoscopic techniques has emphasised the need for adequate training (operations that were straight-forward open procedures may require considerable laparoscopic expertise) and has raised questions about trainee surgeons acquiring adequate experience of open procedures. Images FIG 9 p1347-a p1347-b p1349-a p1350-a p1350-b PMID:8257893

  16. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  17. Advanced sialoendoscopy techniques, rare findings, and complications.

    PubMed

    Nahlieli, Oded

    2009-12-01

    This article presents and discusses advanced minimally invasive sialoendoscopy and combined methods: endoscopy, endoscopic-assisted techniques, and external-lithotripsy combined procedures. It also presents rare situations and complications encountered during sialoendoscopic procedures. Sialoendoscopy is a relatively novel technique, which adds significant new dimensions to the surgeon's armamentarium for management of inflammatory salivary gland diseases. Because of the rapid development in minimally invasive surgical techniques, surgeons are capable of more facilely treating complicated inflammatory and obstructive conditions of the salivary glands.

  18. Advancing manufacturing through computational chemistry

    SciTech Connect

    Noid, D.W.; Sumpter, B.G.; Tuzun, R.E.

    1995-12-31

    The capabilities of nanotechnology and computational chemistry are reaching a point of convergence. New computer hardware and novel computational methods have created opportunities to test proposed nanometer-scale devices, investigate molecular manufacturing and model and predict properties of new materials. Experimental methods are also beginning to provide new capabilities that make the possibility of manufacturing various devices with atomic precision tangible. In this paper, we will discuss some of the novel computational methods we have used in molecular dynamics simulations of polymer processes, neural network predictions of new materials, and simulations of proposed nano-bearings and fluid dynamics in nano- sized devices.

  19. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  20. Aerodynamic Analyses Requiring Advanced Computers, Part 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers are presented which deal with results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include: viscous flows, boundary layer equations, turbulence modeling and Navier-Stokes equations, and internal flows.

  1. Aerodynamic Analyses Requiring Advanced Computers, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers given at the conference present the results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include two-dimensional configurations, three-dimensional configurations, transonic aircraft, and the space shuttle.

  2. Hybrid mesh generation using advancing reduction technique

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study presents an extension of the application of the advancing reduction technique to the hybrid mesh generation. The proposed algorithm is based on a pre-generated rectangle mesh (RM) with a certain orientation. The intersection points between the two sets of perpendicular mesh lines in RM an...

  3. 77 FR 62231 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-12

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  4. 76 FR 31945 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... the Advanced Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  5. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  6. Opportunities in computational mechanics: Advances in parallel computing

    SciTech Connect

    Lesar, R.A.

    1999-02-01

    In this paper, the authors will discuss recent advances in computing power and the prospects for using these new capabilities for studying plasticity and failure. They will first review the new capabilities made available with parallel computing. They will discuss how these machines perform and how well their architecture might work on materials issues. Finally, they will give some estimates on the size of problems possible using these computers.

  7. A Survey of Techniques for Approximate Computing

    DOE PAGES

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less

  8. Computer graphics techniques and computer-generated movies

    NASA Astrophysics Data System (ADS)

    Holzman, Robert E.; Blinn, James F.

    1988-04-01

    The JPL Computer Graphics Laboratory (CGL) has been using advanced computer graphics for more than ten years to simulate space missions and related activities. Applications have ranged from basic computer graphics used interactively to allow engineers to study problems, to sophisticated color graphics used to simulate missions and produce realistic animations and stills for use by NASA and the scientific press. In addition, the CGL did the computer animation for ``Cosmos'', a series of general science programs done for Public Television in the United States by Carl Sagan and shown world-wide. The CGL recently completed the computer animation for ``The Mechanical Universe'', a series of fifty-two half-hour elementary physics lectures, led by Professor David Goodstein of the California Institute of Technology, and now being shown on Public Television in the US. For this series, the CGL produced more than seven hours of computer animation, averaging approximately eight minutes and thirty seconds of computer animation per half-hour program. Our aim at the JPL Computer Graphics Laboratory (CGL) is the realistic depiction of physical phenomena, that is, we deal primarily in ``science education'' rather than in scientific research. Of course, our attempts to render physical events realistically often require the development of new capabilities through research or technology advances, but those advances are not our primary goal.

  9. Recent advancement of turbulent flow measurement techniques

    NASA Technical Reports Server (NTRS)

    Battle, T.; Wang, P.; Cheng, D. Y.

    1974-01-01

    Advancements of the fluctuating density gradient cross beam laser Schlieren technique, the fluctuating line-reversal temperature measurement and the development of the two-dimensional drag-sensing probe to a three-dimensional drag-sensing probe are discussed. The three-dimensionality of the instantaneous momentum vector can shed some light on the nature of turbulence especially with swirling flow. All three measured fluctuating quantities (density, temperature, and momentum) can provide valuable information for theoreticians.

  10. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  11. Advances and trends in computational structures technology

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Venneri, S. L.

    1990-01-01

    The major goals of computational structures technology (CST) are outlined, and recent advances in CST are examined. These include computational material modeling, stochastic-based modeling, computational methods for articulated structural dynamics, strategies and numerical algorithms for new computing systems, multidisciplinary analysis and optimization. The role of CST in the future development of structures technology and the multidisciplinary design of future flight vehicles is addressed, and the future directions of CST research in the prediction of failures of structural components, the solution of large-scale structural problems, and quality assessment and control of numerical simulations are discussed.

  12. America's most computer advanced healthcare facilities.

    PubMed

    1993-02-01

    Healthcare Informatics polled industry experts for nominations for this listing of America's Most Computer-Advanced Healthcare Facilities. Nominations were reviewed for extent of departmental automation, leading-edge applications, advanced point-of-care technologies, and networking communications capabilities. Additional consideration was given to smaller facilities automated beyond "normal expectations." Facility representatives who believe their organizations should be included in our next listing, please contact Healthcare Informatics for a nomination form.

  13. Computational Techniques of Electromagnetic Dosimetry for Humans

    NASA Astrophysics Data System (ADS)

    Hirata, Akimasa; Fujiwara, Osamu

    There has been increasing public concern about the adverse health effects of human exposure to electromagnetic fields. This paper reviews the rationale of international safety guidelines for human protection against electromagnetic fields. Then, this paper also presents computational techniques to conduct dosimetry in anatomically-based human body models. Computational examples and remaining problems are also described briefly.

  14. Advances and Challenges in Computational Plasma Science

    SciTech Connect

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  15. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    SciTech Connect

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  16. Advanced networks and computing in healthcare

    PubMed Central

    Ackerman, Michael

    2011-01-01

    As computing and network capabilities continue to rise, it becomes increasingly important to understand the varied applications for using them to provide healthcare. The objective of this review is to identify key characteristics and attributes of healthcare applications involving the use of advanced computing and communication technologies, drawing upon 45 research and development projects in telemedicine and other aspects of healthcare funded by the National Library of Medicine over the past 12 years. Only projects publishing in the professional literature were included in the review. Four projects did not publish beyond their final reports. In addition, the authors drew on their first-hand experience as project officers, reviewers and monitors of the work. Major themes in the corpus of work were identified, characterizing key attributes of advanced computing and network applications in healthcare. Advanced computing and network applications are relevant to a range of healthcare settings and specialties, but they are most appropriate for solving a narrower range of problems in each. Healthcare projects undertaken primarily to explore potential have also demonstrated effectiveness and depend on the quality of network service as much as bandwidth. Many applications are enabling, making it possible to provide service or conduct research that previously was not possible or to achieve outcomes in addition to those for which projects were undertaken. Most notable are advances in imaging and visualization, collaboration and sense of presence, and mobility in communication and information-resource use. PMID:21486877

  17. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  18. Recent advances in computer image generation simulation.

    PubMed

    Geltmacher, H E

    1988-11-01

    An explosion in flight simulator technology over the past 10 years is revolutionizing U.S. Air Force (USAF) operational training. The single, most important development has been in computer image generation. However, other significant advances are being made in simulator handling qualities, real-time computation systems, and electro-optical displays. These developments hold great promise for achieving high fidelity combat mission simulation. This article reviews the progress to date and predicts its impact, along with that of new computer science advances such as very high speed integrated circuits (VHSIC), on future USAF aircrew simulator training. Some exciting possibilities are multiship, full-mission simulators at replacement training units, miniaturized unit level mission rehearsal training simulators, onboard embedded training capability, and national scale simulator networking.

  19. Advanced AE Techniques in Composite Materials Research

    NASA Technical Reports Server (NTRS)

    Prosser, William H.

    1996-01-01

    Advanced, waveform based acoustic emission (AE) techniques have been successfully used to evaluate damage mechanisms in laboratory testing of composite coupons. An example is presented in which the initiation of transverse matrix cracking was monitored. In these tests, broad band, high fidelity acoustic sensors were used to detect signals which were then digitized and stored for analysis. Analysis techniques were based on plate mode wave propagation characteristics. This approach, more recently referred to as Modal AE, provides an enhanced capability to discriminate and eliminate noise signals from those generated by damage mechanisms. This technique also allows much more precise source location than conventional, threshold crossing arrival time determination techniques. To apply Modal AE concepts to the interpretation of AE on larger composite specimens or structures, the effects of modal wave propagation over larger distances and through structural complexities must be well characterized and understood. To demonstrate these effects, measurements of the far field, peak amplitude attenuation of the extensional and flexural plate mode components of broad band simulated AE signals in large composite panels are discussed. These measurements demonstrated that the flexural mode attenuation is dominated by dispersion effects. Thus, it is significantly affected by the thickness of the composite plate. Furthermore, the flexural mode attenuation can be significantly larger than that of the extensional mode even though its peak amplitude consists of much lower frequency components.

  20. Advanced flow MRI: emerging techniques and applications.

    PubMed

    Markl, M; Schnell, S; Wu, C; Bollache, E; Jarvis, K; Barker, A J; Robinson, J D; Rigsby, C K

    2016-08-01

    Magnetic resonance imaging (MRI) techniques provide non-invasive and non-ionising methods for the highly accurate anatomical depiction of the heart and vessels throughout the cardiac cycle. In addition, the intrinsic sensitivity of MRI to motion offers the unique ability to acquire spatially registered blood flow simultaneously with the morphological data, within a single measurement. In clinical routine, flow MRI is typically accomplished using methods that resolve two spatial dimensions in individual planes and encode the time-resolved velocity in one principal direction, typically oriented perpendicular to the two-dimensional (2D) section. This review describes recently developed advanced MRI flow techniques, which allow for more comprehensive evaluation of blood flow characteristics, such as real-time flow imaging, 2D multiple-venc phase contrast MRI, four-dimensional (4D) flow MRI, quantification of complex haemodynamic properties, and highly accelerated flow imaging. Emerging techniques and novel applications are explored. In addition, applications of these new techniques for the improved evaluation of cardiovascular (aorta, pulmonary arteries, congenital heart disease, atrial fibrillation, coronary arteries) as well as cerebrovascular disease (intra-cranial arteries and veins) are presented. PMID:26944696

  1. Advanced Bode Plot Techniques for Ultrasonic Transducers

    NASA Astrophysics Data System (ADS)

    DeAngelis, D. A.; Schulze, G. W.

    The Bode plot, displayed as either impedance or admittance versus frequency, is the most basic test used by ultrasonic transducer designers. With simplicity and ease-of-use, Bode plots are ideal for baseline comparisons such as spacing of parasitic modes or impedance, but quite often the subtleties that manifest as poor process control are hard to interpret or are nonexistence. In-process testing of transducers is time consuming for quantifying statistical aberrations, and assessments made indirectly via the workpiece are difficult. This research investigates the use of advanced Bode plot techniques to compare ultrasonic transducers with known "good" and known "bad" process performance, with the goal of a-priori process assessment. These advanced techniques expand from the basic constant voltage versus frequency sweep to include constant current and constant velocity interrogated locally on transducer or tool; they also include up and down directional frequency sweeps to quantify hysteresis effects like jumping and dropping phenomena. The investigation focuses solely on the common PZT8 piezoelectric material used with welding transducers for semiconductor wire bonding. Several metrics are investigated such as impedance, displacement/current gain, velocity/current gain, displacement/voltage gain and velocity/voltage gain. The experimental and theoretical research methods include Bode plots, admittance loops, laser vibrometry and coupled-field finite element analysis.

  2. New coding technique for computer generated holograms.

    NASA Technical Reports Server (NTRS)

    Haskell, R. E.; Culver, B. C.

    1972-01-01

    A coding technique is developed for recording computer generated holograms on a computer controlled CRT in which each resolution cell contains two beam spots of equal size and equal intensity. This provides a binary hologram in which only the position of the two dots is varied from cell to cell. The amplitude associated with each resolution cell is controlled by selectively diffracting unwanted light into a higher diffraction order. The recording of the holograms is fast and simple.

  3. Compression Techniques for Improved Algorithm Computational Performance

    NASA Technical Reports Server (NTRS)

    Zalameda, Joseph N.; Howell, Patricia A.; Winfree, William P.

    2005-01-01

    Analysis of thermal data requires the processing of large amounts of temporal image data. The processing of the data for quantitative information can be time intensive especially out in the field where large areas are inspected resulting in numerous data sets. By applying a temporal compression technique, improved algorithm performance can be obtained. In this study, analysis techniques are applied to compressed and non-compressed thermal data. A comparison is made based on computational speed and defect signal to noise.

  4. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  5. Airborne Advanced Reconfigurable Computer System (ARCS)

    NASA Technical Reports Server (NTRS)

    Bjurman, B. E.; Jenkins, G. M.; Masreliez, C. J.; Mcclellan, K. L.; Templeman, J. E.

    1976-01-01

    A digital computer subsystem fault-tolerant concept was defined, and the potential benefits and costs of such a subsystem were assessed when used as the central element of a new transport's flight control system. The derived advanced reconfigurable computer system (ARCS) is a triple-redundant computer subsystem that automatically reconfigures, under multiple fault conditions, from triplex to duplex to simplex operation, with redundancy recovery if the fault condition is transient. The study included criteria development covering factors at the aircraft's operation level that would influence the design of a fault-tolerant system for commercial airline use. A new reliability analysis tool was developed for evaluating redundant, fault-tolerant system availability and survivability; and a stringent digital system software design methodology was used to achieve design/implementation visibility.

  6. Advances in nanodiagnostic techniques for microbial agents.

    PubMed

    Syed, Muhammad Ali

    2014-01-15

    Infectious diseases account for millions of sufferings and deaths in both developing as well as developed countries with a substantial economic loss. Massive increase in world population and international travel has facilitated their spread from one part of the world to other areas, making them one of the most significant global health risks. Furthermore, detection of bioterrorism agents in water, food and environmental samples as well traveler's baggage is a great challenge of the time for security purpose. Prevention strategies against infectious agents demand rapid and accurate detection and identification of the causative agents with highest sensitivity which should be equally available in different parts of the globe. Similarly, rapid and early diagnosis of infectious diseases has always been indispensable for their prompt cure and management, which has stimulated scientists to develop highly sophisticated techniques over centuries and the efforts continue unabated. Conventional diagnostic techniques are time consuming, tedious, expensive, less sensitive, and unsuitable for field situations. Nanodiagnostic assays have been promising for early, sensitive, point-of-care and cost-effective detection of microbial agents. There has been an explosive research in this area of science in last two decades yielding highly fascinating results. This review highlights some of the advancements made in the field of nanotechnology based assays for microbial detection since 2005 along with providing the basic understanding. PMID:24012709

  7. Inverse lithography technique for advanced CMOS nodes

    NASA Astrophysics Data System (ADS)

    Villaret, Alexandre; Tritchkov, Alexander; Entradas, Jorge; Yesilada, Emek

    2013-04-01

    Resolution Enhancement Techniques have continuously improved over the last decade, driven by the ever growing constraints of lithography process. Despite the large number of RET applied, some hotspot configurations remain challenging for advanced nodes due to aggressive design rules. Inverse Lithography Technique (ILT) is evaluated here as a substitute to the dense OPC baseline. Indeed ILT has been known for several years for its near-to-ideal mask quality, while also being potentially more time consuming in terms of OPC run and mask processing. We chose to evaluate Mentor Graphics' ILT engine "pxOPCTM" on both lines and via hotspot configurations. These hotspots were extracted from real 28nm test cases where the dense OPC solution is not satisfactory. For both layer types, the reference OPC consists of a dense OPC engine coupled to rule-based and/or model-based assist generation method. The same CM1 model is used for the reference and the ILT OPC. ILT quality improvement is presented through Optical Rule Check (ORC) results with various adequate detectors. Several mask manufacturing rule constraints (MRC) are considered for the ILT solution and their impact on process ability is checked after mask processing. A hybrid OPC approach allowing localized ILT usage is presented in order to optimize both quality and runtime. A real mask is prepared and fabricated with this method. Finally, results analyzed on silicon are presented to compare localized ILT to reference dense OPC.

  8. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  9. Recent advances in DNA sequencing techniques

    NASA Astrophysics Data System (ADS)

    Singh, Rama Shankar

    2013-06-01

    Successful mapping of the draft human genome in 2001 and more recent mapping of the human microbiome genome in 2012 have relied heavily on the parallel processing of the second generation/Next Generation Sequencing (NGS) DNA machines at a cost of several millions dollars and long computer processing times. These have been mainly biochemical approaches. Here a system analysis approach is used to review these techniques by identifying the requirements, specifications, test methods, error estimates, repeatability, reliability and trends in the cost reduction. The first generation, NGS and the Third Generation Single Molecule Real Time (SMART) detection sequencing methods are reviewed. Based on the National Human Genome Research Institute (NHGRI) data, the achieved cost reduction of 1.5 times per yr. from Sep. 2001 to July 2007; 7 times per yr., from Oct. 2007 to Apr. 2010; and 2.5 times per yr. from July 2010 to Jan 2012 are discussed.

  10. Ambient temperature modelling with soft computing techniques

    SciTech Connect

    Bertini, Ilaria; Ceravolo, Francesco; Citterio, Marco; Di Pietra, Biagio; Margiotta, Francesca; Pizzuti, Stefano; Puglisi, Giovanni; De Felice, Matteo

    2010-07-15

    This paper proposes a hybrid approach based on soft computing techniques in order to estimate monthly and daily ambient temperature. Indeed, we combine the back-propagation (BP) algorithm and the simple Genetic Algorithm (GA) in order to effectively train artificial neural networks (ANN) in such a way that the BP algorithm initialises a few individuals of the GA's population. Experiments concerned monthly temperature estimation of unknown places and daily temperature estimation for thermal load computation. Results have shown remarkable improvements in accuracy compared to traditional methods. (author)

  11. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  12. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  13. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  14. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year... (DOE), on the Advanced Scientific Computing Research Program managed by the Office of...

  15. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    .../Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department...

  16. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Advanced Scientific Computing Advisory Committee Charter Renewal AGENCY: Department of Energy, Office of... Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed... concerning the Advanced Scientific Computing program in response only to charges from the Director of...

  17. 78 FR 56871 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  18. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  19. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, DOE. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing Advisory..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  20. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  1. Computational Design of Advanced Nuclear Fuels

    SciTech Connect

    Savrasov, Sergey; Kotliar, Gabriel; Haule, Kristjan

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  2. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    SciTech Connect

    Larsen, R.S.; /SLAC

    2008-04-22

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.

  3. An introduction to NASA's advanced computing program: Integrated computing systems in advanced multichip modules

    NASA Technical Reports Server (NTRS)

    Fang, Wai-Chi; Alkalai, Leon

    1996-01-01

    Recent changes within NASA's space exploration program favor the design, implementation, and operation of low cost, lightweight, small and micro spacecraft with multiple launches per year. In order to meet the future needs of these missions with regard to the use of spacecraft microelectronics, NASA's advanced flight computing (AFC) program is currently considering industrial cooperation and advanced packaging architectures. In relation to this, the AFC program is reviewed, considering the design and implementation of NASA's AFC multichip module.

  4. Advances in procedural techniques--antegrade.

    PubMed

    Wilson, William; Spratt, James C

    2014-05-01

    There have been many technological advances in antegrade CTO PCI, but perhaps most importantly has been the evolution of the "hybrid' approach where ideally there exists a seamless interplay of antegrade wiring, antegrade dissection re-entry and retrograde approaches as dictated by procedural factors. Antegrade wire escalation with intimal tracking remains the preferred initial strategy in short CTOs without proximal cap ambiguity. More complex CTOs, however, usually require either a retrograde or an antegrade dissection re-entry approach, or both. Antegrade dissection re-entry is well suited to long occlusions where there is a healthy distal vessel and limited "interventional" collaterals. Early use of a dissection re-entry strategy will increase success rates, reduce complications, and minimise radiation exposure, contrast use as well as procedural times. Antegrade dissection can be achieved with a knuckle wire technique or the CrossBoss catheter whilst re-entry will be achieved in the most reproducible and reliable fashion by the Stingray balloon/wire. It should be avoided where there is potential for loss of large side branches. It remains to be seen whether use of newer dissection re-entry strategies will be associated with lower restenosis rates compared with the more uncontrolled subintimal tracking strategies such as STAR and whether stent insertion in the subintimal space is associated with higher rates of late stent malapposition and stent thrombosis. It is to be hoped that the algorithms, which have been developed to guide CTO operators, allow for a better transfer of knowledge and skills to increase uptake and acceptance of CTO PCI as a whole. PMID:24694104

  5. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-01-01

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches. PMID:27401684

  6. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  7. Computer Vision Techniques for Transcatheter Intervention

    PubMed Central

    Zhao, Feng; Roach, Matthew

    2015-01-01

    Minimally invasive transcatheter technologies have demonstrated substantial promise for the diagnosis and the treatment of cardiovascular diseases. For example, transcatheter aortic valve implantation is an alternative to aortic valve replacement for the treatment of severe aortic stenosis, and transcatheter atrial fibrillation ablation is widely used for the treatment and the cure of atrial fibrillation. In addition, catheter-based intravascular ultrasound and optical coherence tomography imaging of coronary arteries provides important information about the coronary lumen, wall, and plaque characteristics. Qualitative and quantitative analysis of these cross-sectional image data will be beneficial to the evaluation and the treatment of coronary artery diseases such as atherosclerosis. In all the phases (preoperative, intraoperative, and postoperative) during the transcatheter intervention procedure, computer vision techniques (e.g., image segmentation and motion tracking) have been largely applied in the field to accomplish tasks like annulus measurement, valve selection, catheter placement control, and vessel centerline extraction. This provides beneficial guidance for the clinicians in surgical planning, disease diagnosis, and treatment assessment. In this paper, we present a systematical review on these state-of-the-art methods. We aim to give a comprehensive overview for researchers in the area of computer vision on the subject of transcatheter intervention. Research in medical computing is multi-disciplinary due to its nature, and hence, it is important to understand the application domain, clinical background, and imaging modality, so that methods and quantitative measurements derived from analyzing the imaging data are appropriate and meaningful. We thus provide an overview on the background information of the transcatheter intervention procedures, as well as a review of the computer vision techniques and methodologies applied in this area. PMID:27170893

  8. Advanced Scientific Computing Research Network Requirements

    SciTech Connect

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  9. Innovative Tools Advance Revolutionary Weld Technique

    NASA Technical Reports Server (NTRS)

    2009-01-01

    The iconic, orange external tank of the space shuttle launch system not only contains the fuel used by the shuttle s main engines during liftoff but also comprises the shuttle s backbone, supporting the space shuttle orbiter and solid rocket boosters. Given the tank s structural importance and the extreme forces (7.8 million pounds of thrust load) and temperatures it encounters during launch, the welds used to construct the tank must be highly reliable. Variable polarity plasma arc welding, developed for manufacturing the external tank and later employed for building the International Space Station, was until 1994 the best process for joining the aluminum alloys used during construction. That year, Marshall Space Flight Center engineers began experimenting with a relatively new welding technique called friction stir welding (FSW), developed in 1991 by The Welding Institute, of Cambridge, England. FSW differs from traditional fusion welding in that it is a solid-state welding technique, using frictional heat and motion to join structural components without actually melting any of the material. The weld is created by a shouldered pin tool that is plunged into the seam of the materials to be joined. The tool traverses the line while rotating at high speeds, generating friction that heats and softens but does not melt the metal. (The heat produced approaches about 80 percent of the metal s melting temperature.) The pin tool s rotation crushes and stirs the plasticized metal, extruding it along the seam as the tool moves forward. The material cools and consolidates, resulting in a weld with superior mechanical properties as compared to those weld properties of fusion welds. The innovative FSW technology promises a number of attractive benefits. Because the welded materials are not melted, many of the undesirables associated with fusion welding porosity, cracking, shrinkage, and distortion of the weld are minimized or avoided. The process is more energy efficient, safe

  10. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  11. 76 FR 45786 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, Department of Energy... Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  12. 75 FR 57742 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  13. 78 FR 64931 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-30

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION... Computing Advisory Committee (ASCAC). This meeting replaces the cancelled ASCAC meeting that was to be held... Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of Energy;...

  14. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  15. 78 FR 50404 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ] ACTION... Scientific Computing Advisory Committee (ASCAC). The Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  16. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  17. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    SciTech Connect

    Durlofsky, Louis J.

    2000-08-28

    This project targets the development of (1) advanced reservoir simulation techniques for modeling non-conventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and well index (for use in simulation models), including the effects of wellbore flow; and (3) accurate approaches to account for heterogeneity in the near-well region.

  18. High-Performance Computing for Advanced Smart Grid Applications

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu

    2012-07-06

    The power grid is becoming far more complex as a result of the grid evolution meeting an information revolution. Due to the penetration of smart grid technologies, the grid is evolving as an unprecedented speed and the information infrastructure is fundamentally improved with a large number of smart meters and sensors that produce several orders of magnitude larger amounts of data. How to pull data in, perform analysis, and put information out in a real-time manner is a fundamental challenge in smart grid operation and planning. The future power grid requires high performance computing to be one of the foundational technologies in developing the algorithms and tools for the significantly increased complexity. New techniques and computational capabilities are required to meet the demands for higher reliability and better asset utilization, including advanced algorithms and computing hardware for large-scale modeling, simulation, and analysis. This chapter summarizes the computational challenges in smart grid and the need for high performance computing, and present examples of how high performance computing might be used for future smart grid operation and planning.

  19. Important advances in technology and unique applications to cardiovascular computed tomography.

    PubMed

    Chaikriangkrai, Kongkiat; Choi, Su Yeon; Nabi, Faisal; Chang, Su Min

    2014-01-01

    For the past decade, multidetector cardiac computed tomography and its main application, coronary computed tomography angiography, have been established as a noninvasive technique for anatomical assessment of coronary arteries. This new era of coronary artery evaluation by coronary computed tomography angiography has arisen from the rapid advancement in computed tomography technology, which has led to massive diagnostic and prognostic clinical studies in various patient populations. This article gives a brief overview of current multidetector cardiac computed tomography systems, developing cardiac computed tomography technologies in both hardware and software fields, innovative radiation exposure reduction measures, multidetector cardiac computed tomography functional studies, and their newer clinical applications beyond coronary computed tomography angiography. PMID:25574342

  20. Hybrid computer techniques for solving partial differential equations

    NASA Technical Reports Server (NTRS)

    Hammond, J. L., Jr.; Odowd, W. M.

    1971-01-01

    Techniques overcome equipment limitations that restrict other computer techniques in solving trivial cases. The use of curve fitting by quadratic interpolation greatly reduces required digital storage space.

  1. A Survey of Computational Intelligence Techniques in Protein Function Prediction

    PubMed Central

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction. PMID:25574395

  2. Computational intelligence techniques for tactile sensing systems.

    PubMed

    Gastaldo, Paolo; Pinna, Luigi; Seminara, Lucia; Valle, Maurizio; Zunino, Rodolfo

    2014-01-01

    Tactile sensing helps robots interact with humans and objects effectively in real environments. Piezoelectric polymer sensors provide the functional building blocks of the robotic electronic skin, mainly thanks to their flexibility and suitability for detecting dynamic contact events and for recognizing the touch modality. The paper focuses on the ability of tactile sensing systems to support the challenging recognition of certain qualities/modalities of touch. The research applies novel computational intelligence techniques and a tensor-based approach for the classification of touch modalities; its main results consist in providing a procedure to enhance system generalization ability and architecture for multi-class recognition applications. An experimental campaign involving 70 participants using three different modalities in touching the upper surface of the sensor array was conducted, and confirmed the validity of the approach.

  3. Advances in computational studies of energy materials.

    PubMed

    Catlow, C R A; Guo, Z X; Miskufova, M; Shevlin, S A; Smith, A G H; Sokol, A A; Walsh, A; Wilson, D J; Woodley, S M

    2010-07-28

    We review recent developments and applications of computational modelling techniques in the field of materials for energy technologies including hydrogen production and storage, energy storage and conversion, and light absorption and emission. In addition, we present new work on an Sn2TiO4 photocatalyst containing an Sn(II) lone pair, new interatomic potential models for SrTiO3 and GaN, an exploration of defects in the kesterite/stannite-structured solar cell absorber Cu2ZnSnS4, and report details of the incorporation of hydrogen into Ag2O and Cu2O. Special attention is paid to the modelling of nanostructured systems, including ceria (CeO2, mixed Ce(x)O(y) and Ce2O3) and group 13 sesquioxides. We consider applications based on both interatomic potential and electronic structure methodologies; and we illustrate the increasingly quantitative and predictive nature of modelling in this field. PMID:20566517

  4. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  5. Multidirectional mobilities: Advanced measurement techniques and applications

    NASA Astrophysics Data System (ADS)

    Ivarsson, Lars Holger

    Today high noise-and-vibration comfort has become a quality sign of products in sectors such as the automotive industry, aircraft, components, households and manufacturing. Consequently, already in the design phase of products, tools are required to predict the final vibration and noise levels. These tools have to be applicable over a wide frequency range with sufficient accuracy. During recent decades a variety of tools have been developed such as transfer path analysis (TPA), input force estimation, substructuring, coupling by frequency response functions (FRF) and hybrid modelling. While these methods have a well-developed theoretical basis, their application combined with experimental data often suffers from a lack of information concerning rotational DOFs. In order to measure response in all 6 DOFs (including rotation), a sensor has been developed, whose special features are discussed in the thesis. This transducer simplifies the response measurements, although in practice the excitation of moments appears to be more difficult. Several excitation techniques have been developed to enable measurement of multidirectional mobilities. For rapid and simple measurement of the loaded mobility matrix, a MIMO (Multiple Input Multiple Output) technique is used. The technique has been tested and validated on several structures of different complexity. A second technique for measuring the loaded 6-by-6 mobility matrix has been developed. This technique employs a model of the excitation set-up, and with this model the mobility matrix is determined from sequential measurements. Measurements on ``real'' structures show that both techniques give results of similar quality, and both are recommended for practical use. As a further step, a technique for measuring the unloaded mobilities is presented. It employs the measured loaded mobility matrix in order to calculate compensation forces and moments, which are later applied in order to compensate for the loading of the

  6. Advances in laparoscopic urologic surgery techniques

    PubMed Central

    Abdul-Muhsin, Haidar M.; Humphreys, Mitchell R.

    2016-01-01

    The last two decades witnessed the inception and exponential implementation of key technological advancements in laparoscopic urology. While some of these technologies thrived and became part of daily practice, others are still hindered by major challenges. This review was conducted through a comprehensive literature search in order to highlight some of the most promising technologies in laparoscopic visualization, augmented reality, and insufflation. Additionally, this review will provide an update regarding the current status of single-site and natural orifice surgery in urology. PMID:27134743

  7. Advances in laparoscopic urologic surgery techniques.

    PubMed

    Abdul-Muhsin, Haidar M; Humphreys, Mitchell R

    2016-01-01

    The last two decades witnessed the inception and exponential implementation of key technological advancements in laparoscopic urology. While some of these technologies thrived and became part of daily practice, others are still hindered by major challenges. This review was conducted through a comprehensive literature search in order to highlight some of the most promising technologies in laparoscopic visualization, augmented reality, and insufflation. Additionally, this review will provide an update regarding the current status of single-site and natural orifice surgery in urology. PMID:27134743

  8. Parallel computing techniques for rotorcraft aerodynamics

    NASA Astrophysics Data System (ADS)

    Ekici, Kivanc

    The modification of unsteady three-dimensional Navier-Stokes codes for application on massively parallel and distributed computing environments is investigated. The Euler/Navier-Stokes code TURNS (Transonic Unsteady Rotor Navier-Stokes) was chosen as a test bed because of its wide use by universities and industry. For the efficient implementation of TURNS on parallel computing systems, two algorithmic changes are developed. First, main modifications to the implicit operator, Lower-Upper Symmetric Gauss Seidel (LU-SGS) originally used in TURNS, is performed. Second, application of an inexact Newton method, coupled with a Krylov subspace iterative method (Newton-Krylov method) is carried out. Both techniques have been tried previously for the Euler equations mode of the code. In this work, we have extended the methods to the Navier-Stokes mode. Several new implicit operators were tried because of convergence problems of traditional operators with the high cell aspect ratio (CAR) grids needed for viscous calculations on structured grids. Promising results for both Euler and Navier-Stokes cases are presented for these operators. For the efficient implementation of Newton-Krylov methods to the Navier-Stokes mode of TURNS, efficient preconditioners must be used. The parallel implicit operators used in the previous step are employed as preconditioners and the results are compared. The Message Passing Interface (MPI) protocol has been used because of its portability to various parallel architectures. It should be noted that the proposed methodology is general and can be applied to several other CFD codes (e.g. OVERFLOW).

  9. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  10. Major advances in genetic evaluation techniques.

    PubMed

    Powell, R L; Norman, H D

    2006-04-01

    The past quarter-century in genetic evaluation of dairy cattle has been marked by evolution in methodology and computer capacity, expansion in the array of evaluated traits, and globalization. Animal models replaced sire and sire-maternal grandsire models and, more recently, application of Bayesian theory has become standard. Individual test-day observations have been used more effectively in estimation of lactation yield or directly as input to evaluation models. Computer speed and storage are less limiting in choosing procedures. The increased capabilities have supported evaluation of additional traits that affect the net profitability of dairy cows. The importance of traits other than yield has increased, in a few cases due to an antagonistic relationship with yield. National evaluations combined internationally provide evaluations for bulls from all participating countries on each of the national scales, facilitating choices from among many more bulls. Selection within countries has increased inbreeding and the use of similar genetics across countries reduces the previously available outcross population. Concern about inbreeding has prompted changes in evaluation methodology and mating practices, and has promoted interest in crossbreeding. In just the past decade, distribution of genetic evaluations has gone from mailed paper or computer tapes for a limited audience to publicly accessible, request-driven distribution via the Internet. Among the distributed information is a choice of economic indices that combine an increasing array of traits into numbers reflecting breeding goals under different milk-pricing conditions. Considerable progress in genomics and the mapping of the bovine genome have identified markers for some deleterious recessive genes, but broader benefits of marker-assisted selection are still in the future. A possible exception is the proprietary use of DNA testing by semen producers to select among potential progeny test bulls. The collection

  11. Advanced Computational Thermal Studies and their Assessment for Supercritical-Pressure Reactors (SCRs)

    SciTech Connect

    D. M. McEligot; J. Y. Yoo; J. S. Lee; S. T. Ro; E. Lurien; S. O. Park; R. H. Pletcher; B. L. Smith; P. Vukoslavcevic; J. M. Wallace

    2009-04-01

    The goal of this laboratory / university collaboration of coupled computational and experimental studies is the improvement of predictive methods for supercritical-pressure reactors. The general objective is to develop supporting knowledge needed of advanced computational techniques for the technology development of the concepts and their safety systems.

  12. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  13. Visualization Techniques for Computer Network Defense

    SciTech Connect

    Beaver, Justin M; Steed, Chad A; Patton, Robert M; Cui, Xiaohui; Schultz, Matthew A

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  14. [Advanced online search techniques and dedicated search engines for physicians].

    PubMed

    Nahum, Yoav

    2008-02-01

    In recent years search engines have become an essential tool in the work of physicians. This article will review advanced search techniques from the world of information specialists, as well as some advanced search engine operators that may help physicians improve their online search capabilities, and maximize the yield of their searches. This article also reviews popular dedicated scientific and biomedical literature search engines.

  15. Advanced optical imaging techniques for neurodevelopment.

    PubMed

    Wu, Yicong; Christensen, Ryan; Colón-Ramos, Daniel; Shroff, Hari

    2013-12-01

    Over the past decade, developmental neuroscience has been transformed by the widespread application of confocal and two-photon fluorescence microscopy. Even greater progress is imminent, as recent innovations in microscopy now enable imaging with increased depth, speed, and spatial resolution; reduced phototoxicity; and in some cases without external fluorescent probes. We discuss these new techniques and emphasize their dramatic impact on neurobiology, including the ability to image neurons at depths exceeding 1mm, to observe neurodevelopment noninvasively throughout embryogenesis, and to visualize neuronal processes or structures that were previously too small or too difficult to target with conventional microscopy.

  16. Advanced Optical Imaging Techniques for Neurodevelopment

    PubMed Central

    Wu, Yicong; Christensen, Ryan; Colón-Ramos, Daniel; Shroff, Hari

    2013-01-01

    Over the past decade, developmental neuroscience has been transformed by the widespread application of confocal and two-photon fluorescence microscopy. Even greater progress is imminent, as recent innovations in microscopy now enable imaging with increased depth, speed, and spatial resolution; reduced phototoxicity; and in some cases without external fluorescent probes. We discuss these new techniques and emphasize their dramatic impact on neurobiology, including the ability to image neurons at depths exceeding 1 mm, to observe neurodevelopment noninvasively throughout embryogenesis, and to visualize neuronal processes or structures that were previously too small or too difficult to target with conventional microscopy. PMID:23831260

  17. Advanced ultrasonic techniques for local tumor hyperthermia.

    PubMed

    Lele, P P

    1989-05-01

    Scanned, intensity-modulated, focused ultrasound (SIMFU) presently is the modality of choice for localized, controlled heating of deep as well as superficial tumors noninvasively. With the present SIMFU system, it was possible to heat 88 per cent of deep tumors up to 12 cm in depth and 15 cm in diameter, to 43 degrees C in 3 to 4 minutes. The infiltrative tumor margins could be heated to the desired therapeutic temperature. The temperature outside the treatment field fell off sharply. Excellent objective responses were obtained without local or systemic toxicity. Multiinstitutional clinical trials of local hyperthermia by this promising technique are clearly warranted.

  18. Air pollution monitoring by advanced spectroscopic techniques.

    PubMed

    Hodgeson, J A; McClenny, W A; Hanst, P L

    1973-10-19

    The monitoring requirements related to air pollution are many and varied. The molecules of concern differ greatly in their chemical and physical properties, in the nature of their environment, and in their concentration ranges. Furthermore, the application may have specific requirements such as rapid response time, ultrasensitivity, multipollutant capability, or capability for remote measurements. For these reasons, no single spectroscopic technique appears to offer a panacea for all monitoring needs. Instead we have attempted to demonstrate in the above discussion that, regardless of the difficulty and complexity of the monitoring problems, spectroscopy offers many tools by which such problems may be solved.

  19. Advanced enhancement techniques for digitized images

    NASA Astrophysics Data System (ADS)

    Tom, V. T.; Merenyi, R. C.; Carlotto, M. J.; Heller, W. G.

    Computer image enhancement of digitized X-ray and conventional photographs has been employed to reveal anomalies in aerospace hardware. Signal processing of these images included use of specially-developed filters to sharpen detail without sacrificing radiographic information, application of local contrast stretch and histogram equalization algorithms to display structure in low-contrast areas and employment of other unique digital processing methods. Edge detection, normally complicated by poor spatial resolution, limited contrast and recording media noise, was performed as a post-processing operation via a difference-of-Gaussians method and a least squares fitting procedures. In this manner, multi-image signal processing allowed for the precise measurement (to within 0.02 inches, rms) of the Inertial Upper Stage nozzle nosecap motion during a static test firing as well as identifying potential problems in the Solid Rocket Booster parachute deployment.

  20. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  1. Advanced automated char image analysis techniques

    SciTech Connect

    Tao Wu; Edward Lester; Michael Cloke

    2006-05-15

    Char morphology is an important characteristic when attempting to understand coal behavior and coal burnout. In this study, an augmented algorithm has been proposed to identify char types using image analysis. On the basis of a series of image processing steps, a char image is singled out from the whole image, which then allows the important major features of the char particle to be measured, including size, porosity, and wall thickness. The techniques for automated char image analysis have been tested against char images taken from ICCP Char Atlas as well as actual char particles derived from pyrolyzed char samples. Thirty different chars were prepared in a drop tube furnace operating at 1300{sup o}C, 1% oxygen, and 100 ms from 15 different world coals sieved into two size fractions (53-75 and 106-125 {mu}m). The results from this automated technique are comparable with those from manual analysis, and the additional detail from the automated sytem has potential use in applications such as combustion modeling systems. Obtaining highly detailed char information with automated methods has traditionally been hampered by the difficulty of automatic recognition of individual char particles. 20 refs., 10 figs., 3 tabs.

  2. WAATS: A computer program for Weights Analysis of Advanced Transportation Systems

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.

    1974-01-01

    A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.

  3. Advanced Techniques for Simulating the Behavior of Sand

    NASA Astrophysics Data System (ADS)

    Clothier, M.; Bailey, M.

    2009-12-01

    research is to simulate the look and behavior of sand, this work will go beyond simple particle collision. In particular, we can continue to use our parallel algorithms not only on single particles but on particle “clumps” that consist of multiple combined particles. Since sand is typically not spherical in nature, these particle “clumps” help to simulate the coarse nature of sand. In a simulation environment, multiple combined particles could be used to simulate the polygonal and granular nature of sand grains. Thus, a diversity of sand particles can be generated. The interaction between these particles can then be parallelized using GPU hardware. As such, this research will investigate different graphics and physics techniques and determine the tradeoffs in performance and visual quality for sand simulation. An enhanced sand model through the use of high performance computing and GPUs has great potential to impact research for both earth and space scientists. Interaction with JPL has provided an opportunity for us to refine our simulation techniques that can ultimately be used for their vehicle simulator. As an added benefit of this work, advancements in simulating sand can also benefit scientists here on earth, especially in regard to understanding landslides and debris flows.

  4. Computer Organizational Techniques Used by Office Personnel.

    ERIC Educational Resources Information Center

    Alexander, Melody

    1995-01-01

    According to survey responses from 404 of 532 office personnel, 81.7% enjoy working with computers; the majority save files on their hard drives, use disk labels and storage files, do not use subdirectories or compress data, and do not make backups of floppy disks. Those with higher degrees, more computer experience, and more daily computer use…

  5. Computing Advances in the Teaching of Chemistry.

    ERIC Educational Resources Information Center

    Baskett, W. P.; Matthews, G. P.

    1984-01-01

    Discusses three trends in computer-oriented chemistry instruction: (1) availability of interfaces to integrate computers with experiments; (2) impact of the development of higher resolution graphics and greater memory capacity; and (3) role of videodisc technology on computer assisted instruction. Includes program listings for auto-titration and…

  6. Application of computational fluid dynamics techniques to blood pumps.

    PubMed

    Sukumar, R; Athavale, M M; Makhijani, V B; Przekwas, A J

    1996-06-01

    Present-day computational fluid dynamics (CFD) techniques can be used to analyze the behavior of fluid flow in a variety of pumps. CFD can be a powerful tool during the design stage for rapid virtual prototyping of different designs, analyzing performance parameters, and making design improvements. Computational flow solutions provide information such as the location and size of stagnation zones and the local shear rate. These parameters can be correlated to the extent of hemolysis and thrombus formation and are critical to the success of a blood pump. CFD-ACE, an advanced commercial CFD code developed by CFD Research Corporation, has been applied to fluid flows in rotary machines, such as axial flow pumps and inducers. Preprocessing and postprocessing tools for efficient grid generation and advanced graphical flow visualization are integrated seamlessly with CFD-ACE. The code has structured multiblock grid capability, non-Newtonian fluid treatment, a variety of turbulence models, and an Eulerian-Langrangian particle tracking model. CFD-ACE has been used successfully to study the flow characteristics in an axial flow blood pump. An unstructured flow solver that greatly automates the process of grid generation and speeds up the flow simulation is under development. PMID:8817950

  7. Continuation of advanced crew procedures development techniques

    NASA Technical Reports Server (NTRS)

    Arbet, J. D.; Benbow, R. L.; Evans, M. E.; Mangiaracina, A. A.; Mcgavern, J. L.; Spangler, M. C.; Tatum, I. C.

    1976-01-01

    An operational computer program, the Procedures and Performance Program (PPP) which operates in conjunction with the Phase I Shuttle Procedures Simulator to provide a procedures recording and crew/vehicle performance monitoring capability was developed. A technical synopsis of each task resulting in the development of the Procedures and Performance Program is provided. Conclusions and recommendations for action leading to the improvements in production of crew procedures development and crew training support are included. The PPP provides real-time CRT displays and post-run hardcopy output of procedures, difference procedures, performance data, parametric analysis data, and training script/training status data. During post-run, the program is designed to support evaluation through the reconstruction of displays to any point in time. A permanent record of the simulation exercise can be obtained via hardcopy output of the display data and via transfer to the Generalized Documentation Processor (GDP). Reference procedures data may be transferred from the GDP to the PPP. Interface is provided with the all digital trajectory program, the Space Vehicle Dynamics Simulator (SVDS) to support initial procedures timeline development.

  8. Advanced Techniques for Root Cause Analysis

    2000-09-19

    Five items make up this package, or can be used individually. The Chronological Safety Management Template utilizes a linear adaptation of the Integrated Safety Management System laid out in the form of a template that greatly enhances the ability of the analyst to perform the first step of any investigation which is to gather all pertinent facts and identify causal factors. The Problem Analysis Tree is a simple three (3) level problem analysis tree whichmore » is easier for organizations outside of WSRC to use. Another part is the Systemic Root Cause Tree. One of the most basic and unique features of Expanded Root Cause Analysis is the Systemic Root Cause portion of the Expanded Root Cause Pyramid. The Systemic Root Causes are even more basic than the Programmatic Root Causes and represent Root Causes that cut across multiple (if not all) programs in an organization. the Systemic Root Cause portion contains 51 causes embedded at the bottom level of a three level Systemic Root Cause Tree that is divided into logical, organizationally based categorie to assist the analyst. The Computer Aided Root Cause Analysis that allows the analyst at each level of the Pyramid to a) obtain a brief description of the cause that is being considered, b) record a decision that the item is applicable, c) proceed to the next level of the Pyramid to see only those items at the next level of the tree that are relevant to the particular cause that has been chosen, and d) at the end of the process automatically print out a summary report of the incident, the causal factors as they relate to the safety management system, the probable causes, apparent causes, Programmatic Root Causes and Systemic Root Causes for each causal factor and the associated corrective action.« less

  9. Laparoscopic ureteral reimplantation: a simplified dome advancement technique.

    PubMed

    Lima, Guilherme C; Rais-Bahrami, Soroush; Link, Richard E; Kavoussi, Louis R

    2005-12-01

    Laparoscopic Boari flap reimplantation has been used to treat long distal ureteral strictures. This technique requires extensive bladder mobilization and complex intracorporeal suturing. This demonstrates a novel laparoscopic bladder dome advancement approach for ureteral reimplantation. This technique obviates the need for bladder pedicle dissection and simplifies the required suturing.

  10. Evaluation of Advanced Retrieval Techniques in an Experimental Online Catalog.

    ERIC Educational Resources Information Center

    Larson, Ray R.

    1992-01-01

    Discusses subject searching problems in online library catalogs; explains advanced information retrieval (IR) techniques; and describes experiments conducted on a test collection database, CHESHIRE (California Hybrid Extended SMART for Hypertext and Information Retrieval Experimentation), which was created to evaluate IR techniques in online…

  11. New Information Dispersal Techniques for Trustworthy Computing

    ERIC Educational Resources Information Center

    Parakh, Abhishek

    2011-01-01

    Information dispersal algorithms (IDA) are used for distributed data storage because they simultaneously provide security, reliability and space efficiency, constituting a trustworthy computing framework for many critical applications, such as cloud computing, in the information society. In the most general sense, this is achieved by dividing data…

  12. Cloud Computing Techniques for Space Mission Design

    NASA Technical Reports Server (NTRS)

    Arrieta, Juan; Senent, Juan

    2014-01-01

    The overarching objective of space mission design is to tackle complex problems producing better results, and faster. In developing the methods and tools to fulfill this objective, the user interacts with the different layers of a computing system.

  13. Advances in gamma titanium aluminides and their manufacturing techniques

    NASA Astrophysics Data System (ADS)

    Kothari, Kunal; Radhakrishnan, Ramachandran; Wereley, Norman M.

    2012-11-01

    Gamma titanium aluminides display attractive properties for high temperature applications. For over a decade in the 1990s, the attractive properties of titanium aluminides were outweighed by difficulties encountered in processing and machining at room temperature. But advances in manufacturing technologies, deeper understanding of titanium aluminides microstructure, deformation mechanisms, and advances in micro-alloying, has led to the production of gamma titanium aluminide sheets. An in-depth review of key advances in gamma titanium aluminides is presented, including microstructure, deformation mechanisms, and alloy development. Traditional manufacturing techniques such as ingot metallurgy and investment casting are reviewed and advances via powder metallurgy based manufacturing techniques are discussed. Finally, manufacturing challenges facing gamma titanium aluminides, as well as avenues to overcome them, are discussed.

  14. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  15. 75 FR 44015 - Certain Semiconductor Products Made by Advanced Lithography Techniques and Products Containing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... COMMISSION Certain Semiconductor Products Made by Advanced Lithography Techniques and Products Containing... importation of certain semiconductor products made by advanced lithography techniques and products containing... certain semiconductor products made by advanced lithography techniques or products containing same...

  16. Advanced liner-cooling techniques for gas turbine combustors

    NASA Technical Reports Server (NTRS)

    Norgren, C. T.; Riddlebaugh, S. M.

    1985-01-01

    Component research for advanced small gas turbine engines is currently underway at the NASA Lewis Research Center. As part of this program, a basic reverse-flow combustor geometry was being maintained while different advanced liner wall cooling techniques were investigated. Performance and liner cooling effectiveness of the experimental combustor configuration featuring counter-flow film-cooled panels is presented and compared with two previously reported combustors featuring: splash film-cooled liner walls; and transpiration cooled liner walls (Lamilloy).

  17. Advanced regenerative-cooling techniques for future space transportation systems

    NASA Technical Reports Server (NTRS)

    Wagner, W. R.; Shoji, J. M.

    1975-01-01

    A review of regenerative-cooling techniques applicable to advanced planned engine designs for space booster and orbit transportation systems has developed the status of the key elements of this cooling mode. This work is presented in terms of gas side, coolant side, wall conduction heat transfer, and chamber life fatigue margin considerations. Described are preliminary heat transfer and trade analyses performed using developed techniques combining channel wall construction with advanced, high-strength, high-thermal-conductivity materials (NARloy-Z or Zr-Cu alloys) in high heat flux regions, combined with lightweight steel tubular nozzle wall construction. Advanced cooling techniques such as oxygen cooling and dual-mode hydrocarbon/hydrogen fuel operation and their limitations are indicated for the regenerative cooling approach.

  18. Advances in Monte Carlo computer simulation

    NASA Astrophysics Data System (ADS)

    Swendsen, Robert H.

    2011-03-01

    Since the invention of the Metropolis method in 1953, Monte Carlo methods have been shown to provide an efficient, practical approach to the calculation of physical properties in a wide variety of systems. In this talk, I will discuss some of the advances in the MC simulation of thermodynamics systems, with an emphasis on optimization to obtain a maximum of useful information.

  19. Bi-maxillary advancement surgery: Technique, indications and results.

    PubMed

    Olivi, Pierre; Garcia, Claude

    2014-06-01

    Esthetic analysis of the face in some patients presenting a dental Class II can reveal the need for maxillo-mandibular advancement surgery. In these cases, mandibular advancement alone would provide a result which was satisfactory from the occlusal viewpoint but esthetically displeasing. Using bi-maxillary advancement, the impact of nasal volume is reduced and the nasolabial relationship is corrected. The sub-mandibular length is increased, thus creating a better-defined cervico-mental angle. This treatment technique involving a prior mandibular procedure has the advantage of restoring patients' dental occlusion while optimizing their facial esthetics.

  20. Techniques for Managing a Computer Classroom.

    ERIC Educational Resources Information Center

    Sedran, Mary Ann

    1985-01-01

    Some techniques for managing the classroom and teaching programing that have worked well are described. Hardware placement and use, classroom management, instructional recommendations, and programing ideas are each discussed. (MNS)

  1. Techniques of networking in the computer world.

    PubMed

    Armstrong, M L

    1985-09-01

    Networks can play an important role for nurses in user-to-user communication because they can be used both within and outside the health care delivery system. The choices include an information exchange, which can be an effective strategy for sharing personal concerns, problems, and achievements about the computer; commercial data bases with their vast sources of information and research data; or local area networks, effective in an office or campus setting. All of these networks can put worlds of information and services just a few words or keyboard strokes away, because they offer, outside of your own computer, a whole new dimension of retrieval, storage, reference, and communication capabilities. These networks can significantly enhance computing potential by providing an overall expansion of information.

  2. Measuring Speed Using a Computer--Several Techniques.

    ERIC Educational Resources Information Center

    Pearce, Jon M.

    1988-01-01

    Introduces three different techniques to facilitate the measurement of speed and the associated kinematics and dynamics using a computer. Discusses sensing techniques using optical or ultrasonic sensors, interfacing with a computer, software routines for the interfaces, and other applications. Provides circuit diagrams, pictures, and a program to…

  3. Space data systems: Advanced flight computers

    NASA Technical Reports Server (NTRS)

    Benz, Harry F.

    1991-01-01

    The technical objectives are to develop high-performance, space-qualifiable, onboard computing, storage, and networking technologies. The topics are presented in viewgraph form and include the following: technology challenges; state-of-the-art assessment; program description; relationship to external programs; and cooperation and coordination effort.

  4. Advances in Computer-Supported Learning

    ERIC Educational Resources Information Center

    Neto, Francisco; Brasileiro, Francisco

    2007-01-01

    The Internet and growth of computer networks have eliminated geographic barriers, creating an environment where education can be brought to a student no matter where that student may be. The success of distance learning programs and the availability of many Web-supported applications and multimedia resources have increased the effectiveness of…

  5. Computer Science Techniques Applied to Parallel Atomistic Simulation

    NASA Astrophysics Data System (ADS)

    Nakano, Aiichiro

    1998-03-01

    Recent developments in parallel processing technology and multiresolution numerical algorithms have established large-scale molecular dynamics (MD) simulations as a new research mode for studying materials phenomena such as fracture. However, this requires large system sizes and long simulated times. We have developed: i) Space-time multiresolution schemes; ii) fuzzy-clustering approach to hierarchical dynamics; iii) wavelet-based adaptive curvilinear-coordinate load balancing; iv) multilevel preconditioned conjugate gradient method; and v) spacefilling-curve-based data compression for parallel I/O. Using these techniques, million-atom parallel MD simulations are performed for the oxidation dynamics of nanocrystalline Al. The simulations take into account the effect of dynamic charge transfer between Al and O using the electronegativity equalization scheme. The resulting long-range Coulomb interaction is calculated efficiently with the fast multipole method. Results for temperature and charge distributions, residual stresses, bond lengths and bond angles, and diffusivities of Al and O will be presented. The oxidation of nanocrystalline Al is elucidated through immersive visualization in virtual environments. A unique dual-degree education program at Louisiana State University will also be discussed in which students can obtain a Ph.D. in Physics & Astronomy and a M.S. from the Department of Computer Science in five years. This program fosters interdisciplinary research activities for interfacing High Performance Computing and Communications with large-scale atomistic simulations of advanced materials. This work was supported by NSF (CAREER Program), ARO, PRF, and Louisiana LEQSF.

  6. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  7. Computational Techniques in Radio Neutrino Event Reconstruction

    NASA Astrophysics Data System (ADS)

    Beydler, M.; ARA Collaboration

    2016-03-01

    The Askaryan Radio Array (ARA) is a high-energy cosmic neutrino detector constructed with stations of radio antennas buried in the ice at the South Pole. Event reconstruction relies on the analysis of the arrival times of the transient radio signals generated by neutrinos interacting within a few kilometers of the detector. Because of its depth dependence, the index of refraction in the ice complicates the interferometric directional reconstruction of possible neutrino events. Currently, there is an ongoing endeavor to enhance the programs used for the time-consuming computations of the curved paths of the transient wave signals in the ice as well as the interferometric beamforming. We have implemented a fast, multi-dimensional spline table lookup of the wave arrival times in order to enable raytrace-based directional reconstructions. Additionally, we have applied parallel computing across multiple Graphics Processing Units (GPUs) in order to perform the beamforming calculations quickly.

  8. Advances in Computationally Modeling Human Oral Bioavailability

    PubMed Central

    Wang, Junmei; Hou, Tingjun

    2015-01-01

    Although significant progress has been made in experimental high throughput screening (HTS) of ADME (absorption, distribution, metabolism, excretion) and pharmacokinetic properties, the ADME and Toxicity (ADME-Tox) in silico modeling is still indispensable in drug discovery as it can guide us to wisely select drug candidates prior to expensive ADME screenings and clinical trials. Compared to other ADME-Tox properties, human oral bioavailability (HOBA) is particularly important but extremely difficult to predict. In this paper, the advances in human oral bioavailability modeling will be reviewed. Moreover, our deep insight on how to construct more accurate and reliable HOBA QSAR and classification models will also discussed. PMID:25582307

  9. Hybrid inverse lithography techniques for advanced hierarchical memories

    NASA Astrophysics Data System (ADS)

    Xiao, Guangming; Hooker, Kevin; Irby, Dave; Zhang, Yunqiang; Ward, Brian; Cecil, Tom; Hall, Brett; Lee, Mindy; Kim, Dave; Lucas, Kevin

    2014-03-01

    Traditional segment-based model-based OPC methods have been the mainstream mask layout optimization techniques in volume production for memory and embedded memory devices for many device generations. These techniques have been continually optimized over time to meet the ever increasing difficulties of memory and memory periphery patterning. There are a range of difficult issues for patterning embedded memories successfully. These difficulties include the need for a very high level of symmetry and consistency (both within memory cells themselves and between cells) due to circuit effects such as noise margin requirements in SRAMs. Memory cells and access structures consume a large percentage of area in embedded devices so there is a very high return from shrinking the cell area as much as possible. This aggressive scaling leads to very difficult resolution, 2D CD control and process window requirements. Additionally, the range of interactions between mask synthesis corrections of neighboring areas can extend well beyond the size of the memory cell, making it difficult to fully take advantage of the inherent designed cell hierarchy in mask pattern optimization. This is especially true for non-traditional (i.e., less dependent on geometric rule) OPC/RET methods such as inverse lithography techniques (ILT) which inherently have more model-based decisions in their optimizations. New inverse methods such as model-based SRAF placement and ILT are, however, well known to have considerable benefits in finding flexible mask pattern solutions to improve process window, improve 2D CD control, and improve resolution in ultra-dense memory patterns. They also are known to reduce recipe complexity and provide native MRC compliant mask pattern solutions. Unfortunately, ILT is also known to be several times slower than traditional OPC methods due to the increased computational lithographic optimizations it performs. In this paper, we describe and present results for a methodology to

  10. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  11. Advanced Marketing Core Curriculum. Test Items and Assessment Techniques.

    ERIC Educational Resources Information Center

    Smith, Clifton L.; And Others

    This document contains duties and tasks, multiple-choice test items, and other assessment techniques for Missouri's advanced marketing core curriculum. The core curriculum begins with a list of 13 suggested textbook resources. Next, nine duties with their associated tasks are given. Under each task appears one or more citations to appropriate…

  12. Innovative modelling techniques in computer vision

    NASA Astrophysics Data System (ADS)

    Ardizzone, Edoardo; Chella, Antonio

    The paper is concerned with two of main research activities currently carried on at the Computer Science and Artificial Intelligence lab of DIE. The first part deals with hybrid artificial vision models, intended to provide object recognition and classification capabilities to an autonomous intelligen system. In this framework, a system recovering 3-D shape information from grey-level images of a scene, building a geometric representation of the scene in terms of superquadrics at the geometric level, and reasoning about the scene at the symbolic level is described. In the second part, attention is focused on automatic indexing of image databases. JACOB, a prototypal system allowing for the automatic extraction from images of salient features like colour and texture, and for content-based browsing and querying in image and video databases is briefly described.

  13. Comparison of interfinger connection matrix computation techniques.

    PubMed

    Martin, Joel R; Terekhov, Alexander V; Latash, Mark L; Zatsiorsky, Vladimir M

    2013-10-01

    A hypothesis was proposed that the central nervous system controls force production by the fingers through hypothetical neural commands. The neural commands are scaled between values of 0 to 1, indicating no intentional force production or maximal voluntary contraction (MVC) force production, respectively. A matrix of interfinger connections transforms neural commands into finger forces. Two methods have been proposed to compute the interfinger connection matrix. The first method uses only single finger MVC trials and multiplies the interfinger connection matrix by a gain factor. The second method uses a neural network model based on experimental data. The performance of the two methods was compared on the MVC data and on a data set of submaximal forces, collected over a range of total forces and moments of force. The methods were compared in terms of (1) ability to predict finger forces, (2) accuracy of neural command reconstruction, and (3) preserved planarity of force data for submaximal force production task. Both methods did a reasonable job of predicting the total force in multifinger MVC trials; however, the neural network model performed better in regards to all other criteria. Overall, the results indicate that for modeling multifinger interaction the neural network method is preferable. PMID:23183029

  14. Advances in Single-Photon Emission Computed Tomography Hardware and Software.

    PubMed

    Piccinelli, Marina; Garcia, Ernest V

    2016-02-01

    Nuclear imaging techniques remain today's most reliable modality for the assessment and quantification of myocardial perfusion. In recent years, the field has experienced tremendous progress both in terms of dedicated cameras for cardiac applications and software techniques for image reconstruction. The most recent advances in single-photon emission computed tomography hardware and software are reviewed, focusing on how these improvements have resulted in an even more powerful diagnostic tool with reduced injected radiation dose and acquisition time.

  15. A new technique for fast dynamic focusing law computing

    NASA Astrophysics Data System (ADS)

    Fritsch, C.; Cruza, J. F.; Brizuela, J.; Camacho, J.; Moreno, J. M.

    2012-05-01

    Dynamic focusing requires computing the individual delays for every element and every focus in the image. This is an easy and relatively fast task if the inspected medium is homogeneous. Nevertheless, some difficulties arise in presence of interfaces (i.e, wedges, immersion, etc.): refraction effects require computing the Snell's law for every focus and element to find the fastest ray entry point in the interface. The process is easy but takes a long time. This work presents a new technique to compute the focusing delays for an equivalent virtual array that operates in the second medium only, thus avoiding any interface. It is nearly as fast as computing the focal laws in the homogeneous case and an order of magnitude faster than Snell's or Fermat's principle based methods. Furthermore, the technique is completely general and can be applied to any equipment having dynamic focusing capabilities. In fact, the technique is especially well suited for real-time focal law computing hardware.

  16. Advanced Packaging Materials and Techniques for High Power TR Module: Standard Flight vs. Advanced Packaging

    NASA Technical Reports Server (NTRS)

    Hoffman, James Patrick; Del Castillo, Linda; Miller, Jennifer; Jenabi, Masud; Hunter, Donald; Birur, Gajanana

    2011-01-01

    The higher output power densities required of modern radar architectures, such as the proposed DESDynI [Deformation, Ecosystem Structure, and Dynamics of Ice] SAR [Synthetic Aperture Radar] Instrument (or DSI) require increasingly dense high power electronics. To enable these higher power densities, while maintaining or even improving hardware reliability, requires advances in integrating advanced thermal packaging technologies into radar transmit/receive (TR) modules. New materials and techniques have been studied and compared to standard technologies.

  17. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    SciTech Connect

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow in the wellbore

  18. An Advanced Time Averaging Modelling Technique for Power Electronic Circuits

    NASA Astrophysics Data System (ADS)

    Jankuloski, Goce

    For stable and efficient performance of power converters, a good mathematical model is needed. This thesis presents a new modelling technique for DC/DC and DC/AC Pulse Width Modulated (PWM) converters. The new model is more accurate than the existing modelling techniques such as State Space Averaging (SSA) and Discrete Time Modelling. Unlike the SSA model, the new modelling technique, the Advanced Time Averaging Model (ATAM) includes the averaging dynamics of the converter's output. In addition to offering enhanced model accuracy, application of linearization techniques to the ATAM enables the use of conventional linear control design tools. A controller design application demonstrates that a controller designed based on the ATAM outperforms one designed using the ubiquitous SSA model. Unlike the SSA model, ATAM for DC/AC augments the system's dynamics with the dynamics needed for subcycle fundamental contribution (SFC) calculation. This allows for controller design that is based on an exact model.

  19. Technology development of fabrication techniques for advanced solar dynamic concentrators

    NASA Technical Reports Server (NTRS)

    Richter, Scott W.

    1991-01-01

    The objective of the advanced concentrator program is to develop the technology that will lead to lightweight, highly reflective, accurate, scaleable, and long lived space solar dynamic concentrators. The advanced concentrator program encompasses new and innovative concepts, fabrication techniques, materials selection, and simulated space environmental testing. Fabrication techniques include methods of fabricating the substrates and coating substrate surfaces to produce high-quality optical surfaces, acceptable for further coating with vapor deposited optical films. The selected materials to obtain a high quality optical surface include microsheet glass and Eccocoat EP-3 epoxy, with DC-93-500 selected as a candidate silicone adhesive and levelizing layer. The following procedures are defined: cutting, cleaning, forming, and bonding microsheet glass. Procedures are also defined for surface cleaning, and EP-3 epoxy application. The results and analyses from atomic oxygen and thermal cycling tests are used to determine the effects of orbital conditions in a space environment.

  20. Technology development of fabrication techniques for advanced solar dynamic concentrators

    NASA Technical Reports Server (NTRS)

    Richter, Scott W.

    1991-01-01

    The objective of the advanced concentrator program is to develop the technology that will lead to lightweight, highly reflective, accurate, scaleable, and long lived space solar dynamic concentrators. The advanced concentrator program encompasses new and innovative concepts, fabrication techniques, materials selection, and simulated space environmental testing. Fabrication techniques include methods of fabricating the substrates and coating substrate surfaces to produce high quality optical surfaces, acceptable for further coating with vapor deposited optical films. The selected materials to obtain a high quality optical surface include microsheet glass and Eccocoat EP-3 epoxy, with DC-93-500 selected as a candidate silicone adhesive and levelizing layer. The following procedures are defined: cutting, cleaning, forming, and bonding microsheet glass. Procedures are also defined for surface cleaning, and EP-3 epoxy application. The results and analyses from atomic oxygen and thermal cycling tests are used to determine the effects of orbital conditions in a space environment.

  1. Advanced Morphological and Functional Magnetic Resonance Techniques in Glaucoma

    PubMed Central

    Mastropasqua, Rodolfo; Agnifili, Luca; Mattei, Peter A.; Caulo, Massimo; Fasanella, Vincenzo; Navarra, Riccardo; Mastropasqua, Leonardo; Marchini, Giorgio

    2015-01-01

    Glaucoma is a multifactorial disease that is the leading cause of irreversible blindness. Recent data documented that glaucoma is not limited to the retinal ganglion cells but that it also extends to the posterior visual pathway. The diagnosis is based on the presence of signs of glaucomatous optic neuropathy and consistent functional visual field alterations. Unfortunately these functional alterations often become evident when a significant amount of the nerve fibers that compose the optic nerve has been irreversibly lost. Advanced morphological and functional magnetic resonance (MR) techniques (morphometry, diffusion tensor imaging, arterial spin labeling, and functional connectivity) may provide a means for observing modifications induced by this fiber loss, within the optic nerve and the visual cortex, in an earlier stage. The aim of this systematic review was to determine if the use of these advanced MR techniques could offer the possibility of diagnosing glaucoma at an earlier stage than that currently possible. PMID:26167474

  2. Three-dimensional hybrid grid generation using advancing front techniques

    NASA Technical Reports Server (NTRS)

    Steinbrenner, John P.; Noack, Ralph W.

    1995-01-01

    A new 3-dimensional hybrid grid generation technique has been developed, based on ideas of advancing fronts for both structured and unstructured grids. In this approach, structured grids are first generate independently around individual components of the geometry. Fronts are initialized on these structure grids, and advanced outward so that new cells are extracted directly from the structured grids. Employing typical advancing front techniques, cells are rejected if they intersect the existing front or fail other criteria When no more viable structured cells exist further cells are advanced in an unstructured manner to close off the overall domain, resulting in a grid of 'hybrid' form. There are two primary advantages to the hybrid formulation. First, generating blocks with limited regard to topology eliminates the bottleneck encountered when a multiple block system is used to fully encapsulate a domain. Individual blocks may be generated free of external constraints, which will significantly reduce the generation time. Secondly, grid points near the body (presumably with high aspect ratio) will still maintain a structured (non-triangular or tetrahedral) character, thereby maximizing grid quality and solution accuracy near the surface.

  3. Use of advanced computers for aerodynamic flow simulation

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.; Ballhaus, W. F.

    1980-01-01

    The current and projected use of advanced computers for large-scale aerodynamic flow simulation applied to engineering design and research is discussed. The design use of mature codes run on conventional, serial computers is compared with the fluid research use of new codes run on parallel and vector computers. The role of flow simulations in design is illustrated by the application of a three dimensional, inviscid, transonic code to the Sabreliner 60 wing redesign. Research computations that include a more complete description of the fluid physics by use of Reynolds averaged Navier-Stokes and large-eddy simulation formulations are also presented. Results of studies for a numerical aerodynamic simulation facility are used to project the feasibility of design applications employing these more advanced three dimensional viscous flow simulations.

  4. Applying Empirical and Computer Technique in Teaching Undergraduate Sociology

    ERIC Educational Resources Information Center

    O'Kane, James M.

    1976-01-01

    A 2-semester undergraduate sociology course in empirical techniques and computer analysis is described which permits the student maximum freedom in his choice of a research problem while encouraging him to use both a statistical design and a computer analysis to test his hypotheses. (JT)

  5. Chaste: using agile programming techniques to develop computational biology software.

    PubMed

    Pitt-Francis, Joe; Bernabeu, Miguel O; Cooper, Jonathan; Garny, Alan; Momtahan, Lee; Osborne, James; Pathmanathan, Pras; Rodriguez, Blanca; Whiteley, Jonathan P; Gavaghan, David J

    2008-09-13

    Cardiac modelling is the area of physiome modelling where the available simulation software is perhaps most mature, and it therefore provides an excellent starting point for considering the software requirements for the wider physiome community. In this paper, we will begin by introducing some of the most advanced existing software packages for simulating cardiac electrical activity. We consider the software development methods used in producing codes of this type, and discuss their use of numerical algorithms, relative computational efficiency, usability, robustness and extensibility. We then go on to describe a class of software development methodologies known as test-driven agile methods and argue that such methods are more suitable for scientific software development than the traditional academic approaches. A case study is a project of our own, Cancer, Heart and Soft Tissue Environment, which is a library of computational biology software that began as an experiment in the use of agile programming methods. We present our experiences with a review of our progress thus far, focusing on the advantages and disadvantages of this new approach compared with the development methods used in some existing packages. We conclude by considering whether the likely wider needs of the cardiac modelling community are currently being met and suggest that, in order to respond effectively to changing requirements, it is essential that these codes should be more malleable. Such codes will allow for reliable extensions to include both detailed mathematical models--of the heart and other organs--and more efficient numerical techniques that are currently being developed by many research groups worldwide. PMID:18565813

  6. Chaste: using agile programming techniques to develop computational biology software.

    PubMed

    Pitt-Francis, Joe; Bernabeu, Miguel O; Cooper, Jonathan; Garny, Alan; Momtahan, Lee; Osborne, James; Pathmanathan, Pras; Rodriguez, Blanca; Whiteley, Jonathan P; Gavaghan, David J

    2008-09-13

    Cardiac modelling is the area of physiome modelling where the available simulation software is perhaps most mature, and it therefore provides an excellent starting point for considering the software requirements for the wider physiome community. In this paper, we will begin by introducing some of the most advanced existing software packages for simulating cardiac electrical activity. We consider the software development methods used in producing codes of this type, and discuss their use of numerical algorithms, relative computational efficiency, usability, robustness and extensibility. We then go on to describe a class of software development methodologies known as test-driven agile methods and argue that such methods are more suitable for scientific software development than the traditional academic approaches. A case study is a project of our own, Cancer, Heart and Soft Tissue Environment, which is a library of computational biology software that began as an experiment in the use of agile programming methods. We present our experiences with a review of our progress thus far, focusing on the advantages and disadvantages of this new approach compared with the development methods used in some existing packages. We conclude by considering whether the likely wider needs of the cardiac modelling community are currently being met and suggest that, in order to respond effectively to changing requirements, it is essential that these codes should be more malleable. Such codes will allow for reliable extensions to include both detailed mathematical models--of the heart and other organs--and more efficient numerical techniques that are currently being developed by many research groups worldwide.

  7. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1982-01-01

    Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.

  8. Computation Techniques for the Volume of a Tetrahedron

    ERIC Educational Resources Information Center

    Srinivasan, V. K.

    2010-01-01

    The purpose of this article is to discuss specific techniques for the computation of the volume of a tetrahedron. A few of them are taught in the undergraduate multivariable calculus courses. Few of them are found in text books on coordinate geometry and synthetic solid geometry. This article gathers many of these techniques so as to constitute a…

  9. Full Endoscopic Spinal Surgery Techniques: Advancements, Indications, and Outcomes

    PubMed Central

    Yue, James J.; Long, William

    2015-01-01

    Advancements in both surgical instrumentation and full endoscopic spine techniques have resulted in positive clinical outcomes in the treatment of cervical, thoracic, and lumbar spine pathologies. Endoscopic techniques impart minimal approach related disruption of non-pathologic spinal anatomy and function while concurrently maximizing functional visualization and correction of pathological tissues. An advanced understanding of the applicable functional neuroanatomy, in particular the neuroforamen, is essential for successful outcomes. Additionally, an understanding of the varying types of disc prolapse pathology in relation to the neuroforamen will result in more optimal surgical outcomes. Indications for lumbar endoscopic spine surgery include disc herniations, spinal stenosis, infections, medial branch rhizotomy, and interbody fusion. Limitations are based on both non spine and spine related findings. A high riding iliac wing, a more posteriorly located retroperitoneal cavity, an overly distal or proximally migrated herniated disc are all relative contra-indications to lumbar endoscopic spinal surgery techniques. Modifications in scope size and visual field of view angulation have enabled both anterior and posterior cervical decompression. Endoscopic burrs, electrocautery, and focused laser technology allow for the least invasive spinal surgical techniques in all age groups and across varying body habitus. Complications include among others, dural tears, dysesthsia, nerve injury, and infection. PMID:26114086

  10. Full Endoscopic Spinal Surgery Techniques: Advancements, Indications, and Outcomes.

    PubMed

    Yue, James J; Long, William

    2015-01-01

    Advancements in both surgical instrumentation and full endoscopic spine techniques have resulted in positive clinical outcomes in the treatment of cervical, thoracic, and lumbar spine pathologies. Endoscopic techniques impart minimal approach related disruption of non-pathologic spinal anatomy and function while concurrently maximizing functional visualization and correction of pathological tissues. An advanced understanding of the applicable functional neuroanatomy, in particular the neuroforamen, is essential for successful outcomes. Additionally, an understanding of the varying types of disc prolapse pathology in relation to the neuroforamen will result in more optimal surgical outcomes. Indications for lumbar endoscopic spine surgery include disc herniations, spinal stenosis, infections, medial branch rhizotomy, and interbody fusion. Limitations are based on both non spine and spine related findings. A high riding iliac wing, a more posteriorly located retroperitoneal cavity, an overly distal or proximally migrated herniated disc are all relative contra-indications to lumbar endoscopic spinal surgery techniques. Modifications in scope size and visual field of view angulation have enabled both anterior and posterior cervical decompression. Endoscopic burrs, electrocautery, and focused laser technology allow for the least invasive spinal surgical techniques in all age groups and across varying body habitus. Complications include among others, dural tears, dysesthsia, nerve injury, and infection. PMID:26114086

  11. Activities and operations of the Advanced Computing Research Facility, July-October 1986

    SciTech Connect

    Pieper, G.W.

    1986-01-01

    Research activities and operations of the Advanced Computing Research Facility (ACRF) at Argonne National Laboratory are discussed for the period from July 1986 through October 1986. The facility is currently supported by the Department of Energy, and is operated by the Mathematics and Computer Science Division at Argonne. Over the past four-month period, a new commercial multiprocessor, the Intel iPSC-VX/d4 hypercube was installed. In addition, four other commercial multiprocessors continue to be available for research - an Encore Multimax, a Sequent Balance 21000, an Alliant FX/8, and an Intel iPSC/d5 - as well as a locally designed multiprocessor, the Lemur. These machines are being actively used by scientists at Argonne and throughout the nation in a wide variety of projects concerning computer systems with parallel and vector architectures. A variety of classes, workshops, and seminars have been sponsored to train researchers on computing techniques for the advanced computer systems at the Advanced Computing Research Facility. For example, courses were offered on writing programs for parallel computer systems and hosted the first annual Alliant users group meeting. A Sequent users group meeting and a two-day workshop on performance evaluation of parallel computers and programs are being organized.

  12. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  13. Advanced Placement Computer Science with Pascal. Volume 2. Experimental Edition.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents 100 lessons for an advanced placement course on programming in Pascal. Some of the topics covered include arrays, sorting, strings, sets, records, computers in society, files, stacks, queues, linked lists, binary trees, searching, hashing, and chaining. Performance objectives, vocabulary, motivation, aim,…

  14. The origins of bioethics: advances in resuscitations techniques.

    PubMed

    Niebroj, L

    2008-12-01

    During the last years there has been an increasing interest in meta-bioethical issues. This turn in the research focus is regarded as a sign of the maturation of bioethics as a distinct area of an academic inquiry. The role of historic-philosophical reflection is often emphasized. It should be noted that there is a rather common agreement that the future of bioethics lies in the critical reflection on its past, in particular, on the very origins of this discipline. Sharing Caplan's opinion, advances in medicine technologies, especially the introduction of respirators and artificial heart machines, is considered as one of the main issues that started bioethics. Using methods of historical as well as meta-ethical research, this article aims at describing the role of advances in resuscitation techniques in the emergence of bioethics and at exploring how bioethical reflection has been shaped by technological developments. A brief historical analysis permits to say that there is a close bond between the emergence of bioethics and the introduction of sophisticated resuscitation technologies into medical practice. The meta-ethical reflection reveals that advances in resuscitation techniques not only initiated bioethics in the second half of the 20(th) century but influenced its evolution by (i) posing a question of justice in health care, (ii) altering commonly accepted ontological notions of human corporeality, and (iii) reconsidering the very purpose of medicine.

  15. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    SciTech Connect

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..; Easter, Richard C; Elliott, Scott M.; Ghan, Steven J.; Liu, Xiaohong; Lowrie, Robert B.; Lucas, Donald D.; Ma, Po-lun; Sacks, William J.; Shrivastava, Manish; Singh, Balwinder; Tautges, Timothy J.; Taylor, Mark A.; Vertenstein, Mariana; Worley, Patrick H.

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  16. Indications and general techniques for lasers in advanced operative laparoscopy.

    PubMed

    Dorsey, J H

    1991-09-01

    Lasers are but one of the several energy delivery systems used by the operative laparoscopist in the performance of advanced operative laparoscopy. Safety is a key factor in the selection of a laser because the tissue damage produced by this instrument is absolutely predictable. The surgeon must be totally familiar with the chosen wavelength and its tissue reaction if this safety factor is to be realized. Other instruments complement the use of lasers in advanced operative laparoscopy, and without thorough knowledge of all available techniques and instruments, the operative laparoscopist will not achieve the full potential of this specialty. It is beyond the scope of this issue on gynecologic laser surgery to present all of the useful nonlaser techniques. Suffice it to say that we often use laser, loop ligature, sutures, hemoclips, bipolar electricity, hydrodissection, and endocoagulation during the course of a day in the operating room and sometimes during one case. As enthusiasm for advanced operative laparoscopy grows and endoscopic capability increases, more complicated and prolonged surgical feats are reported. Radical hysterectomy and lymphadenectomy have been performed by the laparoscopic route, and endoscopic management of ovarian tumors also has been reported. At this moment, these must be viewed as "show and tell" procedures unsupported by statistics to demonstrate any advantage (or disadvantage) when compared with conventional surgical methods. The time required of advanced operative laparoscopy for any given procedure is certainly an important factor. Prolonged operative and anesthesia time certainly can negate the supposed benefit of small incisions and minimally invasive surgery. What goes on inside the abdomen is certainly the most important part of advanced operative laparoscopy. Good surgeons must recognize their own limitations and the limitations of available technology. The operative laparoscopist must know when to quit and institute a

  17. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  18. The investigation of advanced remote sensing techniques for the measurement of aerosol characteristics

    NASA Technical Reports Server (NTRS)

    Deepak, A.; Becher, J.

    1979-01-01

    Advanced remote sensing techniques and inversion methods for the measurement of characteristics of aerosol and gaseous species in the atmosphere were investigated. Of particular interest were the physical and chemical properties of aerosols, such as their size distribution, number concentration, and complex refractive index, and the vertical distribution of these properties on a local as well as global scale. Remote sensing techniques for monitoring of tropospheric aerosols were developed as well as satellite monitoring of upper tropospheric and stratospheric aerosols. Computer programs were developed for solving multiple scattering and radiative transfer problems, as well as inversion/retrieval problems. A necessary aspect of these efforts was to develop models of aerosol properties.

  19. Advanced aeroservoelastic stabilization techniques for hypersonic flight vehicles

    NASA Technical Reports Server (NTRS)

    Chan, Samuel Y.; Cheng, Peter Y.; Myers, Thomas T.; Klyde, David H.; Magdaleno, Raymond E.; Mcruer, Duane T.

    1992-01-01

    Advanced high performance vehicles, including Single-Stage-To-Orbit (SSTO) hypersonic flight vehicles, that are statically unstable, require higher bandwidth flight control systems to compensate for the instability resulting in interactions between the flight control system, the engine/propulsion dynamics, and the low frequency structural modes. Military specifications, such as MIL-F-9490D and MIL-F-87242, tend to limit treatment of structural modes to conventional gain stabilization techniques. The conventional gain stabilization techniques, however, introduce low frequency effective time delays which can be troublesome from a flying qualities standpoint. These time delays can be alleviated by appropriate blending of gain and phase stabilization techniques (referred to as Hybrid Phase Stabilization or HPS) for the low frequency structural modes. The potential of using HPS for compensating structural mode interaction was previously explored. It was shown that effective time delay was significantly reduced with the use of HPS; however, the HPS design was seen to have greater residual response than a conventional gain stablized design. Additional work performed to advance and refine the HPS design procedure, to further develop residual response metrics as a basis for alternative structural stability specifications, and to develop strategies for validating HPS design and specification concepts in manned simulation is presented. Stabilization design sensitivity to structural uncertainties and aircraft-centered requirements are also assessed.

  20. Testing aspects of advanced coherent electron cooling technique

    SciTech Connect

    Litvinenko, V.; Jing, Y.; Pinayev, I.; Wang, G.; Samulyak, R.; Ratner, D.

    2015-05-03

    An advanced version of the Coherent-electron Cooling (CeC) based on the micro-bunching instability was proposed. This approach promises significant increase in the bandwidth of the CeC system and, therefore, significant shortening of cooling time in high-energy hadron colliders. In this paper we present our plans of simulating and testing the key aspects of this proposed technique using the set-up of the coherent-electron-cooling proof-of-principle experiment at BNL.

  1. A Survey of Architectural Techniques for Near-Threshold Computing

    DOE PAGES

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlightmore » their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.« less

  2. A Survey of Architectural Techniques for Near-Threshold Computing

    SciTech Connect

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlight their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.

  3. Recent advances in UHV techniques for particle accelerators

    SciTech Connect

    M. G. Rao

    1995-01-01

    The ultrahigh vacuum (UHV) requirements for storage rings and accelerators, and the development of the science and technology of UHV for particle accelerators and magnetic fusion devices have been recently reviewed by N.B. Mistry and H.F. Dylla respectively. In this paper, the latest developments in the advancement of UHV techniques for the vacuum integrity of Continuous Electron Beam Accelerator Facility (CEBAF) and for successfully dealing with the synchrotron radiation related beam line vacuum problem encountered in the design of the SSC are reviewed: the review includes developments in extreme sensitivity He leak detection technique based on the dynamic adsorption and desorption of He, operation of ionization gauges at Lhe temperatures, metal sponges for the effective cryopumping of H{sup 2} and He to pressures better than 10{sup -14} torr, and low cost and high He sensitivity RGA's. The details of a new extreme sensitivity He leak detector system are also discussed here.

  4. The Design and Implementation of NASA's Advanced Flight Computing Module

    NASA Technical Reports Server (NTRS)

    Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce

    1995-01-01

    This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.

  5. Surveying co-located space geodesy techniques for ITRF computation

    NASA Astrophysics Data System (ADS)

    Sarti, P.; Sillard, P.; Vittuari, L.

    2003-04-01

    We present a comprehensive operational methodology, based on classical geodesy triangulation and trilateration, that allows the determination of reference points of the five space geodesy techniques used in ITRF computation (i.e.: DORIS, GPS, LLR, SLR, VLBI). Most of the times, for a single technique, the reference point is not accessible and measurable directly. Likewise, no mechanically determined ex-center with respect to an external and measurable point is usually given. In these cases, it is not possible to directly measure the sought reference points and it is even less straightforward to obtain the statistical information relating these points for different techniques. We outline the most general practical surveying methodology that permits to recover the reference points of the different techniques regardless of their physical materialization. We also give a detailed analytical approach for less straightforward cases (e.g.: non geodetic VLBI antennae and SLR/LLR systems). We stress the importance of surveying instrumentation and procedure in achieving the best possible results and outline the impact of the information retrieved with our method in ITRF computation. In particular, we will give numerical examples of computation of the reference point of VLBI antennae (Ny Aalesund and Medicina) and the ex-centre vector computation linking co-located VLBI and GPS techniques in Medicina (Italy). A special attention was paid to the rigorous derivation of statistical elements. They will be presented in an other presentation.

  6. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  7. Recent Advances in Techniques for Hyperspectral Image Processing

    NASA Technical Reports Server (NTRS)

    Plaza, Antonio; Benediktsson, Jon Atli; Boardman, Joseph W.; Brazile, Jason; Bruzzone, Lorenzo; Camps-Valls, Gustavo; Chanussot, Jocelyn; Fauvel, Mathieu; Gamba, Paolo; Gualtieri, Anthony; Marconcini, Mattia; Tilton, James C.; Trianni, Giovanna

    2009-01-01

    Imaging spectroscopy, also known as hyperspectral imaging, has been transformed in less than 30 years from being a sparse research tool into a commodity product available to a broad user community. Currently, there is a need for standardized data processing techniques able to take into account the special properties of hyperspectral data. In this paper, we provide a seminal view on recent advances in techniques for hyperspectral image processing. Our main focus is on the design of techniques able to deal with the highdimensional nature of the data, and to integrate the spatial and spectral information. Performance of the discussed techniques is evaluated in different analysis scenarios. To satisfy time-critical constraints in specific applications, we also develop efficient parallel implementations of some of the discussed algorithms. Combined, these parts provide an excellent snapshot of the state-of-the-art in those areas, and offer a thoughtful perspective on future potentials and emerging challenges in the design of robust hyperspectral imaging algorithms

  8. Advanced bronchoscopic techniques in diagnosis and staging of lung cancer.

    PubMed

    Zaric, Bojan; Stojsic, Vladimir; Sarcev, Tatjana; Stojanovic, Goran; Carapic, Vladimir; Perin, Branislav; Zarogoulidis, Paul; Darwiche, Kaid; Tsakiridis, Kosmas; Karapantzos, Ilias; Kesisis, Georgios; Kougioumtzi, Ioanna; Katsikogiannis, Nikolaos; Machairiotis, Nikolaos; Stylianaki, Aikaterini; Foroulis, Christophoros N; Zarogoulidis, Konstantinos

    2013-09-01

    The role of advanced brochoscopic diagnostic techniques in detection and staging of lung cancer has steeply increased in recent years. Bronchoscopic imaging techniques became widely available and easy to use. Technical improvement led to merging in technologies making autofluorescence or narrow band imaging incorporated into one bronchoscope. New tools, such as autofluorescence imagining (AFI), narrow band imaging (NBI) or fuji intelligent chromo endoscopy (FICE), found their place in respiratory endoscopy suites. Development of endobronchial ultrasound (EBUS) improved minimally invasive mediastinal staging and diagnosis of peripheral lung lesions. Linear EBUS proven to be complementary to mediastinoscopy. This technique is now available in almost all high volume centers performing bronchoscopy. Radial EBUS with mini-probes and guiding sheaths provides accurate diagnosis of peripheral pulmonary lesions. Combining EBUS guided procedures with rapid on site cytology (ROSE) increases diagnostic yield even more. Electromagnetic navigation technology (EMN) is also widely used for diagnosis of peripheral lesions. Future development will certainly lead to new improvements in technology and creation of new sophisticated tools for research in respiratory endoscopy. Broncho-microscopy, alveoloscopy, optical coherence tomography are some of the new research techniques emerging for rapid technological development.

  9. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  10. Recent advances in data assimilation in computational geodynamic models

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, Alik

    2010-05-01

    To restore dynamics of mantle structures in the geological past, data assimilation can be used to constrain the initial conditions for the mantle temperature and velocity from their present observations and estimations. The initial conditions so obtained can then be used to run forward models of mantle dynamics to restore the evolution of mantle structures. If heat diffusion is neglected, the present mantle temperature and flow can be assimilated using the backward advection (BAD) into the past. Two- and three-dimensional numerical approaches to the solution of the inverse problem of the Rayleigh-Taylor instability were developed for a dynamic restoration of diapiric structures to their earlier stages (e.g., Ismail-Zadeh et al., 1998, 2001, 2004; Kaus and Podladchikov, 2001). The mantle flow was modelled backwards in time from present-day mantle density heterogeneities inferred from seismic observations (e.g., Steinberger and O'Connell, 1998; Conrad and Gurnis, 2003). The variational (VAR) (or also called adjoint) data assimilation has been pioneered by meteorologists and widely used in oceanography and in hydrological studies. The use of VAR data assimilation in models of geodynamics has been put forward by Bunge et al. (2003) and Ismail-Zadeh et al. (2003). The VAR data assimilation algorithm was employed to restore numerically models of mantle plumes (Ismail-Zadeh et al., 2004, 2006; Hier-Majumder et al., 2005; Liu and Gurnis, 2008; Liu et al., 2008). The use of the quasi-reversibility (QRV) technique (more robust computationally) implies the introduction into the backward heat equation of the additional term involving the product of a small regularization parameter and a higher order temperature derivative (the resulting regularized heat equation is based on the Riemann law of heat conduction). The data assimilation in this case is based on a search of the best fit between the forecast model state and the observations by minimizing the regularization parameter

  11. Advanced computer architecture specification for automated weld systems

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1994-01-01

    This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.

  12. Using an Advanced Computational Laboratory Experiment to Extend and Deepen Physical Chemistry Students' Understanding of Atomic Structure

    ERIC Educational Resources Information Center

    Hoffman, Gary G.

    2015-01-01

    A computational laboratory experiment is described, which involves the advanced study of an atomic system. The students use concepts and techniques typically covered in a physical chemistry course but extend those concepts and techniques to more complex situations. The students get a chance to explore the study of atomic states and perform…

  13. Soft computing in design and manufacturing of advanced materials

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  14. Audio Utilization Conventions and Techniques for Computer Assisted Instruction.

    ERIC Educational Resources Information Center

    Army Signal Center and School, Fort Monmouth, NJ.

    A set of guidelines has been developed for the implementation of the audio mode in computer assisted instruction (CAI). The manual contains a collection of conventions and techniques synthesized from recent publications in areas pertinent to multi-media audiovisual presentation. These areas include audio message placement, positioning, frequency,…

  15. Advances in reduction techniques for tire contact problems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1995-01-01

    Some recent developments in reduction techniques, as applied to predicting the tire contact response and evaluating the sensitivity coefficients of the different response quantities, are reviewed. The sensitivity coefficients measure the sensitivity of the contact response to variations in the geometric and material parameters of the tire. The tire is modeled using a two-dimensional laminated anisotropic shell theory with the effects of variation in geometric and material parameters, transverse shear deformation, and geometric nonlinearities included. The contact conditions are incorporated into the formulation by using a perturbed Lagrangian approach with the fundamental unknowns consisting of the stress resultants, the generalized displacements, and the Lagrange multipliers associated with the contact conditions. The elemental arrays are obtained by using a modified two-field, mixed variational principle. For the application of reduction techniques, the tire finite element model is partitioned into two regions. The first region consists of the nodes that are likely to come in contact with the pavement, and the second region includes all the remaining nodes. The reduction technique is used to significantly reduce the degrees of freedom in the second region. The effectiveness of the computational procedure is demonstrated by a numerical example of the frictionless contact response of the space shuttle nose-gear tire, inflated and pressed against a rigid flat surface. Also, the research topics which have high potential for enhancing the effectiveness of reduction techniques are outlined.

  16. Development of processing techniques for advanced thermal protection materials

    NASA Technical Reports Server (NTRS)

    Selvaduray, Guna S.

    1994-01-01

    The effort, which was focused on the research and development of advanced materials for use in Thermal Protection Systems (TPS), has involved chemical and physical testing of refractory ceramic tiles, fabrics, threads and fibers. This testing has included determination of the optical properties, thermal shock resistance, high temperature dimensional stability, and tolerance to environmental stresses. Materials have also been tested in the Arc Jet 2 x 9 Turbulent Duct Facility (TDF), the 1 atmosphere Radiant Heat Cycler, and the Mini-Wind Tunnel Facility (MWTF). A significant part of the effort hitherto has gone towards modifying and upgrading the test facilities so that meaningful tests can be carried out. Another important effort during this period has been the creation of a materials database. Computer systems administration and support have also been provided. These are described in greater detail below.

  17. A survey of CPU-GPU heterogeneous computing techniques

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and application level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.

  18. A survey of CPU-GPU heterogeneous computing techniques

    DOE PAGES

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and applicationmore » level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.« less

  19. Advanced Techniques for Removal of Retrievable Inferior Vena Cava Filters

    SciTech Connect

    Iliescu, Bogdan; Haskal, Ziv J.

    2012-08-15

    Inferior vena cava (IVC) filters have proven valuable for the prevention of primary or recurrent pulmonary embolism in selected patients with or at high risk for venous thromboembolic disease. Their use has become commonplace, and the numbers implanted increase annually. During the last 3 years, in the United States, the percentage of annually placed optional filters, i.e., filters than can remain as permanent filters or potentially be retrieved, has consistently exceeded that of permanent filters. In parallel, the complications of long- or short-term filtration have become increasingly evident to physicians, regulatory agencies, and the public. Most filter removals are uneventful, with a high degree of success. When routine filter-retrieval techniques prove unsuccessful, progressively more advanced tools and skill sets must be used to enhance filter-retrieval success. These techniques should be used with caution to avoid damage to the filter or cava during IVC retrieval. This review describes the complex techniques for filter retrieval, including use of additional snares, guidewires, angioplasty balloons, and mechanical and thermal approaches as well as illustrates their specific application.

  20. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  1. Application of object oriented programming techniques in front end computers

    SciTech Connect

    Skelly, J.F.

    1997-11-01

    The Standard Model for accelerator control systems describes two levels of computers, often called Console Level Computers (CLCs) and Front End Computers (FECs), joined by a network. The Front End Computer (FEC) environment imposes special demands on software, beyond real time performance and robustness. FEC software must manage a diverse inventory of devices with individualistic timing requirements and hardware interfaces. It must implement network services which export device access to the control system at large, interpreting a uniform network communications protocol into the specific control requirements of the individual devices. Object oriented languages provide programming techniques which neatly address these challenges, and also offer benefits in terms of maintainability and flexibility. Applications are discussed which exhibit the use of inheritance, multiple inheritance and inheritance trees, and polymorphism to address the needs of FEC software.

  2. Advanced Manufacturing Techniques Demonstrated for Fabricating Developmental Hardware

    NASA Technical Reports Server (NTRS)

    Redding, Chip

    2004-01-01

    NASA Glenn Research Center's Engineering Development Division has been working in support of innovative gas turbine engine systems under development by Glenn's Combustion Branch. These one-of-a-kind components require operation under extreme conditions. High-temperature ceramics were chosen for fabrication was because of the hostile operating environment. During the designing process, it became apparent that traditional machining techniques would not be adequate to produce the small, intricate features for the conceptual design, which was to be produced by stacking over a dozen thin layers with many small features that would then be aligned and bonded together into a one-piece unit. Instead of using traditional machining, we produced computer models in Pro/ENGINEER (Parametric Technology Corporation (PTC), Needham, MA) to the specifications of the research engineer. The computer models were exported in stereolithography standard (STL) format and used to produce full-size rapid prototype polymer models. These semi-opaque plastic models were used for visualization and design verification. The computer models also were exported in International Graphics Exchange Specification (IGES) format and sent to Glenn's Thermal/Fluids Design & Analysis Branch and Applied Structural Mechanics Branch for profiling heat transfer and mechanical strength analysis.

  3. Coal and Coal Constituent Studies by Advanced EMR Techniques

    SciTech Connect

    Alex I. Smirnov; Mark J. Nilges; R. Linn Belford; Robert B. Clarkson

    1998-03-31

    Advanced electronic magnetic resonance (EMR) methods are used to examine properties of coals, chars, and molecular species related to constituents of coal. We have achieved substantial progress on upgrading the high field (HF) EMR (W-band, 95 GHz) spectrometers that are especially advantageous for such studies. Particularly, we have built a new second W-band instrument (Mark II) in addition to our Mark I. Briefly, Mark II features: (i) an Oxford custom-built 7 T superconducting magnet which is scannable from 0 to 7 T at up to 0.5 T/min; (ii) water-cooled coaxial solenoid with up to ±550 G scan under digital (15 bits resolution) computer control; (iii) custom-engineered precision feed-back circuit, which is used to drive this solenoid, is based on an Ultrastab 860R sensor that has linearity better than 5 ppm and resolution of 0.05 ppm; (iv) an Oxford CF 1200 cryostat for variable temperature studies from 1.8 to 340 K. During this grant period we have completed several key upgrades of both Mark I and II, particularly microwave bridge, W-band probehead, and computer interfaces. We utilize these improved instruments for HF EMR studies of spin-spin interaction and existence of different paramagnetic species in carbonaceous solids.

  4. Techniques for developing approximate optimal advanced launch system guidance

    NASA Technical Reports Server (NTRS)

    Feeley, Timothy S.; Speyer, Jason L.

    1991-01-01

    An extension to the authors' previous technique used to develop a real-time guidance scheme for the Advanced Launch System is presented. The approach is to construct an optimal guidance law based upon an asymptotic expansion associated with small physical parameters, epsilon. The trajectory of a rocket modeled as a point mass is considered with the flight restricted to an equatorial plane while reaching an orbital altitude at orbital injection speeds. The dynamics of this problem can be separated into primary effects due to thrust and gravitational forces, and perturbation effects which include the aerodynamic forces and the remaining inertial forces. An analytic solution to the reduced-order problem represented by the primary dynamics is possible. The Hamilton-Jacobi-Bellman or dynamic programming equation is expanded in an asymptotic series where the zeroth-order term (epsilon = 0) can be obtained in closed form.

  5. Neurocysticercosis: evaluation with advanced magnetic resonance techniques and atypical forms.

    PubMed

    do Amaral, Lázaro Luís Faria; Ferreira, Rafael Martins; da Rocha, Antônio José; Ferreira, Nelson Paes Diniz Fortes

    2005-04-01

    Neurocysticercosis (NCC) is the most common helminthic infection of the central nervous system, but its diagnosis remains difficult. The purpose of this article is to perform a critical analysis of the literature and show our experience in the evaluation of NCC. We discuss the advanced MR technique applications such as diffusion and perfusion-weighted imaging, spectroscopy, cisternography with FLAIR, and supplemental O2 and 3D-CISS. The typical manifestations of NCC are described; emphasis is given to the unusual presentations. The atypical forms of neurocysticercosis were divided into: intraventricular, subarachnoid, spinal, orbital, and intraparenchymatous. Special attention was also given to reactivation of previously calcified lesions and neurocysticercosis associated with mesial temporal sclerosis.

  6. COAL AND CHAR STUDIES BY ADVANCED EMR TECHNIQUES

    SciTech Connect

    R. Linn Belford; Robert B. Clarkson; Mark J. Nilges; Boris M. Odintsov; Alex I. Smirnov

    2001-04-30

    Advanced electronic magnetic resonance (EMR) as well as nuclear magnetic resonance (NMR) methods have been used to examine properties of coals, chars, and molecular species related to constituents of coal. During the span of this grant, progress was made on construction and applications to coals and chars of two high frequency EMR systems particularly appropriate for such studies--48 GHz and 95 GHz electron magnetic resonance spectrometer, on new low-frequency dynamic nuclear polarization (DNP) experiments to examine the interaction between water and the surfaces of suspended char particulates in slurries, and on a variety of proton nuclear magnetic resonance (NMR) techniques to measure characteristics of the water directly in contact with the surfaces and pore spaces of carbonaceous particulates.

  7. Advanced Fibre Bragg Grating and Microfibre Bragg Grating Fabrication Techniques

    NASA Astrophysics Data System (ADS)

    Chung, Kit Man

    Fibre Bragg gratings (FBGs) have become a very important technology for communication systems and fibre optic sensing. Typically, FBGs are less than 10-mm long and are fabricated using fused silica uniform phase masks which become more expensive for longer length or non-uniform pitch. Generally, interference UV laser beams are employed to make long or complex FBGs, and this technique introduces critical precision and control issues. In this work, we demonstrate an advanced FBG fabrication system that enables the writing of long and complex gratings in optical fibres with virtually any apodisation profile, local phase and Bragg wavelength using a novel optical design in which the incident angles of two UV beams onto an optical fibre can be adjusted simultaneously by moving just one optical component, instead of two optics employed in earlier configurations, to vary the grating pitch. The key advantage of the grating fabrication system is that complex gratings can be fabricated by controlling the linear movements of two translation stages. In addition to the study of advanced grating fabrication technique, we also focus on the inscription of FBGs written in optical fibres with a cladding diameter of several ten's of microns. Fabrication of microfibres was investigated using a sophisticated tapering method. We also proposed a simple but practical technique to filter out the higher order modes reflected from the FBG written in microfibres via a linear taper region while the fundamental mode re-couples to the core. By using this technique, reflection from the microfibre Bragg grating (MFBG) can be effectively single mode, simplifying the demultiplexing and demodulation processes. MFBG exhibits high sensitivity to contact force and an MFBG-based force sensor was also constructed and tested to investigate their suitability for use as an invasive surgery device. Performance of the contact force sensor packaged in a conforming elastomer material compares favourably to one

  8. Multiple advanced surgical techniques to treat acquired seminal duct obstruction

    PubMed Central

    Jiang, Hong-Tao; Yuan, Qian; Liu, Yu; Liu, Zeng-Qin; Zhou, Zhen-Yu; Xiao, Ke-Feng; Yang, Jiang-Gen

    2014-01-01

    The aim of this study was to evaluate the outcomes of multiple advanced surgical treatments (i.e. microsurgery, laparoscopic surgery and endoscopic surgery) for acquired obstructive azoospermia. We analyzed the surgical outcomes of 51 patients with suspected acquired obstructive azoospermia consecutively who enrolled at our center between January 2009 and May 2013. Modified vasoepididymostomy, laparoscopically assisted vasovasostomy and transurethral incision of the ejaculatory duct with holmium laser were chosen and performed based on the different obstruction sites. The mean postoperative follow-up time was 22 months (range: 9 months to 52 months). Semen analyses were initiated at four postoperative weeks, followed by trimonthly (months 3, 6, 9 and 12) semen analyses, until no sperm was found at 12 months or until pregnancy was achieved. Patency was defined as >10,000 sperm ml−1 of semen. The obstruction sites, postoperative patency and natural pregnancy rate were recorded. Of 51 patients, 47 underwent bilateral or unilateral surgical reconstruction; the other four patients were unable to be treated with surgical reconstruction because of pelvic vas or intratesticular tubules obstruction. The reconstruction rate was 92.2% (47/51), and the patency rate and natural pregnancy rate were 89.4% (42/47) and 38.1% (16/42), respectively. No severe complications were observed. Using multiple advanced surgical techniques, more extensive range of seminal duct obstruction was accessible and correctable; thus, a favorable patency and pregnancy rate can be achieved. PMID:25337841

  9. Computer assisted audit techniques for UNIX (UNIX-CAATS)

    SciTech Connect

    Polk, W.T.

    1991-01-01

    Federal and DOE regulations impose specific requirements for internal controls of computer systems. These controls include adequate separation of duties and sufficient controls for access of system and data. The DOE Inspector General's Office has the responsibility to examine internal controls, as well as efficient use of computer system resources. As a result, DOE supported NIST development of computer assisted audit techniques to examine BSD UNIX computers (UNIX-CAATS). These systems were selected due to the increasing number of UNIX workstations in use within DOE. This paper describes the design and development of these techniques, as well as the results of testing at NIST and the first audit at a DOE site. UNIX-CAATS consists of tools which examine security of passwords, file systems, and network access. In addition, a tool was developed to examine efficiency of disk utilization. Test results at NIST indicated inadequate password management, as well as weak network resource controls. File system security was considered adequate. Audit results at a DOE site indicated weak password management and inefficient disk utilization. During the audit, we also found improvements to UNIX-CAATS were needed when applied to large systems. NIST plans to enhance the techniques developed for DOE/IG in future work. This future work would leverage currently available tools, along with needed enhancements. These enhancements would enable DOE/IG to audit large systems, such as supercomputers.

  10. Computer assisted audit techniques for UNIX (UNIX-CAATS)

    SciTech Connect

    Polk, W.T.

    1991-12-31

    Federal and DOE regulations impose specific requirements for internal controls of computer systems. These controls include adequate separation of duties and sufficient controls for access of system and data. The DOE Inspector General`s Office has the responsibility to examine internal controls, as well as efficient use of computer system resources. As a result, DOE supported NIST development of computer assisted audit techniques to examine BSD UNIX computers (UNIX-CAATS). These systems were selected due to the increasing number of UNIX workstations in use within DOE. This paper describes the design and development of these techniques, as well as the results of testing at NIST and the first audit at a DOE site. UNIX-CAATS consists of tools which examine security of passwords, file systems, and network access. In addition, a tool was developed to examine efficiency of disk utilization. Test results at NIST indicated inadequate password management, as well as weak network resource controls. File system security was considered adequate. Audit results at a DOE site indicated weak password management and inefficient disk utilization. During the audit, we also found improvements to UNIX-CAATS were needed when applied to large systems. NIST plans to enhance the techniques developed for DOE/IG in future work. This future work would leverage currently available tools, along with needed enhancements. These enhancements would enable DOE/IG to audit large systems, such as supercomputers.

  11. Coupled explosive/structure computational techniques at Sandia National Laboratories

    SciTech Connect

    Preece, D.S.; Attaway, S.W.; Swegle, J.W.

    1997-06-01

    Simulation of the effects of explosives on structures is a challenge because the explosive response can best be simulated using Eulerian computational techniques and structural behavior is best modeled using Lagrangian methods. Due to the different methodology of the two computational techniques and code architecture requirements, they are usually implemented in different computer programs. Explosive and structure modeling in two different codes make it difficult or next to impossible to do coupled explosive/structure interaction simulations. Sandia National Laboratories has developed two techniques for solving this problem. The first is called Smoothed Particle Hydrodynamics (SPH), a relatively new gridless method comparable to Eulerian, that is especially suited for treating liquids and gases such as those produced by an explosive. The SPH capability has been fully implemented into the transient dynamics finite element (Lagrangian) codes PRONTO-2D and -3D. A PRONTO-3D/SPH simulation of the effect of a blast on a protective-wall barrier is presented in this paper. The second technique employed at Sandia uses a new code called Zapotec that combines the 3-D Eulerian code CTH and the Lagrangian code PRONTO-3D with minimal changes to either code. CTH and PRONTO-3D are currently executing on the Sandia Terraflops machine (9000 Pentium Pro processors). Eulerian simulations with 100 million cells have been completed on the current configuration of the machine (4500 Pentium Pro processors). The CTH and PRONTO-3D combination will soon be executing in a coupled fashion on this machine.

  12. Computer-vision-based registration techniques for augmented reality

    NASA Astrophysics Data System (ADS)

    Hoff, William A.; Nguyen, Khoi; Lyon, Torsten

    1996-10-01

    Augmented reality is a term used to describe systems in which computer-generated information is superimposed on top of the real world; for example, through the use of a see- through head-mounted display. A human user of such a system could still see and interact with the real world, but have valuable additional information, such as descriptions of important features or instructions for performing physical tasks, superimposed on the world. For example, the computer could identify and overlay them with graphic outlines, labels, and schematics. The graphics are registered to the real-world objects and appear to be 'painted' onto those objects. Augmented reality systems can be used to make productivity aids for tasks such as inspection, manufacturing, and navigation. One of the most critical requirements for augmented reality is to recognize and locate real-world objects with respect to the person's head. Accurate registration is necessary in order to overlay graphics accurately on top of the real-world objects. At the Colorado School of Mines, we have developed a prototype augmented reality system that uses head-mounted cameras and computer vision techniques to accurately register the head to the scene. The current system locates and tracks a set of pre-placed passive fiducial targets placed on the real-world objects. The system computes the pose of the objects and displays graphics overlays using a see-through head-mounted display. This paper describes the architecture of the system and outlines the computer vision techniques used.

  13. Training Software in Artificial-Intelligence Computing Techniques

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna; Rogstad, Eric; Chalfant, Eugene

    2005-01-01

    The Artificial Intelligence (AI) Toolkit is a computer program for training scientists, engineers, and university students in three soft-computing techniques (fuzzy logic, neural networks, and genetic algorithms) used in artificial-intelligence applications. The program promotes an easily understandable tutorial interface, including an interactive graphical component through which the user can gain hands-on experience in soft-computing techniques applied to realistic example problems. The tutorial provides step-by-step instructions on the workings of soft-computing technology, whereas the hands-on examples allow interaction and reinforcement of the techniques explained throughout the tutorial. In the fuzzy-logic example, a user can interact with a robot and an obstacle course to verify how fuzzy logic is used to command a rover traverse from an arbitrary start to the goal location. For the genetic algorithm example, the problem is to determine the minimum-length path for visiting a user-chosen set of planets in the solar system. For the neural-network example, the problem is to decide, on the basis of input data on physical characteristics, whether a person is a man, woman, or child. The AI Toolkit is compatible with the Windows 95,98, ME, NT 4.0, 2000, and XP operating systems. A computer having a processor speed of at least 300 MHz, and random-access memory of at least 56MB is recommended for optimal performance. The program can be run on a slower computer having less memory, but some functions may not be executed properly.

  14. Advances in the Rising Bubble Technique for discharge measurement

    NASA Astrophysics Data System (ADS)

    Hilgersom, Koen; Luxemburg, Willem; Willemsen, Geert; Bussmann, Luuk

    2014-05-01

    Already in the 19th century, d'Auria described a discharge measurement technique that applies floats to find the depth-integrated velocity (d'Auria, 1882). The basis of this technique was that the horizontal distance that the float travels on its way to the surface is the image of the integrated velocity profile over depth. Viol and Semenov (1964) improved this method by using air bubbles as floats, but still distances were measured manually until Sargent (1981) introduced a technique that could derive the distances from two photographs simultaneously taken from each side of the river bank. Recently, modern image processing techniques proved to further improve the applicability of the method (Hilgersom and Luxemburg, 2012). In the 2012 article, controlling and determining the rising velocity of an air bubble still appeared a major challenge for the application of this method. Ever since, laboratory experiments with different nozzle and tube sizes lead to advances in our self-made equipment enabling us to produce individual air bubbles with a more constant rising velocity. Also, we introduced an underwater camera to on-site determine the rising velocity, which is dependent on the water temperature and contamination, and therefore is site-specific. Camera measurements of the rising velocity proved successful in a laboratory and field setting, although some improvements to the setup are necessary to capture the air bubbles also at depths where little daylight penetrates. References D'Auria, L.: Velocity of streams; A new method to determine correctly the mean velocity of any perpendicular in rivers and canals, (The) American Engineers, 3, 1882. Hilgersom, K.P. and Luxemburg, W.M.J.: Technical Note: How image processing facilitates the rising bubble technique for discharge measurement, Hydrology and Earth System Sciences, 16(2), 345-356, 2012. Sargent, D.: Development of a viable method of stream flow measurement using the integrating float technique, Proceedings of

  15. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  16. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  17. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  18. Applications of Advanced Nondestructive Measurement Techniques to Address Safety of Flight Issues on NASA Spacecraft

    NASA Technical Reports Server (NTRS)

    Prosser, Bill

    2016-01-01

    Advanced nondestructive measurement techniques are critical for ensuring the reliability and safety of NASA spacecraft. Techniques such as infrared thermography, THz imaging, X-ray computed tomography and backscatter X-ray are used to detect indications of damage in spacecraft components and structures. Additionally, sensor and measurement systems are integrated into spacecraft to provide structural health monitoring to detect damaging events that occur during flight such as debris impacts during launch and assent or from micrometeoroid and orbital debris, or excessive loading due to anomalous flight conditions. A number of examples will be provided of how these nondestructive measurement techniques have been applied to resolve safety critical inspection concerns for the Space Shuttle, International Space Station (ISS), and a variety of launch vehicles and unmanned spacecraft.

  19. Unenhanced CT in the evaluation of urinary calculi: application of advanced computer methods.

    PubMed

    Olcott, E W; Sommer, F G

    1999-04-01

    Recent advances in computer hardware and software technology enable radiologists to examine tissues and structures using three-dimensional figures constructed from the multiple planar images acquired during a spiral CT examination. Three-dimensional CT techniques permit the linear dimensions of renal calculi to be determined along all three coordinate axes with a high degree of accuracy and enable direct volumetric analysis of calculi, yielding information that is not available from any other diagnostic modality. Additionally, three-dimensional techniques can help to identify and localize calculi in patients with suspected urinary colic.

  20. Measurement and calibration techniques used in computer partial pressure analysis

    SciTech Connect

    Mitchell, D.J.

    1985-05-01

    The uses of residual gas analyzers (RGA's) in computer controlled analytical studies and process monitoring applications are discussed in this paper. The relative merits are compared for the two most commonly used RGA's, which are the magnetic sector and the quadrupole mass analyzer. Methods of installing RGA's in vacuum systems and computer interfacing techniques are described. Measurement and calibration methods are outlined for applications where it is desirable to characterize either partial pressures or gas evolution rates. Interpretation of RGA spectra and limitations imposed by analytical errors are also discussed.

  1. Computer-assisted surgical techniques: can they really improve laser surgery?

    NASA Astrophysics Data System (ADS)

    Reinisch, Lou; Arango, Pablo; Howard, John G.; Mendenhall, Marcus H.; Ossoff, Robert H.

    1995-05-01

    As part of our Computer-Assisted Surgical Techniques (CAST) program, we use computers to guide surgical lasers, create minimal incision widths, regulate the rate of tissue ablation, monitor the types of tissue being ablated with photo-acoustic feedback, and track and compensate for patient motions due to respiration and heart beat. The union of the computer, robotics and lasers can assist the surgeon and permit several new applications. Although these advances in laser surgery appear to have obvious benefits, it is important to evaluate and quantify the clinical advantages. We have compared the CAST system to manually controlled laser surgery and studied the wound healing after laser incision. We have found definite advantages to the CAST system. However, the computer, alone, cannot compensate for the thermal damage lateral to the incision site. The results suggest the need for motion tracking and compensation to be a part of the CAST system.

  2. Coral surface area quantification-evaluation of established techniques by comparison with computer tomography

    NASA Astrophysics Data System (ADS)

    Naumann, M. S.; Niggl, W.; Laforsch, C.; Glaser, C.; Wild, C.

    2009-03-01

    The surface area of scleractinian corals represents an important reference parameter required for various aspects of coral reef science. However, with advancements in detection accuracy and novel approaches for coral surface area quantification, evaluation of established techniques in comparison with state-of-the-art technology gains importance to coral researchers. This study presents an evaluation of methodological accuracy for established techniques in comparison to a novel approach composed of computer tomography (CT) and 3-dimensional surface reconstruction. The skeleton surface area of reef corals from six genera representing the most common morphological growth forms was acquired by CT and subsequently measured by computer-aided 3-dimensional surface reconstruction. Surface area estimates for the same corals were also obtained by application of four established techniques: Simple and Advanced Geometry, Wax Coating and Planar Projection Photography. Comparison of the resulting area values revealed significant differences between the majority (82%) of established techniques and the CT reference. Genus-specific analysis assigned the highest accuracy to geometric approximations (Simple or Advanced Geometry) for the majority of assessed coral genera (maximum accuracy: 104%; Simple Geometry with Montipora sp.). The commonly used and invasive Wax Coating technique reached intermediate accuracy (47-74%) for the majority of genera, but performed outstanding in the measurement of branching Acropora spp. corals (maximum accuracy: 101%), while the Planar Projection Photography delivered genera-wide low accuracy (12-36%). Comparison of area values derived from established techniques and CT additionally yielded approximation factors (AFs) applicable as factors in the mathematical improvement of surface area estimates by established techniques in relation to CT reference accuracy.

  3. Whole-genome CNV analysis: advances in computational approaches

    PubMed Central

    Pirooznia, Mehdi; Goes, Fernando S.; Zandi, Peter P.

    2015-01-01

    Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development. PMID:25918519

  4. Fast and Computationally Efficient Boundary Detection Technique for Medical Images

    NASA Astrophysics Data System (ADS)

    Das, Arpita; Goswami, Partha; Sen, Susanta

    2011-03-01

    Detection of edge is a fundamental procedure of image processing. Many edge detection algorithms have been developed based on computation of the intensity gradient. In medical images, boundaries of the objects are vague for gradual change of intensities. Therefore need exists to develop a computationally efficient and accurate edge detection approach. We have presented such algorithm using modified global threshold technique. In our work, the boundaries are highlighted from the background by selecting a threshold (T) that separates object and background. In the image, where object to background or vice-verse transition occurs, pixel intensity either rises greater or equal to T (background to object transition) or falls less than T (object to background). We have marked these transition regions as object boundary and enhanced the corresponding intensity. The value of T may be specified heuristically or by following specific algorithm. Conventional global threshold algorithm computes the value of T automatically. But this approach is not computationally efficient and required a large memory. In this study, we have proposed a parameter for which computation of T is very easy and fast. We have also proved that a fixed size memory [ 256 × 4 Byte] is enough to compute this algorithm.

  5. Computer-Assisted Technique for Surgical Tooth Extraction

    PubMed Central

    Hamza, Hosamuddin

    2016-01-01

    Introduction. Surgical tooth extraction is a common procedure in dentistry. However, numerous extraction cases show a high level of difficulty in practice. This difficulty is usually related to inadequate visualization, improper instrumentation, or other factors related to the targeted tooth (e.g., ankyloses or presence of bony undercut). Methods. In this work, the author presents a new technique for surgical tooth extraction based on 3D imaging, computer planning, and a new concept of computer-assisted manufacturing. Results. The outcome of this work is a surgical guide made by 3D printing of plastics and CNC of metals (hybrid outcome). In addition, the conventional surgical cutting tools (surgical burs) are modified with a number of stoppers adjusted to avoid any excessive drilling that could harm bone or other vital structures. Conclusion. The present outcome could provide a minimally invasive technique to overcome the routine complications facing dental surgeons in surgical extraction procedures. PMID:27127510

  6. Computer-Assisted Technique for Surgical Tooth Extraction.

    PubMed

    Hamza, Hosamuddin

    2016-01-01

    Introduction. Surgical tooth extraction is a common procedure in dentistry. However, numerous extraction cases show a high level of difficulty in practice. This difficulty is usually related to inadequate visualization, improper instrumentation, or other factors related to the targeted tooth (e.g., ankyloses or presence of bony undercut). Methods. In this work, the author presents a new technique for surgical tooth extraction based on 3D imaging, computer planning, and a new concept of computer-assisted manufacturing. Results. The outcome of this work is a surgical guide made by 3D printing of plastics and CNC of metals (hybrid outcome). In addition, the conventional surgical cutting tools (surgical burs) are modified with a number of stoppers adjusted to avoid any excessive drilling that could harm bone or other vital structures. Conclusion. The present outcome could provide a minimally invasive technique to overcome the routine complications facing dental surgeons in surgical extraction procedures. PMID:27127510

  7. Discriminating coastal rangeland production and improvements with computer aided techniques

    NASA Technical Reports Server (NTRS)

    Reeves, C. A.; Faulkner, D. P.

    1975-01-01

    The feasibility and utility of using satellite data and computer-aided remote sensing analysis techniques to conduct range inventories were tested. This pilot study was focused over a 250,000 acre site in Galveston and Brazoria Counties along the Texas Gulf Coast. Rectified enlarged aircraft color infrared photographs of this site were used as the ground truth base. The different land categories were identified, delineated, and measured. Multispectral scanner (MSS) bulk data from LANDSAT-1 was received and analyzed with the Image 100 pattern recognition system. Features of interest were delineated on the image console giving the number of picture elements classified; the picture elements were converted to acreages and the accuracy of the technique was evaluated by comparison with data base results for three test sites. The accuracies for computer aided classification of coastal marshes ranged from 89% to 96%.

  8. Advanced techniques for constrained internal coordinate molecular dynamics.

    PubMed

    Wagner, Jeffrey R; Balaraman, Gouthaman S; Niesen, Michiel J M; Larsen, Adrien B; Jain, Abhinandan; Vaidehi, Nagarajan

    2013-04-30

    Internal coordinate molecular dynamics (ICMD) methods provide a more natural description of a protein by using bond, angle, and torsional coordinates instead of a Cartesian coordinate representation. Freezing high-frequency bonds and angles in the ICMD model gives rise to constrained ICMD (CICMD) models. There are several theoretical aspects that need to be developed to make the CICMD method robust and widely usable. In this article, we have designed a new framework for (1) initializing velocities for nonindependent CICMD coordinates, (2) efficient computation of center of mass velocity during CICMD simulations, (3) using advanced integrators such as Runge-Kutta, Lobatto, and adaptive CVODE for CICMD simulations, and (4) cancelling out the "flying ice cube effect" that sometimes arises in Nosé-Hoover dynamics. The Generalized Newton-Euler Inverse Mass Operator (GNEIMO) method is an implementation of a CICMD method that we have developed to study protein dynamics. GNEIMO allows for a hierarchy of coarse-grained simulation models based on the ability to rigidly constrain any group of atoms. In this article, we perform tests on the Lobatto and Runge-Kutta integrators to determine optimal simulation parameters. We also implement an adaptive coarse-graining tool using the GNEIMO Python interface. This tool enables the secondary structure-guided "freezing and thawing" of degrees of freedom in the molecule on the fly during molecular dynamics simulations and is shown to fold four proteins to their native topologies. With these advancements, we envision the use of the GNEIMO method in protein structure prediction, structure refinement, and in studying domain motion.

  9. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  10. Efficient Computational Techniques for Electromagnetic Propagation and Scattering.

    NASA Astrophysics Data System (ADS)

    Wagner, Robert Louis

    Electromagnetic propagation and scattering problems are important in many application areas such as communications, high-speed circuitry, medical imaging, geophysical remote sensing, nondestructive testing, and radar. This thesis develops several new techniques for the efficient computer solution of such problems. Most of this thesis deals with the efficient solution of electromagnetic scattering problems formulated as surface integral equations. A standard method of moments (MOM) formulation is used to reduce the problem to the solution of a dense, N times N matrix equation, where N is the number of surface current unknowns. An iterative solution technique is used, requiring the computation of many matrix-vector multiplications. Techniques developed for this problem include the ray-propagation fast multipole algorithm (RPFMA), which is a simple, non-nested, physically intuitive technique based on the fast multipole method (FMM). The RPFMA is implemented for two-dimensional surface integral equations, and reduces the cost of a matrix-vector multiplication from O(N^2) to O(N^ {4/3}). The use of wavelets is also studied for the solution of two-dimensional surface integral equations. It is shown that the use of wavelets as basis functions produces a MOM matrix with substantial sparsity. However, unlike the RPFMA, the use of a wavelet basis does not reduce the computational complexity of the problem. In other words, the sparse MOM matrix in the wavelet basis still has O(N ^2) significant entries. The fast multipole method-fast Fourier transform (FMM-FFT) method is developed to compute the scattering of an electromagnetic wave from a two-dimensional rough surface. The resulting algorithm computes a matrix-vector multiply in O(N log N) operations. This algorithm is shown to be more efficient than another O(N log N) algorithm, the multi-level fast multipole algorithm (MLFMA), for surfaces of small height. For surfaces with larger roughness, the MLFMA is found to be more

  11. Techniques for animation of CFD results. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Horowitz, Jay; Hanson, Jeffery C.

    1992-01-01

    Video animation is becoming increasingly vital to the computational fluid dynamics researcher, not just for presentation, but for recording and comparing dynamic visualizations that are beyond the current capabilities of even the most powerful graphic workstation. To meet these needs, Lewis Research Center has recently established a facility to provide users with easy access to advanced video animation capabilities. However, producing animation that is both visually effective and scientifically accurate involves various technological and aesthetic considerations that must be understood both by the researcher and those supporting the visualization process. These considerations include: scan conversion, color conversion, and spatial ambiguities.

  12. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  13. Software for the ACP (Advanced Computer Program) multiprocessor system

    SciTech Connect

    Biel, J.; Areti, H.; Atac, R.; Cook, A.; Fischler, M.; Gaines, I.; Kaliher, C.; Hance, R.; Husby, D.; Nash, T.

    1987-02-02

    Software has been developed for use with the Fermilab Advanced Computer Program (ACP) multiprocessor system. The software was designed to make a system of a hundred independent node processors as easy to use as a single, powerful CPU. Subroutines have been developed by which a user's host program can send data to and get results from the program running in each of his ACP node processors. Utility programs make it easy to compile and link host and node programs, to debug a node program on an ACP development system, and to submit a debugged program to an ACP production system.

  14. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  15. Computer modeling for advanced life support system analysis.

    PubMed

    Drysdale, A

    1997-01-01

    This article discusses the equivalent mass approach to advanced life support system analysis, describes a computer model developed to use this approach, and presents early results from modeling the NASA JSC BioPlex. The model is built using an object oriented approach and G2, a commercially available modeling package Cost factor equivalencies are given for the Volosin scenarios. Plant data from NASA KSC and Utah State University (USU) are used, together with configuration data from the BioPlex design effort. Initial results focus on the importance of obtaining high plant productivity with a flight-like configuration. PMID:11540448

  16. Computational analysis of semi-span model test techniques

    NASA Technical Reports Server (NTRS)

    Milholen, William E., II; Chokani, Ndaona

    1996-01-01

    A computational investigation was conducted to support the development of a semi-span model test capability in the NASA LaRC's National Transonic Facility. This capability is required for the testing of high-lift systems at flight Reynolds numbers. A three-dimensional Navier-Stokes solver was used to compute the low-speed flow over both a full-span configuration and a semi-span configuration. The computational results were found to be in good agreement with the experimental data. The computational results indicate that the stand-off height has a strong influence on the flow over a semi-span model. The semi-span model adequately replicates the aerodynamic characteristics of the full-span configuration when a small stand-off height, approximately twice the tunnel empty sidewall boundary layer displacement thickness, is used. Several active sidewall boundary layer control techniques were examined including: upstream blowing, local jet blowing, and sidewall suction. Both upstream tangential blowing, and sidewall suction were found to minimize the separation of the sidewall boundary layer ahead of the semi-span model. The required mass flow rates are found to be practicable for testing in the NTF. For the configuration examined, the active sidewall boundary layer control techniques were found to be necessary only near the maximum lift conditions.

  17. Advances in parallel computer technology for desktop atmospheric dispersion models

    SciTech Connect

    Bian, X.; Ionescu-Niscov, S.; Fast, J.D.; Allwine, K.J.

    1996-12-31

    Desktop models are those models used by analysts with varied backgrounds, for performing, for example, air quality assessment and emergency response activities. These models must be robust, well documented, have minimal and well controlled user inputs, and have clear outputs. Existing coarse-grained parallel computers can provide significant increases in computation speed in desktop atmospheric dispersion modeling without considerable increases in hardware cost. This increased speed will allow for significant improvements to be made in the scientific foundations of these applied models, in the form of more advanced diffusion schemes and better representation of the wind and turbulence fields. This is especially attractive for emergency response applications where speed and accuracy are of utmost importance. This paper describes one particular application of coarse-grained parallel computer technology to a desktop complex terrain atmospheric dispersion modeling system. By comparing performance characteristics of the coarse-grained parallel version of the model with the single-processor version, we will demonstrate that applying coarse-grained parallel computer technology to desktop atmospheric dispersion modeling systems will allow us to address critical issues facing future requirements of this class of dispersion models.

  18. Assessment technique for computer-aided manufactured sockets.

    PubMed

    Sanders, Joan E; Severance, Michael R

    2011-01-01

    This article presents an assessment technique for testing the quality of prosthetic socket fabrication processes at computer-aided manufacturing facilities. The assessment technique is potentially useful to both facilities making sockets and companies marketing manufacturing equipment seeking to assess and improve product quality. To execute the assessment technique, an evaluator fabricates a collection of test models and sockets using the manufacturing suite under evaluation, then measures their shapes using scanning equipment. Overall socket quality is assessed by comparing socket shapes with electronic file (e-file) shapes. To characterize carving performance, model shapes are compared with e-file shapes. To characterize forming performance, socket shapes are compared with model shapes. The mean radial error (MRE), which is the average difference in radii between the two compared shapes, provides insight into sizing quality. Interquartile range (IQR), the range of radial error for the best-matched half of the points on the compared socket surfaces, provides insight into regional shape quality. The source(s) of socket shape error may be pinpointed by separately determining MRE and IQR for carving and forming. The developed assessment technique may provide a useful tool to the prosthetics community and industry to help identify problems and limitations in computer-aided manufacturing and give insight into appropriate modifications to overcome them. PMID:21938663

  19. Assessment technique for computer-aided manufactured sockets.

    PubMed

    Sanders, Joan E; Severance, Michael R

    2011-01-01

    This article presents an assessment technique for testing the quality of prosthetic socket fabrication processes at computer-aided manufacturing facilities. The assessment technique is potentially useful to both facilities making sockets and companies marketing manufacturing equipment seeking to assess and improve product quality. To execute the assessment technique, an evaluator fabricates a collection of test models and sockets using the manufacturing suite under evaluation, then measures their shapes using scanning equipment. Overall socket quality is assessed by comparing socket shapes with electronic file (e-file) shapes. To characterize carving performance, model shapes are compared with e-file shapes. To characterize forming performance, socket shapes are compared with model shapes. The mean radial error (MRE), which is the average difference in radii between the two compared shapes, provides insight into sizing quality. Interquartile range (IQR), the range of radial error for the best-matched half of the points on the compared socket surfaces, provides insight into regional shape quality. The source(s) of socket shape error may be pinpointed by separately determining MRE and IQR for carving and forming. The developed assessment technique may provide a useful tool to the prosthetics community and industry to help identify problems and limitations in computer-aided manufacturing and give insight into appropriate modifications to overcome them.

  20. Brain perfusion: computed tomography and magnetic resonance techniques.

    PubMed

    Copen, William A; Lev, Michael H; Rapalino, Otto

    2016-01-01

    Cerebral perfusion imaging provides assessment of regional microvascular hemodynamics in the living brain, enabling in vivo measurement of a variety of different hemodynamic parameters. Perfusion imaging techniques that are used in the clinical setting usually rely upon X-ray computed tomography (CT) or magnetic resonance imaging (MRI). This chapter reviews CT- and MRI-based perfusion imaging techniques, with attention to image acquisition, clinically relevant aspects of image postprocessing, and fundamental differences between CT- and MRI-based techniques. Correlations with cerebrovascular physiology and potential clinical applications of perfusion imaging are reviewed, focusing upon the two major classes of neurologic disease in which perfusion imaging is most often performed: primary perfusion disorders (including ischemic stroke, transient ischemic attack, and reperfusion syndrome), and brain tumors.

  1. Determining flexor-tendon repair techniques via soft computing

    NASA Technical Reports Server (NTRS)

    Johnson, M.; Firoozbakhsh, K.; Moniem, M.; Jamshidi, M.

    2001-01-01

    An SC-based multi-objective decision-making method for determining the optimal flexor-tendon repair technique from experimental and clinical survey data, and with variable circumstances, was presented. Results were compared with those from the Taguchi method. Using the Taguchi method results in the need to perform ad-hoc decisions when the outcomes for individual objectives are contradictory to a particular preference or circumstance, whereas the SC-based multi-objective technique provides a rigorous straightforward computational process in which changing preferences and importance of differing objectives are easily accommodated. Also, adding more objectives is straightforward and easily accomplished. The use of fuzzy-set representations of information categories provides insight into their performance throughout the range of their universe of discourse. The ability of the technique to provide a "best" medical decision given a particular physician, hospital, patient, situation, and other criteria was also demonstrated.

  2. A review of hemorheology: Measuring techniques and recent advances

    NASA Astrophysics Data System (ADS)

    Sousa, Patrícia C.; Pinho, Fernando T.; Alves, Manuel A.; Oliveira, Mónica S. N.

    2016-02-01

    Significant progress has been made over the years on the topic of hemorheology, not only in terms of the development of more accurate and sophisticated techniques, but also in terms of understanding the phenomena associated with blood components, their interactions and impact upon blood properties. The rheological properties of blood are strongly dependent on the interactions and mechanical properties of red blood cells, and a variation of these properties can bring further insight into the human health state and can be an important parameter in clinical diagnosis. In this article, we provide both a reference for hemorheological research and a resource regarding the fundamental concepts in hemorheology. This review is aimed at those starting in the field of hemodynamics, where blood rheology plays a significant role, but also at those in search of the most up-to-date findings (both qualitative and quantitative) in hemorheological measurements and novel techniques used in this context, including technical advances under more extreme conditions such as in large amplitude oscillatory shear flow or under extensional flow, which impose large deformations comparable to those found in the microcirculatory system and in diseased vessels. Given the impressive rate of increase in the available knowledge on blood flow, this review is also intended to identify areas where current knowledge is still incomplete, and which have the potential for new, exciting and useful research. We also discuss the most important parameters that can lead to an alteration of blood rheology, and which as a consequence can have a significant impact on the normal physiological behavior of blood.

  3. Advances in Poly(4-aminodiphenylaniline) Nanofibers Preparation by Electrospinning Technique.

    PubMed

    Della Pina, C; Busacca, C; Frontera, P; Antonucci, P L; Scarpino, L A; Sironi, A; Falletta, E

    2016-05-01

    Polyaniline (PANI) nanofibers are drawing a great deal of interest from academia and industry due to their multiple applications, especially in biomedical field. PANI nanofibers were successfully electrospun for the first time by MacDiarmid and co-workers at the beginning of the millennium and since then many efforts have been addressed to improve their quality. However, traditional PANI prepared from aniline monomer shows some drawbacks, such as presence of toxic (i.e., benzidine) and inorganic (salts and metals) co-products, that complicate polymer post-treatment, and low solubility in common organic solvents, making hard its processing by electrospinning technique. Some industrial sectors, such as medical and biomedical, need to employ materials free from toxic and polluting species. In this regard, the oxidative polymerization of N-(4-aminophenyl)aniline, aniline dimer, to produce poly(4-aminodiphenylaniline), P4ADA, a kind of PANI, represents an innovative alternative to the traditional synthesis because the obtained polymer results free from carcinogenic and/or polluting co-products, and, moreover, more soluble than traditional PANI. This latter feature can be exploited to obtain P4ADA nanofibers by electrospinning technique. In this paper we report the advances obtained in the P4ADA nanofibers electrospinnig. A comparison among polyethylene oxide (PEO), polymethyl methacrylate (PMMA) and polystyrene (PS), as the second polymer to facilitate the electrospinning process, is shown. In order to increase the conductivity of P4ADA nanofibers, two strategies were adopted and compared: selective insulating binder removal from electrospun nanofibers by a rinsing tratment, afterwards optimizing the minimum amount of binder necessary for the electrospinning process. Moreover, the effect of PEO/P4ADA weight ratio on the fibers morphology and conductivity was highlighted. PMID:27483933

  4. Advances in Poly(4-aminodiphenylaniline) Nanofibers Preparation by Electrospinning Technique.

    PubMed

    Della Pina, C; Busacca, C; Frontera, P; Antonucci, P L; Scarpino, L A; Sironi, A; Falletta, E

    2016-05-01

    Polyaniline (PANI) nanofibers are drawing a great deal of interest from academia and industry due to their multiple applications, especially in biomedical field. PANI nanofibers were successfully electrospun for the first time by MacDiarmid and co-workers at the beginning of the millennium and since then many efforts have been addressed to improve their quality. However, traditional PANI prepared from aniline monomer shows some drawbacks, such as presence of toxic (i.e., benzidine) and inorganic (salts and metals) co-products, that complicate polymer post-treatment, and low solubility in common organic solvents, making hard its processing by electrospinning technique. Some industrial sectors, such as medical and biomedical, need to employ materials free from toxic and polluting species. In this regard, the oxidative polymerization of N-(4-aminophenyl)aniline, aniline dimer, to produce poly(4-aminodiphenylaniline), P4ADA, a kind of PANI, represents an innovative alternative to the traditional synthesis because the obtained polymer results free from carcinogenic and/or polluting co-products, and, moreover, more soluble than traditional PANI. This latter feature can be exploited to obtain P4ADA nanofibers by electrospinning technique. In this paper we report the advances obtained in the P4ADA nanofibers electrospinnig. A comparison among polyethylene oxide (PEO), polymethyl methacrylate (PMMA) and polystyrene (PS), as the second polymer to facilitate the electrospinning process, is shown. In order to increase the conductivity of P4ADA nanofibers, two strategies were adopted and compared: selective insulating binder removal from electrospun nanofibers by a rinsing tratment, afterwards optimizing the minimum amount of binder necessary for the electrospinning process. Moreover, the effect of PEO/P4ADA weight ratio on the fibers morphology and conductivity was highlighted.

  5. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.

    PubMed

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  6. Advanced Techniques for Power System Identification from Measured Data

    SciTech Connect

    Pierre, John W.; Wies, Richard; Trudnowski, Daniel

    2008-11-25

    Time-synchronized measurements provide rich information for estimating a power-system's electromechanical modal properties via advanced signal processing. This information is becoming critical for the improved operational reliability of interconnected grids. A given mode's properties are described by its frequency, damping, and shape. Modal frequencies and damping are useful indicators of power-system stress, usually declining with increased load or reduced grid capacity. Mode shape provides critical information for operational control actions. This project investigated many advanced techniques for power system identification from measured data focusing on mode frequency and damping ratio estimation. Investigators from the three universities coordinated their effort with Pacific Northwest National Laboratory (PNNL). Significant progress was made on developing appropriate techniques for system identification with confidence intervals and testing those techniques on field measured data and through simulation. Experimental data from the western area power system was provided by PNNL and Bonneville Power Administration (BPA) for both ambient conditions and for signal injection tests. Three large-scale tests were conducted for the western area in 2005 and 2006. Measured field PMU (Phasor Measurement Unit) data was provided to the three universities. A 19-machine simulation model was enhanced for testing the system identification algorithms. Extensive simulations were run with this model to test the performance of the algorithms. University of Wyoming researchers participated in four primary activities: (1) Block and adaptive processing techniques for mode estimation from ambient signals and probing signals, (2) confidence interval estimation, (3) probing signal design and injection method analysis, and (4) performance assessment and validation from simulated and field measured data. Subspace based methods have been use to improve previous results from block processing

  7. Unification of color postprocessing techniques for 3-dimensional computational mechanics

    NASA Technical Reports Server (NTRS)

    Bailey, Bruce Charles

    1985-01-01

    To facilitate the understanding of complex three-dimensional numerical models, advanced interactive color postprocessing techniques are introduced. These techniques are sufficiently flexible so that postprocessing difficulties arising from model size, geometric complexity, response variation, and analysis type can be adequately overcome. Finite element, finite difference, and boundary element models may be evaluated with the prototype postprocessor. Elements may be removed from parent models to be studied as independent subobjects. Discontinuous responses may be contoured including responses which become singular, and nonlinear color scales may be input by the user for the enhancement of the contouring operation. Hit testing can be performed to extract precise geometric, response, mesh, or material information from the database. In addition, stress intensity factors may be contoured along the crack front of a fracture model. Stepwise analyses can be studied, and the user can recontour responses repeatedly, as if he were paging through the response sets. As a system, these tools allow effective interpretation of complex analysis results.

  8. Nanocrystalline materials: recent advances in crystallographic characterization techniques.

    PubMed

    Ringe, Emilie

    2014-11-01

    Most properties of nanocrystalline materials are shape-dependent, providing their exquisite tunability in optical, mechanical, electronic and catalytic properties. An example of the former is localized surface plasmon resonance (LSPR), the coherent oscillation of conduction electrons in metals that can be excited by the electric field of light; this resonance frequency is highly dependent on both the size and shape of a nanocrystal. An example of the latter is the marked difference in catalytic activity observed for different Pd nanoparticles. Such examples highlight the importance of particle shape in nanocrystalline materials and their practical applications. However, one may ask 'how are nanoshapes created?', 'how does the shape relate to the atomic packing and crystallography of the material?', 'how can we control and characterize the external shape and crystal structure of such small nanocrystals?'. This feature article aims to give the reader an overview of important techniques, concepts and recent advances related to these questions. Nucleation, growth and how seed crystallography influences the final synthesis product are discussed, followed by shape prediction models based on seed crystallography and thermodynamic or kinetic parameters. The crystallographic implications of epitaxy and orientation in multilayered, core-shell nanoparticles are overviewed, and, finally, the development and implications of novel, spatially resolved analysis tools are discussed.

  9. Achieving miniature sensor systems via advanced packaging techniques

    NASA Astrophysics Data System (ADS)

    Hartup, David C.; Bobier, Kevin; Demmin, Jeffrey

    2005-05-01

    Demands for miniaturized networked sensors that can be deployed in large quantities dictate that the packages be small and cost effective. In order to accomplish these objectives, system developers generally apply advanced packaging techniques to proven systems. A partnership of Nova Engineering and Tessera begins with a baseline of Nova's Unattended Ground Sensors (UGS) technology and utilizes Tessera's three-dimensional (3D) Chip-Scale Packaging (CSP), Multi-Chip Packaging (MCP), and System-in-Package (SIP) innovations to enable novel methods for fabricating compact, vertically integrated sensors utilizing digital, RF, and micro-electromechanical systems (MEMS) devices. These technologies, applied to a variety of sensors and integrated radio architectures, enable diverse multi-modal sensing networks with wireless communication capabilities. Sensors including imaging, accelerometers, acoustical, inertial measurement units, and gas and pressure sensors can be utilized. The greatest challenge to high density, multi-modal sensor networks is the ability to test each component prior to integration, commonly called Known Good Die (KGD) testing. In addition, the mix of multi-sourcing and high technology magnifies the challenge of testing at the die level. Utilizing Tessera proprietary CSP, MCP, and SIP interconnection methods enables fully testable, low profile stacking to create multi-modal sensor radios with high yield.

  10. Removing baseline flame's spectrum by using advanced recovering spectrum techniques.

    PubMed

    Arias, Luis; Sbarbaro, Daniel; Torres, Sergio

    2012-09-01

    In this paper, a novel automated algorithm to estimate and remove the continuous baseline from measured flame spectra is proposed. The algorithm estimates the continuous background based on previous information obtained from a learning database of continuous flame spectra. Then, the discontinuous flame emission is calculated by subtracting the estimated continuous baseline from the measured spectrum. The key issue subtending the learning database is that the continuous flame emissions are predominant in the sooty regions, in absence of discontinuous radiation. The proposed algorithm was tested using natural gas and bio-oil flames spectra at different combustion conditions, and the goodness-of-fit coefficient (GFC) quality metric was used to quantify the performance in the estimation process. Additionally, the commonly used first derivative method (FDM) for baseline removing was applied to the same testing spectra in order to compare and to evaluate the proposed technique. The achieved results show that the proposed method is a very attractive tool for designing advanced combustion monitoring strategies of discontinuous emissions. PMID:22945158

  11. Development of advanced strain diagnostic techniques for reactor environments.

    SciTech Connect

    Fleming, Darryn D.; Holschuh, Thomas Vernon,; Miller, Timothy J.; Hall, Aaron Christopher; Urrea, David Anthony,; Parma, Edward J.,

    2013-02-01

    The following research is operated as a Laboratory Directed Research and Development (LDRD) initiative at Sandia National Laboratories. The long-term goals of the program include sophisticated diagnostics of advanced fuels testing for nuclear reactors for the Department of Energy (DOE) Gen IV program, with the future capability to provide real-time measurement of strain in fuel rod cladding during operation in situ at any research or power reactor in the United States. By quantifying the stress and strain in fuel rods, it is possible to significantly improve fuel rod design, and consequently, to improve the performance and lifetime of the cladding. During the past year of this program, two sets of experiments were performed: small-scale tests to ensure reliability of the gages, and reactor pulse experiments involving the most viable samples in the Annulated Core Research Reactor (ACRR), located onsite at Sandia. Strain measurement techniques that can provide useful data in the extreme environment of a nuclear reactor core are needed to characterize nuclear fuel rods. This report documents the progression of solutions to this issue that were explored for feasibility in FY12 at Sandia National Laboratories, Albuquerque, NM.

  12. Nanocrystalline materials: recent advances in crystallographic characterization techniques

    PubMed Central

    Ringe, Emilie

    2014-01-01

    Most properties of nanocrystalline materials are shape-dependent, providing their exquisite tunability in optical, mechanical, electronic and catalytic properties. An example of the former is localized surface plasmon resonance (LSPR), the coherent oscillation of conduction electrons in metals that can be excited by the electric field of light; this resonance frequency is highly dependent on both the size and shape of a nanocrystal. An example of the latter is the marked difference in catalytic activity observed for different Pd nanoparticles. Such examples highlight the importance of particle shape in nanocrystalline materials and their practical applications. However, one may ask ‘how are nanoshapes created?’, ‘how does the shape relate to the atomic packing and crystallography of the material?’, ‘how can we control and characterize the external shape and crystal structure of such small nanocrystals?’. This feature article aims to give the reader an overview of important techniques, concepts and recent advances related to these questions. Nucleation, growth and how seed crystallography influences the final synthesis product are discussed, followed by shape prediction models based on seed crystallography and thermodynamic or kinetic parameters. The crystallographic implications of epitaxy and orientation in multilayered, core-shell nanoparticles are overviewed, and, finally, the development and implications of novel, spatially resolved analysis tools are discussed. PMID:25485133

  13. Advanced Computational and Experimental Techniques for Nacelle Liner Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Gerhold, Carl H.; Jones, Michael G.; Brown, Martha C.; Nark, Douglas

    2009-01-01

    The Curved Duct Test Rig (CDTR) has been developed to investigate sound propagation through a duct of size comparable to the aft bypass duct of typical aircraft engines. The axial dimension of the bypass duct is often curved and this geometric characteristic is captured in the CDTR. The semiannular bypass duct is simulated by a rectangular test section in which the height corresponds to the circumferential dimension and the width corresponds to the radial dimension. The liner samples are perforate over honeycomb core and are installed on the side walls of the test section. The top and bottom surfaces of the test section are acoustically rigid to simulate a hard wall bifurcation or pylon. A unique feature of the CDTR is the control system that generates sound incident on the liner test section in specific modes. Uniform air flow, at ambient temperature and flow speed Mach 0.275, is introduced through the duct. Experiments to investigate configuration effects such as curvature along the flow path on the acoustic performance of a sample liner are performed in the CDTR and reported in this paper. Combinations of treated and acoustically rigid side walls are investigated. The scattering of modes of the incident wave, both by the curvature and by the asymmetry of wall treatment, is demonstrated in the experimental results. The effect that mode scattering has on total acoustic effectiveness of the liner treatment is also shown. Comparisons of measured liner attenuation with numerical results predicted by an analytic model based on the parabolic approximation to the convected Helmholtz equation are reported. The spectra of attenuation produced by the analytic model are similar to experimental results for both walls treated, straight and curved flow path, with plane wave and higher order modes incident. The numerical model is used to define the optimized resistance and reactance of a liner that significantly improves liner attenuation in the frequency range 1900-2400 Hz. A liner impedance descriptor is used to determine the liner parameters that achieve the optimum impedance.

  14. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    SciTech Connect

    Reed, Daniel; Berzins, Martin; Pennington, Robert; Sarkar, Vivek; Taylor, Valerie

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  15. Satellite communication performance evaluation: Computational techniques based on moments

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1980-01-01

    Computational techniques that efficiently compute bit error probabilities when only moments of the various interference random variables are available are presented. The approach taken is a generalization of the well known Gauss-Quadrature rules used for numerically evaluating single or multiple integrals. In what follows, basic algorithms are developed. Some of its properties and generalizations are shown and its many potential applications are described. Some typical interference scenarios for which the results are particularly applicable include: intentional jamming, adjacent and cochannel interferences; radar pulses (RFI); multipath; and intersymbol interference. While the examples presented stress evaluation of bit error probilities in uncoded digital communication systems, the moment techniques can also be applied to the evaluation of other parameters, such as computational cutoff rate under both normal and mismatched receiver cases in coded systems. Another important application is the determination of the probability distributions of the output of a discrete time dynamical system. This type of model occurs widely in control systems, queueing systems, and synchronization systems (e.g., discrete phase locked loops).

  16. Recent Advances in Computational Mechanics of the Human Knee Joint

    PubMed Central

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  17. Computational intelligence techniques for biological data mining: An overview

    NASA Astrophysics Data System (ADS)

    Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari

    2014-10-01

    Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.

  18. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    NASA Astrophysics Data System (ADS)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  19. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  20. NASA Trapezoidal Wing Computations Including Transition and Advanced Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Lee-Rausch, E. M.

    2012-01-01

    Flow about the NASA Trapezoidal Wing is computed with several turbulence models by using grids from the first High Lift Prediction Workshop in an effort to advance understanding of computational fluid dynamics modeling for this type of flowfield. Transition is accounted for in many of the computations. In particular, a recently-developed 4-equation transition model is utilized and works well overall. Accounting for transition tends to increase lift and decrease moment, which improves the agreement with experiment. Upper surface flap separation is reduced, and agreement with experimental surface pressures and velocity profiles is improved. The predicted shape of wakes from upstream elements is strongly influenced by grid resolution in regions above the main and flap elements. Turbulence model enhancements to account for rotation and curvature have the general effect of increasing lift and improving the resolution of the wing tip vortex as it convects downstream. However, none of the models improve the prediction of surface pressures near the wing tip, where more grid resolution is needed.

  1. An overview of energy efficiency techniques in cluster computing systems

    SciTech Connect

    Valentini, Giorgio Luigi; Lassonde, Walter; Khan, Samee Ullah; Min-Allah, Nasro; Madani, Sajjad A.; Li, Juan; Zhang, Limin; Wang, Lizhe; Ghani, Nasir; Kolodziej, Joanna; Li, Hongxiang; Zomaya, Albert Y.; Xu, Cheng-Zhong; Balaji, Pavan; Vishnu, Abhinav; Pinel, Fredric; Pecero, Johnatan E.; Kliazovich, Dzmitry; Bouvry, Pascal

    2011-09-10

    Two major constraints demand more consideration for energy efficiency in cluster computing: (a) operational costs, and (b) system reliability. Increasing energy efficiency in cluster systems will reduce energy consumption, excess heat, lower operational costs, and improve system reliability. Based on the energy-power relationship, and the fact that energy consumption can be reduced with strategic power management, we focus in this survey on the characteristic of two main power management technologies: (a) static power management (SPM) systems that utilize low-power components to save the energy, and (b) dynamic power management (DPM) systems that utilize software and power-scalable components to optimize the energy consumption. We present the current state of the art in both of the SPM and DPM techniques, citing representative examples. The survey is concluded with a brief discussion and some assumptions about the possible future directions that could be explored to improve the energy efficiency in cluster computing.

  2. A computer graphics display and data compression technique

    NASA Technical Reports Server (NTRS)

    Teague, M. J.; Meyer, H. G.; Levenson, L. (Editor)

    1974-01-01

    The computer program discussed is intended for the graphical presentation of a general dependent variable X that is a function of two independent variables, U and V. The required input to the program is the variation of the dependent variable with one of the independent variables for various fixed values of the other. The computer program is named CRP, and the output is provided by the SD 4060 plotter. Program CRP is an extremely flexible program that offers the user a wide variety of options. The dependent variable may be presented in either a linear or a logarithmic manner. Automatic centering of the plot is provided in the ordinate direction, and the abscissa is scaled automatically for a logarithmic plot. A description of the carpet plot technique is given along with the coordinates system used in the program. Various aspects of the program logic are discussed and detailed documentation of the data card format is presented.

  3. SLDR: a computational technique to identify novel genetic regulatory relationships

    PubMed Central

    2014-01-01

    We developed a new computational technique called Step-Level Differential Response (SLDR) to identify genetic regulatory relationships. Our technique takes advantages of functional genomics data for the same species under different perturbation conditions, therefore complementary to current popular computational techniques. It can particularly identify "rare" activation/inhibition relationship events that can be difficult to find in experimental results. In SLDR, we model each candidate target gene as being controlled by N binary-state regulators that lead to ≤2N observable states ("step-levels") for the target. We applied SLDR to the study of the GEO microarray data set GSE25644, which consists of 158 different mutant S. cerevisiae gene expressional profiles. For each target gene t, we first clustered ordered samples into various clusters, each approximating an observable step-level of t to screen out the "de-centric" target. Then, we ordered each gene x as a candidate regulator and aligned t to x for the purpose of examining the step-level correlations between low expression set of x (Ro) and high expression set of x (Rh) from the regulator x to t, by finding max f(t, x): |Ro-Rh| over all candidate × in the genome for each t. We therefore obtained activation and inhibitions events from different combinations of Ro and Rh. Furthermore, we developed criteria for filtering out less-confident regulators, estimated the number of regulators for each target t, and evaluated identified top-ranking regulator-target relationship. Our results can be cross-validated with the Yeast Fitness database. SLDR is also computationally efficient with o(N2) complexity. In summary, we believe SLDR can be applied to the mining of functional genomics big data for future network biology and network medicine applications. PMID:25350940

  4. Advanced pattern-matching techniques for autonomous acquisition

    NASA Astrophysics Data System (ADS)

    Narendra, P. M.; Westover, B. L.

    1981-01-01

    The key objective of this effort is the development of pattern-matching algorithms which can impart autonomous acquisition capability to precision-guided munitions such as Copperhead and Hellfire. Autonomous acquisition through pattern matching holds the promise of eliminating laser designation and enhancing fire power by multiple target prioritization. The pattern-matching approach being developed under this program is based on a symbolic pattern-matching framework, which is suited for the autonomous acquisition scenario. It is based on matching a symbolic representation derived from the two images, and it can accommodate the stringent pattern-matchine criteria established by the scenario: enormous differences in the scene perspective, aspect and range between the two sensors, differences in sensor characteristics and illumination, and scene changes such as target motion and obscuration from one view point ot the other. This report contains a description of an efficient branch-and-bound technique for symbolic pattern matching. Also presented are the results of applying a simulation of the algorithm to pairs of FLIR images of military vehicles in cluttered environments as well as pairs of images from different sensors (FLIR and silicon TV). The computational requirements are analyzed toward real-time implementation, and avenues of future work are recommended.

  5. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  6. [Computation techniques in the conformational analysis of carbohydrates].

    PubMed

    Gebst, A G; Grachev, A A; Shashkov, A S; Nifant'ev, N E

    2007-01-01

    A growing number of modern studies of carbohydrates is devoted to spatial mechanisms of their participation in the cell recognition processes and directed design of inhibitors of these processes. Any progress in this field is impossible without the development of theoretical conformational analysis of carbohydrates. In this review, we generalize literature data on the potentialities of using of different molecular-mechanic force fields, the methods of quantum mechanics, and molecular dynamics to study the conformation of glycoside bond. A possibility of analyzing the reactivity of carbohydrates with the computation techniques is also discussed in brief.

  7. APPLICATION OF OBJECT ORIENTED PROGRAMMING TECHNIQUES IN FRONT END COMPUTERS.

    SciTech Connect

    SKELLY,J.F.

    1997-11-03

    The Front End Computer (FEC) environment imposes special demands on software, beyond real time performance and robustness. FEC software must manage a diverse inventory of devices with individualistic timing requirements and hardware interfaces. It must implement network services which export device access to the control system at large, interpreting a uniform network communications protocol into the specific control requirements of the individual devices. Object oriented languages provide programming techniques which neatly address these challenges, and also offer benefits in terms of maintainability and flexibility. Applications are discussed which exhibit the use of inheritance, multiple inheritance and inheritance trees, and polymorphism to address the needs of FEC software.

  8. Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing

    PubMed Central

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640

  9. Weldability and joining techniques for advanced fossil energy system alloys

    SciTech Connect

    Lundin, C.D.; Qiao, C.Y.P.; Liu, W.; Yang, D.; Zhou, G.; Morrison, M.

    1998-05-01

    The efforts represent the concerns for the basic understanding of the weldability and fabricability of the advanced high temperature alloys so necessary to affect increases in the efficiency of the next generation Fossil Energy Power Plants. The effort was divided into three tasks with the first effort dealing with the welding and fabrication behavior of 310HCbN (HR3C), the second task details the studies aimed at understanding the weldability of a newly developed 310TaN high temperature stainless (a modification of 310 stainless) and Task 3 addressed the cladding of austenitic tubing with Iron-Aluminide using the GTAW process. Task 1 consisted of microstructural studies on 310HCbN and the development of a Tube Weldability test which has applications to production welding techniques as well as laboratory weldability assessments. In addition, the evaluation of ex-service 310HCbN which showed fireside erosion and cracking at the attachment weld locations was conducted. Task 2 addressed the behavior of the newly developed 310 TaN modification of standard 310 stainless steel and showed that the weldability was excellent and that the sensitization potential was minimal for normal welding and fabrication conditions. The microstructural evolution during elevated temperature testing was characterized and the second phase particles evolved upon aging were identified. Task 3 details the investigation undertaken to clad 310HCbN tubing with Iron Aluminide and developed welding conditions necessary to provide a crack free cladding. The work showed that both a preheat and a post-heat was necessary for crack free deposits and the effect of a third element on the cracking potential was defined together with the effect of the aluminum level for optimum weldability.

  10. Bispectrum-based feature extraction technique for devising a practical brain-computer interface

    NASA Astrophysics Data System (ADS)

    Shahid, Shahjahan; Prasad, Girijesh

    2011-04-01

    The extraction of distinctly separable features from electroencephalogram (EEG) is one of the main challenges in designing a brain-computer interface (BCI). Existing feature extraction techniques for a BCI are mostly developed based on traditional signal processing techniques assuming that the signal is Gaussian and has linear characteristics. But the motor imagery (MI)-related EEG signals are highly non-Gaussian, non-stationary and have nonlinear dynamic characteristics. This paper proposes an advanced, robust but simple feature extraction technique for a MI-related BCI. The technique uses one of the higher order statistics methods, the bispectrum, and extracts the features of nonlinear interactions over several frequency components in MI-related EEG signals. Along with a linear discriminant analysis classifier, the proposed technique has been used to design an MI-based BCI. Three performance measures, classification accuracy, mutual information and Cohen's kappa have been evaluated and compared with a BCI using a contemporary power spectral density-based feature extraction technique. It is observed that the proposed technique extracts nearly recording-session-independent distinct features resulting in significantly much higher and consistent MI task detection accuracy and Cohen's kappa. It is therefore concluded that the bispectrum-based feature extraction is a promising technique for detecting different brain states.

  11. SciDAC Advances and Applications in Computational Beam Dynamics

    SciTech Connect

    Ryne, R.; Abell, D.; Adelmann, A.; Amundson, J.; Bohn, C.; Cary, J.; Colella, P.; Dechow, D.; Decyk, V.; Dragt, A.; Gerber, R.; Habib, S.; Higdon, D.; Katsouleas, T.; Ma, K.-L.; McCorquodale, P.; Mihalcea, D.; Mitchell, C.; Mori, W.; Mottershead, C.T.; Neri, F.; Pogorelov, I.; Qiang, J.; Samulyak, R.; Serafini, D.; Shalf, J.; Siegerist, C.; Spentzouris, P.; Stoltz, P.; Terzic, B.; Venturini, M.; Walstrom, P.

    2005-06-26

    SciDAC has had a major impact on computational beam dynamics and the design of particle accelerators. Particle accelerators--which account for half of the facilities in the DOE Office of Science Facilities for the Future of Science 20 Year Outlook--are crucial for US scientific, industrial, and economic competitiveness. Thanks to SciDAC, accelerator design calculations that were once thought impossible are now carried routinely, and new challenging and important calculations are within reach. SciDAC accelerator modeling codes are being used to get the most science out of existing facilities, to produce optimal designs for future facilities, and to explore advanced accelerator concepts that may hold the key to qualitatively new ways of accelerating charged particle beams. In this poster we present highlights from the SciDAC Accelerator Science and Technology (AST) project Beam Dynamics focus area in regard to algorithm development, software development, and applications.

  12. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  13. Experimental and computing strategies in advanced material characterization problems

    SciTech Connect

    Bolzon, G.

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  14. Experimental and computing strategies in advanced material characterization problems

    NASA Astrophysics Data System (ADS)

    Bolzon, G.

    2015-10-01

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  15. 5 CFR 550.404 - Computation of advance payments and evacuation payments; time periods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Computation of advance payments and... Computation of advance payments and evacuation payments; time periods. (a) Payments shall be based on the rate... others, when applicable, shall be made before advance payments or evacuation payments are made....

  16. 5 CFR 550.404 - Computation of advance payments and evacuation payments; time periods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Computation of advance payments and... Computation of advance payments and evacuation payments; time periods. (a) Payments shall be based on the rate... others, when applicable, shall be made before advance payments or evacuation payments are made....

  17. Reliability of an interactive computer program for advance care planning.

    PubMed

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  18. Reliability of an Interactive Computer Program for Advance Care Planning

    PubMed Central

    Levi, Benjamin H.; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-01-01

    Abstract Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83–0.95, and 0.86–0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time. PMID:22512830

  19. Optical design and characterization of an advanced computational imaging system

    NASA Astrophysics Data System (ADS)

    Shepard, R. Hamilton; Fernandez-Cull, Christy; Raskar, Ramesh; Shi, Boxin; Barsi, Christopher; Zhao, Hang

    2014-09-01

    We describe an advanced computational imaging system with an optical architecture that enables simultaneous and dynamic pupil-plane and image-plane coding accommodating several task-specific applications. We assess the optical requirement trades associated with custom and commercial-off-the-shelf (COTS) optics and converge on the development of two low-cost and robust COTS testbeds. The first is a coded-aperture programmable pixel imager employing a digital micromirror device (DMD) for image plane per-pixel oversampling and spatial super-resolution experiments. The second is a simultaneous pupil-encoded and time-encoded imager employing a DMD for pupil apodization or a deformable mirror for wavefront coding experiments. These two testbeds are built to leverage two MIT Lincoln Laboratory focal plane arrays - an orthogonal transfer CCD with non-uniform pixel sampling and on-chip dithering and a digital readout integrated circuit (DROIC) with advanced on-chip per-pixel processing capabilities. This paper discusses the derivation of optical component requirements, optical design metrics, and performance analyses for the two testbeds built.

  20. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  1. A computational technique for turbulent flow of wastewater sludge.

    PubMed

    Bechtel, Tom B

    2005-01-01

    A computational fluid dynamics (CFD) technique applied to the turbulent flow of wastewater sludge in horizontal, smooth-wall, circular pipes is presented. The technique uses the Crank-Nicolson finite difference method in conjunction with the variable secant method, an algorithm for determining the pressure gradient of the flow. A simple algebraic turbulence model is used. A Bingham-plastic rheological model is used to describe the shear stress/shear rate relationship for the wastewater sludge. The method computes velocity gradient and head loss, given a fixed volumetric flow, pipe size, and solids concentration. Solids concentrations ranging from 3 to 10% (by weight) and nominal pipe sizes from 0.15 m (6 in.) to 0.36 m (14 in.) are studied. Comparison of the CFD results for water to established values serves to validate the numerical method. The head loss results are presented in terms of a head loss ratio, R(hl), which is the ratio of sludge head loss to water head loss. An empirical equation relating R(hl) to pipe velocity and solids concentration, derived from the results of the CFD calculations, is presented. The results are compared with published values of Rhl for solids concentrations of 3 and 6%. A new expression for the Fanning friction factor for wastewater sludge flow is also presented.

  2. Investigation of joining techniques for advanced austenitic alloys

    SciTech Connect

    Lundin, C.D.; Qiao, C.Y.P.; Kikuchi, Y.; Shi, C.; Gill, T.P.S.

    1991-05-01

    Modified Alloys 316 and 800H, designed for high temperature service, have been developed at Oak Ridge National Laboratory. Assessment of the weldability of the advanced austenitic alloys has been conducted at the University of Tennessee. Four aspects of weldability of the advanced austenitic alloys were included in the investigation.

  3. Computational techniques for flows with finite-rate condensation

    NASA Technical Reports Server (NTRS)

    Candler, Graham V.

    1993-01-01

    A computational method to simulate the inviscid two-dimensional flow of a two-phase fluid was developed. This computational technique treats the gas phase and each of a prescribed number of particle sizes as separate fluids which are allowed to interact with one another. Thus, each particle-size class is allowed to move through the fluid at its own velocity at each point in the flow field. Mass, momentum, and energy are exchanged between each particle class and the gas phase. It is assumed that the particles do not collide with one another, so that there is no inter-particle exchange of momentum and energy. However, the particles are allowed to grow, and therefore, they may change from one size class to another. Appropriate rates of mass, momentum, and energy exchange between the gas and particle phases and between the different particle classes were developed. A numerical method was developed for use with this equation set. Several test cases were computed and show qualitative agreement with previous calculations.

  4. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  5. TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science

    NASA Astrophysics Data System (ADS)

    Wilson, C. R.; Spiegelman, M.; van Keken, P.

    2012-12-01

    Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are

  6. Advances in computer technology: impact on the practice of medicine.

    PubMed

    Groth-Vasselli, B; Singh, K; Farnsworth, P N

    1995-01-01

    Advances in computer technology provide a wide range of applications which are revolutionizing the practice of medicine. The development of new software for the office creates a web of communication among physicians, staff members, health care facilities and associated agencies. This provides the physician with the prospect of a paperless office. At the other end of the spectrum, the development of 3D work stations and software based on computational chemistry permits visualization of protein molecules involved in disease. Computer assisted molecular modeling has been used to construct working 3D models of lens alpha-crystallin. The 3D structure of alpha-crystallin is basic to our understanding of the molecular mechanisms involved in lens fiber cell maturation, stabilization of the inner nuclear region, the maintenance of lens transparency and cataractogenesis. The major component of the high molecular weight aggregates that occur during cataractogenesis is alpha-crystallin subunits. Subunits of alpha-crystallin occur in other tissues of the body. In the central nervous system accumulation of these subunits in the form of dense inclusion bodies occurs in pathological conditions such as Alzheimer's disease, Huntington's disease, multiple sclerosis and toxoplasmosis (Iwaki, Wisniewski et al., 1992), as well as neoplasms of astrocyte origin (Iwaki, Iwaki, et al., 1991). Also cardiac ischemia is associated with an increased alpha B synthesis (Chiesi, Longoni et al., 1990). On a more global level, the molecular structure of alpha-crystallin may provide information pertaining to the function of small heat shock proteins, hsp, in maintaining cell stability under the stress of disease.

  7. Recent advances in biosensor techniques for environmental monitoring.

    PubMed

    Rogers, K R

    2006-05-24

    Biosensors for environmental applications continue to show advances and improvements in areas such as sensitivity, selectivity and simplicity. In addition to detecting and measuring specific compounds or compound classes such as pesticides, hazardous industrial chemicals, toxic metals, and pathogenic bacteria, biosensors and bioanalytical assays have been designed to measure biological effects such as cytotoxicity, genotoxicity, biological oxygen demand, pathogenic bacteria, and endocrine disruption effects. This article is intended to discuss recent advances in the area of biosensors for environmental applications.

  8. Advanced techniques for array processing. Final report, 1 Mar 89-30 Apr 91

    SciTech Connect

    Friedlander, B.

    1991-05-30

    Array processing technology is expected to be a key element in communication systems designed for the crowded and hostile environment of the future battlefield. While advanced array processing techniques have been under development for some time, their practical use has been very limited. This project addressed some of the issues which need to be resolved for a successful transition of these promising techniques from theory into practice. The main problem which was studied was that of finding the directions of multiple co-channel transmitters from measurements collected by an antenna array. Two key issues related to high-resolution direction finding were addressed: effects of system calibration errors, and effects of correlation between the received signals due to multipath propagation. A number of useful theoretical performance analysis results were derived, and computationally efficient direction estimation algorithms were developed. These results include: self-calibration techniques for antenna arrays, sensitivity analysis for high-resolution direction finding, extensions of the root-MUSIC algorithm to arbitrary arrays and to arrays with polarization diversity, and new techniques for direction finding in the presence of multipath based on array interpolation. (Author)

  9. Computer vision techniques for rotorcraft low-altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Cheng, Victor H. L.

    1988-01-01

    A description is given of research that applies techniques from computer vision to automation of rotorcraft navigation. The effort emphasizes the development of a methodology for detecting the ranges to obstacles in the region of interest based on the maximum utilization of passive sensors. The range map derived from the obstacle detection approach can be used as obstacle data for the obstacle avoidance in an automataic guidance system and as advisory display to the pilot. The lack of suitable flight imagery data, however, presents a problem in the verification of concepts for obstacle detection. This problem is being addressed by the development of an adequate flight database and by preprocessing of currently available flight imagery. Some comments are made on future work and how research in this area relates to the guidance of other autonomous vehicles.

  10. Computer-aided classification of lung nodules on computed tomography images via deep learning technique.

    PubMed

    Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen

    2015-01-01

    Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain.

  11. Computer-aided classification of lung nodules on computed tomography images via deep learning technique

    PubMed Central

    Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen

    2015-01-01

    Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain. PMID:26346558

  12. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    SciTech Connect

    Fletcher, James H.; Cox, Philip; Harrington, William J; Campbell, Joseph L

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  13. Visual computation of egomotion using an image interpolation technique.

    PubMed

    Chahl, J S; Srinivasan, M V

    1996-05-01

    A novel technique is presented for the computation of the parameters of egomotion of a mobile device, such as a robot or a mechanical arm, equipped with two visual sensors. Each sensor captures a panoramic view of the environment. We show the parameters of ego-motion can be computed by interpolating the position of the image captured by one of the sensors at the robot's present location, with respect to the images captured by the two sensors at the robot's previous location. The algorithm delivers the distance travelled and angle rotated, without the explicit measurement or integration of velocity fields. The result is obtained in a single step, without any iteration or successive approximation. Tests of the algorithm on real and synthetic images reveal an accuracy to within 5% of the actual motion. Implementation of the algorithm on a mobile robot reveals that stepwise rotation and translation can be measured to within 10% accuracy in a three-dimensional world of unknown structure. The position and orientation of the robot at the end of a 30-step trajectory can be estimated with accuracies of 5% and 5 degrees, respectively.

  14. Visual computation of egomotion using an image interpolation technique.

    PubMed

    Chahl, J S; Srinivasan, M V

    1996-05-01

    A novel technique is presented for the computation of the parameters of egomotion of a mobile device, such as a robot or a mechanical arm, equipped with two visual sensors. Each sensor captures a panoramic view of the environment. We show the parameters of ego-motion can be computed by interpolating the position of the image captured by one of the sensors at the robot's present location, with respect to the images captured by the two sensors at the robot's previous location. The algorithm delivers the distance travelled and angle rotated, without the explicit measurement or integration of velocity fields. The result is obtained in a single step, without any iteration or successive approximation. Tests of the algorithm on real and synthetic images reveal an accuracy to within 5% of the actual motion. Implementation of the algorithm on a mobile robot reveals that stepwise rotation and translation can be measured to within 10% accuracy in a three-dimensional world of unknown structure. The position and orientation of the robot at the end of a 30-step trajectory can be estimated with accuracies of 5% and 5 degrees, respectively. PMID:8991456

  15. Advanced Millimeter-Wave Security Portal Imaging Techniques

    SciTech Connect

    Sheen, David M.; Bernacki, Bruce E.; McMakin, Douglas L.

    2012-04-01

    Millimeter-wave imaging is rapidly gaining acceptance for passenger screening at airports and other secured facilities. This paper details a number of techniques developed over the last several years including novel image reconstruction and display techniques, polarimetric imaging techniques, array switching schemes, as well as high frequency high bandwidth techniques. Implementation of some of these methods will increase the cost and complexity of the mm-wave security portal imaging systems. RF photonic methods may provide new solutions to the design and development of the sequentially switched linear mm-wave arrays that are the key element in the mm-wave portal imaging systems.

  16. Recent advances in computer camera methods for machine vision

    NASA Astrophysics Data System (ADS)

    Olson, Gaylord G.; Walker, Jo N.

    1998-10-01

    During the past year, several new computer camera methods (hardware and software) have been developed which have applications in machine vision. These are described below, along with some test results. The improvements are generally in the direction of higher speed and greater parallelism. A PCI interface card has been designed which is adaptable to multiple CCD types, both color and monochrome. A newly designed A/D converter allows for a choice of 8 or 10-bit conversion resolution and a choice of two different analog inputs. Thus, by using four of these converters feeding the 32-bit PCI data bus, up to 8 camera heads can be used with a single PCI card, and four camera heads can be operated in parallel. The card has been designed so that any of 8 different CCD types can be used with it (6 monochrome and 2 color CCDs) ranging in resolution from 192 by 165 pixels up to 1134 by 972 pixels. In the area of software, a method has been developed to better utilize the decision-making capability of the computer along with the sub-array scan capabilities of many CCDs. Specifically, it is shown below how to achieve a dual scan mode camera system wherein one scan mode is a low density, high speed scan of a complete image area, and a higher density sub-array scan is used in those areas where changes have been observed. The name given to this technique is adaptive sub-array scanning.

  17. Integrating advanced materials simulation techniques into an automated data analysis workflow at the Spallation Neutron Source

    SciTech Connect

    Borreguero Calvo, Jose M; Campbell, Stuart I; Delaire, Olivier A; Doucet, Mathieu; Goswami, Monojoy; Hagen, Mark E; Lynch, Vickie E; Proffen, Thomas E; Ren, Shelly; Savici, Andrei T; Sumpter, Bobby G

    2014-01-01

    This presentation will review developments on the integration of advanced modeling and simulation techniques into the analysis step of experimental data obtained at the Spallation Neutron Source. A workflow framework for the purpose of refining molecular mechanics force-fields against quasi-elastic neutron scattering data is presented. The workflow combines software components to submit model simulations to remote high performance computers, a message broker interface for communications between the optimizer engine and the simulation production step, and tools to convolve the simulated data with the experimental resolution. A test application shows the correction to a popular fixed-charge water model in order to account polarization effects due to the presence of solvated ions. Future enhancements to the refinement workflow are discussed. This work is funded through the DOE Center for Accelerating Materials Modeling.

  18. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    SciTech Connect

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  19. Recent advances in microscopic techniques for visualizing leukocytes in vivo

    PubMed Central

    Jain, Rohit; Tikoo, Shweta; Weninger, Wolfgang

    2016-01-01

    Leukocytes are inherently motile and interactive cells. Recent advances in intravital microscopy approaches have enabled a new vista of their behavior within intact tissues in real time. This brief review summarizes the developments enabling the tracking of immune responses in vivo. PMID:27239292

  20. Bricklaying Curriculum: Advanced Bricklaying Techniques. Instructional Materials. Revised.

    ERIC Educational Resources Information Center

    Turcotte, Raymond J.; Hendrix, Laborn J.

    This curriculum guide is designed to assist bricklaying instructors in providing performance-based instruction in advanced bricklaying. Included in the first section of the guide are units on customized or architectural masonry units; glass block; sills, lintels, and copings; and control (expansion) joints. The next two units deal with cut,…

  1. Advanced NDE techniques for quantitative characterization of aircraft

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.

    1990-01-01

    Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.

  2. Utilization of advanced calibration techniques in stochastic rock fall analysis of quarry slopes

    NASA Astrophysics Data System (ADS)

    Preh, Alexander; Ahmadabadi, Morteza; Kolenprat, Bernd

    2016-04-01

    In order to study rock fall dynamics, a research project was conducted by the Vienna University of Technology and the Austrian Central Labour Inspectorate (Federal Ministry of Labour, Social Affairs and Consumer Protection). A part of this project included 277 full-scale drop tests at three different quarries in Austria and recording key parameters of the rock fall trajectories. The tests involved a total of 277 boulders ranging from 0.18 to 1.8 m in diameter and from 0.009 to 8.1 Mg in mass. The geology of these sites included strong rock belonging to igneous, metamorphic and volcanic types. In this paper the results of the tests are used for calibration and validation a new stochastic computer model. It is demonstrated that the error of the model (i.e. the difference between observed and simulated results) has a lognormal distribution. Selecting two parameters, advanced calibration techniques including Markov Chain Monte Carlo Technique, Maximum Likelihood and Root Mean Square Error (RMSE) are utilized to minimize the error. Validation of the model based on the cross validation technique reveals that in general, reasonable stochastic approximations of the rock fall trajectories are obtained in all dimensions, including runout, bounce heights and velocities. The approximations are compared to the measured data in terms of median, 95% and maximum values. The results of the comparisons indicate that approximate first-order predictions, using a single set of input parameters, are possible and can be used to aid practical hazard and risk assessment.

  3. Backscattered Electron Microscopy as an Advanced Technique in Petrography.

    ERIC Educational Resources Information Center

    Krinsley, David Henry; Manley, Curtis Robert

    1989-01-01

    Three uses of this method with sandstone, desert varnish, and granite weathering are described. Background information on this technique is provided. Advantages of this type of microscopy are stressed. (CW)

  4. Rugoscopy: Human identification by computer-assisted photographic superimposition technique

    PubMed Central

    Mohammed, Rezwana Begum; Patil, Rajendra G.; Pammi, V. R.; Sandya, M. Pavana; Kalyan, Siva V.; Anitha, A.

    2013-01-01

    Background: Human identification has been studied since fourteenth century and it has gradually advanced for forensic purposes. Traditional methods such as dental, fingerprint, and DNA comparisons are probably the most common techniques used in this context, allowing fast and secure identification processes. But, in circumstances where identification of an individual by fingerprint or dental record comparison is difficult, palatal rugae may be considered as an alternative source of material. Aim: The present study was done to evaluate the individualistic nature and use of palatal rugae patterns for personal identification and also to test the efficiency of computerized software for forensic identification by photographic superimposition of palatal photographs obtained from casts. Materials and Methods: Two sets of Alginate impressions were made from the upper arches of 100 individuals (50 males and 50 females) with one month interval in between and the casts were poured. All the teeth except the incisors were removed to ensure that only the palate could be used in identification process. In one set of the casts, the palatal rugae were highlighted with a graphite pencil. All the 200 casts were randomly numbered, and then, they were photographed with a 10.1 Mega Pixel Kodak digital camera using standardized method. Using computerized software, the digital photographs of the models without highlighting the palatal rugae were overlapped over the images (transparent) of the palatal rugae with highlighted palatal rugae, in order to identify the pairs by superimposition technique. Incisors were remained and used as landmarks to determine the magnification required to bring the two set of photographs to the same size, in order to make perfect superimposition of images. Results: The result of the overlapping of the digital photographs of highlighted palatal rugae over normal set of models without highlighted palatal rugae resulted in 100% positive identification. Conclusion

  5. Electroextraction and electromembrane extraction: Advances in hyphenation to analytical techniques

    PubMed Central

    Oedit, Amar; Ramautar, Rawi; Hankemeier, Thomas

    2016-01-01

    Electroextraction (EE) and electromembrane extraction (EME) are sample preparation techniques that both require an electric field that is applied over a liquid‐liquid system, which enables the migration of charged analytes. Furthermore, both techniques are often used to pre‐concentrate analytes prior to analysis. In this review an overview is provided of the body of literature spanning April 2012–November 2015 concerning EE and EME, focused on hyphenation to analytical techniques. First, the theoretical aspects of concentration enhancement in EE and EME are discussed to explain extraction recovery and enrichment factor. Next, overviews are provided of the techniques based on their hyphenation to LC, GC, CE, and direct detection. These overviews cover the compounds and matrices, experimental aspects (i.e. donor volume, acceptor volume, extraction time, extraction voltage, and separation time) and the analytical aspects (i.e. limit of detection, enrichment factor, and extraction recovery). Techniques that were either hyphenated online to analytical techniques or show high potential with respect to online hyphenation are highlighted. Finally, the potential future directions of EE and EME are discussed. PMID:26864699

  6. Advanced millimeter-wave security portal imaging techniques

    NASA Astrophysics Data System (ADS)

    Sheen, David M.; Bernacki, Bruce E.; McMakin, Douglas L.

    2012-03-01

    Millimeter-wave (mm-wave) imaging is rapidly gaining acceptance as a security tool to augment conventional metal detectors and baggage x-ray systems for passenger screening at airports and other secured facilities. This acceptance indicates that the technology has matured; however, many potential improvements can yet be realized. The authors have developed a number of techniques over the last several years including novel image reconstruction and display techniques, polarimetric imaging techniques, array switching schemes, and high-frequency high-bandwidth techniques. All of these may improve the performance of new systems; however, some of these techniques will increase the cost and complexity of the mm-wave security portal imaging systems. Reducing this cost may require the development of novel array designs. In particular, RF photonic methods may provide new solutions to the design and development of the sequentially switched linear mm-wave arrays that are the key element in the mm-wave portal imaging systems. Highfrequency, high-bandwidth designs are difficult to achieve with conventional mm-wave electronic devices, and RF photonic devices may be a practical alternative. In this paper, the mm-wave imaging techniques developed at PNNL are reviewed and the potential for implementing RF photonic mm-wave array designs is explored.

  7. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    SciTech Connect

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  8. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  9. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  10. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    SciTech Connect

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  11. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  12. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  13. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  14. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  15. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  16. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  17. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2001-10-01

    In the second year of the project, the Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column is further developed. The approach uses an Eulerian analysis of liquid flows in the bubble column, and makes use of the Lagrangian trajectory analysis for the bubbles and particle motions. An experimental set for studying a two-dimensional bubble column is also developed. The operation of the bubble column is being tested and diagnostic methodology for quantitative measurements is being developed. An Eulerian computational model for the flow condition in the two-dimensional bubble column is also being developed. The liquid and bubble motions are being analyzed and the results are being compared with the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures is also being studied. Further progress was also made in developing a thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion. The balance laws are obtained and the constitutive laws are being developed. Progress was also made in measuring concentration and velocity of particles of different sizes near a wall in a duct flow. The technique of Phase-Doppler anemometry was used in these studies. The general objective of this project is to provide the needed fundamental understanding of three-phase slurry reactors in Fischer-Tropsch (F-T) liquid fuel synthesis. The other main goal is to develop a computational capability for predicting the transport and processing of three-phase coal slurries. The specific objectives are: (1) To develop a thermodynamically consistent rate-dependent anisotropic model for multiphase slurry flows with and without chemical reaction for application to coal liquefaction. Also establish the

  18. Nondestructive Evaluation of Thick Concrete Using Advanced Signal Processing Techniques

    SciTech Connect

    Clayton, Dwight A; Barker, Alan M; Santos-Villalobos, Hector J; Albright, Austin P; Hoegh, Kyle; Khazanovich, Lev

    2015-09-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years [1]. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the various nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations.

  19. Continuous analog of multiplicative algebraic reconstruction technique for computed tomography

    NASA Astrophysics Data System (ADS)

    Tateishi, Kiyoko; Yamaguchi, Yusaku; Abou Al-Ola, Omar M.; Kojima, Takeshi; Yoshinaga, Tetsuya

    2016-03-01

    We propose a hybrid dynamical system as a continuous analog to the block-iterative multiplicative algebraic reconstruction technique (BI-MART), which is a well-known iterative image reconstruction algorithm for computed tomography. The hybrid system is described by a switched nonlinear system with a piecewise smooth vector field or differential equation and, for consistent inverse problems, the convergence of non-negatively constrained solutions to a globally stable equilibrium is guaranteed by the Lyapunov theorem. Namely, we can prove theoretically that a weighted Kullback-Leibler divergence measure can be a common Lyapunov function for the switched system. We show that discretizing the differential equation by using the first-order approximation (Euler's method) based on the geometric multiplicative calculus leads to the same iterative formula of the BI-MART with the scaling parameter as a time-step of numerical discretization. The present paper is the first to reveal that a kind of iterative image reconstruction algorithm is constructed by the discretization of a continuous-time dynamical system for solving tomographic inverse problems. Iterative algorithms with not only the Euler method but also the Runge-Kutta methods of lower-orders applied for discretizing the continuous-time system can be used for image reconstruction. A numerical example showing the characteristics of the discretized iterative methods is presented.

  20. Advanced Sensing and Control Techniques to Facilitate Semi-Autonomous Decommissioning

    SciTech Connect

    Schalkoff, Robert J.

    1999-06-01

    This research is intended to advance the technology of semi-autonomous teleoperated robotics as applied to Decontamination and Decommissioning (D&D) tasks. Specifically, research leading to a prototype dual-manipulator mobile work cell is underway. This cell is supported and enhanced by computer vision, virtual reality and advanced robotics technology.

  1. Brain development in preterm infants assessed using advanced MRI techniques.

    PubMed

    Tusor, Nora; Arichi, Tomoki; Counsell, Serena J; Edwards, A David

    2014-03-01

    Infants who are born preterm have a high incidence of neurocognitive and neurobehavioral abnormalities, which may be associated with impaired brain development. Advanced magnetic resonance imaging (MRI) approaches, such as diffusion MRI (d-MRI) and functional MRI (fMRI), provide objective and reproducible measures of brain development. Indices derived from d-MRI can be used to provide quantitative measures of preterm brain injury. Although fMRI of the neonatal brain is currently a research tool, future studies combining d-MRI and fMRI have the potential to assess the structural and functional properties of the developing brain and its response to injury.

  2. Application of advanced coating techniques to rocket engine components

    NASA Technical Reports Server (NTRS)

    Verma, S. K.

    1988-01-01

    The materials problem in the space shuttle main engine (SSME) is reviewed. Potential coatings and the method of their application for improved life of SSME components are discussed. A number of advanced coatings for turbine blade components and disks are being developed and tested in a multispecimen thermal fatigue fluidized bed facility at IIT Research Institute. This facility is capable of producing severe strains of the degree present in blades and disk components of the SSME. The potential coating systems and current efforts at IITRI being taken for life extension of the SSME components are summarized.

  3. Transcranial Doppler: Techniques and advanced applications: Part 2

    PubMed Central

    Sharma, Arvind K.; Bathala, Lokesh; Batra, Amit; Mehndiratta, Man Mohan; Sharma, Vijay K.

    2016-01-01

    Transcranial Doppler (TCD) is the only diagnostic tool that can provide continuous information about cerebral hemodynamics in real time and over extended periods. In the previous paper (Part 1), we have already presented the basic ultrasound physics pertaining to TCD, insonation methods, and various flow patterns. This article describes various advanced applications of TCD such as detection of right-to-left shunt, emboli monitoring, vasomotor reactivity (VMR), monitoring of vasospasm in subarachnoid hemorrhage (SAH), monitoring of intracranial pressure, its role in stoke prevention in sickle cell disease, and as a supplementary test for confirmation of brain death. PMID:27011639

  4. In Situ Techniques for Monitoring Electrochromism: An Advanced Laboratory Experiment

    ERIC Educational Resources Information Center

    Saricayir, Hakan; Uce, Musa; Koca, Atif

    2010-01-01

    This experiment employs current technology to enhance and extend existing lab content. The basic principles of spectroscopic and electroanalytical techniques and their use in determining material properties are covered in some detail in many undergraduate chemistry programs. However, there are limited examples of laboratory experiments with in…

  5. Benefits of advanced software techniques for mission planning systems

    NASA Technical Reports Server (NTRS)

    Gasquet, A.; Parrod, Y.; Desaintvincent, A.

    1994-01-01

    The increasing complexity of modern spacecraft, and the stringent requirement for maximizing their mission return, call for a new generation of Mission Planning Systems (MPS). In this paper, we discuss the requirements for the Space Mission Planning and the benefits which can be expected from Artificial Intelligence techniques through examples of applications developed by Matra Marconi Space.

  6. A fission matrix based validation protocol for computed power distributions in the advanced test reactor

    SciTech Connect

    Nielsen, J. W.; Nigg, D. W.; LaPorta, A. W.

    2013-07-01

    The Idaho National Laboratory (INL) has been engaged in a significant multi year effort to modernize the computational reactor physics tools and validation procedures used to support operations of the Advanced Test Reactor (ATR) and its companion critical facility (ATRC). Several new protocols for validation of computed neutron flux distributions and spectra as well as for validation of computed fission power distributions, based on new experiments and well-recognized least-squares statistical analysis techniques, have been under development. In the case of power distributions, estimates of the a priori ATR-specific fuel element-to-element fission power correlation and covariance matrices are required for validation analysis. A practical method for generating these matrices using the element-to-element fission matrix is presented, along with a high-order scheme for estimating the underlying fission matrix itself. The proposed methodology is illustrated using the MCNP5 neutron transport code for the required neutronics calculations. The general approach is readily adaptable for implementation using any multidimensional stochastic or deterministic transport code that offers the required level of spatial, angular, and energy resolution in the computed solution for the neutron flux and fission source. (authors)

  7. Single Molecule Techniques for Advanced in situ Hybridization

    SciTech Connect

    Hollars, C W; Stubbs, L; Carlson, K; Lu, X; Wehri, E

    2003-02-03

    One of the most significant achievements of modern science is completion of the human genome sequence, completed in the year 2000. Despite this monumental accomplishment, researchers have only begun to understand the relationships between this three-billion-nucleotide genetic code and the regulation and control of gene and protein expression within each of the millions of different types of highly specialized cells. Several methodologies have been developed for the analysis of gene and protein expression in situ, yet despite these advancements, the pace of such analyses is extremely limited. Because information regarding the precise timing and location of gene expression is a crucial component in the discovery of new pharmacological agents for the treatment of disease, there is an enormous incentive to develop technologies that accelerate the analytical process. Here we report on the use of plasmon resonant particles as advanced probes for in situ hybridization. These probes are used for the detection of low levels of gene-probe response and demonstrate a detection method that enables precise, simultaneous localization within a cell of the points of expression of multiple genes or proteins in a single sample.

  8. Current Advances in the Computational Simulation of the Formation of Low-Mass Stars

    SciTech Connect

    Klein, R I; Inutsuka, S; Padoan, P; Tomisaka, K

    2005-10-24

    Developing a theory of low-mass star formation ({approx} 0.1 to 3 M{sub {circle_dot}}) remains one of the most elusive and important goals of theoretical astrophysics. The star-formation process is the outcome of the complex dynamics of interstellar gas involving non-linear interactions of turbulence, gravity, magnetic field and radiation. The evolution of protostellar condensations, from the moment they are assembled by turbulent flows to the time they reach stellar densities, spans an enormous range of scales, resulting in a major computational challenge for simulations. Since the previous Protostars and Planets conference, dramatic advances in the development of new numerical algorithmic techniques have been successfully implemented on large scale parallel supercomputers. Among such techniques, Adaptive Mesh Refinement and Smooth Particle Hydrodynamics have provided frameworks to simulate the process of low-mass star formation with a very large dynamic range. It is now feasible to explore the turbulent fragmentation of molecular clouds and the gravitational collapse of cores into stars self-consistently within the same calculation. The increased sophistication of these powerful methods comes with substantial caveats associated with the use of the techniques and the interpretation of the numerical results. In this review, we examine what has been accomplished in the field and present a critique of both numerical methods and scientific results. We stress that computational simulations should obey the available observational constraints and demonstrate numerical convergence. Failing this, results of large scale simulations do not advance our understanding of low-mass star formation.

  9. Developments and advances concerning the hyperpolarisation technique SABRE.

    PubMed

    Mewis, Ryan E

    2015-10-01

    To overcome the inherent sensitivity issue in NMR and MRI, hyperpolarisation techniques are used. Signal Amplification By Reversible Exchange (SABRE) is a hyperpolarisation technique that utilises parahydrogen, a molecule that possesses a nuclear singlet state, as the source of polarisation. A metal complex is required to break the singlet order of parahydrogen and, by doing so, facilitates polarisation transfer to analyte molecules ligated to the same complex through the J-coupled network that exists. The increased signal intensities that the analyte molecules possess as a result of this process have led to investigations whereby their potential as MRI contrast agents has been probed and to understand the fundamental processes underpinning the polarisation transfer mechanism. As well as discussing literature relevant to both of these areas, the chemical structure of the complex, the physical constraints of the polarisation transfer process and the successes of implementing SABRE at low and high magnetic fields are discussed. PMID:26264565

  10. A numerical technique for calculation of the noise of high-speed propellers with advanced blade geometry

    NASA Technical Reports Server (NTRS)

    Nystrom, P. A.; Farassat, F.

    1980-01-01

    A numerical technique and computer program were developed for the prediction of the noise of propellers with advanced geometry. The blade upper and lower surfaces are described by a curvilinear coordinate system, which was also used to divide the blade surfaces into panels. Two different acoustic formulations in the time domain were used to improve the speed and efficiency of the noise calculations: an acoustic formualtion with the Doppler factor singularity for panels moving at subsonic speeds and the collapsing sphere formulation for panels moving at transonic or supersonic speeds. This second formulation involves a sphere which is centered at the observer position and whose radius decreases at the speed of sound. The acoustic equation consisted of integrals over the curve of intersection for both the sphere and the panels on the blade. Algorithms used in some parts of the computer program are discussed. Comparisons with measured acoustic data for two model high speed propellers with advanced geometry are also presented.

  11. Advanced techniques for characterization of ion beam modified materials

    SciTech Connect

    Zhang, Yanwen; Debelle, Aurélien; Boulle, Alexandre; Kluth, Patrick; Tuomisto, Filip

    2014-10-30

    Understanding the mechanisms of damage formation in materials irradiated with energetic ions is essential for the field of ion-beam materials modification and engineering. Utilizing incident ions, electrons, photons, and positrons, various analysis techniques, including Rutherford backscattering spectrometry (RBS), electron RBS, Raman spectroscopy, high-resolution X-ray diffraction, small-angle X-ray scattering, and positron annihilation spectroscopy, are routinely used or gaining increasing attention in characterizing ion beam modified materials. The distinctive information, recent developments, and some perspectives in these techniques are reviewed in this paper. Applications of these techniques are discussed to demonstrate their unique ability for studying ion-solid interactions and the corresponding radiation effects in modified depths ranging from a few nm to a few tens of μm, and to provide information on electronic and atomic structure of the materials, defect configuration and concentration, as well as phase stability, amorphization and recrystallization processes. Finally, such knowledge contributes to our fundamental understanding over a wide range of extreme conditions essential for enhancing material performance and also for design and synthesis of new materials to address a broad variety of future energy applications.

  12. Advanced techniques for characterization of ion beam modified materials

    DOE PAGES

    Zhang, Yanwen; Debelle, Aurélien; Boulle, Alexandre; Kluth, Patrick; Tuomisto, Filip

    2014-10-30

    Understanding the mechanisms of damage formation in materials irradiated with energetic ions is essential for the field of ion-beam materials modification and engineering. Utilizing incident ions, electrons, photons, and positrons, various analysis techniques, including Rutherford backscattering spectrometry (RBS), electron RBS, Raman spectroscopy, high-resolution X-ray diffraction, small-angle X-ray scattering, and positron annihilation spectroscopy, are routinely used or gaining increasing attention in characterizing ion beam modified materials. The distinctive information, recent developments, and some perspectives in these techniques are reviewed in this paper. Applications of these techniques are discussed to demonstrate their unique ability for studying ion-solid interactions and the corresponding radiationmore » effects in modified depths ranging from a few nm to a few tens of μm, and to provide information on electronic and atomic structure of the materials, defect configuration and concentration, as well as phase stability, amorphization and recrystallization processes. Finally, such knowledge contributes to our fundamental understanding over a wide range of extreme conditions essential for enhancing material performance and also for design and synthesis of new materials to address a broad variety of future energy applications.« less

  13. Advancing botnet modeling techniques for military and security simulations

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    2011-06-01

    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  14. Advanced materials and techniques for fibre-optic sensing

    NASA Astrophysics Data System (ADS)

    Henderson, Philip J.

    2014-06-01

    Fibre-optic monitoring systems came of age in about 1999 upon the emergence of the world's first significant commercialising company - a spin-out from the UK's collaborative MAST project. By using embedded fibre-optic technology, the MAST project successfully measured transient strain within high-performance composite yacht masts. Since then, applications have extended from smart composites into civil engineering, energy, military, aerospace, medicine and other sectors. Fibre-optic sensors come in various forms, and may be subject to embedment, retrofitting, and remote interrogation. The unique challenges presented by each implementation require careful scrutiny before widespread adoption can take place. Accordingly, various aspects of design and reliability are discussed spanning a range of representative technologies that include resonant microsilicon structures, MEMS, Bragg gratings, advanced forms of spectroscopy, and modern trends in nanotechnology. Keywords: Fibre-optic sensors, fibre Bragg gratings, MEMS, MOEMS, nanotechnology, plasmon.

  15. Recent advances in bioprinting techniques: approaches, applications and future prospects.

    PubMed

    Li, Jipeng; Chen, Mingjiao; Fan, Xianqun; Zhou, Huifang

    2016-01-01

    Bioprinting technology shows potential in tissue engineering for the fabrication of scaffolds, cells, tissues and organs reproducibly and with high accuracy. Bioprinting technologies are mainly divided into three categories, inkjet-based bioprinting, pressure-assisted bioprinting and laser-assisted bioprinting, based on their underlying printing principles. These various printing technologies have their advantages and limitations. Bioprinting utilizes biomaterials, cells or cell factors as a "bioink" to fabricate prospective tissue structures. Biomaterial parameters such as biocompatibility, cell viability and the cellular microenvironment strongly influence the printed product. Various printing technologies have been investigated, and great progress has been made in printing various types of tissue, including vasculature, heart, bone, cartilage, skin and liver. This review introduces basic principles and key aspects of some frequently used printing technologies. We focus on recent advances in three-dimensional printing applications, current challenges and future directions. PMID:27645770

  16. Some Hail 'Computational Science' as Biggest Advance Since Newton, Galileo.

    ERIC Educational Resources Information Center

    Turner, Judith Axler

    1987-01-01

    Computational science is defined as science done on a computer. A computer can serve as a laboratory for researchers who cannot experiment with their subjects, and as a calculator for those who otherwise might need centuries to solve some problems mathematically. The National Science Foundation's support of supercomputers is discussed. (MLW)

  17. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  18. A review of computer-aided design/computer-aided manufacture techniques for removable denture fabrication.

    PubMed

    Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan

    2016-01-01

    The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now.

  19. A review of computer-aided design/computer-aided manufacture techniques for removable denture fabrication.

    PubMed

    Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan

    2016-01-01

    The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now. PMID:27095912

  20. A review of computer-aided design/computer-aided manufacture techniques for removable denture fabrication

    PubMed Central

    Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan

    2016-01-01

    The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now. PMID:27095912

  1. Application of parallel computing techniques to a large-scale reservoir simulation

    SciTech Connect

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris; Pruess, Karsten

    2001-02-01

    Even with the continual advances made in both computational algorithms and computer hardware used in reservoir modeling studies, large-scale simulation of fluid and heat flow in heterogeneous reservoirs remains a challenge. The problem commonly arises from intensive computational requirement for detailed modeling investigations of real-world reservoirs. This paper presents the application of a massive parallel-computing version of the TOUGH2 code developed for performing large-scale field simulations. As an application example, the parallelized TOUGH2 code is applied to develop a three-dimensional unsaturated-zone numerical model simulating flow of moisture, gas, and heat in the unsaturated zone of Yucca Mountain, Nevada, a potential repository for high-level radioactive waste. The modeling approach employs refined spatial discretization to represent the heterogeneous fractured tuffs of the system, using more than a million 3-D gridblocks. The problem of two-phase flow and heat transfer within the model domain leads to a total of 3,226,566 linear equations to be solved per Newton iteration. The simulation is conducted on a Cray T3E-900, a distributed-memory massively parallel computer. Simulation results indicate that the parallel computing technique, as implemented in the TOUGH2 code, is very efficient. The reliability and accuracy of the model results have been demonstrated by comparing them to those of small-scale (coarse-grid) models. These comparisons show that simulation results obtained with the refined grid provide more detailed predictions of the future flow conditions at the site, aiding in the assessment of proposed repository performance.

  2. Investigation to advance prediction techniques of the low-speed aerodynamics of V/STOL aircraft

    NASA Technical Reports Server (NTRS)

    Maskew, B.; Strash, D.; Nathman, J.; Dvorak, F. A.

    1985-01-01

    A computer program, VSAERO, has been applied to a number of V/STOL configurations with a view to advancing prediction techniques for the low-speed aerodynamic characteristics. The program couples a low-order panel method with surface streamline calculation and integral boundary layer procedures. The panel method--which uses piecewise constant source and doublet panels-includes an iterative procedure for wake shape and models boundary layer displacement effect using the source transpiration technique. Certain improvements to a basic vortex tube jet model were installed in the code prior to evaluation. Very promising results were obtained for surface pressures near a jet issuing at 90 deg from a flat plate. A solid core model was used in the initial part of the jet with a simple entrainment model. Preliminary representation of the downstream separation zone significantly improve the correlation. The program accurately predicted the pressure distribution inside the inlet on the Grumman 698-411 design at a range of flight conditions. Furthermore, coupled viscous/potential flow calculations gave very close correlation with experimentally determined operational boundaries dictated by the onset of separation inside the inlet. Experimentally observed degradation of these operational boundaries between nacelle-alone tests and tests on the full configuration were also indicated by the calculation. Application of the program to the General Dynamics STOL fighter design were equally encouraging. Very close agreement was observed between experiment and calculation for the effects of power on pressure distribution, lift and lift curve slope.

  3. Computer vision techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar

    1990-01-01

    Rotorcraft operating in high-threat environments fly close to the earth's surface to utilize surrounding terrain, vegetation, or manmade objects to minimize the risk of being detected by an enemy. Increasing levels of concealment are achieved by adopting different tactics during low-altitude flight. Rotorcraft employ three tactics during low-altitude flight: low-level, contour, and nap-of-the-earth (NOE). The key feature distinguishing the NOE mode from the other two modes is that the whole rotorcraft, including the main rotor, is below tree-top whenever possible. This leads to the use of lateral maneuvers for avoiding obstacles, which in fact constitutes the means for concealment. The piloting of the rotorcraft is at best a very demanding task and the pilot will need help from onboard automation tools in order to devote more time to mission-related activities. The development of an automation tool which has the potential to detect obstacles in the rotorcraft flight path, warn the crew, and interact with the guidance system to avoid detected obstacles, presents challenging problems. Research is described which applies techniques from computer vision to automation of rotorcraft navigtion. The effort emphasizes the development of a methodology for detecting the ranges to obstacles in the region of interest based on the maximum utilization of passive sensors. The range map derived from the obstacle-detection approach can be used as obstacle data for the obstacle avoidance in an automatic guidance system and as advisory display to the pilot. The lack of suitable flight imagery data presents a problem in the verification of concepts for obstacle detection. This problem is being addressed by the development of an adequate flight database and by preprocessing of currently available flight imagery. The presentation concludes with some comments on future work and how research in this area relates to the guidance of other autonomous vehicles.

  4. Computational structural mechanics and fluid dynamics: Advances and trends; Proceedings of the Symposium, Washington, DC, Oct. 17-19, 1988

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Editor); Dwoyer, Douglas L. (Editor)

    1988-01-01

    Recent advances in computational structural and fluid dynamics are discussed in reviews and reports. Topics addressed include fluid-structure interaction and aeroelasticity, CFD techniques for reacting flows, micromechanics, stability and eigenproblems, probabilistic methods and chaotic dynamics, and perturbation and spectral methods. Consideration is given to finite-element, finite-volume, and boundary-element methods; adaptive methods; parallel processing machines and applications; and visualization, mesh generation, and AI interfaces.

  5. Advances in dental veneers: materials, applications, and techniques.

    PubMed

    Pini, Núbia Pavesi; Aguiar, Flávio Henrique Baggio; Lima, Débora Alves Nunes Leite; Lovadino, José Roberto; Terada, Raquel Sano Suga; Pascotto, Renata Corrêa

    2012-01-01

    Laminate veneers are a conservative treatment of unaesthetic anterior teeth. The continued development of dental ceramics offers clinicians many options for creating highly aesthetic and functional porcelain veneers. This evolution of materials, ceramics, and adhesive systems permits improvement of the aesthetic of the smile and the self-esteem of the patient. Clinicians should understand the latest ceramic materials in order to be able to recommend them and their applications and techniques, and to ensure the success of the clinical case. The current literature was reviewed to search for the most important parameters determining the long-term success, correct application, and clinical limitations of porcelain veneers.

  6. Advances in dental veneers: materials, applications, and techniques

    PubMed Central

    Pini, Núbia Pavesi; Aguiar, Flávio Henrique Baggio; Lima, Débora Alves Nunes Leite; Lovadino, José Roberto; Terada, Raquel Sano Suga; Pascotto, Renata Corrêa

    2012-01-01

    Laminate veneers are a conservative treatment of unaesthetic anterior teeth. The continued development of dental ceramics offers clinicians many options for creating highly aesthetic and functional porcelain veneers. This evolution of materials, ceramics, and adhesive systems permits improvement of the aesthetic of the smile and the self-esteem of the patient. Clinicians should understand the latest ceramic materials in order to be able to recommend them and their applications and techniques, and to ensure the success of the clinical case. The current literature was reviewed to search for the most important parameters determining the long-term success, correct application, and clinical limitations of porcelain veneers. PMID:23674920

  7. The emerging role of advanced neuroimaging techniques for brain metastases.

    PubMed

    Nowosielski, Martha; Radbruch, Alexander

    2015-06-01

    Brain metastases are an increasingly encountered and frightening manifestation of systemic cancer. More effective therapeutic strategies for the primary tumor are resulting in longer patient survival on the one hand while on the other, better brain tumor detection has resulted from increased availability and development of more precise brain imaging methods. This review focuses on the emerging role of functional neuroimaging techniques; magnetic resonance imaging (MRI) as well as positron emission tomography (PET), in establishing diagnosis, for monitoring treatment response with an emphasis on new targeted as well as immunomodulatory therapies and for predicting prognosis in patients with brain metastases.

  8. Advances in parameter estimation techniques applied to flexible structures

    NASA Technical Reports Server (NTRS)

    Maben, Egbert; Zimmerman, David C.

    1994-01-01

    In this work, various parameter estimation techniques are investigated in the context of structural system identification utilizing distributed parameter models and 'measured' time-domain data. Distributed parameter models are formulated using the PDEMOD software developed by Taylor. Enhancements made to PDEMOD for this work include the following: (1) a Wittrick-Williams based root solving algorithm; (2) a time simulation capability; and (3) various parameter estimation algorithms. The parameter estimations schemes will be contrasted using the NASA Mini-Mast as the focus structure.

  9. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  10. Advances in measuring techniques for turbine cooling test rigs

    NASA Technical Reports Server (NTRS)

    Pollack, F. G.

    1972-01-01

    Surface temperature distribution measurements for turbine vanes and blades were obtained by measuring the infrared energy emitted by the airfoil. The IR distribution can be related to temperature distribution by suitable calibration methods and the data presented in the form of isotherm maps. Both IR photographic and real time electro-optical methods are being investigated. The methods can be adapted to rotating as well as stationary targets, and both methods can utilize computer processing. Pressure measurements on rotating components are made with a rotating system incorporating 10 miniature transducers. A mercury wetted slip ring assembly was used to supply excitation power and as a signal transfer device. The system was successfully tested up to speeds of 9000 rpm and is now being adapted to measure rotating blade airflow quantities in a spin rig and a research engine.

  11. Advanced terahertz techniques for quality control and counterfeit detection

    NASA Astrophysics Data System (ADS)

    Ahi, Kiarash; Anwar, Mehdi

    2016-04-01

    This paper reports our invented methods for detection of counterfeit electronic. These versatile techniques are also handy in quality control applications. Terahertz pulsed laser systems are capable of giving the material characteristics and thus make it possible to distinguish between the materials used in authentic components and their counterfeit clones. Components with material defects can also be distinguished in section in this manner. In this work different refractive indices and absorption coefficients were observed for counterfeit components compared to their authentic counterparts. Existence of unexpected ingredient materials was detected in counterfeit components by Fourier Transform analysis of the transmitted terahertz pulse. Thicknesses of different layers are obtainable by analyzing the reflected terahertz pulse. Existence of unexpected layers is also detectable in this manner. Recycled, sanded and blacktopped counterfeit electronic components were detected as a result of these analyses. Counterfeit ICs with die dislocations were detected by depicting the terahertz raster scanning data in a coordinate plane which gives terahertz images. In the same manner, raster scanning of the reflected pulse gives terahertz images of the surfaces of the components which were used to investigate contaminant materials and sanded points on the surfaces. The results of the later technique, reveals the recycled counterfeit components.

  12. Advanced microscopy techniques resolving complex precipitates in steels

    NASA Astrophysics Data System (ADS)

    Saikaly, W.; Soto, R.; Bano, X.; Issartel, C.; Rigaut, G.; Charaï, A.

    1999-06-01

    Scanning electron microscopy as well as analytical transmission electron microscopy techniques such as high resolution, electron diffraction, energy dispersive X-ray spectrometry (EDX), parallel electron energy loss spectroscopy (PEELS) and elemental mapping via a Gatan Imaging Filter (GIF) have been used to study complex precipitation in commercial dual phase steels microalloyed with titanium. Titanium nitrides, titanium carbosulfides, titanium carbonitrides and titanium carbides were characterized in this study. Both carbon extraction replicas and thin foils were used as sample preparation techniques. On both the microscopic and nanometric scales, it was found that a large amount of precipitation occurred heterogeneously on already existing inclusions/precipitates. CaS inclusions (1 to 2 μm), already present in liquid steel, acted as nucleation sites for TiN precipitating upon the steel's solidification. In addition, TiC nucleated on existing smaller TiN (around 30 to 50 nm). Despite the complexity of such alloys, the statistical analysis conducted on the non-equilibrium samples were found to be in rather good agreement with the theoretical equilibrium calculations. Heterogeneous precipitation must have played a role in bringing these results closer together.

  13. Comparison of three advanced chromatographic techniques for cannabis identification.

    PubMed

    Debruyne, D; Albessard, F; Bigot, M C; Moulin, M

    1994-01-01

    The development of chromatography technology, with the increasing availability of easier-to-use mass spectrometers combined with gas chromatography (GC), the use of diode-array or programmable variable-wavelength ultraviolet absorption detectors in conjunction with high-performance liquid chromatography (HPLC), and the availability of scanners capable of reading thin-layer chromatography (TLC) plates in the ultraviolet and visible regions, has made for easier, quicker and more positive identification of cannabis samples that standard analytical laboratories are occasionally required to undertake in the effort to combat drug addiction. At laboratories that do not possess the technique of GC combined with mass spectrometry, which provides an irrefutable identification, the following procedure involving HPLC or TLC techniques may be used: identification of the chromatographic peaks corresponding to each of the three main cannabis constituents-cannabidiol (CBD), delta-9-tetrahydrocannabinol (delta-9-THC) and cannabinol (CBN)-by comparison with published data in conjunction with a specific absorption spectrum for each of those constituents obtained between 200 and 300 nm. The collection of the fractions corresponding to the three major cannabinoids at the HPLC system outlet and the cross-checking of their identity in the GC process with flame ionization detection can further corroborate the identification and minimize possible errors due to interference.

  14. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  15. Recent Advances in Spaceborne Precipitation Radar Measurement Techniques and Technology

    NASA Technical Reports Server (NTRS)

    Im, Eastwood; Durden, Stephen L.; Tanelli, Simone

    2006-01-01

    NASA is currently developing advanced instrument concepts and technologies for future spaceborne atmospheric radars, with an over-arching objective of making such instruments more capable in supporting future science needs and more cost effective. Two such examples are the Second-Generation Precipitation Radar (PR-2) and the Nexrad-In-Space (NIS). PR-2 is a 14/35-GHz dual-frequency rain radar with a deployable 5-meter, wide-swath scanned membrane antenna, a dual-polarized/dual-frequency receiver, and a realtime digital signal processor. It is intended for Low Earth Orbit (LEO) operations to provide greatly enhanced rainfall profile retrieval accuracy while consuming only a fraction of the mass of the current TRMM Precipitation Radar (PR). NIS is designed to be a 35-GHz Geostationary Earth Orbiting (GEO) radar for providing hourly monitoring of the life cycle of hurricanes and tropical storms. It uses a 35-m, spherical, lightweight membrane antenna and Doppler processing to acquire 3-dimensional information on the intensity and vertical motion of hurricane rainfall.

  16. Advanced Cell Culture Techniques for Cancer Drug Discovery

    PubMed Central

    Lovitt, Carrie J.; Shelper, Todd B.; Avery, Vicky M.

    2014-01-01

    Human cancer cell lines are an integral part of drug discovery practices. However, modeling the complexity of cancer utilizing these cell lines on standard plastic substrata, does not accurately represent the tumor microenvironment. Research into developing advanced tumor cell culture models in a three-dimensional (3D) architecture that more prescisely characterizes the disease state have been undertaken by a number of laboratories around the world. These 3D cell culture models are particularly beneficial for investigating mechanistic processes and drug resistance in tumor cells. In addition, a range of molecular mechanisms deconstructed by studying cancer cells in 3D models suggest that tumor cells cultured in two-dimensional monolayer conditions do not respond to cancer therapeutics/compounds in a similar manner. Recent studies have demonstrated the potential of utilizing 3D cell culture models in drug discovery programs; however, it is evident that further research is required for the development of more complex models that incorporate the majority of the cellular and physical properties of a tumor. PMID:24887773

  17. Advanced coding techniques for few mode transmission systems.

    PubMed

    Okonkwo, Chigo; van Uden, Roy; Chen, Haoshuo; de Waardt, Huug; Koonen, Ton

    2015-01-26

    We experimentally verify the advantage of employing advanced coding schemes such as space-time coding and 4 dimensional modulation formats to enhance the transmission performance of a 3-mode transmission system. The performance gain of space-time block codes for extending the optical signal-to-noise ratio tolerance in multiple-input multiple-output optical coherent spatial division multiplexing transmission systems with respect to single-mode transmission performance are evaluated. By exploiting the spatial diversity that few-mode-fibers offer, with respect to single mode fiber back-to-back performance, significant OSNR gains of 3.2, 4.1, 4.9, and 6.8 dB at the hard-decision forward error correcting limit are demonstrated for DP-QPSK 8, 16 and 32 QAM, respectively. Furthermore, by employing 4D constellations, 6 × 28Gbaud 128 set partitioned quadrature amplitude modulation is shown to outperform conventional 8 QAM transmission performance, whilst carrying an additional 0.5 bit/symbol.

  18. Advances in turbulent mixing techniques to study microsecond protein folding reactions

    PubMed Central

    Kathuria, Sagar V.; Chan, Alexander; Graceffa, Rita; Nobrega, R. Paul; Matthews, C. Robert; Irving, Thomas C.; Perot, Blair; Bilsel, Osman

    2013-01-01

    Recent experimental and computational advances in the protein folding arena have shown that the readout of the one-dimensional sequence information into three-dimensional structure begins within the first few microseconds of folding. The initiation of refolding reactions has been achieved by several means, including temperature jumps, flash photolysis, pressure jumps and rapid mixing methods. One of the most commonly used means of initiating refolding of chemically-denatured proteins is by turbulent flow mixing with refolding dilution buffer, where greater than 99% mixing efficiency has been achieved within 10’s of microseconds. Successful interfacing of turbulent flow mixers with complementary detection methods, including time-resolved Fluorescence Spectroscopy (trFL), Förster Resonance Energy Transfer (FRET), Circular Dichroism (CD), Small-Angle X-ray Scattering (SAXS), Hydrogen Exchange (HX) followed by Mass Spectrometry (MS) and Nuclear Magnetic Resonance Spectroscopy (NMR), Infrared Spectroscopy (IR), and Fourier Transform IR Spectroscopy (FTIR), has made this technique very attractive for monitoring various aspects of structure formation during folding. Although continuous-flow (CF) mixing devices interfaced with trFL detection have a dead time of only 30 µs, burst-phases have been detected in this time scale during folding of peptides and of large proteins (e.g., CheY and TIM barrels). Furthermore, a major limitation of CF mixing technique has been the requirement of large quantities of sample. In this brief communication, we will discuss the recent flurry of activity in micromachining and microfluidics, guided by computational simulations, that are likely to lead to dramatic improvements in time resolution and sample consumption for CF mixers over the next few years. PMID:23868289

  19. Advanced fabrication techniques for hydrogen-cooled engine structures

    NASA Technical Reports Server (NTRS)

    Buchmann, O. A.; Arefian, V. V.; Warren, H. A.; Vuigner, A. A.; Pohlman, M. J.

    1985-01-01

    Described is a program for development of coolant passage geometries, material systems, and joining processes that will produce long-life hydrogen-cooled structures for scramjet applications. Tests were performed to establish basic material properties, and samples constructed and evaluated to substantiate fabrication processes and inspection techniques. Results of the study show that the basic goal of increasing the life of hydrogen-cooled structures two orders of magnitude relative to that of the Hypersonic Research Engine can be reached with available means. Estimated life is 19000 cycles for the channels and 16000 cycles for pin-fin coolant passage configurations using Nickel 201. Additional research is required to establish the fatigue characteristics of dissimilar-metal coolant passages (Nickel 201/Inconel 718) and to investigate the embrittling effects of the hydrogen coolant.

  20. Advanced Process Monitoring Techniques for Safeguarding Reprocessing Facilities

    SciTech Connect

    Orton, Christopher R.; Bryan, Samuel A.; Schwantes, Jon M.; Levitskaia, Tatiana G.; Fraga, Carlos G.; Peper, Shane M.

    2010-11-30

    The International Atomic Energy Agency (IAEA) has established international safeguards standards for fissionable material at spent fuel reprocessing plants to ensure that significant quantities of weapons-grade nuclear material are not diverted from these facilities. For large throughput nuclear facilities, it is difficult to satisfy the IAEA safeguards accountancy goal for detection of abrupt diversion. Currently, methods to verify material control and accountancy (MC&A) at these facilities require time-consuming and resource-intensive destructive assay (DA). Leveraging new on-line non destructive assay (NDA) process monitoring techniques in conjunction with the traditional and highly precise DA methods may provide an additional measure to nuclear material accountancy which would potentially result in a more timely, cost-effective and resource efficient means for safeguards verification at such facilities. By monitoring process control measurements (e.g. flowrates, temperatures, or concentrations of reagents, products or wastes), abnormal plant operations can be detected. Pacific Northwest National Laboratory (PNNL) is developing on-line NDA process monitoring technologies, including both the Multi-Isotope Process (MIP) Monitor and a spectroscopy-based monitoring system, to potentially reduce the time and resource burden associated with current techniques. The MIP Monitor uses gamma spectroscopy and multivariate analysis to identify off-normal conditions in process streams. The spectroscopic monitor continuously measures chemical compositions of the process streams including actinide metal ions (U, Pu, Np), selected fission products, and major cold flowsheet chemicals using UV-Vis, Near IR and Raman spectroscopy. This paper will provide an overview of our methods and report our on-going efforts to develop and demonstrate the technologies.

  1. Computational Modeling and Neuroimaging Techniques for Targeting during Deep Brain Stimulation

    PubMed Central

    Sweet, Jennifer A.; Pace, Jonathan; Girgis, Fady; Miller, Jonathan P.

    2016-01-01

    Accurate surgical localization of the varied targets for deep brain stimulation (DBS) is a process undergoing constant evolution, with increasingly sophisticated techniques to allow for highly precise targeting. However, despite the fastidious placement of electrodes into specific structures within the brain, there is increasing evidence to suggest that the clinical effects of DBS are likely due to the activation of widespread neuronal networks directly and indirectly influenced by the stimulation of a given target. Selective activation of these complex and inter-connected pathways may further improve the outcomes of currently treated diseases by targeting specific fiber tracts responsible for a particular symptom in a patient-specific manner. Moreover, the delivery of such focused stimulation may aid in the discovery of new targets for electrical stimulation to treat additional neurological, psychiatric, and even cognitive disorders. As such, advancements in surgical targeting, computational modeling, engineering designs, and neuroimaging techniques play a critical role in this process. This article reviews the progress of these applications, discussing the importance of target localization for DBS, and the role of computational modeling and novel neuroimaging in improving our understanding of the pathophysiology of diseases, and thus paving the way for improved selective target localization using DBS. PMID:27445709

  2. Computational fluid dynamics in the design and analysis of thermal processes: a review of recent advances.

    PubMed

    Norton, Tomás; Tiwari, Brijesh; Sun, Da Wen

    2013-01-01

    The design of thermal processes in the food industry has undergone great developments in the last two decades due to the availability of cheap computer power alongside advanced modelling techniques such as computational fluid dynamics (CFD). CFD uses numerical algorithms to solve the non-linear partial differential equations of fluid mechanics and heat transfer so that the complex mechanisms that govern many food-processing systems can be resolved. In thermal processing applications, CFD can be used to build three-dimensional models that are both spatially and temporally representative of a physical system to produce solutions with high levels of physical realism without the heavy costs associated with experimental analyses. Therefore, CFD is playing an ever growing role in the development of optimization of conventional as well as the development of new thermal processes in the food industry. This paper discusses the fundamental aspects involved in developing CFD solutions and forms a state-of-the-art review on various CFD applications in conventional as well as novel thermal processes. The challenges facing CFD modellers of thermal processes are also discussed. From this review it is evident that present-day CFD software, with its rich tapestries of mathematical physics, numerical methods and visualization techniques, is currently recognized as a formidable and pervasive technology which can permit comprehensive analyses of thermal processing.

  3. Computational and experimental advances in drug repositioning for accelerated therapeutic stratification.

    PubMed

    Shameer, Khader; Readhead, Ben; Dudley, Joel T

    2015-01-01

    Drug repositioning is an important component of therapeutic stratification in the precision medicine paradigm. Molecular profiling and more sophisticated analysis of longitudinal clinical data are refining definitions of human diseases, creating needs and opportunities to re-target or reposition approved drugs for alternative indications. Drug repositioning studies have demonstrated success in complex diseases requiring improved therapeutic interventions as well as orphan diseases without any known treatments. An increasing collection of available computational and experimental methods that leverage molecular and clinical data enable diverse drug repositioning strategies. Integration of translational bioinformatics resources, statistical methods, chemoinformatics tools and experimental techniques (including medicinal chemistry techniques) can enable the rapid application of drug repositioning on an increasingly broad scale. Efficient tools are now available for systematic drug-repositioning methods using large repositories of compounds with biological activities. Medicinal chemists along with other translational researchers can play a key role in various aspects of drug repositioning. In this review article, we briefly summarize the history of drug repositioning, explain concepts behind drug repositioning methods, discuss recent computational and experimental advances and highlight available open access resources for effective drug repositioning investigations. We also discuss recent approaches in utilizing electronic health record for outcome assessment of drug repositioning and future avenues of drug repositioning in the light of targeting disease comorbidities, underserved patient communities, individualized medicine and socioeconomic impact.

  4. Development of a real-time aeroperformance analysis technique for the X-29A advanced technology demonstrator

    NASA Technical Reports Server (NTRS)

    Ray, R. J.; Hicks, J. W.; Alexander, R. I.

    1988-01-01

    The X-29A advanced technology demonstrator has shown the practicality and advantages of the capability to compute and display, in real time, aeroperformance flight results. This capability includes the calculation of the in-flight measured drag polar, lift curve, and aircraft specific excess power. From these elements many other types of aeroperformance measurements can be computed and analyzed. The technique can be used to give an immediate postmaneuver assessment of data quality and maneuver technique, thus increasing the productivity of a flight program. A key element of this new method was the concurrent development of a real-time in-flight net thrust algorithm, based on the simplified gross thrust method. This net thrust algorithm allows for the direct calculation of total aircraft drag.

  5. Learning-based computing techniques in geoid modeling for precise height transformation

    NASA Astrophysics Data System (ADS)

    Erol, B.; Erol, S.

    2013-03-01

    Precise determination of local geoid is of particular importance for establishing height control in geodetic GNSS applications, since the classical leveling technique is too laborious. A geoid model can be accurately obtained employing properly distributed benchmarks having GNSS and leveling observations using an appropriate computing algorithm. Besides the classical multivariable polynomial regression equations (MPRE), this study attempts an evaluation of learning based computing algorithms: artificial neural networks (ANNs), adaptive network-based fuzzy inference system (ANFIS) and especially the wavelet neural networks (WNNs) approach in geoid surface approximation. These algorithms were developed parallel to advances in computer technologies and recently have been used for solving complex nonlinear problems of many applications. However, they are rather new in dealing with precise modeling problem of the Earth gravity field. In the scope of the study, these methods were applied to Istanbul GPS Triangulation Network data. The performances of the methods were assessed considering the validation results of the geoid models at the observation points. In conclusion the ANFIS and WNN revealed higher prediction accuracies compared to ANN and MPRE methods. Beside the prediction capabilities, these methods were also compared and discussed from the practical point of view in conclusions.

  6. Silicon and germanium crystallization techniques for advanced device applications

    NASA Astrophysics Data System (ADS)

    Liu, Yaocheng

    Three-dimensional architectures are believed to be one of the possible approaches to reduce interconnect delay in integrated circuits. Metal-induced crystallization (MIC) can produce reasonably high-quality Si crystals with low-temperature processing, enabling the monolithic integration of multilevel devices and circuits. A two-step MIC process was developed to make single-crystal Si pillars on insulator by forming a single-grain NiSi2 template in the first step and crystallizing the amorphous Si by NiSi2-mediated solid-phase epitaxy (SPE) in the second step. A transmission electron microscopy study clearly showed the quality improvement over the traditional MIC process. Another crystallization technique developed is rapid melt growth (RMG) for the fabrication of Ge crystals and Ge-on-insulator (GeOI) substrates. Ge is an important semiconductor with high carrier mobility and excellent optoelectronic properties. GeOI substrates are particularly desired to achieve high device performances and to solve the process problems traditionally associated with bulk Ge wafers. High-quality Ge crystals and GeOI structures were grown on Si substrates using the novel rapid melt growth technique that integrates the key elements in Czochralski growth---seeding, melting, epitaxy and defect necking. Growth velocity and nucleation rate were calculated to determine the RMG process window. Self-aligned microcrucibles were created to hold the Ge liquid during the RMG annealing. Material characterization showed a very low defect density in the RMG GeOI structures. The Ge films are relaxed, with their orientations controlled by the Si substrates. P-channel MOSFETs and p-i-n photodetectors were fabricated with the GeOI substrates. The device properties are comparable to those obtained with bulk Ge wafers, indicating that the RMG GeOI substrates are well suited for device fabrication. A new theory, growth-induced barrier lowering (GIBL), is proposed to understand the defect generation in

  7. Analysis of Piezoelectric Structural Sensors with Emergent Computing Techniques

    NASA Technical Reports Server (NTRS)

    Ramers, Douglas L.

    2005-01-01

    The purpose of this project was to try to interpret the results of some tests that were performed earlier this year and to demonstrate a possible use of emergence in computing to solve IVHM problems. The test data used was collected with piezoelectric sensors to detect mechanical changes in structures. This project team was included of Dr. Doug Ramers and Dr. Abdul Jallob of the Summer Faculty Fellowship Program, Arnaldo Colon-Lopez - a student intern from the University of Puerto Rico of Turabo, and John Lassister and Bob Engberg of the Structural and Dynamics Test Group. The tests were performed by Bob Engberg to compare the performance two types of piezoelectric (piezo) sensors, Pb(Zr(sub 1-1)Ti(sub x))O3, which we will label PZT, and Pb(Zn(sub 1/3)Nb(sub 2/3))O3-PbTiO, which we will label SCP. The tests were conducted under varying temperature and pressure conditions. One set of tests was done by varying water pressure inside an aluminum liner covered with carbon-fiber composite layers (a cylindrical "bottle" with domed ends) and the other by varying temperatures down to cryogenic levels on some specially prepared composite panels. This report discusses the data from the pressure study. The study of the temperature results was not completed in time for this report. The particular sensing done with these piezo sensors is accomplished by the sensor generating an controlled vibration that is transmitted into the structure to which the sensor is attached, and the same sensor then responding to the induced vibration of the structure. There is a relationship between the mechanical impedance of the structure and the resulting electrical impedance produced in the in the piezo sensor. The impedance is also a function of the excitation frequency. Changes in the real part of impendance signature relative to an original reference signature indicate a change in the coupled structure that could be the results of damage or strain. The water pressure tests were conducted by

  8. Simulation of an advanced techniques of ion propulsion Rocket system

    NASA Astrophysics Data System (ADS)

    Bakkiyaraj, R.

    2016-07-01

    The ion propulsion rocket system is expected to become popular with the development of Deuterium,Argon gas and Hexagonal shape Magneto hydrodynamic(MHD) techniques because of the stimulation indirectly generated the power from ionization chamber,design of thrust range is 1.2 N with 40 KW of electric power and high efficiency.The proposed work is the study of MHD power generation through ionization level of Deuterium gas and combination of two gaseous ions(Deuterium gas ions + Argon gas ions) at acceleration stage.IPR consists of three parts 1.Hexagonal shape MHD based power generator through ionization chamber 2.ion accelerator 3.Exhaust of Nozzle.Initially the required energy around 1312 KJ/mol is carrying out the purpose of deuterium gas which is changed to ionization level.The ionized Deuterium gas comes out from RF ionization chamber to nozzle through MHD generator with enhanced velocity then after voltage is generated across the two pairs of electrode in MHD.it will produce thrust value with the help of mixing of Deuterium ion and Argon ion at acceleration position.The simulation of the IPR system has been carried out by MATLAB.By comparing the simulation results with the theoretical and previous results,if reaches that the proposed method is achieved of thrust value with 40KW power for simulating the IPR system.

  9. Advances in Current Rating Techniques for Flexible Printed Circuits

    NASA Technical Reports Server (NTRS)

    Hayes, Ron

    2014-01-01

    Twist Capsule Assemblies are power transfer devices commonly used in spacecraft mechanisms that require electrical signals to be passed across a rotating interface. Flexible printed circuits (flex tapes, see Figure 2) are used to carry the electrical signals in these devices. Determining the current rating for a given trace (conductor) size can be challenging. Because of the thermal conditions present in this environment the most appropriate approach is to assume that the only means by which heat is removed from the trace is thru the conductor itself, so that when the flex tape is long the temperature rise in the trace can be extreme. While this technique represents a worst-case thermal situation that yields conservative current ratings, this conservatism may lead to overly cautious designs when not all traces are used at their full rated capacity. A better understanding of how individual traces behave when they are not all in use is the goal of this research. In the testing done in support of this paper, a representative flex tape used for a flight Solar Array Drive Assembly (SADA) application was tested by energizing individual traces (conductors in the tape) in a vacuum chamber and the temperatures of the tape measured using both fine-gauge thermocouples and infrared thermographic imaging. We find that traditional derating schemes used for bundles of wires do not apply for the configuration tested. We also determine that single active traces located in the center of a flex tape operate at lower temperatures than those on the outside edges.

  10. Recent advances in techniques for tsetse-fly control*

    PubMed Central

    MacLennan, K. J. R.

    1967-01-01

    With the advent of modern persistent insecticides, it has become possible to utilize some of the knowledge that has accumulated on the ecology and bionomics of Glossina and to devise more effective techniques for the control and eventual extermination of these species. The present article, based on experience of the tsetse fly problem in Northern Nigeria, points out that the disadvantages of control techniques—heavy expenditure of money and manpower and undue damage to the biosystem—can now largely be overcome by basing the application of insecticides on knowledge of the habits of the particular species of Glossina in a particular environment. Two factors are essential to the success of a control project: the proper selection of sites for spraying (the concept of restricted application) and the degree of persistence of the insecticide used. Reinfestation from within or outside the project area must also be taken into account. These and other aspects are discussed in relation to experience gained from a successful extermination project carried out in the Sudan vegetation zone and from present control activities in the Northern Guinea vegetation zone. PMID:5301739

  11. Advanced signal processing technique for damage detection in steel tubes

    NASA Astrophysics Data System (ADS)

    Amjad, Umar; Yadav, Susheel Kumar; Dao, Cac Minh; Dao, Kiet; Kundu, Tribikram

    2016-04-01

    In recent years, ultrasonic guided waves gained attention for reliable testing and characterization of metals and composites. Guided wave modes are excited and detected by PZT (Lead Zirconate Titanate) transducers either in transmission or reflection mode. In this study guided waves are excited and detected in the transmission mode and the phase change of the propagating wave modes are recorded. In most of the other studies reported in the literature, the change in the received signal strength (amplitude) is investigated with varying degrees of damage while in this study the change in phase is correlated with the extent of damage. Feature extraction techniques are used for extracting phase and time-frequency information. The main advantage of this approach is that the bonding condition between the transducer and the specimen does not affect the phase while it can affect the strength of recorded signal. Therefore, if the specimen is not damaged but the transducer-specimen bonding is deteriorated then the received signal strength is altered but the phase remains same and thus false positive predictions for damage can be avoided.

  12. Advanced Scientific Computing Environment Team new scientific database management task

    SciTech Connect

    Church, J.P.; Roberts, J.C.; Sims, R.N.; Smetana, A.O.; Westmoreland, B.W.

    1991-06-01

    The mission of the ASCENT Team is to continually keep pace with, evaluate, and select emerging computing technologies to define and implement prototypic scientific environments that maximize the ability of scientists and engineers to manage scientific data. These environments are to be implemented in a manner consistent with the site computing architecture and standards and NRTSC/SCS strategic plans for scientific computing. The major trends in computing hardware and software technology clearly indicate that the future computer'' will be a network environment that comprises supercomputers, graphics boxes, mainframes, clusters, workstations, terminals, and microcomputers. This network computer'' will have an architecturally transparent operating system allowing the applications code to run on any box supplying the required computing resources. The environment will include a distributed database and database managing system(s) that permits use of relational, hierarchical, object oriented, GIS, et al, databases. To reach this goal requires a stepwise progression from the present assemblage of monolithic applications codes running on disparate hardware platforms and operating systems. The first steps include converting from the existing JOSHUA system to a new J80 system that complies with modern language standards, development of a new J90 prototype to provide JOSHUA capabilities on Unix platforms, development of portable graphics tools to greatly facilitate preparation of input and interpretation of output; and extension of Jvv'' concepts and capabilities to distributed and/or parallel computing environments.

  13. The Advance of Computing from the Ground to the Cloud

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  14. Computational Enzyme Design: Advances, hurdles and possible ways forward

    PubMed Central

    Linder, Mats

    2012-01-01

    This mini review addresses recent developments in computational enzyme design. Successful protocols as well as known issues and limitations are discussed from an energetic perspective. It will be argued that improved results can be obtained by including a dynamic treatment in the design protocol. Finally, a molecular dynamics-based approach for evaluating and refining computational designs is presented. PMID:24688650

  15. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    The purpose of this article is to discuss "small-group apprenticeships (SGAs)" as a method to instruct cell culture techniques to high school participants. The study aimed to teach cell culture practices and to introduce advanced imaging techniques to solve various biomedical engineering problems. Participants designed and completed experiments…

  16. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research.

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    Discusses small-group apprenticeships (SGAs) as a method for introducing cell culture techniques to high school participants. Teaches cell culture practices and introduces advance imaging techniques to solve various biomedical engineering problems. Clarifies and illuminates the value of small-group laboratory apprenticeships. (Author/KHR)

  17. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.; Stebbins, J. P.; Smith, A. W.; Pullen, K. E.

    1973-01-01

    A method for the prediction of propellant-material compatibility for periods of time up to ten years is presented. Advanced sensitive measurement techniques used in the prediction method are described. These include: neutron activation analysis, radioactive tracer technique, and atomic absorption spectroscopy with a graphite tube furnace sampler. The results of laboratory tests performed to verify the prediction method are presented.

  18. Computer modeling of a wideswath SAR concept employing multiple antenna beam formation techniques

    NASA Technical Reports Server (NTRS)

    Estes, J. M.

    1982-01-01

    A technique for wideswath synthetic aperture radar coverage, was implemented into the OSS (orbital sar simulation) computer programs. The OSS modifications, and the implementation and simulation of the concept are described. The wideswath technique uses formed multiple antenna beams.

  19. Endoscopic therapy for early gastric cancer: Standard techniques and recent advances in ESD

    PubMed Central

    Kume, Keiichiro

    2014-01-01

    The technique of endoscopic submucosal dissection (ESD) is now a well-known endoscopic therapy for early gastric cancer. ESD was introduced to resect large specimens of early gastric cancer in a single piece. ESD can provide precision of histologic diagnosis and can also reduce the recurrence rate. However, the drawback of ESD is its technical difficulty, and, consequently, it is associated with a high rate of complications, the need for advanced endoscopic techniques, and a lengthy procedure time. Various advances in the devices and techniques used for ESD have contributed to overcoming these drawbacks. PMID:24914364

  20. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  1. Advancements in sensing and perception using structured lighting techniques :an LDRD final report.

    SciTech Connect

    Novick, David Keith; Padilla, Denise D.; Davidson, Patrick A. Jr.; Carlson, Jeffrey J.

    2005-09-01

    This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Advancements in Sensing and Perception using Structured Lighting Techniques''. There is an ever-increasing need for robust, autonomous ground vehicles for counterterrorism and defense missions. Although there has been nearly 30 years of government-sponsored research, it is undisputed that significant advancements in sensing and perception are necessary. We developed an innovative, advanced sensing technology for national security missions serving the Department of Energy, the Department of Defense, and other government agencies. The principal goal of this project was to develop an eye-safe, robust, low-cost, lightweight, 3D structured lighting sensor for use in broad daylight outdoor applications. The market for this technology is wide open due to the unavailability of such a sensor. Currently available laser scanners are slow, bulky and heavy, expensive, fragile, short-range, sensitive to vibration (highly problematic for moving platforms), and unreliable for outdoor use in bright sunlight conditions. Eye-safety issues are a primary concern for currently available laser-based sensors. Passive, stereo-imaging sensors are available for 3D sensing but suffer from several limitations : computationally intensive, require a lighted environment (natural or man-made light source), and don't work for many scenes or regions lacking texture or with ambiguous texture. Our approach leveraged from the advanced capabilities of modern CCD camera technology and Center 6600's expertise in 3D world modeling, mapping, and analysis, using structured lighting. We have a diverse customer base for indoor mapping applications and this research extends our current technology's lifecycle and opens a new market base for outdoor 3D mapping. Applications include precision mapping, autonomous navigation, dexterous manipulation, surveillance and

  2. [Our experience with the treatment of high perianal fistulas with the mucosal flap advancement technique].

    PubMed

    Marino, Giuseppe; Greco, Ettore; Gasparrini, Marcello; Romanzi, Aldo; Ottaviani, Maurizio; Nasi, Stefano; Pasquini, Giorgio

    2004-01-01

    The authors present their experience with the treatment of high transphincteric anal fistulas with the mucosal flap advancement technique. This technique, though by no means easy to perform, allows fistulas to be treated in a single surgical session in comparison to the technique in which setone is used or to the less well known transposition techniques, given the same long-term results in terms of continence and recurrence rate. After a brief overview of the problem, from the points of view of both aetiopathogenesis and classification, the principal surgical treatment techniques are described, presenting the results and complications observed in the authors' own case series. PMID:15038659

  3. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations

  4. The Impact of Advance Organizers upon Students' Achievement in Computer-Assisted Video Instruction.

    ERIC Educational Resources Information Center

    Saidi, Houshmand

    1994-01-01

    Describes a study of undergraduates that was conducted to determine the impact of advance organizers on students' achievement in computer-assisted video instruction (CAVI). Treatments of the experimental and control groups are explained, and results indicate that advance organizers do not facilitate near-transfer of rule-learning in CAVI.…

  5. Some recent advances in computational aerodynamics for helicopter applications

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.; Baeder, J. D.

    1985-01-01

    The growing application of computational aerodynamics to nonlinear helicopter problems is outlined, with particular emphasis on several recent quasi-two-dimensional examples that used the thin-layer Navier-Stokes equations and an eddy-viscosity model to approximate turbulence. Rotor blade section characteristics can now be calculated accurately over a wide range of transonic flow conditions. However, a finite-difference simulation of the complete flow field about a helicopter in forward flight is not currently feasible, despite the impressive progress that is being made in both two and three dimensions. The principal limitations are today's computer speeds and memories, algorithm and solution methods, grid generation, vortex modeling, structural and aerodynamic coupling, and a shortage of engineers who are skilled in both computational fluid dynamics and helicopter aerodynamics and dynamics.

  6. A Computationally Based Approach to Homogenizing Advanced Alloys

    SciTech Connect

    Jablonski, P D; Cowen, C J

    2011-02-27

    We have developed a computationally based approach to optimizing the homogenization heat treatment of complex alloys. The Scheil module within the Thermo-Calc software is used to predict the as-cast segregation present within alloys, and DICTRA (Diffusion Controlled TRAnsformations) is used to model the homogenization kinetics as a function of time, temperature and microstructural scale. We will discuss this approach as it is applied to both Ni based superalloys as well as the more complex (computationally) case of alloys that solidify with more than one matrix phase as a result of segregation. Such is the case typically observed in martensitic steels. With these alloys it is doubly important to homogenize them correctly, especially at the laboratory scale, since they are austenitic at high temperature and thus constituent elements will diffuse slowly. The computationally designed heat treatment and the subsequent verification real castings are presented.

  7. A Novel Microcharacterization Technique in the Measurement of Strain and Orientation Gradient in Advanced Materials

    NASA Technical Reports Server (NTRS)

    Garmestai, H.; Harris, K.; Lourenco, L.

    1997-01-01

    Representation of morphology and evolution of the microstructure during processing and their relation to properties requires proper experimental techniques. Residual strains, lattice distortion, and texture (micro-texture) at the interface and the matrix of a layered structure or a functionally gradient material and their variation are among parameters important in materials characterization but hard to measure with present experimental techniques. Current techniques available to measure changes in interred material parameters (residual stress, micro-texture, microplasticity) produce results which are either qualitative or unreliable. This problem becomes even more complicated in the case of a temperature variation. These parameters affect many of the mechanical properties of advanced materials including stress-strain relation, ductility, creep, and fatigue. A review of some novel experimental techniques using recent advances in electron microscopy is presented here to measure internal stress, (micro)texture, interracial strength and (sub)grain formation and realignment. Two of these techniques are combined in the chamber of an Environmental Scanning Electron Microscope to measure strain and orientation gradients in advanced materials. These techniques which include Backscattered Kikuchi Diffractometry (BKD) and Microscopic Strain Field Analysis are used to characterize metallic and intermetallic matrix composites and superplastic materials. These techniques are compared with the more conventional x-ray diffraction and indentation techniques.

  8. Applications of NLP Techniques to Computer-Assisted Authoring of Test Items for Elementary Chinese

    ERIC Educational Resources Information Center

    Liu, Chao-Lin; Lin, Jen-Hsiang; Wang, Yu-Chun

    2010-01-01

    The authors report an implemented environment for computer-assisted authoring of test items and provide a brief discussion about the applications of NLP techniques for computer assisted language learning. Test items can serve as a tool for language learners to examine their competence in the target language. The authors apply techniques for…

  9. Computational techniques in tribology and material science at the atomic level

    NASA Technical Reports Server (NTRS)

    Ferrante, J.; Bozzolo, G. H.

    1992-01-01

    Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.

  10. Improved Discontinuity-capturing Finite Element Techniques for Reaction Effects in Turbulence Computation

    NASA Astrophysics Data System (ADS)

    Corsini, A.; Rispoli, F.; Santoriello, A.; Tezduyar, T. E.

    2006-09-01

    Recent advances in turbulence modeling brought more and more sophisticated turbulence closures (e.g. k-ɛ, k-ɛ - v 2- f, Second Moment Closures), where the governing equations for the model parameters involve advection, diffusion and reaction terms. Numerical instabilities can be generated by the dominant advection or reaction terms. Classical stabilized formulations such as the Streamline Upwind/Petrov Galerkin (SUPG) formulation (Brook and Hughes, comput methods Appl Mech Eng 32:199 255, 1982; Hughes and Tezduyar, comput methods Appl Mech Eng 45: 217 284, 1984) are very well suited for preventing the numerical instabilities generated by the dominant advection terms. A different stabilization however is needed for instabilities due to the dominant reaction terms. An additional stabilization term, called the diffusion for reaction-dominated (DRD) term, was introduced by Tezduyar and Park (comput methods Appl Mech Eng 59:307 325, 1986) for that purpose and improves the SUPG performance. In recent years a new class of variational multi-scale (VMS) stabilization (Hughes, comput methods Appl Mech Eng 127:387 401, 1995) has been introduced, and this approach, in principle, can deal with advection diffusion reaction equations. However, it was pointed out in Hanke (comput methods Appl Mech Eng 191:2925 2947) that this class of methods also need some improvement in the presence of high reaction rates. In this work we show the benefits of using the DRD operator to enhance the core stabilization techniques such as the SUPG and VMS formulations. We also propose a new operator called the DRDJ (DRD with the local variation jump) term, targeting the reduction of numerical oscillations in the presence of both high reaction rates and sharp solution gradients. The methods are evaluated in the context of two stabilized methods: the classical SUPG formulation and a recently-developed VMS formulation called the V-SGS (Corsini et al. comput methods Appl Mech Eng 194:4797 4823, 2005

  11. A Non-Mathematical Technique for Teaching Binary Computer Concepts.

    ERIC Educational Resources Information Center

    Steele, Fred

    This document describes an aid invented by the author for teaching binary computer concepts in a data processing course for business students unfamiliar with mathematical concepts. It permits the instructor to simulate the inner, invisible operation of storing data electronically. The standard 8-bit "byte" is represented by a portable…

  12. Advances in high-resolution imaging – techniques for three-dimensional imaging of cellular structures

    PubMed Central

    Lidke, Diane S.; Lidke, Keith A.

    2012-01-01

    A fundamental goal in biology is to determine how cellular organization is coupled to function. To achieve this goal, a better understanding of organelle composition and structure is needed. Although visualization of cellular organelles using fluorescence or electron microscopy (EM) has become a common tool for the cell biologist, recent advances are providing a clearer picture of the cell than ever before. In particular, advanced light-microscopy techniques are achieving resolutions below the diffraction limit and EM tomography provides high-resolution three-dimensional (3D) images of cellular structures. The ability to perform both fluorescence and electron microscopy on the same sample (correlative light and electron microscopy, CLEM) makes it possible to identify where a fluorescently labeled protein is located with respect to organelle structures visualized by EM. Here, we review the current state of the art in 3D biological imaging techniques with a focus on recent advances in electron microscopy and fluorescence super-resolution techniques. PMID:22685332

  13. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  14. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  15. MAX - An advanced parallel computer for space applications

    NASA Technical Reports Server (NTRS)

    Lewis, Blair F.; Bunker, Robert L.

    1991-01-01

    MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.

  16. Modulation/demodulation techniques for satellite communications. Part 2: Advanced techniques. The linear channel

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1982-01-01

    A theory is presented for deducing and predicting the performance of transmitter/receivers for bandwidth efficient modulations suitable for use on the linear satellite channel. The underlying principle used is the development of receiver structures based on the maximum-likelihood decision rule. The application of the performance prediction tools, e.g., channel cutoff rate and bit error probability transfer function bounds to these modulation/demodulation techniques.

  17. Study of the interface between numerical and symbolic computing techniques. Progress report

    SciTech Connect

    Not Available

    1983-01-01

    The basic emphasis of our research during the funding period December 1, 1980 to November 30, 1983 has been on the interface between numerical and symbolic computing techniques and on new and improved techniques in both numerical and symbolic computation. The specific research topics are: an analysis of basic iterative methods for matrix equations; the use of symbolic and numerical techniques on the conjectures of Whittaker (in function theory) and Bernstein (in approximation theory); and the integration of symbolic (exact) and numerical (approximate) computing techniques.

  18. Computer aided corrosion surveillance using electrochemical noise techniques

    SciTech Connect

    Reid, S.A.; Quirk, G.P.; Hadfield, M.

    1999-11-01

    Real-time mechanistic analysis of electrochemical noise data is essential for rapid identification of localized corrosion by plant operators. Recent developments towards this goal include: Intelligent Noise Data Reduction techniques to eliminate uninformative data; neural nets which learn how to categorize corrosion mechanisms from data patterns; multivariate analysis which allows the identification of combinations of plant process parameters that cause damage. These techniques can be combined to facilitate pro-active management of the corrosion problem, including consideration of corrosion mechanisms within the plant optimization process.

  19. Comparison of techniques for approximating ocean bottom topography in a wave-refraction computer model

    NASA Technical Reports Server (NTRS)

    Poole, L. R.

    1975-01-01

    A study of the effects of using different methods for approximating bottom topography in a wave-refraction computer model was conducted. Approximation techniques involving quadratic least squares, cubic least squares, and constrained bicubic polynomial interpolation were compared for computed wave patterns and parameters in the region of Saco Bay, Maine. Although substantial local differences can be attributed to use of the different approximation techniques, results indicated that overall computed wave patterns and parameter distributions were quite similar.

  20. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  1. UNEDF: Advanced Scientific Computing Transforms the Low-Energy Nuclear Many-Body Problem

    SciTech Connect

    Stoitsov, Mario; Nam, Hai Ah; Nazarewicz, Witold; Bulgac, Aurel; Hagen, Gaute; Kortelainen, E. M.; Pei, Junchen; Roche, K. J.; Schunck, N.; Thompson, I.; Vary, J. P.; Wild, S.

    2011-01-01

    The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper illustrates significant milestones accomplished by UNEDF through integration of the theoretical approaches, advanced numerical algorithms, and leadership class computational resources.

  2. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Computers.

    ERIC Educational Resources Information Center

    Ellis, Brenda

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 3-hour introduction to computers. The purpose is to develop the following competencies: (1) orientation to data processing; (2) use of data entry devices; (3) use of computer menus; and (4) entry of data with accuracy and…

  3. Analytical techniques for reduction of computational effort in reflector antenna analysis

    NASA Astrophysics Data System (ADS)

    Franceschetti, G.

    Techniques used for computing the radiation integral in reflector antenna analysis are briefly reviewed. The techniques discussed include numerical approaches, such as Monte Carlo multidimensional integration and the Ludwig method (1968), asymptotic solutions, expansion techniques, and the sampling approach. It is pointed out that none of the techniques discussed provides optimum results in the full angular range 0-180 deg, and consequently different techniques are generally used in different angular sectors.

  4. POC-Scale Testing of an Advanced Fine Coal Dewatering Equipment/Technique

    SciTech Connect

    Karekh, B K; Tao, D; Groppo, J G

    1998-08-28

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 mm) clean coal. Economical dewatering of an ultra-fine clean coal product to a 20% level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy's program to show that ultra-clean coal could be effectively dewatered to 20% or lower moisture using either conventional or advanced dewatering techniques. The cost-sharing contract effort is for 45 months beginning September 30, 1994. This report discusses technical progress made during the quarter from January 1 - March 31, 1998.

  5. Blending Two Major Techniques in Order to Compute [Pi

    ERIC Educational Resources Information Center

    Guasti, M. Fernandez

    2005-01-01

    Three major techniques are employed to calculate [pi]. Namely, (i) the perimeter of polygons inscribed or circumscribed in a circle, (ii) calculus based methods using integral representations of inverse trigonometric functions, and (iii) modular identities derived from the transformation theory of elliptic integrals. This note presents a…

  6. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    NASA Astrophysics Data System (ADS)

    Nam, H.; Stoitsov, M.; Nazarewicz, W.; Bulgac, A.; Hagen, G.; Kortelainen, M.; Maris, P.; Pei, J. C.; Roche, K. J.; Schunck, N.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2012-12-01

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multi-disciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. We illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadership-class computational resources.

  7. UNEDF: Advanced Scientific Computing Collaboration Transforms the Low-Energy Nuclear Many-Body Problem

    SciTech Connect

    Nam, H.; Stoitsov, M.; Nazarewicz, W.; Bulgac, A.; Hagen, G.; Kortelainen, M.; Maris, P.; Pei, J. C.; Roche, K. J.; Schunck, N.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2012-12-20

    The demands of cutting-edge science are driving the need for larger and faster computing resources. With the rapidly growing scale of computing systems and the prospect of technologically disruptive architectures to meet these needs, scientists face the challenge of effectively using complex computational resources to advance scientific discovery. Multi-disciplinary collaborating networks of researchers with diverse scientific backgrounds are needed to address these complex challenges. The UNEDF SciDAC collaboration of nuclear theorists, applied mathematicians, and computer scientists is developing a comprehensive description of nuclei and their reactions that delivers maximum predictive power with quantified uncertainties. This paper describes UNEDF and identifies attributes that classify it as a successful computational collaboration. Finally, we illustrate significant milestones accomplished by UNEDF through integrative solutions using the most reliable theoretical approaches, most advanced algorithms, and leadership-class computational resources.

  8. The ergonomics of computer aided design within advanced manufacturing technology.

    PubMed

    John, P A

    1988-03-01

    Many manufacturing companies have now awakened to the significance of computer aided design (CAD), although the majority of them have only been able to purchase computerised draughting systems of which only a subset produce direct manufacturing data. Such companies are moving steadily towards the concept of computer integrated manufacture (CIM), and this demands CAD to address more than draughting. CAD architects are thus having to rethink the basic specification of such systems, although they typically suffer from an insufficient understanding of the design task and have consequently been working with inadequate specifications. It is at this fundamental level that ergonomics has much to offer, making its contribution by encouraging user-centred design. The discussion considers the relationships between CAD and: the design task; the organisation and people; creativity; and artificial intelligence. It finishes with a summary of the contribution of ergonomics.

  9. Modulation/demodulation techniques for satellite communications. Part 3: Advanced techniques. The nonlinear channel

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1982-01-01

    A theory for deducing and predicting the performance of transmitter/receivers for bandwidth efficient modulations suitable for use on the nonlinear satellite channel is presented. The underlying principle used throughout is the development of receiver structures based on the maximum likelihood decision rule and aproximations to it. The bit error probability transfer function bounds developed in great detail in Part 4 is applied to these modulation/demodulation techniques. The effects of the various degrees of receiver mismatch are considered both theoretically and by numerous illustrative examples.

  10. Software Partitioning Schemes for Advanced Simulation Computer Systems. Final Report.

    ERIC Educational Resources Information Center

    Clymer, S. J.

    Conducted to design software partitioning techniques for use by the Air Force to partition a large flight simulator program for optimal execution on alternative configurations, this study resulted in a mathematical model which defines characteristics for an optimal partition, and a manually demonstrated partitioning algorithm design which…

  11. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  12. National facility for advanced computational science: A sustainable path to scientific discovery

    SciTech Connect

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  13. Modeling emergency department operations using advanced computer simulation systems.

    PubMed

    Saunders, C E; Makens, P K; Leblanc, L J

    1989-02-01

    We developed a computer simulation model of emergency department operations using simulation software. This model uses multiple levels of preemptive patient priority; assigns each patient to an individual nurse and physician; incorporates all standard tests, procedures, and consultations; and allows patient service processes to proceed simultaneously, sequentially, repetitively, or a combination of these. Selected input data, including the number of physicians, nurses, and treatment beds, and the blood test turnaround time, then were varied systematically to determine their simulated effect on patient throughput time, selected queue sizes, and rates of resource utilization. Patient throughput time varied directly with laboratory service times and inversely with the number of physician or nurse servers. Resource utilization rates varied inversely with resource availability, and patient waiting time and patient throughput time varied indirectly with the level of patient acuity. The simulation can be animated on a computer monitor, showing simulated patients, specimens, and staff members moving throughout the ED. Computer simulation is a potentially useful tool that can help predict the results of changes in the ED system without actually altering it and may have implications for planning, optimizing resources, and improving the efficiency and quality of care.

  14. Advanced Computational Methods for Security Constrained Financial Transmission Rights

    SciTech Connect

    Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria; Zhou, Ning; Huang, Zhenyu

    2012-07-26

    Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulation of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.

  15. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    SciTech Connect

    Moore, Kevin L. Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  16. Application of Advanced Magnetic Resonance Imaging Techniques in Evaluation of the Lower Extremity

    PubMed Central

    Braun, Hillary J.; Dragoo, Jason L.; Hargreaves, Brian A.; Levenston, Marc E.; Gold, Garry E.

    2012-01-01

    Synopsis This article reviews current magnetic resonance imaging techniques for imaging the lower extremity, focusing on imaging of the knee, ankle, and hip joints. Recent advancements in MRI include imaging at 7 Tesla, using multiple receiver channels, T2* imaging, and metal suppression techniques, allowing more detailed visualization of complex anatomy, evaluation of morphological changes within articular cartilage, and imaging around orthopedic hardware. PMID:23622097

  17. Pre- and postprocessing techniques for determining goodness of computational meshes

    NASA Technical Reports Server (NTRS)

    Oden, J. Tinsley; Westermann, T.; Bass, J. M.

    1993-01-01

    Research in error estimation, mesh conditioning, and solution enhancement for finite element, finite difference, and finite volume methods has been incorporated into AUDITOR, a modern, user-friendly code, which operates on 2D and 3D unstructured neutral files to improve the accuracy and reliability of computational results. Residual error estimation capabilities provide local and global estimates of solution error in the energy norm. Higher order results for derived quantities may be extracted from initial solutions. Within the X-MOTIF graphical user interface, extensive visualization capabilities support critical evaluation of results in linear elasticity, steady state heat transfer, and both compressible and incompressible fluid dynamics.

  18. Computational Approaches to Enhance Nanosafety and Advance Nanomedicine

    NASA Astrophysics Data System (ADS)

    Mendoza, Eduardo R.

    With the increasing use of nanoparticles in food processing, filtration/purification and consumer products, as well as the huge potential of their use in nanomedicine, a quantitative understanding of the effects of nanoparticle uptake and transport is needed. We provide examples of novel methods for modeling complex bio-nano interactions which are based on stochastic process algebras. Since model construction presumes sufficient availability of experimental data, recent developments in "nanoinformatics", an emerging discipline analogous to bioinfomatics, in building an accessible information infrastructure are subsequently discussed. Both computational areas offer opportunities for Filipinos to engage in collaborative, cutting edge research in this impactful field.

  19. An integrated computer system for preliminary design of advanced aircraft.

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Sobieszczanski, J.; Landrum, E. J.

    1972-01-01

    A progress report is given on the first phase of a research project to develop a system of Integrated Programs for Aerospace-Vehicle Design (IPAD) which is intended to automate to the largest extent possible the preliminary and detailed design of advanced aircraft. The approach used is to build a pilot system and simultaneously to carry out two major contractual studies to define a practical IPAD system preparatory to programing. The paper summarizes the specifications and goals of the IPAD system, the progress to date, and any conclusion reached regarding its feasibility and scope. Sample calculations obtained with the pilot system are given for aircraft preliminary designs optimized with respect to discipline parameters, such as weight or L/D, and these results are compared with designs optimized with respect to overall performance parameters, such as range or payload.

  20. Clinical decision support systems for brain tumor characterization using advanced magnetic resonance imaging techniques.

    PubMed

    Tsolaki, Evangelia; Kousi, Evanthia; Svolos, Patricia; Kapsalaki, Efthychia; Theodorou, Kyriaki; Kappas, Constastine; Tsougos, Ioannis

    2014-04-28

    In recent years, advanced magnetic resonance imaging (MRI) techniques, such as magnetic resonance spectroscopy, diffusion weighted imaging, diffusion tensor imaging and perfusion weighted imaging have been used in order to resolve demanding diagnostic problems such as brain tumor characterization and grading, as these techniques offer a more detailed and non-invasive evaluation of the area under study. In the last decade a great effort has been made to import and utilize intelligent systems in the so-called clinical decision support systems (CDSS) for automatic processing, classification, evaluation and representation of MRI data in order for advanced MRI techniques to become a part of the clinical routine, since the amount of data from the aforementioned techniques has gradually increased. Hence, the purpose of the current review article is two-fold. The first is to review and evaluate the progress that has been made towards the utilization of CDSS based on data from advanced MRI techniques. The second is to analyze and propose the future work that has to be done, based on the existing problems and challenges, especially taking into account the new imaging techniques and parameters that can be introduced into intelligent systems to significantly improve their diagnostic specificity and clinical application.

  1. Inference on arthropod demographic parameters: computational advances using R.

    PubMed

    Maia, Aline De Holanda Nunes; Pazianotto, Ricardo Antonio De Almeida; Luiz, Alfredo José Barreto; Marinho-Prado, Jeanne Scardini; Pervez, Ahmad

    2014-02-01

    We developed a computer program for life table analysis using the open source, free software programming environment R. It is useful to quantify chronic nonlethal effects of treatments on arthropod populations by summarizing information on their survival and fertility in key population parameters referred to as fertility life table parameters. Statistical inference on fertility life table parameters is not trivial because it requires the use of computationally intensive methods for variance estimation. Our codes present some advantages with respect to a previous program developed in Statistical Analysis System. Additional multiple comparison tests were incorporated for the analysis of qualitative factors; a module for regression analysis was implemented, thus, allowing analysis of quantitative factors such as temperature or agrochemical doses; availability is granted for users, once it was developed using an open source, free software programming environment. To illustrate the descriptive and inferential analysis implemented in lifetable.R, we present and discuss two examples: 1) a study quantifying the influence of the proteinase inhibitor berenil on the eucalyptus defoliator Thyrinteina arnobia (Stoll) and 2) a study investigating the influence of temperature on demographic parameters of a predaceous ladybird, Hippodamia variegata (Goeze). PMID:24665730

  2. Softform for facial rejuvenation: historical review, operative techniques, and recent advances.

    PubMed

    Miller, P J; Levine, J; Ahn, M S; Maas, C S; Constantinides, M

    2000-01-01

    The deep nasolabial fold and other facial furrows and wrinkles have challenged the facial plastic surgeon. A variety of techniques have been used in the past to correct these troublesome defects. Advances in the last five years in new materials and design have created a subcutaneous implant that has excellent properties. This article reviews the development and use of Softform facial implant.

  3. Traditional Materials and Techniques Used as Instructional Devices in an Advanced Business Spanish Conversation Class.

    ERIC Educational Resources Information Center

    Valdivieso, Jorge

    Spanish language training at the Thunderbird Graduate School of International Management is discussed, focusing on the instructional materials and classroom techniques used in advanced Spanish conversation classes. While traditional materials (dialogues, dictation, literature, mass media, video- and audiotapes) and learning activities (recitation,…

  4. Recognizing and Managing Complexity: Teaching Advanced Programming Concepts and Techniques Using the Zebra Puzzle

    ERIC Educational Resources Information Center

    Crabtree, John; Zhang, Xihui

    2015-01-01

    Teaching advanced programming can be a challenge, especially when the students are pursuing different majors with diverse analytical and problem-solving capabilities. The purpose of this paper is to explore the efficacy of using a particular problem as a vehicle for imparting a broad set of programming concepts and problem-solving techniques. We…

  5. Real-time application of advanced three-dimensional graphic techniques for research aircraft simulation

    NASA Technical Reports Server (NTRS)

    Davis, Steven B.

    1990-01-01

    Visual aids are valuable assets to engineers for design, demonstration, and evaluation. Discussed here are a variety of advanced three-dimensional graphic techniques used to enhance the displays of test aircraft dynamics. The new software's capabilities are examined and possible future uses are considered.

  6. Fabrication of advanced electrochemical energy materials using sol-gel processing techniques

    NASA Technical Reports Server (NTRS)

    Chu, C. T.; Chu, Jay; Zheng, Haixing

    1995-01-01

    Advanced materials play an important role in electrochemical energy devices such as batteries, fuel cells, and electrochemical capacitors. They are being used as both electrodes and electrolytes. Sol-gel processing is a versatile solution technique used in fabrication of ceramic materials with tailored stoichiometry, microstructure, and properties. The application of sol-gel processing in the fabrication of advanced electrochemical energy materials will be presented. The potentials of sol-gel derived materials for electrochemical energy applications will be discussed along with some examples of successful applications. Sol-gel derived metal oxide electrode materials such as V2O5 cathodes have been demonstrated in solid-slate thin film batteries; solid electrolytes materials such as beta-alumina for advanced secondary batteries had been prepared by the sol-gel technique long time ago; and high surface area transition metal compounds for capacitive energy storage applications can also be synthesized with this method.

  7. Research into display sharing techniques for distributed computing environments

    NASA Technical Reports Server (NTRS)

    Hugg, Steven B.; Fitzgerald, Paul F., Jr.; Rosson, Nina Y.; Johns, Stephen R.

    1990-01-01

    The X-based Display Sharing solution for distributed computing environments is described. The Display Sharing prototype includes the base functionality for telecast and display copy requirements. Since the prototype implementation is modular and the system design provided flexibility for the Mission Control Center Upgrade (MCCU) operational consideration, the prototype implementation can be the baseline for a production Display Sharing implementation. To facilitate the process the following discussions are presented: Theory of operation; System of architecture; Using the prototype; Software description; Research tools; Prototype evaluation; and Outstanding issues. The prototype is based on the concept of a dedicated central host performing the majority of the Display Sharing processing, allowing minimal impact on each individual workstation. Each workstation participating in Display Sharing hosts programs to facilitate the user's access to Display Sharing as host machine.

  8. Microplate based biosensing with a computer screen aided technique.

    PubMed

    Filippini, Daniel; Andersson, Tony P M; Svensson, Samuel P S; Lundström, Ingemar

    2003-10-30

    Melanophores, dark pigment cells from the frog Xenopus laevis, have the ability to change light absorbance upon stimulation by different biological agents. Hormone exposure (e.g. melatonin or alpha-melanocyte stimulating hormone) has been used here as a reversible stimulus to test a new compact microplate reading platform. As an application, the detection of the asthma drug formoterol in blood plasma samples is demonstrated. The present system utilizes a computer screen as a (programmable) large area light source, and a standard web camera as recording media enabling even kinetic microplate reading with a versatile and broadly available platform, which suffices to evaluate numerous bioassays. Especially in the context of point of care testing or self testing applications these possibilities become advantageous compared with highly dedicated comparatively expensive commercial systems.

  9. Microplate based biosensing with a computer screen aided technique.

    PubMed

    Filippini, Daniel; Andersson, Tony P M; Svensson, Samuel P S; Lundström, Ingemar

    2003-10-30

    Melanophores, dark pigment cells from the frog Xenopus laevis, have the ability to change light absorbance upon stimulation by different biological agents. Hormone exposure (e.g. melatonin or alpha-melanocyte stimulating hormone) has been used here as a reversible stimulus to test a new compact microplate reading platform. As an application, the detection of the asthma drug formoterol in blood plasma samples is demonstrated. The present system utilizes a computer screen as a (programmable) large area light source, and a standard web camera as recording media enabling even kinetic microplate reading with a versatile and broadly available platform, which suffices to evaluate numerous bioassays. Especially in the context of point of care testing or self testing applications these possibilities become advantageous compared with highly dedicated comparatively expensive commercial systems. PMID:14558996

  10. Techniques for grid manipulation and adaptation. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Eisemann, Peter R.; Lee, Ki D.

    1992-01-01

    Two approaches have been taken to provide systematic grid manipulation for improved grid quality. One is the control point form (CPF) of algebraic grid generation. It provides explicit control of the physical grid shape and grid spacing through the movement of the control points. It works well in the interactive computer graphics environment and hence can be a good candidate for integration with other emerging technologies. The other approach is grid adaptation using a numerical mapping between the physical space and a parametric space. Grid adaptation is achieved by modifying the mapping functions through the effects of grid control sources. The adaptation process can be repeated in a cyclic manner if satisfactory results are not achieved after a single application.

  11. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  12. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    DOE PAGES

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

  13. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based on their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.

  14. Infrared imaging - A validation technique for computational fluid dynamics codes used in STOVL applications

    NASA Technical Reports Server (NTRS)

    Hardman, R. R.; Mahan, J. R.; Smith, M. H.; Gelhausen, P. A.; Van Dalsem, W. R.

    1991-01-01

    The need for a validation technique for computational fluid dynamics (CFD) codes in STOVL applications has led to research efforts to apply infrared thermal imaging techniques to visualize gaseous flow fields. Specifically, a heated, free-jet test facility was constructed. The gaseous flow field of the jet exhaust was characterized using an infrared imaging technique in the 2 to 5.6 micron wavelength band as well as conventional pitot tube and thermocouple methods. These infrared images are compared to computer-generated images using the equations of radiative exchange based on the temperature distribution in the jet exhaust measured with the thermocouple traverses. Temperature and velocity measurement techniques, infrared imaging, and the computer model of the infrared imaging technique are presented and discussed. From the study, it is concluded that infrared imaging techniques coupled with the radiative exchange equations applied to CFD models are a valid method to qualitatively verify CFD codes used in STOVL applications.

  15. Detection and Sizing of Fatigue Cracks in Steel Welds with Advanced Eddy Current Techniques

    NASA Astrophysics Data System (ADS)

    Todorov, E. I.; Mohr, W. C.; Lozev, M. G.

    2008-02-01

    Butt-welded specimens were fatigued to produce cracks in the weld heat-affected zone. Advanced eddy current (AEC) techniques were used to detect and size the cracks through a coating. AEC results were compared with magnetic particle and phased-array ultrasonic techniques. Validation through destructive crack measurements was also conducted. Factors such as geometry, surface treatment, and crack tightness interfered with depth sizing. AEC inspection techniques have the potential of providing more accurate and complete sizing flaw data for manufacturing and in-service inspections.

  16. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  17. Investigating a hybrid perturbation-Galerkin technique using computer algebra

    NASA Technical Reports Server (NTRS)

    Andersen, Carl M.; Geer, James F.

    1988-01-01

    A two-step hybrid perturbation-Galerkin method is presented for the solution of a variety of differential equations type problems which involve a scalar parameter. The resulting (approximate) solution has the form of a sum where each term consists of the product of two functions. The first function is a function of the independent field variable(s) x, and the second is a function of the parameter lambda. In step one the functions of x are determined by forming a perturbation expansion in lambda. In step two the functions of lambda are determined through the use of the classical Bubnov-Gelerkin method. The resulting hybrid method has the potential of overcoming some of the drawbacks of the perturbation and Bubnov-Galerkin methods applied separately, while combining some of the good features of each. In particular, the results can be useful well beyond the radius of convergence associated with the perturbation expansion. The hybrid method is applied with the aid of computer algebra to a simple two-point boundary value problem where the radius of convergence is finite and to a quantum eigenvalue problem where the radius of convergence is zero. For both problems the hybrid method apparently converges for an infinite range of the parameter lambda. The results obtained from the hybrid method are compared with approximate solutions obtained by other methods, and the applicability of the hybrid method to broader problem areas is discussed.

  18. Comparison of Inter-Finger Connection Matrix Computation Techniques

    PubMed Central

    Martin, Joel R.; Terekhov, Alexander V.; Latash, Mark L.; Zatsiorsky, Vladimir M.

    2014-01-01

    A hypothesis was proposed that the central nervous system controls force production by the fingers through hypothetical neural commands (NCs). The NCs are scaled between values of 0 to 1, indicating no intentional force production or maximal voluntary contraction (MVC) force production, respectively. A matrix of finger inter-connections, [IFC], transforms NCs into finger forces. Two methods have been proposed to compute the [IFC]. The first method uses only single finger MVC trials and multiplies the [IFC] by a gain factor. The second method uses a neural network (NN) model based on experimental data. The performance of the two methods was compared on the MVC data and on a data set of sub-maximal forces, collected over a range of total forces and moments of force. The methods were compared in terms of: 1) ability to predict finger forces; 2) accuracy of NC reconstruction; and 3) preserved planarity of force data for sub-maximal force production task. Both methods did a reasonable job of predicting the total force in multi-finger MVC trials; however, the NN model performed better in regards to all other criteria. Overall, the results indicate that for modeling multi-finger interaction the NN method is preferable. PMID:23183029

  19. Strategies for casualty mitigation programs by using advanced tsunami computation

    NASA Astrophysics Data System (ADS)

    IMAI, K.; Imamura, F.

    2012-12-01

    1. Purpose of the study In this study, based on the scenario of great earthquakes along the Nankai trough, we aim on the estimation of the run up and high accuracy inundation process of tsunami in coastal areas including rivers. Here, using a practical method of tsunami analytical model, and taking into account characteristics of detail topography, land use and climate change in a realistic present and expected future environment, we examined the run up and tsunami inundation process. Using these results we estimated the damage due to tsunami and obtained information for the mitigation of human casualties. Considering the time series from the occurrence of the earthquake and the risk of tsunami damage, in order to mitigate casualties we provide contents of disaster risk information displayed in a tsunami hazard and risk map. 2. Creating a tsunami hazard and risk map From the analytical and practical tsunami model (a long wave approximated model) and the high resolution topography (5 m) including detailed data of shoreline, rivers, building and houses, we present a advanced analysis of tsunami inundation considering the land use. Based on the results of tsunami inundation and its analysis; it is possible to draw a tsunami hazard and risk map with information of human casualty, building damage estimation, drift of vehicles, etc. 3. Contents of disaster prevention information To improve the hazard, risk and evacuation information distribution, it is necessary to follow three steps. (1) Provide basic information such as tsunami attack info, areas and routes for evacuation and location of tsunami evacuation facilities. (2) Provide as additional information the time when inundation starts, the actual results of inundation, location of facilities with hazard materials, presence or absence of public facilities and areas underground that required evacuation. (3) Provide information to support disaster response such as infrastructure and traffic network damage prediction

  20. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  1. Advanced imaging techniques for assessment of structure, composition and function in biofilm systems.

    PubMed

    Neu, Thomas R; Manz, Bertram; Volke, Frank; Dynes, James J; Hitchcock, Adam P; Lawrence, John R

    2010-04-01

    Scientific imaging represents an important and accepted research tool for the analysis and understanding of complex natural systems. Apart from traditional microscopic techniques such as light and electron microscopy, new advanced techniques have been established including laser scanning microscopy (LSM), magnetic resonance imaging (MRI) and scanning transmission X-ray microscopy (STXM). These new techniques allow in situ analysis of the structure, composition, processes and dynamics of microbial communities. The three techniques open up quantitative analytical imaging possibilities that were, until a few years ago, impossible. The microscopic techniques represent powerful tools for examination of mixed environmental microbial communities usually encountered in the form of aggregates and films. As a consequence, LSM, MRI and STXM are being used in order to study complex microbial biofilm systems. This mini review provides a short outline of the more recent applications with the intention to stimulate new research and imaging approaches in microbiology.

  2. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.

    1972-01-01

    The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.

  3. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    SciTech Connect

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  4. Computational fluid dynamics research in three-dimensional zonal techniques

    NASA Technical Reports Server (NTRS)

    Walters, Robert W.

    1989-01-01

    Patched-grid algorithms for the analysis of complex configurations with an implicit, upwind-biased Navier-Stokes solver were investigated. Conservative and non-conservative approaches for performing zonal interpolations were implemented. The latter approach yields the most flexible technique in that it can handle both patched and overlaid grids. Results for a two-dimensional blunt body problem show that either approach yield accurate steady-state shock locations and jump conditions. In addition, calculations of the turbulent flow through a hypersonic inlet on a three-zone grid show that the numerical prediction is in good agreement with the experimental results. Through the use of a generalized coordinate transformation at the zonal interface between two or more blocks, the algorithm can be applied to highly stretched viscous grids and to arbitrarily-shaped zonal boundaries. Applications were made to the F-18 aircraft at subsonic, high-alpha conditions, in support of the NASA High-Alpha Research Program. The calculations were compared to ground-based and flight test experiments and were used as a guide to understanding the ground-based tests, which are laminar and transitional, and their relationship to flight. Calculations about a complete reconnaissance aircraft were also performed in order to further demonstrate the capability of the patched-grid algorithm.

  5. Suspended sediment modeling using genetic programming and soft computing techniques

    NASA Astrophysics Data System (ADS)

    Kisi, Ozgur; Dailr, Ali Hosseinzadeh; Cimen, Mesut; Shiri, Jalal

    2012-07-01

    SummaryModeling suspended sediment load is an important factor in water resources engineering as it crucially affects the design and management of water resources structures. In this study the genetic programming (GP) technique was applied for estimating the daily suspended sediment load in two stations in Cumberland River in U.S. Daily flow and sediment data from 1972 to 1989 were used to train and test the applied genetic programming models. The effect of various GP operators on sediment load estimation was investigated. The optimal fitness function, operator functions, linking function and learning algorithm were obtained for modeling daily suspended sediment. The GP estimates were compared with those of the Adaptive Neuro-Fuzzy Inference System (ANFIS), Artificial Neural Networks (ANNs) and Support Vector Machine (SVM) results, in term of coefficient of determination, mean absolute error, coefficient of residual mass and variance accounted for. The comparison results indicated that the GP is superior to the ANFIS, ANN and SVM models in estimating daily suspended sediment load.

  6. Techniques to derive geometries for image-based Eulerian computations

    PubMed Central

    Dillard, Seth; Buchholz, James; Vigmostad, Sarah; Kim, Hyunggun; Udaykumar, H.S.

    2014-01-01

    Purpose The performance of three frequently used level set-based segmentation methods is examined for the purpose of defining features and boundary conditions for image-based Eulerian fluid and solid mechanics models. The focus of the evaluation is to identify an approach that produces the best geometric representation from a computational fluid/solid modeling point of view. In particular, extraction of geometries from a wide variety of imaging modalities and noise intensities, to supply to an immersed boundary approach, is targeted. Design/methodology/approach Two- and three-dimensional images, acquired from optical, X-ray CT, and ultrasound imaging modalities, are segmented with active contours, k-means, and adaptive clustering methods. Segmentation contours are converted to level sets and smoothed as necessary for use in fluid/solid simulations. Results produced by the three approaches are compared visually and with contrast ratio, signal-to-noise ratio, and contrast-to-noise ratio measures. Findings While the active contours method possesses built-in smoothing and regularization and produces continuous contours, the clustering methods (k-means and adaptive clustering) produce discrete (pixelated) contours that require smoothing using speckle-reducing anisotropic diffusion (SRAD). Thus, for images with high contrast and low to moderate noise, active contours are generally preferable. However, adaptive clustering is found to be far superior to the other two methods for images possessing high levels of noise and global intensity variations, due to its more sophisticated use of local pixel/voxel intensity statistics. Originality/value It is often difficult to know a priori which segmentation will perform best for a given image type, particularly when geometric modeling is the ultimate goal. This work offers insight to the algorithm selection process, as well as outlining a practical framework for generating useful geometric surfaces in an Eulerian setting. PMID

  7. Use Of The SYSCAP 2.5 Computer Analysis Program For Advanced Optical System Design And Analysis

    NASA Astrophysics Data System (ADS)

    Kleiner, C. T.

    1983-10-01

    The successful development of various electro-optical systems is highly dependent on precise electronic circuit design which must account for possible parameter drift in the various piece parts. The utilization of a comprehensive computer analysis program (SYSCAP) provides the electro-optical system designer and electro-optical management organization with a well-structured tool for a comprehensive system analysis'. As a result, the techniques described in this paper can be readily used by the electro-optical design community. An improved version of the SYSCAP computer program (version 2.5) is presented which inncludes the following new advances: (1) the introduction of a standard macro library that permits call-up of proven mathematical models for system modeling and simulation, (2) the introduction of improved semiconductor models for bipolar junction transistors and p-n junctions, (3) multifunction modeling capability to link signals with very high speed electronic circuit models, (4) high resolution computer graphics (both interactive and batch process) for display and permanent records, and (5) compatibility and interface with ad-vanced engineering work stations. This 2.5* version of the present SYSCAP 2 computer analysis program will be available for use through the Control Data Corporation world-wide Cybernet system in 1983*. This paper provides an overview of SYSCAP modeling and simulation capabilities.

  8. Computer Modeling of Microbiological Experiments in the Teaching Laboratory: Animation Techniques.

    ERIC Educational Resources Information Center

    Tritz, Gerald J.

    1987-01-01

    Discusses the use of computer assisted instruction in the medical education program of the Kirksville College of Osteopathic Medicine (Missouri). Describes the animation techniques used in a series of simulations for microbiology. (TW)

  9. A technique for computation of star magnitudes relative to an optical sensor

    NASA Technical Reports Server (NTRS)

    Rhoads, J. W.

    1972-01-01

    The theory and techniques used to compute star magnitudes relative to any optical detector (such as the Mariner Mars 1971 Canopus star tracker) are described. Results are given relative to various star detectors.

  10. Computing aerodynamic sound using advanced statistical turbulence theories

    NASA Technical Reports Server (NTRS)

    Hecht, A. M.; Teske, M. E.; Bilanin, A. J.

    1981-01-01

    It is noted that the calculation of turbulence-generated aerodynamic sound requires knowledge of the spatial and temporal variation of Q sub ij (xi sub k, tau), the two-point, two-time turbulent velocity correlations. A technique is presented to obtain an approximate form of these correlations based on closure of the Reynolds stress equations by modeling of higher order terms. The governing equations for Q sub ij are first developed for a general flow. The case of homogeneous, stationary turbulence in a unidirectional constant shear mean flow is then assumed. The required closure form for Q sub ij is selected which is capable of qualitatively reproducing experimentally observed behavior. This form contains separation time dependent scale factors as parameters and depends explicitly on spatial separation. The approximate forms of Q sub ij are used in the differential equations and integral moments are taken over the spatial domain. The velocity correlations are used in the Lighthill theory of aerodynamic sound by assuming normal joint probability.

  11. Parallel-META 2.0: Enhanced Metagenomic Data Analysis with Functional Annotation, High Performance Computing and Advanced Visualization

    PubMed Central

    Song, Baoxing; Xu, Jian; Ning, Kang

    2014-01-01

    The metagenomic method directly sequences and analyses genome information from microbial communities. The main computational tasks for metagenomic analyses include taxonomical and functional structure analysis for all genomes in a microbial community (also referred to as a metagenomic sample). With the advancement of Next Generation Sequencing (NGS) techniques, the number of metagenomic samples and the data size for each sample are increasing rapidly. Current metagenomic analysis is both data- and computation- intensive, especially when there are many species in a metagenomic sample, and each has a large number of sequences. As such, metagenomic analyses require extensive computational power. The increasing analytical requirements further augment the challenges for computation analysis. In this work, we have proposed Parallel-META 2.0, a metagenomic analysis software package, to cope with such needs for efficient and fast analyses of taxonomical and functional structures for microbial communities. Parallel-META 2.0 is an extended and improved version of Parallel-META 1.0, which enhances the taxonomical analysis using multiple databases, improves computation efficiency by optimized parallel computing, and supports interactive visualization of results in multiple views. Furthermore, it enables functional analysis for metagenomic samples including short-reads assembly, gene prediction and functional annotation. Therefore, it could provide accurate taxonomical and functional analyses of the metagenomic samples in high-throughput manner and on large scale. PMID:24595159

  12. Parallel-META 2.0: enhanced metagenomic data analysis with functional annotation, high performance computing and advanced visualization.

    PubMed

    Su, Xiaoquan; Pan, Weihua; Song, Baoxing; Xu, Jian; Ning, Kang

    2014-01-01

    The metagenomic method directly sequences and analyses genome information from microbial communities. The main computational tasks for metagenomic analyses include taxonomical and functional structure analysis for all genomes in a microbial community (also referred to as a metagenomic sample). With the advancement of Next Generation Sequencing (NGS) techniques, the number of metagenomic samples and the data size for each sample are increasing rapidly. Current metagenomic analysis is both data- and computation- intensive, especially when there are many species in a metagenomic sample, and each has a large number of sequences. As such, metagenomic analyses require extensive computational power. The increasing analytical requirements further augment the challenges for computation analysis. In this work, we have proposed Parallel-META 2.0, a metagenomic analysis software package, to cope with such needs for efficient and fast analyses of taxonomical and functional structures for microbial communities. Parallel-META 2.0 is an extended and improved version of Parallel-META 1.0, which enhances the taxonomical analysis using multiple databases, improves computation efficiency by optimized parallel computing, and supports interactive visualization of results in multiple views. Furthermore, it enables functional analysis for metagenomic samples including short-reads assembly, gene prediction and functional annotation. Therefore, it could provide accurate taxonomical and functional analyses of the metagenomic samples in high-throughput manner and on large scale.

  13. Advanced entry guidance algorithm with landing footprint computation

    NASA Astrophysics Data System (ADS)

    Leavitt, James Aaron

    -determined angle of attack profile. The method is also capable of producing orbital footprints using an automatically-generated set of angle of attack profiles of varying range, with the lowest profile designed for near-maximum range in the absence of an active heat load constraint. The accuracy of the footprint method is demonstrated by direct comparison with footprints computed independently by an optimization program.

  14. Nondestructive Characterization by Advanced Synchrotron Light Techniques: Spectromicroscopy and Coherent Radiology

    PubMed Central

    Margaritondo, Giorgio; Hwu, Yeukuang; Je, Jung Ho

    2008-01-01

    The advanced characteristics of synchrotron light has led in recent years to the development of a series of new experimental techniques to investigate chemical and physical properties on a microscopic scale. Although originally developed for materials science and biomedical research, such techniques find increasing applications in other domains – and could be quite useful for the study and conservation of cultural heritage. Specifically, they can nondestructively provide detailed chemical composition information that can be useful for the identification of specimens, for the discovery of historical links based on the sources of chemical raw materials and on chemical processes, for the analysis of damage, their causes and remedies and for many other issues. Likewise, morphological and structural information on a microscopic scale is useful for the identification, study and preservation of many different cultural and historical specimens. We concentrate here on two classes of techniques: in the first case, photoemission spectromicroscopy. This is the result of the advanced evolution of photoemission techniques like ESCA (Electron Microscopy for Chemical Analysis). By combining high lateral resolution to spectroscopy, photoemission spectromicroscopy can deliver fine chemical information on a microscopic scale in a nondestructive fashion. The second class of techniques exploits the high lateral coherence of modern synchrotron sources, a byproduct of the quest for high brightness or brilliance. We will see that such techniques now push radiology into the submicron scale and the submillisecond time domain. Furthermore, they can be implemented in a tomographic mode, increasing the information and becoming potentially quite useful for the analysis of cultural heritage specimens.

  15. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  16. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1979-01-01

    System models that provide a basis for the formulation and evaluation of the performability of commercial aircraft computer system are developed. Quantitative measures of the system effectiveness are formulated. Analytic and simulation techniques for evaluation of the effectiveness and performability of a proposed or existing aircraft computer were studied.

  17. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.

  18. Advance development of a technique for characterizing the thermomechanical properties of thermally stable polymers

    NASA Technical Reports Server (NTRS)

    Gillham, J. K.; Stadnicki, S. J.; Hazony, Y.

    1974-01-01

    The torsional braid experiment has been interfaced with a centralized hierarchical computing system for data acquisition and data processing. Such a system, when matched by the appropriate upgrading of the monitoring techniques, provides high resolution thermomechanical spectra of rigidity and damping, and their derivatives with respect to temperature.

  19. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  20. Teaching Computer Ergonomic Techniques: Practices and Perceptions of Secondary and Postsecondary Business Educators.

    ERIC Educational Resources Information Center

    Alexander, Melody W.; Arp, Larry W.

    1997-01-01

    A survey of 260 secondary and 251 postsecondary business educators found the former more likely to think computer ergonomic techniques should taught in elementary school and to address the hazards of improper use. Both groups stated that over half of students they observe do not use good techniques and agreed that students need continual…

  1. Optimizaton of corrosion control for lead in drinking water using computational modeling techniques

    EPA Science Inventory

    Computational modeling techniques have been used to very good effect in the UK in the optimization of corrosion control for lead in drinking water. A “proof-of-concept” project with three US/CA case studies sought to demonstrate that such techniques could work equally well in the...

  2. Computational techniques for solar wind flows past terrestrial planets: Theory and computer programs

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Chaussee, D. S.; Trudinger, B. C.; Spreiter, J. R.

    1977-01-01

    The interaction of the solar wind with terrestrial planets can be predicted using a computer program based on a single fluid, steady, dissipationless, magnetohydrodynamic model to calculate the axisymmetric, supersonic, super-Alfvenic solar wind flow past both magnetic and nonmagnetic planets. The actual calculations are implemented by an assemblage of computer codes organized into one program. These include finite difference codes which determine the gas-dynamic solution, together with a variety of special purpose output codes for determining and automatically plotting both flow field and magnetic field results. Comparisons are made with previous results, and results are presented for a number of solar wind flows. The computational programs developed are documented and are presented in a general user's manual which is included.

  3. Development of low-cost test techniques for advancing film cooling technology

    NASA Astrophysics Data System (ADS)

    Soechting, F. O.; Landis, K. K.; Dobrowolski, R.

    1987-06-01

    A program for studying advanced film hole geometries that will provide improved film effectiveness levels relative to those reported in the literature is described. A planar wind tunnel was used to conduct flow visualization studies on different film hole shapes, followed by film effectiveness measurements. The most promising geometries were then tested in a two-dimensional cascade to define the film effectiveness distributions, while duplicating a turbine airfoil curvature, Mach number, and acceleration characteristics. The test techniques are assessed and typical results are presented. It was shown that smoke flow visualization is an excellent low-cost technique for observing film coolant-to-mainstream characteristics and that reusable liquid crystal sheets provide an accurate low-cost technique for measuring near-hole film effectiveness contours. Cascade airfoils constructed using specially developed precision fabrication techniques provided high-quality film effectiveness data.

  4. Advances in the surface modification techniques of bone-related implants for last 10 years

    PubMed Central

    Qiu, Zhi-Ye; Chen, Cen; Wang, Xiu-Mei; Lee, In-Seop

    2014-01-01

    At the time of implanting bone-related implants into human body, a variety of biological responses to the material surface occur with respect to surface chemistry and physical state. The commonly used biomaterials (e.g. titanium and its alloy, Co–Cr alloy, stainless steel, polyetheretherketone, ultra-high molecular weight polyethylene and various calcium phosphates) have many drawbacks such as lack of biocompatibility and improper mechanical properties. As surface modification is very promising technology to overcome such problems, a variety of surface modification techniques have been being investigated. This review paper covers recent advances in surface modification techniques of bone-related materials including physicochemical coating, radiation grafting, plasma surface engineering, ion beam processing and surface patterning techniques. The contents are organized with different types of techniques to applicable materials, and typical examples are also described. PMID:26816626

  5. Advanced semiconductor diagnosis by multidimensional electron-beam-induced current technique.

    PubMed

    Chen, J; Yuan, X; Sekiguchi, T

    2008-01-01

    We present advanced semiconductor diagnosis by using electron-beam-induced current (EBIC) technique. By varying the parameters such as temperature, accelerating voltage (V(acc)), bias voltage, and stressing time, it is possible to extend EBIC application from conventional defect characterization to advanced device diagnosis. As an electron beam can excite a certain volume even beneath the surface passive layer, EBIC can be effectively employed to diagnose complicated devices with hybrid structure. Three topics were selected to demonstrate EBIC applications. First, the recombination activities of grain boundaries and their interaction with Fe impurity in photovoltaic multicrystalline Si (mc-Si) are clarified by temperature-dependent EBIC. Second, the detection of dislocations between strained-Si and SiGe virtual substrate are shown to overcome the limitation of depletion region. Third, the observation of leakage sites in high-k gate dielectric is demonstrated for the characterization of advanced hybrid device structures.

  6. Recent advancements in nanoelectrodes and nanopipettes used in combined scanning electrochemical microscopy techniques.

    PubMed

    Kranz, Christine

    2014-01-21

    In recent years, major developments in scanning electrochemical microscopy (SECM) have significantly broadened the application range of this electroanalytical technique from high-resolution electrochemical imaging via nanoscale probes to large scale mapping using arrays of microelectrodes. A major driving force in advancing the SECM methodology is based on developing more sophisticated probes beyond conventional micro-disc electrodes usually based on noble metals or carbon microwires. This critical review focuses on the design and development of advanced electrochemical probes particularly enabling combinations of SECM with other analytical measurement techniques to provide information beyond exclusively measuring electrochemical sample properties. Consequently, this critical review will focus on recent progress and new developments towards multifunctional imaging.

  7. POC-scale testing of an advanced fine coal dewatering equipment/technique

    SciTech Connect

    1998-09-01

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 pm) clean coal. Economical dewatering of an ultra-fine clean-coal product to a 20% level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy`s program to show that ultra-clean coal could be effectively dewatered to 20% or lower moisture using either conventional or advanced dewatering techniques. The cost-sharing contract effort is for 36 months beginning September 30, 1994. This report discusses technical progress made during the quarter from July 1 - September 30, 1997.

  8. Advanced digital modulation: Communication techniques and monolithic GaAs technology

    NASA Technical Reports Server (NTRS)

    Wilson, S. G.; Oliver, J. D., Jr.; Kot, R. C.; Richards, C. R.

    1983-01-01

    Communications theory and practice are merged with state-of-the-art technology in IC fabrication, especially monolithic GaAs technology, to examine the general feasibility of a number of advanced technology digital transmission systems. Satellite-channel models with (1) superior throughput, perhaps 2 Gbps; (2) attractive weight and cost; and (3) high RF power and spectrum efficiency are discussed. Transmission techniques possessing reasonably simple architectures capable of monolithic fabrication at high speeds were surveyed. This included a review of amplitude/phase shift keying (APSK) techniques and the continuous-phase-modulation (CPM) methods, of which MSK represents the simplest case.

  9. Combined preputial advancement and phallopexy as a revision technique for treating paraphimosis in a dog.

    PubMed

    Wasik, S M; Wallace, A M

    2014-11-01

    A 7-year-old neutered male Jack Russell terrier-cross was presented for signs of recurrent paraphimosis, despite previous surgical enlargement of the preputial ostium. Revision surgery was performed using a combination of preputial advancement and phallopexy, which resulted in complete and permanent coverage of the glans penis by the prepuce, and at 1 year postoperatively, no recurrence of paraphimosis had been observed. The combined techniques allow preservation of the normal penile anatomy, are relatively simple to perform and provide a cosmetic result. We recommend this combination for the treatment of paraphimosis in the dog, particularly when other techniques have failed. PMID:25348145

  10. Development of advanced electron holographic techniques and application to industrial materials and devices.

    PubMed

    Yamamoto, Kazuo; Hirayama, Tsukasa; Tanji, Takayoshi

    2013-06-01

    The development of a transmission electron microscope equipped with a field emission gun paved the way for electron holography to be put to practical use in various fields. In this paper, we review three advanced electron holography techniques: on-line real-time electron holography, three-dimensional (3D) tomographic holography and phase-shifting electron holography, which are becoming important techniques for materials science and device engineering. We also describe some applications of electron holography to the analysis of industrial materials and devices: GaAs compound semiconductors, solid oxide fuel cells and all-solid-state lithium ion batteries.

  11. An example of requirements for Advanced Subsonic Civil Transport (ASCT) flight control system using structured techniques

    NASA Technical Reports Server (NTRS)

    Mclees, Robert E.; Cohen, Gerald C.

    1991-01-01

    The requirements are presented for an Advanced Subsonic Civil Transport (ASCT) flight control system generated using structured techniques. The requirements definition starts from initially performing a mission analysis to identify the high level control system requirements and functions necessary to satisfy the mission flight. The result of the study is an example set of control system requirements partially represented using a derivative of Yourdon's structured techniques. Also provided is a research focus for studying structured design methodologies and in particular design-for-validation philosophies.

  12. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  13. Impact of computer advances on future finite elements computations. [for aircraft and spacecraft design

    NASA Technical Reports Server (NTRS)

    Fulton, Robert E.

    1985-01-01

    Research performed over the past 10 years in engineering data base management and parallel computing is discussed, and certain opportunities for research toward the next generation of structural analysis capability are proposed. Particular attention is given to data base management associated with the IPAD project and parallel processing associated with the Finite Element Machine project, both sponsored by NASA, and a near term strategy for a distributed structural analysis capability based on relational data base management software and parallel computers for a future structural analysis system.

  14. Study of advanced techniques for determining the long term performance of components

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The application of existing and new technology to the problem of determining the long-term performance capability of liquid rocket propulsion feed systems is discussed. The long term performance of metal to metal valve seats in a liquid propellant fuel system is stressed. The approaches taken in conducting the analysis are: (1) advancing the technology of characterizing components through the development of new or more sensitive techniques and (2) improving the understanding of the physical of degradation.

  15. 5 CFR 550.404 - Computation of advance payments and evacuation payments; time periods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... evacuation payments; time periods. 550.404 Section 550.404 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY ADMINISTRATION (GENERAL) Payments During Evacuation § 550.404 Computation of advance payments and evacuation payments; time periods. (a) Payments shall be based on the...

  16. 5 CFR 550.404 - Computation of advance payments and evacuation payments; time periods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... evacuation payments; time periods. 550.404 Section 550.404 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY ADMINISTRATION (GENERAL) Payments During Evacuation § 550.404 Computation of advance payments and evacuation payments; time periods. (a) Payments shall be based on the...

  17. Computers-for-edu: An Advanced Business Application Programming (ABAP) Teaching Case

    ERIC Educational Resources Information Center

    Boyle, Todd A.

    2007-01-01

    The "Computers-for-edu" case is designed to provide students with hands-on exposure to creating Advanced Business Application Programming (ABAP) reports and dialogue programs, as well as navigating various mySAP Enterprise Resource Planning (ERP) transactions needed by ABAP developers. The case requires students to apply a wide variety of ABAP…

  18. COMPUTATIONAL TOXICOLOGY ADVANCES: EMERGING CAPABILITIES FOR DATA EXPLORATION AND SAR MODEL DEVELOPMENT

    EPA Science Inventory

    Computational Toxicology Advances: Emerging capabilities for data exploration and SAR model development
    Ann M. Richard and ClarLynda R. Williams, National Health & Environmental Effects Research Laboratory, US EPA, Research Triangle Park, NC, USA; email: richard.ann@epa.gov

  19. Advanced Telecommunications and Computer Technologies in Georgia Public Elementary School Library Media Centers.

    ERIC Educational Resources Information Center

    Rogers, Jackie L.

    The purpose of this study was to determine what recent progress had been made in Georgia public elementary school library media centers regarding access to advanced telecommunications and computer technologies as a result of special funding. A questionnaire addressed the following areas: automation and networking of the school library media center…

  20. PARTNERING WITH DOE TO APPLY ADVANCED BIOLOGICAL, ENVIRONMENTAL, AND COMPUTATIONAL SCIENCE TO ENVIRONMENTAL ISSUES

    EPA Science Inventory

    On February 18, 2004, the U.S. Environmental Protection Agency and Department of Energy signed a Memorandum of Understanding to expand the research collaboration of both agencies to advance biological, environmental, and computational sciences for protecting human health and the ...