Science.gov

Sample records for advanced computational techniques

  1. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  2. Advanced Computational Techniques in Regional Wave Studies

    DTIC Science & Technology

    1990-01-03

    the new GERESS data. The dissertation work emphasized the development and use of advanced computa- tional techniques for studying regional seismic...hand, the possibility of new data sources at regional distances permits using previously ignored signals. Unfortunately, these regional signals will...the Green’s function around this new reference point is containing the propagation effects, and V is the source Gnk(x,t;r,t) - (2) volume where fJk

  3. Advancing crime scene computer forensics techniques

    NASA Astrophysics Data System (ADS)

    Hosmer, Chet; Feldman, John; Giordano, Joe

    1999-02-01

    Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.

  4. Advanced Computational Techniques for Power Tube Design.

    DTIC Science & Technology

    1986-07-01

    fixturing applications, in addition to the existing computer-aided engineering capabilities. o Helix TWT Manufacturing has Implemented a tooling and fixturing...illustrates the ajor features of this computer network. ) The backbone of our system is a Sytek Broadband Network (LAN) which Interconnects terminals and...automatic network analyzer (FANA) which electrically characterizes the slow-wave helices of traveling-wave tubes ( TWTs ) -- both for engineering design

  5. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  6. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  7. XII Advanced Computing and Analysis Techniques in Physics Research

    NASA Astrophysics Data System (ADS)

    Speer, Thomas; Carminati, Federico; Werlen, Monique

    November 2008 will be a few months after the official start of LHC when the highest quantum energy ever produced by mankind will be observed by the most complex piece of scientific equipment ever built. LHC will open a new era in physics research and push further the frontier of Knowledge This achievement has been made possible by new technological developments in many fields, but computing is certainly the technology that has made possible this whole enterprise. Accelerator and detector design, construction management, data acquisition, detectors monitoring, data analysis, event simulation and theoretical interpretation are all computing based HEP activities but also occurring many other research fields. Computing is everywhere and forms the common link between all involved scientists and engineers. The ACAT workshop series, created back in 1990 as AIHENP (Artificial Intelligence in High Energy and Nuclear Research) has been covering the tremendous evolution of computing in its most advanced topics, trying to setup bridges between computer science, experimental and theoretical physics. Conference web-site: http://acat2008.cern.ch/ Programme and presentations: http://indico.cern.ch/conferenceDisplay.py?confId=34666

  8. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  9. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging.

  10. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  11. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  12. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  13. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  14. Advanced computer techniques for inverse modeling of electric current in cardiac tissue

    SciTech Connect

    Hutchinson, S.A.; Romero, L.A.; Diegert, C.F.

    1996-08-01

    For many years, ECG`s and vector cardiograms have been the tools of choice for non-invasive diagnosis of cardiac conduction problems, such as found in reentrant tachycardia or Wolff-Parkinson-White (WPW) syndrome. Through skillful analysis of these skin-surface measurements of cardiac generated electric currents, a physician can deduce the general location of heart conduction irregularities. Using a combination of high-fidelity geometry modeling, advanced mathematical algorithms and massively parallel computing, Sandia`s approach would provide much more accurate information and thus allow the physician to pinpoint the source of an arrhythmia or abnormal conduction pathway.

  15. 13th International Workshop on Advanced Computing and Analysis Techniques in Physics Research

    NASA Astrophysics Data System (ADS)

    Speer, T.; Boudjema, F.; Lauret, J.; Naumann, A.; Teodorescu, L.; Uwer, P.

    "Beyond the Cutting edge in Computing" Fundamental research is dealing, by definition, with the two extremes: the extremely small and the extremely large. The LHC and Astroparticle physics experiments will soon offer new glimpses beyond the current frontiers. And the computing infrastructure to support such physics research needs to look beyond the cutting edge. Once more it seems that we are on the edge of a computing revolution. But perhaps what we are seeing now is a even more epochal change where not only the pace of the revolution is changing, but also its very nature. Change is not any more an "event" meant to open new possibilities that have to be understood first and exploited then to prepare the ground for a new leap. Change is becoming the very essence of the computing reality, sustained by a continuous flow of technical and paradigmatic innovation. The hardware is definitely moving toward more massive parallelism, in a breathtaking synthesis of all the past techniques of concurrent computation. New many-core machines offer opportunities for all sorts of Single/Multiple Instructions, Single/Multiple Data and Vector computations that in the past required specialised hardware. At the same time, all levels of virtualisation imagined till now seem to be possible via Clouds, and possibly many more. Information Technology has been the working backbone of the Global Village, and now, in more than one sense, it is becoming itself the Global Village. Between these two, the gap between the need for adapting applications to exploit the new hardware possibilities and the push toward virtualisation of resources is widening, creating more challenges as technical and intellectual progress continues. ACAT 2010 proposes to explore and confront the different boundaries of the evolution of computing, and its possible consequences on our scientific activity. What do these new technologies entail for physics research? How will physics research benefit from this revolution in

  16. Application of advanced grid generation techniques for flow field computations about complex configurations

    NASA Technical Reports Server (NTRS)

    Kathong, Monchai; Tiwari, Surendra N.

    1988-01-01

    In the computation of flowfields about complex configurations, it is very difficult to construct a boundary-fitted coordinate system. An alternative approach is to use several grids at once, each of which is generated independently. This procedure is called the multiple grids or zonal grids approach; its applications are investigated. The method conservative providing conservation of fluxes at grid interfaces. The Euler equations are solved numerically on such grids for various configurations. The numerical scheme used is the finite-volume technique with a three-stage Runge-Kutta time integration. The code is vectorized and programmed to run on the CDC VPS-32 computer. Steady state solutions of the Euler equations are presented and discussed. The solutions include: low speed flow over a sphere, high speed flow over a slender body, supersonic flow through a duct, and supersonic internal/external flow interaction for an aircraft configuration at various angles of attack. The results demonstrate that the multiple grids approach along with the conservative interfacing is capable of computing the flows about the complex configurations where the use of a single grid system is not possible.

  17. Advanced echocardiographic techniques

    PubMed Central

    Perry, Rebecca

    2015-01-01

    Abstract Echocardiography has advanced significantly since its first clinical use. The move towards more accurate imaging and quantification has driven this advancement. In this review, we will briefly focus on three distinct but important recent advances, three‐dimensional (3D) echocardiography, contrast echocardiography and myocardial tissue imaging. The basic principles of these techniques will be discussed as well as current and future clinical applications. PMID:28191159

  18. Advanced Computer Typography.

    DTIC Science & Technology

    1981-12-01

    ADVANCED COMPUTER TYPOGRAPHY .(U) DEC 81 A V HERSHEY UNCLASSIFIED NPS012-81-005 M MEEEIEEEII IIUJIL15I.4 MICROCQP RE SO.JjI ON ft R NPS012-81-005...NAVAL POSTGRADUATE SCHOOL 0Monterey, California DTIC SELECTEWA APR 5 1982 B ADVANCED COMPUTER TYPOGRAPHY by A. V. HERSHEY December 1981 OApproved for...Subtitle) S. TYPE Or REPORT & PERIOD COVERED Final ADVANCED COMPUTER TYPOGRAPHY Dec 1979 - Dec 1981 S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) S CONTRACT

  19. Research in advanced formal theorem-proving techniques. [design and implementation of computer languages

    NASA Technical Reports Server (NTRS)

    Raphael, B.; Fikes, R.; Waldinger, R.

    1973-01-01

    The results are summarised of a project aimed at the design and implementation of computer languages to aid in expressing problem solving procedures in several areas of artificial intelligence including automatic programming, theorem proving, and robot planning. The principal results of the project were the design and implementation of two complete systems, QA4 and QLISP, and their preliminary experimental use. The various applications of both QA4 and QLISP are given.

  20. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  1. Advanced Communication Techniques

    DTIC Science & Technology

    1988-07-01

    described in which joint decoding is accomplished by combining the syndrome information into one system of linear equations which is solved to give...objective of finding the measurement errors for doppler and range, leading eventually to comparisons of the relative advantages of the systems . Since these...joins in fragmented database systems on both broadcast and nonbroadcast type computer networks is analyzed. Semantic information associated with

  2. Advancement on Visualization Techniques

    DTIC Science & Technology

    1980-10-01

    Aeroa and As ronautics Massachusetts Institute of Technology Cambridge, MA 02139 USA I !ii 1 I This AGARDograph was prepared at the request of the...the fields of science § and technology relating to aerospace for the following purposes: - Exchanging of scientific and technical information...Techniques for providing the pilot visualization have grown rapidly. Technology has developed fron mechanical gauges through electro-mechanical

  3. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Atluri, Satya N.

    1987-01-01

    The development status and applicational range of techniques in computational structural mechanics (CSM) are evaluated with a view to advances in computational models for material behavior, discrete-element technology, quality assessment, the control of numerical simulations of structural response, hybrid analysis techniques, techniques for large-scale optimization, and the impact of new computing systems on CSM. Primary pacers of CSM development encompass prediction and analysis of novel materials for structural components, computational strategies for large-scale structural calculations, and the assessment of response prediction reliability together with its adaptive improvement.

  4. Advanced Coating Removal Techniques

    NASA Technical Reports Server (NTRS)

    Seibert, Jon

    2006-01-01

    An important step in the repair and protection against corrosion damage is the safe removal of the oxidation and protective coatings without further damaging the integrity of the substrate. Two such methods that are proving to be safe and effective in this task are liquid nitrogen and laser removal operations. Laser technology used for the removal of protective coatings is currently being researched and implemented in various areas of the aerospace industry. Delivering thousands of focused energy pulses, the laser ablates the coating surface by heating and dissolving the material applied to the substrate. The metal substrate will reflect the laser and redirect the energy to any remaining protective coating, thus preventing any collateral damage the substrate may suffer throughout the process. Liquid nitrogen jets are comparable to blasting with an ultra high-pressure water jet but without the residual liquid that requires collection and removal .As the liquid nitrogen reaches the surface it is transformed into gaseous nitrogen and reenters the atmosphere without any contamination to surrounding hardware. These innovative technologies simplify corrosion repair by eliminating hazardous chemicals and repetitive manual labor from the coating removal process. One very significant advantage is the reduction of particulate contamination exposure to personnel. With the removal of coatings adjacent to sensitive flight hardware, a benefit of each technique for the space program is that no contamination such as beads, water, or sanding residue is left behind when the job is finished. One primary concern is the safe removal of coatings from thin aluminum honeycomb face sheet. NASA recently conducted thermal testing on liquid nitrogen systems and found that no damage occurred on 1/6", aluminum substrates. Wright Patterson Air Force Base in conjunction with Boeing and NASA is currently testing the laser remOval technique for process qualification. Other applications of liquid

  5. Advanced Wavefront Control Techniques

    SciTech Connect

    Olivier, S S; Brase, J M; Avicola, K; Thompson, C A; Kartz, M W; Winters, S; Hartley, R; Wihelmsen, J; Dowla, F V; Carrano, C J; Bauman, B J; Pennington, D M; Lande, D; Sawvel, R M; Silva, D A; Cooke, J B; Brown, C G

    2001-02-21

    this project, work was performed in four areas (1) advanced modeling tools for deformable mirrors (2) low-order wavefront correctors with Alvarez lenses, (3) a direct phase measuring heterdyne wavefront sensor, and (4) high-spatial-frequency wavefront control using spatial light modulators.

  6. Advanced qualification techniques

    SciTech Connect

    Winokur, P.S; Shaneyfelt, M.R.; Meisenheimer, T.L.; Fleetwood, D.M.

    1993-12-01

    This paper demonstrates use of the Qualified Manufacturers List (QML) methodology to qualify commercial and military microelectronics for use in space applications. QML ``builds in`` the hardness of product through statistical process control (SPC) of technology parameters relevant to the radiation response, test structure to integrated circuit (IC) correlations, and techniques for extrapolating laboratory test results to low-dose-rate space scenarios. Each of these elements is demonstrated and shown to be a cost-effective alternative to expensive end-of-line IC testing. Several examples of test structured-IC correlations are provided and recent work on complications arising from transistor scaling and geometry is discussed. The use of a 10-keV x-ray wafer-level test system to support SPC and establish ``process capability`` is illustrated and a comparison of 10-keV x-ray and Co{sup 60} gamma irradiations is provided for a wide range of CMOS technologies. The x-ray tester is shown to be cost-effective and its use in lot acceptance/qualification is recommended. Finally, a comparison is provided between MIL-STD-883D, Test Method 1019.4, which governs the testing of packaged semiconductor microcircuits in the DoD, and ESA/SSC Basic Specification No. 22900, Europe`s Total Dose Steady-State Irradiation Test Method. Test Method 1019.4 focuses on conservative estimates of MOS hardness for space and tactical applications, while Basic Specification 22900 focuses on improved simulation of low-dose-rate space environments.

  7. Advances in Computational Astrophysics

    SciTech Connect

    Calder, Alan C.; Kouzes, Richard T.

    2009-03-01

    I was invited to be the guest editor for a special issue of Computing in Science and Engineering along with a colleague from Stony Brook. This is the guest editors' introduction to a special issue of Computing in Science and Engineering. Alan and I have written this introduction and have been the editors for the 4 papers to be published in this special edition.

  8. Techniques in Advanced Language Teaching.

    ERIC Educational Resources Information Center

    Ager, D. E.

    1967-01-01

    For ease of presentation, advanced grammar teaching techniques are briefly considered under the headings of structuralism (belief in the effectiveness of presenting grammar rules) and contextualism (belief in the maximum use by students of what they know in the target language). The structuralist's problem of establishing a syllabus is discussed…

  9. Advanced techniques for microwave reflectometry

    SciTech Connect

    Sanchez, J.; Branas, B.; Luna, E. de la; Estrada, T.; Zhuravlev, V. |; Hartfuss, H.J.; Hirsch, M.; Geist, T.; Segovia, J.; Oramas, J.L.

    1994-12-31

    Microwave reflectometry has been applied during the last years as a plasma diagnostic of increasing interest, mainly due to its simplicity, no need for large access ports and low radiation damage of exposed components. Those characteristics make reflectometry an attractive diagnostic for the next generation devices. Systems used either for density profile or density fluctuations have also shown great development, from the original single channel heterodyne to the multichannel homodyne receivers. In the present work we discuss three different advanced reflectometer systems developed by CIEMAT members in collaboration with different institutions. The first one is the broadband heterodyne reflectometer installed on W7AS for density fluctuations measurements. The decoupling of the phase and amplitude of the reflected beam allows for quantitative analysis of the fluctuations. Recent results showing the behavior of the density turbulence during the L-H transition on W7AS are shown. The second system shows how the effect of the turbulence can be used for density profile measurements by reflectometry in situations where the complicated geometry of the waveguides cannot avoid many parasitic reflections. Experiments from the TJ-I tokamak will be shown. Finally, a reflectometer system based on the Amplitude Modulation (AM) technique for density profile measurements is discussed and experimental results from the TJ-I tokamak are shown. The AM system offers the advantage of being almost insensitive to the effect of fluctuations. It is able to take a direct measurement of the time delay of the microwave pulse which propagates to the reflecting layer and is reflected back. In order to achieve fast reconstruction for real time monitoring of the density profile application of Neural Networks algorithms will be presented the method can reduce the computing times by about three orders of magnitude. 10 refs., 10 figs.

  10. Individualized Instruction Using Computer Techniques

    ERIC Educational Resources Information Center

    Castleberry, S.; Lagowski, J. J.

    1970-01-01

    Explains how computer-based instructional techniques are being used to individualize general chemistry instruction. After describing the computer equipment and language used, author describes simulated experiments and computer programmed drill exercises. Outlines fifteen topics programmed for instruction. Comparisons were made between experimental…

  11. Evaluation and study of advanced optical contamination, deposition, measurement, and removal techniques. [including computer programs and ultraviolet reflection analysis

    NASA Technical Reports Server (NTRS)

    Linford, R. M. F.; Allen, T. H.; Dillow, C. F.

    1975-01-01

    A program is described to design, fabricate and install an experimental work chamber assembly (WCA) to provide a wide range of experimental capability. The WCA incorporates several techniques for studying the kinetics of contaminant films and their effect on optical surfaces. It incorporates the capability for depositing both optical and contaminant films on temperature-controlled samples, and for in-situ measurements of the vacuum ultraviolet reflectance. Ellipsometer optics are mounted on the chamber for film thickness determinations, and other features include access ports for radiation sources and instrumentation. Several supporting studies were conducted to define specific chamber requirements, to determine the sensitivity of the measurement techniques to be incorporated in the chamber, and to establish procedures for handling samples prior to their installation in the chamber. A bibliography and literature survey of contamination-related articles is included.

  12. Neutron analysis of spent fuel storage installation using parallel computing and advance discrete ordinates and Monte Carlo techniques.

    PubMed

    Shedlock, Daniel; Haghighat, Alireza

    2005-01-01

    In the United States, the Nuclear Waste Policy Act of 1982 mandated centralised storage of spent nuclear fuel by 1988. However, the Yucca Mountain project is currently scheduled to start accepting spent nuclear fuel in 2010. Since many nuclear power plants were only designed for -10 y of spent fuel pool storage, > 35 plants have been forced into alternate means of spent fuel storage. In order to continue operation and make room in spent fuel pools, nuclear generators are turning towards independent spent fuel storage installations (ISFSIs). Typical vertical concrete ISFSIs are -6.1 m high and 3.3 m in diameter. The inherently large system, and the presence of thick concrete shields result in difficulties for both Monte Carlo (MC) and discrete ordinates (SN) calculations. MC calculations require significant variance reduction and multiple runs to obtain a detailed dose distribution. SN models need a large number of spatial meshes to accurately model the geometry and high quadrature orders to reduce ray effects, therefore, requiring significant amounts of computer memory and time. The use of various differencing schemes is needed to account for radial heterogeneity in material cross sections and densities. Two P3, S12, discrete ordinate, PENTRAN (parallel environment neutral-particle TRANsport) models were analysed and different MC models compared. A multigroup MCNP model was developed for direct comparison to the SN models. The biased A3MCNP (automated adjoint accelerated MCNP) and unbiased (MCNP) continuous energy MC models were developed to assess the adequacy of the CASK multigroup (22 neutron, 18 gamma) cross sections. The PENTRAN SN results are in close agreement (5%) with the multigroup MC results; however, they differ by -20-30% from the continuous-energy MC predictions. This large difference can be attributed to the expected difference between multigroup and continuous energy cross sections, and the fact that the CASK library is based on the old ENDF

  13. LHC Olympics: Advanced Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Armour, Kyle; Larkoski, Andrew; Gray, Amanda; Ventura, Dan; Walsh, Jon; Schabinger, Rob

    2006-05-01

    The LHC Olympics is a series of workshop aimed at encouraging theorists and experimentalists to prepare for the soon-to-be-online Large Hadron Collider in Geneva, Switzerland. One aspect of the LHC Olympics program consists of the study of simulated data sets which represent various possible new physics signals as they would be seen in LHC detectors. Through this exercise, LHC Olympians learn the phenomenology of possible new physics models and gain experience in analyzing LHC data. Additionally, the LHC Olympics encourages discussion between theorists and experimentalists, and through this collaboration new techniques could be developed. The University of Washington LHC Olympics group consists of several first-year graduate and senior undergraduate students, in both theoretical and experimental particle physics. Presented here is a discussion of some of the more advanced techniques used and the recent results of one such LHC Olympics study.

  14. Advanced computations in plasma physics

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2002-05-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to

  15. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  16. Advanced Computation in Plasma Physics

    NASA Astrophysics Data System (ADS)

    Tang, William

    2001-10-01

    Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. This talk will review recent progress and future directions for advanced simulations in magnetically-confined plasmas with illustrative examples chosen from areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop MPP's to produce 3-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for tens of thousands time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract

  17. Advanced flight computer. Special study

    NASA Technical Reports Server (NTRS)

    Coo, Dennis

    1995-01-01

    This report documents a special study to define a 32-bit radiation hardened, SEU tolerant flight computer architecture, and to investigate current or near-term technologies and development efforts that contribute to the Advanced Flight Computer (AFC) design and development. An AFC processing node architecture is defined. Each node may consist of a multi-chip processor as needed. The modular, building block approach uses VLSI technology and packaging methods that demonstrate a feasible AFC module in 1998 that meets that AFC goals. The defined architecture and approach demonstrate a clear low-risk, low-cost path to the 1998 production goal, with intermediate prototypes in 1996.

  18. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  19. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  20. Advances in Procedural Techniques - Antegrade

    PubMed Central

    Wilson, William; Spratt, James C.

    2014-01-01

    There have been many technological advances in antegrade CTO PCI, but perhaps most importantly has been the evolution of the “hybrid’ approach where ideally there exists a seamless interplay of antegrade wiring, antegrade dissection re-entry and retrograde approaches as dictated by procedural factors. Antegrade wire escalation with intimal tracking remains the preferred initial strategy in short CTOs without proximal cap ambiguity. More complex CTOs, however, usually require either a retrograde or an antegrade dissection re-entry approach, or both. Antegrade dissection re-entry is well suited to long occlusions where there is a healthy distal vessel and limited “interventional” collaterals. Early use of a dissection re-entry strategy will increase success rates, reduce complications, and minimise radiation exposure, contrast use as well as procedural times. Antegrade dissection can be achieved with a knuckle wire technique or the CrossBoss catheter whilst re-entry will be achieved in the most reproducible and reliable fashion by the Stingray balloon/wire. It should be avoided where there is potential for loss of large side branches. It remains to be seen whether use of newer dissection re-entry strategies will be associated with lower restenosis rates compared with the more uncontrolled subintimal tracking strategies such as STAR and whether stent insertion in the subintimal space is associated with higher rates of late stent malapposition and stent thrombosis. It is to be hoped that the algorithms, which have been developed to guide CTO operators, allow for a better transfer of knowledge and skills to increase uptake and acceptance of CTO PCI as a whole. PMID:24694104

  1. Computational intelligence techniques in bioinformatics.

    PubMed

    Hassanien, Aboul Ella; Al-Shammari, Eiman Tamah; Ghali, Neveen I

    2013-12-01

    Computational intelligence (CI) is a well-established paradigm with current systems having many of the characteristics of biological computers and capable of performing a variety of tasks that are difficult to do using conventional techniques. It is a methodology involving adaptive mechanisms and/or an ability to learn that facilitate intelligent behavior in complex and changing environments, such that the system is perceived to possess one or more attributes of reason, such as generalization, discovery, association and abstraction. The objective of this article is to present to the CI and bioinformatics research communities some of the state-of-the-art in CI applications to bioinformatics and motivate research in new trend-setting directions. In this article, we present an overview of the CI techniques in bioinformatics. We will show how CI techniques including neural networks, restricted Boltzmann machine, deep belief network, fuzzy logic, rough sets, evolutionary algorithms (EA), genetic algorithms (GA), swarm intelligence, artificial immune systems and support vector machines, could be successfully employed to tackle various problems such as gene expression clustering and classification, protein sequence classification, gene selection, DNA fragment assembly, multiple sequence alignment, and protein function prediction and its structure. We discuss some representative methods to provide inspiring examples to illustrate how CI can be utilized to address these problems and how bioinformatics data can be characterized by CI. Challenges to be addressed and future directions of research are also presented and an extensive bibliography is included.

  2. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  3. Advanced Spectroscopy Technique for Biomedicine

    NASA Astrophysics Data System (ADS)

    Zhao, Jianhua; Zeng, Haishan

    This chapter presents an overview of the applications of optical spectroscopy in biomedicine. We focus on the optical design aspects of advanced biomedical spectroscopy systems, Raman spectroscopy system in particular. Detailed components and system integration are provided. As examples, two real-time in vivo Raman spectroscopy systems, one for skin cancer detection and the other for endoscopic lung cancer detection, and an in vivo confocal Raman spectroscopy system for skin assessment are presented. The applications of Raman spectroscopy in cancer diagnosis of the skin, lung, colon, oral cavity, gastrointestinal tract, breast, and cervix are summarized.

  4. Stitching Techniques Advance Optics Manufacturing

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Because NASA depends on the fabrication and testing of large, high-quality aspheric (nonspherical) optics for applications like the James Webb Space Telescope, it sought an improved method for measuring large aspheres. Through Small Business Innovation Research (SBIR) awards from Goddard Space Flight Center, QED Technologies, of Rochester, New York, upgraded and enhanced its stitching technology for aspheres. QED developed the SSI-A, which earned the company an R&D 100 award, and also developed a breakthrough machine tool called the aspheric stitching interferometer. The equipment is applied to advanced optics in telescopes, microscopes, cameras, medical scopes, binoculars, and photolithography."

  5. Advanced Geophysical Environmental Simulation Techniques

    DTIC Science & Technology

    2007-11-02

    cloud property retrieval algorithms for processing of large multiple-satellite data sets; development and application of improved cloud -phase and... cloud optical property retrieval algorithms; investigation of techniques potentially applicable for retrieval of cloud spatial properties from very...14. SUBJECT TERMS cirrus cloud retrieval satellite meteorology polar-orbiting geostationary 15. NUMBER OF PAGES 16. PRICE CODE 17. SECURITY

  6. Advanced crew procedures development techniques

    NASA Technical Reports Server (NTRS)

    Arbet, J. D.; Benbow, R. L.; Mangiaracina, A. A.; Mcgavern, J. L.; Spangler, M. C.; Tatum, I. C.

    1975-01-01

    The development of an operational computer program, the Procedures and Performance Program (PPP), is reported which provides a procedures recording and crew/vehicle performance monitoring capability. The PPP provides real time CRT displays and postrun hardcopy of procedures, difference procedures, performance, performance evaluation, and training script/training status data. During post-run, the program is designed to support evaluation through the reconstruction of displays to any point in time. A permanent record of the simulation exercise can be obtained via hardcopy output of the display data, and via magnetic tape transfer to the Generalized Documentation Processor (GDP). Reference procedures data may be transferred from the GDP to the PPP.

  7. Advanced techniques in current signature analysis

    NASA Astrophysics Data System (ADS)

    Smith, S. F.; Castleberry, K. N.

    1992-02-01

    In general, both ac and dc motors can be characterized as weakly nonlinear systems, in which both linear and nonlinear effects occur simultaneously. Fortunately, the nonlinearities are generally well behaved and understood and can be handled via several standard mathematical techniques already well developed in the systems modeling area; examples are piecewise linear approximations and Volterra series representations. Field measurements of numerous motors and motor-driven systems confirm the rather complex nature of motor current spectra and illustrate both linear and nonlinear effects (including line harmonics and modulation components). Although previous current signature analysis (CSA) work at Oak Ridge and other sites has principally focused on the modulation mechanisms and detection methods (AM, PM, and FM), more recent studies have been conducted on linear spectral components (those appearing in the electric current at their actual frequencies and not as modulation sidebands). For example, large axial-flow compressors (approximately 3300 hp) in the US gaseous diffusion uranium enrichment plants exhibit running-speed (approximately 20 Hz) and high-frequency vibrational information (greater than 1 kHz) in their motor current spectra. Several signal-processing techniques developed to facilitate analysis of these components, including specialized filtering schemes, are presented. Finally, concepts for the designs of advanced digitally based CSA units are offered, which should serve to foster the development of much more computationally capable 'smart' CSA instrumentation in the next several years.

  8. Septoplasty: Basic and Advanced Techniques.

    PubMed

    Most, Sam P; Rudy, Shannon F

    2017-05-01

    Nasal septal deviation is a prevalent problem that can have significant quality of life ramifications. Septoplasty is commonly performed to provide qualitative and quantitative benefit to those with nasal obstruction owing to septal deviation. Although a standard, basic technique is often adequate for individuals with mild to moderate mid to posterior septal deviation, unique challenges arise with caudal septal deviation. Herein, multiple strategies that attempt to address anterior septal deviation are discussed. Anterior septal reconstruction has been shown to be a safe and effective means by which to address severe caudal septal deviation and long-term reduction in preoperative symptoms.

  9. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  10. Aerodynamic Analyses Requiring Advanced Computers, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers given at the conference present the results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include two-dimensional configurations, three-dimensional configurations, transonic aircraft, and the space shuttle.

  11. Aerodynamic Analyses Requiring Advanced Computers, Part 1

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Papers are presented which deal with results of theoretical research on aerodynamic flow problems requiring the use of advanced computers. Topics discussed include: viscous flows, boundary layer equations, turbulence modeling and Navier-Stokes equations, and internal flows.

  12. Quantum chromodynamics with advanced computing

    SciTech Connect

    Kronfeld, Andreas S.; /Fermilab

    2008-07-01

    We survey results in lattice quantum chromodynamics from groups in the USQCD Collaboration. The main focus is on physics, but many aspects of the discussion are aimed at an audience of computational physicists.

  13. Hybrid mesh generation using advancing reduction technique

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study presents an extension of the application of the advancing reduction technique to the hybrid mesh generation. The proposed algorithm is based on a pre-generated rectangle mesh (RM) with a certain orientation. The intersection points between the two sets of perpendicular mesh lines in RM an...

  14. Advanced techniques for future observations from space

    NASA Technical Reports Server (NTRS)

    Hinkley, E. D.

    1980-01-01

    Advanced remote sensing techniques for the study of global meteorology and the chemistry of the atmosphere are considered. Remote sensing from Spacelab/Shuttle and free-flying satellites will provide the platforms for instrumentation based on advanced technology. Several laser systems are being developed for the measurement of tropospheric winds and pressure, and trace species in the troposphere and stratosphere. In addition, a high-resolution passive infrared sensor shows promise for measuring temperature from sea level up through the stratosphere. Advanced optical and microwave instruments are being developed for wind measurements in the stratosphere and mesosphere. Microwave techniques are also useful for the study of meteorological parameters at the air-sea interface.

  15. Advanced Biomedical Computing Center (ABCC) | DSITP

    Cancer.gov

    The Advanced Biomedical Computing Center (ABCC), located in Frederick Maryland (MD), provides HPC resources for both NIH/NCI intramural scientists and the extramural biomedical research community. Its mission is to provide HPC support, to provide collaborative research, and to conduct in-house research in various areas of computational biology and biomedical research.

  16. Advanced laptop and small personal computer technology

    NASA Technical Reports Server (NTRS)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  17. Computational advances in nanostructure determination

    NASA Astrophysics Data System (ADS)

    Farrow, Christopher Lyn

    The atomic pair distribution function (PDF) and extended x-ray absorption fine structure (EXAFS) techniques fill a hole in conventional crystallographic analysis, which resolves the average long-range structure of a material but inadequately determines deviations from the average. These techniques provide structural information on the sub-nanometer scale and are helping characterize modern materials. Despite their successes, PDF and EXAFS often fall short of adequately describing complex nanostructured materials. Parallel PDF and EXAFS refinement, or corefinement, is one attempt at extending the applicability of these techniques. Corefinement combines the best parts of PDF and EXAFS, the chemical-specific and short-range detail of EXAFS and the short and intermediate-range information from the PDF. New ab initio methods are also being employed to find structures from the PDF. These techniques use the bond length information encoded in the PDF to assemble structures without a model. On another front, new software has been developed to introduce the PDF method to a larger community. Broad awareness of the PDF technique will help drive its future development.

  18. Computational and design methods for advanced imaging

    NASA Astrophysics Data System (ADS)

    Birch, Gabriel C.

    This dissertation merges the optical design and computational aspects of imaging systems to create novel devices that solve engineering problems in optical science and attempts to expand the solution space available to the optical designer. This dissertation is divided into two parts: the first discusses a new active illumination depth sensing modality, while the second part discusses a passive illumination system called plenoptic, or lightfield, imaging. The new depth sensing modality introduced in part one is called depth through controlled aberration. This technique illuminates a target with a known, aberrated projected pattern and takes an image using a traditional, unmodified imaging system. Knowing how the added aberration in the projected pattern changes as a function of depth, we are able to quantitatively determine depth of a series of points from the camera. A major advantage this method permits is the ability for illumination and imaging axes to be coincident. Plenoptic cameras capture both spatial and angular data simultaneously. This dissertation present a new set of parameters that permit the design and comparison of plenoptic devices outside the traditionally published plenoptic 1.0 and plenoptic 2.0 configurations. Additionally, a series of engineering advancements are presented, including full system raytraces of raw plenoptic images, Zernike compression techniques of raw image files, and non-uniform lenslet arrays to compensate for plenoptic system aberrations. Finally, a new snapshot imaging spectrometer is proposed based off the plenoptic configuration.

  19. Opportunities in computational mechanics: Advances in parallel computing

    SciTech Connect

    Lesar, R.A.

    1999-02-01

    In this paper, the authors will discuss recent advances in computing power and the prospects for using these new capabilities for studying plasticity and failure. They will first review the new capabilities made available with parallel computing. They will discuss how these machines perform and how well their architecture might work on materials issues. Finally, they will give some estimates on the size of problems possible using these computers.

  20. Modular Programming Techniques for Distributed Computing Tasks

    DTIC Science & Technology

    2004-08-01

    Modular Programming Techniques for Distributed Computing Tasks Anthony Cowley, Hwa-Chow Hsu, Camillo J. Taylor GRASP Laboratory University of...network, distributed computing , software design 1. INTRODUCTION As efforts to field sensor networks, or teams of mobile robots, become more...TITLE AND SUBTITLE Modular Programming Techniques for Distributed Computing Tasks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER

  1. A computational hyperspectral imaging technique

    NASA Astrophysics Data System (ADS)

    Habibi, Nasim; Azari, Mohammad; Abolbashari, Mehrdad; Farahi, Faramarz

    2016-03-01

    A novel spectral imaging technique is introduced based on a highly dispersive imaging lens system. The chromatic aberration of the lens system is utilized to spread the spectral content of the object over a focal distance. Two three-dimensional surface reconstruction algorithms, depth from focus and depth from defocus, are applied to images captured by dispersive lens system. Using these algorithms, the spectral imager is able to relate either the location of focused image or the amount of defocus at the imaging detector to the spectral content of the object. A spectral imager with ~5 nm spectral resolution is designed based on this technique. The spectral and spatial resolutions of the introduced technique are independent and can be improved simultaneously. Simulation and experimental results are presented.

  2. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  3. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  4. A Survey of Techniques for Approximate Computing

    DOE PAGES

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is tomore » provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.« less

  5. A Survey of Techniques for Approximate Computing

    SciTech Connect

    Mittal, Sparsh

    2016-03-18

    Approximate computing trades off computation quality with the effort expended and as rising performance demands confront with plateauing resource budgets, approximate computing has become, not merely attractive, but even imperative. Here, we present a survey of techniques for approximate computing (AC). We discuss strategies for finding approximable program portions and monitoring output quality, techniques for using AC in different processing units (e.g., CPU, GPU and FPGA), processor components, memory technologies etc., and programming frameworks for AC. Moreover, we classify these techniques based on several key characteristics to emphasize their similarities and differences. Finally, the aim of this paper is to provide insights to researchers into working of AC techniques and inspire more efforts in this area to make AC the mainstream computing approach in future systems.

  6. Advances and Challenges in Computational Plasma Science

    SciTech Connect

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  7. Techniques for Secure and Reliable Computational Outsourcing

    DTIC Science & Technology

    2013-04-01

    make cheating by the remote untrusted servers detectable ; here cheating mean s "not carrying out the expected computational and storage duties...techniques also make cheating by the remote untrusted servers detectable ; here cheating means “not carrying out the expected computational and storage...techniques are applicable to any type of finite automata (e.g., signature-based intrusion detection automata), but the optimizations were tailored to

  8. Advances in computer imaging/applications in facial plastic surgery.

    PubMed

    Papel, I D; Jiannetto, D F

    1999-01-01

    Rapidly progressing computer technology, ever-increasing expectations of patients, and a confusing medicolegal environment requires a clarification of the role of computer imaging/applications. Advances in computer technology and its applications are reviewed. A brief historical discussion is included for perspective. Improvements in both hardware and software with the advent of digital imaging have allowed great increases in speed and accuracy in patient imaging. This facilitates doctor-patient communication and possibly realistic patient expectations. Patients seeking cosmetic surgery now often expect preoperative imaging. Although society in general has become more litigious, a literature search up to 1998 reveals no lawsuits directly involving computer imaging. It appears that conservative utilization of computer imaging by the facial plastic surgeon may actually reduce liability and promote communication. Recent advances have significantly enhanced the value of computer imaging in the practice of facial plastic surgery. These technological advances in computer imaging appear to contribute a useful technique for the practice of facial plastic surgery. Inclusion of computer imaging should be given serious consideration as an adjunct to clinical practice.

  9. Advanced Computing Architectures for Cognitive Processing

    DTIC Science & Technology

    2009-07-01

    AND IS APPROVED FOR PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. FOR THE DIRECTOR: / s ... s / LOK YAN EDWARD J. JONES, Deputy Chief Work Unit Manager Advanced Computing Division...ELEMENT NUMBER 62702F 6. AUTHOR( S ) Gregory D. Peterson 5d. PROJECT NUMBER 459T 5e. TASK NUMBER AC 5f. WORK UNIT NUMBER CP 7. PERFORMING

  10. Computational Techniques of Electromagnetic Dosimetry for Humans

    NASA Astrophysics Data System (ADS)

    Hirata, Akimasa; Fujiwara, Osamu

    There has been increasing public concern about the adverse health effects of human exposure to electromagnetic fields. This paper reviews the rationale of international safety guidelines for human protection against electromagnetic fields. Then, this paper also presents computational techniques to conduct dosimetry in anatomically-based human body models. Computational examples and remaining problems are also described briefly.

  11. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  12. Advanced techniques in echocardiography in small animals.

    PubMed

    Chetboul, Valérie

    2010-07-01

    Transthoracic echocardiography has become a major imaging tool for the diagnosis and management of canine and feline cardiovascular diseases. During the last decade, more recent advances in ultrasound technology with the introduction of newer imaging modalities, such as tissue Doppler imaging, strain and strain rate imaging, and 2-dimensional speckle tracking echocardiography, have provided new parameters to assess myocardial performance, including regional myocardial velocities and deformation, ventricular twist, and mechanical synchrony. An outline of these 4 recent ultrasound techniques, their impact on the understanding of right and left ventricular function in small animals, and their application in research and clinical settings are given in this article.

  13. Basic concepts of advanced MRI techniques.

    PubMed

    Pagani, Elisabetta; Bizzi, Alberto; Di Salle, Francesco; De Stefano, Nicola; Filippi, Massimo

    2008-10-01

    An overview is given of magnetic resonance (MR) techniques sensitized to diffusion, flow, magnetization transfer effect, and local field inhomogeneities induced by physiological changes, that can be viewed, in the clinical practice, as advanced because of their challenging implementation and interpretation. These techniques are known as diffusion-weighted, perfusion, magnetization transfer, functional MRI and MR spectroscopy. An important issue is that they can provide quantitative estimates of structural and functional characteristics that are below the voxel resolution. This review does not deal with the basic concepts of the MR physics and the description of the available acquisition and postprocessing methods, but hopefully provides an adequate background to readers and hence facilitate the understanding of the following clinical contributions.

  14. Advanced flow MRI: emerging techniques and applications.

    PubMed

    Markl, M; Schnell, S; Wu, C; Bollache, E; Jarvis, K; Barker, A J; Robinson, J D; Rigsby, C K

    2016-08-01

    Magnetic resonance imaging (MRI) techniques provide non-invasive and non-ionising methods for the highly accurate anatomical depiction of the heart and vessels throughout the cardiac cycle. In addition, the intrinsic sensitivity of MRI to motion offers the unique ability to acquire spatially registered blood flow simultaneously with the morphological data, within a single measurement. In clinical routine, flow MRI is typically accomplished using methods that resolve two spatial dimensions in individual planes and encode the time-resolved velocity in one principal direction, typically oriented perpendicular to the two-dimensional (2D) section. This review describes recently developed advanced MRI flow techniques, which allow for more comprehensive evaluation of blood flow characteristics, such as real-time flow imaging, 2D multiple-venc phase contrast MRI, four-dimensional (4D) flow MRI, quantification of complex haemodynamic properties, and highly accelerated flow imaging. Emerging techniques and novel applications are explored. In addition, applications of these new techniques for the improved evaluation of cardiovascular (aorta, pulmonary arteries, congenital heart disease, atrial fibrillation, coronary arteries) as well as cerebrovascular disease (intra-cranial arteries and veins) are presented.

  15. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  16. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  17. Airborne Advanced Reconfigurable Computer System (ARCS)

    NASA Technical Reports Server (NTRS)

    Bjurman, B. E.; Jenkins, G. M.; Masreliez, C. J.; Mcclellan, K. L.; Templeman, J. E.

    1976-01-01

    A digital computer subsystem fault-tolerant concept was defined, and the potential benefits and costs of such a subsystem were assessed when used as the central element of a new transport's flight control system. The derived advanced reconfigurable computer system (ARCS) is a triple-redundant computer subsystem that automatically reconfigures, under multiple fault conditions, from triplex to duplex to simplex operation, with redundancy recovery if the fault condition is transient. The study included criteria development covering factors at the aircraft's operation level that would influence the design of a fault-tolerant system for commercial airline use. A new reliability analysis tool was developed for evaluating redundant, fault-tolerant system availability and survivability; and a stringent digital system software design methodology was used to achieve design/implementation visibility.

  18. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  19. Advanced Training Techniques Using Computer Generated Imagery.

    DTIC Science & Technology

    1983-02-28

    School instructors find either lacking or much to subtle in the ASPT simulator. Their theory as to why this is missing is that the simulator does 59 not...same direction as the aircraft! The difference between the aircraft and ASPT simulator perceptions of the shearing effect may be due to several...elements compared to the late 1960’s, when the ASPT was designed. In addition, present day calligraphic visuals offer very high contrast and

  20. Advanced Computational Techniques in Regional Wave Studies

    DTIC Science & Technology

    1988-06-17

    1A CONTENTS SEISMIC VELOCITY AND Q MODEL FOR THE SHALLOW STRUCTURE OF THE ARABIAN SHIELD FROM SHORT PERIOD RAYLEIGH WAVES Introduction 1 Geology Along...line begins in the mesozoic cover rocks in the northeast, continues to the southwest across the shield, and terminates at the outer edge of the Farasan...upper crust in the different tectonic provinces of the shield. GEOLOGY ALONG THE PROFILES USED The refraction profile traverses three major tectonic

  1. Recent advances in DNA sequencing techniques

    NASA Astrophysics Data System (ADS)

    Singh, Rama Shankar

    2013-06-01

    Successful mapping of the draft human genome in 2001 and more recent mapping of the human microbiome genome in 2012 have relied heavily on the parallel processing of the second generation/Next Generation Sequencing (NGS) DNA machines at a cost of several millions dollars and long computer processing times. These have been mainly biochemical approaches. Here a system analysis approach is used to review these techniques by identifying the requirements, specifications, test methods, error estimates, repeatability, reliability and trends in the cost reduction. The first generation, NGS and the Third Generation Single Molecule Real Time (SMART) detection sequencing methods are reviewed. Based on the National Human Genome Research Institute (NHGRI) data, the achieved cost reduction of 1.5 times per yr. from Sep. 2001 to July 2007; 7 times per yr., from Oct. 2007 to Apr. 2010; and 2.5 times per yr. from July 2010 to Jan 2012 are discussed.

  2. 75 FR 64720 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    .../Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department...

  3. 75 FR 9887 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  4. 78 FR 6087 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-29

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  5. 75 FR 43518 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-26

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, DOE. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing Advisory..., Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U. S. Department of...

  6. 76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... Advanced Scientific Computing Advisory Committee Charter Renewal AGENCY: Department of Energy, Office of... Administration, notice is hereby given that the Advanced Scientific Computing Advisory Committee will be renewed... concerning the Advanced Scientific Computing program in response only to charges from the Director of...

  7. 76 FR 9765 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing..., Office of Advanced Scientific Computing Research, SC-21/Germantown Building, U.S. Department of...

  8. 77 FR 45345 - DOE/Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    .../Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION: Notice of open meeting. SUMMARY: This notice announces a meeting of the Advanced Scientific Computing... Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building; U.S. Department...

  9. 78 FR 41046 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... hereby given that the Advanced Scientific Computing Advisory Committee will be renewed for a two-year... (DOE), on the Advanced Scientific Computing Research Program managed by the Office of...

  10. Computational Design of Advanced Nuclear Fuels

    SciTech Connect

    Savrasov, Sergey; Kotliar, Gabriel; Haule, Kristjan

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  11. ATCA for Machines-- Advanced Telecommunications Computing Architecture

    SciTech Connect

    Larsen, R.S.; /SLAC

    2008-04-22

    The Advanced Telecommunications Computing Architecture is a new industry open standard for electronics instrument modules and shelves being evaluated for the International Linear Collider (ILC). It is the first industrial standard designed for High Availability (HA). ILC availability simulations have shown clearly that the capabilities of ATCA are needed in order to achieve acceptable integrated luminosity. The ATCA architecture looks attractive for beam instruments and detector applications as well. This paper provides an overview of ongoing R&D including application of HA principles to power electronics systems.

  12. Ambient temperature modelling with soft computing techniques

    SciTech Connect

    Bertini, Ilaria; Ceravolo, Francesco; Citterio, Marco; Di Pietra, Biagio; Margiotta, Francesca; Pizzuti, Stefano; Puglisi, Giovanni; De Felice, Matteo

    2010-07-15

    This paper proposes a hybrid approach based on soft computing techniques in order to estimate monthly and daily ambient temperature. Indeed, we combine the back-propagation (BP) algorithm and the simple Genetic Algorithm (GA) in order to effectively train artificial neural networks (ANN) in such a way that the BP algorithm initialises a few individuals of the GA's population. Experiments concerned monthly temperature estimation of unknown places and daily temperature estimation for thermal load computation. Results have shown remarkable improvements in accuracy compared to traditional methods. (author)

  13. Ground Motion Models and Computer Techniques

    DTIC Science & Technology

    1972-04-01

    tectonic stress-strain distributions induced by changing the pore witer pressure. A general computer subroutine (TAMEOS) is described which...interactions, material phase changes , and dependence of strength parameters on the thermodynamic state. This report describes improved techniques...stretches, 1^ = An A? (3.11) ij = In X? (3.12) we find X. = X? X? . (3.13) 55 It is shown in Ref. 24 that the rate of change of compression, ö

  14. Defense Science Board Report on Advanced Computing

    DTIC Science & Technology

    2009-03-01

    complex computational  issues are  pursued , and that several vendors remain at  the  leading edge of  supercomputing  capability  in  the U.S.  In... pursuing   the  ASC  program  to  help  assure  that  HPC  advances  are  available  to  the  broad  national  security  community. As  in  the past, many...apply HPC  to  technical  problems  related  to  weapons  physics,  but  that  are  entirely  unclassified.  Examples include explosive  astrophysical

  15. Advanced high-performance computer system architectures

    NASA Astrophysics Data System (ADS)

    Vinogradov, V. I.

    2007-02-01

    Convergence of computer systems and communication technologies are moving to switched high-performance modular system architectures on the basis of high-speed switched interconnections. Multi-core processors become more perspective way to high-performance system, and traditional parallel bus system architectures (VME/VXI, cPCI/PXI) are moving to new higher speed serial switched interconnections. Fundamentals in system architecture development are compact modular component strategy, low-power processor, new serial high-speed interface chips on the board, and high-speed switched fabric for SAN architectures. Overview of advanced modular concepts and new international standards for development high-performance embedded and compact modular systems for real-time applications are described.

  16. Tools and techniques for computational reproducibility.

    PubMed

    Piccolo, Stephen R; Frampton, Michael B

    2016-07-11

    When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.

  17. Recent Advances in Computed Tomographic Technology: Cardiopulmonary Imaging Applications.

    PubMed

    Tabari, Azadeh; Lo Gullo, Roberto; Murugan, Venkatesh; Otrakji, Alexi; Digumarthy, Subba; Kalra, Mannudeep

    2017-03-01

    Cardiothoracic diseases result in substantial morbidity and mortality. Chest computed tomography (CT) has been an imaging modality of choice for assessing a host of chest diseases, and technologic advances have enabled the emergence of coronary CT angiography as a robust noninvasive test for cardiac imaging. Technologic developments in CT have also enabled the application of dual-energy CT scanning for assessing pulmonary vascular and neoplastic processes. Concerns over increasing radiation dose from CT scanning are being addressed with introduction of more dose-efficient wide-area detector arrays and iterative reconstruction techniques. This review article discusses the technologic innovations in CT and their effect on cardiothoracic applications.

  18. Computer Vision Techniques for Transcatheter Intervention

    PubMed Central

    Zhao, Feng; Roach, Matthew

    2015-01-01

    Minimally invasive transcatheter technologies have demonstrated substantial promise for the diagnosis and the treatment of cardiovascular diseases. For example, transcatheter aortic valve implantation is an alternative to aortic valve replacement for the treatment of severe aortic stenosis, and transcatheter atrial fibrillation ablation is widely used for the treatment and the cure of atrial fibrillation. In addition, catheter-based intravascular ultrasound and optical coherence tomography imaging of coronary arteries provides important information about the coronary lumen, wall, and plaque characteristics. Qualitative and quantitative analysis of these cross-sectional image data will be beneficial to the evaluation and the treatment of coronary artery diseases such as atherosclerosis. In all the phases (preoperative, intraoperative, and postoperative) during the transcatheter intervention procedure, computer vision techniques (e.g., image segmentation and motion tracking) have been largely applied in the field to accomplish tasks like annulus measurement, valve selection, catheter placement control, and vessel centerline extraction. This provides beneficial guidance for the clinicians in surgical planning, disease diagnosis, and treatment assessment. In this paper, we present a systematical review on these state-of-the-art methods. We aim to give a comprehensive overview for researchers in the area of computer vision on the subject of transcatheter intervention. Research in medical computing is multi-disciplinary due to its nature, and hence, it is important to understand the application domain, clinical background, and imaging modality, so that methods and quantitative measurements derived from analyzing the imaging data are appropriate and meaningful. We thus provide an overview on the background information of the transcatheter intervention procedures, as well as a review of the computer vision techniques and methodologies applied in this area. PMID:27170893

  19. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  20. 75 FR 57742 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Scientific Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770...: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown Building;...

  1. 76 FR 45786 - Advanced Scientific Computing Advisory Committee; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... Advanced Scientific Computing Advisory Committee; Meeting AGENCY: Office of Science, Department of Energy... Computing Advisory Committee (ASCAC). Federal Advisory Committee Act (Pub. L. 92-463, 86 Stat. 770) requires... INFORMATION CONTACT: Melea Baker, Office of Advanced Scientific Computing Research; SC-21/Germantown...

  2. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    SciTech Connect

    Durlofsky, Louis J.

    2000-08-28

    This project targets the development of (1) advanced reservoir simulation techniques for modeling non-conventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and well index (for use in simulation models), including the effects of wellbore flow; and (3) accurate approaches to account for heterogeneity in the near-well region.

  3. High-Performance Computing for Advanced Smart Grid Applications

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu

    2012-07-06

    The power grid is becoming far more complex as a result of the grid evolution meeting an information revolution. Due to the penetration of smart grid technologies, the grid is evolving as an unprecedented speed and the information infrastructure is fundamentally improved with a large number of smart meters and sensors that produce several orders of magnitude larger amounts of data. How to pull data in, perform analysis, and put information out in a real-time manner is a fundamental challenge in smart grid operation and planning. The future power grid requires high performance computing to be one of the foundational technologies in developing the algorithms and tools for the significantly increased complexity. New techniques and computational capabilities are required to meet the demands for higher reliability and better asset utilization, including advanced algorithms and computing hardware for large-scale modeling, simulation, and analysis. This chapter summarizes the computational challenges in smart grid and the need for high performance computing, and present examples of how high performance computing might be used for future smart grid operation and planning.

  4. Innovative Tools Advance Revolutionary Weld Technique

    NASA Technical Reports Server (NTRS)

    2009-01-01

    The iconic, orange external tank of the space shuttle launch system not only contains the fuel used by the shuttle s main engines during liftoff but also comprises the shuttle s backbone, supporting the space shuttle orbiter and solid rocket boosters. Given the tank s structural importance and the extreme forces (7.8 million pounds of thrust load) and temperatures it encounters during launch, the welds used to construct the tank must be highly reliable. Variable polarity plasma arc welding, developed for manufacturing the external tank and later employed for building the International Space Station, was until 1994 the best process for joining the aluminum alloys used during construction. That year, Marshall Space Flight Center engineers began experimenting with a relatively new welding technique called friction stir welding (FSW), developed in 1991 by The Welding Institute, of Cambridge, England. FSW differs from traditional fusion welding in that it is a solid-state welding technique, using frictional heat and motion to join structural components without actually melting any of the material. The weld is created by a shouldered pin tool that is plunged into the seam of the materials to be joined. The tool traverses the line while rotating at high speeds, generating friction that heats and softens but does not melt the metal. (The heat produced approaches about 80 percent of the metal s melting temperature.) The pin tool s rotation crushes and stirs the plasticized metal, extruding it along the seam as the tool moves forward. The material cools and consolidates, resulting in a weld with superior mechanical properties as compared to those weld properties of fusion welds. The innovative FSW technology promises a number of attractive benefits. Because the welded materials are not melted, many of the undesirables associated with fusion welding porosity, cracking, shrinkage, and distortion of the weld are minimized or avoided. The process is more energy efficient, safe

  5. OPENING REMARKS: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2006-01-01

    Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such

  6. High resolution computed tomography of advanced composite and ceramic materials

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Klima, S. J.

    1991-01-01

    Advanced composite and ceramic materials are being developed for use in many new defense and commercial applications. In order to achieve the desired mechanical properties of these materials, the structural elements must be carefully analyzed and engineered. A study was conducted to evaluate the use of high resolution computed tomography (CT) as a macrostructural analysis tool for advanced composite and ceramic materials. Several samples were scanned using a laboratory high resolution CT scanner. Samples were also destructively analyzed at the locations of the scans and the nondestructive and destructive results were compared. The study provides useful information outlining the strengths and limitations of this technique and the prospects for further research in this area.

  7. Application of advanced electronics to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Carney, P. C.

    1980-01-01

    Advancements in hardware and software technology are summarized with specific emphasis on spacecraft computer capabilities. Available state of the art technology is reviewed and candidate architectures are defined.

  8. Recent advances in computational methods for nuclear magnetic resonance data processing.

    PubMed

    Gao, Xin

    2013-02-01

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  9. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR awards Data-intensive...

  10. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  11. Hybrid computer techniques for solving partial differential equations

    NASA Technical Reports Server (NTRS)

    Hammond, J. L., Jr.; Odowd, W. M.

    1971-01-01

    Techniques overcome equipment limitations that restrict other computer techniques in solving trivial cases. The use of curve fitting by quadratic interpolation greatly reduces required digital storage space.

  12. A survey of computational intelligence techniques in protein function prediction.

    PubMed

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction.

  13. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  14. Regularization techniques in realistic Laplacian computation.

    PubMed

    Bortel, Radoslav; Sovka, Pavel

    2007-11-01

    This paper explores regularization options for the ill-posed spline coefficient equations in the realistic Laplacian computation. We investigate the use of the Tikhonov regularization, truncated singular value decomposition, and the so-called lambda-correction with the regularization parameter chosen by the L-curve, generalized cross-validation, quasi-optimality, and the discrepancy principle criteria. The provided range of regularization techniques is much wider than in the previous works. The improvement of the realistic Laplacian is investigated by simulations on the three-shell spherical head model. The conclusion is that the best performance is provided by the combination of the Tikhonov regularization and the generalized cross-validation criterion-a combination that has never been suggested for this task before.

  15. Computational intelligence techniques for tactile sensing systems.

    PubMed

    Gastaldo, Paolo; Pinna, Luigi; Seminara, Lucia; Valle, Maurizio; Zunino, Rodolfo

    2014-06-19

    Tactile sensing helps robots interact with humans and objects effectively in real environments. Piezoelectric polymer sensors provide the functional building blocks of the robotic electronic skin, mainly thanks to their flexibility and suitability for detecting dynamic contact events and for recognizing the touch modality. The paper focuses on the ability of tactile sensing systems to support the challenging recognition of certain qualities/modalities of touch. The research applies novel computational intelligence techniques and a tensor-based approach for the classification of touch modalities; its main results consist in providing a procedure to enhance system generalization ability and architecture for multi-class recognition applications. An experimental campaign involving 70 participants using three different modalities in touching the upper surface of the sensor array was conducted, and confirmed the validity of the approach.

  16. Advanced crystallization techniques of 'solar grade' silicon

    NASA Astrophysics Data System (ADS)

    Gasparini, M.; Calligarich, C.; Rava, P.; Sardi, L.; Alessandri, M.; Redaelli, F.; Pizzini, S.

    Microstructural, electrical and photo-voltaic characteristics of polycrystal line silicon solar cells fabricated with silicon ingots containing 5, 100 and 500 ppmw iron are reported and discussed. All silicon ingots were grown by the directional solidification technique in graphite or special quartz molds and doped intentionally with iron, in order to evaluate the potentiality of the D.S. technique when employed with solar silicon feedstocks. Results indicate that structural breakdown limits the amount of the ingot which is usable for solar cells fabrication, but also that efficiencies in excess of 10 percent are obtained using the 'good' region of the ingot.

  17. Advances in laparoscopic urologic surgery techniques

    PubMed Central

    Abdul-Muhsin, Haidar M.; Humphreys, Mitchell R.

    2016-01-01

    The last two decades witnessed the inception and exponential implementation of key technological advancements in laparoscopic urology. While some of these technologies thrived and became part of daily practice, others are still hindered by major challenges. This review was conducted through a comprehensive literature search in order to highlight some of the most promising technologies in laparoscopic visualization, augmented reality, and insufflation. Additionally, this review will provide an update regarding the current status of single-site and natural orifice surgery in urology. PMID:27134743

  18. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  19. Advanced Computational Thermal Studies and their Assessment for Supercritical-Pressure Reactors (SCRs)

    SciTech Connect

    D. M. McEligot; J. Y. Yoo; J. S. Lee; S. T. Ro; E. Lurien; S. O. Park; R. H. Pletcher; B. L. Smith; P. Vukoslavcevic; J. M. Wallace

    2009-04-01

    The goal of this laboratory / university collaboration of coupled computational and experimental studies is the improvement of predictive methods for supercritical-pressure reactors. The general objective is to develop supporting knowledge needed of advanced computational techniques for the technology development of the concepts and their safety systems.

  20. [Advanced online search techniques and dedicated search engines for physicians].

    PubMed

    Nahum, Yoav

    2008-02-01

    In recent years search engines have become an essential tool in the work of physicians. This article will review advanced search techniques from the world of information specialists, as well as some advanced search engine operators that may help physicians improve their online search capabilities, and maximize the yield of their searches. This article also reviews popular dedicated scientific and biomedical literature search engines.

  1. Recent Advances in Seismic Wavefront Tracking Techniques and Their Applications

    NASA Astrophysics Data System (ADS)

    Sambridge, M.; Rawlinson, N.; Hauser, J.

    2007-12-01

    In observational seismology, wavefront tracking techniques are becoming increasingly popular as a means of predicting two point traveltimes and their associated paths. Possible applications include reflection migration, earthquake relocation and seismic tomography at a wide variety of scales. Compared with traditional ray based techniques such as shooting and bending, wavefront tracking has the advantage of locating traveltimes between the source and every point in the medium; in many cases, improved efficiency and robustness; and greater potential for tracking multiple arrivals. In this presentation, two wavefront tracking techniques will be considered: the so-called Fast Marching Method (FMM), and a wavefront construction (WFC) scheme. Over the last several years, FMM has become a mature technique in seismology, with a number of improvements to the underlying theory and the release of software tools that allow it to be used in a variety of applications. At its core, FMM is a grid based solver that implicitly tracks a propagating wavefront by seeking finite difference solutions to the eikonal equation along an evolving narrow band. Recent developments include the use of source grid refinement to improve accuracy, the introduction of a multi-stage scheme to allow reflections and refractions to be tracked in layered media, and extension to spherical coordinates. Implementation of these ideas has led to a number of different applications, including teleseismic tomography, wide-angle reflection and refraction tomography, earthquake relocation, and ambient noise imaging using surface waves. The WFC scheme represents the wavefront surface as a set of points in 6-D phase space; these points are advanced in time using local initial value ray tracing in order to form a sequence of wavefront surfaces that fill the model volume. Surface refinement and simplification techniques inspired by recent developments in computer graphics are used to maintain a fixed density of nodes

  2. WAATS: A computer program for Weights Analysis of Advanced Transportation Systems

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.

    1974-01-01

    A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.

  3. Parallel computing techniques for rotorcraft aerodynamics

    NASA Astrophysics Data System (ADS)

    Ekici, Kivanc

    The modification of unsteady three-dimensional Navier-Stokes codes for application on massively parallel and distributed computing environments is investigated. The Euler/Navier-Stokes code TURNS (Transonic Unsteady Rotor Navier-Stokes) was chosen as a test bed because of its wide use by universities and industry. For the efficient implementation of TURNS on parallel computing systems, two algorithmic changes are developed. First, main modifications to the implicit operator, Lower-Upper Symmetric Gauss Seidel (LU-SGS) originally used in TURNS, is performed. Second, application of an inexact Newton method, coupled with a Krylov subspace iterative method (Newton-Krylov method) is carried out. Both techniques have been tried previously for the Euler equations mode of the code. In this work, we have extended the methods to the Navier-Stokes mode. Several new implicit operators were tried because of convergence problems of traditional operators with the high cell aspect ratio (CAR) grids needed for viscous calculations on structured grids. Promising results for both Euler and Navier-Stokes cases are presented for these operators. For the efficient implementation of Newton-Krylov methods to the Navier-Stokes mode of TURNS, efficient preconditioners must be used. The parallel implicit operators used in the previous step are employed as preconditioners and the results are compared. The Message Passing Interface (MPI) protocol has been used because of its portability to various parallel architectures. It should be noted that the proposed methodology is general and can be applied to several other CFD codes (e.g. OVERFLOW).

  4. Some Recent Advances in Computer Graphics.

    ERIC Educational Resources Information Center

    Whitted, Turner

    1982-01-01

    General principles of computer graphics are reviewed, including discussions of display hardware, geometric modeling, algorithms, and applications in science, computer-aided design, flight training, communications, business, art, and entertainment. (JN)

  5. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  6. Computing Advances in the Teaching of Chemistry.

    ERIC Educational Resources Information Center

    Baskett, W. P.; Matthews, G. P.

    1984-01-01

    Discusses three trends in computer-oriented chemistry instruction: (1) availability of interfaces to integrate computers with experiments; (2) impact of the development of higher resolution graphics and greater memory capacity; and (3) role of videodisc technology on computer assisted instruction. Includes program listings for auto-titration and…

  7. Visualization Techniques for Computer Network Defense

    SciTech Connect

    Beaver, Justin M; Steed, Chad A; Patton, Robert M; Cui, Xiaohui; Schultz, Matthew A

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  8. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  9. Continuation of advanced crew procedures development techniques

    NASA Technical Reports Server (NTRS)

    Arbet, J. D.; Benbow, R. L.; Evans, M. E.; Mangiaracina, A. A.; Mcgavern, J. L.; Spangler, M. C.; Tatum, I. C.

    1976-01-01

    An operational computer program, the Procedures and Performance Program (PPP) which operates in conjunction with the Phase I Shuttle Procedures Simulator to provide a procedures recording and crew/vehicle performance monitoring capability was developed. A technical synopsis of each task resulting in the development of the Procedures and Performance Program is provided. Conclusions and recommendations for action leading to the improvements in production of crew procedures development and crew training support are included. The PPP provides real-time CRT displays and post-run hardcopy output of procedures, difference procedures, performance data, parametric analysis data, and training script/training status data. During post-run, the program is designed to support evaluation through the reconstruction of displays to any point in time. A permanent record of the simulation exercise can be obtained via hardcopy output of the display data and via transfer to the Generalized Documentation Processor (GDP). Reference procedures data may be transferred from the GDP to the PPP. Interface is provided with the all digital trajectory program, the Space Vehicle Dynamics Simulator (SVDS) to support initial procedures timeline development.

  10. Diagnostics of nonlocal plasmas: advanced techniques

    NASA Astrophysics Data System (ADS)

    Mustafaev, Alexander; Grabovskiy, Artiom; Strakhova, Anastasiya; Soukhomlinov, Vladimir

    2014-10-01

    This talk generalizes our recent results, obtained in different directions of plasma diagnostics. First-method of flat single-sided probe, based on expansion of the electron velocity distribution function (EVDF) in series of Legendre polynomials. It will be demonstrated, that flat probe, oriented under different angles with respect to the discharge axis, allow to determine full EVDF in nonlocal plasmas. It is also shown, that cylindrical probe is unable to determine full EVDF. We propose the solution of this problem by combined using the kinetic Boltzmann equation and experimental probe data. Second-magnetic diagnostics. This method is implemented in knudsen diode with surface ionization of atoms (KDSI) and based on measurements of the magnetic characteristics of the KDSI in presence of transverse magnetic field. Using magnetic diagnostics we can investigate the wide range of plasma processes: from scattering cross-sections of electrons to plasma-surface interactions. Third-noncontact diagnostics method for direct measurements of EVDF in remote plasma objects by combination of the flat single-sided probe technique and magnetic polarization Hanley method.

  11. Computer Organizational Techniques Used by Office Personnel.

    ERIC Educational Resources Information Center

    Alexander, Melody

    1995-01-01

    According to survey responses from 404 of 532 office personnel, 81.7% enjoy working with computers; the majority save files on their hard drives, use disk labels and storage files, do not use subdirectories or compress data, and do not make backups of floppy disks. Those with higher degrees, more computer experience, and more daily computer use…

  12. Application of advanced computational technology to propulsion CFD

    NASA Astrophysics Data System (ADS)

    Szuch, John R.

    The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid dynamics (ICFM) to a state of practical application for aerospace propulsion system design. This paper presents an overview of efforts underway at NASA Lewis to advance and apply computational technology to ICFM. These efforts include the use of modern, software engineering principles for code development, the development of an AI-based user-interface for large codes, the establishment of a high-performance, data communications network to link ICFM researchers and facilities, and the application of parallel processing to speed up computationally intensive and/or time-critical ICFM problems. A multistage compressor flow physics program is cited as an example of efforts to use advanced computational technology to enhance a current NASA Lewis ICFM research program.

  13. Computation of Viscous Flow about Advanced Projectiles.

    DTIC Science & Technology

    1983-09-09

    Domain". Journal of Comp. Physics, Vol. 8, 1971, pp. 392-408. 10. Thompson , J . F ., Thames, F. C., and Mastin, C. M., "Automatic Numerical Generation of...computations, USSR Comput. Math. Math. Phys., 12, 2 (1972), 182-195. I~~ll A - - 18. Thompson , J . F ., F. C. Thames, and C. M. Mastin, Automatic

  14. Computing Algorithms for Nuffield Advanced Physics.

    ERIC Educational Resources Information Center

    Summers, M. K.

    1978-01-01

    Defines all recurrence relations used in the Nuffield course, to solve first- and second-order differential equations, and describes a typical algorithm for computer generation of solutions. (Author/GA)

  15. Aerodynamic optimization studies on advanced architecture computers

    NASA Technical Reports Server (NTRS)

    Chawla, Kalpana

    1995-01-01

    The approach to carrying out multi-discipline aerospace design studies in the future, especially in massively parallel computing environments, comprises of choosing (1) suitable solvers to compute solutions to equations characterizing a discipline, and (2) efficient optimization methods. In addition, for aerodynamic optimization problems, (3) smart methodologies must be selected to modify the surface shape. In this research effort, a 'direct' optimization method is implemented on the Cray C-90 to improve aerodynamic design. It is coupled with an existing implicit Navier-Stokes solver, OVERFLOW, to compute flow solutions. The optimization method is chosen such that it can accomodate multi-discipline optimization in future computations. In the work , however, only single discipline aerodynamic optimization will be included.

  16. Advances in gamma titanium aluminides and their manufacturing techniques

    NASA Astrophysics Data System (ADS)

    Kothari, Kunal; Radhakrishnan, Ramachandran; Wereley, Norman M.

    2012-11-01

    Gamma titanium aluminides display attractive properties for high temperature applications. For over a decade in the 1990s, the attractive properties of titanium aluminides were outweighed by difficulties encountered in processing and machining at room temperature. But advances in manufacturing technologies, deeper understanding of titanium aluminides microstructure, deformation mechanisms, and advances in micro-alloying, has led to the production of gamma titanium aluminide sheets. An in-depth review of key advances in gamma titanium aluminides is presented, including microstructure, deformation mechanisms, and alloy development. Traditional manufacturing techniques such as ingot metallurgy and investment casting are reviewed and advances via powder metallurgy based manufacturing techniques are discussed. Finally, manufacturing challenges facing gamma titanium aluminides, as well as avenues to overcome them, are discussed.

  17. 75 FR 44015 - Certain Semiconductor Products Made by Advanced Lithography Techniques and Products Containing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... COMMISSION Certain Semiconductor Products Made by Advanced Lithography Techniques and Products Containing... importation of certain semiconductor products made by advanced lithography techniques and products containing... certain semiconductor products made by advanced lithography techniques or products containing same...

  18. Advanced Crew Personal Support Computer (CPSC) task

    NASA Technical Reports Server (NTRS)

    Muratore, Debra

    1991-01-01

    The topics are presented in view graph form and include: background; objectives of task; benefits to the Space Station Freedom (SSF) Program; technical approach; baseline integration; and growth and evolution options. The objective is to: (1) introduce new computer technology into the SSF Program; (2) augment core computer capabilities to meet additional mission requirements; (3) minimize risk in upgrading technology; and (4) provide a low cost way to enhance crew and ground operations support.

  19. Frontiers of research in advanced computations

    SciTech Connect

    1996-07-01

    The principal mission of the Institute for Scientific Computing Research is to foster interactions among LLNL researchers, universities, and industry on selected topics in scientific computing. In the area of computational physics, the Institute has developed a new algorithm, GaPH, to help scientists understand the chemistry of turbulent and driven plasmas or gases at far less cost than other methods. New low-frequency electromagnetic models better describe the plasma etching and deposition characteristics of a computer chip in the making. A new method for modeling realistic curved boundaries within an orthogonal mesh is resulting in a better understanding of the physics associated with such boundaries and much quicker solutions. All these capabilities are being developed for massively parallel implementation, which is an ongoing focus of Institute researchers. Other groups within the Institute are developing novel computational methods to address a range of other problems. Examples include feature detection and motion recognition by computer, improved monitoring of blood oxygen levels, and entirely new models of human joint mechanics and prosthetic devices.

  20. New Information Dispersal Techniques for Trustworthy Computing

    ERIC Educational Resources Information Center

    Parakh, Abhishek

    2011-01-01

    Information dispersal algorithms (IDA) are used for distributed data storage because they simultaneously provide security, reliability and space efficiency, constituting a trustworthy computing framework for many critical applications, such as cloud computing, in the information society. In the most general sense, this is achieved by dividing data…

  1. Digital computer technique for setup and checkout of an analog computer

    NASA Technical Reports Server (NTRS)

    Ambaruch, R.

    1968-01-01

    Computer program technique, called Analog Computer Check-Out Routine Digitally /ACCORD/, generates complete setup and checkout data for an analog computer. In addition, the correctness of the analog program implementation is validated.

  2. Advanced liner-cooling techniques for gas turbine combustors

    NASA Technical Reports Server (NTRS)

    Norgren, C. T.; Riddlebaugh, S. M.

    1985-01-01

    Component research for advanced small gas turbine engines is currently underway at the NASA Lewis Research Center. As part of this program, a basic reverse-flow combustor geometry was being maintained while different advanced liner wall cooling techniques were investigated. Performance and liner cooling effectiveness of the experimental combustor configuration featuring counter-flow film-cooled panels is presented and compared with two previously reported combustors featuring: splash film-cooled liner walls; and transpiration cooled liner walls (Lamilloy).

  3. Cloud Computing Techniques for Space Mission Design

    NASA Technical Reports Server (NTRS)

    Arrieta, Juan; Senent, Juan

    2014-01-01

    The overarching objective of space mission design is to tackle complex problems producing better results, and faster. In developing the methods and tools to fulfill this objective, the user interacts with the different layers of a computing system.

  4. Advances in Computer-Supported Learning

    ERIC Educational Resources Information Center

    Neto, Francisco; Brasileiro, Francisco

    2007-01-01

    The Internet and growth of computer networks have eliminated geographic barriers, creating an environment where education can be brought to a student no matter where that student may be. The success of distance learning programs and the availability of many Web-supported applications and multimedia resources have increased the effectiveness of…

  5. Advanced Computing Tools and Models for Accelerator Physics

    SciTech Connect

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  6. ASDA - Advanced Suit Design Analyzer computer program

    NASA Technical Reports Server (NTRS)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  7. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  8. Wafer hot spot identification through advanced photomask characterization techniques

    NASA Astrophysics Data System (ADS)

    Choi, Yohan; Green, Michael; McMurran, Jeff; Ham, Young; Lin, Howard; Lan, Andy; Yang, Richer; Lung, Mike

    2016-10-01

    As device manufacturers progress through advanced technology nodes, limitations in standard 1-dimensional (1D) mask Critical Dimension (CD) metrics are becoming apparent. Historically, 1D metrics such as Mean to Target (MTT) and CD Uniformity (CDU) have been adequate for end users to evaluate and predict the mask impact on the wafer process. However, the wafer lithographer's process margin is shrinking at advanced nodes to a point that the classical mask CD metrics are no longer adequate to gauge the mask contribution to wafer process error. For example, wafer CDU error at advanced nodes is impacted by mask factors such as 3-dimensional (3D) effects and mask pattern fidelity on subresolution assist features (SRAFs) used in Optical Proximity Correction (OPC) models of ever-increasing complexity. These items are not quantifiable with the 1D metrology techniques of today. Likewise, the mask maker needs advanced characterization methods in order to optimize the mask process to meet the wafer lithographer's needs. These advanced characterization metrics are what is needed to harmonize mask and wafer processes for enhanced wafer hot spot analysis. In this paper, we study advanced mask pattern characterization techniques and their correlation with modeled wafer performance.

  9. [Cardiac computed tomography: new applications of an evolving technique].

    PubMed

    Martín, María; Corros, Cecilia; Calvo, Juan; Mesa, Alicia; García-Campos, Ana; Rodríguez, María Luisa; Barreiro, Manuel; Rozado, José; Colunga, Santiago; de la Hera, Jesús M; Morís, César; Luyando, Luis H

    2015-01-01

    During the last years we have witnessed an increasing development of imaging techniques applied in Cardiology. Among them, cardiac computed tomography is an emerging and evolving technique. With the current possibility of very low radiation studies, the applications have expanded and go further coronariography In the present article we review the technical developments of cardiac computed tomography and its new applications.

  10. Measuring Speed Using a Computer--Several Techniques.

    ERIC Educational Resources Information Center

    Pearce, Jon M.

    1988-01-01

    Introduces three different techniques to facilitate the measurement of speed and the associated kinematics and dynamics using a computer. Discusses sensing techniques using optical or ultrasonic sensors, interfacing with a computer, software routines for the interfaces, and other applications. Provides circuit diagrams, pictures, and a program to…

  11. Bringing The Web Down to Size: Advanced Search Techniques.

    ERIC Educational Resources Information Center

    Huber, Joe; Miley, Donna

    1997-01-01

    Examines advanced Internet search techniques, focusing on six search engines. Includes a chart comparison of nine search features: "include two words,""exclude one of two words,""exclude mature audience content,""two adjacent words,""exact match,""contains first and neither of two following…

  12. Advanced Marketing Core Curriculum. Test Items and Assessment Techniques.

    ERIC Educational Resources Information Center

    Smith, Clifton L.; And Others

    This document contains duties and tasks, multiple-choice test items, and other assessment techniques for Missouri's advanced marketing core curriculum. The core curriculum begins with a list of 13 suggested textbook resources. Next, nine duties with their associated tasks are given. Under each task appears one or more citations to appropriate…

  13. Genioglossus muscle advancement: A modification of the conventional technique.

    PubMed

    García Vega, José Ramón; de la Plata, María Mancha; Galindo, Néstor; Navarro, Miriam; Díez, Daniel; Láncara, Fernando

    2014-04-01

    Obstructive sleep apnoea syndrome (OSAS) is a pathophysiologic condition associated with fragmented sleep and arousals caused by nocturnal mechanical obstruction of the upper airway. This results in behavioural derangements, such as excessive daytime sleepiness and fatigue, and pathophysiologic derangements that cause morbidities and mortality including hypertension, arrhythmias, myocardial infarction, stroke and sudden death. The genioglossus advancement is a proven technique for the treatment of mild to moderate obstructive sleep apnoea syndrome by relieving airway obstruction at the hypopharyngeal level. In this article, we report a modification of the conventional genioglossus advancement described by Riley and Powell. The modification we describe replaces the bone segment at the mandibular basal bone rather than at the mid area of the symphysis. This means a linear movement that allows a greater advancement and avoids the rotation of the genioglossus muscle. Through this article we will describe the advantages of the surgical technique such as greater effectiveness, stability, more pleasing aesthetic outcome and the reduction of potential complications.

  14. Computer-aided auscultation learning system for nursing technique instruction.

    PubMed

    Hou, Chun-Ju; Chen, Yen-Ting; Hu, Ling-Chen; Chuang, Chih-Chieh; Chiu, Yu-Hsien; Tsai, Ming-Shih

    2008-01-01

    Pulmonary auscultation is a physical assessment skill learned by nursing students for examining the respiratory system. Generally, a sound simulator equipped mannequin is used to group teach auscultation techniques via classroom demonstration. However, nursing students cannot readily duplicate this learning environment for self-study. The advancement of electronic and digital signal processing technologies facilitates simulating this learning environment. This study aims to develop a computer-aided auscultation learning system for assisting teachers and nursing students in auscultation teaching and learning. This system provides teachers with signal recording and processing of lung sounds and immediate playback of lung sounds for students. A graphical user interface allows teachers to control the measuring device, draw lung sound waveforms, highlight lung sound segments of interest, and include descriptive text. Effects on learning lung sound auscultation were evaluated for verifying the feasibility of the system. Fifteen nursing students voluntarily participated in the repeated experiment. The results of a paired t test showed that auscultative abilities of the students were significantly improved by using the computer-aided auscultation learning system.

  15. Computer Assisted Instruction Techniques for Screening Freshmen.

    ERIC Educational Resources Information Center

    Flower, K. W.; Craft, W. J.

    1981-01-01

    Describes the use of computer assisted instruction at North Carolina Agriculture and Technical State University in freshman and remedial mathematics to cut down high attrition rates and weed out quickly the students who can't adapt to the vigors of engineering course work. (Author/DS)

  16. Advanced techniques in safeguarding a conditioning facility for spent fuel

    SciTech Connect

    Rudolf, K.; Weh, R. )

    1992-01-01

    Although reprocessing continues to be the main factor in the waste management of nuclear reactors, the alternative of direct final disposal is currently being developed to the level of industrial applications, based on an agreement between the heads of the federal government and the federal states of Germany. Thus, the Konrad and Gorleben sites are being studied as potential final repositories as is the pilot conditioning facility (PKA) under construction. Discussions on the application of safeguards measures have led to the drafting of an approach that will cover the entire back end of the fuel cycle. The conditioning of fuel prior to direct final disposal represents one element in the overall approach. A modern facility equipped with advanced technology, PKA is a pilot plant with regard to conditioning techniques as well as to safeguards. Therefore, the PKA safeguards approach is expected to facilitate future industrial applications of the conditioning procedure. This cannot be satisfactorily implemented without advanced safeguards techniques. The level of development of the safeguards techniques varies. While advanced camera and seal systems are basically available, the other techniques and methods still require research and development. Feasibility studies and equipment development are geared to providing applicable safeguards techniques in time for commissioning of the PKA.

  17. Volumes to learn: advancing therapeutics with innovative computed tomography image data analysis.

    PubMed

    Maitland, Michael L

    2010-09-15

    Semi-automated methods for calculating tumor volumes from computed tomography images are a new tool for advancing the development of cancer therapeutics. Volumetric measurements, relying on already widely available standard clinical imaging techniques, could shorten the observation intervals needed to identify cohorts of patients sensitive or resistant to treatment.

  18. Computational Techniques in Radio Neutrino Event Reconstruction

    NASA Astrophysics Data System (ADS)

    Beydler, M.; ARA Collaboration

    2016-03-01

    The Askaryan Radio Array (ARA) is a high-energy cosmic neutrino detector constructed with stations of radio antennas buried in the ice at the South Pole. Event reconstruction relies on the analysis of the arrival times of the transient radio signals generated by neutrinos interacting within a few kilometers of the detector. Because of its depth dependence, the index of refraction in the ice complicates the interferometric directional reconstruction of possible neutrino events. Currently, there is an ongoing endeavor to enhance the programs used for the time-consuming computations of the curved paths of the transient wave signals in the ice as well as the interferometric beamforming. We have implemented a fast, multi-dimensional spline table lookup of the wave arrival times in order to enable raytrace-based directional reconstructions. Additionally, we have applied parallel computing across multiple Graphics Processing Units (GPUs) in order to perform the beamforming calculations quickly.

  19. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This

  20. Advances in Single-Photon Emission Computed Tomography Hardware and Software.

    PubMed

    Piccinelli, Marina; Garcia, Ernest V

    2016-02-01

    Nuclear imaging techniques remain today's most reliable modality for the assessment and quantification of myocardial perfusion. In recent years, the field has experienced tremendous progress both in terms of dedicated cameras for cardiac applications and software techniques for image reconstruction. The most recent advances in single-photon emission computed tomography hardware and software are reviewed, focusing on how these improvements have resulted in an even more powerful diagnostic tool with reduced injected radiation dose and acquisition time.

  1. Advanced Packaging Materials and Techniques for High Power TR Module: Standard Flight vs. Advanced Packaging

    NASA Technical Reports Server (NTRS)

    Hoffman, James Patrick; Del Castillo, Linda; Miller, Jennifer; Jenabi, Masud; Hunter, Donald; Birur, Gajanana

    2011-01-01

    The higher output power densities required of modern radar architectures, such as the proposed DESDynI [Deformation, Ecosystem Structure, and Dynamics of Ice] SAR [Synthetic Aperture Radar] Instrument (or DSI) require increasingly dense high power electronics. To enable these higher power densities, while maintaining or even improving hardware reliability, requires advances in integrating advanced thermal packaging technologies into radar transmit/receive (TR) modules. New materials and techniques have been studied and compared to standard technologies.

  2. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    SciTech Connect

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow in the wellbore

  3. Technology development of fabrication techniques for advanced solar dynamic concentrators

    NASA Technical Reports Server (NTRS)

    Richter, Scott W.

    1991-01-01

    The objective of the advanced concentrator program is to develop the technology that will lead to lightweight, highly reflective, accurate, scaleable, and long lived space solar dynamic concentrators. The advanced concentrator program encompasses new and innovative concepts, fabrication techniques, materials selection, and simulated space environmental testing. Fabrication techniques include methods of fabricating the substrates and coating substrate surfaces to produce high quality optical surfaces, acceptable for further coating with vapor deposited optical films. The selected materials to obtain a high quality optical surface include microsheet glass and Eccocoat EP-3 epoxy, with DC-93-500 selected as a candidate silicone adhesive and levelizing layer. The following procedures are defined: cutting, cleaning, forming, and bonding microsheet glass. Procedures are also defined for surface cleaning, and EP-3 epoxy application. The results and analyses from atomic oxygen and thermal cycling tests are used to determine the effects of orbital conditions in a space environment.

  4. Advanced thermal management techniques for space power electronics

    NASA Astrophysics Data System (ADS)

    Reyes, Angel Samuel

    1992-01-01

    Modern electronic systems used in space must be reliable and efficient with thermal management unaffected by outer space constraints. Current thermal management techniques are not sufficient for the increasing waste heat dissipation of novel electronic technologies. Many advanced thermal management techniques have been developed in recent years that have application in high power electronic systems. The benefits and limitations of emerging cooling technologies are discussed. These technologies include: liquid pumped devices, mechanically pumped two-phase cooling, capillary pumped evaporative cooling, and thermoelectric devices. Currently, liquid pumped devices offer the most promising alternative for electronics thermal control.

  5. Data Compression Techniques for Advanced Space Transportation Systems

    NASA Technical Reports Server (NTRS)

    Bradley, William G.

    1998-01-01

    Advanced space transportation systems, including vehicle state of health systems, will produce large amounts of data which must be stored on board the vehicle and or transmitted to the ground and stored. The cost of storage or transmission of the data could be reduced if the number of bits required to represent the data is reduced by the use of data compression techniques. Most of the work done in this study was rather generic and could apply to many data compression systems, but the first application area to be considered was launch vehicle state of health telemetry systems. Both lossless and lossy compression techniques were considered in this study.

  6. Advance techniques for monitoring human tolerance to positive Gz accelerations

    NASA Technical Reports Server (NTRS)

    Pelligra, R.; Sandler, H.; Rositano, S.; Skrettingland, K.; Mancini, R.

    1973-01-01

    Tolerance to positive g accelerations was measured in ten normal male subjects using both standard and advanced techniques. In addition to routine electrocardiogram, heart rate, respiratory rate, and infrared television, monitoring techniques during acceleration exposure included measurement of peripheral vision loss, noninvasive temporal, brachial, and/or radial arterial blood flow, and automatic measurement of indirect systolic and diastolic blood pressure at 60-sec intervals. Although brachial and radial arterial flow measurements reflected significant cardiovascular changes during and after acceleration, they were inconsistent indices of the onset of grayout or blackout. Temporal arterial blood flow, however, showed a high correlation with subjective peripheral light loss.

  7. Three-dimensional hybrid grid generation using advancing front techniques

    NASA Technical Reports Server (NTRS)

    Steinbrenner, John P.; Noack, Ralph W.

    1995-01-01

    A new 3-dimensional hybrid grid generation technique has been developed, based on ideas of advancing fronts for both structured and unstructured grids. In this approach, structured grids are first generate independently around individual components of the geometry. Fronts are initialized on these structure grids, and advanced outward so that new cells are extracted directly from the structured grids. Employing typical advancing front techniques, cells are rejected if they intersect the existing front or fail other criteria When no more viable structured cells exist further cells are advanced in an unstructured manner to close off the overall domain, resulting in a grid of 'hybrid' form. There are two primary advantages to the hybrid formulation. First, generating blocks with limited regard to topology eliminates the bottleneck encountered when a multiple block system is used to fully encapsulate a domain. Individual blocks may be generated free of external constraints, which will significantly reduce the generation time. Secondly, grid points near the body (presumably with high aspect ratio) will still maintain a structured (non-triangular or tetrahedral) character, thereby maximizing grid quality and solution accuracy near the surface.

  8. Strategies and advanced techniques for marine pollution studies

    SciTech Connect

    Giam, C.S.; Dou, H.J.M.

    1986-01-01

    Here is a review of strategies and techniques for evaluating marine pollution by hazardous organic compounds. Geo-chemical considerations such as the relationship between the inputs, atmospheric and estuarine transport, and the outputs, sedimentation and degradation, guide the decision on appropriate approaches to pollution monitoring in the marine environment. The latest instrumental methods and standard protocols for analysis of organic compounds are presented, as well as advances in interpretation and correlation of data made possible by the accessibility of commercial data bases.

  9. Advanced endoscopic ultrasound management techniques for preneoplastic pancreatic cystic lesions

    PubMed Central

    Arshad, Hafiz Muhammad Sharjeel; Bharmal, Sheila; Duman, Deniz Guney; Liangpunsakul, Suthat; Turner, Brian G

    2017-01-01

    Pancreatic cystic lesions can be benign, premalignant or malignant. The recent increase in detection and tremendous clinical variability of pancreatic cysts has presented a significant therapeutic challenge to physicians. Mucinous cystic neoplasms are of particular interest given their known malignant potential. This review article provides a brief but comprehensive review of premalignant pancreatic cystic lesions with advanced endoscopic ultrasound (EUS) management approaches. A comprehensive literature search was performed using PubMed, Cochrane, OVID and EMBASE databases. Preneoplastic pancreatic cystic lesions include mucinous cystadenoma and intraductal papillary mucinous neoplasm. The 2012 International Sendai Guidelines guide physicians in their management of pancreatic cystic lesions. Some of the advanced EUS management techniques include ethanol ablation, chemotherapeutic (paclitaxel) ablation, radiofrequency ablation and cryotherapy. In future, EUS-guided injections of drug-eluting beads and neodymium:yttrium aluminum agent laser ablation is predicted to be an integral part of EUS-guided management techniques. In summary, International Sendai Consensus Guidelines should be used to make a decision regarding management of pancreatic cystic lesions. Advanced EUS techniques are proving extremely beneficial in management, especially in those patients who are at high surgical risk. PMID:27574295

  10. [Surgical reconstruction of maxillary defects using computer-assisted techniques].

    PubMed

    Zhang, W B; Yu, Y; Wang, Y; Liu, X J; Mao, C; Guo, C B; Yu, G Y; Peng, X

    2017-02-18

    The maxilla is the most important bony support of the mid-face skeleton and is critical for both esthetics and function. Maxillary defects, resulting from tumor resection, can cause severe functional and cosmetic deformities. Furthermore, maxillary reconstruction presents a great challenge for oral and maxillofacial surgeons. Nowadays, vascularized composite bone flap transfer has been widely used for functional maxillary reconstruction. In the last decade, we have performed a comprehensive research on functional maxillary reconstruction with free fibula flap and reported excellent functional and acceptable esthetic results. However, this experience based clinical procedure still remainssome problems in accuracy and efficiency. In recent years, computer assisted techniques are now widely used in oral and maxillofacial surgery. We have performed a series of study on maxillary reconstruction with computer assisted techniques. The computer assisted techniques used for maxillary reconstruction mainly include: (1) Three dimensional (3D) reconstruction and tumor mapping: providing a 3D view of maxillary tumor and adjacent structures and helping to make the diagnosis of maxillary tumor accurate and objective; (2) Virtual planning: simulating tumor resection and maxillectomy as well as fibula reconstruction on the computer, so that to make an ideal surgical plan; (3) 3D printing: producing a 3D stereo model for prebending individualized titanium mesh and also providing template or cutting guide for the surgery; (4) Surgical navigation: the bridge between virtual plan and real surgery, confirming the virtual plan during the surgery and guarantee the accuracy; (5) Computer assisted analyzing and evaluating: making a quantitative and objective of the final result and evaluating the outcome. We also performed a series of studies to evaluate the application of computer assisted techniques used for maxillary reconstruction, including: (1) 3D tumor mapping technique for accurate

  11. Activities and operations of the Advanced Computing Research Facility, July-October 1986

    SciTech Connect

    Pieper, G.W.

    1986-01-01

    Research activities and operations of the Advanced Computing Research Facility (ACRF) at Argonne National Laboratory are discussed for the period from July 1986 through October 1986. The facility is currently supported by the Department of Energy, and is operated by the Mathematics and Computer Science Division at Argonne. Over the past four-month period, a new commercial multiprocessor, the Intel iPSC-VX/d4 hypercube was installed. In addition, four other commercial multiprocessors continue to be available for research - an Encore Multimax, a Sequent Balance 21000, an Alliant FX/8, and an Intel iPSC/d5 - as well as a locally designed multiprocessor, the Lemur. These machines are being actively used by scientists at Argonne and throughout the nation in a wide variety of projects concerning computer systems with parallel and vector architectures. A variety of classes, workshops, and seminars have been sponsored to train researchers on computing techniques for the advanced computer systems at the Advanced Computing Research Facility. For example, courses were offered on writing programs for parallel computer systems and hosted the first annual Alliant users group meeting. A Sequent users group meeting and a two-day workshop on performance evaluation of parallel computers and programs are being organized.

  12. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  13. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... Advanced Scientific Computing Advisory Committee AGENCY: Department of Energy, Office of Science. ACTION... Reliability, Diffusion on Complex Networks, and Reversible Software Execution Systems Report from Applied Math... at: (301) 903-7486 or by email at: Melea.Baker@science.doe.gov . You must make your request for...

  14. 78 FR 56871 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Advanced Scientific Computing Advisory Committee AGENCY: Office of Science, Department of Energy. ACTION... Exascale technical approaches subcommittee Facilities update Report from Applied Math Committee of Visitors...: ( Melea.Baker@science.doe.gov ). You must make your request for an oral statement at least five...

  15. The Federal Government's Role in Advancing Computer Technology

    ERIC Educational Resources Information Center

    Information Hotline, 1978

    1978-01-01

    As part of the Federal Data Processing Reorganization Study submitted by the Science and Technology Team, the Federal Government's role in advancing and diffusing computer technology is discussed. Findings and conclusions assess the state-of-the-art in government and in industry, and five recommendations provide directions for government policy…

  16. Advanced computational research in materials processing for design and manufacturing

    SciTech Connect

    Zacharia, T.

    1994-12-31

    The computational requirements for design and manufacture of automotive components have seen dramatic increases for producing automobiles with three times the mileage. Automotive component design systems are becoming increasingly reliant on structural analysis requiring both overall larger analysis and more complex analyses, more three-dimensional analyses, larger model sizes, and routine consideration of transient and non-linear effects. Such analyses must be performed rapidly to minimize delays in the design and development process, which drives the need for parallel computing. This paper briefly describes advanced computational research in superplastic forming and automotive crash worthiness.

  17. GPU implementation of simultaneous iterative reconstruction techniques for computed tomograpy

    NASA Astrophysics Data System (ADS)

    Xin, Junjun; Bardel, Chuck; Udpa, Lalita; Udpa, Satish

    2013-01-01

    This paper presents implementation of simultaneous iteration reconstruction techniques on GPU with parallel computing languages using CUDA and its intrinsic libraries on four different Graphic Processing (GPU) cards. GPUs are highly parallel computing structures that enable acceleration of scientific and engineering computations. The GPU implementations offer significant performance improvement in reconstruction times. Initial results on the Shepp-Logan phantom of size ranging from 16×16 to 256×256 pixels are presented.

  18. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1982-01-01

    Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.

  19. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    SciTech Connect

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..; Easter, Richard C; Elliott, Scott M.; Ghan, Steven J.; Liu, Xiaohong; Lowrie, Robert B.; Lucas, Donald D.; Ma, Po-lun; Sacks, William J.; Shrivastava, Manish; Singh, Balwinder; Tautges, Timothy J.; Taylor, Mark A.; Vertenstein, Mariana; Worley, Patrick H.

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  20. Computation Techniques for the Volume of a Tetrahedron

    ERIC Educational Resources Information Center

    Srinivasan, V. K.

    2010-01-01

    The purpose of this article is to discuss specific techniques for the computation of the volume of a tetrahedron. A few of them are taught in the undergraduate multivariable calculus courses. Few of them are found in text books on coordinate geometry and synthetic solid geometry. This article gathers many of these techniques so as to constitute a…

  1. Advances in Numerical Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    1997-01-01

    Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.

  2. Noncompaction cardiomyopathy: The role of advanced multimodality imaging techniques in diagnosis and assessment.

    PubMed

    Chebrolu, Lakshmi H; Mehta, Anjlee M; Nanda, Navin C

    2017-02-01

    Noncompaction cardiomyopathy (NCCM) is a unique cardiomyopathy with a diverse array of genotypic and phenotypic manifestations. Its hallmark morphology consists of a bilayered myocardium with a compact epicardial layer and prominent trabeculations that comprise the noncompacted endocardial layer. The controversial diagnostic criteria for NCCM have been frequently discussed in the literature. This review touches on those diagnostic criteria, delves further into the evolving use of advanced imaging techniques within the major imaging modalities (echocardiography, computed tomography, and cardiac magnetic resonance imaging), and proposes an alternative algorithm incorporating these techniques for aiding with the diagnosis of NCCM.

  3. The investigation of advanced remote sensing techniques for the measurement of aerosol characteristics

    NASA Technical Reports Server (NTRS)

    Deepak, A.; Becher, J.

    1979-01-01

    Advanced remote sensing techniques and inversion methods for the measurement of characteristics of aerosol and gaseous species in the atmosphere were investigated. Of particular interest were the physical and chemical properties of aerosols, such as their size distribution, number concentration, and complex refractive index, and the vertical distribution of these properties on a local as well as global scale. Remote sensing techniques for monitoring of tropospheric aerosols were developed as well as satellite monitoring of upper tropospheric and stratospheric aerosols. Computer programs were developed for solving multiple scattering and radiative transfer problems, as well as inversion/retrieval problems. A necessary aspect of these efforts was to develop models of aerosol properties.

  4. A comparative analysis of soft computing techniques for gene prediction.

    PubMed

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided.

  5. Advanced aeroservoelastic stabilization techniques for hypersonic flight vehicles

    NASA Technical Reports Server (NTRS)

    Chan, Samuel Y.; Cheng, Peter Y.; Myers, Thomas T.; Klyde, David H.; Magdaleno, Raymond E.; Mcruer, Duane T.

    1992-01-01

    Advanced high performance vehicles, including Single-Stage-To-Orbit (SSTO) hypersonic flight vehicles, that are statically unstable, require higher bandwidth flight control systems to compensate for the instability resulting in interactions between the flight control system, the engine/propulsion dynamics, and the low frequency structural modes. Military specifications, such as MIL-F-9490D and MIL-F-87242, tend to limit treatment of structural modes to conventional gain stabilization techniques. The conventional gain stabilization techniques, however, introduce low frequency effective time delays which can be troublesome from a flying qualities standpoint. These time delays can be alleviated by appropriate blending of gain and phase stabilization techniques (referred to as Hybrid Phase Stabilization or HPS) for the low frequency structural modes. The potential of using HPS for compensating structural mode interaction was previously explored. It was shown that effective time delay was significantly reduced with the use of HPS; however, the HPS design was seen to have greater residual response than a conventional gain stablized design. Additional work performed to advance and refine the HPS design procedure, to further develop residual response metrics as a basis for alternative structural stability specifications, and to develop strategies for validating HPS design and specification concepts in manned simulation is presented. Stabilization design sensitivity to structural uncertainties and aircraft-centered requirements are also assessed.

  6. Advanced sensor-computer technology for urban runoff monitoring

    NASA Astrophysics Data System (ADS)

    Yu, Byunggu; Behera, Pradeep K.; Ramirez Rochac, Juan F.

    2011-04-01

    The paper presents the project team's advanced sensor-computer sphere technology for real-time and continuous monitoring of wastewater runoff at the sewer discharge outfalls along the receiving water. This research significantly enhances and extends the previously proposed novel sensor-computer technology. This advanced technology offers new computation models for an innovative use of the sensor-computer sphere comprising accelerometer, programmable in-situ computer, solar power, and wireless communication for real-time and online monitoring of runoff quantity. This innovation can enable more effective planning and decision-making in civil infrastructure, natural environment protection, and water pollution related emergencies. The paper presents the following: (i) the sensor-computer sphere technology; (ii) a significant enhancement to the previously proposed discrete runoff quantity model of this technology; (iii) a new continuous runoff quantity model. Our comparative study on the two distinct models is presented. Based on this study, the paper further investigates the following: (1) energy-, memory-, and communication-efficient use of the technology for runoff monitoring; (2) possible sensor extensions for runoff quality monitoring.

  7. Testing aspects of advanced coherent electron cooling technique

    SciTech Connect

    Litvinenko, V.; Jing, Y.; Pinayev, I.; Wang, G.; Samulyak, R.; Ratner, D.

    2015-05-03

    An advanced version of the Coherent-electron Cooling (CeC) based on the micro-bunching instability was proposed. This approach promises significant increase in the bandwidth of the CeC system and, therefore, significant shortening of cooling time in high-energy hadron colliders. In this paper we present our plans of simulating and testing the key aspects of this proposed technique using the set-up of the coherent-electron-cooling proof-of-principle experiment at BNL.

  8. The Design and Implementation of NASA's Advanced Flight Computing Module

    NASA Technical Reports Server (NTRS)

    Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce

    1995-01-01

    This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.

  9. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    SciTech Connect

    McCoy, Michel; Archer, Bill; Hendrickson, Bruce; Wade, Doug; Hoang, Thuc

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  10. Activities of the Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Oliger, Joseph

    1994-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under contract with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. Research at RIACS is currently being done in the following areas: (1) parallel computing; (2) advanced methods for scientific computing; (3) high performance networks; and (4) learning systems. RIACS technical reports are usually preprints of manuscripts that have been submitted to research journals or conference proceedings. A list of these reports for the period January 1, 1994 through December 31, 1994 is in the Reports and Abstracts section of this report.

  11. Using an Advanced Computational Laboratory Experiment to Extend and Deepen Physical Chemistry Students' Understanding of Atomic Structure

    ERIC Educational Resources Information Center

    Hoffman, Gary G.

    2015-01-01

    A computational laboratory experiment is described, which involves the advanced study of an atomic system. The students use concepts and techniques typically covered in a physical chemistry course but extend those concepts and techniques to more complex situations. The students get a chance to explore the study of atomic states and perform…

  12. Recent advances in data assimilation in computational geodynamic models

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, Alik

    2010-05-01

    To restore dynamics of mantle structures in the geological past, data assimilation can be used to constrain the initial conditions for the mantle temperature and velocity from their present observations and estimations. The initial conditions so obtained can then be used to run forward models of mantle dynamics to restore the evolution of mantle structures. If heat diffusion is neglected, the present mantle temperature and flow can be assimilated using the backward advection (BAD) into the past. Two- and three-dimensional numerical approaches to the solution of the inverse problem of the Rayleigh-Taylor instability were developed for a dynamic restoration of diapiric structures to their earlier stages (e.g., Ismail-Zadeh et al., 1998, 2001, 2004; Kaus and Podladchikov, 2001). The mantle flow was modelled backwards in time from present-day mantle density heterogeneities inferred from seismic observations (e.g., Steinberger and O'Connell, 1998; Conrad and Gurnis, 2003). The variational (VAR) (or also called adjoint) data assimilation has been pioneered by meteorologists and widely used in oceanography and in hydrological studies. The use of VAR data assimilation in models of geodynamics has been put forward by Bunge et al. (2003) and Ismail-Zadeh et al. (2003). The VAR data assimilation algorithm was employed to restore numerically models of mantle plumes (Ismail-Zadeh et al., 2004, 2006; Hier-Majumder et al., 2005; Liu and Gurnis, 2008; Liu et al., 2008). The use of the quasi-reversibility (QRV) technique (more robust computationally) implies the introduction into the backward heat equation of the additional term involving the product of a small regularization parameter and a higher order temperature derivative (the resulting regularized heat equation is based on the Riemann law of heat conduction). The data assimilation in this case is based on a search of the best fit between the forecast model state and the observations by minimizing the regularization parameter

  13. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  14. Soft computing in design and manufacturing of advanced materials

    NASA Technical Reports Server (NTRS)

    Cios, Krzysztof J.; Baaklini, George Y; Vary, Alex

    1993-01-01

    The potential of fuzzy sets and neural networks, often referred to as soft computing, for aiding in all aspects of manufacturing of advanced materials like ceramics is addressed. In design and manufacturing of advanced materials, it is desirable to find which of the many processing variables contribute most to the desired properties of the material. There is also interest in real time quality control of parameters that govern material properties during processing stages. The concepts of fuzzy sets and neural networks are briefly introduced and it is shown how they can be used in the design and manufacturing processes. These two computational methods are alternatives to other methods such as the Taguchi method. The two methods are demonstrated by using data collected at NASA Lewis Research Center. Future research directions are also discussed.

  15. Recent Advances in Techniques for Hyperspectral Image Processing

    NASA Technical Reports Server (NTRS)

    Plaza, Antonio; Benediktsson, Jon Atli; Boardman, Joseph W.; Brazile, Jason; Bruzzone, Lorenzo; Camps-Valls, Gustavo; Chanussot, Jocelyn; Fauvel, Mathieu; Gamba, Paolo; Gualtieri, Anthony; Marconcini, Mattia; Tilton, James C.; Trianni, Giovanna

    2009-01-01

    Imaging spectroscopy, also known as hyperspectral imaging, has been transformed in less than 30 years from being a sparse research tool into a commodity product available to a broad user community. Currently, there is a need for standardized data processing techniques able to take into account the special properties of hyperspectral data. In this paper, we provide a seminal view on recent advances in techniques for hyperspectral image processing. Our main focus is on the design of techniques able to deal with the highdimensional nature of the data, and to integrate the spatial and spectral information. Performance of the discussed techniques is evaluated in different analysis scenarios. To satisfy time-critical constraints in specific applications, we also develop efficient parallel implementations of some of the discussed algorithms. Combined, these parts provide an excellent snapshot of the state-of-the-art in those areas, and offer a thoughtful perspective on future potentials and emerging challenges in the design of robust hyperspectral imaging algorithms

  16. A Survey of Architectural Techniques for Near-Threshold Computing

    DOE PAGES

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlightmore » their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.« less

  17. A Survey of Architectural Techniques for Near-Threshold Computing

    SciTech Connect

    Mittal, Sparsh

    2015-12-28

    Energy efficiency has now become the primary obstacle in scaling the performance of all classes of computing systems. In low-voltage computing and specifically, near-threshold voltage computing (NTC), which involves operating the transistor very close to and yet above its threshold voltage, holds the promise of providing many-fold improvement in energy efficiency. However, use of NTC also presents several challenges such as increased parametric variation, failure rate and performance loss etc. Our paper surveys several re- cent techniques which aim to offset these challenges for fully leveraging the potential of NTC. By classifying these techniques along several dimensions, we also highlight their similarities and differences. Ultimately, we hope that this paper will provide insights into state-of-art NTC techniques to researchers and system-designers and inspire further research in this field.

  18. Advanced IMCW Lidar Techniques for ASCENDS CO2 Column Measurements

    NASA Astrophysics Data System (ADS)

    Campbell, Joel; lin, bing; nehrir, amin; harrison, fenton; obland, michael

    2015-04-01

    Global atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity-Modulated Continuous-Wave (IM-CW) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud contamination. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of optically thin clouds, thereby eliminating the need to correct for sidelobe bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These results are extended to include Richardson-Lucy deconvolution techniques to extend the resolution of the lidar beyond that implied by limit of the bandwidth of the modulation.

  19. Advances in Cross-Cutting Ideas for Computational Climate Science

    SciTech Connect

    Ng, Esmond; Evans, Katherine J.; Caldwell, Peter; Hoffman, Forrest M.; Jackson, Charles; Kerstin, Van Dam; Leung, Ruby; Martin, Daniel F.; Ostrouchov, George; Tuminaro, Raymond; Ullrich, Paul; Wild, S.; Williams, Samuel

    2017-01-01

    This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1) process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling

  20. Aesthetic Lateral Canthoplasty Using Tarso-Conjunctival Advancement Technique.

    PubMed

    Lee, Eun Jung; Lew, Dae Hyun; Song, Seung Han; Lee, Myung Chul

    2017-01-01

    Reduced horizontal length of the palpebral fissure is a distinctive characteristic of Asian eyelids, and aesthetic lateral canthal lengthening techniques have been performed for a refinement. The aim of this study is to describe a novel lateral canthoplasty using tarso-conjunctival advancement with a lid margin splitting procedure on the upper eyelids and to report the postoperative results. From December 2011 to June 2014, patients who underwent lateral canthoplasty using the tarso-conjunctival advancement procedure for aesthetic purposes were reviewed retrospectively. The predictor variables were grouped into demographic and operative categories. The primary outcome variables were the distances from the mid-pupillary line to the lateral canthus and the horizontal length of the palpebral aperture (distance from the medial to lateral canthus). Data analyses were performed using descriptive and univariate statistics. Patients who showed increment in objective measurements were considered significant. Aesthetic appearance was also evaluated based on pre- and postoperative clinical photographs. A total of 45 patients were enrolled in this study. Both the distance from the mid-pupil to the lateral canthus (ΔDpupil-lateral; 2.78 ± 0.54 mm, P <0.05) and the palpebral aperture horizontal length (ΔDmedial-lateral 2.93 ± 0.81 mm, P <0.05) increased significantly from pre- to postoperative state. All the patients demonstrated satisfactory results aesthetically during the follow-up. The tarso-conjunctival advancement technique for lateral canthoplasty produced satisfactory aesthetic results with an enlarged palpebral aperture. Future research is required to fully delineate the risk of possible complications, including injury to the eyelashes and meibomian glands.

  1. Surveying co-located space geodesy techniques for ITRF computation

    NASA Astrophysics Data System (ADS)

    Sarti, P.; Sillard, P.; Vittuari, L.

    2003-04-01

    We present a comprehensive operational methodology, based on classical geodesy triangulation and trilateration, that allows the determination of reference points of the five space geodesy techniques used in ITRF computation (i.e.: DORIS, GPS, LLR, SLR, VLBI). Most of the times, for a single technique, the reference point is not accessible and measurable directly. Likewise, no mechanically determined ex-center with respect to an external and measurable point is usually given. In these cases, it is not possible to directly measure the sought reference points and it is even less straightforward to obtain the statistical information relating these points for different techniques. We outline the most general practical surveying methodology that permits to recover the reference points of the different techniques regardless of their physical materialization. We also give a detailed analytical approach for less straightforward cases (e.g.: non geodetic VLBI antennae and SLR/LLR systems). We stress the importance of surveying instrumentation and procedure in achieving the best possible results and outline the impact of the information retrieved with our method in ITRF computation. In particular, we will give numerical examples of computation of the reference point of VLBI antennae (Ny Aalesund and Medicina) and the ex-centre vector computation linking co-located VLBI and GPS techniques in Medicina (Italy). A special attention was paid to the rigorous derivation of statistical elements. They will be presented in an other presentation.

  2. Computer-assisted virtual planning and surgical template fabrication for frontoorbital advancement.

    PubMed

    Soleman, Jehuda; Thieringer, Florian; Beinemann, Joerg; Kunz, Christoph; Guzman, Raphael

    2015-05-01

    OBJECT The authors describe a novel technique using computer-assisted design (CAD) and computed-assisted manufacturing (CAM) for the fabrication of individualized 3D printed surgical templates for frontoorbital advancement surgery. METHODS Two patients underwent frontoorbital advancement surgery for unilateral coronal synostosis. Virtual surgical planning (SurgiCase-CMF, version 5.0, Materialise) was done by virtual mirroring techniques and superposition of an age-matched normative 3D pediatric skull model. Based on these measurements, surgical templates were fabricated using a 3D printer. Bifrontal craniotomy and the osteotomies for the orbital bandeau were performed based on the sterilized 3D templates. The remodeling was then done placing the bone plates within the negative 3D templates and fixing them using absorbable poly-dl-lactic acid plates and screws. RESULTS Both patients exhibited a satisfying head shape postoperatively and at follow-up. No surgery-related complications occurred. The cutting and positioning of the 3D surgical templates proved to be very accurate and easy to use as well as reproducible and efficient. CONCLUSIONS Computer-assisted virtual planning and 3D template fabrication for frontoorbital advancement surgery leads to reconstructions based on standardizedmeasurements, precludes subjective remodeling, and seems to be overall safe and feasible. A larger series of patients with long-term follow-up is needed for further evaluation of this novel technique.

  3. Development of processing techniques for advanced thermal protection materials

    NASA Technical Reports Server (NTRS)

    Selvaduray, Guna S.

    1994-01-01

    The effort, which was focused on the research and development of advanced materials for use in Thermal Protection Systems (TPS), has involved chemical and physical testing of refractory ceramic tiles, fabrics, threads and fibers. This testing has included determination of the optical properties, thermal shock resistance, high temperature dimensional stability, and tolerance to environmental stresses. Materials have also been tested in the Arc Jet 2 x 9 Turbulent Duct Facility (TDF), the 1 atmosphere Radiant Heat Cycler, and the Mini-Wind Tunnel Facility (MWTF). A significant part of the effort hitherto has gone towards modifying and upgrading the test facilities so that meaningful tests can be carried out. Another important effort during this period has been the creation of a materials database. Computer systems administration and support have also been provided. These are described in greater detail below.

  4. Advances in reduction techniques for tire contact problems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1995-01-01

    Some recent developments in reduction techniques, as applied to predicting the tire contact response and evaluating the sensitivity coefficients of the different response quantities, are reviewed. The sensitivity coefficients measure the sensitivity of the contact response to variations in the geometric and material parameters of the tire. The tire is modeled using a two-dimensional laminated anisotropic shell theory with the effects of variation in geometric and material parameters, transverse shear deformation, and geometric nonlinearities included. The contact conditions are incorporated into the formulation by using a perturbed Lagrangian approach with the fundamental unknowns consisting of the stress resultants, the generalized displacements, and the Lagrange multipliers associated with the contact conditions. The elemental arrays are obtained by using a modified two-field, mixed variational principle. For the application of reduction techniques, the tire finite element model is partitioned into two regions. The first region consists of the nodes that are likely to come in contact with the pavement, and the second region includes all the remaining nodes. The reduction technique is used to significantly reduce the degrees of freedom in the second region. The effectiveness of the computational procedure is demonstrated by a numerical example of the frictionless contact response of the space shuttle nose-gear tire, inflated and pressed against a rigid flat surface. Also, the research topics which have high potential for enhancing the effectiveness of reduction techniques are outlined.

  5. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  6. Advanced Techniques for Removal of Retrievable Inferior Vena Cava Filters

    SciTech Connect

    Iliescu, Bogdan; Haskal, Ziv J.

    2012-08-15

    Inferior vena cava (IVC) filters have proven valuable for the prevention of primary or recurrent pulmonary embolism in selected patients with or at high risk for venous thromboembolic disease. Their use has become commonplace, and the numbers implanted increase annually. During the last 3 years, in the United States, the percentage of annually placed optional filters, i.e., filters than can remain as permanent filters or potentially be retrieved, has consistently exceeded that of permanent filters. In parallel, the complications of long- or short-term filtration have become increasingly evident to physicians, regulatory agencies, and the public. Most filter removals are uneventful, with a high degree of success. When routine filter-retrieval techniques prove unsuccessful, progressively more advanced tools and skill sets must be used to enhance filter-retrieval success. These techniques should be used with caution to avoid damage to the filter or cava during IVC retrieval. This review describes the complex techniques for filter retrieval, including use of additional snares, guidewires, angioplasty balloons, and mechanical and thermal approaches as well as illustrates their specific application.

  7. Advanced Synchrotron Techniques at High Pressure Collaborative Access Team (HPCAT)

    NASA Astrophysics Data System (ADS)

    Shen, G.; Sinogeikin, S. V.; Chow, P.; Kono, Y.; Meng, Y.; Park, C.; Popov, D.; Rod, E.; Smith, J.; Xiao, Y.; Mao, H.

    2012-12-01

    High Pressure Collaborative Access Team (HPCAT) is dedicated to advancing cutting-edge, multidisciplinary, high-pressure science and technology using synchrotron radiation at Sector 16 of the Advanced Photon Source (APS) of Argonne National Laboratory. At HPCAT an array of novel x-ray diffraction and spectroscopic techniques has been integrated with high pressure and extreme temperature instrumentation for studies of structure and materials properties at extreme conditions.. HPCAT consists of four active independent beamlines performing a large range of various experiments at extreme conditions. 16BM-B beamline is dedicated to energy dispersive and white Laue X-ray diffraction. The majority of experiments are performed with a Paris-Edinburgh large volume press (to 7GPa and 2500K) and include amorphous and liquid structure measurement, white beam radiography, elastic sound wave velocity measurement of amorphous solid materials, with viscosity and density measurement of liquid being under development. 16BM-D is a monochromatic diffraction beamline for powder and single crystal diffraction at high pressure and high (resistive heating) / low (cryostats) temperature. The additional capabilities include high-resolution powder diffraction and x-ray absorption near edge structure (XANES) spectroscopy. The insertion device beamline of HPCAT has two undulators in canted mode (operating independently) and LN cooled Si monochromators capable of providing a large range of energies. 16IDB is a microdiffraction beamline mainly focusing on high-pressure powder and single crystal diffraction in DAC at high temperatures (double-sided laser heating and resistive heating) and low temperature (various cryostats). The modern instrumentation allows high-quality diffraction at megabar pressures from light element, fast experiments with pulsed laser heating, fast dynamic experiments with Pilatus detector, and so on. 16ID-D beamline is dedicated to x-ray scattering and spectroscopy research

  8. RECENT ADVANCES IN COMPUTATIONAL MECHANICS FOR CIVIL ENGINEERING

    NASA Astrophysics Data System (ADS)

    Applied Mechanics Committee, Computational Mechanics Subcommittee,

    In order to clarify mechanical phenomena in civil engineering, it is necessary to improve computational theory and technique in consideration of the particularity of objects to be analyzed and to update computational mechanics focusing on practical use. In addition to the analysis of infrastructure, for damage prediction of natural disasters such as earthquake, tsunami and flood, since it is essential to reflect broad ranges in space and time inherent to fields of civil engineering as well as material properties, it is important to newly develop computational method in view of the particularity of fields of civil engineering. In this context, research trend of methods of computational mechanics which is noteworthy for resolving the complex mechanics problems in civil engineering is reviewed in this paper.

  9. Computer/PERT technique monitors actual versus allocated costs

    NASA Technical Reports Server (NTRS)

    Houry, E.; Walker, J. D.

    1967-01-01

    A computer method measures the users performance in cost-type contracts utilizing the existing nasa program evaluation review technique without imposing any additional reporting requirements. progress is measured by comparing actual costs with a value of work performed in a specific period.

  10. Coal and Coal Constituent Studies by Advanced EMR Techniques

    SciTech Connect

    Alex I. Smirnov; Mark J. Nilges; R. Linn Belford; Robert B. Clarkson

    1998-03-31

    Advanced electronic magnetic resonance (EMR) methods are used to examine properties of coals, chars, and molecular species related to constituents of coal. We have achieved substantial progress on upgrading the high field (HF) EMR (W-band, 95 GHz) spectrometers that are especially advantageous for such studies. Particularly, we have built a new second W-band instrument (Mark II) in addition to our Mark I. Briefly, Mark II features: (i) an Oxford custom-built 7 T superconducting magnet which is scannable from 0 to 7 T at up to 0.5 T/min; (ii) water-cooled coaxial solenoid with up to ±550 G scan under digital (15 bits resolution) computer control; (iii) custom-engineered precision feed-back circuit, which is used to drive this solenoid, is based on an Ultrastab 860R sensor that has linearity better than 5 ppm and resolution of 0.05 ppm; (iv) an Oxford CF 1200 cryostat for variable temperature studies from 1.8 to 340 K. During this grant period we have completed several key upgrades of both Mark I and II, particularly microwave bridge, W-band probehead, and computer interfaces. We utilize these improved instruments for HF EMR studies of spin-spin interaction and existence of different paramagnetic species in carbonaceous solids.

  11. Advanced Manufacturing Techniques Demonstrated for Fabricating Developmental Hardware

    NASA Technical Reports Server (NTRS)

    Redding, Chip

    2004-01-01

    NASA Glenn Research Center's Engineering Development Division has been working in support of innovative gas turbine engine systems under development by Glenn's Combustion Branch. These one-of-a-kind components require operation under extreme conditions. High-temperature ceramics were chosen for fabrication was because of the hostile operating environment. During the designing process, it became apparent that traditional machining techniques would not be adequate to produce the small, intricate features for the conceptual design, which was to be produced by stacking over a dozen thin layers with many small features that would then be aligned and bonded together into a one-piece unit. Instead of using traditional machining, we produced computer models in Pro/ENGINEER (Parametric Technology Corporation (PTC), Needham, MA) to the specifications of the research engineer. The computer models were exported in stereolithography standard (STL) format and used to produce full-size rapid prototype polymer models. These semi-opaque plastic models were used for visualization and design verification. The computer models also were exported in International Graphics Exchange Specification (IGES) format and sent to Glenn's Thermal/Fluids Design & Analysis Branch and Applied Structural Mechanics Branch for profiling heat transfer and mechanical strength analysis.

  12. Advanced Fibre Bragg Grating and Microfibre Bragg Grating Fabrication Techniques

    NASA Astrophysics Data System (ADS)

    Chung, Kit Man

    Fibre Bragg gratings (FBGs) have become a very important technology for communication systems and fibre optic sensing. Typically, FBGs are less than 10-mm long and are fabricated using fused silica uniform phase masks which become more expensive for longer length or non-uniform pitch. Generally, interference UV laser beams are employed to make long or complex FBGs, and this technique introduces critical precision and control issues. In this work, we demonstrate an advanced FBG fabrication system that enables the writing of long and complex gratings in optical fibres with virtually any apodisation profile, local phase and Bragg wavelength using a novel optical design in which the incident angles of two UV beams onto an optical fibre can be adjusted simultaneously by moving just one optical component, instead of two optics employed in earlier configurations, to vary the grating pitch. The key advantage of the grating fabrication system is that complex gratings can be fabricated by controlling the linear movements of two translation stages. In addition to the study of advanced grating fabrication technique, we also focus on the inscription of FBGs written in optical fibres with a cladding diameter of several ten's of microns. Fabrication of microfibres was investigated using a sophisticated tapering method. We also proposed a simple but practical technique to filter out the higher order modes reflected from the FBG written in microfibres via a linear taper region while the fundamental mode re-couples to the core. By using this technique, reflection from the microfibre Bragg grating (MFBG) can be effectively single mode, simplifying the demultiplexing and demodulation processes. MFBG exhibits high sensitivity to contact force and an MFBG-based force sensor was also constructed and tested to investigate their suitability for use as an invasive surgery device. Performance of the contact force sensor packaged in a conforming elastomer material compares favourably to one

  13. A survey of CPU-GPU heterogeneous computing techniques

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and application level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.

  14. A survey of CPU-GPU heterogeneous computing techniques

    DOE PAGES

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-07-04

    As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and applicationmore » level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.« less

  15. Computation of the tip vortex flowfield for advanced aircraft propellers

    NASA Technical Reports Server (NTRS)

    Tsai, Tommy M.; Dejong, Frederick J.; Levy, Ralph

    1988-01-01

    The tip vortex flowfield plays a significant role in the performance of advanced aircraft propellers. The flowfield in the tip region is complex, three-dimensional and viscous with large secondary velocities. An analysis is presented using an approximate set of equations which contains the physics required by the tip vortex flowfield, but which does not require the resources of the full Navier-Stokes equations. A computer code was developed to predict the tip vortex flowfield of advanced aircraft propellers. A grid generation package was developed to allow specification of a variety of advanced aircraft propeller shapes. Calculations of the tip vortex generation on an SR3 type blade at high Reynolds numbers were made using this code and a parametric study was performed to show the effect of tip thickness on tip vortex intensity. In addition, calculations of the tip vortex generation on a NACA 0012 type blade were made, including the flowfield downstream of the blade trailing edge. Comparison of flowfield calculations with experimental data from an F4 blade was made. A user's manual was also prepared for the computer code (NASA CR-182178).

  16. Temporomandibular joint computed tomography: development of a direct sagittal technique

    SciTech Connect

    van der Kuijl, B.; Vencken, L.M.; de Bont, L.G.; Boering, G. )

    1990-12-01

    Radiology plays an important role in the diagnosis of temporomandibular disorders. Different techniques are used with computed tomography offering simultaneous imaging of bone and soft tissues. It is therefore suited for visualization of the articular disk and may be used in patients with suspected internal derangements and other disorders of the temporomandibular joint. Previous research suggests advantages to direct sagittal scanning, which requires special positioning of the patient and a sophisticated scanning technique. This study describes the development of a new technique of direct sagittal computed tomographic imaging of the temporomandibular joint using a specially designed patient table and internal light visor positioning. No structures other than the patient's head are involved in the imaging process, and misleading artifacts from the arm or the shoulder are eliminated. The use of the scanogram allows precise correction of the condylar axis and selection of exact slice level.

  17. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  18. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  19. Applications of Advanced Nondestructive Measurement Techniques to Address Safety of Flight Issues on NASA Spacecraft

    NASA Technical Reports Server (NTRS)

    Prosser, Bill

    2016-01-01

    Advanced nondestructive measurement techniques are critical for ensuring the reliability and safety of NASA spacecraft. Techniques such as infrared thermography, THz imaging, X-ray computed tomography and backscatter X-ray are used to detect indications of damage in spacecraft components and structures. Additionally, sensor and measurement systems are integrated into spacecraft to provide structural health monitoring to detect damaging events that occur during flight such as debris impacts during launch and assent or from micrometeoroid and orbital debris, or excessive loading due to anomalous flight conditions. A number of examples will be provided of how these nondestructive measurement techniques have been applied to resolve safety critical inspection concerns for the Space Shuttle, International Space Station (ISS), and a variety of launch vehicles and unmanned spacecraft.

  20. Bone tissue engineering scaffolding: computer-aided scaffolding techniques.

    PubMed

    Thavornyutikarn, Boonlom; Chantarapanich, Nattapon; Sitthiseripratip, Kriskrai; Thouas, George A; Chen, Qizhi

    Tissue engineering is essentially a technique for imitating nature. Natural tissues consist of three components: cells, signalling systems (e.g. growth factors) and extracellular matrix (ECM). The ECM forms a scaffold for its cells. Hence, the engineered tissue construct is an artificial scaffold populated with living cells and signalling molecules. A huge effort has been invested in bone tissue engineering, in which a highly porous scaffold plays a critical role in guiding bone and vascular tissue growth and regeneration in three dimensions. In the last two decades, numerous scaffolding techniques have been developed to fabricate highly interconnective, porous scaffolds for bone tissue engineering applications. This review provides an update on the progress of foaming technology of biomaterials, with a special attention being focused on computer-aided manufacturing (Andrade et al. 2002) techniques. This article starts with a brief introduction of tissue engineering (Bone tissue engineering and scaffolds) and scaffolding materials (Biomaterials used in bone tissue engineering). After a brief reviews on conventional scaffolding techniques (Conventional scaffolding techniques), a number of CAM techniques are reviewed in great detail. For each technique, the structure and mechanical integrity of fabricated scaffolds are discussed in detail. Finally, the advantaged and disadvantage of these techniques are compared (Comparison of scaffolding techniques) and summarised (Summary).

  1. Advanced Cytologic Techniques for the Detection of Malignant Pancreatobiliary Strictures

    PubMed Central

    Moreno Luna, Laura E.; Kipp, Benjamin; Halling, Kevin C.; Sebo, Thomas J.; Kremers., Walter K.; Roberts, Lewis R.; Barr Fritcher, Emily G.; Levy, Michael J.; Gores, Gregory J.

    2006-01-01

    Background & Aims Two advanced cytologic techniques for detecting aneuploidy, digital image analysis (DIA) and fluorescence in situ hybridization (FISH) have recently been developed to help identify malignant pancreatobiliary strictures. The aim of this study was to assess the clinical utility of cytology, DIA, and FISH for the identification of malignant pancreatobiliary strictures. Methods Brush cytologic specimens from 233 consecutive patients undergoing ERCP for pancreatobiliary strictures were examined by all three techniques. Strictures were stratified as proximal (n=33) or distal (n=114) based on whether they occurred above or below the cystic duct, respectively. Strictures in patients with PSC (n=86) were analyzed separately. Results Despite the stratification, the performances of the tests were similar. Routine cytology has a low sensitivity (5–20%) but 100% specificity. Because of the high specificity for cytology, we assessed the performance of the other tests when routine cytology was negative. In this clinical context, FISH had an increased sensitivity (35–60%) when assessing for chromosomal gains (polysomy) while preserving the specificity of cytology. The sensitivity and specificity of DIA was intermediate as compared to routine cytology and FISH, but was additive to FISH values demonstrating only trisomy of chromosome 7 or chromosome 3. Conclusions These findings suggest that FISH and DIA increase the sensitivity for the diagnosis of malignant pancreatobiliary tract strictures over that obtained by conventional cytology while maintaining an acceptable specificity. PMID:17030177

  2. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  3. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  4. Advanced Techniques for Constrained Internal Coordinate Molecular Dynamics

    PubMed Central

    Wagner, Jeffrey R.; Balaraman, Gouthaman S.; Niesen, Michiel J. M.; Larsen, Adrien B.; Jain, Abhinandan; Vaidehi, Nagarajan

    2013-01-01

    Internal coordinate molecular dynamics (ICMD) methods provide a more natural description of a protein by using bond, angle and torsional coordinates instead of a Cartesian coordinate representation. Freezing high frequency bonds and angles in the ICMD model gives rise to constrained ICMD (CICMD) models. There are several theoretical aspects that need to be developed in order to make the CICMD method robust and widely usable. In this paper we have designed a new framework for 1) initializing velocities for non-independent CICMD coordinates, 2) efficient computation of center of mass velocity during CICMD simulations, 3) using advanced integrators such as Runge-Kutta, Lobatto and adaptive CVODE for CICMD simulations, and 4) cancelling out the “flying ice cube effect” that sometimes arises in Nosé-Hoover dynamics. The Generalized Newton-Euler Inverse Mass Operator (GNEIMO) method is an implementation of a CICMD method that we have developed to study protein dynamics. GNEIMO allows for a hierarchy of coarse-grained simulation models based on the ability to rigidly constrain any group of atoms. In this paper, we perform tests on the Lobatto and Runge-Kutta integrators to determine optimal simulation parameters. We also implement an adaptive coarse graining tool using the GNEIMO Python interface. This tool enables the secondary structure-guided “freezing and thawing” of degrees of freedom in the molecule on the fly during MD simulations, and is shown to fold four proteins to their native topologies. With these advancements we envision the use of the GNEIMO method in protein structure prediction, structure refinement, and in studying domain motion. PMID:23345138

  5. Advanced techniques for constrained internal coordinate molecular dynamics.

    PubMed

    Wagner, Jeffrey R; Balaraman, Gouthaman S; Niesen, Michiel J M; Larsen, Adrien B; Jain, Abhinandan; Vaidehi, Nagarajan

    2013-04-30

    Internal coordinate molecular dynamics (ICMD) methods provide a more natural description of a protein by using bond, angle, and torsional coordinates instead of a Cartesian coordinate representation. Freezing high-frequency bonds and angles in the ICMD model gives rise to constrained ICMD (CICMD) models. There are several theoretical aspects that need to be developed to make the CICMD method robust and widely usable. In this article, we have designed a new framework for (1) initializing velocities for nonindependent CICMD coordinates, (2) efficient computation of center of mass velocity during CICMD simulations, (3) using advanced integrators such as Runge-Kutta, Lobatto, and adaptive CVODE for CICMD simulations, and (4) cancelling out the "flying ice cube effect" that sometimes arises in Nosé-Hoover dynamics. The Generalized Newton-Euler Inverse Mass Operator (GNEIMO) method is an implementation of a CICMD method that we have developed to study protein dynamics. GNEIMO allows for a hierarchy of coarse-grained simulation models based on the ability to rigidly constrain any group of atoms. In this article, we perform tests on the Lobatto and Runge-Kutta integrators to determine optimal simulation parameters. We also implement an adaptive coarse-graining tool using the GNEIMO Python interface. This tool enables the secondary structure-guided "freezing and thawing" of degrees of freedom in the molecule on the fly during molecular dynamics simulations and is shown to fold four proteins to their native topologies. With these advancements, we envision the use of the GNEIMO method in protein structure prediction, structure refinement, and in studying domain motion.

  6. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  7. Advanced Computer Science on Internal Ballistics of Solid Rocket Motors

    NASA Astrophysics Data System (ADS)

    Shimada, Toru; Kato, Kazushige; Sekino, Nobuhiro; Tsuboi, Nobuyuki; Seike, Yoshio; Fukunaga, Mihoko; Daimon, Yu; Hasegawa, Hiroshi; Asakawa, Hiroya

    In this paper, described is the development of a numerical simulation system, what we call “Advanced Computer Science on SRM Internal Ballistics (ACSSIB)”, for the purpose of improvement of performance and reliability of solid rocket motors (SRM). The ACSSIB system is consisting of a casting simulation code of solid propellant slurry, correlation database of local burning-rate of cured propellant in terms of local slurry flow characteristics, and a numerical code for the internal ballistics of SRM, as well as relevant hardware. This paper describes mainly the objectives, the contents of this R&D, and the output of the fiscal year of 2008.

  8. Advances in Electromagnetic Modelling through High Performance Computing

    SciTech Connect

    Ko, K.; Folwell, N.; Ge, L.; Guetz, A.; Lee, L.; Li, Z.; Ng, C.; Prudencio, E.; Schussman, G.; Uplenchwar, R.; Xiao, L.; /SLAC

    2006-03-29

    Under the DOE SciDAC project on Accelerator Science and Technology, a suite of electromagnetic codes has been under development at SLAC that are based on unstructured grids for higher accuracy, and use parallel processing to enable large-scale simulation. The new modeling capability is supported by SciDAC collaborations on meshing, solvers, refinement, optimization and visualization. These advances in computational science are described and the application of the parallel eigensolver Omega3P to the cavity design for the International Linear Collider is discussed.

  9. Training Software in Artificial-Intelligence Computing Techniques

    NASA Technical Reports Server (NTRS)

    Howard, Ayanna; Rogstad, Eric; Chalfant, Eugene

    2005-01-01

    The Artificial Intelligence (AI) Toolkit is a computer program for training scientists, engineers, and university students in three soft-computing techniques (fuzzy logic, neural networks, and genetic algorithms) used in artificial-intelligence applications. The program promotes an easily understandable tutorial interface, including an interactive graphical component through which the user can gain hands-on experience in soft-computing techniques applied to realistic example problems. The tutorial provides step-by-step instructions on the workings of soft-computing technology, whereas the hands-on examples allow interaction and reinforcement of the techniques explained throughout the tutorial. In the fuzzy-logic example, a user can interact with a robot and an obstacle course to verify how fuzzy logic is used to command a rover traverse from an arbitrary start to the goal location. For the genetic algorithm example, the problem is to determine the minimum-length path for visiting a user-chosen set of planets in the solar system. For the neural-network example, the problem is to decide, on the basis of input data on physical characteristics, whether a person is a man, woman, or child. The AI Toolkit is compatible with the Windows 95,98, ME, NT 4.0, 2000, and XP operating systems. A computer having a processor speed of at least 300 MHz, and random-access memory of at least 56MB is recommended for optimal performance. The program can be run on a slower computer having less memory, but some functions may not be executed properly.

  10. Computational methods of the Advanced Fluid Dynamics Model

    SciTech Connect

    Bohl, W.R.; Wilhelm, D.; Parker, F.R.; Berthier, J.; Maudlin, P.J.; Schmuck, P.; Goutagny, L.; Ichikawa, S.; Ninokata, H.; Luck, L.B.

    1987-01-01

    To more accurately treat severe accidents in fast reactors, a program has been set up to investigate new computational models and approaches. The product of this effort is a computer code, the Advanced Fluid Dynamics Model (AFDM). This paper describes some of the basic features of the numerical algorithm used in AFDM. Aspects receiving particular emphasis are the fractional-step method of time integration, the semi-implicit pressure iteration, the virtual mass inertial terms, the use of three velocity fields, higher order differencing, convection of interfacial area with source and sink terms, multicomponent diffusion processes in heat and mass transfer, the SESAME equation of state, and vectorized programming. A calculated comparison with an isothermal tetralin/ammonia experiment is performed. We conclude that significant improvements are possible in reliably calculating the progression of severe accidents with further development.

  11. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  12. Evaluation of computational radiometric and spectral sensor calibration techniques

    NASA Astrophysics Data System (ADS)

    Manakov, Alkhazur

    2016-04-01

    Radiometric and spectral calibration are essential for enabling the use of digital sensors for measurement purposes. Traditional optical calibration techniques require expensive equipment such as specialized light sources, monochromators, tunable filters, calibrated photo-diodes, etc. The trade-offs between computational and physics-based characterization schemes are, however, not well understood. In this paper we perform an analysis of existing computational calibration schemes and elucidate their weak points. We highlight the limitations by comparing against ground truth measurements performed in an optical characterization laboratory (EMVA 1288 standard). Based on our analysis, we present accurate and affordable methods for the radiometric and spectral calibration of a camera.

  13. Computational ocean acoustics: Advances in 3D ocean acoustic modeling

    NASA Astrophysics Data System (ADS)

    Schmidt, Henrik; Jensen, Finn B.

    2012-11-01

    The numerical model of ocean acoustic propagation developed in the 1980's are still in widespread use today, and the field of computational ocean acoustics is often considered a mature field. However, the explosive increase in computational power available to the community has created opportunities for modeling phenomena that earlier were beyond reach. Most notably, three-dimensional propagation and scattering problems have been prohibitive computationally, but are now addressed routinely using brute force numerical approaches such as the Finite Element Method, in particular for target scattering problems, where they are being combined with the traditional wave theory propagation models in hybrid modeling frameworks. Also, recent years has seen the development of hybrid approaches coupling oceanographic circulation models with acoustic propagation models, enabling the forecasting of sonar performance uncertainty in dynamic ocean environments. These and other advances made over the last couple of decades support the notion that the field of computational ocean acoustics is far from being mature. [Work supported by the Office of Naval Research, Code 321OA].

  14. Computer-Assisted Technique for Surgical Tooth Extraction

    PubMed Central

    Hamza, Hosamuddin

    2016-01-01

    Introduction. Surgical tooth extraction is a common procedure in dentistry. However, numerous extraction cases show a high level of difficulty in practice. This difficulty is usually related to inadequate visualization, improper instrumentation, or other factors related to the targeted tooth (e.g., ankyloses or presence of bony undercut). Methods. In this work, the author presents a new technique for surgical tooth extraction based on 3D imaging, computer planning, and a new concept of computer-assisted manufacturing. Results. The outcome of this work is a surgical guide made by 3D printing of plastics and CNC of metals (hybrid outcome). In addition, the conventional surgical cutting tools (surgical burs) are modified with a number of stoppers adjusted to avoid any excessive drilling that could harm bone or other vital structures. Conclusion. The present outcome could provide a minimally invasive technique to overcome the routine complications facing dental surgeons in surgical extraction procedures. PMID:27127510

  15. Computer-Assisted Technique for Surgical Tooth Extraction.

    PubMed

    Hamza, Hosamuddin

    2016-01-01

    Introduction. Surgical tooth extraction is a common procedure in dentistry. However, numerous extraction cases show a high level of difficulty in practice. This difficulty is usually related to inadequate visualization, improper instrumentation, or other factors related to the targeted tooth (e.g., ankyloses or presence of bony undercut). Methods. In this work, the author presents a new technique for surgical tooth extraction based on 3D imaging, computer planning, and a new concept of computer-assisted manufacturing. Results. The outcome of this work is a surgical guide made by 3D printing of plastics and CNC of metals (hybrid outcome). In addition, the conventional surgical cutting tools (surgical burs) are modified with a number of stoppers adjusted to avoid any excessive drilling that could harm bone or other vital structures. Conclusion. The present outcome could provide a minimally invasive technique to overcome the routine complications facing dental surgeons in surgical extraction procedures.

  16. Discriminating coastal rangeland production and improvements with computer aided techniques

    NASA Technical Reports Server (NTRS)

    Reeves, C. A.; Faulkner, D. P.

    1975-01-01

    The feasibility and utility of using satellite data and computer-aided remote sensing analysis techniques to conduct range inventories were tested. This pilot study was focused over a 250,000 acre site in Galveston and Brazoria Counties along the Texas Gulf Coast. Rectified enlarged aircraft color infrared photographs of this site were used as the ground truth base. The different land categories were identified, delineated, and measured. Multispectral scanner (MSS) bulk data from LANDSAT-1 was received and analyzed with the Image 100 pattern recognition system. Features of interest were delineated on the image console giving the number of picture elements classified; the picture elements were converted to acreages and the accuracy of the technique was evaluated by comparison with data base results for three test sites. The accuracies for computer aided classification of coastal marshes ranged from 89% to 96%.

  17. Computational cardiology: how computer simulations could be used to develop new therapies and advance existing ones

    PubMed Central

    Trayanova, Natalia A.; O'Hara, Thomas; Bayer, Jason D.; Boyle, Patrick M.; McDowell, Kathleen S.; Constantino, Jason; Arevalo, Hermenegild J.; Hu, Yuxuan; Vadakkumpadan, Fijoy

    2012-01-01

    This article reviews the latest developments in computational cardiology. It focuses on the contribution of cardiac modelling to the development of new therapies as well as the advancement of existing ones for cardiac arrhythmias and pump dysfunction. Reviewed are cardiac modelling efforts aimed at advancing and optimizing existent therapies for cardiac disease (defibrillation, ablation of ventricular tachycardia, and cardiac resynchronization therapy) and at suggesting novel treatments, including novel molecular targets, as well as efforts to use cardiac models in stratification of patients likely to benefit from a given therapy, and the use of models in diagnostic procedures. PMID:23104919

  18. Advances in Poly(4-aminodiphenylaniline) Nanofibers Preparation by Electrospinning Technique.

    PubMed

    Della Pina, C; Busacca, C; Frontera, P; Antonucci, P L; Scarpino, L A; Sironi, A; Falletta, E

    2016-05-01

    Polyaniline (PANI) nanofibers are drawing a great deal of interest from academia and industry due to their multiple applications, especially in biomedical field. PANI nanofibers were successfully electrospun for the first time by MacDiarmid and co-workers at the beginning of the millennium and since then many efforts have been addressed to improve their quality. However, traditional PANI prepared from aniline monomer shows some drawbacks, such as presence of toxic (i.e., benzidine) and inorganic (salts and metals) co-products, that complicate polymer post-treatment, and low solubility in common organic solvents, making hard its processing by electrospinning technique. Some industrial sectors, such as medical and biomedical, need to employ materials free from toxic and polluting species. In this regard, the oxidative polymerization of N-(4-aminophenyl)aniline, aniline dimer, to produce poly(4-aminodiphenylaniline), P4ADA, a kind of PANI, represents an innovative alternative to the traditional synthesis because the obtained polymer results free from carcinogenic and/or polluting co-products, and, moreover, more soluble than traditional PANI. This latter feature can be exploited to obtain P4ADA nanofibers by electrospinning technique. In this paper we report the advances obtained in the P4ADA nanofibers electrospinnig. A comparison among polyethylene oxide (PEO), polymethyl methacrylate (PMMA) and polystyrene (PS), as the second polymer to facilitate the electrospinning process, is shown. In order to increase the conductivity of P4ADA nanofibers, two strategies were adopted and compared: selective insulating binder removal from electrospun nanofibers by a rinsing tratment, afterwards optimizing the minimum amount of binder necessary for the electrospinning process. Moreover, the effect of PEO/P4ADA weight ratio on the fibers morphology and conductivity was highlighted.

  19. A review of hemorheology: Measuring techniques and recent advances

    NASA Astrophysics Data System (ADS)

    Sousa, Patrícia C.; Pinho, Fernando T.; Alves, Manuel A.; Oliveira, Mónica S. N.

    2016-02-01

    Significant progress has been made over the years on the topic of hemorheology, not only in terms of the development of more accurate and sophisticated techniques, but also in terms of understanding the phenomena associated with blood components, their interactions and impact upon blood properties. The rheological properties of blood are strongly dependent on the interactions and mechanical properties of red blood cells, and a variation of these properties can bring further insight into the human health state and can be an important parameter in clinical diagnosis. In this article, we provide both a reference for hemorheological research and a resource regarding the fundamental concepts in hemorheology. This review is aimed at those starting in the field of hemodynamics, where blood rheology plays a significant role, but also at those in search of the most up-to-date findings (both qualitative and quantitative) in hemorheological measurements and novel techniques used in this context, including technical advances under more extreme conditions such as in large amplitude oscillatory shear flow or under extensional flow, which impose large deformations comparable to those found in the microcirculatory system and in diseased vessels. Given the impressive rate of increase in the available knowledge on blood flow, this review is also intended to identify areas where current knowledge is still incomplete, and which have the potential for new, exciting and useful research. We also discuss the most important parameters that can lead to an alteration of blood rheology, and which as a consequence can have a significant impact on the normal physiological behavior of blood.

  20. Computational analysis of semi-span model test techniques

    NASA Technical Reports Server (NTRS)

    Milholen, William E., II; Chokani, Ndaona

    1996-01-01

    A computational investigation was conducted to support the development of a semi-span model test capability in the NASA LaRC's National Transonic Facility. This capability is required for the testing of high-lift systems at flight Reynolds numbers. A three-dimensional Navier-Stokes solver was used to compute the low-speed flow over both a full-span configuration and a semi-span configuration. The computational results were found to be in good agreement with the experimental data. The computational results indicate that the stand-off height has a strong influence on the flow over a semi-span model. The semi-span model adequately replicates the aerodynamic characteristics of the full-span configuration when a small stand-off height, approximately twice the tunnel empty sidewall boundary layer displacement thickness, is used. Several active sidewall boundary layer control techniques were examined including: upstream blowing, local jet blowing, and sidewall suction. Both upstream tangential blowing, and sidewall suction were found to minimize the separation of the sidewall boundary layer ahead of the semi-span model. The required mass flow rates are found to be practicable for testing in the NTF. For the configuration examined, the active sidewall boundary layer control techniques were found to be necessary only near the maximum lift conditions.

  1. Advanced condition monitoring techniques and plant life extension studies at EBR-2

    SciTech Connect

    Singer, R.M.; Gross, K.C. ); Perry, W.H.; King, R.W. )

    1991-01-01

    Numerous advanced techniques have been evaluated and tested at EBR-2 as part of a plant-life extension program for detection of degradation and other abnormalities in plant systems. Two techniques have been determined to be of considerable assistance in planning for the extended-life operation of EBR-2. The first, a computer-based pattern-recognition system (System State Analyzer or SSA) is used for surveillance of the primary system instrumentation, primary sodium pumps and plant heat balances. This surveillance has indicated that the SSA can detect instrumentation degradation and system performance degradation over varying time intervals and can be used to provide derived signal values to replace signals from failed sensors. The second technique, also a computer-based pattern-recognition system (Sequential Probability Ratio Test or SPRT) is used to validate signals and to detect incipient failures in sensors and components or systems. It is being used on the failed fuel detection system and is experimentally used on the primary coolant pumps. Both techniques are described and experience with their operation presented.

  2. A survey of GPU-based medical image computing techniques.

    PubMed

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine.

  3. Advanced Techniques for Power System Identification from Measured Data

    SciTech Connect

    Pierre, John W.; Wies, Richard; Trudnowski, Daniel

    2008-11-25

    Time-synchronized measurements provide rich information for estimating a power-system's electromechanical modal properties via advanced signal processing. This information is becoming critical for the improved operational reliability of interconnected grids. A given mode's properties are described by its frequency, damping, and shape. Modal frequencies and damping are useful indicators of power-system stress, usually declining with increased load or reduced grid capacity. Mode shape provides critical information for operational control actions. This project investigated many advanced techniques for power system identification from measured data focusing on mode frequency and damping ratio estimation. Investigators from the three universities coordinated their effort with Pacific Northwest National Laboratory (PNNL). Significant progress was made on developing appropriate techniques for system identification with confidence intervals and testing those techniques on field measured data and through simulation. Experimental data from the western area power system was provided by PNNL and Bonneville Power Administration (BPA) for both ambient conditions and for signal injection tests. Three large-scale tests were conducted for the western area in 2005 and 2006. Measured field PMU (Phasor Measurement Unit) data was provided to the three universities. A 19-machine simulation model was enhanced for testing the system identification algorithms. Extensive simulations were run with this model to test the performance of the algorithms. University of Wyoming researchers participated in four primary activities: (1) Block and adaptive processing techniques for mode estimation from ambient signals and probing signals, (2) confidence interval estimation, (3) probing signal design and injection method analysis, and (4) performance assessment and validation from simulated and field measured data. Subspace based methods have been use to improve previous results from block processing

  4. Pediatric Cardiopulmonary Resuscitation: Advances in Science, Techniques, and Outcomes

    PubMed Central

    Topjian, Alexis A.; Berg, Robert A.; Nadkarni, Vinay M.

    2009-01-01

    More than 25% of children survive to hospital discharge after in-hospital cardiac arrests, and 5% to 10% survive after out-of-hospital cardiac arrests. This review of pediatric cardiopulmonary resuscitation addresses the epidemiology of pediatric cardiac arrests, mechanisms of coronary blood flow during cardiopulmonary resuscitation, the 4 phases of cardiac arrest resuscitation, appropriate interventions during each phase, special resuscitation circumstances, extracorporeal membrane oxygenation cardiopulmonary resuscitation, and quality of cardiopulmonary resuscitation. The key elements of pathophysiology that impact and match the timing, intensity, duration, and variability of the hypoxic-ischemic insult to evidence-based interventions are reviewed. Exciting discoveries in basic and applied-science laboratories are now relevant for specific subpopulations of pediatric cardiac arrest victims and circumstances (eg, ventricular fibrillation, neonates, congenital heart disease, extracorporeal cardiopulmonary resuscitation). Improving the quality of interventions is increasingly recognized as a key factor for improving outcomes. Evolving training strategies include simulation training, just-in-time and just-in-place training, and crisis-team training. The difficult issue of when to discontinue resuscitative efforts is addressed. Outcomes from pediatric cardiac arrests are improving. Advances in resuscitation science and state-of-the-art implementation techniques provide the opportunity for further improvement in outcomes among children after cardiac arrest. PMID:18977991

  5. Recommended advanced techniques for waterborne pathogen detection in developing countries.

    PubMed

    Alhamlan, Fatimah S; Al-Qahtani, Ahmed A; Al-Ahdal, Mohammed N

    2015-02-19

    The effect of human activities on water resources has expanded dramatically during the past few decades, leading to the spread of waterborne microbial pathogens. The total global health impact of human infectious diseases associated with pathogenic microorganisms from land-based wastewater pollution was estimated to be approximately three million disability-adjusted life years (DALY), with an estimated economic loss of nearly 12 billion US dollars per year. Although clean water is essential for healthy living, it is not equally granted to all humans. Indeed, people who live in developing countries are challenged every day by an inadequate supply of clean water. Polluted water can lead to health crises that in turn spread waterborne pathogens. Taking measures to assess the water quality can prevent these potential risks. Thus, a pressing need has emerged in developing countries for comprehensive and accurate assessments of water quality. This review presents current and emerging advanced techniques for assessing water quality that can be adopted by authorities in developing countries.

  6. Dissecting cell adhesion architecture using advanced imaging techniques

    PubMed Central

    Morton, Penny E

    2011-01-01

    Cell adhesion to extracellular matrix proteins or to other cells is essential for the control of embryonic development, tissue integrity, immune function and wound healing. Adhesions are tightly spatially regulated structures containing over one hundred different proteins that coordinate both dynamics and signaling events at these sites. Extensive biochemical and morphological analysis of adhesion types over the past three decades has greatly improved understanding of individual protein contributions to adhesion signaling and, in some cases, dynamics. However, it is becoming increasingly clear that these diverse macromolecular complexes contain a variety of protein sub-networks, as well as distinct sub-domains that likely play important roles in regulating adhesion behavior. Until recently, resolving these structures, which are often less than a micron in size, was hampered by the limitations of conventional light microscopy. However, recent advances in optical techniques and imaging methods have revealed exciting insight into the intricate control of adhesion structure and assembly. Here we provide an overview of the recent data arising from such studies of cell:matrix and cell:cell contact and an overview of the imaging strategies that have been applied to study the intricacies and hierarchy of proteins within adhesions. PMID:21785274

  7. Nanocrystalline materials: recent advances in crystallographic characterization techniques

    PubMed Central

    Ringe, Emilie

    2014-01-01

    Most properties of nanocrystalline materials are shape-dependent, providing their exquisite tunability in optical, mechanical, electronic and catalytic properties. An example of the former is localized surface plasmon resonance (LSPR), the coherent oscillation of conduction electrons in metals that can be excited by the electric field of light; this resonance frequency is highly dependent on both the size and shape of a nanocrystal. An example of the latter is the marked difference in catalytic activity observed for different Pd nanoparticles. Such examples highlight the importance of particle shape in nanocrystalline materials and their practical applications. However, one may ask ‘how are nanoshapes created?’, ‘how does the shape relate to the atomic packing and crystallography of the material?’, ‘how can we control and characterize the external shape and crystal structure of such small nanocrystals?’. This feature article aims to give the reader an overview of important techniques, concepts and recent advances related to these questions. Nucleation, growth and how seed crystallography influences the final synthesis product are discussed, followed by shape prediction models based on seed crystallography and thermodynamic or kinetic parameters. The crystallographic implications of epitaxy and orientation in multilayered, core-shell nanoparticles are overviewed, and, finally, the development and implications of novel, spatially resolved analysis tools are discussed. PMID:25485133

  8. REVIEW ARTICLE: Emission measurement techniques for advanced powertrains

    NASA Astrophysics Data System (ADS)

    Adachi, Masayuki

    2000-10-01

    Recent developments in high-efficiency low-emission powertrains require the emission measurement technologies to be able to detect regulated and unregulated compounds with very high sensitivity and a fast response. For example, levels of a variety of nitrogen compounds and sulphur compounds should be analysed in real time in order to develop aftertreatment systems to decrease emission of NOx for the lean burning powertrains. Also, real-time information on the emission of particulate matter for the transient operation of diesel engines and direct injection gasoline engines is invaluable. The present paper reviews newly introduced instrumentation for such emission measurement that is demanded for the developments in advanced powertrain systems. They include Fourier transform infrared spectroscopy, mass spectrometry and fast response flame ionization detection. In addition, demands and applications of the fuel reformer developments for fuel cell electric vehicles are discussed. Besides the detection methodologies, sample handling techniques for the measurement of concentrations emitted from low emission vehicles for which the concentrations of the pollutants are significantly lower than the concentrations present in ambient air, are also described.

  9. Development of advanced strain diagnostic techniques for reactor environments.

    SciTech Connect

    Fleming, Darryn D.; Holschuh, Thomas Vernon,; Miller, Timothy J.; Hall, Aaron Christopher; Urrea, David Anthony,; Parma, Edward J.,

    2013-02-01

    The following research is operated as a Laboratory Directed Research and Development (LDRD) initiative at Sandia National Laboratories. The long-term goals of the program include sophisticated diagnostics of advanced fuels testing for nuclear reactors for the Department of Energy (DOE) Gen IV program, with the future capability to provide real-time measurement of strain in fuel rod cladding during operation in situ at any research or power reactor in the United States. By quantifying the stress and strain in fuel rods, it is possible to significantly improve fuel rod design, and consequently, to improve the performance and lifetime of the cladding. During the past year of this program, two sets of experiments were performed: small-scale tests to ensure reliability of the gages, and reactor pulse experiments involving the most viable samples in the Annulated Core Research Reactor (ACRR), located onsite at Sandia. Strain measurement techniques that can provide useful data in the extreme environment of a nuclear reactor core are needed to characterize nuclear fuel rods. This report documents the progression of solutions to this issue that were explored for feasibility in FY12 at Sandia National Laboratories, Albuquerque, NM.

  10. Nanocrystalline materials: recent advances in crystallographic characterization techniques.

    PubMed

    Ringe, Emilie

    2014-11-01

    Most properties of nanocrystalline materials are shape-dependent, providing their exquisite tunability in optical, mechanical, electronic and catalytic properties. An example of the former is localized surface plasmon resonance (LSPR), the coherent oscillation of conduction electrons in metals that can be excited by the electric field of light; this resonance frequency is highly dependent on both the size and shape of a nanocrystal. An example of the latter is the marked difference in catalytic activity observed for different Pd nanoparticles. Such examples highlight the importance of particle shape in nanocrystalline materials and their practical applications. However, one may ask 'how are nanoshapes created?', 'how does the shape relate to the atomic packing and crystallography of the material?', 'how can we control and characterize the external shape and crystal structure of such small nanocrystals?'. This feature article aims to give the reader an overview of important techniques, concepts and recent advances related to these questions. Nucleation, growth and how seed crystallography influences the final synthesis product are discussed, followed by shape prediction models based on seed crystallography and thermodynamic or kinetic parameters. The crystallographic implications of epitaxy and orientation in multilayered, core-shell nanoparticles are overviewed, and, finally, the development and implications of novel, spatially resolved analysis tools are discussed.

  11. Techniques for increasing the update rate of real-time dynamic computer graphic displays

    NASA Technical Reports Server (NTRS)

    Kahlbaum, W. M., Jr.

    1986-01-01

    This paper describes several techniques which may be used to increase the animation update rate of real-time computer raster graphic displays. The techniques were developed on the ADAGE RDS 3000 graphic system in support of the Advanced Concepts Simulator at the NASA Langley Research Center. The first technique involves pre-processing of the next animation frame while the previous one is being erased from the screen memory. The second technique involves the use of a parallel processor, the AGG4, for high speed character generation. The description of the AGG4 includes the Barrel Shifter which is a part of the hardware and is the key to the high speed character rendition. The final result of this total effort was a four fold increase in the update rate of an existing primary flight display from 4 to 16 frames per second.

  12. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    SciTech Connect

    Reed, Daniel; Berzins, Martin; Pennington, Robert; Sarkar, Vivek; Taylor, Valerie

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  13. Recent advances in computational mechanics of the human knee joint.

    PubMed

    Kazemi, M; Dabiri, Y; Li, L P

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  14. Recent Advances in Computational Mechanics of the Human Knee Joint

    PubMed Central

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  15. Determining flexor-tendon repair techniques via soft computing

    NASA Technical Reports Server (NTRS)

    Johnson, M.; Firoozbakhsh, K.; Moniem, M.; Jamshidi, M.

    2001-01-01

    An SC-based multi-objective decision-making method for determining the optimal flexor-tendon repair technique from experimental and clinical survey data, and with variable circumstances, was presented. Results were compared with those from the Taguchi method. Using the Taguchi method results in the need to perform ad-hoc decisions when the outcomes for individual objectives are contradictory to a particular preference or circumstance, whereas the SC-based multi-objective technique provides a rigorous straightforward computational process in which changing preferences and importance of differing objectives are easily accommodated. Also, adding more objectives is straightforward and easily accomplished. The use of fuzzy-set representations of information categories provides insight into their performance throughout the range of their universe of discourse. The ability of the technique to provide a "best" medical decision given a particular physician, hospital, patient, situation, and other criteria was also demonstrated.

  16. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.

    PubMed

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  17. Recovery Act: Advanced Interaction, Computation, and Visualization Tools for Sustainable Building Design

    SciTech Connect

    Greenberg, Donald P.; Hencey, Brandon M.

    2013-08-20

    Current building energy simulation technology requires excessive labor, time and expertise to create building energy models, excessive computational time for accurate simulations and difficulties with the interpretation of the results. These deficiencies can be ameliorated using modern graphical user interfaces and algorithms which take advantage of modern computer architectures and display capabilities. To prove this hypothesis, we developed an experimental test bed for building energy simulation. This novel test bed environment offers an easy-to-use interactive graphical interface, provides access to innovative simulation modules that run at accelerated computational speeds, and presents new graphics visualization methods to interpret simulation results. Our system offers the promise of dramatic ease of use in comparison with currently available building energy simulation tools. Its modular structure makes it suitable for early stage building design, as a research platform for the investigation of new simulation methods, and as a tool for teaching concepts of sustainable design. Improvements in the accuracy and execution speed of many of the simulation modules are based on the modification of advanced computer graphics rendering algorithms. Significant performance improvements are demonstrated in several computationally expensive energy simulation modules. The incorporation of these modern graphical techniques should advance the state of the art in the domain of whole building energy analysis and building performance simulation, particularly at the conceptual design stage when decisions have the greatest impact. More importantly, these better simulation tools will enable the transition from prescriptive to performative energy codes, resulting in better, more efficient designs for our future built environment.

  18. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics

    DTIC Science & Technology

    1981-08-01

    R. Rosinski. Texture gradient registration and the development of slant perception. Journal of Experimental Child Psycholog, 1976, 21, 339-348...gradient effectiveness in the perception of surface slant. Journal of Experimental Child Psychology, 1976, 22, 261-271. Schlosberg, H. Stereoscopic depth...resolution limit; the temporal frequency range from 0.1 Hz up to the CFF . Mean retinal illuminance was 10 trolands. For these conditions: 1) contrast

  19. Advanced Computational and Experimental Techniques for Nacelle Liner Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Gerhold, Carl H.; Jones, Michael G.; Brown, Martha C.; Nark, Douglas

    2009-01-01

    The Curved Duct Test Rig (CDTR) has been developed to investigate sound propagation through a duct of size comparable to the aft bypass duct of typical aircraft engines. The axial dimension of the bypass duct is often curved and this geometric characteristic is captured in the CDTR. The semiannular bypass duct is simulated by a rectangular test section in which the height corresponds to the circumferential dimension and the width corresponds to the radial dimension. The liner samples are perforate over honeycomb core and are installed on the side walls of the test section. The top and bottom surfaces of the test section are acoustically rigid to simulate a hard wall bifurcation or pylon. A unique feature of the CDTR is the control system that generates sound incident on the liner test section in specific modes. Uniform air flow, at ambient temperature and flow speed Mach 0.275, is introduced through the duct. Experiments to investigate configuration effects such as curvature along the flow path on the acoustic performance of a sample liner are performed in the CDTR and reported in this paper. Combinations of treated and acoustically rigid side walls are investigated. The scattering of modes of the incident wave, both by the curvature and by the asymmetry of wall treatment, is demonstrated in the experimental results. The effect that mode scattering has on total acoustic effectiveness of the liner treatment is also shown. Comparisons of measured liner attenuation with numerical results predicted by an analytic model based on the parabolic approximation to the convected Helmholtz equation are reported. The spectra of attenuation produced by the analytic model are similar to experimental results for both walls treated, straight and curved flow path, with plane wave and higher order modes incident. The numerical model is used to define the optimized resistance and reactance of a liner that significantly improves liner attenuation in the frequency range 1900-2400 Hz. A liner impedance descriptor is used to determine the liner parameters that achieve the optimum impedance.

  20. NASA Trapezoidal Wing Computations Including Transition and Advanced Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Rumsey, C. L.; Lee-Rausch, E. M.

    2012-01-01

    Flow about the NASA Trapezoidal Wing is computed with several turbulence models by using grids from the first High Lift Prediction Workshop in an effort to advance understanding of computational fluid dynamics modeling for this type of flowfield. Transition is accounted for in many of the computations. In particular, a recently-developed 4-equation transition model is utilized and works well overall. Accounting for transition tends to increase lift and decrease moment, which improves the agreement with experiment. Upper surface flap separation is reduced, and agreement with experimental surface pressures and velocity profiles is improved. The predicted shape of wakes from upstream elements is strongly influenced by grid resolution in regions above the main and flap elements. Turbulence model enhancements to account for rotation and curvature have the general effect of increasing lift and improving the resolution of the wing tip vortex as it convects downstream. However, none of the models improve the prediction of surface pressures near the wing tip, where more grid resolution is needed.

  1. Advanced Computational Methods for Thermal Radiative Heat Transfer

    SciTech Connect

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  2. Satellite communication performance evaluation: Computational techniques based on moments

    NASA Astrophysics Data System (ADS)

    Omura, J. K.; Simon, M. K.

    1980-09-01

    Computational techniques that efficiently compute bit error probabilities when only moments of the various interference random variables are available are presented. The approach taken is a generalization of the well known Gauss-Quadrature rules used for numerically evaluating single or multiple integrals. In what follows, basic algorithms are developed. Some of its properties and generalizations are shown and its many potential applications are described. Some typical interference scenarios for which the results are particularly applicable include: intentional jamming, adjacent and cochannel interferences; radar pulses (RFI); multipath; and intersymbol interference. While the examples presented stress evaluation of bit error probilities in uncoded digital communication systems, the moment techniques can also be applied to the evaluation of other parameters, such as computational cutoff rate under both normal and mismatched receiver cases in coded systems. Another important application is the determination of the probability distributions of the output of a discrete time dynamical system. This type of model occurs widely in control systems, queueing systems, and synchronization systems (e.g., discrete phase locked loops).

  3. Satellite communication performance evaluation: Computational techniques based on moments

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1980-01-01

    Computational techniques that efficiently compute bit error probabilities when only moments of the various interference random variables are available are presented. The approach taken is a generalization of the well known Gauss-Quadrature rules used for numerically evaluating single or multiple integrals. In what follows, basic algorithms are developed. Some of its properties and generalizations are shown and its many potential applications are described. Some typical interference scenarios for which the results are particularly applicable include: intentional jamming, adjacent and cochannel interferences; radar pulses (RFI); multipath; and intersymbol interference. While the examples presented stress evaluation of bit error probilities in uncoded digital communication systems, the moment techniques can also be applied to the evaluation of other parameters, such as computational cutoff rate under both normal and mismatched receiver cases in coded systems. Another important application is the determination of the probability distributions of the output of a discrete time dynamical system. This type of model occurs widely in control systems, queueing systems, and synchronization systems (e.g., discrete phase locked loops).

  4. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    NASA Astrophysics Data System (ADS)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  5. Computational intelligence techniques for biological data mining: An overview

    NASA Astrophysics Data System (ADS)

    Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari

    2014-10-01

    Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.

  6. Weldability and joining techniques for advanced fossil energy system alloys

    SciTech Connect

    Lundin, C.D.; Qiao, C.Y.P.; Liu, W.; Yang, D.; Zhou, G.; Morrison, M.

    1998-05-01

    The efforts represent the concerns for the basic understanding of the weldability and fabricability of the advanced high temperature alloys so necessary to affect increases in the efficiency of the next generation Fossil Energy Power Plants. The effort was divided into three tasks with the first effort dealing with the welding and fabrication behavior of 310HCbN (HR3C), the second task details the studies aimed at understanding the weldability of a newly developed 310TaN high temperature stainless (a modification of 310 stainless) and Task 3 addressed the cladding of austenitic tubing with Iron-Aluminide using the GTAW process. Task 1 consisted of microstructural studies on 310HCbN and the development of a Tube Weldability test which has applications to production welding techniques as well as laboratory weldability assessments. In addition, the evaluation of ex-service 310HCbN which showed fireside erosion and cracking at the attachment weld locations was conducted. Task 2 addressed the behavior of the newly developed 310 TaN modification of standard 310 stainless steel and showed that the weldability was excellent and that the sensitization potential was minimal for normal welding and fabrication conditions. The microstructural evolution during elevated temperature testing was characterized and the second phase particles evolved upon aging were identified. Task 3 details the investigation undertaken to clad 310HCbN tubing with Iron Aluminide and developed welding conditions necessary to provide a crack free cladding. The work showed that both a preheat and a post-heat was necessary for crack free deposits and the effect of a third element on the cracking potential was defined together with the effect of the aluminum level for optimum weldability.

  7. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    SciTech Connect

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs.

  8. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  9. Experimental and computing strategies in advanced material characterization problems

    SciTech Connect

    Bolzon, G.

    2015-10-28

    The mechanical characterization of materials relies more and more often on sophisticated experimental methods that permit to acquire a large amount of data and, contemporarily, to reduce the invasiveness of the tests. This evolution accompanies the growing demand of non-destructive diagnostic tools that assess the safety level of components in use in structures and infrastructures, for instance in the strategic energy sector. Advanced material systems and properties that are not amenable to traditional techniques, for instance thin layered structures and their adhesion on the relevant substrates, can be also characterized by means of combined experimental-numerical tools elaborating data acquired by full-field measurement techniques. In this context, parameter identification procedures involve the repeated simulation of the laboratory or in situ tests by sophisticated and usually expensive non-linear analyses while, in some situation, reliable and accurate results would be required in real time. The effectiveness and the filtering capabilities of reduced models based on decomposition and interpolation techniques can be profitably used to meet these conflicting requirements. This communication intends to summarize some results recently achieved in this field by the author and her co-workers. The aim is to foster further interaction between engineering and mathematical communities.

  10. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  11. Reliability of an interactive computer program for advance care planning.

    PubMed

    Schubart, Jane R; Levi, Benjamin H; Camacho, Fabian; Whitehead, Megan; Farace, Elana; Green, Michael J

    2012-06-01

    Despite widespread efforts to promote advance directives (ADs), completion rates remain low. Making Your Wishes Known: Planning Your Medical Future (MYWK) is an interactive computer program that guides individuals through the process of advance care planning, explaining health conditions and interventions that commonly involve life or death decisions, helps them articulate their values/goals, and translates users' preferences into a detailed AD document. The purpose of this study was to demonstrate that (in the absence of major life changes) the AD generated by MYWK reliably reflects an individual's values/preferences. English speakers ≥30 years old completed MYWK twice, 4 to 6 weeks apart. Reliability indices were assessed for three AD components: General Wishes; Specific Wishes for treatment; and Quality-of-Life values (QoL). Twenty-four participants completed the study. Both the Specific Wishes and QoL scales had high internal consistency in both time periods (Knuder Richardson formula 20 [KR-20]=0.83-0.95, and 0.86-0.89). Test-retest reliability was perfect for General Wishes (κ=1), high for QoL (Pearson's correlation coefficient=0.83), but lower for Specific Wishes (Pearson's correlation coefficient=0.57). MYWK generates an AD where General Wishes and QoL (but not Specific Wishes) statements remain consistent over time.

  12. Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing

    PubMed Central

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640

  13. A computer graphics display and data compression technique

    NASA Technical Reports Server (NTRS)

    Teague, M. J.; Meyer, H. G.; Levenson, L. (Editor)

    1974-01-01

    The computer program discussed is intended for the graphical presentation of a general dependent variable X that is a function of two independent variables, U and V. The required input to the program is the variation of the dependent variable with one of the independent variables for various fixed values of the other. The computer program is named CRP, and the output is provided by the SD 4060 plotter. Program CRP is an extremely flexible program that offers the user a wide variety of options. The dependent variable may be presented in either a linear or a logarithmic manner. Automatic centering of the plot is provided in the ordinate direction, and the abscissa is scaled automatically for a logarithmic plot. A description of the carpet plot technique is given along with the coordinates system used in the program. Various aspects of the program logic are discussed and detailed documentation of the data card format is presented.

  14. Experimental and Computational Techniques in Soft Condensed Matter Physics

    NASA Astrophysics Data System (ADS)

    Olafsen, Jeffrey

    2010-09-01

    1. Microscopy of soft materials Eric R. Weeks; 2. Computational methods to study jammed Systems Carl F. Schrek and Corey S. O'Hern; 3. Soft random solids: particulate gels, compressed emulsions and hybrid materials Anthony D. Dinsmore; 4. Langmuir monolayers Michael Dennin; 5. Computer modeling of granular rheology Leonardo E. Silbert; 6. Rheological and microrheological measurements of soft condensed matter John R. de Bruyn and Felix K. Oppong; 7. Particle-based measurement techniques for soft matter Nicholas T. Ouellette; 8. Cellular automata models of granular flow G. William Baxter; 9. Photoelastic materials Brian Utter; 10. Image acquisition and analysis in soft condensed matter Jeffrey S. Olafsen; 11. Structure and patterns in bacterial colonies Nicholas C. Darnton.

  15. Traffic simulations on parallel computers using domain decomposition techniques

    SciTech Connect

    Hanebutte, U.R.; Tentner, A.M.

    1995-12-31

    Large scale simulations of Intelligent Transportation Systems (ITS) can only be achieved by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic simulations with the standard simulation package TRAF-NETSIM on a 128 nodes IBM SPx parallel supercomputer as well as on a cluster of SUN workstations. Whilst this particular parallel implementation is based on NETSIM, a microscopic traffic simulation model, the presented strategy is applicable to a broad class of traffic simulations. An outer iteration loop must be introduced in order to converge to a global solution. A performance study that utilizes a scalable test network that consist of square-grids is presented, which addresses the performance penalty introduced by the additional iteration loop.

  16. NDE of advanced turbine engine components and materials by computed tomography

    NASA Technical Reports Server (NTRS)

    Yancey, R. N.; Baaklini, George Y.; Klima, Stanley J.

    1991-01-01

    Computed tomography (CT) is an X-ray technique that provides quantitative 3D density information of materials and components and can accurately detail spatial distributions of cracks, voids, and density variations. CT scans of ceramic materials, composites, and engine components were taken and the resulting images will be discussed. Scans were taken with two CT systems with different spatial resolution capabilities. The scans showed internal damage, density variations, and geometrical arrangement of various features in the materials and components. It was concluded that CT can play an important role in the characterization of advanced turbine engine materials and components. Future applications of this technology will be outlined.

  17. Investigation of joining techniques for advanced austenitic alloys

    SciTech Connect

    Lundin, C.D.; Qiao, C.Y.P.; Kikuchi, Y.; Shi, C.; Gill, T.P.S.

    1991-05-01

    Modified Alloys 316 and 800H, designed for high temperature service, have been developed at Oak Ridge National Laboratory. Assessment of the weldability of the advanced austenitic alloys has been conducted at the University of Tennessee. Four aspects of weldability of the advanced austenitic alloys were included in the investigation.

  18. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  19. Using advanced computer vision algorithms on small mobile robots

    NASA Astrophysics Data System (ADS)

    Kogut, G.; Birchmore, F.; Biagtan Pacis, E.; Everett, H. R.

    2006-05-01

    The Technology Transfer project employs a spiral development process to enhance the functionality and autonomy of mobile robot systems in the Joint Robotics Program (JRP) Robotic Systems Pool by converging existing component technologies onto a transition platform for optimization. An example of this approach is the implementation of advanced computer vision algorithms on small mobile robots. We demonstrate the implementation and testing of the following two algorithms useful on mobile robots: 1) object classification using a boosted Cascade of classifiers trained with the Adaboost training algorithm, and 2) human presence detection from a moving platform. Object classification is performed with an Adaboost training system developed at the University of California, San Diego (UCSD) Computer Vision Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working towards a solution to increase the robustness of this system to perform generic object recognition, this paper demonstrates an extension to this application by detecting soda cans in a cluttered indoor environment. The human presence detection from a moving platform system uses a data fusion algorithm which combines results from a scanning laser and a thermal imager. The system is able to detect the presence of humans while both the humans and the robot are moving simultaneously. In both systems, the two aforementioned algorithms were implemented on embedded hardware and optimized for use in real-time. Test results are shown for a variety of environments.

  20. Advances in the computational study of language acquisition.

    PubMed

    Brent, M R

    1996-01-01

    This paper provides a tutorial introduction to computational studies of how children learn their native languages. Its aim is to make recent advances accessible to the broader research community, and to place them in the context of current theoretical issues. The first section locates computational studies and behavioral studies within a common theoretical framework. The next two sections review two papers that appear in this volume: one on learning the meanings of words and one or learning the sounds of words. The following section highlights an idea which emerges independently in these two papers and which I have dubbed autonomous bootstrapping. Classical bootstrapping hypotheses propose that children begin to get a toc-hold in a particular linguistic domain, such as syntax, by exploiting information from another domain, such as semantics. Autonomous bootstrapping complements the cross-domain acquisition strategies of classical bootstrapping with strategies that apply within a single domain. Autonomous bootstrapping strategies work by representing partial and/or uncertain linguistic knowledge and using it to analyze the input. The next two sections review two more more contributions to this special issue: one on learning word meanings via selectional preferences and one on algorithms for setting grammatical parameters. The final section suggests directions for future research.

  1. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to

  2. Soft Computing Techniques for the Protein Folding Problem on High Performance Computing Architectures.

    PubMed

    Llanes, Antonio; Muñoz, Andrés; Bueno-Crespo, Andrés; García-Valverde, Teresa; Sánchez, Antonia; Arcas-Túnez, Francisco; Pérez-Sánchez, Horacio; Cecilia, José M

    2016-01-01

    The protein-folding problem has been extensively studied during the last fifty years. The understanding of the dynamics of global shape of a protein and the influence on its biological function can help us to discover new and more effective drugs to deal with diseases of pharmacological relevance. Different computational approaches have been developed by different researchers in order to foresee the threedimensional arrangement of atoms of proteins from their sequences. However, the computational complexity of this problem makes mandatory the search for new models, novel algorithmic strategies and hardware platforms that provide solutions in a reasonable time frame. We present in this revision work the past and last tendencies regarding protein folding simulations from both perspectives; hardware and software. Of particular interest to us are both the use of inexact solutions to this computationally hard problem as well as which hardware platforms have been used for running this kind of Soft Computing techniques.

  3. Computational technique to overcome the limit of a photomask writer

    NASA Astrophysics Data System (ADS)

    Choi, Jin; Shin, In Kyun; Jeon, Chan-Uk

    2014-04-01

    We present the limit of a conventional photomask writer and new possibilities to meet various requirements of tight specifications for a sub-10 nm device. The issues of a variable shaped beam (VSB) writer and how to overcome the limit by computational techniques are discussed. Because VSB writing can use only one rectangular or triangular beam per shot, the complex design for computational lithography results in the increase of shot number to implement rounded or angled pattern. Based on model-based-fracturing, we have confirmed that the ideal curvilinear pattern can be optimized by using overlapping shots, and that they have the same patterning performance in mask and wafer. On the other hand, the multibeam writer can make ideally any kinds of shapes, even curvilinear design because the combination of small spots writes a design. In a real situation, each spot of multibeam writer is defined at fixed mesh and each beam has discrete dose level, so that there are fidelity errors if the pixel size is large or the dose level is not enough large. Here, we propose a "Buddhist cross" design as the evaluation pattern of digitization error in a multibeam writer. The "fidelity error" smaller than 0.5 nm error requires 5 nm pixel size and the required minimum number for dose level is 7 to implement a smaller error than 0.05 nm at one edge. To realize new technology for mass production, new data flow, model based pattern verification, and required computing power have been presented.

  4. Computational techniques for flows with finite-rate condensation

    NASA Technical Reports Server (NTRS)

    Candler, Graham V.

    1993-01-01

    A computational method to simulate the inviscid two-dimensional flow of a two-phase fluid was developed. This computational technique treats the gas phase and each of a prescribed number of particle sizes as separate fluids which are allowed to interact with one another. Thus, each particle-size class is allowed to move through the fluid at its own velocity at each point in the flow field. Mass, momentum, and energy are exchanged between each particle class and the gas phase. It is assumed that the particles do not collide with one another, so that there is no inter-particle exchange of momentum and energy. However, the particles are allowed to grow, and therefore, they may change from one size class to another. Appropriate rates of mass, momentum, and energy exchange between the gas and particle phases and between the different particle classes were developed. A numerical method was developed for use with this equation set. Several test cases were computed and show qualitative agreement with previous calculations.

  5. Advanced techniques for array processing. Final report, 1 Mar 89-30 Apr 91

    SciTech Connect

    Friedlander, B.

    1991-05-30

    Array processing technology is expected to be a key element in communication systems designed for the crowded and hostile environment of the future battlefield. While advanced array processing techniques have been under development for some time, their practical use has been very limited. This project addressed some of the issues which need to be resolved for a successful transition of these promising techniques from theory into practice. The main problem which was studied was that of finding the directions of multiple co-channel transmitters from measurements collected by an antenna array. Two key issues related to high-resolution direction finding were addressed: effects of system calibration errors, and effects of correlation between the received signals due to multipath propagation. A number of useful theoretical performance analysis results were derived, and computationally efficient direction estimation algorithms were developed. These results include: self-calibration techniques for antenna arrays, sensitivity analysis for high-resolution direction finding, extensions of the root-MUSIC algorithm to arbitrary arrays and to arrays with polarization diversity, and new techniques for direction finding in the presence of multipath based on array interpolation. (Author)

  6. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    SciTech Connect

    Fletcher, James H.; Cox, Philip; Harrington, William J; Campbell, Joseph L

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  7. Advances in x-ray computed microtomography at the NSLS

    SciTech Connect

    Dowd, B.A.; Andrews, A.B.; Marr, R.B.; Siddons, D.P.; Jones, K.W.; Peskin, A.M.

    1998-08-01

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  8. ADVANCES IN X-RAY COMPUTED MICROTOMOGRAPHY AT THE NSLS.

    SciTech Connect

    DOWD,B.A.

    1998-08-07

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the ''gridding'' algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  9. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    SciTech Connect

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  10. Advanced Computational Framework for Environmental Management ZEM, Version 1.x

    SciTech Connect

    Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin

    2016-11-04

    Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions). To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.

  11. Recent advances in sample preparation techniques for effective bioanalytical methods.

    PubMed

    Kole, Prashant Laxman; Venkatesh, Gantala; Kotecha, Jignesh; Sheshala, Ravi

    2011-01-01

    This paper reviews the recent developments in bioanalysis sample preparation techniques and gives an update on basic principles, theory, applications and possibilities for automation, and a comparative discussion on the advantages and limitation of each technique. Conventional liquid-liquid extraction (LLE), protein precipitation (PP) and solid-phase extraction (SPE) techniques are now been considered as methods of the past. The last decade has witnessed a rapid development of novel sample preparation techniques in bioanalysis. Developments in SPE techniques such as selective sorbents and in the overall approach to SPE, such as hybrid SPE and molecularly imprinted polymer SPE, have been addressed. Considerable literature has been published in the area of solid-phase micro-extraction and its different versions, e.g. stir bar sorptive extraction, and their application in the development of selective and sensitive bioanalytical methods. Techniques such as dispersive solid-phase extraction, disposable pipette extraction and micro-extraction by packed sorbent offer a variety of extraction phases and provide unique advantages to bioanalytical methods. On-line SPE utilizing column-switching techniques is rapidly gaining acceptance in bioanalytical applications. PP sample preparation techniques such as PP filter plates/tubes offer many advantages like removal of phospholipids and proteins in plasma/serum. Newer approaches to conventional LLE techniques (salting-out LLE) are also covered in this review article.

  12. Block sparse Cholesky algorithms on advanced uniprocessor computers

    SciTech Connect

    Ng, E.G.; Peyton, B.W.

    1991-12-01

    As with many other linear algebra algorithms, devising a portable implementation of sparse Cholesky factorization that performs well on the broad range of computer architectures currently available is a formidable challenge. Even after limiting our attention to machines with only one processor, as we have done in this report, there are still several interesting issues to consider. For dense matrices, it is well known that block factorization algorithms are the best means of achieving this goal. We take this approach for sparse factorization as well. This paper has two primary goals. First, we examine two sparse Cholesky factorization algorithms, the multifrontal method and a blocked left-looking sparse Cholesky method, in a systematic and consistent fashion, both to illustrate the strengths of the blocking techniques in general and to obtain a fair evaluation of the two approaches. Second, we assess the impact of various implementation techniques on time and storage efficiency, paying particularly close attention to the work-storage requirement of the two methods and their variants.

  13. Advanced rehabilitation techniques for the multi-limb amputee.

    PubMed

    Harvey, Zach T; Loomis, Gregory A; Mitsch, Sarah; Murphy, Ian C; Griffin, Sarah C; Potter, Benjamin K; Pasquina, Paul

    2012-01-01

    Advances in combat casualty care have contributed to unprecedented survival rates of battlefield injuries, challenging the field of rehabilitation to help injured service members achieve maximal functional recovery and independence. Nowhere is this better illustrated than in the care of the multiple-limb amputee. Specialized medical, surgical, and rehabilitative interventions are needed to optimize the care of this unique patient population. This article describes lessons learned at Walter Reed National Military Medical Center Bethesda in providing advanced therapy and prosthetics for combat casualties, but provides guidelines for all providers involved in the care of individuals with amputation.

  14. Advanced froth flotation techniques for fine coal cleaning

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.

    1994-12-31

    Advanced column flotation cells offer many potential advantages for the treatment of fine coal. The most important of these is the ability to achieve high separation efficiencies using only a single stage of processing. Unfortunately, industrial flotation columns often suffer from poor recovery, low throughput and high maintenance requirements as compared to mechanically-agitated conventional cells. These problems can usually be attributed to poorly-designed air sparging systems. This article examines the problems of air sparging in greater detail and offers useful guidelines for designing bubble generators for industrial flotation columns. The application of these principles in the design of a successful advanced fine coal flotation circuit is also presented.

  15. Computer-aided classification of lung nodules on computed tomography images via deep learning technique

    PubMed Central

    Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen

    2015-01-01

    Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain. PMID:26346558

  16. Computer-aided classification of lung nodules on computed tomography images via deep learning technique.

    PubMed

    Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen

    2015-01-01

    Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain.

  17. Analytic Syntax: A Technique for Advanced Level Reading

    ERIC Educational Resources Information Center

    Berman, Ruth

    1975-01-01

    The technique explained here can increase a foreign student's awareness of English grammatical and rhetorical structures. Structural paraphrase is a syntactic reformulation of difficult phrases with minimal vocabulary changes. The technique is illustrated and suggestions are given for class presentation. (CHK)

  18. Computer vision techniques for rotorcraft low-altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Cheng, Victor H. L.

    1988-01-01

    A description is given of research that applies techniques from computer vision to automation of rotorcraft navigation. The effort emphasizes the development of a methodology for detecting the ranges to obstacles in the region of interest based on the maximum utilization of passive sensors. The range map derived from the obstacle detection approach can be used as obstacle data for the obstacle avoidance in an automataic guidance system and as advisory display to the pilot. The lack of suitable flight imagery data, however, presents a problem in the verification of concepts for obstacle detection. This problem is being addressed by the development of an adequate flight database and by preprocessing of currently available flight imagery. Some comments are made on future work and how research in this area relates to the guidance of other autonomous vehicles.

  19. System Design Techniques for Reducing the Power Requirements of Advanced life Support Systems

    NASA Technical Reports Server (NTRS)

    Finn, Cory; Levri, Julie; Pawlowski, Chris; Crawford, Sekou; Luna, Bernadette (Technical Monitor)

    2000-01-01

    The high power requirement associated with overall operation of regenerative life support systems is a critical Z:p technological challenge. Optimization of individual processors alone will not be sufficient to produce an optimized system. System studies must be used in order to improve the overall efficiency of life support systems. Current research efforts at NASA Ames Research Center are aimed at developing approaches for reducing system power and energy usage in advanced life support systems. System energy integration and energy reuse techniques are being applied to advanced life support, in addition to advanced control methods for efficient distribution of power and thermal resources. An overview of current results of this work will be presented. The development of integrated system designs that reuse waste heat from sources such as crop lighting and solid waste processing systems will reduce overall power and cooling requirements. Using an energy integration technique known as Pinch analysis, system heat exchange designs are being developed that match hot and cold streams according to specific design principles. For various designs, the potential savings for power, heating and cooling are being identified and quantified. The use of state-of-the-art control methods for distribution of resources, such as system cooling water or electrical power, will also reduce overall power and cooling requirements. Control algorithms are being developed which dynamically adjust the use of system resources by the various subsystems and components in order to achieve an overall goal, such as smoothing of power usage and/or heat rejection profiles, while maintaining adequate reserves of food, water, oxygen, and other consumables, and preventing excessive build-up of waste materials. Reductions in the peak loading of the power and thermal systems will lead to lower overall requirements. Computer simulation models are being used to test various control system designs.

  20. Recent advances in microscopic techniques for visualizing leukocytes in vivo

    PubMed Central

    Jain, Rohit; Tikoo, Shweta; Weninger, Wolfgang

    2016-01-01

    Leukocytes are inherently motile and interactive cells. Recent advances in intravital microscopy approaches have enabled a new vista of their behavior within intact tissues in real time. This brief review summarizes the developments enabling the tracking of immune responses in vivo. PMID:27239292

  1. Bricklaying Curriculum: Advanced Bricklaying Techniques. Instructional Materials. Revised.

    ERIC Educational Resources Information Center

    Turcotte, Raymond J.; Hendrix, Laborn J.

    This curriculum guide is designed to assist bricklaying instructors in providing performance-based instruction in advanced bricklaying. Included in the first section of the guide are units on customized or architectural masonry units; glass block; sills, lintels, and copings; and control (expansion) joints. The next two units deal with cut,…

  2. Advanced NDE techniques for quantitative characterization of aircraft

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.

    1990-01-01

    Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.

  3. A dual-frequency applied potential tomography technique: computer simulations.

    PubMed

    Griffiths, H; Ahmed, A

    1987-01-01

    Applied potential tomography has been discussed in relation to both static and dynamic imaging. We have investigated the feasibility of obtaining static images by measuring profiles at two frequencies of drive current to exploit the differing gradients of electrical conductivity with frequency for different tissues. This method has the advantages that no profile for the homogeneous medium is then needed, and the electrodes can be coupled directly to the skin. To demonstrate the principle, computer simulations have been carried out using published electrical parameters for mammalian tissues at frequencies of 100 and 150 kHz. The distribution of complex electric potentials was calculated by the successive over-relaxation method in two dimensions for an abdominal cross-section with 16 electrodes equally spaced around the surface. From the computed electrode potentials, images were reconstructed using a back-projection method (neglecting phase information). Liver and kidney appeared most distinctly on the image because of their comparatively large conductivity gradients. The perturbations in the electrode potential differences between the two frequencies had a mean value of 5%, requiring accurate measurement in a practical system, compared with 150% when the 100 kHz values were related to a simulation of homogeneous saline equal in conductivity to muscle. The perturbations could be increased by widening the separation of the frequencies. Static imaging using a dual-frequency technique appears to be feasible, but a more detailed consideration of the electrical properties of tissues is needed to determine the optimum choice of frequencies.

  4. Utilization of advanced calibration techniques in stochastic rock fall analysis of quarry slopes

    NASA Astrophysics Data System (ADS)

    Preh, Alexander; Ahmadabadi, Morteza; Kolenprat, Bernd

    2016-04-01

    In order to study rock fall dynamics, a research project was conducted by the Vienna University of Technology and the Austrian Central Labour Inspectorate (Federal Ministry of Labour, Social Affairs and Consumer Protection). A part of this project included 277 full-scale drop tests at three different quarries in Austria and recording key parameters of the rock fall trajectories. The tests involved a total of 277 boulders ranging from 0.18 to 1.8 m in diameter and from 0.009 to 8.1 Mg in mass. The geology of these sites included strong rock belonging to igneous, metamorphic and volcanic types. In this paper the results of the tests are used for calibration and validation a new stochastic computer model. It is demonstrated that the error of the model (i.e. the difference between observed and simulated results) has a lognormal distribution. Selecting two parameters, advanced calibration techniques including Markov Chain Monte Carlo Technique, Maximum Likelihood and Root Mean Square Error (RMSE) are utilized to minimize the error. Validation of the model based on the cross validation technique reveals that in general, reasonable stochastic approximations of the rock fall trajectories are obtained in all dimensions, including runout, bounce heights and velocities. The approximations are compared to the measured data in terms of median, 95% and maximum values. The results of the comparisons indicate that approximate first-order predictions, using a single set of input parameters, are possible and can be used to aid practical hazard and risk assessment.

  5. An experimental modal testing/identification technique for personal computers

    NASA Technical Reports Server (NTRS)

    Roemer, Michael J.; Schlonski, Steven T.; Mook, D. Joseph

    1990-01-01

    A PC-based system for mode shape identification is evaluated. A time-domain modal identification procedure is utilized to identify the mode shapes of a beam apparatus from discrete time-domain measurements. The apparatus includes a cantilevered aluminum beam, four accelerometers, four low-pass filters, and the computer. The method's algorithm is comprised of an identification algorithm: the Eigensystem Realization Algorithm (ERA) and an estimation algorithm called Minimum Model Error (MME). The identification ability of this algorithm is compared with ERA alone, a frequency-response-function technique, and an Euler-Bernoulli beam model. Detection of modal parameters and mode shapes by the PC-based time-domain system is shown to be accurate in an application with an aluminum beam, while mode shapes identified by the frequency-domain technique are not as accurate as predicted. The new method is shown to be significantly less sensitive to noise and poorly excited modes than other leading methods. The results support the use of time-domain identification systems for mode shape prediction.

  6. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  7. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    SciTech Connect

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-04-25

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  8. Advanced Simulation and Computing FY07-08 Implementation Plan Volume 2

    SciTech Connect

    Kusnezov, D; Hale, A; McCoy, M; Hopson, J

    2006-06-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  9. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  10. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  11. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  12. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  13. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  14. Backscattered Electron Microscopy as an Advanced Technique in Petrography.

    ERIC Educational Resources Information Center

    Krinsley, David Henry; Manley, Curtis Robert

    1989-01-01

    Three uses of this method with sandstone, desert varnish, and granite weathering are described. Background information on this technique is provided. Advantages of this type of microscopy are stressed. (CW)

  15. Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR

    DTIC Science & Technology

    2014-07-12

    based ground penetrating radars for the detection of subsurface objects that are low in metal content and hard to detect. The derived techniques...penetrating radars for the detection of subsurface objects that are low in metal content and hard to detect. The derived techniques include the exploitation...5.00 4.00 3.00 9.00 T. Glenn, J. Wilson, D. Ho. A MULTIMODAL MATCHING PURSUITS DISSIMILARITY MEASURE APPLIED TO LANDMINE/CLUTTER DISCRIMINATION

  16. Electroextraction and electromembrane extraction: Advances in hyphenation to analytical techniques

    PubMed Central

    Oedit, Amar; Ramautar, Rawi; Hankemeier, Thomas

    2016-01-01

    Electroextraction (EE) and electromembrane extraction (EME) are sample preparation techniques that both require an electric field that is applied over a liquid‐liquid system, which enables the migration of charged analytes. Furthermore, both techniques are often used to pre‐concentrate analytes prior to analysis. In this review an overview is provided of the body of literature spanning April 2012–November 2015 concerning EE and EME, focused on hyphenation to analytical techniques. First, the theoretical aspects of concentration enhancement in EE and EME are discussed to explain extraction recovery and enrichment factor. Next, overviews are provided of the techniques based on their hyphenation to LC, GC, CE, and direct detection. These overviews cover the compounds and matrices, experimental aspects (i.e. donor volume, acceptor volume, extraction time, extraction voltage, and separation time) and the analytical aspects (i.e. limit of detection, enrichment factor, and extraction recovery). Techniques that were either hyphenated online to analytical techniques or show high potential with respect to online hyphenation are highlighted. Finally, the potential future directions of EE and EME are discussed. PMID:26864699

  17. Advanced millimeter-wave security portal imaging techniques

    NASA Astrophysics Data System (ADS)

    Sheen, David M.; Bernacki, Bruce E.; McMakin, Douglas L.

    2012-03-01

    Millimeter-wave (mm-wave) imaging is rapidly gaining acceptance as a security tool to augment conventional metal detectors and baggage x-ray systems for passenger screening at airports and other secured facilities. This acceptance indicates that the technology has matured; however, many potential improvements can yet be realized. The authors have developed a number of techniques over the last several years including novel image reconstruction and display techniques, polarimetric imaging techniques, array switching schemes, and high-frequency high-bandwidth techniques. All of these may improve the performance of new systems; however, some of these techniques will increase the cost and complexity of the mm-wave security portal imaging systems. Reducing this cost may require the development of novel array designs. In particular, RF photonic methods may provide new solutions to the design and development of the sequentially switched linear mm-wave arrays that are the key element in the mm-wave portal imaging systems. Highfrequency, high-bandwidth designs are difficult to achieve with conventional mm-wave electronic devices, and RF photonic devices may be a practical alternative. In this paper, the mm-wave imaging techniques developed at PNNL are reviewed and the potential for implementing RF photonic mm-wave array designs is explored.

  18. Coal and char studies by advanced EMR techniques

    SciTech Connect

    Belford, R.L.; Clarkson, R.B.; Odintsov, B.M.

    1999-03-31

    Advanced magnetic resonance (EMR) methods are used to examine properties of coals, chars, and molecular species related to constituents of coal. During this grant period, further progress was made on proton NMR and low-frequency dynamic nuclear polarization (DNP) to examine the interaction between fluids such as water and the surface of suspended char particles. Effects of char particle size and type on water nuclear spin relaxation, T2, were measured and modeled.

  19. Coal and char studies by advanced EMR techniques

    SciTech Connect

    Belford, R.L.; Clarkson, R.B.; Odintsov, B.M.

    1998-09-30

    Advanced magnetic resonance (EMR) methods are used to examine properties of coals, chars, and molecular species related to constituents of coal. During this grant period, further progress was made on proton NMR and low-frequency dynamic nuclear polarization (DNP) to examine the interaction between fluids such as water and the surface of suspended char particles. Effects of char particle size on water nuclear spin relaxation, T2, were measured.

  20. ADVANCED COMPUTATIONAL MODEL FOR THREE-PHASE SLURRY REACTORS

    SciTech Connect

    Goodarz Ahmadi

    2004-10-01

    In this project, an Eulerian-Lagrangian formulation for analyzing three-phase slurry flows in a bubble column was developed. The approach used an Eulerian analysis of liquid flows in the bubble column, and made use of the Lagrangian trajectory analysis for the bubbles and particle motions. The bubble-bubble and particle-particle collisions are included the model. The model predictions are compared with the experimental data and good agreement was found An experimental setup for studying two-dimensional bubble columns was developed. The multiphase flow conditions in the bubble column were measured using optical image processing and Particle Image Velocimetry techniques (PIV). A simple shear flow device for bubble motion in a constant shear flow field was also developed. The flow conditions in simple shear flow device were studied using PIV method. Concentration and velocity of particles of different sizes near a wall in a duct flow was also measured. The technique of Phase-Doppler anemometry was used in these studies. An Eulerian volume of fluid (VOF) computational model for the flow condition in the two-dimensional bubble column was also developed. The liquid and bubble motions were analyzed and the results were compared with observed flow patterns in the experimental setup. Solid-fluid mixture flows in ducts and passages at different angle of orientations were also analyzed. The model predictions were compared with the experimental data and good agreement was found. Gravity chute flows of solid-liquid mixtures were also studied. The simulation results were compared with the experimental data and discussed A thermodynamically consistent model for multiphase slurry flows with and without chemical reaction in a state of turbulent motion was developed. The balance laws were obtained and the constitutive laws established.

  1. Advanced Sensing and Control Techniques to Facilitate Semi-Autonomous Decommissioning

    SciTech Connect

    Schalkoff, Robert J.

    1999-06-01

    This research is intended to advance the technology of semi-autonomous teleoperated robotics as applied to Decontamination and Decommissioning (D&D) tasks. Specifically, research leading to a prototype dual-manipulator mobile work cell is underway. This cell is supported and enhanced by computer vision, virtual reality and advanced robotics technology.

  2. Application of Active Learning Techniques to an Advanced Course

    NASA Astrophysics Data System (ADS)

    Knop, R. A.

    2004-05-01

    The New Faculty Workshop provided a wealth of techniques as well as an overriding philosophy for the teaching of undergraduate Physics and Astronomy courses. The focus of the workshop was active learning, summarized in ``Learner-Centered Astronomy Teaching" by Slater & Adams: it's not what you do in class that matters, it's what the students do. Much of the specific focus of the New Faculty Workshop is on teaching the large, introductory Physics classes that many of the faculty present are sure to teach, both algebra-based and calculus-based. Many of these techniques apply directly and with little modification to introductory Astronomy courses. However, little direct attention is given to upper-division undergraduate, or even graduate, courses. In this presentation, I will share my experience in attempting to apply some of the techniques discussed at the New Faculty Workshop to an upper-division course in Galactic Astrophysics at Vanderbilt University during the Spring semester of 2004.

  3. The bumper technique for advancing a large profile microcatheter.

    PubMed

    Kellner, Christopher P; Chartrain, Alexander G; Schwegel, Claire; Oxley, Thomas J; Shoirah, Hazem; Mocco, J

    2017-03-09

    Operators commonly encounter difficulty maneuvering a microcatheter beyond the distal lip of wide neck aneurysms and aneurysms in challenging locations. Few techniques have been described to guide operators in these particular situations. In this case report of a 56-year-old woman with a 16 mm ophthalmic artery aneurysm, the microcatheter continually snagged the distal aneurysm lip, preventing delivery of a flow diverter into the distal parent vessel. In troubleshooting this obstacle, a second microguidewire was introduced alongside the microcatheter and was used to cover the distal lip of the aneurysm to prevent further snagging. The second guidewire successfully deflected the microcatheter into the distal vessel, a technique that we have aptly dubbed the 'bumper technique'.

  4. Nondestructive Evaluation of Thick Concrete Using Advanced Signal Processing Techniques

    SciTech Connect

    Clayton, Dwight A; Barker, Alan M; Santos-Villalobos, Hector J; Albright, Austin P; Hoegh, Kyle; Khazanovich, Lev

    2015-09-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years [1]. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the various nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations.

  5. Advances in computed radiography systems and their physical imaging characteristics.

    PubMed

    Cowen, A R; Davies, A G; Kengyelics, S M

    2007-12-01

    Radiological imaging is progressing towards an all-digital future, across the spectrum of medical imaging techniques. Computed radiography (CR) has provided a ready pathway from screen film to digital radiography and a convenient entry point to PACS. This review briefly revisits the principles of modern CR systems and their physical imaging characteristics. Wide dynamic range and digital image enhancement are well-established benefits of CR, which lend themselves to improved image presentation and reduced rates of repeat exposures. However, in its original form CR offered limited scope for reducing the radiation dose per radiographic exposure, compared with screen film. Recent innovations in CR, including the use of dual-sided image readout and channelled storage phosphor have eased these concerns. For example, introduction of these technologies has improved detective quantum efficiency (DQE) by approximately 50 and 100%, respectively, compared with standard CR. As a result CR currently affords greater scope for reducing patient dose, and provides a more substantive challenge to the new solid-state, flat-panel, digital radiography detectors.

  6. A fission matrix based validation protocol for computed power distributions in the advanced test reactor

    SciTech Connect

    Nielsen, J. W.; Nigg, D. W.; LaPorta, A. W.

    2013-07-01

    The Idaho National Laboratory (INL) has been engaged in a significant multi year effort to modernize the computational reactor physics tools and validation procedures used to support operations of the Advanced Test Reactor (ATR) and its companion critical facility (ATRC). Several new protocols for validation of computed neutron flux distributions and spectra as well as for validation of computed fission power distributions, based on new experiments and well-recognized least-squares statistical analysis techniques, have been under development. In the case of power distributions, estimates of the a priori ATR-specific fuel element-to-element fission power correlation and covariance matrices are required for validation analysis. A practical method for generating these matrices using the element-to-element fission matrix is presented, along with a high-order scheme for estimating the underlying fission matrix itself. The proposed methodology is illustrated using the MCNP5 neutron transport code for the required neutronics calculations. The general approach is readily adaptable for implementation using any multidimensional stochastic or deterministic transport code that offers the required level of spatial, angular, and energy resolution in the computed solution for the neutron flux and fission source. (authors)

  7. Transcranial Doppler: Techniques and advanced applications: Part 2

    PubMed Central

    Sharma, Arvind K.; Bathala, Lokesh; Batra, Amit; Mehndiratta, Man Mohan; Sharma, Vijay K.

    2016-01-01

    Transcranial Doppler (TCD) is the only diagnostic tool that can provide continuous information about cerebral hemodynamics in real time and over extended periods. In the previous paper (Part 1), we have already presented the basic ultrasound physics pertaining to TCD, insonation methods, and various flow patterns. This article describes various advanced applications of TCD such as detection of right-to-left shunt, emboli monitoring, vasomotor reactivity (VMR), monitoring of vasospasm in subarachnoid hemorrhage (SAH), monitoring of intracranial pressure, its role in stoke prevention in sickle cell disease, and as a supplementary test for confirmation of brain death. PMID:27011639

  8. Brain development in preterm infants assessed using advanced MRI techniques.

    PubMed

    Tusor, Nora; Arichi, Tomoki; Counsell, Serena J; Edwards, A David

    2014-03-01

    Infants who are born preterm have a high incidence of neurocognitive and neurobehavioral abnormalities, which may be associated with impaired brain development. Advanced magnetic resonance imaging (MRI) approaches, such as diffusion MRI (d-MRI) and functional MRI (fMRI), provide objective and reproducible measures of brain development. Indices derived from d-MRI can be used to provide quantitative measures of preterm brain injury. Although fMRI of the neonatal brain is currently a research tool, future studies combining d-MRI and fMRI have the potential to assess the structural and functional properties of the developing brain and its response to injury.

  9. Application of advanced coating techniques to rocket engine components

    NASA Technical Reports Server (NTRS)

    Verma, S. K.

    1988-01-01

    The materials problem in the space shuttle main engine (SSME) is reviewed. Potential coatings and the method of their application for improved life of SSME components are discussed. A number of advanced coatings for turbine blade components and disks are being developed and tested in a multispecimen thermal fatigue fluidized bed facility at IIT Research Institute. This facility is capable of producing severe strains of the degree present in blades and disk components of the SSME. The potential coating systems and current efforts at IITRI being taken for life extension of the SSME components are summarized.

  10. Comparison of TCP automatic tuning techniques for distributed computing

    SciTech Connect

    Weigle, E. H.; Feng, W. C.

    2002-01-01

    Rather than painful, manual, static, per-connection optimization of TCP buffer sizes simply to achieve acceptable performance for distributed applications, many researchers have proposed techniques to perform this tuning automatically. This paper first discusses the relative merits of the various approaches in theory, and then provides substantial experimental data concerning two competing implementations - the buffer autotuning already present in Linux 2.4.x and 'Dynamic Right-Sizing.' This paper reveals heretofore unknown aspects of the problem and current solutions, provides insight into the proper approach for various circumstances, and points toward ways to further improve performance. TCP, for good or ill, is the only protocol widely available for reliable end-to-end congestion-controlled network communication, and thus it is the one used for almost all distributed computing. Unfortunately, TCP was not designed with high-performance computing in mind - its original design decisions focused on long-term fairness first, with performance a distant second. Thus users must often perform tortuous manual optimizations simply to achieve acceptable behavior. The most important and often most difficult task is determining and setting appropriate buffer sizes. Because of this, at least six ways of automatically setting these sizes have been proposed. In this paper, we compare and contrast these tuning methods. First we explain each method, followed by an in-depth discussion of their features. Next we discuss the experiments to fully characterize two particularly interesting methods (Linux 2.4 autotuning and Dynamic Right-Sizing). We conclude with results and possible improvements.

  11. Current Advances in the Computational Simulation of the Formation of Low-Mass Stars

    SciTech Connect

    Klein, R I; Inutsuka, S; Padoan, P; Tomisaka, K

    2005-10-24

    Developing a theory of low-mass star formation ({approx} 0.1 to 3 M{sub {circle_dot}}) remains one of the most elusive and important goals of theoretical astrophysics. The star-formation process is the outcome of the complex dynamics of interstellar gas involving non-linear interactions of turbulence, gravity, magnetic field and radiation. The evolution of protostellar condensations, from the moment they are assembled by turbulent flows to the time they reach stellar densities, spans an enormous range of scales, resulting in a major computational challenge for simulations. Since the previous Protostars and Planets conference, dramatic advances in the development of new numerical algorithmic techniques have been successfully implemented on large scale parallel supercomputers. Among such techniques, Adaptive Mesh Refinement and Smooth Particle Hydrodynamics have provided frameworks to simulate the process of low-mass star formation with a very large dynamic range. It is now feasible to explore the turbulent fragmentation of molecular clouds and the gravitational collapse of cores into stars self-consistently within the same calculation. The increased sophistication of these powerful methods comes with substantial caveats associated with the use of the techniques and the interpretation of the numerical results. In this review, we examine what has been accomplished in the field and present a critique of both numerical methods and scientific results. We stress that computational simulations should obey the available observational constraints and demonstrate numerical convergence. Failing this, results of large scale simulations do not advance our understanding of low-mass star formation.

  12. Benefits of advanced software techniques for mission planning systems

    NASA Technical Reports Server (NTRS)

    Gasquet, A.; Parrod, Y.; Desaintvincent, A.

    1994-01-01

    The increasing complexity of modern spacecraft, and the stringent requirement for maximizing their mission return, call for a new generation of Mission Planning Systems (MPS). In this paper, we discuss the requirements for the Space Mission Planning and the benefits which can be expected from Artificial Intelligence techniques through examples of applications developed by Matra Marconi Space.

  13. In Situ Techniques for Monitoring Electrochromism: An Advanced Laboratory Experiment

    ERIC Educational Resources Information Center

    Saricayir, Hakan; Uce, Musa; Koca, Atif

    2010-01-01

    This experiment employs current technology to enhance and extend existing lab content. The basic principles of spectroscopic and electroanalytical techniques and their use in determining material properties are covered in some detail in many undergraduate chemistry programs. However, there are limited examples of laboratory experiments with in…

  14. Advances in High-Fidelity Multi-Physics Simulation Techniques

    DTIC Science & Technology

    2008-01-01

    fluid dynamics with other disciplines also yield a large and typically stiff equation set whose numerical solution mandates the development and...and Electromagnetics . . . . . 3 2.1 Governing Equations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2 Numerical Technique...discrete equivalent of the governing equations . Thus, the values of the solution vector are localized in a pointwise sense at each node of the mesh. This

  15. Computer Vision Techniques Applied to Space Object Detect, Track, ID, Characterize

    NASA Astrophysics Data System (ADS)

    Flewelling, B.

    2014-09-01

    Space-based object detection and tracking represents a fundamental step necessary for detailed analysis of space objects. Initial observations of a resident space object (RSO) may result from careful sensor tasking to observe an object with well understood dynamics, or measurements-of-opportunity on an object with poorly understood dynamics. Dim and eccentric objects present a particular challenge which requires more dynamic use of imaging systems. As a result of more stressing data acquisition strategies, advanced techniques for the accurate processing of both point and streaking sources are needed. This paper will focus on two key methods in computer vision used to determine interest points within imagery. The Harris Corner method and the method of Phase Congruency can be used to effectively extract static and streaking point sources and to indicate when apparent motion is present within an observation. The geometric inferences which can be made from the resulting detections will be discussed, including a method to evaluate the localization uncertainty of the extracted detections which is based on the computation of the Hessian of the detector response. Finally a technique which exploits the additional information found in detected streak endpoints to provide a better centroid in the presence of curved streaks is explained and additional applications for the presented techniques are discussed.

  16. Single Molecule Techniques for Advanced in situ Hybridization

    SciTech Connect

    Hollars, C W; Stubbs, L; Carlson, K; Lu, X; Wehri, E

    2003-02-03

    One of the most significant achievements of modern science is completion of the human genome sequence, completed in the year 2000. Despite this monumental accomplishment, researchers have only begun to understand the relationships between this three-billion-nucleotide genetic code and the regulation and control of gene and protein expression within each of the millions of different types of highly specialized cells. Several methodologies have been developed for the analysis of gene and protein expression in situ, yet despite these advancements, the pace of such analyses is extremely limited. Because information regarding the precise timing and location of gene expression is a crucial component in the discovery of new pharmacological agents for the treatment of disease, there is an enormous incentive to develop technologies that accelerate the analytical process. Here we report on the use of plasmon resonant particles as advanced probes for in situ hybridization. These probes are used for the detection of low levels of gene-probe response and demonstrate a detection method that enables precise, simultaneous localization within a cell of the points of expression of multiple genes or proteins in a single sample.

  17. The Design and Transfer of Advanced Command and Control (C2) Computer-Based Systems

    DTIC Science & Technology

    1980-03-31

    TECHNICAL REPORT 80-02 QUARTERLY TECHNICAL REPORT: THE DESIGN AND TRANSFER OF ADVANCED COMMAND AND CONTROL (C 2 ) COMPUTER-BASED SYSTEMS ARPA...The Tasks/Objectives and/or Purposes of the overall project are connected with the design , development, demonstration and transfer of advanced...command and control (C2 ) computer-based systems; this report covers work in the computer-based design and transfer areas only. The Technical Problems thus

  18. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  19. Advanced optical techniques for monitoring dosimetric parameters in photodynamic therapy

    NASA Astrophysics Data System (ADS)

    Li, Buhong; Qiu, Zhihai; Huang, Zheng

    2012-12-01

    Photodynamic therapy (PDT) is based on the generation of highly reactive singlet oxygen through interactions of photosensitizer, light and molecular oxygen. PDT has become a clinically approved, minimally invasive therapeutic modality for a wide variety of malignant and nonmalignant diseases. The main dosimetric parameters for predicting the PDT efficacy include the delivered light dose, the quantification and photobleaching of the administrated photosensitizer, the tissue oxygen concentration, the amount of singlet oxygen generation and the resulting biological responses. This review article presents the emerging optical techniques that in use or under development for monitoring dosimetric parameters during PDT treatment. Moreover, the main challenges in developing real-time and noninvasive optical techniques for monitoring dosimetric parameters in PDT will be described.

  20. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  1. Continuous analog of multiplicative algebraic reconstruction technique for computed tomography

    NASA Astrophysics Data System (ADS)

    Tateishi, Kiyoko; Yamaguchi, Yusaku; Abou Al-Ola, Omar M.; Kojima, Takeshi; Yoshinaga, Tetsuya

    2016-03-01

    We propose a hybrid dynamical system as a continuous analog to the block-iterative multiplicative algebraic reconstruction technique (BI-MART), which is a well-known iterative image reconstruction algorithm for computed tomography. The hybrid system is described by a switched nonlinear system with a piecewise smooth vector field or differential equation and, for consistent inverse problems, the convergence of non-negatively constrained solutions to a globally stable equilibrium is guaranteed by the Lyapunov theorem. Namely, we can prove theoretically that a weighted Kullback-Leibler divergence measure can be a common Lyapunov function for the switched system. We show that discretizing the differential equation by using the first-order approximation (Euler's method) based on the geometric multiplicative calculus leads to the same iterative formula of the BI-MART with the scaling parameter as a time-step of numerical discretization. The present paper is the first to reveal that a kind of iterative image reconstruction algorithm is constructed by the discretization of a continuous-time dynamical system for solving tomographic inverse problems. Iterative algorithms with not only the Euler method but also the Runge-Kutta methods of lower-orders applied for discretizing the continuous-time system can be used for image reconstruction. A numerical example showing the characteristics of the discretized iterative methods is presented.

  2. Image processing techniques in computer-assisted patch clamping

    NASA Astrophysics Data System (ADS)

    Azizian, Mahdi; Patel, Rajni; Gavrilovici, Cezar; Poulter, Michael O.

    2010-02-01

    Patch clamping is used in electrophysiology to study single or multiple ion channels in cells. Multiple micropipettes are used as electrodes to collect data from several cells. Placement of these electrodes is a time consuming and complicated task due to the lack of depth perception, limited view through the microscope lens and the possibility of collisions between micro-pipettes. To aid in this process, a computer-assisted approach is developed using image processing techniques applied to images obtained through the microscope. Image processing algorithms are applied to perform autofocusing, relative depth estimation, distance estimation and tracking of the micro-pipettes in the images without making any major changes in the existing patch clamp equipment. An autofocusing algorithm with a micrometer precision is developed and the relative depth estimation is performed based on autofocusing. A micro-pipette tip detection algorithm is developed which can be used to initialize or reset the tracking algorithm and to calibrate the system by registering the relative image and micro-manipulator coordinates. An image-based tracking algorithm is also developed to track a micro-pipette tip in real time. The real-time tracking data is then used for visual servoing the micro-pipette tips and updating the calibration information.

  3. Advance techniques for monitoring human tolerance to +Gz accelerations.

    NASA Technical Reports Server (NTRS)

    Pelligra, R.; Sandler, H.; Rositano, S.; Skrettingland, K.; Mancini, R.

    1972-01-01

    Standard techniques for monitoring the acceleration-stressed human subject have been augmented by measuring (1) temporal, brachial and/or radial arterial blood flow, and (2) indirect systolic and diastolic blood pressure at 60-sec intervals. Results show that the response of blood pressure to positive accelerations is complex and dependent on an interplay of hydrostatic forces, diminishing venous return, redistribution of blood, and other poorly defined compensatory reflexes.

  4. Advanced techniques for characterization of ion beam modified materials

    DOE PAGES

    Zhang, Yanwen; Debelle, Aurélien; Boulle, Alexandre; ...

    2014-10-30

    Understanding the mechanisms of damage formation in materials irradiated with energetic ions is essential for the field of ion-beam materials modification and engineering. Utilizing incident ions, electrons, photons, and positrons, various analysis techniques, including Rutherford backscattering spectrometry (RBS), electron RBS, Raman spectroscopy, high-resolution X-ray diffraction, small-angle X-ray scattering, and positron annihilation spectroscopy, are routinely used or gaining increasing attention in characterizing ion beam modified materials. The distinctive information, recent developments, and some perspectives in these techniques are reviewed in this paper. Applications of these techniques are discussed to demonstrate their unique ability for studying ion-solid interactions and the corresponding radiationmore » effects in modified depths ranging from a few nm to a few tens of μm, and to provide information on electronic and atomic structure of the materials, defect configuration and concentration, as well as phase stability, amorphization and recrystallization processes. Finally, such knowledge contributes to our fundamental understanding over a wide range of extreme conditions essential for enhancing material performance and also for design and synthesis of new materials to address a broad variety of future energy applications.« less

  5. Advanced techniques for characterization of ion beam modified materials

    SciTech Connect

    Zhang, Yanwen; Debelle, Aurélien; Boulle, Alexandre; Kluth, Patrick; Tuomisto, Filip

    2014-10-30

    Understanding the mechanisms of damage formation in materials irradiated with energetic ions is essential for the field of ion-beam materials modification and engineering. Utilizing incident ions, electrons, photons, and positrons, various analysis techniques, including Rutherford backscattering spectrometry (RBS), electron RBS, Raman spectroscopy, high-resolution X-ray diffraction, small-angle X-ray scattering, and positron annihilation spectroscopy, are routinely used or gaining increasing attention in characterizing ion beam modified materials. The distinctive information, recent developments, and some perspectives in these techniques are reviewed in this paper. Applications of these techniques are discussed to demonstrate their unique ability for studying ion-solid interactions and the corresponding radiation effects in modified depths ranging from a few nm to a few tens of μm, and to provide information on electronic and atomic structure of the materials, defect configuration and concentration, as well as phase stability, amorphization and recrystallization processes. Finally, such knowledge contributes to our fundamental understanding over a wide range of extreme conditions essential for enhancing material performance and also for design and synthesis of new materials to address a broad variety of future energy applications.

  6. Recent advances in bioprinting techniques: approaches, applications and future prospects.

    PubMed

    Li, Jipeng; Chen, Mingjiao; Fan, Xianqun; Zhou, Huifang

    2016-09-20

    Bioprinting technology shows potential in tissue engineering for the fabrication of scaffolds, cells, tissues and organs reproducibly and with high accuracy. Bioprinting technologies are mainly divided into three categories, inkjet-based bioprinting, pressure-assisted bioprinting and laser-assisted bioprinting, based on their underlying printing principles. These various printing technologies have their advantages and limitations. Bioprinting utilizes biomaterials, cells or cell factors as a "bioink" to fabricate prospective tissue structures. Biomaterial parameters such as biocompatibility, cell viability and the cellular microenvironment strongly influence the printed product. Various printing technologies have been investigated, and great progress has been made in printing various types of tissue, including vasculature, heart, bone, cartilage, skin and liver. This review introduces basic principles and key aspects of some frequently used printing technologies. We focus on recent advances in three-dimensional printing applications, current challenges and future directions.

  7. Advanced materials and techniques for fibre-optic sensing

    NASA Astrophysics Data System (ADS)

    Henderson, Philip J.

    2014-06-01

    Fibre-optic monitoring systems came of age in about 1999 upon the emergence of the world's first significant commercialising company - a spin-out from the UK's collaborative MAST project. By using embedded fibre-optic technology, the MAST project successfully measured transient strain within high-performance composite yacht masts. Since then, applications have extended from smart composites into civil engineering, energy, military, aerospace, medicine and other sectors. Fibre-optic sensors come in various forms, and may be subject to embedment, retrofitting, and remote interrogation. The unique challenges presented by each implementation require careful scrutiny before widespread adoption can take place. Accordingly, various aspects of design and reliability are discussed spanning a range of representative technologies that include resonant microsilicon structures, MEMS, Bragg gratings, advanced forms of spectroscopy, and modern trends in nanotechnology. Keywords: Fibre-optic sensors, fibre Bragg gratings, MEMS, MOEMS, nanotechnology, plasmon.

  8. Advanced liquefaction using coal swelling and catalyst dispersion techniques

    SciTech Connect

    Curtis, C.W. ); Gutterman, C. ); Chander, S. )

    1991-01-01

    Research in this project centers upon developing a new approach to the direct liquefaction of coal to produce an all-distillate product slate at a sizable cost reduction over current technology. The approach integrates all aspects of the coal liquefaction process including coal selection, pretreatment, coal swelling with catalyst impregnation, coal liquefaction experimentation, product recovery with characterization, alternate bottoms processing, and a technical assessment including an economic evaluation. Work has centered upon obtaining bulk samples of feedstocks for the project, up-dating the background literature, and preparing and testing a computer program to perform material balance calculations for the continuous flow liquefaction unit.

  9. Investigation to advance prediction techniques of the low-speed aerodynamics of V/STOL aircraft

    NASA Technical Reports Server (NTRS)

    Maskew, B.; Strash, D.; Nathman, J.; Dvorak, F. A.

    1985-01-01

    A computer program, VSAERO, has been applied to a number of V/STOL configurations with a view to advancing prediction techniques for the low-speed aerodynamic characteristics. The program couples a low-order panel method with surface streamline calculation and integral boundary layer procedures. The panel method--which uses piecewise constant source and doublet panels-includes an iterative procedure for wake shape and models boundary layer displacement effect using the source transpiration technique. Certain improvements to a basic vortex tube jet model were installed in the code prior to evaluation. Very promising results were obtained for surface pressures near a jet issuing at 90 deg from a flat plate. A solid core model was used in the initial part of the jet with a simple entrainment model. Preliminary representation of the downstream separation zone significantly improve the correlation. The program accurately predicted the pressure distribution inside the inlet on the Grumman 698-411 design at a range of flight conditions. Furthermore, coupled viscous/potential flow calculations gave very close correlation with experimentally determined operational boundaries dictated by the onset of separation inside the inlet. Experimentally observed degradation of these operational boundaries between nacelle-alone tests and tests on the full configuration were also indicated by the calculation. Application of the program to the General Dynamics STOL fighter design were equally encouraging. Very close agreement was observed between experiment and calculation for the effects of power on pressure distribution, lift and lift curve slope.

  10. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  11. Advances in parameter estimation techniques applied to flexible structures

    NASA Technical Reports Server (NTRS)

    Maben, Egbert; Zimmerman, David C.

    1994-01-01

    In this work, various parameter estimation techniques are investigated in the context of structural system identification utilizing distributed parameter models and 'measured' time-domain data. Distributed parameter models are formulated using the PDEMOD software developed by Taylor. Enhancements made to PDEMOD for this work include the following: (1) a Wittrick-Williams based root solving algorithm; (2) a time simulation capability; and (3) various parameter estimation algorithms. The parameter estimations schemes will be contrasted using the NASA Mini-Mast as the focus structure.

  12. Advances in dental veneers: materials, applications, and techniques

    PubMed Central

    Pini, Núbia Pavesi; Aguiar, Flávio Henrique Baggio; Lima, Débora Alves Nunes Leite; Lovadino, José Roberto; Terada, Raquel Sano Suga; Pascotto, Renata Corrêa

    2012-01-01

    Laminate veneers are a conservative treatment of unaesthetic anterior teeth. The continued development of dental ceramics offers clinicians many options for creating highly aesthetic and functional porcelain veneers. This evolution of materials, ceramics, and adhesive systems permits improvement of the aesthetic of the smile and the self-esteem of the patient. Clinicians should understand the latest ceramic materials in order to be able to recommend them and their applications and techniques, and to ensure the success of the clinical case. The current literature was reviewed to search for the most important parameters determining the long-term success, correct application, and clinical limitations of porcelain veneers. PMID:23674920

  13. Coincident Pulse Techniques for Hybrid Electronic Optical Computer Systems

    DTIC Science & Technology

    1992-08-31

    parallel computer architectures, parallel algorithms, and VLSI . Additional interests include design tools and methodology for software, hardware, and...Interactive Toolset for Characterizing Complex Neural Systems" D.N. Krieger, T.W. Berger, S.P. Levitan, and R.J. Sclabassi; Computers and Mathematics, Vol...systims algorithm design, and computer aided design for VLSI . and the application of large computational arrays to scientific problems. Dr. Levitan is a

  14. Advanced terahertz techniques for quality control and counterfeit detection

    NASA Astrophysics Data System (ADS)

    Ahi, Kiarash; Anwar, Mehdi

    2016-04-01

    This paper reports our invented methods for detection of counterfeit electronic. These versatile techniques are also handy in quality control applications. Terahertz pulsed laser systems are capable of giving the material characteristics and thus make it possible to distinguish between the materials used in authentic components and their counterfeit clones. Components with material defects can also be distinguished in section in this manner. In this work different refractive indices and absorption coefficients were observed for counterfeit components compared to their authentic counterparts. Existence of unexpected ingredient materials was detected in counterfeit components by Fourier Transform analysis of the transmitted terahertz pulse. Thicknesses of different layers are obtainable by analyzing the reflected terahertz pulse. Existence of unexpected layers is also detectable in this manner. Recycled, sanded and blacktopped counterfeit electronic components were detected as a result of these analyses. Counterfeit ICs with die dislocations were detected by depicting the terahertz raster scanning data in a coordinate plane which gives terahertz images. In the same manner, raster scanning of the reflected pulse gives terahertz images of the surfaces of the components which were used to investigate contaminant materials and sanded points on the surfaces. The results of the later technique, reveals the recycled counterfeit components.

  15. Comparison of three advanced chromatographic techniques for cannabis identification.

    PubMed

    Debruyne, D; Albessard, F; Bigot, M C; Moulin, M

    1994-01-01

    The development of chromatography technology, with the increasing availability of easier-to-use mass spectrometers combined with gas chromatography (GC), the use of diode-array or programmable variable-wavelength ultraviolet absorption detectors in conjunction with high-performance liquid chromatography (HPLC), and the availability of scanners capable of reading thin-layer chromatography (TLC) plates in the ultraviolet and visible regions, has made for easier, quicker and more positive identification of cannabis samples that standard analytical laboratories are occasionally required to undertake in the effort to combat drug addiction. At laboratories that do not possess the technique of GC combined with mass spectrometry, which provides an irrefutable identification, the following procedure involving HPLC or TLC techniques may be used: identification of the chromatographic peaks corresponding to each of the three main cannabis constituents-cannabidiol (CBD), delta-9-tetrahydrocannabinol (delta-9-THC) and cannabinol (CBN)-by comparison with published data in conjunction with a specific absorption spectrum for each of those constituents obtained between 200 and 300 nm. The collection of the fractions corresponding to the three major cannabinoids at the HPLC system outlet and the cross-checking of their identity in the GC process with flame ionization detection can further corroborate the identification and minimize possible errors due to interference.

  16. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  17. Computational fluid dynamics in the design and analysis of thermal processes: a review of recent advances.

    PubMed

    Norton, Tomás; Tiwari, Brijesh; Sun, Da Wen

    2013-01-01

    The design of thermal processes in the food industry has undergone great developments in the last two decades due to the availability of cheap computer power alongside advanced modelling techniques such as computational fluid dynamics (CFD). CFD uses numerical algorithms to solve the non-linear partial differential equations of fluid mechanics and heat transfer so that the complex mechanisms that govern many food-processing systems can be resolved. In thermal processing applications, CFD can be used to build three-dimensional models that are both spatially and temporally representative of a physical system to produce solutions with high levels of physical realism without the heavy costs associated with experimental analyses. Therefore, CFD is playing an ever growing role in the development of optimization of conventional as well as the development of new thermal processes in the food industry. This paper discusses the fundamental aspects involved in developing CFD solutions and forms a state-of-the-art review on various CFD applications in conventional as well as novel thermal processes. The challenges facing CFD modellers of thermal processes are also discussed. From this review it is evident that present-day CFD software, with its rich tapestries of mathematical physics, numerical methods and visualization techniques, is currently recognized as a formidable and pervasive technology which can permit comprehensive analyses of thermal processing.

  18. A review of computer-aided design/computer-aided manufacture techniques for removable denture fabrication

    PubMed Central

    Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan

    2016-01-01

    The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now. PMID:27095912

  19. A review of computer-aided design/computer-aided manufacture techniques for removable denture fabrication.

    PubMed

    Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan

    2016-01-01

    The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now.

  20. Development of a real-time aeroperformance analysis technique for the X-29A advanced technology demonstrator

    NASA Technical Reports Server (NTRS)

    Ray, R. J.; Hicks, J. W.; Alexander, R. I.

    1988-01-01

    The X-29A advanced technology demonstrator has shown the practicality and advantages of the capability to compute and display, in real time, aeroperformance flight results. This capability includes the calculation of the in-flight measured drag polar, lift curve, and aircraft specific excess power. From these elements many other types of aeroperformance measurements can be computed and analyzed. The technique can be used to give an immediate postmaneuver assessment of data quality and maneuver technique, thus increasing the productivity of a flight program. A key element of this new method was the concurrent development of a real-time in-flight net thrust algorithm, based on the simplified gross thrust method. This net thrust algorithm allows for the direct calculation of total aircraft drag.

  1. Development of a real-time aeroperformance analysis technique for the X-29A advanced technology demonstrator

    NASA Technical Reports Server (NTRS)

    Ray, R. J.; Hicks, J. W.; Alexander, R. I.

    1988-01-01

    The X-29A advanced technology demonstrator has shown the practicality and advantages of the capability to compute and display, in real time, aeroperformance flight results. This capability includes the calculation of the in flight measured drag polar, lift curve, and aircraft specific excess power. From these elements, many other types of aeroperformance measurements can be computed and analyzed. The technique can be used to give an immediate postmaneuver assessment of data quality and maneuver technique, thus increasing the productivity of a flight program. A key element of this new method was the concurrent development of a real-time in flight net thrust algorithm, based on the simplified gross thrust method. This net thrust algorithm allows for the direct calculation of total aircraft drag.

  2. Bioactive glass thin films synthesized by advanced pulsed laser techniques

    NASA Astrophysics Data System (ADS)

    Mihailescu, N.; Stan, George E.; Ristoscu, C.; Sopronyi, M.; Mihailescu, Ion N.

    2016-10-01

    Bioactive materials play an increasingly important role in the biomaterials industry, and are extensively used in a range of applications, including biodegradable metallic implants. We report on Bioactive Glasses (BG) films deposition by pulsed laser techniques onto biodegradable substrates. The BG coatings were obtained using a KrF* excimer laser source (λ= 248 nm, τFWHM ≤ 25 ns).Their thickness has been determined by Profilometry measurements, whilst their morphology has been analysed by Scanning Electron Microscopy (SEM). The obtained coatings fairly preserved the targets composition and structure, as revealed by Energy Dispersive X-Ray Spectroscopy, Grazing Incidence X-Ray Diffraction, and Fourier Transform Infra-Red Spectroscopy analyses.

  3. Advanced Techniques in Musculoskeletal Oncology: Perfusion, Diffusion, and Spectroscopy.

    PubMed

    Teixeira, Pedro A Gondim; Beaumont, Marine; Gabriela, Hossu; Bailiang, Chen; Verhaeghe, Jean-luc; Sirveaux, François; Blum, Alain

    2015-12-01

    The imaging characterization of musculoskeletal tumors can be challenging, and a significant number of lesions remain indeterminate when conventional imaging protocols are used. In recent years, clinical availability of functional imaging methods has increased. Functional imaging has the potential to improve tumor detection, characterization, and follow-up. The most frequently used functional methods are perfusion imaging, diffusion-weighted imaging (DWI), and MR proton spectroscopy (MRS). Each of these techniques has specific protocol requirements and diagnostic pitfalls that need to be acknowledged to avoid misdiagnoses. Additionally, the application of functional methods in the MSK system has various technical issues that need to be addressed to ensure data quality and comparability. In this article, the application of contrast-enhanced perfusion imaging, DWI, and MRS for the evaluation of bone and soft tissue tumors is discussed, with emphasis on acquisition protocols, technical difficulties, and current clinical indications.

  4. Advanced fabrication techniques for hydrogen-cooled engine structures

    NASA Technical Reports Server (NTRS)

    Buchmann, O. A.; Arefian, V. V.; Warren, H. A.; Vuigner, A. A.; Pohlman, M. J.

    1985-01-01

    Described is a program for development of coolant passage geometries, material systems, and joining processes that will produce long-life hydrogen-cooled structures for scramjet applications. Tests were performed to establish basic material properties, and samples constructed and evaluated to substantiate fabrication processes and inspection techniques. Results of the study show that the basic goal of increasing the life of hydrogen-cooled structures two orders of magnitude relative to that of the Hypersonic Research Engine can be reached with available means. Estimated life is 19000 cycles for the channels and 16000 cycles for pin-fin coolant passage configurations using Nickel 201. Additional research is required to establish the fatigue characteristics of dissimilar-metal coolant passages (Nickel 201/Inconel 718) and to investigate the embrittling effects of the hydrogen coolant.

  5. Advances in techniques for assessment of microalgal lipids.

    PubMed

    Challagulla, Vineela; Nayar, Sasi; Walsh, Kerry; Fabbro, Larelle

    2016-07-15

    Microalgae are a varied group of organisms with considerable commercial potential as sources of various biochemicals, storage molecules and metabolites such as lipids, sugars, amino acids, pigments and toxins. Algal lipids can be processed to bio-oils and biodiesel. The conventional method to estimate algal lipids is based on extraction using solvents and quantification by gravimetry or chromatography. Such methods are time consuming, use hazardous chemicals and are labor intensive. For rapid screening of prospective algae or for management decisions (e.g. decision on timing of harvest), a rapid, high throughput, reliable, accurate, cost effective and preferably nondestructive analytical technique is desirable. This manuscript reviews the application of fluorescent lipid soluble dyes (Nile Red and BODIPY 505/515), nuclear magnetic resonance (NMR), Raman, Fourier transform infrared (FTIR) and near infrared (NIR) spectroscopy for the assessment of lipids in microalgae.

  6. Computer vision techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar

    1990-01-01

    Rotorcraft operating in high-threat environments fly close to the earth's surface to utilize surrounding terrain, vegetation, or manmade objects to minimize the risk of being detected by an enemy. Increasing levels of concealment are achieved by adopting different tactics during low-altitude flight. Rotorcraft employ three tactics during low-altitude flight: low-level, contour, and nap-of-the-earth (NOE). The key feature distinguishing the NOE mode from the other two modes is that the whole rotorcraft, including the main rotor, is below tree-top whenever possible. This leads to the use of lateral maneuvers for avoiding obstacles, which in fact constitutes the means for concealment. The piloting of the rotorcraft is at best a very demanding task and the pilot will need help from onboard automation tools in order to devote more time to mission-related activities. The development of an automation tool which has the potential to detect obstacles in the rotorcraft flight path, warn the crew, and interact with the guidance system to avoid detected obstacles, presents challenging problems. Research is described which applies techniques from computer vision to automation of rotorcraft navigtion. The effort emphasizes the development of a methodology for detecting the ranges to obstacles in the region of interest based on the maximum utilization of passive sensors. The range map derived from the obstacle-detection approach can be used as obstacle data for the obstacle avoidance in an automatic guidance system and as advisory display to the pilot. The lack of suitable flight imagery data presents a problem in the verification of concepts for obstacle detection. This problem is being addressed by the development of an adequate flight database and by preprocessing of currently available flight imagery. The presentation concludes with some comments on future work and how research in this area relates to the guidance of other autonomous vehicles.

  7. The Advance of Computing from the Ground to the Cloud

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    A trend toward the abstraction of computing platforms that has been developing in the broader IT arena over the last few years is just beginning to make inroads into the library technology scene. Cloud computing offers for libraries many interesting possibilities that may help reduce technology costs and increase capacity, reliability, and…

  8. Computational Enzyme Design: Advances, hurdles and possible ways forward

    PubMed Central

    Linder, Mats

    2012-01-01

    This mini review addresses recent developments in computational enzyme design. Successful protocols as well as known issues and limitations are discussed from an energetic perspective. It will be argued that improved results can be obtained by including a dynamic treatment in the design protocol. Finally, a molecular dynamics-based approach for evaluating and refining computational designs is presented. PMID:24688650

  9. Advanced Infusion Techniques with 3-D Printed Tooling

    SciTech Connect

    Nuttall, David; Elliott, Amy; Post, Brian K.; Love, Lonnie J.

    2016-05-10

    The manufacturing of tooling for large, contoured surfaces for fiber-layup applications requires significant effort to understand the geometry and then to subtractively manufacture the tool. Traditional methods for the auto industry use clay that is hand sculpted. In the marine pleasure craft industry, the exterior of the model is formed from a foam lay-up that is either hand cut or machined to create smooth lines. Engineers and researchers at Oak Ridge National Laboratory s Manufacturing Demonstration Facility (ORNL MDF) collaborated with Magnum Venus Products (MVP) in the development of a process for reproducing legacy whitewater adventure craft via digital scanning and large scale 3-D printed layup molds. The process entailed 3D scanning a legacy canoe form, converting that form to a CAD model, additively manufacturing (3-D Print) the mold tool, and subtractively finishing the mold s transfer surfaces. Future work will include applying a gelcoat to the mold transfer surface and infusing using vacuum assisted resin transfer molding, or VARTM principles, to create a watertight vessel. The outlined steps were performed on a specific canoe geometry found by MVP s principal participant. The intent of utilizing this geometry is to develop an energy efficient and marketable process for replicating complex shapes, specifically focusing on this particular watercraft, and provide a finished product for demonstration to the composites industry. The culminating part produced through this agreement has been slated for public presentation and potential demonstration at the 2016 CAMX (Composites and Advanced Materials eXpo) exposition in Anaheim, CA. Phase I of this collaborative research and development agreement (MDF-15-68) was conducted under CRADA NFE-15-05575 and was initiated on May 7, 2015, with an introduction to the MVP product line, and concluded in March of 2016 with the printing of and processing of a canoe mold. The project partner Magnum Venous Products (MVP) is

  10. Computational Modeling and Neuroimaging Techniques for Targeting during Deep Brain Stimulation

    PubMed Central

    Sweet, Jennifer A.; Pace, Jonathan; Girgis, Fady; Miller, Jonathan P.

    2016-01-01

    Accurate surgical localization of the varied targets for deep brain stimulation (DBS) is a process undergoing constant evolution, with increasingly sophisticated techniques to allow for highly precise targeting. However, despite the fastidious placement of electrodes into specific structures within the brain, there is increasing evidence to suggest that the clinical effects of DBS are likely due to the activation of widespread neuronal networks directly and indirectly influenced by the stimulation of a given target. Selective activation of these complex and inter-connected pathways may further improve the outcomes of currently treated diseases by targeting specific fiber tracts responsible for a particular symptom in a patient-specific manner. Moreover, the delivery of such focused stimulation may aid in the discovery of new targets for electrical stimulation to treat additional neurological, psychiatric, and even cognitive disorders. As such, advancements in surgical targeting, computational modeling, engineering designs, and neuroimaging techniques play a critical role in this process. This article reviews the progress of these applications, discussing the importance of target localization for DBS, and the role of computational modeling and novel neuroimaging in improving our understanding of the pathophysiology of diseases, and thus paving the way for improved selective target localization using DBS. PMID:27445709

  11. Recent advances in techniques for tsetse-fly control*

    PubMed Central

    MacLennan, K. J. R.

    1967-01-01

    With the advent of modern persistent insecticides, it has become possible to utilize some of the knowledge that has accumulated on the ecology and bionomics of Glossina and to devise more effective techniques for the control and eventual extermination of these species. The present article, based on experience of the tsetse fly problem in Northern Nigeria, points out that the disadvantages of control techniques—heavy expenditure of money and manpower and undue damage to the biosystem—can now largely be overcome by basing the application of insecticides on knowledge of the habits of the particular species of Glossina in a particular environment. Two factors are essential to the success of a control project: the proper selection of sites for spraying (the concept of restricted application) and the degree of persistence of the insecticide used. Reinfestation from within or outside the project area must also be taken into account. These and other aspects are discussed in relation to experience gained from a successful extermination project carried out in the Sudan vegetation zone and from present control activities in the Northern Guinea vegetation zone. PMID:5301739

  12. Advances in Current Rating Techniques for Flexible Printed Circuits

    NASA Technical Reports Server (NTRS)

    Hayes, Ron

    2014-01-01

    Twist Capsule Assemblies are power transfer devices commonly used in spacecraft mechanisms that require electrical signals to be passed across a rotating interface. Flexible printed circuits (flex tapes, see Figure 2) are used to carry the electrical signals in these devices. Determining the current rating for a given trace (conductor) size can be challenging. Because of the thermal conditions present in this environment the most appropriate approach is to assume that the only means by which heat is removed from the trace is thru the conductor itself, so that when the flex tape is long the temperature rise in the trace can be extreme. While this technique represents a worst-case thermal situation that yields conservative current ratings, this conservatism may lead to overly cautious designs when not all traces are used at their full rated capacity. A better understanding of how individual traces behave when they are not all in use is the goal of this research. In the testing done in support of this paper, a representative flex tape used for a flight Solar Array Drive Assembly (SADA) application was tested by energizing individual traces (conductors in the tape) in a vacuum chamber and the temperatures of the tape measured using both fine-gauge thermocouples and infrared thermographic imaging. We find that traditional derating schemes used for bundles of wires do not apply for the configuration tested. We also determine that single active traces located in the center of a flex tape operate at lower temperatures than those on the outside edges.

  13. Simulation of an advanced techniques of ion propulsion Rocket system

    NASA Astrophysics Data System (ADS)

    Bakkiyaraj, R.

    2016-07-01

    The ion propulsion rocket system is expected to become popular with the development of Deuterium,Argon gas and Hexagonal shape Magneto hydrodynamic(MHD) techniques because of the stimulation indirectly generated the power from ionization chamber,design of thrust range is 1.2 N with 40 KW of electric power and high efficiency.The proposed work is the study of MHD power generation through ionization level of Deuterium gas and combination of two gaseous ions(Deuterium gas ions + Argon gas ions) at acceleration stage.IPR consists of three parts 1.Hexagonal shape MHD based power generator through ionization chamber 2.ion accelerator 3.Exhaust of Nozzle.Initially the required energy around 1312 KJ/mol is carrying out the purpose of deuterium gas which is changed to ionization level.The ionized Deuterium gas comes out from RF ionization chamber to nozzle through MHD generator with enhanced velocity then after voltage is generated across the two pairs of electrode in MHD.it will produce thrust value with the help of mixing of Deuterium ion and Argon ion at acceleration position.The simulation of the IPR system has been carried out by MATLAB.By comparing the simulation results with the theoretical and previous results,if reaches that the proposed method is achieved of thrust value with 40KW power for simulating the IPR system.

  14. Advances in low energy neutral atom imaging techniques

    SciTech Connect

    Scime, E.E.; Funsten, H.O.; McComas, D.J.; Moore, K.R. ); Gruntman, M. . Space Sciences Center)

    1993-01-01

    Recently proposed low energy neutral atom (LENA) imaging techniques use a collisional process to convert the low energy neutrals into ions before detection. At low energies, collisional processes limit the angular resolution and conversion efficiencies of these devices. However, if the intense ultraviolet light background can be suppressed, direct LENA detection is possible. We present results from a series of experiments designed to develop a novel filtering structure based on free-standing transmission gratings. If the grating period is sufficiently small, free standing transmission gratings can be employed to substantially polarize ultraviolet (UV) light in the wavelength range 300 [Angstrom] to 1500 [Angstrom]. If a second grating is placed behind the first grating with its axis of polarization oriented at a right angle to the first's, a substantial attenuation of UV radiation is achievable. ne neutrals will pass through the remaining open area of two gratings and be detected without UV background complications. We have obtained nominal 2000 [Angstrom] period (1000 [Angstrom] bars with 1000 [Angstrom] slits) free standing, gold transmission gratings and measured their UV and atomic transmission characteristics. The geometric factor of a LENA imager based on this technology is comparable to that of other proposed LENA imagers. In addition, this of imager does not distort the neutral trajectories, allowing for high angular resolution.

  15. Advanced signal processing technique for damage detection in steel tubes

    NASA Astrophysics Data System (ADS)

    Amjad, Umar; Yadav, Susheel Kumar; Dao, Cac Minh; Dao, Kiet; Kundu, Tribikram

    2016-04-01

    In recent years, ultrasonic guided waves gained attention for reliable testing and characterization of metals and composites. Guided wave modes are excited and detected by PZT (Lead Zirconate Titanate) transducers either in transmission or reflection mode. In this study guided waves are excited and detected in the transmission mode and the phase change of the propagating wave modes are recorded. In most of the other studies reported in the literature, the change in the received signal strength (amplitude) is investigated with varying degrees of damage while in this study the change in phase is correlated with the extent of damage. Feature extraction techniques are used for extracting phase and time-frequency information. The main advantage of this approach is that the bonding condition between the transducer and the specimen does not affect the phase while it can affect the strength of recorded signal. Therefore, if the specimen is not damaged but the transducer-specimen bonding is deteriorated then the received signal strength is altered but the phase remains same and thus false positive predictions for damage can be avoided.

  16. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.; Stebbins, J. P.; Smith, A. W.; Pullen, K. E.

    1973-01-01

    A method for the prediction of propellant-material compatibility for periods of time up to ten years is presented. Advanced sensitive measurement techniques used in the prediction method are described. These include: neutron activation analysis, radioactive tracer technique, and atomic absorption spectroscopy with a graphite tube furnace sampler. The results of laboratory tests performed to verify the prediction method are presented.

  17. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    The purpose of this article is to discuss "small-group apprenticeships (SGAs)" as a method to instruct cell culture techniques to high school participants. The study aimed to teach cell culture practices and to introduce advanced imaging techniques to solve various biomedical engineering problems. Participants designed and completed experiments…

  18. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research.

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    Discusses small-group apprenticeships (SGAs) as a method for introducing cell culture techniques to high school participants. Teaches cell culture practices and introduces advance imaging techniques to solve various biomedical engineering problems. Clarifies and illuminates the value of small-group laboratory apprenticeships. (Author/KHR)

  19. Incorporation of Monte-Carlo Computer Techniques into Science and Mathematics Education.

    ERIC Educational Resources Information Center

    Danesh, Iraj

    1987-01-01

    Described is a Monte-Carlo method for modeling physical systems with a computer. Also discussed are ways to incorporate Monte-Carlo simulation techniques for introductory science and mathematics teaching and also for enriching computer and simulation courses. (RH)

  20. Advancements in sensing and perception using structured lighting techniques :an LDRD final report.

    SciTech Connect

    Novick, David Keith; Padilla, Denise D.; Davidson, Patrick A. Jr.; Carlson, Jeffrey J.

    2005-09-01

    This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Advancements in Sensing and Perception using Structured Lighting Techniques''. There is an ever-increasing need for robust, autonomous ground vehicles for counterterrorism and defense missions. Although there has been nearly 30 years of government-sponsored research, it is undisputed that significant advancements in sensing and perception are necessary. We developed an innovative, advanced sensing technology for national security missions serving the Department of Energy, the Department of Defense, and other government agencies. The principal goal of this project was to develop an eye-safe, robust, low-cost, lightweight, 3D structured lighting sensor for use in broad daylight outdoor applications. The market for this technology is wide open due to the unavailability of such a sensor. Currently available laser scanners are slow, bulky and heavy, expensive, fragile, short-range, sensitive to vibration (highly problematic for moving platforms), and unreliable for outdoor use in bright sunlight conditions. Eye-safety issues are a primary concern for currently available laser-based sensors. Passive, stereo-imaging sensors are available for 3D sensing but suffer from several limitations : computationally intensive, require a lighted environment (natural or man-made light source), and don't work for many scenes or regions lacking texture or with ambiguous texture. Our approach leveraged from the advanced capabilities of modern CCD camera technology and Center 6600's expertise in 3D world modeling, mapping, and analysis, using structured lighting. We have a diverse customer base for indoor mapping applications and this research extends our current technology's lifecycle and opens a new market base for outdoor 3D mapping. Applications include precision mapping, autonomous navigation, dexterous manipulation, surveillance and

  1. A Study of Computer Techniques for Music Research. Final Report.

    ERIC Educational Resources Information Center

    Lincoln, Harry B.

    Work in three areas comprised this study of computer use in thematic indexing for music research: (1) acquisition, encoding, and keypunching of data--themes of which now number about 50,000 (primarily 16th Century Italian vocal music) and serve as a test base for program development; (2) development of computer programs to process this data; and…

  2. A Study into Advanced Guidance Laws Using Computational Methods

    DTIC Science & Technology

    2011-12-01

    computing aerodynamic forces % and moments. Except where noted, all dimensions in % MKS system. % Inputs...9] R. L. Shaw, Fighter Combat: Tactics and Maneuvering. Annapolis, MD: Naval Institute Press, 1988. [10] U. S. Shukla and P. R. Mahapatra

  3. Autonomous management of distributed information systems using evolutionary computation techniques

    NASA Astrophysics Data System (ADS)

    Oates, Martin J.

    1999-03-01

    can provide reliable and consistent performance. This paper investigates evolutionary computation techniques, comparing results from genetic algorithms, simulated annealing and hillclimbing. Major differential algorithm performance is found across different fitness criteria. Preliminary conclusions are that a genetic algorithm approach seems superior to hillclimbing or simulated annealing when more realistic (from a quality of service viewpoint) objective functions are used. Further, the genetic algorithm approach displays regions of adequate robustness to parameter variation, which is also critical from a maintained quality of service viewpoint.

  4. Advances in Domain Mapping of Massively Parallel Scientific Computations

    SciTech Connect

    Leland, Robert W.; Hendrickson, Bruce A.

    2015-10-01

    One of the most important concerns in parallel computing is the proper distribution of workload across processors. For most scientific applications on massively parallel machines, the best approach to this distribution is to employ data parallelism; that is, to break the datastructures supporting a computation into pieces and then to assign those pieces to different processors. Collectively, these partitioning and assignment tasks comprise the domain mapping problem.

  5. Analysis of Piezoelectric Structural Sensors with Emergent Computing Techniques

    NASA Technical Reports Server (NTRS)

    Ramers, Douglas L.

    2005-01-01

    The purpose of this project was to try to interpret the results of some tests that were performed earlier this year and to demonstrate a possible use of emergence in computing to solve IVHM problems. The test data used was collected with piezoelectric sensors to detect mechanical changes in structures. This project team was included of Dr. Doug Ramers and Dr. Abdul Jallob of the Summer Faculty Fellowship Program, Arnaldo Colon-Lopez - a student intern from the University of Puerto Rico of Turabo, and John Lassister and Bob Engberg of the Structural and Dynamics Test Group. The tests were performed by Bob Engberg to compare the performance two types of piezoelectric (piezo) sensors, Pb(Zr(sub 1-1)Ti(sub x))O3, which we will label PZT, and Pb(Zn(sub 1/3)Nb(sub 2/3))O3-PbTiO, which we will label SCP. The tests were conducted under varying temperature and pressure conditions. One set of tests was done by varying water pressure inside an aluminum liner covered with carbon-fiber composite layers (a cylindrical "bottle" with domed ends) and the other by varying temperatures down to cryogenic levels on some specially prepared composite panels. This report discusses the data from the pressure study. The study of the temperature results was not completed in time for this report. The particular sensing done with these piezo sensors is accomplished by the sensor generating an controlled vibration that is transmitted into the structure to which the sensor is attached, and the same sensor then responding to the induced vibration of the structure. There is a relationship between the mechanical impedance of the structure and the resulting electrical impedance produced in the in the piezo sensor. The impedance is also a function of the excitation frequency. Changes in the real part of impendance signature relative to an original reference signature indicate a change in the coupled structure that could be the results of damage or strain. The water pressure tests were conducted by

  6. Computer modeling of a wideswath SAR concept employing multiple antenna beam formation techniques

    NASA Technical Reports Server (NTRS)

    Estes, J. M.

    1982-01-01

    A technique for wideswath synthetic aperture radar coverage, was implemented into the OSS (orbital sar simulation) computer programs. The OSS modifications, and the implementation and simulation of the concept are described. The wideswath technique uses formed multiple antenna beams.

  7. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  8. OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2005-01-01

    Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations

  9. A Novel Microcharacterization Technique in the Measurement of Strain and Orientation Gradient in Advanced Materials

    NASA Technical Reports Server (NTRS)

    Garmestai, H.; Harris, K.; Lourenco, L.

    1997-01-01

    Representation of morphology and evolution of the microstructure during processing and their relation to properties requires proper experimental techniques. Residual strains, lattice distortion, and texture (micro-texture) at the interface and the matrix of a layered structure or a functionally gradient material and their variation are among parameters important in materials characterization but hard to measure with present experimental techniques. Current techniques available to measure changes in interred material parameters (residual stress, micro-texture, microplasticity) produce results which are either qualitative or unreliable. This problem becomes even more complicated in the case of a temperature variation. These parameters affect many of the mechanical properties of advanced materials including stress-strain relation, ductility, creep, and fatigue. A review of some novel experimental techniques using recent advances in electron microscopy is presented here to measure internal stress, (micro)texture, interracial strength and (sub)grain formation and realignment. Two of these techniques are combined in the chamber of an Environmental Scanning Electron Microscope to measure strain and orientation gradients in advanced materials. These techniques which include Backscattered Kikuchi Diffractometry (BKD) and Microscopic Strain Field Analysis are used to characterize metallic and intermetallic matrix composites and superplastic materials. These techniques are compared with the more conventional x-ray diffraction and indentation techniques.

  10. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  11. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  12. MAX - An advanced parallel computer for space applications

    NASA Technical Reports Server (NTRS)

    Lewis, Blair F.; Bunker, Robert L.

    1991-01-01

    MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.

  13. Advanced Computational Aeroacoustics Methods for Fan Noise Prediction

    NASA Technical Reports Server (NTRS)

    Envia, Edmane (Technical Monitor); Tam, Christopher

    2003-01-01

    Direct computation of fan noise is presently not possible. One of the major difficulties is the geometrical complexity of the problem. In the case of fan noise, the blade geometry is critical to the loading on the blade and hence the intensity of the radiated noise. The precise geometry must be incorporated into the computation. In computational fluid dynamics (CFD), there are two general ways to handle problems with complex geometry. One way is to use unstructured grids. The other is to use body fitted overset grids. In the overset grid method, accurate data transfer is of utmost importance. For acoustic computation, it is not clear that the currently used data transfer methods are sufficiently accurate as not to contaminate the very small amplitude acoustic disturbances. In CFD, low order schemes are, invariably, used in conjunction with unstructured grids. However, low order schemes are known to be numerically dispersive and dissipative. dissipative errors are extremely undesirable for acoustic wave problems. The objective of this project is to develop a high order unstructured grid Dispersion-Relation-Preserving (DRP) scheme. would minimize numerical dispersion and dissipation errors. contains the results of the funded portion of the project. scheme on an unstructured grid has been developed. constructed in the wave number space. The characteristics of the scheme can be improved by the inclusion of additional constraints. Stability of the scheme has been investigated. Stability can be improved by adopting the upwinding strategy.

  14. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Computers.

    ERIC Educational Resources Information Center

    Ellis, Brenda

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 3-hour introduction to computers. The purpose is to develop the following competencies: (1) orientation to data processing; (2) use of data entry devices; (3) use of computer menus; and (4) entry of data with accuracy and…

  15. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  16. Modulation/demodulation techniques for satellite communications. Part 2: Advanced techniques. The linear channel

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1982-01-01

    A theory is presented for deducing and predicting the performance of transmitter/receivers for bandwidth efficient modulations suitable for use on the linear satellite channel. The underlying principle used is the development of receiver structures based on the maximum-likelihood decision rule. The application of the performance prediction tools, e.g., channel cutoff rate and bit error probability transfer function bounds to these modulation/demodulation techniques.

  17. Advanced combustion techniques for controlling NO sub x emissions of high altitude cruise aircraft

    NASA Technical Reports Server (NTRS)

    Rudey, R. A.; Reck, G. M.

    1976-01-01

    An array of experiments designed to explore the potential of advanced combustion techniques for controlling the emissions of aircraft into the upper atmosphere was discussed. Of particular concern are the oxides of nitrogen (NOx) emissions into the stratosphere. The experiments utilize a wide variety of approaches varying from advanced combustor concepts to fundamental flame tube experiments. Results are presented which indicate that substantial reductions in cruise NOx emissions should be achievable in future aircraft engines. A major NASA program is described which focuses the many fundamental experiments into a planned evolution and demonstration of the prevaporized-premixed combustion technique in a full-scale engine.

  18. POC-Scale Testing of an Advanced Fine Coal Dewatering Equipment/Technique

    SciTech Connect

    Karekh, B K; Tao, D; Groppo, J G

    1998-08-28

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 mm) clean coal. Economical dewatering of an ultra-fine clean coal product to a 20% level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy's program to show that ultra-clean coal could be effectively dewatered to 20% or lower moisture using either conventional or advanced dewatering techniques. The cost-sharing contract effort is for 45 months beginning September 30, 1994. This report discusses technical progress made during the quarter from January 1 - March 31, 1998.

  19. Analysis of computational modeling techniques for complete rotorcraft configurations

    NASA Astrophysics Data System (ADS)

    O'Brien, David M., Jr.

    Computational fluid dynamics (CFD) provides the helicopter designer with a powerful tool for identifying problematic aerodynamics. Through the use of CFD, design concepts can be analyzed in a virtual wind tunnel long before a physical model is ever created. Traditional CFD analysis tends to be a time consuming process, where much of the effort is spent generating a high quality computational grid. Recent increases in computing power and memory have created renewed interest in alternative grid schemes such as unstructured grids, which facilitate rapid grid generation by relaxing restrictions on grid structure. Three rotor models have been incorporated into a popular fixed-wing unstructured CFD solver to increase its capability and facilitate availability to the rotorcraft community. The benefit of unstructured grid methods is demonstrated through rapid generation of high fidelity configuration models. The simplest rotor model is the steady state actuator disk approximation. By transforming the unsteady rotor problem into a steady state one, the actuator disk can provide rapid predictions of performance parameters such as lift and drag. The actuator blade and overset blade models provide a depiction of the unsteady rotor wake, but incur a larger computational cost than the actuator disk. The actuator blade model is convenient when the unsteady aerodynamic behavior needs to be investigated, but the computational cost of the overset approach is too large. The overset or chimera method allows the blades loads to be computed from first-principles and therefore provides the most accurate prediction of the rotor wake for the models investigated. The physics of the flow fields generated by these models for rotor/fuselage interactions are explored, along with efficiencies and limitations of each method.

  20. Applications of NLP Techniques to Computer-Assisted Authoring of Test Items for Elementary Chinese

    ERIC Educational Resources Information Center

    Liu, Chao-Lin; Lin, Jen-Hsiang; Wang, Yu-Chun

    2010-01-01

    The authors report an implemented environment for computer-assisted authoring of test items and provide a brief discussion about the applications of NLP techniques for computer assisted language learning. Test items can serve as a tool for language learners to examine their competence in the target language. The authors apply techniques for…

  1. Computational techniques in tribology and material science at the atomic level

    NASA Technical Reports Server (NTRS)

    Ferrante, J.; Bozzolo, G. H.

    1992-01-01

    Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.

  2. Advances on modelling of ITER scenarios: physics and computational challenges

    NASA Astrophysics Data System (ADS)

    Giruzzi, G.; Garcia, J.; Artaud, J. F.; Basiuk, V.; Decker, J.; Imbeaux, F.; Peysson, Y.; Schneider, M.

    2011-12-01

    Methods and tools for design and modelling of tokamak operation scenarios are discussed with particular application to ITER advanced scenarios. Simulations of hybrid and steady-state scenarios performed with the integrated tokamak modelling suite of codes CRONOS are presented. The advantages of a possible steady-state scenario based on cyclic operations, alternating phases of positive and negative loop voltage, with no magnetic flux consumption on average, are discussed. For regimes in which current alignment is an issue, a general method for scenario design is presented, based on the characteristics of the poloidal current density profile.

  3. National facility for advanced computational science: A sustainable path to scientific discovery

    SciTech Connect

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  4. Advanced imaging findings and computer-assisted surgery of suspected synovial chondromatosis in the temporomandibular joint.

    PubMed

    Hohlweg-Majert, Bettina; Metzger, Marc C; Böhm, Joachim; Muecke, Thomas; Schulze, Dirk

    2008-11-01

    Synovial chondromatosis of the joint occurs mainly in teenagers and young adults. Only 3% of these neoplasms are located in the head and neck region. Synovial chondromatosis of the temporomandibular joint is therefore a very rare disorder. Therefore, developing a working, histological confirmation is required for differential diagnosis. In this case series, the outcome of histological investigation and imaging techniques are compared. Based on clinical symptoms, five cases of suspected synovial chondromatosis of the temporomandibular joint are presented. In each of the subjects, the diagnosis was confirmed by histology. Specific imaging features for each case are described. The tomography images were compared with the histological findings. All patients demonstrated preauricular swelling, dental midline deviation, and limited mouth opening. Computer-assisted surgery was performed. Histology disclosed synovial chondromatosis of the temporomandibular joint in four cases. The other case was found to be a developmental disorder of the tympanic bone. The diagnosis of synovial chondromatosis of the temporomandibular joint can only be based on histology. Clinical symptoms are too general and the available imaging techniques only show nonspecific tumorous destruction, infiltration, and/or residual calcified bodies, they are only for advanced cases. A rare developmental disorder of the tympanic bone--persistence of foramen of Huschke--has to be differentiated.

  5. Using Advanced Computer Vision Algorithms on Small Mobile Robots

    DTIC Science & Technology

    2006-04-20

    Lab. This classification algorithm has been used to successfully detect the license plates of automobiles in motion in real-time. While working...use in real-time. Test results are shown for a variety of environments. KEYWORDS: robotics, computer vision, car /license plate detection, SIFT...when detecting the make and model of automobiles , SIFT can be used to achieve very high detection rates at the expense of a hefty performance cost when

  6. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  7. Advanced Computational Methods for Security Constrained Financial Transmission Rights

    SciTech Connect

    Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria; Zhou, Ning; Huang, Zhenyu

    2012-07-26

    Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulation of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.

  8. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    SciTech Connect

    Moore, Kevin L. Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  9. Novel Techniques for Secure Use of Public Cloud Computing Resources

    DTIC Science & Technology

    2015-09-17

    SIGCOMM Computer Communication Review, volume 43, 513–514. ACM, 2013. [61] Jeong, Ik Rae and Jeong Ok Kwon. “Analysis of some keyword search schemes in...Government’s Information Infrastruc- ture”, 1993. URL http://govinfo.library.unt.edu/npr/library/reports/it09.html. [92] Rhee, Hyun Sook, Ik Rae Jeong

  10. Modulation/demodulation techniques for satellite communications. Part 3: Advanced techniques. The nonlinear channel

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1982-01-01

    A theory for deducing and predicting the performance of transmitter/receivers for bandwidth efficient modulations suitable for use on the nonlinear satellite channel is presented. The underlying principle used throughout is the development of receiver structures based on the maximum likelihood decision rule and aproximations to it. The bit error probability transfer function bounds developed in great detail in Part 4 is applied to these modulation/demodulation techniques. The effects of the various degrees of receiver mismatch are considered both theoretically and by numerous illustrative examples.

  11. Computation of the optical trapping force using an FDTD based technique.

    PubMed

    Gauthier, Robert

    2005-05-16

    The computation details related to computing the optical radiation pressure force on various objects using a 2-D grid FDTD algorithm are presented. The technique is based on propagating the electric and magnetic fields through the grid and determining the changes in the optical energy flow with and without the trap object(s) in the system. Traces displayed indicate that the optical forces and FDTD predicted object behavior are in agreement with published experiments and also determined through other computation techniques. We show computation results for a high and low dielectric disc and thin walled shell. The FDTD technique for computing the light-particle force interaction may be employed in all regimes relating particle dimensions to source wavelength. The algorithm presented here can be easily extended to 3-D and include torque computation algorithms, thus providing a highly flexible and universally useable computation engine.

  12. Comparison of techniques for approximating ocean bottom topography in a wave-refraction computer model

    NASA Technical Reports Server (NTRS)

    Poole, L. R.

    1975-01-01

    A study of the effects of using different methods for approximating bottom topography in a wave-refraction computer model was conducted. Approximation techniques involving quadratic least squares, cubic least squares, and constrained bicubic polynomial interpolation were compared for computed wave patterns and parameters in the region of Saco Bay, Maine. Although substantial local differences can be attributed to use of the different approximation techniques, results indicated that overall computed wave patterns and parameter distributions were quite similar.

  13. Application of Advanced Magnetic Resonance Imaging Techniques in Evaluation of the Lower Extremity

    PubMed Central

    Braun, Hillary J.; Dragoo, Jason L.; Hargreaves, Brian A.; Levenston, Marc E.; Gold, Garry E.

    2012-01-01

    Synopsis This article reviews current magnetic resonance imaging techniques for imaging the lower extremity, focusing on imaging of the knee, ankle, and hip joints. Recent advancements in MRI include imaging at 7 Tesla, using multiple receiver channels, T2* imaging, and metal suppression techniques, allowing more detailed visualization of complex anatomy, evaluation of morphological changes within articular cartilage, and imaging around orthopedic hardware. PMID:23622097

  14. Advances in neutron radiographic techniques and applications: a method for nondestructive testing.

    PubMed

    Berger, Harold

    2004-10-01

    A brief history of neutron radiography is presented to set the stage for a discussion of significant neutron radiographic developments and an assessment of future directions for neutron radiography. Specific advances are seen in the use of modern, high dynamic range imaging methods (image plates and flat panels) and for high contrast techniques such as phase contrast, and phase-sensitive imaging. Competition for neutron radiographic inspection may develop as these techniques offer application prospects for X-ray methods.

  15. Computational Approaches to Enhance Nanosafety and Advance Nanomedicine

    NASA Astrophysics Data System (ADS)

    Mendoza, Eduardo R.

    With the increasing use of nanoparticles in food processing, filtration/purification and consumer products, as well as the huge potential of their use in nanomedicine, a quantitative understanding of the effects of nanoparticle uptake and transport is needed. We provide examples of novel methods for modeling complex bio-nano interactions which are based on stochastic process algebras. Since model construction presumes sufficient availability of experimental data, recent developments in "nanoinformatics", an emerging discipline analogous to bioinfomatics, in building an accessible information infrastructure are subsequently discussed. Both computational areas offer opportunities for Filipinos to engage in collaborative, cutting edge research in this impactful field.

  16. First Responders Guide to Computer Forensics: Advanced Topics

    DTIC Science & Technology

    2005-09-01

    server of the sender , the mail server of the receiver, and the computer that receives the email. Assume that Alice wants to send an email to her friend...pleased to meet you MAIL FROM: alice.price@alphanet.com 250 alice.price@alphanet.com... Sender ok RCPT TO: bob.doe@betanet.com 250 bob.doe...betanet.com... Sender ok DATA 354 Please start mail input From: alice.price@alphanet.com To: bob.doe@betanet.com Subject: Lunch Bob, It was good

  17. Blending Two Major Techniques in Order to Compute [Pi

    ERIC Educational Resources Information Center

    Guasti, M. Fernandez

    2005-01-01

    Three major techniques are employed to calculate [pi]. Namely, (i) the perimeter of polygons inscribed or circumscribed in a circle, (ii) calculus based methods using integral representations of inverse trigonometric functions, and (iii) modular identities derived from the transformation theory of elliptic integrals. This note presents a…

  18. An investigation of optimization techniques for drawing computer graphics displays

    NASA Technical Reports Server (NTRS)

    Stocker, F. R.

    1979-01-01

    Techniques for reducing vector data plotting time are studied. The choice of tolerances in optimization and the application of optimization to plots produced on real time interactive display devices are discussed. All results are developed relative to plotting packages and support hardware so that results are useful in real world situations.

  19. Multiplexing technique for computer communications via satellite channels

    NASA Technical Reports Server (NTRS)

    Binder, R.

    1975-01-01

    Multiplexing scheme combines technique of dynamic allocation with conventional time-division multiplexing. Scheme is designed to expedite short-duration interactive or priority traffic and to delay large data transfers; as result, each node has effective capacity of almost total channel capacity when other nodes have light traffic loads.

  20. A Computer Program for the Johnson-Neyman Technique.

    ERIC Educational Resources Information Center

    Ceurvorst, Robert W.

    1979-01-01

    The Johnson-Neyman technique is briefly reviewed, and a program for carrying out an analysis using the procedure is described. The program accommodates one independent and one dependent variable, up to 20 groups of observations, and an unlimited number of cases. (Author)

  1. Computational Efforts in Support of Advanced Coal Research

    SciTech Connect

    Suljo Linic

    2006-08-17

    The focus in this project was to employ first principles computational methods to study the underlying molecular elementary processes that govern hydrogen diffusion through Pd membranes as well as the elementary processes that govern the CO- and S-poisoning of these membranes. Our computational methodology integrated a multiscale hierarchical modeling approach, wherein a molecular understanding of the interactions between various species is gained from ab-initio quantum chemical Density Functional Theory (DFT) calculations, while a mesoscopic statistical mechanical model like Kinetic Monte Carlo is employed to predict the key macroscopic membrane properties such as permeability. The key developments are: (1) We have coupled systematically the ab initio calculations with Kinetic Monte Carlo (KMC) simulations to model hydrogen diffusion through the Pd based-membranes. The predicted tracer diffusivity of hydrogen atoms through the bulk of Pd lattice from KMC simulations are in excellent agreement with experiments. (2) The KMC simulations of dissociative adsorption of H{sub 2} over Pd(111) surface indicates that for thin membranes (less than 10{micro} thick), the diffusion of hydrogen from surface to the first subsurface layer is rate limiting. (3) Sulfur poisons the Pd surface by altering the electronic structure of the Pd atoms in the vicinity of the S atom. The KMC simulations indicate that increasing sulfur coverage drastically reduces the hydrogen coverage on the Pd surface and hence the driving force for diffusion through the membrane.

  2. Computer Based Instructional Techniques in Undergraduate Introductory Organic Chemistry: Rationale, Developmental Techniques, Programming Strategies and Evaluation.

    ERIC Educational Resources Information Center

    Culp, G. H.; And Others

    Over 100 interactive computer programs for use in general and organic chemistry at the University of Texas at Austin have been prepared. The rationale for the programs is based upon the belief that computer-assisted instruction (CAI) can improve education by, among other things, freeing teachers from routine tasks, measuring entry skills,…

  3. Advanced techniques for high resolution spectroscopic observations of cosmic gamma-ray sources

    NASA Technical Reports Server (NTRS)

    Matteson, J. L.; Pelling, M. R.; Peterson, L. E.; Lin, R. P.; Anderson, K. A.; Pehl, R. H.; Hurley, K. C.; Vedrenne, G.; Sniel, M.; Durouchoux, P.

    1985-01-01

    An advanced gamma-ray spectrometer that is currently in development is described. It will obtain a sensitivity of 0.0001 ph/sq cm./sec in a 6 hour balloon observation and uses innovative techniques for background reduction and source imaging.

  4. Recognizing and Managing Complexity: Teaching Advanced Programming Concepts and Techniques Using the Zebra Puzzle

    ERIC Educational Resources Information Center

    Crabtree, John; Zhang, Xihui

    2015-01-01

    Teaching advanced programming can be a challenge, especially when the students are pursuing different majors with diverse analytical and problem-solving capabilities. The purpose of this paper is to explore the efficacy of using a particular problem as a vehicle for imparting a broad set of programming concepts and problem-solving techniques. We…

  5. Real-time application of advanced three-dimensional graphic techniques for research aircraft simulation

    NASA Technical Reports Server (NTRS)

    Davis, Steven B.

    1990-01-01

    Visual aids are valuable assets to engineers for design, demonstration, and evaluation. Discussed here are a variety of advanced three-dimensional graphic techniques used to enhance the displays of test aircraft dynamics. The new software's capabilities are examined and possible future uses are considered.

  6. Fabrication of advanced electrochemical energy materials using sol-gel processing techniques

    NASA Technical Reports Server (NTRS)

    Chu, C. T.; Chu, Jay; Zheng, Haixing

    1995-01-01

    Advanced materials play an important role in electrochemical energy devices such as batteries, fuel cells, and electrochemical capacitors. They are being used as both electrodes and electrolytes. Sol-gel processing is a versatile solution technique used in fabrication of ceramic materials with tailored stoichiometry, microstructure, and properties. The application of sol-gel processing in the fabrication of advanced electrochemical energy materials will be presented. The potentials of sol-gel derived materials for electrochemical energy applications will be discussed along with some examples of successful applications. Sol-gel derived metal oxide electrode materials such as V2O5 cathodes have been demonstrated in solid-slate thin film batteries; solid electrolytes materials such as beta-alumina for advanced secondary batteries had been prepared by the sol-gel technique long time ago; and high surface area transition metal compounds for capacitive energy storage applications can also be synthesized with this method.

  7. Retrospective indexing (RI) - A computer-aided indexing technique

    NASA Technical Reports Server (NTRS)

    Buchan, Ronald L.

    1990-01-01

    An account is given of a method for data base-updating designated 'computer-aided indexing' (CAI) which has been very efficiently implemented at NASA's Scientific and Technical Information Facility by means of retrospective indexing. Novel terms added to the NASA Thesaurus will therefore proceed directly into both the NASA-RECON aerospace information system and its portion of the ESA-Information Retrieval Service, giving users full access to material thus indexed. If a given term appears in the title of a record, it is given special weight. An illustrative graphic representation of the CAI search strategy is presented.

  8. Pre- and postprocessing techniques for determining goodness of computational meshes

    NASA Technical Reports Server (NTRS)

    Oden, J. Tinsley; Westermann, T.; Bass, J. M.

    1993-01-01

    Research in error estimation, mesh conditioning, and solution enhancement for finite element, finite difference, and finite volume methods has been incorporated into AUDITOR, a modern, user-friendly code, which operates on 2D and 3D unstructured neutral files to improve the accuracy and reliability of computational results. Residual error estimation capabilities provide local and global estimates of solution error in the energy norm. Higher order results for derived quantities may be extracted from initial solutions. Within the X-MOTIF graphical user interface, extensive visualization capabilities support critical evaluation of results in linear elasticity, steady state heat transfer, and both compressible and incompressible fluid dynamics.

  9. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  10. Computer graphics techniques for aircraft EMC analysis and design

    NASA Astrophysics Data System (ADS)

    Kubina, S. J.; Bhartia, P.

    1983-10-01

    A comprehensive computer-aided system for the prediction of the potential interaction between avionics systems, with special emphasis on antenna-to-antenna coupling, is described. The methodology is applicable throughout the life cycle of an avionic/weapon system, including system upgrades and retrofits. As soon as aircraft geometry and preliminary systems information becomes available, the computer codes can be used to selectively display proposed antenna locations, emitter/receptor response characteristics, electromagnetic interference (EMI) margins and the actual ray-optical paths of maximum antenna-antenna coupling for each potential interacting antenna set. Antennas can be interactively relocated by track-ball (or joystick) and the analysis repeated at will for optimization or installation design study purposes. The codes can significantly simplify the task of the designer/analyst in effectively identifying critical interactions among an overwhelming large set of potential ones. In addition, it is an excellent design, development and analysis tool which simultaneously identifies both numerically and pictorially the EMI interdependencies among subsystems.

  11. An assessment technique for computer-socket manufacturing

    PubMed Central

    Sanders, Joan; Severance, Michael

    2015-01-01

    An assessment strategy is presented for testing the quality of carving and forming of individual computer aided manufacturing facilities. The strategy is potentially useful to facilities making sockets and companies marketing manufacturing equipment. To execute the strategy, an evaluator fabricates a collection of test models and sockets using the manufacturing suite under evaluation, and then measures their shapes using scanning equipment. Overall socket quality is assessed by comparing socket shapes with electronic file shapes. Then model shapes are compared with electronic file shapes to characterize carving performance. Socket shapes are compared with model shapes to characterize forming performance. The mean radial error (MRE), which is the average difference in radii between the two shapes being compared, provides insight into sizing quality. Inter-quartile range (IQR), the range of radial error for the best matched half of the points on the surfaces being compared, provides insight into shape quality. By determining MRE and IQR for carving and forming separately, the source(s) of socket shape error may be pinpointed. The developed strategy may provide a useful tool to the prosthetics community and industry to help identify problems and limitations in computer aided manufacturing and insight into appropriate modifications to overcome them. PMID:21938663

  12. Strategies for casualty mitigation programs by using advanced tsunami computation

    NASA Astrophysics Data System (ADS)

    IMAI, K.; Imamura, F.

    2012-12-01

    1. Purpose of the study In this study, based on the scenario of great earthquakes along the Nankai trough, we aim on the estimation of the run up and high accuracy inundation process of tsunami in coastal areas including rivers. Here, using a practical method of tsunami analytical model, and taking into account characteristics of detail topography, land use and climate change in a realistic present and expected future environment, we examined the run up and tsunami inundation process. Using these results we estimated the damage due to tsunami and obtained information for the mitigation of human casualties. Considering the time series from the occurrence of the earthquake and the risk of tsunami damage, in order to mitigate casualties we provide contents of disaster risk information displayed in a tsunami hazard and risk map. 2. Creating a tsunami hazard and risk map From the analytical and practical tsunami model (a long wave approximated model) and the high resolution topography (5 m) including detailed data of shoreline, rivers, building and houses, we present a advanced analysis of tsunami inundation considering the land use. Based on the results of tsunami inundation and its analysis; it is possible to draw a tsunami hazard and risk map with information of human casualty, building damage estimation, drift of vehicles, etc. 3. Contents of disaster prevention information To improve the hazard, risk and evacuation information distribution, it is necessary to follow three steps. (1) Provide basic information such as tsunami attack info, areas and routes for evacuation and location of tsunami evacuation facilities. (2) Provide as additional information the time when inundation starts, the actual results of inundation, location of facilities with hazard materials, presence or absence of public facilities and areas underground that required evacuation. (3) Provide information to support disaster response such as infrastructure and traffic network damage prediction

  13. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  14. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  15. Computer-Aided Techniques for Providing Operator Performance Measures

    DTIC Science & Technology

    1974-12-01

    the researcher’s ingenuity and the time he has available for the study, and (3) the research process and all associated manual effort must be repeated...ever need to see during the entire mission, or (2) allow various CRT "pAes" to be manually selected. The first technique is objectionable due to the...update the re gression. Four candidato updatente efunctions are generated for each maneuver sector: i q = (X, a, 0M .. ) ! ~FO(Xo.,,,O..,,) " ~ F( 0

  16. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    SciTech Connect

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  17. Research into display sharing techniques for distributed computing environments

    NASA Technical Reports Server (NTRS)

    Hugg, Steven B.; Fitzgerald, Paul F., Jr.; Rosson, Nina Y.; Johns, Stephen R.

    1990-01-01

    The X-based Display Sharing solution for distributed computing environments is described. The Display Sharing prototype includes the base functionality for telecast and display copy requirements. Since the prototype implementation is modular and the system design provided flexibility for the Mission Control Center Upgrade (MCCU) operational consideration, the prototype implementation can be the baseline for a production Display Sharing implementation. To facilitate the process the following discussions are presented: Theory of operation; System of architecture; Using the prototype; Software description; Research tools; Prototype evaluation; and Outstanding issues. The prototype is based on the concept of a dedicated central host performing the majority of the Display Sharing processing, allowing minimal impact on each individual workstation. Each workstation participating in Display Sharing hosts programs to facilitate the user's access to Display Sharing as host machine.

  18. Microplate based biosensing with a computer screen aided technique.

    PubMed

    Filippini, Daniel; Andersson, Tony P M; Svensson, Samuel P S; Lundström, Ingemar

    2003-10-30

    Melanophores, dark pigment cells from the frog Xenopus laevis, have the ability to change light absorbance upon stimulation by different biological agents. Hormone exposure (e.g. melatonin or alpha-melanocyte stimulating hormone) has been used here as a reversible stimulus to test a new compact microplate reading platform. As an application, the detection of the asthma drug formoterol in blood plasma samples is demonstrated. The present system utilizes a computer screen as a (programmable) large area light source, and a standard web camera as recording media enabling even kinetic microplate reading with a versatile and broadly available platform, which suffices to evaluate numerous bioassays. Especially in the context of point of care testing or self testing applications these possibilities become advantageous compared with highly dedicated comparatively expensive commercial systems.

  19. Techniques for grid manipulation and adaptation. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Eisemann, Peter R.; Lee, Ki D.

    1992-01-01

    Two approaches have been taken to provide systematic grid manipulation for improved grid quality. One is the control point form (CPF) of algebraic grid generation. It provides explicit control of the physical grid shape and grid spacing through the movement of the control points. It works well in the interactive computer graphics environment and hence can be a good candidate for integration with other emerging technologies. The other approach is grid adaptation using a numerical mapping between the physical space and a parametric space. Grid adaptation is achieved by modifying the mapping functions through the effects of grid control sources. The adaptation process can be repeated in a cyclic manner if satisfactory results are not achieved after a single application.

  20. Viscous, resistive magnetohydrodynamic stability computed by spectral techniques

    PubMed Central

    Dahlburg, R. B.; Zang, T. A.; Montgomery, D.; Hussaini, M. Y.

    1983-01-01

    Expansions in Chebyshev polynomials are used to study the linear stability of one-dimensional magnetohydrodynamic quasiequilibria, in the presence of finite resistivity and viscosity. The method is modeled on the one used by Orszag in accurate computation of solutions of the Orr-Sommerfeld equation. Two Reynolds-like numbers involving Alfvén speeds, length scales, kinematic viscosity, and magnetic diffusivity govern the stability boundaries, which are determined by the geometric mean of the two Reynolds-like numbers. Marginal stability curves, growth rates versus Reynolds-like numbers, and growth rates versus parallel wave numbers are exhibited. A numerical result that appears general is that instability has been found to be associated with inflection points in the current profile, though no general analytical proof has emerged. It is possible that nonlinear subcritical three-dimensional instabilities may exist, similar to those in Poiseuille and Couette flow. PMID:16593375

  1. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    DOE PAGES

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

  2. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    SciTech Connect

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based on their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.

  3. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.

    1972-01-01

    The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.

  4. Computing aerodynamic sound using advanced statistical turbulence theories

    NASA Technical Reports Server (NTRS)

    Hecht, A. M.; Teske, M. E.; Bilanin, A. J.

    1981-01-01

    It is noted that the calculation of turbulence-generated aerodynamic sound requires knowledge of the spatial and temporal variation of Q sub ij (xi sub k, tau), the two-point, two-time turbulent velocity correlations. A technique is presented to obtain an approximate form of these correlations based on closure of the Reynolds stress equations by modeling of higher order terms. The governing equations for Q sub ij are first developed for a general flow. The case of homogeneous, stationary turbulence in a unidirectional constant shear mean flow is then assumed. The required closure form for Q sub ij is selected which is capable of qualitatively reproducing experimentally observed behavior. This form contains separation time dependent scale factors as parameters and depends explicitly on spatial separation. The approximate forms of Q sub ij are used in the differential equations and integral moments are taken over the spatial domain. The velocity correlations are used in the Lighthill theory of aerodynamic sound by assuming normal joint probability.

  5. Computer simulation of the thermal environment of large-scale integrated circuits - Computer time-saving techniques.

    NASA Technical Reports Server (NTRS)

    Thompson, R. R.; Blum, H. A.

    1971-01-01

    This paper is concerned with the computer costs for both the steady-state and transient thermal responses of large-scale integrated circuits (LSI) when metal is present within the substrate. For the more cost-sensitive transient case, an extrapolation technique for computer time savings is compared with the accuracy loss in this study. This approach could be useful for design-cost planning.

  6. Advances and perspectives in lung cancer imaging using multidetector row computed tomography.

    PubMed

    Coche, Emmanuel

    2012-10-01

    The introduction of multidetector row computed tomography (CT) into clinical practice has revolutionized many aspects of the clinical work-up. Lung cancer imaging has benefited from various breakthroughs in computing technology, with advances in the field of lung cancer detection, tissue characterization, lung cancer staging and response to therapy. Our paper discusses the problems of radiation, image visualization and CT examination comparison. It also reviews the most significant advances in lung cancer imaging and highlights the emerging clinical applications that use state of the art CT technology in the field of lung cancer diagnosis and follow-up.

  7. Parallel-META 2.0: enhanced metagenomic data analysis with functional annotation, high performance computing and advanced visualization.

    PubMed

    Su, Xiaoquan; Pan, Weihua; Song, Baoxing; Xu, Jian; Ning, Kang

    2014-01-01

    The metagenomic method directly sequences and analyses genome information from microbial communities. The main computational tasks for metagenomic analyses include taxonomical and functional structure analysis for all genomes in a microbial community (also referred to as a metagenomic sample). With the advancement of Next Generation Sequencing (NGS) techniques, the number of metagenomic samples and the data size for each sample are increasing rapidly. Current metagenomic analysis is both data- and computation- intensive, especially when there are many species in a metagenomic sample, and each has a large number of sequences. As such, metagenomic analyses require extensive computational power. The increasing analytical requirements further augment the challenges for computation analysis. In this work, we have proposed Parallel-META 2.0, a metagenomic analysis software package, to cope with such needs for efficient and fast analyses of taxonomical and functional structures for microbial communities. Parallel-META 2.0 is an extended and improved version of Parallel-META 1.0, which enhances the taxonomical analysis using multiple databases, improves computation efficiency by optimized parallel computing, and supports interactive visualization of results in multiple views. Furthermore, it enables functional analysis for metagenomic samples including short-reads assembly, gene prediction and functional annotation. Therefore, it could provide accurate taxonomical and functional analyses of the metagenomic samples in high-throughput manner and on large scale.

  8. Advances in atmospheric light scattering theory and remote-sensing techniques

    NASA Astrophysics Data System (ADS)

    Videen, Gorden; Sun, Wenbo; Gong, Wei

    2017-02-01

    This issue focuses especially on characterizing particles in the Earth-atmosphere system. The significant role of aerosol particles in this system was recognized in the mid-1970s [1]. Since that time, our appreciation for the role they play has only increased. It has been and continues to be one of the greatest unknown factors in the Earth-atmosphere system as evidenced by the most recent Intergovernmental Panel on Climate Change (IPCC) assessments [2]. With increased computational capabilities, in terms of both advanced algorithms and in brute-force computational power, more researchers have the tools available to address different aspects of the role of aerosols in the atmosphere. In this issue, we focus on recent advances in this topical area, especially the role of light scattering and remote sensing. This issue follows on the heels of four previous topical issues on this subject matter that have graced the pages of this journal [3-6].

  9. Geophysical outlook. Part IV. New vector super computers promote seismic advancements

    SciTech Connect

    Nelson, H.R. Jr.

    1982-01-01

    Some major oil companies are beginning to test the use of vector computers to process the huge volumes of seismic data acquired by modern prospecting techniques. To take advantage of the parallel-processing techniques offered by the vector mode of analysis, users must completely restructure the seismic-data processing packages. The most important application of vector computers, to date, has been in numerical reservoir modeling.

  10. Advanced entry guidance algorithm with landing footprint computation

    NASA Astrophysics Data System (ADS)

    Leavitt, James Aaron

    -determined angle of attack profile. The method is also capable of producing orbital footprints using an automatically-generated set of angle of attack profiles of varying range, with the lowest profile designed for near-maximum range in the absence of an active heat load constraint. The accuracy of the footprint method is demonstrated by direct comparison with footprints computed independently by an optimization program.

  11. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  12. Recent advances in computational fluid dynamics relevant to the modelling of pesticide flow on leaf surfaces.

    PubMed

    Glass, C Richard; Walters, Keith F A; Gaskell, Philip H; Lee, Yeaw C; Thompson, Harvey M; Emerson, David R; Gu, Xiao-Jun

    2010-01-01

    Increasing societal and governmental concern about the worldwide use of chemical pesticides is now providing strong drivers towards maximising the efficiency of pesticide utilisation and the development of alternative control techniques. There is growing recognition that the ultimate goal of achieving efficient and sustainable pesticide usage will require greater understanding of the fluid mechanical mechanisms governing the delivery to, and spreading of, pesticide droplets on target surfaces such as leaves. This has led to increasing use of computational fluid dynamics (CFD) as an important component of efficient process design with regard to pesticide delivery to the leaf surface. This perspective highlights recent advances in CFD methods for droplet spreading and film flows, which have the potential to provide accurate, predictive models for pesticide flow on leaf surfaces, and which can take account of each of the key influences of surface topography and chemistry, initial spray deposition conditions, evaporation and multiple droplet spreading interactions. The mathematical framework of these CFD methods is described briefly, and a series of new flow simulation results relevant to pesticide flows over foliage is provided. The potential benefits of employing CFD for practical process design are also discussed briefly.

  13. Advances in automated deception detection in text-based computer-mediated communication

    NASA Astrophysics Data System (ADS)

    Adkins, Mark; Twitchell, Douglas P.; Burgoon, Judee K.; Nunamaker, Jay F., Jr.

    2004-08-01

    The Internet has provided criminals, terrorists, spies, and other threats to national security a means of communication. At the same time it also provides for the possibility of detecting and tracking their deceptive communication. Recent advances in natural language processing, machine learning and deception research have created an environment where automated and semi-automated deception detection of text-based computer-mediated communication (CMC, e.g. email, chat, instant messaging) is a reachable goal. This paper reviews two methods for discriminating between deceptive and non-deceptive messages in CMC. First, Document Feature Mining uses document features or cues in CMC messages combined with machine learning techniques to classify messages according to their deceptive potential. The method, which is most useful in asynchronous applications, also allows for the visualization of potential deception cues in CMC messages. Second, Speech Act Profiling, a method for quantifying and visualizing synchronous CMC, has shown promise in aiding deception detection. The methods may be combined and are intended to be a part of a suite of tools for automating deception detection.

  14. Investigating a hybrid perturbation-Galerkin technique using computer algebra

    NASA Technical Reports Server (NTRS)

    Andersen, Carl M.; Geer, James F.

    1988-01-01

    A two-step hybrid perturbation-Galerkin method is presented for the solution of a variety of differential equations type problems which involve a scalar parameter. The resulting (approximate) solution has the form of a sum where each term consists of the product of two functions. The first function is a function of the independent field variable(s) x, and the second is a function of the parameter lambda. In step one the functions of x are determined by forming a perturbation expansion in lambda. In step two the functions of lambda are determined through the use of the classical Bubnov-Gelerkin method. The resulting hybrid method has the potential of overcoming some of the drawbacks of the perturbation and Bubnov-Galerkin methods applied separately, while combining some of the good features of each. In particular, the results can be useful well beyond the radius of convergence associated with the perturbation expansion. The hybrid method is applied with the aid of computer algebra to a simple two-point boundary value problem where the radius of convergence is finite and to a quantum eigenvalue problem where the radius of convergence is zero. For both problems the hybrid method apparently converges for an infinite range of the parameter lambda. The results obtained from the hybrid method are compared with approximate solutions obtained by other methods, and the applicability of the hybrid method to broader problem areas is discussed.

  15. Computational Modeling and High Performance Computing in Advanced Materials Processing, Synthesis, and Design

    DTIC Science & Technology

    2014-12-07

    research efforts in this project focused on the synergistic coupling of: • Computational material science and mechanics of hybrid and light weight polymeric...MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 11 A-l-l: Atomistic Modeling in Polymer Nanocomposite Systems...DETAILED TECHNICAL REPORT 16 A-1: COMPUTATIONAL MATERIAL SCIENCE AND MECHANICS OF HYBRID AND LIGHT WEIGHT POLYMERIC COMPOSITE STRUCTURES 16 A-l-l

  16. A technique for computation of star magnitudes relative to an optical sensor

    NASA Technical Reports Server (NTRS)

    Rhoads, J. W.

    1972-01-01

    The theory and techniques used to compute star magnitudes relative to any optical detector (such as the Mariner Mars 1971 Canopus star tracker) are described. Results are given relative to various star detectors.

  17. Computer Modeling of Microbiological Experiments in the Teaching Laboratory: Animation Techniques.

    ERIC Educational Resources Information Center

    Tritz, Gerald J.

    1987-01-01

    Discusses the use of computer assisted instruction in the medical education program of the Kirksville College of Osteopathic Medicine (Missouri). Describes the animation techniques used in a series of simulations for microbiology. (TW)

  18. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  19. Techniques to derive geometries for image-based Eulerian computations

    PubMed Central

    Dillard, Seth; Buchholz, James; Vigmostad, Sarah; Kim, Hyunggun; Udaykumar, H.S.

    2014-01-01

    Purpose The performance of three frequently used level set-based segmentation methods is examined for the purpose of defining features and boundary conditions for image-based Eulerian fluid and solid mechanics models. The focus of the evaluation is to identify an approach that produces the best geometric representation from a computational fluid/solid modeling point of view. In particular, extraction of geometries from a wide variety of imaging modalities and noise intensities, to supply to an immersed boundary approach, is targeted. Design/methodology/approach Two- and three-dimensional images, acquired from optical, X-ray CT, and ultrasound imaging modalities, are segmented with active contours, k-means, and adaptive clustering methods. Segmentation contours are converted to level sets and smoothed as necessary for use in fluid/solid simulations. Results produced by the three approaches are compared visually and with contrast ratio, signal-to-noise ratio, and contrast-to-noise ratio measures. Findings While the active contours method possesses built-in smoothing and regularization and produces continuous contours, the clustering methods (k-means and adaptive clustering) produce discrete (pixelated) contours that require smoothing using speckle-reducing anisotropic diffusion (SRAD). Thus, for images with high contrast and low to moderate noise, active contours are generally preferable. However, adaptive clustering is found to be far superior to the other two methods for images possessing high levels of noise and global intensity variations, due to its more sophisticated use of local pixel/voxel intensity statistics. Originality/value It is often difficult to know a priori which segmentation will perform best for a given image type, particularly when geometric modeling is the ultimate goal. This work offers insight to the algorithm selection process, as well as outlining a practical framework for generating useful geometric surfaces in an Eulerian setting. PMID

  20. Suspended sediment modeling using genetic programming and soft computing techniques

    NASA Astrophysics Data System (ADS)

    Kisi, Ozgur; Dailr, Ali Hosseinzadeh; Cimen, Mesut; Shiri, Jalal

    2012-07-01

    SummaryModeling suspended sediment load is an important factor in water resources engineering as it crucially affects the design and management of water resources structures. In this study the genetic programming (GP) technique was applied for estimating the daily suspended sediment load in two stations in Cumberland River in U.S. Daily flow and sediment data from 1972 to 1989 were used to train and test the applied genetic programming models. The effect of various GP operators on sediment load estimation was investigated. The optimal fitness function, operator functions, linking function and learning algorithm were obtained for modeling daily suspended sediment. The GP estimates were compared with those of the Adaptive Neuro-Fuzzy Inference System (ANFIS), Artificial Neural Networks (ANNs) and Support Vector Machine (SVM) results, in term of coefficient of determination, mean absolute error, coefficient of residual mass and variance accounted for. The comparison results indicated that the GP is superior to the ANFIS, ANN and SVM models in estimating daily suspended sediment load.

  1. Unveiling Prolyl Oligopeptidase Ligand Migration by Comprehensive Computational Techniques

    PubMed Central

    Kotev, Martin; Lecina, Daniel; Tarragó, Teresa; Giralt, Ernest; Guallar, Víctor

    2015-01-01

    Prolyl oligopeptidase (POP) is a large 80 kDa protease, which cleaves oligopeptides at the C-terminal side of proline residues and constitutes an important pharmaceutical target. Despite the existence of several crystallographic structures, there is an open debate about migration (entrance and exit) pathways for ligands, and their coupling with protein dynamics. Recent studies have shown the capabilities of molecular dynamics and classical force fields in describing spontaneous binding events and nonbiased ligand migration pathways. Due to POP’s size and to the buried nature of its active site, an exhaustive sampling by means of conventional long enough molecular dynamics trajectories is still a nearly impossible task. Such a level of sampling, however, is possible with the breakthrough protein energy landscape exploration technique. Here, we present an exhaustive sampling of POP with a known inhibitor, Z-pro-prolinal. In >3000 trajectories Z-pro-prolinal explores all the accessible surface area, showing multiple entrance events into the large internal cavity through the pore in the β-propeller domain. Moreover, we modeled a natural substrate binding and product release by predicting the entrance of an undecapeptide substrate, followed by manual active site cleavage and nonbiased exit of one of the products (a dipeptide). The product exit shows preference from a flexible 18-amino acid residues loop, pointing to an overall mechanism where entrance and exit occur in different sites. PMID:25564858

  2. Minimally invasive (endoscopic-computer assisted) surgery: Technique and review

    PubMed Central

    Kumar, Anand; Yadav, Nirma; Singh, Shipra; Chauhan, Neha

    2016-01-01

    Endoscopic or minimally invasive surgery popular as keyhole surgery is a medical procedure in which endoscope (a camera) is used, and it has gained broad acceptance with popularity in several surgical specialties and has heightened the standard of care. Oral and maxillofacial surgery is a modern discipline in the field of dentistry in which endoscopy has developed as well as widely used in surgeries and is rapidly gaining importance. The use of different visual as well as standard instruments such as laparoscopic and endoscopic instruments, and high-powered magnification devices, has allowed physicians to decrease the morbidity of many surgical procedures by eliminating the need for a large surgical incision. Minimally invasive techniques have evolved through the development of surgical microscopes equipped with a camera to get visual images for maxillofacial surgeries, endodontic procedures, and periodontal surgical procedures. Nevertheless, current experiences and reviewing the literature have intimated that the use of endoscopes, as in different minimally invasive methods, may permit complicated surgeries with less complications, for example, in reconstruction of facial fractures through smaller incisions with less extensive exposure. PMID:28299251

  3. [Surgical reconstruction of maxillary defects using a computer-assisted techniques].

    PubMed

    Zhang, W B; Yu, Y; Wang, Y; Liu, X J; Mao, C; Guo, C B; Yu, G Y; Peng, X

    2017-02-18

    The maxilla is the most important bony support of the mid-face skeleton and is critical for both esthetics and function. Maxillary defects, resulting from tumor resection, can cause severe functional and cosmetic deformities. Furthermore, maxillary reconstruction presents a great challenge for oral and maxillofacial surgeons. Nowadays, vascularized composite bone flap transfer has been widely used for functional maxillary reconstruction. In the last decade, we have performed a comprehensive research on functional maxillary reconstruction with free fibula flap and reported excellent functional and acceptable esthetic results. However, this experience based clinical procedure still remainssome problems in accuracy and efficiency. In recent years, computer assisted techniques are now widely used in oral and maxillofacial surgery. We have performed a series of study on maxillary reconstruction with computer assisted techniques. The computer assisted techniques used for maxillary reconstruction mainly include: (1) Three dimensional (3D) reconstruction and tumor mapping: providing a 3D view of maxillary tumor and adjacent structures and helping to make the diagnosis of maxillary tumor accurate and objective; (2) Virtual planning: simulating tumor resection and maxillectomy as well as fibula reconstruction on the computer, so that to make an ideal surgical plan; (3) 3D printing: producing a 3D stereo model for prebending individualized titanium mesh and also providing template or cutting guide for the surgery; (4) Surgical navigation: the bridge between virtual plan and real surgery, confirming the virtual plan during the surgery and guarantee the accuracy; (5) Computer assisted analyzing and evaluating: making a quantitative and objective of the final result and evaluating the outcome. We also performed a series of studies to evaluate the application of computer assisted techniques used for maxillary reconstruction, including: (1) 3D tumor mapping technique for accurate

  4. Advance development of a technique for characterizing the thermomechanical properties of thermally stable polymers

    NASA Technical Reports Server (NTRS)

    Gillham, J. K.; Stadnicki, S. J.; Hazony, Y.

    1974-01-01

    The torsional braid experiment has been interfaced with a centralized hierarchical computing system for data acquisition and data processing. Such a system, when matched by the appropriate upgrading of the monitoring techniques, provides high resolution thermomechanical spectra of rigidity and damping, and their derivatives with respect to temperature.

  5. Recent Advances in Techniques for Starch Esters and the Applications: A Review

    PubMed Central

    Hong, Jing; Zeng, Xin-An; Brennan, Charles S.; Brennan, Margaret; Han, Zhong

    2016-01-01

    Esterification is one of the most important methods to alter the structure of starch granules and improve its applications. Conventionally, starch esters are prepared by conventional or dual modification techniques, which have the disadvantages of being expensive, have regent overdoses, and are time-consuming. In addition, the degree of substitution (DS) is often considered as the primary factor in view of its contribution to estimate substituted groups of starch esters. In order to improve the detection accuracy and production efficiency, different detection techniques, including titration, nuclear magnetic resonance (NMR), Fourier transform infrared spectroscopy (FT-IR), thermal gravimetric analysis/infrared spectroscopy (TGA/IR) and headspace gas chromatography (HS-GC), have been developed for DS. This paper gives a comprehensive overview on the recent advances in DS analysis and starch esterification techniques. Additionally, the advantages, limitations, some perspectives on future trends of these techniques and the applications of their derivatives in the food industry are also presented. PMID:28231145

  6. Advances in the surface modification techniques of bone-related implants for last 10 years

    PubMed Central

    Qiu, Zhi-Ye; Chen, Cen; Wang, Xiu-Mei; Lee, In-Seop

    2014-01-01

    At the time of implanting bone-related implants into human body, a variety of biological responses to the material surface occur with respect to surface chemistry and physical state. The commonly used biomaterials (e.g. titanium and its alloy, Co–Cr alloy, stainless steel, polyetheretherketone, ultra-high molecular weight polyethylene and various calcium phosphates) have many drawbacks such as lack of biocompatibility and improper mechanical properties. As surface modification is very promising technology to overcome such problems, a variety of surface modification techniques have been being investigated. This review paper covers recent advances in surface modification techniques of bone-related materials including physicochemical coating, radiation grafting, plasma surface engineering, ion beam processing and surface patterning techniques. The contents are organized with different types of techniques to applicable materials, and typical examples are also described. PMID:26816626

  7. Recent Advances in Techniques for Starch Esters and the Applications: A Review.

    PubMed

    Hong, Jing; Zeng, Xin-An; Brennan, Charles S; Brennan, Margaret; Han, Zhong

    2016-07-09

    Esterification is one of the most important methods to alter the structure of starch granules and improve its applications. Conventionally, starch esters are prepared by conventional or dual modification techniques, which have the disadvantages of being expensive, have regent overdoses, and are time-consuming. In addition, the degree of substitution (DS) is often considered as the primary factor in view of its contribution to estimate substituted groups of starch esters. In order to improve the detection accuracy and production efficiency, different detection techniques, including titration, nuclear magnetic resonance (NMR), Fourier transform infrared spectroscopy (FT-IR), thermal gravimetric analysis/infrared spectroscopy (TGA/IR) and headspace gas chromatography (HS-GC), have been developed for DS. This paper gives a comprehensive overview on the recent advances in DS analysis and starch esterification techniques. Additionally, the advantages, limitations, some perspectives on future trends of these techniques and the applications of their derivatives in the food industry are also presented.

  8. Unified Instrumentation: Examining the Simultaneous Application of Advanced Measurement Techniques for Increased Wind Tunnel Testing Capability

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A. (Editor); Bartram, Scott M.; Humphreys, William M., Jr.; Jenkins, Luther N.; Jordan, Jeffrey D.; Lee, Joseph W.; Leighty, Bradley D.; Meyers, James F.; South, Bruce W.; Cavone, Angelo A.; Ingram, JoAnne L.

    2002-01-01

    A Unified Instrumentation Test examining the combined application of Pressure Sensitive Paint, Projection Moire Interferometry, Digital Particle Image Velocimetry, Doppler Global Velocimetry, and Acoustic Microphone Array has been conducted at the NASA Langley Research Center. The fundamental purposes of conducting the test were to: (a) identify and solve compatibility issues among the techniques that would inhibit their simultaneous application in a wind tunnel, and (b) demonstrate that simultaneous use of advanced instrumentation techniques is feasible for increasing tunnel efficiency and identifying control surface actuation / aerodynamic reaction phenomena. This paper provides summary descriptions of each measurement technique used during the Unified Instrumentation Test, their implementation for testing in a unified fashion, and example results identifying areas of instrument compatibility and incompatibility. Conclusions are drawn regarding the conditions under which the measurement techniques can be operated simultaneously on a non-interference basis. Finally, areas requiring improvement for successfully applying unified instrumentation in future wind tunnel tests are addressed.

  9. Assessment of Techniques for Evaluating Computer Systems for Federal Agency Procurements. Final Report.

    ERIC Educational Resources Information Center

    Letmanyi, Helen

    Developed to identify and qualitatively assess computer system evaluation techniques for use during acquisition of general purpose computer systems, this document presents several criteria for comparison and selection. An introduction discusses the automatic data processing (ADP) acquisition process and the need to plan for uncertainty through…

  10. Gaming via Computer Simulation Techniques for Junior College Economics Education. Final Report.

    ERIC Educational Resources Information Center

    Thompson, Fred A.

    A study designed to answer the need for more attractive and effective economics education involved the teaching of one junior college economics class by the conventional (lecture) method and an experimental class by computer simulation techniques. Econometric models approximating the "real world" were computer programed to enable the experimental…

  11. Advanced combustion techniques for controlling NO/x/ emissions of high altitude cruise aircraft

    NASA Technical Reports Server (NTRS)

    Rudey, R. A.; Reck, G. M.

    1976-01-01

    An array of experiments have been and continue to be sponsored and conducted by NASA to explore the potential of advanced combustion techniques for controlling the emissions of aircraft into the upper atmosphere. Of particular concern are the oxides of nitrogen (NO/x/) emissions into the stratosphere. The experiments utilize a wide variety of approaches varying from advanced combustor concepts to fundamental flame tube experiments. Results are presented which indicate that substantial reductions in cruise NO/x/ emissions should be achievable in future aircraft engines. A major NASA program is described which focuses the many fundamental experiments into a planned evolution and demonstration of the prevaporized-premixed combustion technique in a full-scale engine.

  12. POC-scale testing of an advanced fine coal dewatering equipment/technique

    SciTech Connect

    1998-09-01

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 pm) clean coal. Economical dewatering of an ultra-fine clean-coal product to a 20% level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy`s program to show that ultra-clean coal could be effectively dewatered to 20% or lower moisture using either conventional or advanced dewatering techniques. The cost-sharing contract effort is for 36 months beginning September 30, 1994. This report discusses technical progress made during the quarter from July 1 - September 30, 1997.

  13. Imaging of skull base pathologies: Role of advanced magnetic resonance imaging techniques

    PubMed Central

    Mathur, Ankit; Kesavadas, C; Thomas, Bejoy; Kapilamoorthy, TR

    2015-01-01

    Imaging plays a vital role in evaluation of skull base pathologies as this region is not directly accessible for clinical evaluation. Computerized tomography (CT) and magnetic resonance imaging (MRI) have played complementary roles in the diagnosis of the various neoplastic and non-neoplastic lesions of the skull base. However, CT and conventional MRI may at times be insufficient to correctly pinpoint the accurate diagnosis. Advanced MRI techniques, though difficult to apply in the skull base region, in conjunction with CT and conventional MRI can however help in improving the diagnostic accuracy. This article aims to highlight the importance of advanced MRI techniques like diffusion-weighted imaging, susceptibility-weighted imaging, perfusion-weighted imaging, and MR spectroscopy in differentiation of various lesions involving the skull base. PMID:26427895

  14. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  15. Advanced digital modulation: Communication techniques and monolithic GaAs technology

    NASA Technical Reports Server (NTRS)

    Wilson, S. G.; Oliver, J. D., Jr.; Kot, R. C.; Richards, C. R.

    1983-01-01

    Communications theory and practice are merged with state-of-the-art technology in IC fabrication, especially monolithic GaAs technology, to examine the general feasibility of a number of advanced technology digital transmission systems. Satellite-channel models with (1) superior throughput, perhaps 2 Gbps; (2) attractive weight and cost; and (3) high RF power and spectrum efficiency are discussed. Transmission techniques possessing reasonably simple architectures capable of monolithic fabrication at high speeds were surveyed. This included a review of amplitude/phase shift keying (APSK) techniques and the continuous-phase-modulation (CPM) methods, of which MSK represents the simplest case.

  16. Development of advanced electron holographic techniques and application to industrial materials and devices.

    PubMed

    Yamamoto, Kazuo; Hirayama, Tsukasa; Tanji, Takayoshi

    2013-06-01

    The development of a transmission electron microscope equipped with a field emission gun paved the way for electron holography to be put to practical use in various fields. In this paper, we review three advanced electron holography techniques: on-line real-time electron holography, three-dimensional (3D) tomographic holography and phase-shifting electron holography, which are becoming important techniques for materials science and device engineering. We also describe some applications of electron holography to the analysis of industrial materials and devices: GaAs compound semiconductors, solid oxide fuel cells and all-solid-state lithium ion batteries.

  17. Combined preputial advancement and phallopexy as a revision technique for treating paraphimosis in a dog.

    PubMed

    Wasik, S M; Wallace, A M

    2014-11-01

    A 7-year-old neutered male Jack Russell terrier-cross was presented for signs of recurrent paraphimosis, despite previous surgical enlargement of the preputial ostium. Revision surgery was performed using a combination of preputial advancement and phallopexy, which resulted in complete and permanent coverage of the glans penis by the prepuce, and at 1 year postoperatively, no recurrence of paraphimosis had been observed. The combined techniques allow preservation of the normal penile anatomy, are relatively simple to perform and provide a cosmetic result. We recommend this combination for the treatment of paraphimosis in the dog, particularly when other techniques have failed.

  18. [Principles and advanced techniques for better internetpresentations in obstetrics and gynecology].

    PubMed

    Seufert, R; Molitor, N; Pollow, K; Woernle, F; Hawighorst-Knapstein, S

    2001-08-01

    Internet presentations are common tools for better medical communication and better scientific work. Meanwhile a great number of gynecological and obstetrical institutions present data via the world wide web within a wide range of quality and performance. Specific HTML editors offer quick and easy presentations, but only advanced internet techniques enable interesting multimedia presentations. N-tier applications are the future standard and we must integrate them in general informatical systems. New Concepts, actual tools and general problems will be discussed and new principles similar to actual E commerce techniques are able to solve our special medical demands.

  19. An example of requirements for Advanced Subsonic Civil Transport (ASCT) flight control system using structured techniques

    NASA Technical Reports Server (NTRS)

    Mclees, Robert E.; Cohen, Gerald C.

    1991-01-01

    The requirements are presented for an Advanced Subsonic Civil Transport (ASCT) flight control system generated using structured techniques. The requirements definition starts from initially performing a mission analysis to identify the high level control system requirements and functions necessary to satisfy the mission flight. The result of the study is an example set of control system requirements partially represented using a derivative of Yourdon's structured techniques. Also provided is a research focus for studying structured design methodologies and in particular design-for-validation philosophies.

  20. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report)

    EPA Science Inventory

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  1. Response to House Joint Resolution No. 118 [To Advance Computer-Assisted Instruction].

    ERIC Educational Resources Information Center

    Virginia State General Assembly, Richmond.

    This response by the Virginia Department of Education to House Joint Resolution No. 118 of the General Assembly of Virginia, which requested the Department of Education to study initiatives to advance computer-assisted instruction, is based on input from state and national task forces and on a 1986 survey of 80 Viriginia school divisions. The…

  2. PARTNERING WITH DOE TO APPLY ADVANCED BIOLOGICAL, ENVIRONMENTAL, AND COMPUTATIONAL SCIENCE TO ENVIRONMENTAL ISSUES

    EPA Science Inventory

    On February 18, 2004, the U.S. Environmental Protection Agency and Department of Energy signed a Memorandum of Understanding to expand the research collaboration of both agencies to advance biological, environmental, and computational sciences for protecting human health and the ...

  3. COMPUTATIONAL TOXICOLOGY ADVANCES: EMERGING CAPABILITIES FOR DATA EXPLORATION AND SAR MODEL DEVELOPMENT

    EPA Science Inventory

    Computational Toxicology Advances: Emerging capabilities for data exploration and SAR model development
    Ann M. Richard and ClarLynda R. Williams, National Health & Environmental Effects Research Laboratory, US EPA, Research Triangle Park, NC, USA; email: richard.ann@epa.gov

  4. Computers-for-edu: An Advanced Business Application Programming (ABAP) Teaching Case

    ERIC Educational Resources Information Center

    Boyle, Todd A.

    2007-01-01

    The "Computers-for-edu" case is designed to provide students with hands-on exposure to creating Advanced Business Application Programming (ABAP) reports and dialogue programs, as well as navigating various mySAP Enterprise Resource Planning (ERP) transactions needed by ABAP developers. The case requires students to apply a wide variety…

  5. Research in Computational Aeroscience Applications Implemented on Advanced Parallel Computing Systems

    NASA Technical Reports Server (NTRS)

    Wigton, Larry

    1996-01-01

    Improving the numerical linear algebra routines for use in new Navier-Stokes codes, specifically Tim Barth's unstructured grid code, with spin-offs to TRANAIR is reported. A fast distance calculation routine for Navier-Stokes codes using the new one-equation turbulence models is written. The primary focus of this work was devoted to improving matrix-iterative methods. New algorithms have been developed which activate the full potential of classical Cray-class computers as well as distributed-memory parallel computers.

  6. Impact of computer advances on future finite elements computations. [for aircraft and spacecraft design

    NASA Technical Reports Server (NTRS)

    Fulton, Robert E.

    1985-01-01

    Research performed over the past 10 years in engineering data base management and parallel computing is discussed, and certain opportunities for research toward the next generation of structural analysis capability are proposed. Particular attention is given to data base management associated with the IPAD project and parallel processing associated with the Finite Element Machine project, both sponsored by NASA, and a near term strategy for a distributed structural analysis capability based on relational data base management software and parallel computers for a future structural analysis system.

  7. Study of advanced techniques for determining the long term performance of components

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The application of existing and new technology to the problem of determining the long-term performance capability of liquid rocket propulsion feed systems is discussed. The long term performance of metal to metal valve seats in a liquid propellant fuel system is stressed. The approaches taken in conducting the analysis are: (1) advancing the technology of characterizing components through the development of new or more sensitive techniques and (2) improving the understanding of the physical of degradation.

  8. A first attempt to bring computational biology into advanced high school biology classrooms.

    PubMed

    Gallagher, Suzanne Renick; Coon, William; Donley, Kristin; Scott, Abby; Goldberg, Debra S

    2011-10-01

    Computer science has become ubiquitous in many areas of biological research, yet most high school and even college students are unaware of this. As a result, many college biology majors graduate without adequate computational skills for contemporary fields of biology. The absence of a computational element in secondary school biology classrooms is of growing concern to the computational biology community and biology teachers who would like to acquaint their students with updated approaches in the discipline. We present a first attempt to correct this absence by introducing a computational biology element to teach genetic evolution into advanced biology classes in two local high schools. Our primary goal was to show students how computation is used in biology and why a basic understanding of computation is necessary for research in many fields of biology. This curriculum is intended to be taught by a computational biologist who has worked with a high school advanced biology teacher to adapt the unit for his/her classroom, but a motivated high school teacher comfortable with mathematics and computing may be able to teach this alone. In this paper, we present our curriculum, which takes into consideration the constraints of the required curriculum, and discuss our experiences teaching it. We describe the successes and challenges we encountered while bringing this unit to high school students, discuss how we addressed these challenges, and make suggestions for future versions of this curriculum.We believe that our curriculum can be a valuable seed for further development of computational activities aimed at high school biology students. Further, our experiences may be of value to others teaching computational biology at this level. Our curriculum can be obtained at http://ecsite.cs.colorado.edu/?page_id=149#biology or by contacting the authors.

  9. Building an advanced climate model: Program plan for the CHAMMP (Computer Hardware, Advanced Mathematics, and Model Physics) Climate Modeling Program

    SciTech Connect

    Not Available

    1990-12-01

    The issue of global warming and related climatic changes from increasing concentrations of greenhouse gases in the atmosphere has received prominent attention during the past few years. The Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP) Climate Modeling Program is designed to contribute directly to this rapid improvement. The goal of the CHAMMP Climate Modeling Program is to develop, verify, and apply a new generation of climate models within a coordinated framework that incorporates the best available scientific and numerical approaches to represent physical, biogeochemical, and ecological processes, that fully utilizes the hardware and software capabilities of new computer architectures, that probes the limits of climate predictability, and finally that can be used to address the challenging problem of understanding the greenhouse climate issue through the ability of the models to simulate time-dependent climatic changes over extended times and with regional resolution.

  10. Integrating Organic Matter Structure with Ecosystem Function using Advanced Analytical Chemistry Techniques

    NASA Astrophysics Data System (ADS)

    Boot, C. M.

    2012-12-01

    Microorganisms are the primary transformers of organic matter in terrestrial and aquatic ecosystems. The structure of organic matter controls its bioavailability and researchers have long sought to link the chemical characteristics of the organic matter pool to its lability. To date this effort has been primarily attempted using low resolution descriptive characteristics (e.g. organic matter content, carbon to nitrogen ratio, aromaticity, etc .). However, recent progress in linking these two important ecosystem components has been advanced using advanced high resolution tools (e.g. nuclear magnetic resonance (NMR) spectroscopy, and mass spectroscopy (MS)-based techniques). A series of experiments will be presented that highlight the application of high resolution techniques in a variety of terrestrial and aquatic ecosystems with the focus on how these data explicitly provide the foundation for integrating organic matter structure into our concept of ecosystem function. The talk will highlight results from a series of experiments including: an MS-based metabolomics and fluorescence excitation emission matrix approach evaluating seasonal and vegetation based changes in dissolved organic matter (DOM) composition from arctic soils; Fourier transform ion cyclotron resonance (FTICR) MS and MS metabolomics analysis of DOM from three lakes in an alpine watershed; and the transformation of 13C labeled glucose track with NMR during a rewetting experiment from Colorado grassland soils. These data will be synthesized to illustrate how the application of advanced analytical techniques provides novel insight into our understanding of organic matter processing in a wide range of ecosystems.

  11. Potential of advanced MR imaging techniques in the differential diagnosis of parkinsonism.

    PubMed

    Hotter, Anna; Esterhammer, Regina; Schocke, Michael F H; Seppi, Klaus

    2009-01-01

    The clinical differentiation of parkinsonian syndromes remains challenging not only for neurologists but also for movement disorder specialists. Conventional magnetic resonance imaging (cMRI) with the visual assessment of T2- and T1-weighted imaging as well as different advanced MRI techniques offer objective measures, which may be a useful tool in the diagnostic work-up of Parkinson's disease and atypical parkinsonian disorders (APDs). In clinical practice, cMRI is a well-established method for the exclusion of symptomatic parkinsonism due to other pathologies. Over the past two decades, abnormalities in the basal ganglia and infratentorial structures have been shown especially in APDs not only by cMRI but also by different advanced MRI techniques, including methods to assess regional cerebral atrophy quantitatively such as magnetic resonance volumetry, proton magnetic resonance spectroscopy, diffusion-weighted imaging, and magnetization transfer imaging. This article aims to review recent research findings on the role of advanced MRI techniques in the differential diagnosis of neurodegenerative parkinsonian disorders.

  12. Advanced in situ spectroscopic techniques and their applications in environmental biogeochemistry: introduction to the special section.

    PubMed

    Lombi, Enzo; Hettiarachchi, Ganga M; Scheckel, Kirk G

    2011-01-01

    Understanding the molecular-scale complexities and interplay of chemical and biological processes of contaminants at solid, liquid, and gas interfaces is a fundamental and crucial element to enhance our understanding of anthropogenic environmental impacts. The ability to describe the complexity of environmental biogeochemical reaction mechanisms relies on our analytical ability through the application and developmemnt of advanced spectroscopic techniques. Accompanying this introductory article are nine papers that either review advanced in situ spectroscopic methods or present original research utilizing these techniques. This collection of articles summarizes the challenges facing environmental biogeochemistry, highlights the recent advances and scientific gaps, and provides an outlook into future research that may benefit from the use of in situ spectroscopic approaches. The use of synchrotron-based techniques and other methods are discussed in detail, as is the importance to integrate multiple analytical approaches to confirm results of complementary procedures or to fill data gaps. We also argue that future direction in research will be driven, in addition to recent analytical developments, by emerging factors such as the need for risk assessment of new materials (i.e., nanotechnologies) and the realization that biogeochemical processes need to be investigated in situ under environmentally relevant conditions.

  13. POC-scale testing of an advanced fine coal dewatering equipment/technique

    SciTech Connect

    Groppo, J.G.; Parekh, B.K.; Rawls, P.

    1995-11-01

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 {mu}m) clean coal. Economical dewatering of an ultra-fine clean coal product to a 20 percent level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy`s program to show that ultra-clean coal could be effectively dewatered to 20 percent or lower moisture using either conventional or advanced dewatering techniques. As the contract title suggests, the main focus of the program is on proof-of-concept testing of a dewatering technique for a fine clean coal product. The coal industry is reluctant to use the advanced fine coal recovery technology due to the non-availability of an economical dewatering process. in fact, in a recent survey conducted by U.S. DOE and Battelle, dewatering of fine clean coal was identified as the number one priority for the coal industry. This project will attempt to demonstrate an efficient and economic fine clean coal slurry dewatering process.

  14. Computer architecture for efficient algorithmic executions in real-time systems: New technology for avionics systems and advanced space vehicles

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Youngblood, John N.; Saha, Aindam

    1987-01-01

    Improvements and advances in the development of computer architecture now provide innovative technology for the recasting of traditional sequential solutions into high-performance, low-cost, parallel system to increase system performance. Research conducted in development of specialized computer architecture for the algorithmic execution of an avionics system, guidance and control problem in real time is described. A comprehensive treatment of both the hardware and software structures of a customized computer which performs real-time computation of guidance commands with updated estimates of target motion and time-to-go is presented. An optimal, real-time allocation algorithm was developed which maps the algorithmic tasks onto the processing elements. This allocation is based on the critical path analysis. The final stage is the design and development of the hardware structures suitable for the efficient execution of the allocated task graph. The processing element is designed for rapid execution of the allocated tasks. Fault tolerance is a key feature of the overall architecture. Parallel numerical integration techniques, tasks definitions, and allocation algorithms are discussed. The parallel implementation is analytically verified and the experimental results are presented. The design of the data-driven computer architecture, customized for the execution of the particular algorithm, is discussed.

  15. Teaching Computer Ergonomic Techniques: Practices and Perceptions of Secondary and Postsecondary Business Educators.

    ERIC Educational Resources Information Center

    Alexander, Melody W.; Arp, Larry W.

    1997-01-01

    A survey of 260 secondary and 251 postsecondary business educators found the former more likely to think computer ergonomic techniques should taught in elementary school and to address the hazards of improper use. Both groups stated that over half of students they observe do not use good techniques and agreed that students need continual…

  16. Optimizaton of corrosion control for lead in drinking water using computational modeling techniques

    EPA Science Inventory

    Computational modeling techniques have been used to very good effect in the UK in the optimization of corrosion control for lead in drinking water. A “proof-of-concept” project with three US/CA case studies sought to demonstrate that such techniques could work equally well in the...

  17. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    SciTech Connect

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  18. Advancement in Understanding Volcanic Processes by 4D Synchrotron X-ray Computed Microtomography Imaging of Rock Textures

    NASA Astrophysics Data System (ADS)

    Polacci, M.; Arzilli, F.; La Spina, G.

    2015-12-01

    X-ray computed microtomography (μCT) is the only high-resolution, non-destructive technique that allows visualization and processing of geomaterials directly in three-dimensions. This, together with the development of more and more sophisticated imaging techniques, have generated in the last ten years a widespread application of this methodology in Earth Sciences, from structural geology to palaeontology to igneous petrology to volcanology. Here, I will describe how X-ray μCT has contributed to advance our knowledge of volcanic processes and eruption dynamics and illustrate the first, preliminary results from 4D (space+time) X-ray microtomographic experiments of magma kinetics in basaltic systems.

  19. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  20. Computational techniques for solar wind flows past terrestrial planets: Theory and computer programs

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Chaussee, D. S.; Trudinger, B. C.; Spreiter, J. R.

    1977-01-01

    The interaction of the solar wind with terrestrial planets can be predicted using a computer program based on a single fluid, steady, dissipationless, magnetohydrodynamic model to calculate the axisymmetric, supersonic, super-Alfvenic solar wind flow past both magnetic and nonmagnetic planets. The actual calculations are implemented by an assemblage of computer codes organized into one program. These include finite difference codes which determine the gas-dynamic solution, together with a variety of special purpose output codes for determining and automatically plotting both flow field and magnetic field results. Comparisons are made with previous results, and results are presented for a number of solar wind flows. The computational programs developed are documented and are presented in a general user's manual which is included.