Diffraction scattering computed tomography: a window into the structures of complex nanomaterials
Birkbak, M. E.; Leemreize, H.; Frølich, S.; Stock, S. R.
2015-01-01
Modern functional nanomaterials and devices are increasingly composed of multiple phases arranged in three dimensions over several length scales. Therefore there is a pressing demand for improved methods for structural characterization of such complex materials. An excellent emerging technique that addresses this problem is diffraction/scattering computed tomography (DSCT). DSCT combines the merits of diffraction and/or small angle scattering with computed tomography to allow imaging the interior of materials based on the diffraction or small angle scattering signals. This allows, e.g., one to distinguish the distributions of polymorphs in complex mixtures. Here we review this technique and give examples of how it can shed light on modern nanoscale materials. PMID:26505175
Modern Computational Techniques for the HMMER Sequence Analysis
2013-01-01
This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944
Shaikhouni, Ammar; Elder, J Bradley
2012-11-01
At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. Copyright © 2012 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Nagasinghe, Iranga
2010-01-01
This thesis investigates and develops a few acceleration techniques for the search engine algorithms used in PageRank and HITS computations. PageRank and HITS methods are two highly successful applications of modern Linear Algebra in computer science and engineering. They constitute the essential technologies accounted for the immense growth and…
Breast tumor malignancy modelling using evolutionary neural logic networks.
Tsakonas, Athanasios; Dounias, Georgios; Panagi, Georgia; Panourgias, Evangelia
2006-01-01
The present work proposes a computer assisted methodology for the effective modelling of the diagnostic decision for breast tumor malignancy. The suggested approach is based on innovative hybrid computational intelligence algorithms properly applied in related cytological data contained in past medical records. The experimental data used in this study were gathered in the early 1990s in the University of Wisconsin, based in post diagnostic cytological observations performed by expert medical staff. Data were properly encoded in a computer database and accordingly, various alternative modelling techniques were applied on them, in an attempt to form diagnostic models. Previous methods included standard optimisation techniques, as well as artificial intelligence approaches, in a way that a variety of related publications exists in modern literature on the subject. In this report, a hybrid computational intelligence approach is suggested, which effectively combines modern mathematical logic principles, neural computation and genetic programming in an effective manner. The approach proves promising either in terms of diagnostic accuracy and generalization capabilities, or in terms of comprehensibility and practical importance for the related medical staff.
Improvements in the efficiency of turboexpanders in cryogenic applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agahi, R.R.; Lin, M.C.; Ershaghi, B.
1996-12-31
Process designers have utilized turboexpanders in cryogenic processes because of their higher thermal efficiencies when compared with conventional refrigeration cycles. Process design and equipment performance have improved substantially through the utilization of modern technologies. Turboexpander manufacturers have also adopted Computational Fluid Dynamic Software, Computer Numerical Control Technology and Holography Techniques to further improve an already impressive turboexpander efficiency performance. In this paper, the authors explain the design process of the turboexpander utilizing modern technology. Two cases of turboexpanders processing helium (4.35{degrees}K) and hydrogen (56{degrees}K) will be presented.
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
All biology is computational biology.
Markowetz, Florian
2017-03-01
Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.
NASA Technical Reports Server (NTRS)
Yeomans, D. K. (Editor); West, R. M. (Editor); Harrington, R. S. (Editor); Marsden, B. G. (Editor)
1984-01-01
Modern techniques for making cometary astrometric observations, reducing these observations, using accurate reference star catalogs, and computing precise orbits and ephemerides are discussed in detail and recommendations and suggestions are given in each area.
NASA Astrophysics Data System (ADS)
Marrugo, Andrés G.; Millán, María S.; Cristóbal, Gabriel; Gabarda, Salvador; Sorel, Michal; Sroubek, Filip
2012-06-01
Medical digital imaging has become a key element of modern health care procedures. It provides visual documentation and a permanent record for the patients, and most important the ability to extract information about many diseases. Modern ophthalmology thrives and develops on the advances in digital imaging and computing power. In this work we present an overview of recent image processing techniques proposed by the authors in the area of digital eye fundus photography. Our applications range from retinal image quality assessment to image restoration via blind deconvolution and visualization of structural changes in time between patient visits. All proposed within a framework for improving and assisting the medical practice and the forthcoming scenario of the information chain in telemedicine.
Imaging in anatomy: a comparison of imaging techniques in embalmed human cadavers
2013-01-01
Background A large variety of imaging techniques is an integral part of modern medicine. Introducing radiological imaging techniques into the dissection course serves as a basis for improved learning of anatomy and multidisciplinary learning in pre-clinical medical education. Methods Four different imaging techniques (ultrasound, radiography, computed tomography, and magnetic resonance imaging) were performed in embalmed human body donors to analyse possibilities and limitations of the respective techniques in this peculiar setting. Results The quality of ultrasound and radiography images was poor, images of computed tomography and magnetic resonance imaging were of good quality. Conclusion Computed tomography and magnetic resonance imaging have a superior image quality in comparison to ultrasound and radiography and offer suitable methods for imaging embalmed human cadavers as a valuable addition to the dissection course. PMID:24156510
Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph
ERIC Educational Resources Information Center
Attwood, Paul V.
1997-01-01
Describes a self-instructional assignment approach to the teaching of advanced enzymology. Presents an assignment that offers a means of teaching enzymology to students that exposes them to modern computer-based techniques of analyzing protein structure and relates structure to enzyme function. (JRH)
NASA Astrophysics Data System (ADS)
Kubina, Stanley J.
1989-09-01
The review of the status of computational electromagnetics by Miller and the exposition by Burke of the developments in one of the more important computer codes in the application of the electric field integral equation method, the Numerical Electromagnetic Code (NEC), coupled with Molinet's summary of progress in techniques based on the Geometrical Theory of Diffraction (GTD), provide a clear perspective on the maturity of the modern discipline of computational electromagnetics and its potential. Audone's exposition of the application to the computation of Radar Scattering Cross-section (RCS) is an indication of the breadth of practical applications and his exploitation of modern near-field measurement techniques reminds one of progress in the measurement discipline which is essential to the validation or calibration of computational modeling methodology when applied to complex structures such as aircraft and ships. The latter monograph also presents some comparison results with computational models. Some of the results presented for scale model and flight measurements show some serious disagreements in the lobe structure which would require some detailed examination. This also applies to the radiation patterns obtained by flight measurement compared with those obtained using wire-grid models and integral equation modeling methods. In the examples which follow, an attempt is made to match measurements results completely over the entire 2 to 30 MHz HF range for antennas on a large patrol aircraft. The problem of validating computer models of HF antennas on a helicopter and using computer models to generate radiation pattern information which cannot be obtained by measurements are discussed. The use of NEC computer models to analyze top-side ship configurations where measurement results are not available and only self-validation measures are available or at best comparisons with an alternate GTD computer modeling technique is also discussed.
Bridging the Gap between Basic and Clinical Sciences: A Description of a Radiological Anatomy Course
ERIC Educational Resources Information Center
Torres, Anna; Staskiewicz, Grzegorz J.; Lisiecka, Justyna; Pietrzyk, Lukasz; Czekajlo, Michael; Arancibia, Carlos U.; Maciejewski, Ryszard; Torres, Kamil
2016-01-01
A wide variety of medical imaging techniques pervade modern medicine, and the changing portability and performance of tools like ultrasound imaging have brought these medical imaging techniques into the everyday practice of many specialties outside of radiology. However, proper interpretation of ultrasonographic and computed tomographic images…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevrekidis, Ioannis G.
The work explored the linking of modern developing machine learning techniques (manifold learning and in particular diffusion maps) with traditional PDE modeling/discretization/scientific computation techniques via the equation-free methodology developed by the PI. The result (in addition to several PhD degrees, two of them by CSGF Fellows) was a sequence of strong developments - in part on the algorithmic side, linking data mining with scientific computing, and in part on applications, ranging from PDE discretizations to molecular dynamics and complex network dynamics.
[Advancements of computer chemistry in separation of Chinese medicine].
Li, Lingjuan; Hong, Hong; Xu, Xuesong; Guo, Liwei
2011-12-01
Separating technique of Chinese medicine is not only a key technique in the field of Chinese medicine' s research and development, but also a significant step in the modernization of Chinese medicinal preparation. Computer chemistry can build model and look for the regulations from Chinese medicine system which is full of complicated data. This paper analyzed the applicability, key technology, basic mode and common algorithm of computer chemistry applied in the separation of Chinese medicine, introduced the mathematic mode and the setting methods of Extraction kinetics, investigated several problems which based on traditional Chinese medicine membrane procession, and forecasted the application prospect.
Sedlár, Drahomír; Potomková, Jarmila; Rehorová, Jarmila; Seckár, Pavel; Sukopová, Vera
2003-11-01
Information explosion and globalization make great demands on keeping pace with the new trends in the healthcare sector. The contemporary level of computer and information literacy among most health care professionals in the Teaching Hospital Olomouc (Czech Republic) is not satisfactory for efficient exploitation of modern information technology in diagnostics, therapy and nursing. The present contribution describes the application of two basic problem solving techniques (brainstorming, SWOT analysis) to develop a project aimed at information literacy enhancement.
Wronkiewicz, Mark; Larson, Eric; Lee, Adrian Kc
2016-10-01
Brain-computer interface (BCI) technology allows users to generate actions based solely on their brain signals. However, current non-invasive BCIs generally classify brain activity recorded from surface electroencephalography (EEG) electrodes, which can hinder the application of findings from modern neuroscience research. In this study, we use source imaging-a neuroimaging technique that projects EEG signals onto the surface of the brain-in a BCI classification framework. This allowed us to incorporate prior research from functional neuroimaging to target activity from a cortical region involved in auditory attention. Classifiers trained to detect attention switches performed better with source imaging projections than with EEG sensor signals. Within source imaging, including subject-specific anatomical MRI information (instead of using a generic head model) further improved classification performance. This source-based strategy also reduced accuracy variability across three dimensionality reduction techniques-a major design choice in most BCIs. Our work shows that source imaging provides clear quantitative and qualitative advantages to BCIs and highlights the value of incorporating modern neuroscience knowledge and methods into BCI systems.
A Survey of Architectural Techniques For Improving Cache Power Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
Modern processors are using increasingly larger sized on-chip caches. Also, with each CMOS technology generation, there has been a significant increase in their leakage energy consumption. For this reason, cache power management has become a crucial research issue in modern processor design. To address this challenge and also meet the goals of sustainable computing, researchers have proposed several techniques for improving energy efficiency of cache architectures. This paper surveys recent architectural techniques for improving cache power efficiency and also presents a classification of these techniques based on their characteristics. For providing an application perspective, this paper also reviews several real-worldmore » processor chips that employ cache energy saving techniques. The aim of this survey is to enable engineers and researchers to get insights into the techniques for improving cache power efficiency and motivate them to invent novel solutions for enabling low-power operation of caches.« less
Concepts and algorithms for terminal-area traffic management
NASA Technical Reports Server (NTRS)
Erzberger, H.; Chapel, J. D.
1984-01-01
The nation's air-traffic-control system is the subject of an extensive modernization program, including the planned introduction of advanced automation techniques. This paper gives an overview of a concept for automating terminal-area traffic management. Four-dimensional (4D) guidance techniques, which play an essential role in the automated system, are reviewed. One technique, intended for on-board computer implementation, is based on application of optimal control theory. The second technique is a simplified approach to 4D guidance intended for ground computer implementation. It generates advisory messages to help the controller maintain scheduled landing times of aircraft not equipped with on-board 4D guidance systems. An operational system for the second technique, recently evaluated in a simulation, is also described.
Application of evolutionary computation in ECAD problems
NASA Astrophysics Data System (ADS)
Lee, Dae-Hyun; Hwang, Seung H.
1998-10-01
Design of modern electronic system is a complicated task which demands the use of computer- aided design (CAD) tools. Since a lot of problems in ECAD are combinatorial optimization problems, evolutionary computations such as genetic algorithms and evolutionary programming have been widely employed to solve those problems. We have applied evolutionary computation techniques to solve ECAD problems such as technology mapping, microcode-bit optimization, data path ordering and peak power estimation, where their benefits are well observed. This paper presents experiences and discusses issues in those applications.
2011-06-01
4. Conclusion The Web -based AGeS system described in this paper is a computationally-efficient and scalable system for high- throughput genome...method for protecting web services involves making them more resilient to attack using autonomic computing techniques. This paper presents our initial...20–23, 2011 2011 DoD High Performance Computing Modernzation Program Users Group Conference HPCMP UGC 2011 The papers in this book comprise the
ERIC Educational Resources Information Center
Ivanov, Anisoara; Neacsu, Andrei
2011-01-01
This study describes the possibility and advantages of utilizing simple computer codes to complement the teaching techniques for high school physics. The authors have begun working on a collection of open source programs which allow students to compare the results and graphics from classroom exercises with the correct solutions and further more to…
ERIC Educational Resources Information Center
Litofsky, Joshua; Viswanathan, Rama
2015-01-01
Matrix diagonalization, the key technique at the heart of modern computational chemistry for the numerical solution of the Schrödinger equation, can be easily introduced in the physical chemistry curriculum in a pedagogical context using simple Hückel molecular orbital theory for p bonding in molecules. We present details and results of…
Modern quantitative schlieren techniques
NASA Astrophysics Data System (ADS)
Hargather, Michael; Settles, Gary
2010-11-01
Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.
Structural biology computing: Lessons for the biomedical research sciences.
Morin, Andrew; Sliz, Piotr
2013-11-01
The field of structural biology, whose aim is to elucidate the molecular and atomic structures of biological macromolecules, has long been at the forefront of biomedical sciences in adopting and developing computational research methods. Operating at the intersection between biophysics, biochemistry, and molecular biology, structural biology's growth into a foundational framework on which many concepts and findings of molecular biology are interpreted1 has depended largely on parallel advancements in computational tools and techniques. Without these computing advances, modern structural biology would likely have remained an exclusive pursuit practiced by few, and not become the widely practiced, foundational field it is today. As other areas of biomedical research increasingly embrace research computing techniques, the successes, failures and lessons of structural biology computing can serve as a useful guide to progress in other biomedically related research fields. Copyright © 2013 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Biernacki, John; Juhasz, John; Sadler, Gerald
1991-01-01
A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.
Generalized likelihood ratios for quantitative diagnostic test scores.
Tandberg, D; Deely, J J; O'Malley, A J
1997-11-01
The reduction of quantitative diagnostic test scores to the dichotomous case is a wasteful and unnecessary simplification in the era of high-speed computing. Physicians could make better use of the information embedded in quantitative test results if modern generalized curve estimation techniques were applied to the likelihood functions of Bayes' theorem. Hand calculations could be completely avoided and computed graphical summaries provided instead. Graphs showing posttest probability of disease as a function of pretest probability with confidence intervals (POD plots) would enhance acceptance of these techniques if they were immediately available at the computer terminal when test results were retrieved. Such constructs would also provide immediate feedback to physicians when a valueless test had been ordered.
New Trends in Computer Assisted Language Learning and Teaching.
ERIC Educational Resources Information Center
Perez-Paredes, Pascual, Ed.; Cantos-Gomez, Pascual, Ed.
2002-01-01
Articles in this special issue include the following: "ICT and Modern Foreign Languages: Learning Opportunities and Training Needs" (Graham Davies); "Authoring, Pedagogy and the Web: Expectations Versus Reality" (Paul Bangs); "Web-based Instructional Environments: Tools and Techniques for Effective Second Language…
Execution environment for intelligent real-time control systems
NASA Technical Reports Server (NTRS)
Sztipanovits, Janos
1987-01-01
Modern telerobot control technology requires the integration of symbolic and non-symbolic programming techniques, different models of parallel computations, and various programming paradigms. The Multigraph Architecture, which has been developed for the implementation of intelligent real-time control systems is described. The layered architecture includes specific computational models, integrated execution environment and various high-level tools. A special feature of the architecture is the tight coupling between the symbolic and non-symbolic computations. It supports not only a data interface, but also the integration of the control structures in a parallel computing environment.
Cicero, Raúl; Criales, José Luis; Cardoso, Manuel
2009-01-01
The impressive development of computed tomography (CT) techniques such as the three dimensional helical CT produces a spatial image of the thoracic skull. At the beginning of the 16th century Leonardo da Vinci drew with great precision the thorax oseum. These drawings show an outstanding similarity with the images obtained by three dimensional helical CT. The cumbersome task of the Renaissance genius is a prime example of the careful study of human anatomy. Modern imaging techniques require perfect anatomic knowledge of the human body in order to generate exact interpretations of images. Leonardo's example is alive for anybody devoted to modern imaging studies.
NASA Astrophysics Data System (ADS)
Bird, Robert; Nystrom, David; Albright, Brian
2017-10-01
The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.
A performance model for GPUs with caches
Dao, Thanh Tuan; Kim, Jungwon; Seo, Sangmin; ...
2014-06-24
To exploit the abundant computational power of the world's fastest supercomputers, an even workload distribution to the typically heterogeneous compute devices is necessary. While relatively accurate performance models exist for conventional CPUs, accurate performance estimation models for modern GPUs do not exist. This paper presents two accurate models for modern GPUs: a sampling-based linear model, and a model based on machine-learning (ML) techniques which improves the accuracy of the linear model and is applicable to modern GPUs with and without caches. We first construct the sampling-based linear model to predict the runtime of an arbitrary OpenCL kernel. Based on anmore » analysis of NVIDIA GPUs' scheduling policies we determine the earliest sampling points that allow an accurate estimation. The linear model cannot capture well the significant effects that memory coalescing or caching as implemented in modern GPUs have on performance. We therefore propose a model based on ML techniques that takes several compiler-generated statistics about the kernel as well as the GPU's hardware performance counters as additional inputs to obtain a more accurate runtime performance estimation for modern GPUs. We demonstrate the effectiveness and broad applicability of the model by applying it to three different NVIDIA GPU architectures and one AMD GPU architecture. On an extensive set of OpenCL benchmarks, on average, the proposed model estimates the runtime performance with less than 7 percent error for a second-generation GTX 280 with no on-chip caches and less than 5 percent for the Fermi-based GTX 580 with hardware caches. On the Kepler-based GTX 680, the linear model has an error of less than 10 percent. On an AMD GPU architecture, Radeon HD 6970, the model estimates with 8 percent of error rates. As a result, the proposed technique outperforms existing models by a factor of 5 to 6 in terms of accuracy.« less
Fox, W Christopher; Park, Min S; Belverud, Shawn; Klugh, Arnett; Rivet, Dennis; Tomlin, Jeffrey M
2013-04-01
To follow the progression of neuroimaging as a means of non-invasive evaluation of mild traumatic brain injury (mTBI) in order to provide recommendations based on reproducible, defined imaging findings. A comprehensive literature review and analysis of contemporary published articles was performed to study the progression of neuroimaging findings as a non-invasive 'biomarker' for mTBI. Multiple imaging modalities exist to support the evaluation of patients with mTBI, including ultrasound (US), computed tomography (CT), single photon emission computed tomography (SPECT), positron emission tomography (PET), and magnetic resonance imaging (MRI). These techniques continue to evolve with the development of fractional anisotropy (FA), fiber tractography (FT), and diffusion tensor imaging (DTI). Modern imaging techniques, when applied in the appropriate clinical setting, may serve as a valuable tool for diagnosis and management of patients with mTBI. An understanding of modern neuroanatomical imaging will enhance our ability to analyse injury and recognize the manifestations of mTBI.
Brønsted acidity of protic ionic liquids: a modern ab initio valence bond theory perspective.
Patil, Amol Baliram; Mahadeo Bhanage, Bhalchandra
2016-09-21
Room temperature ionic liquids (ILs), especially protic ionic liquids (PILs), are used in many areas of the chemical sciences. Ionicity, the extent of proton transfer, is a key parameter which determines many physicochemical properties and in turn the suitability of PILs for various applications. The spectrum of computational chemistry techniques applied to investigate ionic liquids includes classical molecular dynamics, Monte Carlo simulations, ab initio molecular dynamics, Density Functional Theory (DFT), CCSD(t) etc. At the other end of the spectrum is another computational approach: modern ab initio Valence Bond Theory (VBT). VBT differs from molecular orbital theory based methods in the expression of the molecular wave function. The molecular wave function in the valence bond ansatz is expressed as a linear combination of valence bond structures. These structures include covalent and ionic structures explicitly. Modern ab initio valence bond theory calculations of representative primary and tertiary ammonium protic ionic liquids indicate that modern ab initio valence bond theory can be employed to assess the acidity and ionicity of protic ionic liquids a priori.
WE-D-303-00: Computational Phantoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, John; Brigham and Women’s Hospital and Dana-Farber Cancer Institute, Boston, MA
2015-06-15
Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less
Stochastic Feedforward Control Technique
NASA Technical Reports Server (NTRS)
Halyo, Nesim
1990-01-01
Class of commanded trajectories modeled as stochastic process. Advanced Transport Operating Systems (ATOPS) research and development program conducted by NASA Langley Research Center aimed at developing capabilities for increases in capacities of airports, safe and accurate flight in adverse weather conditions including shear, winds, avoidance of wake vortexes, and reduced consumption of fuel. Advances in techniques for design of modern controls and increased capabilities of digital flight computers coupled with accurate guidance information from Microwave Landing System (MLS). Stochastic feedforward control technique developed within context of ATOPS program.
NASA Astrophysics Data System (ADS)
Clementi, Enrico
2012-06-01
This is the introductory chapter to the AIP Proceedings volume "Theory and Applications of Computational Chemistry: The First Decade of the Second Millennium" where we discuss the evolution of "computational chemistry". Very early variational computational chemistry developments are reported in Sections 1 to 7, and 11, 12 by recalling some of the computational chemistry contributions by the author and his collaborators (from late 1950 to mid 1990); perturbation techniques are not considered in this already extended work. Present day's computational chemistry is partly considered in Sections 8 to 10 where more recent studies by the author and his collaborators are discussed, including the Hartree-Fock-Heitler-London method; a more general discussion on present day computational chemistry is presented in Section 14. The following chapters of this AIP volume provide a view of modern computational chemistry. Future computational chemistry developments can be extrapolated from the chapters of this AIP volume; further, in Sections 13 and 15 present an overall analysis on computational chemistry, obtained from the Global Simulation approach, by considering the evolution of scientific knowledge confronted with the opportunities offered by modern computers.
An Intelligent Systems Approach to Automated Object Recognition: A Preliminary Study
Maddox, Brian G.; Swadley, Casey L.
2002-01-01
Attempts at fully automated object recognition systems have met with varying levels of success over the years. However, none of the systems have achieved high enough accuracy rates to be run unattended. One of the reasons for this may be that they are designed from the computer's point of view and rely mainly on image-processing methods. A better solution to this problem may be to make use of modern advances in computational intelligence and distributed processing to try to mimic how the human brain is thought to recognize objects. As humans combine cognitive processes with detection techniques, such a system would combine traditional image-processing techniques with computer-based intelligence to determine the identity of various objects in a scene.
Application of contrast media in post-mortem imaging (CT and MRI).
Grabherr, Silke; Grimm, Jochen; Baumann, Pia; Mangin, Patrice
2015-09-01
The application of contrast media in post-mortem radiology differs from clinical approaches in living patients. Post-mortem changes in the vascular system and the absence of blood flow lead to specific problems that have to be considered for the performance of post-mortem angiography. In addition, interpreting the images is challenging due to technique-related and post-mortem artefacts that have to be known and that are specific for each applied technique. Although the idea of injecting contrast media is old, classic methods are not simply transferable to modern radiological techniques in forensic medicine, as they are mostly dedicated to single-organ studies or applicable only shortly after death. With the introduction of modern imaging techniques, such as post-mortem computed tomography (PMCT) and post-mortem magnetic resonance (PMMR), to forensic death investigations, intensive research started to explore their advantages and limitations compared to conventional autopsy. PMCT has already become a routine investigation in several centres, and different techniques have been developed to better visualise the vascular system and organ parenchyma in PMCT. In contrast, the use of PMMR is still limited due to practical issues, and research is now starting in the field of PMMR angiography. This article gives an overview of the problems in post-mortem contrast media application, the various classic and modern techniques, and the issues to consider by using different media.
The Architecture of Information at Plateau Beaubourg
ERIC Educational Resources Information Center
Branda, Ewan Edward
2012-01-01
During the course of the 1960s, computers and information networks made their appearance in the public imagination. To architects on the cusp of architecture's postmodern turn, information technology offered new forms, metaphors, and techniques by which modern architecture's technological and utopian basis could be reasserted. Yet by the end of…
Researching for Public Speaking Classes: Requiring Various Media.
ERIC Educational Resources Information Center
Greenberg, Cindy A.
This paper presents steps for incorporating research into a community college public speaking class curriculum. It outlines, from an historical perspective, the different techniques and tools of research, including face-to-face contact, the printing press, modern machinery, and computers. It suggests discussing research surveys and taking students…
Developing and Assessing E-Learning Techniques for Teaching Forecasting
ERIC Educational Resources Information Center
Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian
2014-01-01
In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…
Low Cost Alternatives to Commercial Lab Kits for Physics Experiments
ERIC Educational Resources Information Center
Kodejška, C.; De Nunzio, G.; Kubinek, R.; Ríha, J.
2015-01-01
Conducting experiments in physics using modern measuring techniques, and particularly those utilizing computers, is often much more attractive to students than conducting experiments conventionally. However, the cost of professional kits in the Czech Republic is still very expensive for many schools. The basic equipment for one student workplace…
A parallel Jacobson-Oksman optimization algorithm. [parallel processing (computers)
NASA Technical Reports Server (NTRS)
Straeter, T. A.; Markos, A. T.
1975-01-01
A gradient-dependent optimization technique which exploits the vector-streaming or parallel-computing capabilities of some modern computers is presented. The algorithm, derived by assuming that the function to be minimized is homogeneous, is a modification of the Jacobson-Oksman serial minimization method. In addition to describing the algorithm, conditions insuring the convergence of the iterates of the algorithm and the results of numerical experiments on a group of sample test functions are presented. The results of these experiments indicate that this algorithm will solve optimization problems in less computing time than conventional serial methods on machines having vector-streaming or parallel-computing capabilities.
A survey of GPU-based acceleration techniques in MRI reconstructions
Wang, Haifeng; Peng, Hanchuan; Chang, Yuchou
2018-01-01
Image reconstruction in magnetic resonance imaging (MRI) clinical applications has become increasingly more complicated. However, diagnostic and treatment require very fast computational procedure. Modern competitive platforms of graphics processing unit (GPU) have been used to make high-performance parallel computations available, and attractive to common consumers for computing massively parallel reconstruction problems at commodity price. GPUs have also become more and more important for reconstruction computations, especially when deep learning starts to be applied into MRI reconstruction. The motivation of this survey is to review the image reconstruction schemes of GPU computing for MRI applications and provide a summary reference for researchers in MRI community. PMID:29675361
A survey of GPU-based acceleration techniques in MRI reconstructions.
Wang, Haifeng; Peng, Hanchuan; Chang, Yuchou; Liang, Dong
2018-03-01
Image reconstruction in magnetic resonance imaging (MRI) clinical applications has become increasingly more complicated. However, diagnostic and treatment require very fast computational procedure. Modern competitive platforms of graphics processing unit (GPU) have been used to make high-performance parallel computations available, and attractive to common consumers for computing massively parallel reconstruction problems at commodity price. GPUs have also become more and more important for reconstruction computations, especially when deep learning starts to be applied into MRI reconstruction. The motivation of this survey is to review the image reconstruction schemes of GPU computing for MRI applications and provide a summary reference for researchers in MRI community.
Digital image processing for information extraction.
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1973-01-01
The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.
Computer vision applications for coronagraphic optical alignment and image processing.
Savransky, Dmitry; Thomas, Sandrine J; Poyneer, Lisa A; Macintosh, Bruce A
2013-05-10
Modern coronagraphic systems require very precise alignment between optical components and can benefit greatly from automated image processing. We discuss three techniques commonly employed in the fields of computer vision and image analysis as applied to the Gemini Planet Imager, a new facility instrument for the Gemini South Observatory. We describe how feature extraction and clustering methods can be used to aid in automated system alignment tasks, and also present a search algorithm for finding regular features in science images used for calibration and data processing. Along with discussions of each technique, we present our specific implementation and show results of each one in operation.
NASA Astrophysics Data System (ADS)
Schulthess, Thomas C.
2013-03-01
The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.
Analytical techniques for identification and study of organic matter in returned lunar samples
NASA Technical Reports Server (NTRS)
Burlingame, A. L.
1974-01-01
The results of geochemical research are reviewed. Emphasis is placed on the contribution of mass spectrometric data to the solution of specific structural problems. Information on the mass spectrometric behavior of compounds of geochemical interest is reviewed and currently available techniques of particular importance to geochemistry, such as gas chromatograph-mass spectrometer coupling, modern sample introduction methods, and computer application in high resolution mass spectrometry, receive particular attention.
A Survey Of Techniques for Managing and Leveraging Caches in GPUs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
2014-09-01
Initially introduced as special-purpose accelerators for graphics applications, graphics processing units (GPUs) have now emerged as general purpose computing platforms for a wide range of applications. To address the requirements of these applications, modern GPUs include sizable hardware-managed caches. However, several factors, such as unique architecture of GPU, rise of CPU–GPU heterogeneous computing, etc., demand effective management of caches to achieve high performance and energy efficiency. Recently, several techniques have been proposed for this purpose. In this paper, we survey several architectural and system-level techniques proposed for managing and leveraging GPU caches. We also discuss the importance and challenges ofmore » cache management in GPUs. The aim of this paper is to provide the readers insights into cache management techniques for GPUs and motivate them to propose even better techniques for leveraging the full potential of caches in the GPUs of tomorrow.« less
Modern adjuncts and technologies in microsurgery: an historical and evidence-based review.
Pratt, George F; Rozen, Warren M; Chubb, Daniel; Whitaker, Iain S; Grinsell, Damien; Ashton, Mark W; Acosta, Rafael
2010-11-01
While modern reconstructive surgery was revolutionized with the introduction of microsurgical techniques, microsurgery itself has seen the introduction of a range of technological aids and modern techniques aiming to improve dissection times, anastomotic times, and overall outcomes. These include improved preoperative planning, anastomotic aides, and earlier detection of complications with higher salvage rates. Despite the potential for substantial impact, many of these techniques have been evaluated in a limited fashion, and the evidence for each has not been universally explored. The purpose of this review was to establish and quantify the evidence for each technique. A search of relevant medical databases was performed to identify literature providing evidence for each technology. Levels of evidence were thus accumulated and applied to each technique. There is a relative paucity of evidence for many of the more recent technologies described in the field of microsurgery, with no randomized controlled trials, and most studies in the field comprising case series only. Current evidence-based suggestions include the use of computed tomographic angiography (CTA) for the preoperative planning of perforator flaps, the intraoperative use of a mechanical anastomotic coupling aide (particularly the Unilink® coupler), and postoperative flap monitoring with strict protocols using clinical bedside monitoring and/or the implantable Doppler probe. Despite the breadth of technologies introduced into the field of microsurgery, there is substantial variation in the degree of evidence presented for each, suggesting the role for much future research, particularly from emerging technologies such as robotics and modern simulators. Copyright © 2010 Wiley-Liss, Inc.
ERIC Educational Resources Information Center
Lee, Mark J. W.; Pradhan, Sunam; Dalgarno, Barney
2008-01-01
Modern information technology and computer science curricula employ a variety of graphical tools and development environments to facilitate student learning of introductory programming concepts and techniques. While the provision of interactive features and the use of visualization can enhance students' understanding and assist them in grasping…
User’s guide to SNAP for ArcGIS® :ArcGIS interface for scheduling and network analysis program
Woodam Chung; Dennis Dykstra; Fred Bower; Stephen O’Brien; Richard Abt; John. and Sessions
2012-01-01
This document introduces a computer software named SNAP for ArcGIS® , which has been developed to streamline scheduling and transportation planning for timber harvest areas. Using modern optimization techniques, it can be used to spatially schedule timber harvest with consideration of harvesting costs, multiple products, alternative...
The use of modern measurement techniques for designing pro ecological constructions
NASA Astrophysics Data System (ADS)
Wieczorowski, Michał; Gapiński, Bartosz; Szymański, Maciej; Rękas, Artur
2017-10-01
In the paper some possibilities of application modern length and angle metrology techniques to design constructions that support ecology were presented. The paper is based on a project where a lighter bus and train car seat was developed. Different options were presented including static and dynamic photogrammetry, computed tomography and thermography. Research related with dynamic behaviour of designed structures gave input to determine deformation of a seat and passengers sitting on it during communication accidents. Works connected to strength of construction elements made it possible to optimize its dimensions maintaining proper durability. Metrological actions taken in relation to production machines and equipment enabled to better recognize phenomena that take place during manufacturing process and to correct its parameters, what in turns also contributed to slim down the construction.
PET/CT in Radiation Therapy Planning.
Specht, Lena; Berthelsen, Anne Kiil
2018-01-01
Radiation therapy (RT) is an important component of the management of lymphoma patients. Most lymphomas are metabolically active and accumulate 18 F-fluorodeoxyglucose (FDG). Positron emission tomography with computer tomography (PET/CT) imaging using FDG is used routinely in staging and treatment evaluation. FDG-PET/CT imaging is now also used routinely for contouring the target for RT, and has been shown to change the irradiated volume significantly compared with CT imaging alone. Modern advanced imaging techniques with image fusion and motion management in combination with modern highly conformal RT techniques have increased the precision of RT, and have made it possible to reduce dramatically the risks of long-term side effects of treatment while maintaining the high cure rates for these diseases. Copyright © 2017 Elsevier Inc. All rights reserved.
Applications of modern statistical methods to analysis of data in physical science
NASA Astrophysics Data System (ADS)
Wicker, James Eric
Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.
Computational Analysis of Behavior.
Egnor, S E Roian; Branson, Kristin
2016-07-08
In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.
Control Law Design in a Computational Aeroelasticity Environment
NASA Technical Reports Server (NTRS)
Newsom, Jerry R.; Robertshaw, Harry H.; Kapania, Rakesh K.
2003-01-01
A methodology for designing active control laws in a computational aeroelasticity environment is given. The methodology involves employing a systems identification technique to develop an explicit state-space model for control law design from the output of a computational aeroelasticity code. The particular computational aeroelasticity code employed in this paper solves the transonic small disturbance aerodynamic equation using a time-accurate, finite-difference scheme. Linear structural dynamics equations are integrated simultaneously with the computational fluid dynamics equations to determine the time responses of the structure. These structural responses are employed as the input to a modern systems identification technique that determines the Markov parameters of an "equivalent linear system". The Eigensystem Realization Algorithm is then employed to develop an explicit state-space model of the equivalent linear system. The Linear Quadratic Guassian control law design technique is employed to design a control law. The computational aeroelasticity code is modified to accept control laws and perform closed-loop simulations. Flutter control of a rectangular wing model is chosen to demonstrate the methodology. Various cases are used to illustrate the usefulness of the methodology as the nonlinearity of the aeroelastic system is increased through increased angle-of-attack changes.
Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2016-01-01
An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.
Lisasi, Esther; Kulanga, Ahaz; Muiruri, Charles; Killewo, Lucy; Fadhili, Ndimangwa; Mimano, Lucy; Kapanda, Gibson; Tibyampansha, Dativa; Ibrahim, Glory; Nyindo, Mramba; Mteta, Kien; Kessi, Egbert; Ntabaye, Moshi; Bartlett, John
2014-08-01
The Kilimanjaro Christian Medical University (KCMU) College and the Medical Education Partnership Initiative (MEPI) are addressing the crisis in Tanzanian health care manpower by modernizing the college's medical education with new tools and techniques. With a $10 million MEPI grant and the participation of its partner, Duke University, KCMU is harnessing the power of information technology (IT) to upgrade tools for students and faculty. Initiatives in eLearning have included bringing fiber-optic connectivity to the campus, offering campus-wide wireless access, opening student and faculty computer laboratories, and providing computer tablets to all incoming medical students. Beyond IT, the college is also offering wet laboratory instruction for hands-on diagnostic skills, team-based learning, and clinical skills workshops. In addition, modern teaching tools and techniques address the challenges posed by increasing numbers of students. To provide incentives for instructors, a performance-based compensation plan and teaching awards have been established. Also for faculty, IT tools and training have been made available, and a medical education course management system is now being widely employed. Student and faculty responses have been favorable, and the rapid uptake of these interventions by students, faculty, and the college's administration suggests that the KCMU College MEPI approach has addressed unmet needs. This enabling environment has transformed the culture of learning and teaching at KCMU College, where a path to sustainability is now being pursued.
Lisasi, Esther; Kulanga, Ahaz; Muiruri, Charles; Killewo, Lucy; Fadhili, Ndimangwa; Mimano, Lucy; Kapanda, Gibson; Tibyampansha, Dativa; Ibrahim, Glory; Nyindo, Mramba; Mteta, Kien; Kessi, Egbert; Ntabaye, Moshi; Bartlett, John
2014-01-01
The Kilimanjaro Christian Medical University (KCMU) College and the Medical Education Partnership Initiative (MEPI) are addressing the crisis in Tanzanian health care manpower by modernizing the college’s medical education with new tools and techniques. With a $10 million MEPI grant and the participation of its partner, Duke University, KCMU is harnessing the power of information technology (IT) to upgrade tools for students and faculty. Initiatives in eLearning have included bringing fiber-optic connectivity to the campus, offering campus-wide wireless access, opening student and faculty computer laboratories, and providing computer tablets to all incoming medical students. Beyond IT, the college is also offering wet laboratory instruction for hands-on diagnostic skills, team-based learning, and clinical skills workshops. In addition, modern teaching tools and techniques address the challenges posed by increasing numbers of students. To provide incentives for instructors, a performance-based compensation plan and teaching awards have been established. Also for faculty, IT tools and training have been made available, and a medical education course management system is now being widely employed. Student and faculty responses have been favorable, and the rapid uptake of these interventions by students, faculty, and the college’s administration suggests that the KCMU College MEPI approach has addressed unmet needs. This enabling environment has transformed the culture of learning and teaching at KCMU College, where a path to sustainability is now being pursued. PMID:25072581
WE-D-303-01: Development and Application of Digital Human Phantoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Segars, P.
2015-06-15
Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less
Parallel algorithm for computation of second-order sequential best rotations
NASA Astrophysics Data System (ADS)
Redif, Soydan; Kasap, Server
2013-12-01
Algorithms for computing an approximate polynomial matrix eigenvalue decomposition of para-Hermitian systems have emerged as a powerful, generic signal processing tool. A technique that has shown much success in this regard is the sequential best rotation (SBR2) algorithm. Proposed is a scheme for parallelising SBR2 with a view to exploiting the modern architectural features and inherent parallelism of field-programmable gate array (FPGA) technology. Experiments show that the proposed scheme can achieve low execution times while requiring minimal FPGA resources.
Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Xipeng
The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of thismore » project through that period.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...
2017-10-25
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
NASA Astrophysics Data System (ADS)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen
2017-12-01
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.
2010-03-01
is to develop a novel clinical useful delivered-dose verification protocol for modern prostate VMAT using Electronic Portal Imaging Device (EPID...technique. A number of important milestones have been accomplished, which include (i) calibrated CBCT HU vs. electron density curve; (ii...prostate VMAT using Electronic Portal Imaging Device (EPID) and onboard Cone beam Computed Tomography (CBCT). The specific aims of this project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Haley; BC Cancer Agency, Surrey, B.C.; BC Cancer Agency, Vancouver, B.C.
2014-08-15
Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. Wemore » describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.« less
A Stimulating Approach To Teaching, Learning and Assessing Finite Element Methods: A Case Study.
ERIC Educational Resources Information Center
Karadelis, J. N.
1998-01-01
Examines the benefits of introducing finite element methods into the curriculum of undergraduate courses. Analyzes the structure of the computer-assisted-design module and the extent to which it fulfills its main objectives. Discusses the efficiency of modern teaching and learning techniques used to develop skills for solving engineering problems;…
ERIC Educational Resources Information Center
Hacisalihoglu, Gokhan; Hilgert, Uwe; Nash, E. Bruce; Micklos, David A.
2008-01-01
Today's biology educators face the challenge of training their students in modern molecular biology techniques including genomics and bioinformatics. The Dolan DNA Learning Center (DNALC) of Cold Spring Harbor Laboratory has developed and disseminated a bench- and computer-based plant genomics curriculum for biology faculty. In 2007, a five-day…
Optics for Processes, Products and Metrology
NASA Astrophysics Data System (ADS)
Mather, George
1999-04-01
Optical physics has a variety of applications in industry, including process inspection, coatings development, vision instrumentation, spectroscopy, and many others. Optics has been used extensively in the design of solar energy collection systems and coatings, for example. Also, with the availability of good CCD cameras and fast computers, it has become possible to develop real-time inspection and metrology devices that can accommodate the high throughputs encountered in modern production processes. More recently, developments in moiré interferometry show great promise for applications in the basic metals and electronics industries. The talk will illustrate applications of optics by discussing process inspection techniques for defect detection, part dimensioning, birefringence measurement, and the analysis of optical coatings in the automotive, glass, and optical disc industries. In particular, examples of optical techniques for the quality control of CD-R, MO, and CD-RW discs will be presented. In addition, the application of optical concepts to solar energy collector design and to metrology by moiré techniques will be discussed. Finally, some of the modern techniques and instruments used for qualitative and quantitative material analysis will be presented.
Mostafavi, Kamal; Tutunea-Fatan, O Remus; Bordatchev, Evgueni V; Johnson, James A
2014-12-01
The strong advent of computer-assisted technologies experienced by the modern orthopedic surgery prompts for the expansion of computationally efficient techniques to be built on the broad base of computer-aided engineering tools that are readily available. However, one of the common challenges faced during the current developmental phase continues to remain the lack of reliable frameworks to allow a fast and precise conversion of the anatomical information acquired through computer tomography to a format that is acceptable to computer-aided engineering software. To address this, this study proposes an integrated and automatic framework capable to extract and then postprocess the original imaging data to a common planar and closed B-Spline representation. The core of the developed platform relies on the approximation of the discrete computer tomography data by means of an original two-step B-Spline fitting technique based on successive deformations of the control polygon. In addition to its rapidity and robustness, the developed fitting technique was validated to produce accurate representations that do not deviate by more than 0.2 mm with respect to alternate representations of the bone geometry that were obtained through different-contact-based-data acquisition or data processing methods. © IMechE 2014.
Hattotuwagama, Channa K; Guan, Pingping; Doytchinova, Irini A; Flower, Darren R
2004-11-21
Quantitative structure-activity relationship (QSAR) analysis is a main cornerstone of modern informatic disciplines. Predictive computational models, based on QSAR technology, of peptide-major histocompatibility complex (MHC) binding affinity have now become a vital component of modern day computational immunovaccinology. Historically, such approaches have been built around semi-qualitative, classification methods, but these are now giving way to quantitative regression methods. The additive method, an established immunoinformatics technique for the quantitative prediction of peptide-protein affinity, was used here to identify the sequence dependence of peptide binding specificity for three mouse class I MHC alleles: H2-D(b), H2-K(b) and H2-K(k). As we show, in terms of reliability the resulting models represent a significant advance on existing methods. They can be used for the accurate prediction of T-cell epitopes and are freely available online ( http://www.jenner.ac.uk/MHCPred).
NASA Astrophysics Data System (ADS)
Stock, Michala K.; Stull, Kyra E.; Garvin, Heather M.; Klales, Alexandra R.
2016-10-01
Forensic anthropologists are routinely asked to estimate a biological profile (i.e., age, sex, ancestry and stature) from a set of unidentified remains. In contrast to the abundance of collections and techniques associated with adult skeletons, there is a paucity of modern, documented subadult skeletal material, which limits the creation and validation of appropriate forensic standards. Many are forced to use antiquated methods derived from small sample sizes, which given documented secular changes in the growth and development of children, are not appropriate for application in the medico-legal setting. Therefore, the aim of this project is to use multi-slice computed tomography (MSCT) data from a large, diverse sample of modern subadults to develop new methods to estimate subadult age and sex for practical forensic applications. The research sample will consist of over 1,500 full-body MSCT scans of modern subadult individuals (aged birth to 20 years) obtained from two U.S. medical examiner's offices. Statistical analysis of epiphyseal union scores, long bone osteometrics, and os coxae landmark data will be used to develop modern subadult age and sex estimation standards. This project will result in a database of information gathered from the MSCT scans, as well as the creation of modern, statistically rigorous standards for skeletal age and sex estimation in subadults. Furthermore, the research and methods developed in this project will be applicable to dry bone specimens, MSCT scans, and radiographic images, thus providing both tools and continued access to data for forensic practitioners in a variety of settings.
An introduction to real-time graphical techniques for analyzing multivariate data
NASA Astrophysics Data System (ADS)
Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner
1987-08-01
Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".
Preliminary design data package, appendices C1 and C2
NASA Technical Reports Server (NTRS)
1980-01-01
The HYBRID2 program which computes the fuel and energy consumption of a hybrid vehicle with a bi-modal control strategy over specified component driving cycles is described. Fuel and energy consumption are computed separately for the two modes of operation. The program also computes yearly average fuel and energy consumption using a composite driving cycle which varies as a function of daily travel. The modelling techniques are described, and subroutines and their functions are given. The composition of modern automobiles is discussed along with the energy required to manufacture an American automobile. The energy required to scrap and recycle automobiles is also discussed.
Human-machine interface for a VR-based medical imaging environment
NASA Astrophysics Data System (ADS)
Krapichler, Christian; Haubner, Michael; Loesch, Andreas; Lang, Manfred K.; Englmeier, Karl-Hans
1997-05-01
Modern 3D scanning techniques like magnetic resonance imaging (MRI) or computed tomography (CT) produce high- quality images of the human anatomy. Virtual environments open new ways to display and to analyze those tomograms. Compared with today's inspection of 2D image sequences, physicians are empowered to recognize spatial coherencies and examine pathological regions more facile, diagnosis and therapy planning can be accelerated. For that purpose a powerful human-machine interface is required, which offers a variety of tools and features to enable both exploration and manipulation of the 3D data. Man-machine communication has to be intuitive and efficacious to avoid long accustoming times and to enhance familiarity with and acceptance of the interface. Hence, interaction capabilities in virtual worlds should be comparable to those in the real work to allow utilization of our natural experiences. In this paper the integration of hand gestures and visual focus, two important aspects in modern human-computer interaction, into a medical imaging environment is shown. With the presented human- machine interface, including virtual reality displaying and interaction techniques, radiologists can be supported in their work. Further, virtual environments can even alleviate communication between specialists from different fields or in educational and training applications.
Proton Upset Monte Carlo Simulation
NASA Technical Reports Server (NTRS)
O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.
2009-01-01
The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.
Chen, Xiaodong; Ren, Liqiang; Zheng, Bin; Liu, Hong
2013-01-01
The conventional optical microscopes have been used widely in scientific research and in clinical practice. The modern digital microscopic devices combine the power of optical imaging and computerized analysis, archiving and communication techniques. It has a great potential in pathological examinations for improving the efficiency and accuracy of clinical diagnosis. This chapter reviews the basic optical principles of conventional microscopes, fluorescence microscopes and electron microscopes. The recent developments and future clinical applications of advanced digital microscopic imaging methods and computer assisted diagnosis schemes are also discussed.
Business continuity strategies for cyber defence: battling time and information overload.
Streufert, John
2010-11-01
Can the same numbers and letters which are the life blood of modern business and government computer systems be harnessed to protect computers from attack against known information security risks? For the past seven years, Foreign Service officers and technicians of the US Government have sought to maintain diplomatic operations in the face of rising cyber attacks and test the hypothesis that an ounce of prevention is worth a pound of cure. As eight out of ten attacks leverage known computer security vulnerabilities or configuration setting weaknesses, a pound of cure would seem to be easy to come by. Yet modern security tools present an unusually consequential threat to business continuity - too much rather than too little information on cyber problems is presented, harking back to a phenomenon cited by social scientists in the 1960s called 'information overload'. Experience indicates that the longer the most serious cyber problems go untreated, the wider the attack surface adversaries can find. One technique used at the Department of State, called 'risk scoring', resulted in an 89 per cent overall reduction in measured risk over 12 months for the Department of State's servers and personal computers. Later refinements of risk scoring enabled technicians to correct unique security threats with unprecedented speed. This paper explores how the use of metrics, special care in presenting information to technicians and executives alike, as well as tactical use of organisational incentives can result in stronger cyber defences protecting modern organisations.
Using machine learning to accelerate sampling-based inversion
NASA Astrophysics Data System (ADS)
Valentine, A. P.; Sambridge, M.
2017-12-01
In most cases, a complete solution to a geophysical inverse problem (including robust understanding of the uncertainties associated with the result) requires a sampling-based approach. However, the computational burden is high, and proves intractable for many problems of interest. There is therefore considerable value in developing techniques that can accelerate sampling procedures.The main computational cost lies in evaluation of the forward operator (e.g. calculation of synthetic seismograms) for each candidate model. Modern machine learning techniques-such as Gaussian Processes-offer a route for constructing a computationally-cheap approximation to this calculation, which can replace the accurate solution during sampling. Importantly, the accuracy of the approximation can be refined as inversion proceeds, to ensure high-quality results.In this presentation, we describe and demonstrate this approach-which can be seen as an extension of popular current methods, such as the Neighbourhood Algorithm, and bridges the gap between prior- and posterior-sampling frameworks.
Vectorization with SIMD extensions speeds up reconstruction in electron tomography.
Agulleiro, J I; Garzón, E M; García, I; Fernández, J J
2010-06-01
Electron tomography allows structural studies of cellular structures at molecular detail. Large 3D reconstructions are needed to meet the resolution requirements. The processing time to compute these large volumes may be considerable and so, high performance computing techniques have been used traditionally. This work presents a vector approach to tomographic reconstruction that relies on the exploitation of the SIMD extensions available in modern processors in combination to other single processor optimization techniques. This approach succeeds in producing full resolution tomograms with an important reduction in processing time, as evaluated with the most common reconstruction algorithms, namely WBP and SIRT. The main advantage stems from the fact that this approach is to be run on standard computers without the need of specialized hardware, which facilitates the development, use and management of programs. Future trends in processor design open excellent opportunities for vector processing with processor's SIMD extensions in the field of 3D electron microscopy.
Method and computer program product for maintenance and modernization backlogging
Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M
2013-02-19
According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.
Computer generated hologram from point cloud using graphics processor.
Chen, Rick H-Y; Wilkinson, Timothy D
2009-12-20
Computer generated holography is an extremely demanding and complex task when it comes to providing realistic reconstructions with full parallax, occlusion, and shadowing. We present an algorithm designed for data-parallel computing on modern graphics processing units to alleviate the computational burden. We apply Gaussian interpolation to create a continuous surface representation from discrete input object points. The algorithm maintains a potential occluder list for each individual hologram plane sample to keep the number of visibility tests to a minimum. We experimented with two approximations that simplify and accelerate occlusion computation. It is observed that letting several neighboring hologram plane samples share visibility information on object points leads to significantly faster computation without causing noticeable artifacts in the reconstructed images. Computing a reduced sample set via nonuniform sampling is also found to be an effective acceleration technique.
Computational discovery of extremal microstructure families
Chen, Desai; Skouras, Mélina; Zhu, Bo; Matusik, Wojciech
2018-01-01
Modern fabrication techniques, such as additive manufacturing, can be used to create materials with complex custom internal structures. These engineered materials exhibit a much broader range of bulk properties than their base materials and are typically referred to as metamaterials or microstructures. Although metamaterials with extraordinary properties have many applications, designing them is very difficult and is generally done by hand. We propose a computational approach to discover families of microstructures with extremal macroscale properties automatically. Using efficient simulation and sampling techniques, we compute the space of mechanical properties covered by physically realizable microstructures. Our system then clusters microstructures with common topologies into families. Parameterized templates are eventually extracted from families to generate new microstructure designs. We demonstrate these capabilities on the computational design of mechanical metamaterials and present five auxetic microstructure families with extremal elastic material properties. Our study opens the way for the completely automated discovery of extremal microstructures across multiple domains of physics, including applications reliant on thermal, electrical, and magnetic properties. PMID:29376124
DOE Office of Scientific and Technical Information (OSTI.GOV)
Specht, Lena, E-mail: lena.specht@regionh.dk; Yahalom, Joachim; Illidge, Tim
2014-07-15
Radiation therapy (RT) is the most effective single modality for local control of Hodgkin lymphoma (HL) and an important component of therapy for many patients. These guidelines have been developed to address the use of RT in HL in the modern era of combined modality treatment. The role of reduced volumes and doses is addressed, integrating modern imaging with 3-dimensional (3D) planning and advanced techniques of treatment delivery. The previously applied extended field (EF) and original involved field (IF) techniques, which treated larger volumes based on nodal stations, have now been replaced by the use of limited volumes, based solelymore » on detectable nodal (and extranodal extension) involvement at presentation, using contrast-enhanced computed tomography, positron emission tomography/computed tomography, magnetic resonance imaging, or a combination of these techniques. The International Commission on Radiation Units and Measurements concepts of gross tumor volume, clinical target volume, internal target volume, and planning target volume are used for defining the targeted volumes. Newer treatment techniques, including intensity modulated radiation therapy, breath-hold, image guided radiation therapy, and 4-dimensional imaging, should be implemented when their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control. The highly conformal involved node radiation therapy (INRT), recently introduced for patients for whom optimal imaging is available, is explained. A new concept, involved site radiation therapy (ISRT), is introduced as the standard conformal therapy for the scenario, commonly encountered, wherein optimal imaging is not available. There is increasing evidence that RT doses used in the past are higher than necessary for disease control in this era of combined modality therapy. The use of INRT and of lower doses in early-stage HL is supported by available data. Although the use of ISRT has not yet been validated in a formal study, it is more conservative than INRT, accounting for suboptimal information and appropriately designed for safe local disease control. The goal of modern smaller field radiation therapy is to reduce both treatment volume and treatment dose while maintaining efficacy and minimizing acute and late sequelae. This review is a consensus of the International Lymphoma Radiation Oncology Group (ILROG) Steering Committee regarding the modern approach to RT in the treatment of HL, outlining a new concept of ISRT in which reduced treatment volumes are planned for the effective control of involved sites of HL. Nodal and extranodal non-Hodgkin lymphomas (NHL) are covered separately by ILROG guidelines.« less
Specht, Lena; Yahalom, Joachim; Illidge, Tim; Berthelsen, Anne Kiil; Constine, Louis S; Eich, Hans Theodor; Girinsky, Theodore; Hoppe, Richard T; Mauch, Peter; Mikhaeel, N George; Ng, Andrea
2014-07-15
Radiation therapy (RT) is the most effective single modality for local control of Hodgkin lymphoma (HL) and an important component of therapy for many patients. These guidelines have been developed to address the use of RT in HL in the modern era of combined modality treatment. The role of reduced volumes and doses is addressed, integrating modern imaging with 3-dimensional (3D) planning and advanced techniques of treatment delivery. The previously applied extended field (EF) and original involved field (IF) techniques, which treated larger volumes based on nodal stations, have now been replaced by the use of limited volumes, based solely on detectable nodal (and extranodal extension) involvement at presentation, using contrast-enhanced computed tomography, positron emission tomography/computed tomography, magnetic resonance imaging, or a combination of these techniques. The International Commission on Radiation Units and Measurements concepts of gross tumor volume, clinical target volume, internal target volume, and planning target volume are used for defining the targeted volumes. Newer treatment techniques, including intensity modulated radiation therapy, breath-hold, image guided radiation therapy, and 4-dimensional imaging, should be implemented when their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control. The highly conformal involved node radiation therapy (INRT), recently introduced for patients for whom optimal imaging is available, is explained. A new concept, involved site radiation therapy (ISRT), is introduced as the standard conformal therapy for the scenario, commonly encountered, wherein optimal imaging is not available. There is increasing evidence that RT doses used in the past are higher than necessary for disease control in this era of combined modality therapy. The use of INRT and of lower doses in early-stage HL is supported by available data. Although the use of ISRT has not yet been validated in a formal study, it is more conservative than INRT, accounting for suboptimal information and appropriately designed for safe local disease control. The goal of modern smaller field radiation therapy is to reduce both treatment volume and treatment dose while maintaining efficacy and minimizing acute and late sequelae. This review is a consensus of the International Lymphoma Radiation Oncology Group (ILROG) Steering Committee regarding the modern approach to RT in the treatment of HL, outlining a new concept of ISRT in which reduced treatment volumes are planned for the effective control of involved sites of HL. Nodal and extranodal non-Hodgkin lymphomas (NHL) are covered separately by ILROG guidelines. Copyright © 2014 Elsevier Inc. All rights reserved.
Characterization and Developmental History of Problem Solving Methods in Medicine
Harbort, Robert A.
1980-01-01
The central thesis of this paper is the importance of the framework in which information is structured. It is technically important in the design of systems; it is also important in guaranteeing that systems are usable by clinicians. Progress in medical computing depends on our ability to develop a more quantitative understanding of the role of context in our choice of problem solving techniques. This in turn will help us to design more flexible and responsive computer systems. The paper contains an overview of some models of knowledge and problem solving methods, a characterization of modern diagnostic techniques, and a discussion of skill development in medical practice. Diagnostic techniques are examined in terms of how they are taught, what problem solving methods they use, and how they fit together into an overall theory of interpretation of the medical status of a patient.
Egas Moniz: 90 Years (1927-2017) from Cerebral Angiography.
Artico, Marco; Spoletini, Marialuisa; Fumagalli, Lorenzo; Biagioni, Francesca; Ryskalin, Larisa; Fornai, Francesco; Salvati, Maurizio; Frati, Alessandro; Pastore, Francesco Saverio; Taurone, Samanta
2017-01-01
In June 2017 we celebrate the 90th anniversary of the pioneer discovery of cerebral angiography, the seminal imaging technique used for visualizing cerebral blood vessels and vascular alterations as well as other intracranial disorders. Egas Moniz (1874-1955) was the first to describe the use of this revolutionary technique which, until 1975 (when computed tomography, CT, scan was introduced in the clinical practice), was the sole diagnostic tool to provide an imaging of cerebral vessels and therefore alterations due to intracranial pathology. Moniz introduced in the clinical practice this fundamental and important diagnostic tool. The present contribution wishes to pay a tribute to the Portuguese neurosurgeon, who was also a distinguished neurologist and statesman. Despite his tremendous contribution in modern brain imaging, Egas Moniz was awarded the Nobel Prize in Physiology or Medicine in 1949 for prefrontal leucotomy, the neurosurgical intervention nowadays unacceptable, but should rather be remembered for his key contribution to modern brain imaging.
Recent advances in hypersonic technology
NASA Technical Reports Server (NTRS)
Dwoyer, Douglas L.
1990-01-01
This paper will focus on recent advances in hypersonic aerodynamic prediction techniques. Current capabilities of existing numerical methods for predicting high Mach number flows will be discussed and shortcomings will be identified. Physical models available for inclusion into modern codes for predicting the effects of transition and turbulence will also be outlined and their limitations identified. Chemical reaction models appropriate to high-speed flows will be addressed, and the impact of their inclusion in computational fluid dynamics codes will be discussed. Finally, the problem of validating predictive techniques for high Mach number flows will be addressed.
Evaluating Application Resilience with XRay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Sui; Bronevetsky, Greg; Li, Bin
2015-05-07
The rising count and shrinking feature size of transistors within modern computers is making them increasingly vulnerable to various types of soft faults. This problem is especially acute in high-performance computing (HPC) systems used for scientific computing, because these systems include many thousands of compute cores and nodes, all of which may be utilized in a single large-scale run. The increasing vulnerability of HPC applications to errors induced by soft faults is motivating extensive work on techniques to make these applications more resiilent to such faults, ranging from generic techniques such as replication or checkpoint/restart to algorithmspecific error detection andmore » tolerance techniques. Effective use of such techniques requires a detailed understanding of how a given application is affected by soft faults to ensure that (i) efforts to improve application resilience are spent in the code regions most vulnerable to faults and (ii) the appropriate resilience technique is applied to each code region. This paper presents XRay, a tool to view the application vulnerability to soft errors, and illustrates how XRay can be used in the context of a representative application. In addition to providing actionable insights into application behavior XRay automatically selects the number of fault injection experiments required to provide an informative view of application behavior, ensuring that the information is statistically well-grounded without performing unnecessary experiments.« less
cudaMap: a GPU accelerated program for gene expression connectivity mapping.
McArt, Darragh G; Bankhead, Peter; Dunne, Philip D; Salto-Tellez, Manuel; Hamilton, Peter; Zhang, Shu-Dong
2013-10-11
Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hermann, Raphael P
2017-01-01
This most comprehensive and unrivaled compendium in the field provides an up-to-date account of the chemistry of solids, nanoparticles and hybrid materials. Following a valuable introductory chapter reviewing important synthesis techniques, the handbook presents a series of contributions by about 150 international leading experts -- the "Who's Who" of solid state science. Clearly structured, in six volumes it collates the knowledge available on solid state chemistry, starting from the synthesis, and modern methods of structure determination. Understanding and measuring the physical properties of bulk solids and the theoretical basis of modern computational treatments of solids are given ample space, asmore » are such modern trends as nanoparticles, surface properties and heterogeneous catalysis. Emphasis is placed throughout not only on the design and structure of solids but also on practical applications of these novel materials in real chemical situations.« less
Advanced 3D Characterization and Reconstruction of Reactor Materials FY16 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fromm, Bradley; Hauch, Benjamin; Sridharan, Kumar
2016-12-01
A coordinated effort to link advanced materials characterization methods and computational modeling approaches is critical to future success for understanding and predicting the behavior of reactor materials that operate at extreme conditions. The difficulty and expense of working with nuclear materials have inhibited the use of modern characterization techniques on this class of materials. Likewise, mesoscale simulation efforts have been impeded due to insufficient experimental data necessary for initialization and validation of the computer models. The objective of this research is to develop methods to integrate advanced materials characterization techniques developed for reactor materials with state-of-the-art mesoscale modeling and simulationmore » tools. Research to develop broad-ion beam sample preparation, high-resolution electron backscatter diffraction, and digital microstructure reconstruction techniques; and methods for integration of these techniques into mesoscale modeling tools are detailed. Results for both irradiated and un-irradiated reactor materials are presented for FY14 - FY16 and final remarks are provided.« less
Current Developments in Machine Learning Techniques in Biological Data Mining.
Dumancas, Gerard G; Adrianto, Indra; Bello, Ghalib; Dozmorov, Mikhail
2017-01-01
This supplement is intended to focus on the use of machine learning techniques to generate meaningful information on biological data. This supplement under Bioinformatics and Biology Insights aims to provide scientists and researchers working in this rapid and evolving field with online, open-access articles authored by leading international experts in this field. Advances in the field of biology have generated massive opportunities to allow the implementation of modern computational and statistical techniques. Machine learning methods in particular, a subfield of computer science, have evolved as an indispensable tool applied to a wide spectrum of bioinformatics applications. Thus, it is broadly used to investigate the underlying mechanisms leading to a specific disease, as well as the biomarker discovery process. With a growth in this specific area of science comes the need to access up-to-date, high-quality scholarly articles that will leverage the knowledge of scientists and researchers in the various applications of machine learning techniques in mining biological data.
Towards quantum chemistry on a quantum computer.
Lanyon, B P; Whitfield, J D; Gillett, G G; Goggin, M E; Almeida, M P; Kassal, I; Biamonte, J D; Mohseni, M; Powell, B J; Barbieri, M; Aspuru-Guzik, A; White, A G
2010-02-01
Exact first-principles calculations of molecular properties are currently intractable because their computational cost grows exponentially with both the number of atoms and basis set size. A solution is to move to a radically different model of computing by building a quantum computer, which is a device that uses quantum systems themselves to store and process data. Here we report the application of the latest photonic quantum computer technology to calculate properties of the smallest molecular system: the hydrogen molecule in a minimal basis. We calculate the complete energy spectrum to 20 bits of precision and discuss how the technique can be expanded to solve large-scale chemical problems that lie beyond the reach of modern supercomputers. These results represent an early practical step toward a powerful tool with a broad range of quantum-chemical applications.
Virtopsy: postmortem imaging of laryngeal foreign bodies.
Oesterhelweg, Lars; Bolliger, Stephan A; Thali, Michael J; Ross, Steffen
2009-05-01
Death from corpora aliena in the larynx is a well-known entity in forensic pathology. The correct diagnosis of this cause of death is difficult without an autopsy, and misdiagnoses by external examination alone are common. To determine the postmortem usefulness of modern imaging techniques in the diagnosis of foreign bodies in the larynx, multislice computed tomography, magnetic resonance imaging, and postmortem full-body computed tomography-angiography were performed. Three decedents with a suspected foreign body in the larynx underwent the 3 different imaging techniques before medicolegal autopsy. Multislice computed tomography has a high diagnostic value in the noninvasive localization of a foreign body and abnormalities in the larynx. The differentiation between neoplasm or soft foreign bodies (eg, food) is possible, but difficult, by unenhanced multislice computed tomography. By magnetic resonance imaging, the discrimination of the soft tissue structures and soft foreign bodies is much easier. In addition to the postmortem multislice computed tomography, the combination with postmortem angiography will increase the diagnostic value. Postmortem, cross-sectional imaging methods are highly valuable procedures for the noninvasive detection of corpora aliena in the larynx.
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation (to index large arrays) have not been widely researched. We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any addressmore » computation scheme that flows an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Enabling the flow of errors allows one to situate detectors at loop exit points, and helps turn silent corruptions into easily detectable error situations. Our experiments using PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGEmore » is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
DICOMGrid: a middleware to integrate PACS and EELA-2 grid infrastructure
NASA Astrophysics Data System (ADS)
Moreno, Ramon A.; de Sá Rebelo, Marina; Gutierrez, Marco A.
2010-03-01
Medical images provide lots of information for physicians, but the huge amount of data produced by medical image equipments in a modern Health Institution is not completely explored in its full potential yet. Nowadays medical images are used in hospitals mostly as part of routine activities while its intrinsic value for research is underestimated. Medical images can be used for the development of new visualization techniques, new algorithms for patient care and new image processing techniques. These research areas usually require the use of huge volumes of data to obtain significant results, along with enormous computing capabilities. Such qualities are characteristics of grid computing systems such as EELA-2 infrastructure. The grid technologies allow the sharing of data in large scale in a safe and integrated environment and offer high computing capabilities. In this paper we describe the DicomGrid to store and retrieve medical images, properly anonymized, that can be used by researchers to test new processing techniques, using the computational power offered by grid technology. A prototype of the DicomGrid is under evaluation and permits the submission of jobs into the EELA-2 grid infrastructure while offering a simple interface that requires minimal understanding of the grid operation.
Quantum Computing Using Superconducting Qubits
2006-04-01
see the right fig.), and (iii) dynamically modifying ( pulsating ) this potential by controlling the motion of the A particles. This allows easy...superconductors with periodic pinning arrays. We show that sample heating by moving vortices produces negative differential resistivity (NDR) of both N- and S...efficient (i.e., using one two-bit operation) QC circuits using modern microfabrication techniques. scheme for this design [1,3] to achieve conditional
Development of seismic tomography software for hybrid supercomputers
NASA Astrophysics Data System (ADS)
Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton
2015-04-01
Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on supercomputers using multicore CPUs only, with preliminary performance tests showing good parallel efficiency on large numerical grids. Porting of the algorithms to hybrid supercomputers is currently ongoing.
Impedance computations and beam-based measurements: A problem of discrepancy
NASA Astrophysics Data System (ADS)
Smaluk, Victor
2018-04-01
High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictions based on the computed impedance budgets show a significant discrepancy. Three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.
Double-negative metamaterial for mobile phone application
NASA Astrophysics Data System (ADS)
Hossain, M. I.; Faruque, M. R. I.; Islam, M. T.
2017-01-01
In this paper, a new design and analysis of metamaterial and its applications to modern handset are presented. The proposed metamaterial unit-cell design consists of two connected square spiral structures, which leads to increase the effective media ratio. The finite instigation technique based on Computer Simulation Technology Microwave Studio is utilized in this investigation, and the measurement is taken in an anechoic chamber. A good agreement is observed among simulated and measured results. The results indicate that the proposed metamaterial can successfully cover cellular phone frequency bands. Moreover, the uses of proposed metamaterial in modern handset antennas are also analyzed. The results reveal that the proposed metamaterial attachment significantly reduces specific absorption rate values without reducing the antenna performances.
Overview of computational structural methods for modern military aircraft
NASA Technical Reports Server (NTRS)
Kudva, J. N.
1992-01-01
Computational structural methods are essential for designing modern military aircraft. This briefing deals with computational structural methods (CSM) currently used. First a brief summary of modern day aircraft structural design procedures is presented. Following this, several ongoing CSM related projects at Northrop are discussed. Finally, shortcomings in this area, future requirements, and summary remarks are given.
Civil and mechanical engineering applications of sensitivity analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Komkov, V.
1985-07-01
In this largely tutorial presentation, the historical development of optimization theories has been outlined as they applied to mechanical and civil engineering designs and the development of modern sensitivity techniques during the last 20 years has been traced. Some of the difficulties and the progress made in overcoming them have been outlined. Some of the recently developed theoretical methods have been stressed to indicate their importance to computer-aided design technology.
Cache Energy Optimization Techniques For Modern Processors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
2013-01-01
Modern multicore processors are employing large last-level caches, for example Intel's E7-8800 processor uses 24MB L3 cache. Further, with each CMOS technology generation, leakage energy has been dramatically increasing and hence, leakage energy is expected to become a major source of energy dissipation, especially in last-level caches (LLCs). The conventional schemes of cache energy saving either aim at saving dynamic energy or are based on properties specific to first-level caches, and thus these schemes have limited utility for last-level caches. Further, several other techniques require offline profiling or per-application tuning and hence are not suitable for product systems. In thismore » book, we present novel cache leakage energy saving schemes for single-core and multicore systems; desktop, QoS, real-time and server systems. Also, we present cache energy saving techniques for caches designed with both conventional SRAM devices and emerging non-volatile devices such as STT-RAM (spin-torque transfer RAM). We present software-controlled, hardware-assisted techniques which use dynamic cache reconfiguration to configure the cache to the most energy efficient configuration while keeping the performance loss bounded. To profile and test a large number of potential configurations, we utilize low-overhead, micro-architecture components, which can be easily integrated into modern processor chips. We adopt a system-wide approach to save energy to ensure that cache reconfiguration does not increase energy consumption of other components of the processor. We have compared our techniques with state-of-the-art techniques and have found that our techniques outperform them in terms of energy efficiency and other relevant metrics. The techniques presented in this book have important applications in improving energy-efficiency of higher-end embedded, desktop, QoS, real-time, server processors and multitasking systems. This book is intended to be a valuable guide for both newcomers and veterans in the field of cache power management. It will help graduate students, CAD tool developers and designers in understanding the need of energy efficiency in modern computing systems. Further, it will be useful for researchers in gaining insights into algorithms and techniques for micro-architectural and system-level energy optimization using dynamic cache reconfiguration. We sincerely believe that the ``food for thought'' presented in this book will inspire the readers to develop even better ideas for designing ``green'' processors of tomorrow.« less
Digital signal conditioning for flight test instrumentation
NASA Technical Reports Server (NTRS)
Bever, Glenn A.
1991-01-01
An introduction to digital measurement processes on aircraft is provided. Flight test instrumentation systems are rapidly evolving from analog-intensive to digital intensive systems, including the use of onboard digital computers. The topics include measurements that are digital in origin, as well as sampling, encoding, transmitting, and storing data. Particular emphasis is placed on modern avionic data bus architectures and what to be aware of when extracting data from them. Examples of data extraction techniques are given. Tradeoffs between digital logic families, trends in digital development, and design testing techniques are discussed. An introduction to digital filtering is also covered.
The Weather Forecast Using Data Mining Research Based on Cloud Computing.
NASA Astrophysics Data System (ADS)
Wang, ZhanJie; Mazharul Mujib, A. B. M.
2017-10-01
Weather forecasting has been an important application in meteorology and one of the most scientifically and technologically challenging problem around the world. In my study, we have analyzed the use of data mining techniques in forecasting weather. This paper proposes a modern method to develop a service oriented architecture for the weather information systems which forecast weather using these data mining techniques. This can be carried out by using Artificial Neural Network and Decision tree Algorithms and meteorological data collected in Specific time. Algorithm has presented the best results to generate classification rules for the mean weather variables. The results showed that these data mining techniques can be enough for weather forecasting.
NASA Astrophysics Data System (ADS)
Shao, Meiyue; Aktulga, H. Metin; Yang, Chao; Ng, Esmond G.; Maris, Pieter; Vary, James P.
2018-01-01
We describe a number of recently developed techniques for improving the performance of large-scale nuclear configuration interaction calculations on high performance parallel computers. We show the benefit of using a preconditioned block iterative method to replace the Lanczos algorithm that has traditionally been used to perform this type of computation. The rapid convergence of the block iterative method is achieved by a proper choice of starting guesses of the eigenvectors and the construction of an effective preconditioner. These acceleration techniques take advantage of special structure of the nuclear configuration interaction problem which we discuss in detail. The use of a block method also allows us to improve the concurrency of the computation, and take advantage of the memory hierarchy of modern microprocessors to increase the arithmetic intensity of the computation relative to data movement. We also discuss the implementation details that are critical to achieving high performance on massively parallel multi-core supercomputers, and demonstrate that the new block iterative solver is two to three times faster than the Lanczos based algorithm for problems of moderate sizes on a Cray XC30 system.
Guest editorial: Introduction to the special issue on modern control for computer games.
Argyriou, Vasileios; Kotsia, Irene; Zafeiriou, Stefanos; Petrou, Maria
2013-12-01
A typical gaming scenario, as developed in the past 20 years, involves a player interacting with a game using a specialized input device, such as a joystic, a mouse, a keyboard, etc. Recent technological advances and new sensors (for example, low cost commodity depth cameras) have enabled the introduction of more elaborated approaches in which the player is now able to interact with the game using his body pose, facial expressions, actions, and even his physiological signals. A new era of games has already started, employing computer vision techniques, brain-computer interfaces systems, haptic and wearable devices. The future lies in games that will be intelligent enough not only to extract the player's commands provided by his speech and gestures but also his behavioral cues, as well as his/her emotional states, and adjust their game plot accordingly in order to ensure more realistic and satisfactory gameplay experience. This special issue on modern control for computer games discusses several interdisciplinary factors that influence a user's input to a game, something directly linked to the gaming experience. These include, but are not limited to, the following: behavioral affective gaming, user satisfaction and perception, motion capture and scene modeling, and complete software frameworks that address several challenges risen in such scenarios.
Brenke, Christopher; Lassel, Elke A; Terris, Darcey; Kurt, Aysel; Schmieder, Kirsten; Schoenberg, Stefan O; Weisser, Gerald
2014-05-01
A significant proportion of acute care neurosurgical patients present to hospital outside regular working hours. The objective of our study was to evaluate the structure of neurosurgical on-call services in Germany, the use of modern communication devices and teleradiology services, and the personal acceptance of modern technologies by neurosurgeons. A nationwide survey of all 141 neurosurgical departments in Germany was performed. The questionnaire consisted of two parts: one for neurosurgical departments and one for individual neurosurgeons. The questionnaire, available online and mailed in paper form, included 21 questions about on-call service structure; the availability and use of communication devices, teleradiology services, and other information services; and neurosurgeons' personal acceptance of modern technologies. The questionnaire return rate from departments was 63.1% (89/141), whereas 187 individual neurosurgeons responded. For 57.3% of departments, teleradiology services were available and were frequently used by 62.2% of neurosurgeons. A further 23.6% of departments described using smartphone screenshots of computed tomography (CT) images transmitted by multimedia messaging service (MMS), and 8.6% of images were described as sent by unencrypted email. Although 47.0% of neurosurgeons reported owning a smartphone, only 1.1% used their phone for on-call image communication. Teleradiology services were observed to be widely used by on-call neurosurgeons in Germany. Nevertheless, a significant number of departments appear to use outdated techniques or techniques that leave patient data unprotected. On-call neurosurgeons in Germany report a willingness to adopt more modern approaches, utilizing readily available smartphones or tablet technology. Georg Thieme Verlag KG Stuttgart · New York.
SIMD Optimization of Linear Expressions for Programmable Graphics Hardware
Bajaj, Chandrajit; Ihm, Insung; Min, Jungki; Oh, Jinsang
2009-01-01
The increased programmability of graphics hardware allows efficient graphical processing unit (GPU) implementations of a wide range of general computations on commodity PCs. An important factor in such implementations is how to fully exploit the SIMD computing capacities offered by modern graphics processors. Linear expressions in the form of ȳ = Ax̄ + b̄, where A is a matrix, and x̄, ȳ and b̄ are vectors, constitute one of the most basic operations in many scientific computations. In this paper, we propose a SIMD code optimization technique that enables efficient shader codes to be generated for evaluating linear expressions. It is shown that performance can be improved considerably by efficiently packing arithmetic operations into four-wide SIMD instructions through reordering of the operations in linear expressions. We demonstrate that the presented technique can be used effectively for programming both vertex and pixel shaders for a variety of mathematical applications, including integrating differential equations and solving a sparse linear system of equations using iterative methods. PMID:19946569
Accelerating EPI distortion correction by utilizing a modern GPU-based parallel computation.
Yang, Yao-Hao; Huang, Teng-Yi; Wang, Fu-Nien; Chuang, Tzu-Chao; Chen, Nan-Kuei
2013-04-01
The combination of phase demodulation and field mapping is a practical method to correct echo planar imaging (EPI) geometric distortion. However, since phase dispersion accumulates in each phase-encoding step, the calculation complexity of phase modulation is Ny-fold higher than conventional image reconstructions. Thus, correcting EPI images via phase demodulation is generally a time-consuming task. Parallel computing by employing general-purpose calculations on graphics processing units (GPU) can accelerate scientific computing if the algorithm is parallelized. This study proposes a method that incorporates the GPU-based technique into phase demodulation calculations to reduce computation time. The proposed parallel algorithm was applied to a PROPELLER-EPI diffusion tensor data set. The GPU-based phase demodulation method reduced the EPI distortion correctly, and accelerated the computation. The total reconstruction time of the 16-slice PROPELLER-EPI diffusion tensor images with matrix size of 128 × 128 was reduced from 1,754 seconds to 101 seconds by utilizing the parallelized 4-GPU program. GPU computing is a promising method to accelerate EPI geometric correction. The resulting reduction in computation time of phase demodulation should accelerate postprocessing for studies performed with EPI, and should effectuate the PROPELLER-EPI technique for clinical practice. Copyright © 2011 by the American Society of Neuroimaging.
New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases
NASA Astrophysics Data System (ADS)
Brescia, Massimo
2012-11-01
Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to produce efficient and reliable scientific results. All these considerations will be described in the detail in the chapter. Moreover, examples of modern applications offering to a wide variety of e-science communities a large spectrum of computational facilities to exploit the wealth of available massive data sets and powerful machine learning and statistical algorithms will be also introduced.
GPU accelerated FDTD solver and its application in MRI.
Chi, J; Liu, F; Jin, J; Mason, D G; Crozier, S
2010-01-01
The finite difference time domain (FDTD) method is a popular technique for computational electromagnetics (CEM). The large computational power often required, however, has been a limiting factor for its applications. In this paper, we will present a graphics processing unit (GPU)-based parallel FDTD solver and its successful application to the investigation of a novel B1 shimming scheme for high-field magnetic resonance imaging (MRI). The optimized shimming scheme exhibits considerably improved transmit B(1) profiles. The GPU implementation dramatically shortened the runtime of FDTD simulation of electromagnetic field compared with its CPU counterpart. The acceleration in runtime has made such investigation possible, and will pave the way for other studies of large-scale computational electromagnetic problems in modern MRI which were previously impractical.
Egas Moniz: 90 Years (1927–2017) from Cerebral Angiography
Artico, Marco; Spoletini, Marialuisa; Fumagalli, Lorenzo; Biagioni, Francesca; Ryskalin, Larisa; Fornai, Francesco; Salvati, Maurizio; Frati, Alessandro; Pastore, Francesco Saverio; Taurone, Samanta
2017-01-01
In June 2017 we celebrate the 90th anniversary of the pioneer discovery of cerebral angiography, the seminal imaging technique used for visualizing cerebral blood vessels and vascular alterations as well as other intracranial disorders. Egas Moniz (1874–1955) was the first to describe the use of this revolutionary technique which, until 1975 (when computed tomography, CT, scan was introduced in the clinical practice), was the sole diagnostic tool to provide an imaging of cerebral vessels and therefore alterations due to intracranial pathology. Moniz introduced in the clinical practice this fundamental and important diagnostic tool. The present contribution wishes to pay a tribute to the Portuguese neurosurgeon, who was also a distinguished neurologist and statesman. Despite his tremendous contribution in modern brain imaging, Egas Moniz was awarded the Nobel Prize in Physiology or Medicine in 1949 for prefrontal leucotomy, the neurosurgical intervention nowadays unacceptable, but should rather be remembered for his key contribution to modern brain imaging. PMID:28974927
Magnetic resonance imaging of granular materials
NASA Astrophysics Data System (ADS)
Stannarius, Ralf
2017-05-01
Magnetic Resonance Imaging (MRI) has become one of the most important tools to screen humans in medicine; virtually every modern hospital is equipped with a Nuclear Magnetic Resonance (NMR) tomograph. The potential of NMR in 3D imaging tasks is by far greater, but there is only "a handful" of MRI studies of particulate matter. The method is expensive, time-consuming, and requires a deep understanding of pulse sequences, signal acquisition, and processing. We give a short introduction into the physical principles of this imaging technique, describe its advantages and limitations for the screening of granular matter, and present a number of examples of different application purposes, from the exploration of granular packing, via the detection of flow and particle diffusion, to real dynamic measurements. Probably, X-ray computed tomography is preferable in most applications, but fast imaging of single slices with modern MRI techniques is unmatched, and the additional opportunity to retrieve spatially resolved flow and diffusion profiles without particle tracking is a unique feature.
NASA Astrophysics Data System (ADS)
Hadade, Ioan; di Mare, Luca
2016-08-01
Modern multicore and manycore processors exhibit multiple levels of parallelism through a wide range of architectural features such as SIMD for data parallel execution or threads for core parallelism. The exploitation of multi-level parallelism is therefore crucial for achieving superior performance on current and future processors. This paper presents the performance tuning of a multiblock CFD solver on Intel SandyBridge and Haswell multicore CPUs and the Intel Xeon Phi Knights Corner coprocessor. Code optimisations have been applied on two computational kernels exhibiting different computational patterns: the update of flow variables and the evaluation of the Roe numerical fluxes. We discuss at great length the code transformations required for achieving efficient SIMD computations for both kernels across the selected devices including SIMD shuffles and transpositions for flux stencil computations and global memory transformations. Core parallelism is expressed through threading based on a number of domain decomposition techniques together with optimisations pertaining to alleviating NUMA effects found in multi-socket compute nodes. Results are correlated with the Roofline performance model in order to assert their efficiency for each distinct architecture. We report significant speedups for single thread execution across both kernels: 2-5X on the multicore CPUs and 14-23X on the Xeon Phi coprocessor. Computations at full node and chip concurrency deliver a factor of three speedup on the multicore processors and up to 24X on the Xeon Phi manycore coprocessor.
Comparison of Acceleration Techniques for Selected Low-Level Bioinformatics Operations
Langenkämper, Daniel; Jakobi, Tobias; Feld, Dustin; Jelonek, Lukas; Goesmann, Alexander; Nattkemper, Tim W.
2016-01-01
Within the recent years clock rates of modern processors stagnated while the demand for computing power continued to grow. This applied particularly for the fields of life sciences and bioinformatics, where new technologies keep on creating rapidly growing piles of raw data with increasing speed. The number of cores per processor increased in an attempt to compensate for slight increments of clock rates. This technological shift demands changes in software development, especially in the field of high performance computing where parallelization techniques are gaining in importance due to the pressing issue of large sized datasets generated by e.g., modern genomics. This paper presents an overview of state-of-the-art manual and automatic acceleration techniques and lists some applications employing these in different areas of sequence informatics. Furthermore, we provide examples for automatic acceleration of two use cases to show typical problems and gains of transforming a serial application to a parallel one. The paper should aid the reader in deciding for a certain techniques for the problem at hand. We compare four different state-of-the-art automatic acceleration approaches (OpenMP, PluTo-SICA, PPCG, and OpenACC). Their performance as well as their applicability for selected use cases is discussed. While optimizations targeting the CPU worked better in the complex k-mer use case, optimizers for Graphics Processing Units (GPUs) performed better in the matrix multiplication example. But performance is only superior at a certain problem size due to data migration overhead. We show that automatic code parallelization is feasible with current compiler software and yields significant increases in execution speed. Automatic optimizers for CPU are mature and usually no additional manual adjustment is required. In contrast, some automatic parallelizers targeting GPUs still lack maturity and are limited to simple statements and structures. PMID:26904094
Comparison of Acceleration Techniques for Selected Low-Level Bioinformatics Operations.
Langenkämper, Daniel; Jakobi, Tobias; Feld, Dustin; Jelonek, Lukas; Goesmann, Alexander; Nattkemper, Tim W
2016-01-01
Within the recent years clock rates of modern processors stagnated while the demand for computing power continued to grow. This applied particularly for the fields of life sciences and bioinformatics, where new technologies keep on creating rapidly growing piles of raw data with increasing speed. The number of cores per processor increased in an attempt to compensate for slight increments of clock rates. This technological shift demands changes in software development, especially in the field of high performance computing where parallelization techniques are gaining in importance due to the pressing issue of large sized datasets generated by e.g., modern genomics. This paper presents an overview of state-of-the-art manual and automatic acceleration techniques and lists some applications employing these in different areas of sequence informatics. Furthermore, we provide examples for automatic acceleration of two use cases to show typical problems and gains of transforming a serial application to a parallel one. The paper should aid the reader in deciding for a certain techniques for the problem at hand. We compare four different state-of-the-art automatic acceleration approaches (OpenMP, PluTo-SICA, PPCG, and OpenACC). Their performance as well as their applicability for selected use cases is discussed. While optimizations targeting the CPU worked better in the complex k-mer use case, optimizers for Graphics Processing Units (GPUs) performed better in the matrix multiplication example. But performance is only superior at a certain problem size due to data migration overhead. We show that automatic code parallelization is feasible with current compiler software and yields significant increases in execution speed. Automatic optimizers for CPU are mature and usually no additional manual adjustment is required. In contrast, some automatic parallelizers targeting GPUs still lack maturity and are limited to simple statements and structures.
Nonequilibrium flow computations. 1: An analysis of numerical formulations of conservation laws
NASA Technical Reports Server (NTRS)
Liu, Yen; Vinokur, Marcel
1988-01-01
Modern numerical techniques employing properties of flux Jacobian matrices are extended to general, nonequilibrium flows. Generalizations of the Beam-Warming scheme, Steger-Warming and van Leer Flux-vector splittings, and Roe's approximate Riemann solver are presented for 3-D, time-varying grids. The analysis is based on a thermodynamic model that includes the most general thermal and chemical nonequilibrium flow of an arbitrary gas. Various special cases are also discussed.
A volumetric technique for fossil body mass estimation applied to Australopithecus afarensis.
Brassey, Charlotte A; O'Mahoney, Thomas G; Chamberlain, Andrew T; Sellers, William I
2018-02-01
Fossil body mass estimation is a well established practice within the field of physical anthropology. Previous studies have relied upon traditional allometric approaches, in which the relationship between one/several skeletal dimensions and body mass in a range of modern taxa is used in a predictive capacity. The lack of relatively complete skeletons has thus far limited the potential application of alternative mass estimation techniques, such as volumetric reconstruction, to fossil hominins. Yet across vertebrate paleontology more broadly, novel volumetric approaches are resulting in predicted values for fossil body mass very different to those estimated by traditional allometry. Here we present a new digital reconstruction of Australopithecus afarensis (A.L. 288-1; 'Lucy') and a convex hull-based volumetric estimate of body mass. The technique relies upon identifying a predictable relationship between the 'shrink-wrapped' volume of the skeleton and known body mass in a range of modern taxa, and subsequent application to an articulated model of the fossil taxa of interest. Our calibration dataset comprises whole body computed tomography (CT) scans of 15 species of modern primate. The resulting predictive model is characterized by a high correlation coefficient (r 2 = 0.988) and a percentage standard error of 20%, and performs well when applied to modern individuals of known body mass. Application of the convex hull technique to A. afarensis results in a relatively low body mass estimate of 20.4 kg (95% prediction interval 13.5-30.9 kg). A sensitivity analysis on the articulation of the chest region highlights the sensitivity of our approach to the reconstruction of the trunk, and the incomplete nature of the preserved ribcage may explain the low values for predicted body mass here. We suggest that the heaviest of previous estimates would require the thorax to be expanded to an unlikely extent, yet this can only be properly tested when more complete fossils are available. Copyright © 2017 Elsevier Ltd. All rights reserved.
Preparing Students for Careers in Science and Industry with Computational Physics
NASA Astrophysics Data System (ADS)
Florinski, V. A.
2011-12-01
Funded by NSF CAREER grant, the University of Alabama (UAH) in Huntsville has launched a new graduate program in Computational Physics. It is universally accepted that today's physics is done on a computer. The program blends the boundary between physics and computer science by teaching student modern, practical techniques of solving difficult physics problems using diverse computational platforms. Currently consisting of two courses first offered in the Fall of 2011, the program will eventually include 5 courses covering methods for fluid dynamics, particle transport via stochastic methods, and hybrid and PIC plasma simulations. The UAH's unique location allows courses to be shaped through discussions with faculty, NASA/MSFC researchers and local R&D business representatives, i.e., potential employers of the program's graduates. Students currently participating in the program have all begun their research careers in space and plasma physics; many are presenting their research at this meeting.
Geospace simulations using modern accelerator processor technology
NASA Astrophysics Data System (ADS)
Germaschewski, K.; Raeder, J.; Larson, D. J.
2009-12-01
OpenGGCM (Open Geospace General Circulation Model) is a well-established numerical code simulating the Earth's space environment. The most computing intensive part is the MHD (magnetohydrodynamics) solver that models the plasma surrounding Earth and its interaction with Earth's magnetic field and the solar wind flowing in from the sun. Like other global magnetosphere codes, OpenGGCM's realism is currently limited by computational constraints on grid resolution. OpenGGCM has been ported to make use of the added computational powerof modern accelerator based processor architectures, in particular the Cell processor. The Cell architecture is a novel inhomogeneous multicore architecture capable of achieving up to 230 GFLops on a single chip. The University of New Hampshire recently acquired a PowerXCell 8i based computing cluster, and here we will report initial performance results of OpenGGCM. Realizing the high theoretical performance of the Cell processor is a programming challenge, though. We implemented the MHD solver using a multi-level parallelization approach: On the coarsest level, the problem is distributed to processors based upon the usual domain decomposition approach. Then, on each processor, the problem is divided into 3D columns, each of which is handled by the memory limited SPEs (synergistic processing elements) slice by slice. Finally, SIMD instructions are used to fully exploit the SIMD FPUs in each SPE. Memory management needs to be handled explicitly by the code, using DMA to move data from main memory to the per-SPE local store and vice versa. We use a modern technique, automatic code generation, which shields the application programmer from having to deal with all of the implementation details just described, keeping the code much more easily maintainable. Our preliminary results indicate excellent performance, a speed-up of a factor of 30 compared to the unoptimized version.
The Development of Sociocultural Competence with the Help of Computer Technology
ERIC Educational Resources Information Center
Rakhimova, Alina E.; Yashina, Marianna E.; Mukhamadiarova, Albina F.; Sharipova, Astrid V.
2017-01-01
The article deals with the description of the process of development sociocultural knowledge and competences using computer technologies. On the whole the development of modern computer technologies allows teachers to broaden trainees' sociocultural outlook and trace their progress online. Observation of modern computer technologies and estimation…
Pre- and postprocessing techniques for determining goodness of computational meshes
NASA Technical Reports Server (NTRS)
Oden, J. Tinsley; Westermann, T.; Bass, J. M.
1993-01-01
Research in error estimation, mesh conditioning, and solution enhancement for finite element, finite difference, and finite volume methods has been incorporated into AUDITOR, a modern, user-friendly code, which operates on 2D and 3D unstructured neutral files to improve the accuracy and reliability of computational results. Residual error estimation capabilities provide local and global estimates of solution error in the energy norm. Higher order results for derived quantities may be extracted from initial solutions. Within the X-MOTIF graphical user interface, extensive visualization capabilities support critical evaluation of results in linear elasticity, steady state heat transfer, and both compressible and incompressible fluid dynamics.
Propulsion system/flight control integration for supersonic aircraft
NASA Technical Reports Server (NTRS)
Reukauf, P. J.; Burcham, F. W., Jr.
1976-01-01
Digital integrated control systems are studied. Such systems allow minimization of undesirable interactions while maximizing performance at all flight conditions. One such program is the YF-12 cooperative control program. The existing analog air data computer, autothrottle, autopilot, and inlet control systems are converted to digital systems by using a general purpose airborne computer and interface unit. Existing control laws are programed and tested in flight. Integrated control laws, derived using accurate mathematical models of the airplane and propulsion system in conjunction with modern control techniques, are tested in flight. Analysis indicates that an integrated autothrottle autopilot gives good flight path control and that observers are used to replace failed sensors.
Denoised Wigner distribution deconvolution via low-rank matrix completion
Lee, Justin; Barbastathis, George
2016-08-23
Wigner distribution deconvolution (WDD) is a decades-old method for recovering phase from intensity measurements. Although the technique offers an elegant linear solution to the quadratic phase retrieval problem, it has seen limited adoption due to its high computational/memory requirements and the fact that the technique often exhibits high noise sensitivity. Here, we propose a method for noise suppression in WDD via low-rank noisy matrix completion. Our technique exploits the redundancy of an object’s phase space to denoise its WDD reconstruction. We show in model calculations that our technique outperforms other WDD algorithms as well as modern iterative methods for phasemore » retrieval such as ptychography. Here, our results suggest that a class of phase retrieval techniques relying on regularized direct inversion of ptychographic datasets (instead of iterative reconstruction techniques) can provide accurate quantitative phase information in the presence of high levels of noise.« less
Denoised Wigner distribution deconvolution via low-rank matrix completion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Justin; Barbastathis, George
Wigner distribution deconvolution (WDD) is a decades-old method for recovering phase from intensity measurements. Although the technique offers an elegant linear solution to the quadratic phase retrieval problem, it has seen limited adoption due to its high computational/memory requirements and the fact that the technique often exhibits high noise sensitivity. Here, we propose a method for noise suppression in WDD via low-rank noisy matrix completion. Our technique exploits the redundancy of an object’s phase space to denoise its WDD reconstruction. We show in model calculations that our technique outperforms other WDD algorithms as well as modern iterative methods for phasemore » retrieval such as ptychography. Here, our results suggest that a class of phase retrieval techniques relying on regularized direct inversion of ptychographic datasets (instead of iterative reconstruction techniques) can provide accurate quantitative phase information in the presence of high levels of noise.« less
Advanced techniques in placental biology -- workshop report.
Nelson, D M; Sadovsky, Y; Robinson, J M; Croy, B A; Rice, G; Kniss, D A
2006-04-01
Major advances in placental biology have been realized as new technologies have been developed and existing methods have been refined in many areas of biological research. Classical anatomy and whole-organ physiology tools once used to analyze placental structure and function have been supplanted by more sophisticated techniques adapted from molecular biology, proteomics, and computational biology and bioinformatics. In addition, significant refinements in morphological study of the placenta and its constituent cell types have improved our ability to assess form and function in highly integrated manner. To offer an overview of modern technologies used by investigators to study the placenta, this workshop: Advanced techniques in placental biology, assembled experts who discussed fundamental principles and real time examples of four separate methodologies. Y. Sadovsky presented the principles of microRNA function as an endogenous mechanism of gene regulation. J. Robinson demonstrated the utility of correlative microscopy in which light-level and transmission electron microscopy are combined to provide cellular and subcellular views of placental cells. A. Croy provided a lecture on the use of microdissection techniques which are invaluable for isolating very small subsets of cell types for molecular analysis. Finally, G. Rice presented an overview methods on profiling of complex protein mixtures within tissue and/or fluid samples that, when refined, will offer databases that will underpin a systems approach to modern trophoblast biology.
Impedance computations and beam-based measurements: A problem of discrepancy
Smaluk, Victor
2018-04-21
High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictionsmore » based on the computed impedance budgets show a significant discrepancy. For this article, three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.« less
Impedance computations and beam-based measurements: A problem of discrepancy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smaluk, Victor
High intensity of particle beams is crucial for high-performance operation of modern electron-positron storage rings, both colliders and light sources. The beam intensity is limited by the interaction of the beam with self-induced electromagnetic fields (wake fields) proportional to the vacuum chamber impedance. For a new accelerator project, the total broadband impedance is computed by element-wise wake-field simulations using computer codes. For a machine in operation, the impedance can be measured experimentally using beam-based techniques. In this article, a comparative analysis of impedance computations and beam-based measurements is presented for 15 electron-positron storage rings. The measured data and the predictionsmore » based on the computed impedance budgets show a significant discrepancy. For this article, three possible reasons for the discrepancy are discussed: interference of the wake fields excited by a beam in adjacent components of the vacuum chamber, effect of computation mesh size, and effect of insufficient bandwidth of the computed impedance.« less
cudaMap: a GPU accelerated program for gene expression connectivity mapping
2013-01-01
Background Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. Results cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Conclusion Emerging ‘omics’ technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap. PMID:24112435
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spong, D.A.
The design techniques and physics analysis of modern stellarator configurations for magnetic fusion research rely heavily on high performance computing and simulation. Stellarators, which are fundamentally 3-dimensional in nature, offer significantly more design flexibility than more symmetric devices such as the tokamak. By varying the outer boundary shape of the plasma, a variety of physics features, such as transport, stability, and heating efficiency can be optimized. Scientific visualization techniques are an important adjunct to this effort as they provide a necessary ergonomic link between the numerical results and the intuition of the human researcher. The authors have developed a varietymore » of visualization techniques for stellarators which both facilitate the design optimization process and allow the physics simulations to be more readily understood.« less
The role of modern diagnostic imaging in diagnosing and differentiating kidney diseases in children.
Maliborski, Artur; Zegadło, Arkadiusz; Placzyńska, Małgorzata; Sopińska, Małgorzata; Lichosik, Marianna; Jobs, Katarzyna
2018-01-01
Urinary tract diseases are in the group of the most commonly diagnosed medical conditions in pediatric patients. Many diseases with different etiologies are accompanied by pain, fever, hematuria, or urinary tract dysfunction. Those most common ones in children are urinary tract infections and congenital malformation. They can also represent tumors or changes caused by systemic diseases. Clinical tests and even more often additional imaging studies are required to make a proper diagnosis of urinary tract diseases. Just a few decades ago urography, cystography or voiding cystourethrography were the main methods in diagnostic imaging of the urinary tract. Today's imaging methods supported by digital radiographic and fluoroscopy systems, high sensitivity detectors with quantum detection, advanced algorithms eliminating motion artifacts, modern medical imaging monitors with a resolution of three or even eight megapixels significantly differ from conventional radiographic methods. The methods that are currently usually performed are: computed tomography, magnetic resonance imaging, isotopic methods and ultrasonography using elastography and new solutions in Doppler imaging. Modern techniques are currently focused on reducing radiation exposure with better imaging capabilities. The development of these techniques became an essential diagnostic aid in nephrological and urological practice. The aim of this paper is to present the latest solutions that are currently used in the diagnostic imaging of urinary tract diseases.
ERIC Educational Resources Information Center
Berg, A. I.; And Others
Five articles which were selected from a Russian language book on cybernetics and then translated are presented here. They deal with the topics of: computer-developed computers, heuristics and modern sciences, linguistics and practice, cybernetics and moral-ethical considerations, and computer chess programs. (Author/JY)
LaRiviere, Michael J.; Gross, Robert E.
2016-01-01
Epilepsy is a common, disabling illness that is refractory to medical treatment in approximately one-third of patients, particularly among those with mesial temporal lobe epilepsy. While standard open mesial temporal resection is effective, achieving seizure freedom in most patients, efforts to develop safer, minimally invasive techniques have been underway for over half a century. Stereotactic ablative techniques, in particular, radiofrequency (RF) ablation, were first developed in the 1960s, with refinements in the 1990s with the advent of modern computed tomography and magnetic resonance-based imaging. In the past 5 years, the most recent techniques have used MRI-guided laser interstitial thermotherapy (LITT), the development of which began in the 1980s, saw refinements in MRI thermal imaging through the 1990s, and was initially used primarily for the treatment of intracranial and extracranial tumors. The present review describes the original stereotactic ablation trials, followed by modern imaging-guided RF ablation series for mesial temporal lobe epilepsy. The developments of LITT and MRI thermometry are then discussed. Finally, the two currently available MRI-guided LITT systems are reviewed for their role in the treatment of mesial temporal lobe and other medically refractory epilepsies. PMID:27995127
NASA Astrophysics Data System (ADS)
Pollard, David; Chang, Won; Haran, Murali; Applegate, Patrick; DeConto, Robert
2016-05-01
A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ˜ 20 000 yr. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. The analyses provide sea-level-rise envelopes with well-defined parametric uncertainty bounds, but the simple averaging method only provides robust results with full-factorial parameter sampling in the large ensemble. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree well with the more advanced techniques. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds.
Advances in numerical and applied mathematics
NASA Technical Reports Server (NTRS)
South, J. C., Jr. (Editor); Hussaini, M. Y. (Editor)
1986-01-01
This collection of papers covers some recent developments in numerical analysis and computational fluid dynamics. Some of these studies are of a fundamental nature. They address basic issues such as intermediate boundary conditions for approximate factorization schemes, existence and uniqueness of steady states for time dependent problems, and pitfalls of implicit time stepping. The other studies deal with modern numerical methods such as total variation diminishing schemes, higher order variants of vortex and particle methods, spectral multidomain techniques, and front tracking techniques. There is also a paper on adaptive grids. The fluid dynamics papers treat the classical problems of imcompressible flows in helically coiled pipes, vortex breakdown, and transonic flows.
Yu, Daxiong; Ma, Ruijie; Fang, Jianqiao
2015-05-01
There are many eminent acupuncture masters in modern times in the regions of Zhejiang province, which has developed the acupuncture schools of numerous characteristics and induces the important impacts at home and abroad. Through the literature collection on the acupuncture schools in Zhejiang and the interviews to the parties involved, it has been discovered that the acupuncture manipulation techniques of acupuncture masters in modern times are specifically featured. Those techniques are developed on the basis of Neijing (Internal Classic), Jinzhenfu (Ode to Gold Needle) and Zhenjiu Dacheng (Great Compendium of Acupuncture and Moxibustion). No matter to obey the old maxim or study by himself, every master lays the emphasis on the research and interpretation of classical theories and integrates the traditional with the modern. In the paper, the acupuncture manipulation techniques of Zhejiang acupuncture masters in modern times are stated from four aspects, named needling techniques in Internal Classic, feijingzouqi needling technique, penetrating needling technique and innovation of acupuncture manipulation.
Increasing the reliability of ecological models using modern software engineering techniques
Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff
2009-01-01
Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...
Effects of rotation on coolant passage heat transfer. Volume 1: Coolant passages with smooth walls
NASA Technical Reports Server (NTRS)
Hajek, T. J.; Wagner, J. H.; Johnson, B. V.; Higgins, A. W.; Steuber, G. D.
1991-01-01
An experimental program was conducted to investigate heat transfer and pressure loss characteristics of rotating multipass passages, for configurations and dimensions typical of modern turbine blades. The immediate objective was the generation of a data base of heat transfer and pressure loss data required to develop heat transfer correlations and to assess computational fluid dynamic techniques for rotating coolant passages. Experiments were conducted in a smooth wall large scale heat transfer model.
[Application of Finite Element Method in Thoracolumbar Spine Traumatology].
Zhang, Min; Qiu, Yong-gui; Shao, Yu; Gu, Xiao-feng; Zeng, Ming-wei
2015-04-01
The finite element method (FEM) is a mathematical technique using modern computer technology for stress analysis, and has been gradually used in simulating human body structures in the biomechanical field, especially more widely used in the research of thoracolumbar spine traumatology. This paper reviews the establishment of the thoracolumbar spine FEM, the verification of the FEM, and the thoracolumbar spine FEM research status in different fields, and discusses its prospects and values in forensic thoracolumbar traumatology.
Fast matrix multiplication and its algebraic neighbourhood
NASA Astrophysics Data System (ADS)
Pan, V. Ya.
2017-11-01
Matrix multiplication is among the most fundamental operations of modern computations. By 1969 it was still commonly believed that the classical algorithm was optimal, although the experts already knew that this was not so. Worldwide interest in matrix multiplication instantly exploded in 1969, when Strassen decreased the exponent 3 of cubic time to 2.807. Then everyone expected to see matrix multiplication performed in quadratic or nearly quadratic time very soon. Further progress, however, turned out to be capricious. It was at stalemate for almost a decade, then a combination of surprising techniques (completely independent of Strassen's original ones and much more advanced) enabled a new decrease of the exponent in 1978-1981 and then again in 1986, to 2.376. By 2017 the exponent has still not passed through the barrier of 2.373, but most disturbing was the curse of recursion — even the decrease of exponents below 2.7733 required numerous recursive steps, and each of them squared the problem size. As a result, all algorithms supporting such exponents supersede the classical algorithm only for inputs of immense sizes, far beyond any potential interest for the user. We survey the long study of fast matrix multiplication, focusing on neglected algorithms for feasible matrix multiplication. We comment on their design, the techniques involved, implementation issues, the impact of their study on the modern theory and practice of Algebraic Computations, and perspectives for fast matrix multiplication. Bibliography: 163 titles.
NASA Astrophysics Data System (ADS)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.
2017-02-01
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.
Computational Chemistry Using Modern Electronic Structure Methods
ERIC Educational Resources Information Center
Bell, Stephen; Dines, Trevor J.; Chowdhry, Babur Z.; Withnall, Robert
2007-01-01
Various modern electronic structure methods are now days used to teach computational chemistry to undergraduate students. Such quantum calculations can now be easily used even for large size molecules.
The Analog Revolution and Its On-Going Role in Modern Analytical Measurements.
Enke, Christie G
2015-12-15
The electronic revolution in analytical instrumentation began when we first exceeded the two-digit resolution of panel meters and chart recorders and then took the first steps into automated control. It started with the first uses of operational amplifiers (op amps) in the analog domain 20 years before the digital computer entered the analytical lab. Their application greatly increased both accuracy and precision in chemical measurement and they provided an elegant means for the electronic control of experimental quantities. Later, laboratory and personal computers provided an unlimited readout resolution and enabled programmable control of instrument parameters as well as storage and computation of acquired data. However, digital computers did not replace the op amp's critical role of converting the analog sensor's output to a robust and accurate voltage. Rather it added a new role: converting that voltage into a number. These analog operations are generally the limiting portions of our computerized instrumentation systems. Operational amplifier performance in gain, input current and resistance, offset voltage, and rise time have improved by a remarkable 3-4 orders of magnitude since their first implementations. Each 10-fold improvement has opened the doors for the development of new techniques in all areas of chemical analysis. Along with some interesting history, the multiple roles op amps play in modern instrumentation are described along with a number of examples of new areas of analysis that have been enabled by their improvements.
CSciBox: An Intelligent Assistant for Dating Ice and Sediment Cores
NASA Astrophysics Data System (ADS)
Finlinson, K.; Bradley, E.; White, J. W. C.; Anderson, K. A.; Marchitto, T. M., Jr.; de Vesine, L. R.; Jones, T. R.; Lindsay, C. M.; Israelsen, B.
2015-12-01
CSciBox is an integrated software system for the construction and evaluation of age models of paleo-environmental archives. It incorporates a number of data-processing and visualization facilities, ranging from simple interpolation to reservoir-age correction and 14C calibration via the Calib algorithm, as well as a number of firn and ice-flow models. It employs modern database technology to store paleoclimate proxy data and analysis results in an easily accessible and searchable form, and offers the user access to those data and computational elements via a modern graphical user interface (GUI). In the case of truly large data or computations, CSciBox is parallelizable across modern multi-core processors, or clusters, or even the cloud. The code is open source and freely available on github, as are one-click installers for various versions of Windows and Mac OSX. The system's architecture allows users to incorporate their own software in the form of computational components that can be built smoothly into CSciBox workflows, taking advantage of CSciBox's GUI, data importing facilities, and plotting capabilities. To date, BACON and StratiCounter have been integrated into CSciBox as embedded components. The user can manipulate and compose all of these tools and facilities as she sees fit. Alternatively, she can employ CSciBox's automated reasoning engine, which uses artificial intelligence techniques to explore the gamut of age models and cross-dating scenarios automatically. The automated reasoning engine captures the knowledge of expert geoscientists, and can output a description of its reasoning.
Shao, Meiyue; Aktulga, H. Metin; Yang, Chao; ...
2017-09-14
In this paper, we describe a number of recently developed techniques for improving the performance of large-scale nuclear configuration interaction calculations on high performance parallel computers. We show the benefit of using a preconditioned block iterative method to replace the Lanczos algorithm that has traditionally been used to perform this type of computation. The rapid convergence of the block iterative method is achieved by a proper choice of starting guesses of the eigenvectors and the construction of an effective preconditioner. These acceleration techniques take advantage of special structure of the nuclear configuration interaction problem which we discuss in detail. Themore » use of a block method also allows us to improve the concurrency of the computation, and take advantage of the memory hierarchy of modern microprocessors to increase the arithmetic intensity of the computation relative to data movement. Finally, we also discuss the implementation details that are critical to achieving high performance on massively parallel multi-core supercomputers, and demonstrate that the new block iterative solver is two to three times faster than the Lanczos based algorithm for problems of moderate sizes on a Cray XC30 system.« less
NASA Astrophysics Data System (ADS)
Gama Goicochea, A.; Balderas Altamirano, M. A.; Lopez-Esparza, R.; Waldo-Mendoza, Miguel A.; Perez, E.
2015-09-01
The connection between fundamental interactions acting in molecules in a fluid and macroscopically measured properties, such as the viscosity between colloidal particles coated with polymers, is studied here. The role that hydrodynamic and Brownian forces play in colloidal dispersions is also discussed. It is argued that many-body systems in which all these interactions take place can be accurately solved using computational simulation tools. One of those modern tools is the technique known as dissipative particle dynamics, which incorporates Brownian and hydrodynamic forces, as well as basic conservative interactions. A case study is reported, as an example of the applications of this technique, which consists of the prediction of the viscosity and friction between two opposing parallel surfaces covered with polymer chains, under the influence of a steady flow. This work is intended to serve as an introduction to the subject of colloidal dispersions and computer simulations, for final-year undergraduate students and beginning graduate students who are interested in beginning research in soft matter systems. To that end, a computational code is included that students can use right away to study complex fluids in equilibrium.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shao, Meiyue; Aktulga, H. Metin; Yang, Chao
In this paper, we describe a number of recently developed techniques for improving the performance of large-scale nuclear configuration interaction calculations on high performance parallel computers. We show the benefit of using a preconditioned block iterative method to replace the Lanczos algorithm that has traditionally been used to perform this type of computation. The rapid convergence of the block iterative method is achieved by a proper choice of starting guesses of the eigenvectors and the construction of an effective preconditioner. These acceleration techniques take advantage of special structure of the nuclear configuration interaction problem which we discuss in detail. Themore » use of a block method also allows us to improve the concurrency of the computation, and take advantage of the memory hierarchy of modern microprocessors to increase the arithmetic intensity of the computation relative to data movement. Finally, we also discuss the implementation details that are critical to achieving high performance on massively parallel multi-core supercomputers, and demonstrate that the new block iterative solver is two to three times faster than the Lanczos based algorithm for problems of moderate sizes on a Cray XC30 system.« less
Computer-assisted learning in critical care: from ENIAC to HAL.
Tegtmeyer, K; Ibsen, L; Goldstein, B
2001-08-01
Computers are commonly used to serve many functions in today's modern intensive care unit. One of the most intriguing and perhaps most challenging applications of computers has been to attempt to improve medical education. With the introduction of the first computer, medical educators began looking for ways to incorporate their use into the modern curriculum. Prior limitations of cost and complexity of computers have consistently decreased since their introduction, making it increasingly feasible to incorporate computers into medical education. Simultaneously, the capabilities and capacities of computers have increased. Combining the computer with other modern digital technology has allowed the development of more intricate and realistic educational tools. The purpose of this article is to briefly describe the history and use of computers in medical education with special reference to critical care medicine. In addition, we will examine the role of computers in teaching and learning and discuss the types of interaction between the computer user and the computer.
NASA Technical Reports Server (NTRS)
Nguyen, D. T.; Watson, Willie R. (Technical Monitor)
2005-01-01
The overall objectives of this research work are to formulate and validate efficient parallel algorithms, and to efficiently design/implement computer software for solving large-scale acoustic problems, arised from the unified frameworks of the finite element procedures. The adopted parallel Finite Element (FE) Domain Decomposition (DD) procedures should fully take advantages of multiple processing capabilities offered by most modern high performance computing platforms for efficient parallel computation. To achieve this objective. the formulation needs to integrate efficient sparse (and dense) assembly techniques, hybrid (or mixed) direct and iterative equation solvers, proper pre-conditioned strategies, unrolling strategies, and effective processors' communicating schemes. Finally, the numerical performance of the developed parallel finite element procedures will be evaluated by solving series of structural, and acoustic (symmetrical and un-symmetrical) problems (in different computing platforms). Comparisons with existing "commercialized" and/or "public domain" software are also included, whenever possible.
Computational Studies of Snake Venom Toxins
Ojeda, Paola G.; Caballero, Julio; Kaas, Quentin; González, Wendy
2017-01-01
Most snake venom toxins are proteins, and participate to envenomation through a diverse array of bioactivities, such as bleeding, inflammation, and pain, cytotoxic, cardiotoxic or neurotoxic effects. The venom of a single snake species contains hundreds of toxins, and the venoms of the 725 species of venomous snakes represent a large pool of potentially bioactive proteins. Despite considerable discovery efforts, most of the snake venom toxins are still uncharacterized. Modern bioinformatics tools have been recently developed to mine snake venoms, helping focus experimental research on the most potentially interesting toxins. Some computational techniques predict toxin molecular targets, and the binding mode to these targets. This review gives an overview of current knowledge on the ~2200 sequences, and more than 400 three-dimensional structures of snake toxins deposited in public repositories, as well as of molecular modeling studies of the interaction between these toxins and their molecular targets. We also describe how modern bioinformatics have been used to study the snake venom protein phospholipase A2, the small basic myotoxin Crotamine, and the three-finger peptide Mambalgin. PMID:29271884
Atomdroid: a computational chemistry tool for mobile platforms.
Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M
2012-04-23
We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.
Rich Language Analysis for Counterterrorism
NASA Astrophysics Data System (ADS)
Guidère, Mathieu; Howard, Newton; Argamon, Shlomo
Accurate and relevant intelligence is critical for effective counterterrorism. Too much irrelevant information is as bad or worse than not enough information. Modern computational tools promise to provide better search and summarization capabilities to help analysts filter and select relevant and key information. However, to do this task effectively, such tools must have access to levels of meaning beyond the literal. Terrorists operating in context-rich cultures like fundamentalist Islam use messages with multiple levels of interpretation, which are easily misunderstood by non-insiders. This chapter discusses several kinds of such “encryption” used by terrorists and insurgents in the Arabic language, and how knowledge of such methods can be used to enhance computational text analysis techniques for use in counterterrorism.
Nonmetastatic Castration-resistant Prostate Cancer: A Modern Perspective.
Cancian, Madeline; Renzulli, Joseph F
2018-06-01
Nonmetastatic castration-resistant prostate cancer (nmCRPC) presents a challenge to urologists as currently there are no Food and Drug Administration-approved therapies. However, there are new imaging modalities, including fluciclovine positron emission tomography-computed tomography and Ga-PSMA (prostate specific membrane antigent) positron emission tomography-computed tomography, which are improving accuracy of diagnosis. With improved imaging, we are better able to target therapy. Today there are 3 ongoing clinical trials studying second-generation antiandrogens in nmCRPC, which hold the promise of a new treatment paradigm. In this article, we will review the new imaging techniques and the rationale behind novel treatment modalities in nmCRPC. Copyright © 2018 Elsevier Inc. All rights reserved.
Pape, G; Raiss, P; Kleinschmidt, K; Schuld, C; Mohr, G; Loew, M; Rickert, M
2010-12-01
Loosening of the glenoid component is one of the major causes of failure in total shoulder arthroplasty. Possible risk factors for loosening of cemented components include an eccentric loading, poor bone quality, inadequate cementing technique and insufficient cement penetration. The application of a modern cementing technique has become an established procedure in total hip arthroplasty. The goal of modern cementing techniques in general is to improve the cement-penetration into the cancellous bone. Modern cementing techniques include the cement vacuum-mixing technique, retrograde filling of the cement under pressurisation and the use of a pulsatile lavage system. The main purpose of this study was to analyse cement penetration into the glenoid bone by using modern cement techniques and to investigate the relationship between the bone mineral density (BMD) and the cement penetration. Furthermore we measured the temperature at the glenoid surface before and after jet-lavage of different patients during total shoulder arthroplasty. It is known that the surrounding temperature of the bone has an effect on the polymerisation of the cement. Data from this experiment provide the temperature setting for the in-vitro study. The glenoid surface temperature was measured in 10 patients with a hand-held non-contact temperature measurement device. The bone mineral density was measured by DEXA. Eight paired cadaver scapulae were allocated (n = 16). Each pair comprised two scapulae from one donor (matched-pair design). Two different glenoid components were used, one with pegs and the other with a keel. The glenoids for the in-vitro study were prepared with the bone compaction technique by the same surgeon in all cases. Pulsatile lavage was used to clean the glenoid of blood and bone fragments. Low viscosity bone cement was applied retrogradely into the glenoid by using a syringe. A constant pressure was applied with a modified force sensor impactor. Micro-computed tomography scans were applied to analyse the cement penetration into the cancellous bone. The mean temperature during the in-vivo arthroplasty of the glenoid was 29.4 °C (27.2-31 °C) before and 26.2 °C (25-27.5 °C) after jet-lavage. The overall peak BMD was 0.59 (range 0.33-0.99) g/cm (2). Mean cement penetration was 107.9 (range 67.6-142.3) mm (2) in the peg group and 128.3 (range 102.6-170.8) mm (2) in the keel group. The thickness of the cement layer varied from 0 to 2.1 mm in the pegged group and from 0 to 2.4 mm in the keeled group. A strong negative correlation between BMD and mean cement penetration was found for the peg group (r (2) = -0.834; p < 0.01) and for the keel group (r (2) = -0.727; p < 0.041). Micro-CT shows an inhomogenous dispersion of the cement into the cancellous bone. Data from the in-vivo temperature measurement indicate that the temperature at the glenohumeral surface under operation differs from the body core temperature and should be considered in further in-vitro studies with human specimens. Bone mineral density is negatively correlated to cement penetration in the glenoid. The application of a modern cementing technique in the glenoid provides sufficient cementing penetration although there is an inhomogenous dispersion of the cement. The findings of this study should be considered in further discussions about cementing technique and cement penetration into the cancellous bone of the glenoid. © Georg Thieme Verlag KG Stuttgart · New York.
Plane-Wave DFT Methods for Chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bylaska, Eric J.
A detailed description of modern plane-wave DFT methods and software (contained in the NWChem package) are described that allow for both geometry optimization and ab initio molecular dynamics simulations. Significant emphasis is placed on aspects of these methods that are of interest to computational chemists and useful for simulating chemistry, including techniques for calculating charged systems, exact exchange (i.e. hybrid DFT methods), and highly efficient AIMD/MM methods. Sample applications on the structure of the goethite+water interface and the hydrolysis of nitroaromatic molecules are described.
Special opportunities in helicopter aerodynamics
NASA Technical Reports Server (NTRS)
Mccroskey, W. J.
1983-01-01
Aerodynamic research relating to modern helicopters includes the study of three dimensional, unsteady, nonlinear flow fields. A selective review is made of some of the phenomenon that hamper the development of satisfactory engineering prediction techniques, but which provides a rich source of research opportunities: flow separations, compressibility effects, complex vortical wakes, and aerodynamic interference between components. Several examples of work in progress are given, including dynamic stall alleviation, the development of computational methods for transonic flow, rotor-wake predictions, and blade-vortex interactions.
A New Three Dimensional Based Key Generation Technique in AVK
NASA Astrophysics Data System (ADS)
Banerjee, Subhasish; Dutta, Manash Pratim; Bhunia, Chandan Tilak
2017-08-01
In modern era, ensuring high order security becomes one and only objective of computer networks. From the last few decades, many researchers have given their contributions to achieve the secrecy over the communication channel. In achieving perfect security, Shannon had done the pioneer work on perfect secret theorem and illustrated that secrecy of the shared information can be maintained if the key becomes variable in nature instead of static one. In this regard, a key generation technique has been proposed where the key can be changed every time whenever a new block of data needs to be exchanged. In our scheme, the keys not only vary in bit sequences but also in size. The experimental study is also included in this article to prove the correctness and effectiveness of our proposed technique.
A Comparison of Source Code Plagiarism Detection Engines
NASA Astrophysics Data System (ADS)
Lancaster, Thomas; Culwin, Fintan
2004-06-01
Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and effective techniques are seen to involve tokenising student submissions then searching pairs of submissions for long common substrings, an example of what is defined to be a paired structural metric. Computing academics are recommended to use one of the two Web-based detection engines, MOSS and JPlag. It is shown that whilst detection is well established there are still places where further research would be useful, particularly where visual support of the investigation process is possible.
Architectural Techniques For Managing Non-volatile Caches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
As chip power dissipation becomes a critical challenge in scaling processor performance, computer architects are forced to fundamentally rethink the design of modern processors and hence, the chip-design industry is now at a major inflection point in its hardware roadmap. The high leakage power and low density of SRAM poses serious obstacles in its use for designing large on-chip caches and for this reason, researchers are exploring non-volatile memory (NVM) devices, such as spin torque transfer RAM, phase change RAM and resistive RAM. However, since NVMs are not strictly superior to SRAM, effective architectural techniques are required for making themmore » a universal memory solution. This book discusses techniques for designing processor caches using NVM devices. It presents algorithms and architectures for improving their energy efficiency, performance and lifetime. It also provides both qualitative and quantitative evaluation to help the reader gain insights and motivate them to explore further. This book will be highly useful for beginners as well as veterans in computer architecture, chip designers, product managers and technical marketing professionals.« less
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...
2017-03-20
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha
Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less
Peng, Yun; Miller, Brandi D; Boone, Timothy B; Zhang, Yingchun
2018-02-12
Weakened pelvic floor support is believed to be the main cause of various pelvic floor disorders. Modern theories of pelvic floor support stress on the structural and functional integrity of multiple structures and their interplay to maintain normal pelvic floor functions. Connective tissues provide passive pelvic floor support while pelvic floor muscles provide active support through voluntary contraction. Advanced modern medical technologies allow us to comprehensively and thoroughly evaluate the interaction of supporting structures and assess both active and passive support functions. The pathophysiology of various pelvic floor disorders associated with pelvic floor weakness is now under scrutiny from the combination of (1) morphological, (2) dynamic (through computational modeling), and (3) neurophysiological perspectives. This topical review aims to update newly emerged studies assessing pelvic floor support function among these three categories. A literature search was performed with emphasis on (1) medical imaging studies that assess pelvic floor muscle architecture, (2) subject-specific computational modeling studies that address new topics such as modeling muscle contractions, and (3) pelvic floor neurophysiology studies that report novel devices or findings such as high-density surface electromyography techniques. We found that recent computational modeling studies are featured with more realistic soft tissue constitutive models (e.g., active muscle contraction) as well as an increasing interest in simulating surgical interventions (e.g., artificial sphincter). Diffusion tensor imaging provides a useful non-invasive tool to characterize pelvic floor muscles at the microstructural level, which can be potentially used to improve the accuracy of the simulation of muscle contraction. Studies using high-density surface electromyography anal and vaginal probes on large patient cohorts have been recently reported. Influences of vaginal delivery on the distribution of innervation zones of pelvic floor muscles are clarified, providing useful guidance for a better protection of women during delivery. We are now in a period of transition to advanced diagnostic and predictive pelvic floor medicine. Our findings highlight the application of diffusion tensor imaging, computational models with consideration of active pelvic floor muscle contraction, high-density surface electromyography, and their potential integration, as tools to push the boundary of our knowledge in pelvic floor support and better shape current clinical practice.
Sachetto Oliveira, Rafael; Martins Rocha, Bernardo; Burgarelli, Denise; Meira, Wagner; Constantinides, Christakis; Weber Dos Santos, Rodrigo
2018-02-01
The use of computer models as a tool for the study and understanding of the complex phenomena of cardiac electrophysiology has attained increased importance nowadays. At the same time, the increased complexity of the biophysical processes translates into complex computational and mathematical models. To speed up cardiac simulations and to allow more precise and realistic uses, 2 different techniques have been traditionally exploited: parallel computing and sophisticated numerical methods. In this work, we combine a modern parallel computing technique based on multicore and graphics processing units (GPUs) and a sophisticated numerical method based on a new space-time adaptive algorithm. We evaluate each technique alone and in different combinations: multicore and GPU, multicore and GPU and space adaptivity, multicore and GPU and space adaptivity and time adaptivity. All the techniques and combinations were evaluated under different scenarios: 3D simulations on slabs, 3D simulations on a ventricular mouse mesh, ie, complex geometry, sinus-rhythm, and arrhythmic conditions. Our results suggest that multicore and GPU accelerate the simulations by an approximate factor of 33×, whereas the speedups attained by the space-time adaptive algorithms were approximately 48. Nevertheless, by combining all the techniques, we obtained speedups that ranged between 165 and 498. The tested methods were able to reduce the execution time of a simulation by more than 498× for a complex cellular model in a slab geometry and by 165× in a realistic heart geometry simulating spiral waves. The proposed methods will allow faster and more realistic simulations in a feasible time with no significant loss of accuracy. Copyright © 2017 John Wiley & Sons, Ltd.
Seeing is believing: on the use of image databases for visually exploring plant organelle dynamics.
Mano, Shoji; Miwa, Tomoki; Nishikawa, Shuh-ichi; Mimura, Tetsuro; Nishimura, Mikio
2009-12-01
Organelle dynamics vary dramatically depending on cell type, developmental stage and environmental stimuli, so that various parameters, such as size, number and behavior, are required for the description of the dynamics of each organelle. Imaging techniques are superior to other techniques for describing organelle dynamics because these parameters are visually exhibited. Therefore, as the results can be seen immediately, investigators can more easily grasp organelle dynamics. At present, imaging techniques are emerging as fundamental tools in plant organelle research, and the development of new methodologies to visualize organelles and the improvement of analytical tools and equipment have allowed the large-scale generation of image and movie data. Accordingly, image databases that accumulate information on organelle dynamics are an increasingly indispensable part of modern plant organelle research. In addition, image databases are potentially rich data sources for computational analyses, as image and movie data reposited in the databases contain valuable and significant information, such as size, number, length and velocity. Computational analytical tools support image-based data mining, such as segmentation, quantification and statistical analyses, to extract biologically meaningful information from each database and combine them to construct models. In this review, we outline the image databases that are dedicated to plant organelle research and present their potential as resources for image-based computational analyses.
Model-based phase-shifting interferometer
NASA Astrophysics Data System (ADS)
Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian
2015-10-01
A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.
NASA Astrophysics Data System (ADS)
Spitznagel, J. A.; Wood, Susan
1988-08-01
The Software Engineering institute is a federally funded research and development center sponsored by the Department of Defense (DOD). It was chartered by the Undersecretary of Defense for Research and Engineering on June 15, 1984. The SEI was established and is operated by Carnegie Mellon University (CUM) under contract F19628-C-0003, which was competitively awarded on December 28, 1984, by the Air Force Electronic Systems Division. The mission of the SEI is to provide the means to bring the ablest minds and the most effective technology to bear on the rapid improvement of the quality of operational software in mission-critical computer systems; to accelerate the reduction to practice of modern software engineering techniques and methods; to promulgate the use of modern techniques and methods throughout the mission-critical systems community; and to establish standards of excellence for the practice of software engineering. This report provides a summary of the programs and projects, staff, facilities, and service accomplishments of the Software Engineering Institute during 1987.
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
NASA Technical Reports Server (NTRS)
Soeder, J. F.
1983-01-01
As turbofan engines become more complex, the development of controls necessitate the use of multivariable control techniques. A control developed for the F100-PW-100(3) turbofan engine by using linear quadratic regulator theory and other modern multivariable control synthesis techniques is described. The assembly language implementation of this control on an SEL 810B minicomputer is described. This implementation was then evaluated by using a real-time hybrid simulation of the engine. The control software was modified to run with a real engine. These modifications, in the form of sensor and actuator failure checks and control executive sequencing, are discussed. Finally recommendations for control software implementations are presented.
Leveraging Social Computing for Personalized Crisis Communication using Social Media.
Leykin, Dmitry; Aharonson-Daniel, Limor; Lahad, Mooli
2016-03-24
The extensive use of social media in modern life redefines social interaction and communication. Communication plays an important role in mitigating, or exacerbating, the psychological and behavioral responses to critical incidents and disasters. As recent disasters demonstrated, people tend to converge to social media during and following emergencies. Authorities can then use this media and other computational methods to gain insights from the public, mainly to enhance situational awareness, but also to improve their communication with the public and public adherence to instructions. The current review presents a conceptual framework for studying psychological aspects of crisis and risk communication using the social media through social computing. Advanced analytical tools can be integrated in the processes and objectives of crisis communication. The availability of the computational techniques can improve communication with the public by a process of Hyper-Targeted Crisis Communication. The review suggests that using advanced computational tools for target-audience profiling and linguistic matching in social media, can facilitate more sensitive and personalized emergency communication.
Recent CFD Simulations of Shuttle Orbiter Contingency Abort Aerodynamics
NASA Technical Reports Server (NTRS)
Papadopoulos, Periklis; Prabhu, Dinesh; Wright, Michael; Davies, Carol; McDaniel, Ryan; Venkatapathy, Ethiraj; Wersinski, Paul; Gomez, Reynaldo; Arnold, Jim (Technical Monitor)
2001-01-01
Modern Computational Fluid Dynamics (CFD) techniques were used to compute aerodynamic forces and moments of the Space Shuttle Orbiter in specific portions of contingency abort trajectory space. The trajectory space covers a Mach number range of 3.5-15, an angle-of-attack range of 20-60 degrees, an altitude range of 100-190 kft, and several different settings of the control surfaces (elevons, body flap, and speed brake). While approximately 40 cases have been computed, only a sampling of the results is presented here. The computed results, in general, are in good agreement with the Orbiter Operational Aerodynamic Data Book (OADB) data (i.e., within the uncertainty bands) for almost all the cases. However, in a limited number of high angle-of-attack cases (at Mach 15), there are significant differences between the computed results, especially the vehicle pitching moment, and the OADB data. A preliminary analysis of the data from the CFD simulations at Mach 15 shows that these differences can be attributed to real-gas/Mach number effects.
The influence of surface finishing methods on touch-sensitive reactions
NASA Astrophysics Data System (ADS)
Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.
2017-02-01
This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.
Current role of modern radiotherapy techniques in the management of breast cancer
Ozyigit, Gokhan; Gultekin, Melis
2014-01-01
Breast cancer is the most common type of malignancy in females. Advances in systemic therapies and radiotherapy (RT) provided long survival rates in breast cancer patients. RT has a major role in the management of breast cancer. During the past 15 years several developments took place in the field of imaging and irradiation techniques, intensity modulated RT, hypofractionation and partial-breast irradiation. Currently, improvements in the RT technology allow us a subsequent decrease in the treatment-related complications such as fibrosis and long-term cardiac toxicity while improving the loco-regional control rates and cosmetic results. Thus, it is crucial that modern radiotherapy techniques should be carried out with maximum care and efficiency. Several randomized trials provided evidence for the feasibility of modern radiotherapy techniques in the management of breast cancer. However, the role of modern radiotherapy techniques in the management of breast cancer will continue to be defined by the mature results of randomized trials. Current review will provide an up-to-date evidence based data on the role of modern radiotherapy techniques in the management of breast cancer. PMID:25114857
Psycho-informatics: Big Data shaping modern psychometrics.
Markowetz, Alexander; Błaszkiewicz, Konrad; Montag, Christian; Switala, Christina; Schlaepfer, Thomas E
2014-04-01
For the first time in history, it is possible to study human behavior on great scale and in fine detail simultaneously. Online services and ubiquitous computational devices, such as smartphones and modern cars, record our everyday activity. The resulting Big Data offers unprecedented opportunities for tracking and analyzing behavior. This paper hypothesizes the applicability and impact of Big Data technologies in the context of psychometrics both for research and clinical applications. It first outlines the state of the art, including the severe shortcomings with respect to quality and quantity of the resulting data. It then presents a technological vision, comprised of (i) numerous data sources such as mobile devices and sensors, (ii) a central data store, and (iii) an analytical platform, employing techniques from data mining and machine learning. To further illustrate the dramatic benefits of the proposed methodologies, the paper then outlines two current projects, logging and analyzing smartphone usage. One such study attempts to thereby quantify severity of major depression dynamically; the other investigates (mobile) Internet Addiction. Finally, the paper addresses some of the ethical issues inherent to Big Data technologies. In summary, the proposed approach is about to induce the single biggest methodological shift since the beginning of psychology or psychiatry. The resulting range of applications will dramatically shape the daily routines of researches and medical practitioners alike. Indeed, transferring techniques from computer science to psychiatry and psychology is about to establish Psycho-Informatics, an entire research direction of its own. Copyright © 2013 Elsevier Ltd. All rights reserved.
Trace: a high-throughput tomographic reconstruction engine for large-scale datasets.
Bicer, Tekin; Gürsoy, Doğa; Andrade, Vincent De; Kettimuthu, Rajkumar; Scullin, William; Carlo, Francesco De; Foster, Ian T
2017-01-01
Modern synchrotron light sources and detectors produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. We present Trace, a data-intensive computing engine that we have developed to enable high-performance implementation of iterative tomographic reconstruction algorithms for parallel computers. Trace provides fine-grained reconstruction of tomography datasets using both (thread-level) shared memory and (process-level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations that we apply to the replicated reconstruction objects and evaluate them using tomography datasets collected at the Advanced Photon Source. Our experimental evaluations show that our optimizations and parallelization techniques can provide 158× speedup using 32 compute nodes (384 cores) over a single-core configuration and decrease the end-to-end processing time of a large sinogram (with 4501 × 1 × 22,400 dimensions) from 12.5 h to <5 min per iteration. The proposed tomographic reconstruction engine can efficiently process large-scale tomographic data using many compute nodes and minimize reconstruction times.
Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers
Wang, Bei; Ethier, Stephane; Tang, William; ...
2017-06-29
The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less
Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bei; Ethier, Stephane; Tang, William
The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less
Precision cosmological parameter estimation
NASA Astrophysics Data System (ADS)
Fendt, William Ashton, Jr.
2009-09-01
Experimental efforts of the last few decades have brought. a golden age to mankind's endeavor to understand tine physical properties of the Universe throughout its history. Recent measurements of the cosmic microwave background (CMB) provide strong confirmation of the standard big bang paradigm, as well as introducing new mysteries, to unexplained by current physical models. In the following decades. even more ambitious scientific endeavours will begin to shed light on the new physics by looking at the detailed structure of the Universe both at very early and recent times. Modern data has allowed us to begins to test inflationary models of the early Universe, and the near future will bring higher precision data and much stronger tests. Cracking the codes hidden in these cosmological observables is a difficult and computationally intensive problem. The challenges will continue to increase as future experiments bring larger and more precise data sets. Because of the complexity of the problem, we are forced to use approximate techniques and make simplifying assumptions to ease the computational workload. While this has been reasonably sufficient until now, hints of the limitations of our techniques have begun to come to light. For example, the likelihood approximation used for analysis of CMB data from the Wilkinson Microwave Anistropy Probe (WMAP) satellite was shown to have short falls, leading to pre-emptive conclusions drawn about current cosmological theories. Also it can he shown that an approximate method used by all current analysis codes to describe the recombination history of the Universe will not be sufficiently accurate for future experiments. With a new CMB satellite scheduled for launch in the coming months, it is vital that we develop techniques to improve the analysis of cosmological data. This work develops a novel technique of both avoiding the use of approximate computational codes as well as allowing the application of new, more precise analysis methods. These techniques will help in the understanding of new physics contained in current and future data sets as well as benefit the research efforts of the cosmology community. Our idea is to shift the computationally intensive pieces of the parameter estimation framework to a parallel training step. We then provide a machine learning code that uses this training set to learn the relationship between the underlying cosmological parameters and the function we wish to compute. This code is very accurate and simple to evaluate. It can provide incredible speed- ups of parameter estimation codes. For some applications this provides the convenience of obtaining results faster, while in other cases this allows the use of codes that would be impossible to apply in the brute force setting. In this thesis we provide several examples where our method allows more accurate computation of functions important for data analysis than is currently possible. As the techniques developed in this work are very general, there are no doubt a wide array of applications both inside and outside of cosmology. We have already seen this interest as other scientists have presented ideas for using our algorithm to improve their computational work, indicating its importance as modern experiments push forward. In fact, our algorithm will play an important role in the parameter analysis of Planck, the next generation CMB space mission.
Visualizing functional motions of membrane transporters with molecular dynamics simulations.
Shaikh, Saher A; Li, Jing; Enkavi, Giray; Wen, Po-Chao; Huang, Zhijian; Tajkhorshid, Emad
2013-01-29
Computational modeling and molecular simulation techniques have become an integral part of modern molecular research. Various areas of molecular sciences continue to benefit from, indeed rely on, the unparalleled spatial and temporal resolutions offered by these technologies, to provide a more complete picture of the molecular problems at hand. Because of the continuous development of more efficient algorithms harvesting ever-expanding computational resources, and the emergence of more advanced and novel theories and methodologies, the scope of computational studies has expanded significantly over the past decade, now including much larger molecular systems and far more complex molecular phenomena. Among the various computer modeling techniques, the application of molecular dynamics (MD) simulation and related techniques has particularly drawn attention in biomolecular research, because of the ability of the method to describe the dynamical nature of the molecular systems and thereby to provide a more realistic representation, which is often needed for understanding fundamental molecular properties. The method has proven to be remarkably successful in capturing molecular events and structural transitions highly relevant to the function and/or physicochemical properties of biomolecular systems. Herein, after a brief introduction to the method of MD, we use a number of membrane transport proteins studied in our laboratory as examples to showcase the scope and applicability of the method and its power in characterizing molecular motions of various magnitudes and time scales that are involved in the function of this important class of membrane proteins.
Visualizing Functional Motions of Membrane Transporters with Molecular Dynamics Simulations
2013-01-01
Computational modeling and molecular simulation techniques have become an integral part of modern molecular research. Various areas of molecular sciences continue to benefit from, indeed rely on, the unparalleled spatial and temporal resolutions offered by these technologies, to provide a more complete picture of the molecular problems at hand. Because of the continuous development of more efficient algorithms harvesting ever-expanding computational resources, and the emergence of more advanced and novel theories and methodologies, the scope of computational studies has expanded significantly over the past decade, now including much larger molecular systems and far more complex molecular phenomena. Among the various computer modeling techniques, the application of molecular dynamics (MD) simulation and related techniques has particularly drawn attention in biomolecular research, because of the ability of the method to describe the dynamical nature of the molecular systems and thereby to provide a more realistic representation, which is often needed for understanding fundamental molecular properties. The method has proven to be remarkably successful in capturing molecular events and structural transitions highly relevant to the function and/or physicochemical properties of biomolecular systems. Herein, after a brief introduction to the method of MD, we use a number of membrane transport proteins studied in our laboratory as examples to showcase the scope and applicability of the method and its power in characterizing molecular motions of various magnitudes and time scales that are involved in the function of this important class of membrane proteins. PMID:23298176
PandaEPL: a library for programming spatial navigation experiments.
Solway, Alec; Miller, Jonathan F; Kahana, Michael J
2013-12-01
Recent advances in neuroimaging and neural recording techniques have enabled researchers to make significant progress in understanding the neural mechanisms underlying human spatial navigation. Because these techniques generally require participants to remain stationary, computer-generated virtual environments are used. We introduce PandaEPL, a programming library for the Python language designed to simplify the creation of computer-controlled spatial-navigation experiments. PandaEPL is built on top of Panda3D, a modern open-source game engine. It allows users to construct three-dimensional environments that participants can navigate from a first-person perspective. Sound playback and recording and also joystick support are provided through the use of additional optional libraries. PandaEPL also handles many tasks common to all cognitive experiments, including managing configuration files, logging all internal and participant-generated events, and keeping track of the experiment state. We describe how PandaEPL compares with other software for building spatial-navigation experiments and walk the reader through the process of creating a fully functional experiment.
PandaEPL: A library for programming spatial navigation experiments
Solway, Alec; Miller, Jonathan F.
2013-01-01
Recent advances in neuroimaging and neural recording techniques have enabled researchers to make significant progress in understanding the neural mechanisms underlying human spatial navigation. Because these techniques generally require participants to remain stationary, computer-generated virtual environments are used. We introduce PandaEPL, a programming library for the Python language designed to simplify the creation of computer-controlled spatial-navigation experiments. PandaEPL is built on top of Panda3D, a modern open-source game engine. It allows users to construct three-dimensional environments that participants can navigate from a first-person perspective. Sound playback and recording and also joystick support are provided through the use of additional optional libraries. PandaEPL also handles many tasks common to all cognitive experiments, including managing configuration files, logging all internal and participant-generated events, and keeping track of the experiment state. We describe how PandaEPL compares with other software for building spatial-navigation experiments and walk the reader through the process of creating a fully functional experiment. PMID:23549683
Graphical Interface for the Study of Gas-Phase Reaction Kinetics: Cyclopentene Vapor Pyrolysis
NASA Astrophysics Data System (ADS)
Marcotte, Ronald E.; Wilson, Lenore D.
2001-06-01
The undergraduate laboratory experiment on the pyrolysis of gaseous cyclopentene has been modernized to improve safety, speed, and precision and to better reflect the current practice of physical chemistry. It now utilizes virtual instrument techniques to create a graphical computer interface for the collection and display of experimental data. An electronic pressure gauge has replaced the mercury manometer formerly needed in proximity to the 500 °C pyrolysis oven. Students have much better real-time information available to them and no longer require multiple lab periods to get rate constants and acceptable Arrhenius parameters. The time saved on manual data collection is used to give the students a tour of the computer interfacing hardware and software and a hands-on introduction to gas-phase reagent preparation using a research-grade high-vacuum system. This includes loading the sample, degassing it by the freeze-pump-thaw technique, handling liquid nitrogen and working through the logic necessary for each reconfiguration of the diffusion pump section and the submanifolds.
Acoustical qualification of Teatro Nuovo in Spoleto before refurbishing works
NASA Astrophysics Data System (ADS)
Cocchi, Alessandro; Cesare Consumi, Marco; Shimokura, Ryota
2004-05-01
To qualify the acoustical quality of an opera house two different approaches are now available: one is based on responses of qualified listeners (subjective judgments) compared with objective values of selected parameters, the other on comparison tests conducted in suited rooms and on a model of the auditory brain system (preference). In the occasion of the refurbishment of an opera house known for the Two Worlds Festival edited yearly by the Italian Composer G. C. Menotti, a large number of measurements were taken with different techniques, so it is possible to compare the different methods and also the results with some geometrical criterion, based on the most simple rules of musical harmony, now neglected as our attention is attracted to computer simulations, computer aided measurement techniques and similar modern methods. From this work some link between well known acoustical parameters (not known at the time when architects sketched the shape of ancient opera houses) and geometrical criteria (well known at the time when ancient opera houses were built) will be shown.
Modern morphometry: new perspectives in physical anthropology.
Mantini, Simone; Ripani, Maurizio
2009-06-01
In the past one hundred years physical anthropology has recourse to more and more efficient methods, which provide several new information regarding, human evolution and biology. Apart from the molecular approach, the introduction of new computed assisted techniques gave rise to a new concept of morphometry. Computed tomography and 3D-imaging, allowed providing anatomical description of the external and inner structures exceeding the problems encountered with the traditional morphometric methods. Furthermore, the support of geometric morphometrics, allowed creating geometric models to investigate morphological variation in terms of evolution, ontogeny and variability. The integration of these new tools gave rise to the virtual anthropology and to a new image of the anthropologist in which anatomical, biological, mathematical statistical and data processing information are fused in a multidisciplinary approach.
Extending the knowledge in histochemistry and cell biology.
Heupel, Wolfgang-Moritz; Drenckhahn, Detlev
2010-01-01
Central to modern Histochemistry and Cell Biology stands the need for visualization of cellular and molecular processes. In the past several years, a variety of techniques has been achieved bridging traditional light microscopy, fluorescence microscopy and electron microscopy with powerful software-based post-processing and computer modeling. Researchers now have various tools available to investigate problems of interest from bird's- up to worm's-eye of view, focusing on tissues, cells, proteins or finally single molecules. Applications of new approaches in combination with well-established traditional techniques of mRNA, DNA or protein analysis have led to enlightening and prudent studies which have paved the way toward a better understanding of not only physiological but also pathological processes in the field of cell biology. This review is intended to summarize articles standing for the progress made in "histo-biochemical" techniques and their manifold applications.
On the Impact of Execution Models: A Case Study in Computational Chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavarría-Miranda, Daniel; Halappanavar, Mahantesh; Krishnamoorthy, Sriram
2015-05-25
Efficient utilization of high-performance computing (HPC) platforms is an important and complex problem. Execution models, abstract descriptions of the dynamic runtime behavior of the execution stack, have significant impact on the utilization of HPC systems. Using a computational chemistry kernel as a case study and a wide variety of execution models combined with load balancing techniques, we explore the impact of execution models on the utilization of an HPC system. We demonstrate a 50 percent improvement in performance by using work stealing relative to a more traditional static scheduling approach. We also use a novel semi-matching technique for load balancingmore » that has comparable performance to a traditional hypergraph-based partitioning implementation, which is computationally expensive. Using this study, we found that execution model design choices and assumptions can limit critical optimizations such as global, dynamic load balancing and finding the correct balance between available work units and different system and runtime overheads. With the emergence of multi- and many-core architectures and the consequent growth in the complexity of HPC platforms, we believe that these lessons will be beneficial to researchers tuning diverse applications on modern HPC platforms, especially on emerging dynamic platforms with energy-induced performance variability.« less
Parallel and Portable Monte Carlo Particle Transport
NASA Astrophysics Data System (ADS)
Lee, S. R.; Cummings, J. C.; Nolen, S. D.; Keen, N. D.
1997-08-01
We have developed a multi-group, Monte Carlo neutron transport code in C++ using object-oriented methods and the Parallel Object-Oriented Methods and Applications (POOMA) class library. This transport code, called MC++, currently computes k and α eigenvalues of the neutron transport equation on a rectilinear computational mesh. It is portable to and runs in parallel on a wide variety of platforms, including MPPs, clustered SMPs, and individual workstations. It contains appropriate classes and abstractions for particle transport and, through the use of POOMA, for portable parallelism. Current capabilities are discussed, along with physics and performance results for several test problems on a variety of hardware, including all three Accelerated Strategic Computing Initiative (ASCI) platforms. Current parallel performance indicates the ability to compute α-eigenvalues in seconds or minutes rather than days or weeks. Current and future work on the implementation of a general transport physics framework (TPF) is also described. This TPF employs modern C++ programming techniques to provide simplified user interfaces, generic STL-style programming, and compile-time performance optimization. Physics capabilities of the TPF will be extended to include continuous energy treatments, implicit Monte Carlo algorithms, and a variety of convergence acceleration techniques such as importance combing.
Computational protein design-the next generation tool to expand synthetic biology applications.
Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel
2018-05-02
One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.
Evaluation of a Multicore-Optimized Implementation for Tomographic Reconstruction
Agulleiro, Jose-Ignacio; Fernández, José Jesús
2012-01-01
Tomography allows elucidation of the three-dimensional structure of an object from a set of projection images. In life sciences, electron microscope tomography is providing invaluable information about the cell structure at a resolution of a few nanometres. Here, large images are required to combine wide fields of view with high resolution requirements. The computational complexity of the algorithms along with the large image size then turns tomographic reconstruction into a computationally demanding problem. Traditionally, high-performance computing techniques have been applied to cope with such demands on supercomputers, distributed systems and computer clusters. In the last few years, the trend has turned towards graphics processing units (GPUs). Here we present a detailed description and a thorough evaluation of an alternative approach that relies on exploitation of the power available in modern multicore computers. The combination of single-core code optimization, vector processing, multithreading and efficient disk I/O operations succeeds in providing fast tomographic reconstructions on standard computers. The approach turns out to be competitive with the fastest GPU-based solutions thus far. PMID:23139768
The Center for Computational Biology: resources, achievements, and challenges
Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2011-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221
The Center for Computational Biology: resources, achievements, and challenges.
Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2012-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.
NASA Astrophysics Data System (ADS)
2014-10-01
The active involvement of young researchers in scientific processes and the acquisition of scientific experience by gifted youth currently have a great value for the development of science. One of the research activities of National Research Tomsk Polytechnic University, aimed at the preparing and formation of the next generation of scientists, is the International Conference of Students and Young Scientists ''Modern Techniques and Technologies'', which was held in 2014 for the twentieth time. Great experience in the organization of scientific events has been acquired through years of carrying the conference. There are all the necessary resources for this: a team of organizers - employees of Tomsk Polytechnic University, premises provided with modern office equipment and equipment for demonstration, and leading scientists - professors of TPU, as well as the status of the university as a leading research university in Russia. This way the conference is able to attract world leading scientists for the collaboration. For the previous years the conference proved itself as a major scientific event at international level, which attracts more than 600 students and young scientists from Russia, CIS and other countries. The conference provides oral plenary and section reports. The conference is organized around lectures, where leading Russian and foreign scientists deliver plenary presentations to young audiences. An important indicator of this scientific event is the magnitude of the coverage of scientific fields: energy, heat and power, instrument making, engineering, systems and devices for medical purposes, electromechanics, material science, computer science and control in technical systems, nanotechnologies and nanomaterials, physical methods in science and technology, control and quality management, design and technology of artistic materials processing. The main issues considered by young researchers at the conference were related to the analysis of contemporary problems using new techniques and application of new technologies.
Applications of Mapping and Tomographic Techniques in Gem Sciences
NASA Astrophysics Data System (ADS)
Shen, A. H.
2014-12-01
Gem Sciences are scientific studies of gemstones - their genesis, provenance, synthesis, enhancement, treatment and identification. As high quality forms of specific minerals, the gemstones exhibit unusual physical properties that are usually unseen in the regular counterparts. Most gemstones are colored by trace elements incorporated in the crystal lattice during various growth stages; forming coloration zones of various scales. Studying the spectral and chemical contrast across color zones helps elucidating the origins of colors. These are done by UV-visible spectrometers with microscope and LA-ICPMS in modern gemological laboratories. In the case of diamonds, their colored zones arise from various structural defects incorporated in different growth zones and are studied with FTIR spectrometers with IR microscope and laser photoluminescence spectrometers. Advancement in modern synthetic techniques such as chemical vapor deposition (CVD) has created some problem for identification. Some exploratory experiments in carbon isotope mapping were done on diamonds using SIMS. The most important issue in pearls is to identify one particular pearl being a cultured one or a natural pearl. The price difference can be enormous. Classical way of such identification is done by x-ray radiographs, which clearly show the bead and the nacre. Modern cultured pearl advancement has eliminated the need for an artificial bead, but a small piece of tissue instead. Nowadays, computer x-ray tomography (CT) scanning devices are used to depict the clear image of the interior of a pearl. In the Chinese jade market, filling fissures with epoxy and/or wax are very commonly seen. We are currently exploring Magnetic Resonance Imaging (MRI) technique to map the distribution of artificial resin within a polycrystalline aggregates.
van der Ploeg, Tjeerd; Nieboer, Daan; Steyerberg, Ewout W
2016-10-01
Prediction of medical outcomes may potentially benefit from using modern statistical modeling techniques. We aimed to externally validate modeling strategies for prediction of 6-month mortality of patients suffering from traumatic brain injury (TBI) with predictor sets of increasing complexity. We analyzed individual patient data from 15 different studies including 11,026 TBI patients. We consecutively considered a core set of predictors (age, motor score, and pupillary reactivity), an extended set with computed tomography scan characteristics, and a further extension with two laboratory measurements (glucose and hemoglobin). With each of these sets, we predicted 6-month mortality using default settings with five statistical modeling techniques: logistic regression (LR), classification and regression trees, random forests (RFs), support vector machines (SVM) and neural nets. For external validation, a model developed on one of the 15 data sets was applied to each of the 14 remaining sets. This process was repeated 15 times for a total of 630 validations. The area under the receiver operating characteristic curve (AUC) was used to assess the discriminative ability of the models. For the most complex predictor set, the LR models performed best (median validated AUC value, 0.757), followed by RF and support vector machine models (median validated AUC value, 0.735 and 0.732, respectively). With each predictor set, the classification and regression trees models showed poor performance (median validated AUC value, <0.7). The variability in performance across the studies was smallest for the RF- and LR-based models (inter quartile range for validated AUC values from 0.07 to 0.10). In the area of predicting mortality from TBI, nonlinear and nonadditive effects are not pronounced enough to make modern prediction methods beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.
Scalability and Validation of Big Data Bioinformatics Software.
Yang, Andrian; Troup, Michael; Ho, Joshua W K
2017-01-01
This review examines two important aspects that are central to modern big data bioinformatics analysis - software scalability and validity. We argue that not only are the issues of scalability and validation common to all big data bioinformatics analyses, they can be tackled by conceptually related methodological approaches, namely divide-and-conquer (scalability) and multiple executions (validation). Scalability is defined as the ability for a program to scale based on workload. It has always been an important consideration when developing bioinformatics algorithms and programs. Nonetheless the surge of volume and variety of biological and biomedical data has posed new challenges. We discuss how modern cloud computing and big data programming frameworks such as MapReduce and Spark are being used to effectively implement divide-and-conquer in a distributed computing environment. Validation of software is another important issue in big data bioinformatics that is often ignored. Software validation is the process of determining whether the program under test fulfils the task for which it was designed. Determining the correctness of the computational output of big data bioinformatics software is especially difficult due to the large input space and complex algorithms involved. We discuss how state-of-the-art software testing techniques that are based on the idea of multiple executions, such as metamorphic testing, can be used to implement an effective bioinformatics quality assurance strategy. We hope this review will raise awareness of these critical issues in bioinformatics.
Methods for simulation-based analysis of fluid-structure interaction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barone, Matthew Franklin; Payne, Jeffrey L.
2005-10-01
Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonalmore » decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.« less
Building a Data Science capability for USGS water research and communication
NASA Astrophysics Data System (ADS)
Appling, A.; Read, E. K.
2015-12-01
Interpreting and communicating water issues in an era of exponentially increasing information requires a blend of domain expertise, computational proficiency, and communication skills. The USGS Office of Water Information has established a Data Science team to meet these needs, providing challenging careers for diverse domain scientists and innovators in the fields of information technology and data visualization. Here, we detail the experience of building a Data Science capability as a bridging element between traditional water resources analyses and modern computing tools and data management techniques. This approach includes four major components: 1) building reusable research tools, 2) documenting data-intensive research approaches in peer reviewed journals, 3) communicating complex water resources issues with interactive web visualizations, and 4) offering training programs for our peers in scientific computing. These components collectively improve the efficiency, transparency, and reproducibility of USGS data analyses and scientific workflows.
Experimental and Computational Sonic Boom Assessment of Lockheed-Martin N+2 Low Boom Models
NASA Technical Reports Server (NTRS)
Cliff, Susan E.; Durston, Donald A.; Elmiligui, Alaa A.; Walker, Eric L.; Carter, Melissa B.
2015-01-01
Flight at speeds greater than the speed of sound is not permitted over land, primarily because of the noise and structural damage caused by sonic boom pressure waves of supersonic aircraft. Mitigation of sonic boom is a key focus area of the High Speed Project under NASA's Fundamental Aeronautics Program. The project is focusing on technologies to enable future civilian aircraft to fly efficiently with reduced sonic boom, engine and aircraft noise, and emissions. A major objective of the project is to improve both computational and experimental capabilities for design of low-boom, high-efficiency aircraft. NASA and industry partners are developing improved wind tunnel testing techniques and new pressure instrumentation to measure the weak sonic boom pressure signatures of modern vehicle concepts. In parallel, computational methods are being developed to provide rapid design and analysis of supersonic aircraft with improved meshing techniques that provide efficient, robust, and accurate on- and off-body pressures at several body lengths from vehicles with very low sonic boom overpressures. The maturity of these critical parallel efforts is necessary before low-boom flight can be demonstrated and commercial supersonic flight can be realized.
Wielpütz, Mark O.; Kauczor, Hans-Ulrich
2012-01-01
From the first measurements of the distribution of pulmonary blood flow using radioactive tracers by West and colleagues (J Clin Invest 40: 1–12, 1961) allowing gravitational differences in pulmonary blood flow to be described, the imaging of pulmonary blood flow has made considerable progress. The researcher employing modern imaging techniques now has the choice of several techniques, including magnetic resonance imaging (MRI), computerized tomography (CT), positron emission tomography (PET), and single photon emission computed tomography (SPECT). These techniques differ in several important ways: the resolution of the measurement, the type of contrast or tag used to image flow, and the amount of ionizing radiation associated with each measurement. In addition, the techniques vary in what is actually measured, whether it is capillary perfusion such as with PET and SPECT, or larger vessel information in addition to capillary perfusion such as with MRI and CT. Combined, these issues affect quantification and interpretation of data as well as the type of experiments possible using different techniques. The goal of this review is to give an overview of the techniques most commonly in use for physiological experiments along with the issues unique to each technique. PMID:22604884
Innovative Teaching Practice: Traditional and Alternative Methods (Challenges and Implications)
ERIC Educational Resources Information Center
Nurutdinova, Aida R.; Perchatkina, Veronika G.; Zinatullina, Liliya M.; Zubkova, Guzel I.; Galeeva, Farida T.
2016-01-01
The relevance of the present issue is caused be the strong need in alternative methods of learning foreign language and the need in language training and retraining for the modern professionals. The aim of the article is to identify the basic techniques and skills in using various modern techniques in the context of modern educational tasks. The…
A Case Study in Flight Computer Software Redesign
NASA Astrophysics Data System (ADS)
Shimoni, R.; Ben-Zur, Y.
2004-06-01
Historically many real-time systems were developed using technologies that are now obsolete. There is a need for upgrading these systems. A good development process is essential to achieve a well-designed software product. We, at MLM, a subsidary of Israel Aircraft Industries, faced a similar situation in the Flight Mission Computer (Main Airborne Computer-MAC) of the SHAVIT launcher. It was necessary to upgrade the computer hardware and we decided to update the software as well. During the last two years, we have designed and implemented and new version of the MAC software, to be run on a new and stronger target platform. We undertook to create a new version of the MAC program using modern software development techniques. The process included Object-Oriented design using a CASE tool suitable for embedded real-time systems. We have partially implemented the ROPES development process. In this article we present the difficulties and challenges we faced in the software development process.
Web-based services for drug design and discovery.
Frey, Jeremy G; Bird, Colin L
2011-09-01
Reviews of the development of drug discovery through the 20(th) century recognised the importance of chemistry and increasingly bioinformatics, but had relatively little to say about the importance of computing and networked computing in particular. However, the design and discovery of new drugs is arguably the most significant single application of bioinformatics and cheminformatics to have benefitted from the increases in the range and power of the computational techniques since the emergence of the World Wide Web, commonly now referred to as simply 'the Web'. Web services have enabled researchers to access shared resources and to deploy standardized calculations in their search for new drugs. This article first considers the fundamental principles of Web services and workflows, and then explores the facilities and resources that have evolved to meet the specific needs of chem- and bio-informatics. This strategy leads to a more detailed examination of the basic components that characterise molecules and the essential predictive techniques, followed by a discussion of the emerging networked services that transcend the basic provisions, and the growing trend towards embracing modern techniques, in particular the Semantic Web. In the opinion of the authors, the issues that require community action are: increasing the amount of chemical data available for open access; validating the data as provided; and developing more efficient links between the worlds of cheminformatics and bioinformatics. The goal is to create ever better drug design services.
Orientational imaging of a single plasmonic nanoparticle using dark-field hyperspectral imaging
NASA Astrophysics Data System (ADS)
Mehta, Nishir; Mahigir, Amirreza; Veronis, Georgios; Gartia, Manas Ranjan
2017-08-01
Orientation of plasmonic nanostructures is an important feature in many nanoscale applications such as catalyst, biosensors DNA interactions, protein detections, hotspot of surface enhanced Raman spectroscopy (SERS), and fluorescence resonant energy transfer (FRET) experiments. However, due to diffraction limit, it is challenging to obtain the exact orientation of the nanostructure using standard optical microscope. Hyperspectral Imaging Microscopy is a state-of-the-art visualization technology that combines modern optics with hyperspectral imaging and computer system to provide the identification and quantitative spectral analysis of nano- and microscale structures. In this work, initially we use transmitted dark field imaging technique to locate single nanoparticle on a glass substrate. Then we employ hyperspectral imaging technique at the same spot to investigate orientation of single nanoparticle. No special tagging or staining of nanoparticle has been done, as more likely required in traditional microscopy techniques. Different orientations have been identified by carefully understanding and calibrating shift in spectral response from each different orientations of similar sized nanoparticles. Wavelengths recorded are between 300 nm to 900 nm. The orientations measured by hyperspectral microscopy was validated using finite difference time domain (FDTD) electrodynamics calculations and scanning electron microscopy (SEM) analysis. The combination of high resolution nanometer-scale imaging techniques and the modern numerical modeling capacities thus enables a meaningful advance in our knowledge of manipulating and fabricating shaped nanostructures. This work will advance our understanding of the behavior of small nanoparticle clusters useful for sensing, nanomedicine, and surface sciences.
Master of Puppets: Cooperative Multitasking for In Situ Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morozov, Dmitriy; Lukic, Zarija
2016-01-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monozov, Dmitriy; Lukie, Zarija
2016-04-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
MAGEE,GLEN I.
Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flightmore » modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.« less
NASA Astrophysics Data System (ADS)
Sextos, Anastasios G.
2014-01-01
This paper presents the structure of an undergraduate course entitled 'programming techniques and the use of specialised software in structural engineering' which is offered to the fifth (final) year students of the Civil Engineering Department of Aristotle University Thessaloniki in Greece. The aim of this course is to demonstrate the use of new information technologies in the field of structural engineering and to teach modern programming and finite element simulation techniques that the students can in turn apply in both research and everyday design of structures. The course also focuses on the physical interpretation of structural engineering problems, in a way that the students become familiar with the concept of computational tools without losing perspective from the engineering problem studied. For this purpose, a wide variety of structural engineering problems are studied in class, involving structural statics, dynamics, earthquake engineering, design of reinforced concrete and steel structures as well as data and information management. The main novelty of the course is that it is taught and examined solely in the computer laboratory ensuring that each student can accomplish the prescribed 'hands-on' training on a dedicated computer, strictly on a 1:1 student over hardware ratio. Significant effort has also been put so that modern educational techniques and tools are utilised to offer the course in an essentially paperless mode. This involves electronic educational material, video tutorials, student information in real time and exams given and assessed electronically through an ad hoc developed, personalised, electronic system. The positive feedback received from the students reveals that the concept of a paperless course is not only applicable in real academic conditions but is also a promising approach that significantly increases student productivity and engagement. The question, however, is whether such an investment in educational technology is indeed timely during economic recession, where the academic priorities are rapidly changing. In the light of this unfavourable and unstable financial environment, a critical overview of the strengths, the weaknesses, the opportunities and the threats of this effort is presented herein, hopefully contributing to the discussion on the future of higher education in the time of crisis.
A tale of two species: neural integration in zebrafish and monkeys
Joshua, Mati; Lisberger, Stephen G.
2014-01-01
Selection of a model organism creates a tension between competing constraints. The recent explosion of modern molecular techniques has revolutionized the analysis of neural systems in organisms that are amenable to genetic techniques. Yet, the non-human primate remains the gold-standard for the analysis of the neural basis of behavior, and as a bridge to the operation of the human brain. The challenge is to generalize across species in a way that exposes the operation of circuits as well as the relationship of circuits to behavior. Eye movements provide an opportunity to cross the bridge from mechanism to behavior through research on diverse species. Here, we review experiments and computational studies on a circuit function called “neural integration” that occurs in the brainstems of larval zebrafish, non-human primates, and species “in between”. We show that analysis of circuit structure using modern molecular and imaging approaches in zebrafish has remarkable explanatory power for the details of the responses of integrator neurons in the monkey. The combination of research from the two species has led to a much stronger hypothesis for the implementation of the neural integrator than could have been achieved using either species alone. PMID:24797331
A tale of two species: Neural integration in zebrafish and monkeys.
Joshua, M; Lisberger, S G
2015-06-18
Selection of a model organism creates tension between competing constraints. The recent explosion of modern molecular techniques has revolutionized the analysis of neural systems in organisms that are amenable to genetic techniques. Yet, the non-human primate remains the gold-standard for the analysis of the neural basis of behavior, and as a bridge to the operation of the human brain. The challenge is to generalize across species in a way that exposes the operation of circuits as well as the relationship of circuits to behavior. Eye movements provide an opportunity to cross the bridge from mechanism to behavior through research on diverse species. Here, we review experiments and computational studies on a circuit function called "neural integration" that occurs in the brainstems of larval zebrafish, primates, and species "in between". We show that analysis of circuit structure using modern molecular and imaging approaches in zebrafish has remarkable explanatory power for details of the responses of integrator neurons in the monkey. The combination of research from the two species has led to a much stronger hypothesis for the implementation of the neural integrator than could have been achieved using either species alone. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.
Optimizing Engineering Tools Using Modern Ground Architectures
2017-12-01
Considerations,” International Journal of Computer Science & Engineering Survey , vol. 5, no. 4, 2014. [10] R. Bell. (n.d). A beginner’s guide to big O notation...scientific community. Traditional computing architectures were not capable of processing the data efficiently, or in some cases, could not process the...thesis investigates how these modern computing architectures could be leveraged by industry and academia to improve the performance and capabilities of
Computational study of noise in a large signal transduction network.
Intosalmi, Jukka; Manninen, Tiina; Ruohonen, Keijo; Linne, Marja-Leena
2011-06-21
Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor. We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased. We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies. © 2011 Intosalmi et al; licensee BioMed Central Ltd.
The application of LQR synthesis techniques to the turboshaft engine control problem
NASA Technical Reports Server (NTRS)
Pfeil, W. H.; De Los Reyes, G.; Bobula, G. A.
1984-01-01
A power turbine governor was designed for a recent-technology turboshaft engine coupled to a modern, articulated rotor system using Linear Quadratic Regulator (LQR) and Kalman Filter (KF) techniques. A linear, state-space model of the engine and rotor system was derived for six engine power settings from flight idle to maximum continuous. An integrator was appended to the fuel flow input to reduce the steady-state governor error to zero. Feedback gains were calculated for the system states at each power setting using the LQR technique. The main rotor tip speed state is not measurable, so a Kalman Filter of the rotor was used to estimate this state. The crossover of the system was increased to 10 rad/s compared to 2 rad/sec for a current governor. Initial computer simulations with a nonlinear engine model indicate a significant decrease in power turbine speed variation with the LQR governor compared to a conventional governor.
FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation
NASA Astrophysics Data System (ADS)
Veltri, M.
2016-09-01
This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.
Cardone, Daniela; Merla, Arcangelo
2017-01-01
Thermal infrared imaging has been proposed, and is now used, as a tool for the non-contact and non-invasive computational assessment of human autonomic nervous activity and psychophysiological states. Thanks to a new generation of high sensitivity infrared thermal detectors and the development of computational models of the autonomic control of the facial cutaneous temperature, several autonomic variables can be computed through thermal infrared imaging, including localized blood perfusion rate, cardiac pulse rate, breath rate, sudomotor and stress responses. In fact, all of these parameters impact on the control of the cutaneous temperature. The physiological information obtained through this approach, could then be used to infer about a variety of psychophysiological or emotional states, as proved by the increasing number of psychophysiology or neurosciences studies that use thermal infrared imaging. This paper presents a review of the principal achievements of thermal infrared imaging in computational psychophysiology, focusing on the capability of the technique for providing ubiquitous and unwired monitoring of psychophysiological activity and affective states. It also presents a summary on the modern, up-to-date infrared sensors technology. PMID:28475155
Cardone, Daniela; Merla, Arcangelo
2017-05-05
Thermal infrared imaging has been proposed, and is now used, as a tool for the non-contact and non-invasive computational assessment of human autonomic nervous activity and psychophysiological states. Thanks to a new generation of high sensitivity infrared thermal detectors and the development of computational models of the autonomic control of the facial cutaneous temperature, several autonomic variables can be computed through thermal infrared imaging, including localized blood perfusion rate, cardiac pulse rate, breath rate, sudomotor and stress responses. In fact, all of these parameters impact on the control of the cutaneous temperature. The physiological information obtained through this approach, could then be used to infer about a variety of psychophysiological or emotional states, as proved by the increasing number of psychophysiology or neurosciences studies that use thermal infrared imaging. This paper presents a review of the principal achievements of thermal infrared imaging in computational psychophysiology, focusing on the capability of the technique for providing ubiquitous and unwired monitoring of psychophysiological activity and affective states. It also presents a summary on the modern, up-to-date infrared sensors technology.
Leveraging Social Computing for Personalized Crisis Communication using Social Media
Leykin, Dmitry; Aharonson-Daniel, Limor; Lahad, Mooli
2016-01-01
Introduction: The extensive use of social media in modern life redefines social interaction and communication. Communication plays an important role in mitigating, or exacerbating, the psychological and behavioral responses to critical incidents and disasters. As recent disasters demonstrated, people tend to converge to social media during and following emergencies. Authorities can then use this media and other computational methods to gain insights from the public, mainly to enhance situational awareness, but also to improve their communication with the public and public adherence to instructions. Methods: The current review presents a conceptual framework for studying psychological aspects of crisis and risk communication using the social media through social computing. Results: Advanced analytical tools can be integrated in the processes and objectives of crisis communication. The availability of the computational techniques can improve communication with the public by a process of Hyper-Targeted Crisis Communication. Discussion: The review suggests that using advanced computational tools for target-audience profiling and linguistic matching in social media, can facilitate more sensitive and personalized emergency communication. PMID:27092290
Automatic Blocking Of QR and LU Factorizations for Locality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yi, Q; Kennedy, K; You, H
2004-03-26
QR and LU factorizations for dense matrices are important linear algebra computations that are widely used in scientific applications. To efficiently perform these computations on modern computers, the factorization algorithms need to be blocked when operating on large matrices to effectively exploit the deep cache hierarchy prevalent in today's computer memory systems. Because both QR (based on Householder transformations) and LU factorization algorithms contain complex loop structures, few compilers can fully automate the blocking of these algorithms. Though linear algebra libraries such as LAPACK provides manually blocked implementations of these algorithms, by automatically generating blocked versions of the computations, moremore » benefit can be gained such as automatic adaptation of different blocking strategies. This paper demonstrates how to apply an aggressive loop transformation technique, dependence hoisting, to produce efficient blockings for both QR and LU with partial pivoting. We present different blocking strategies that can be generated by our optimizer and compare the performance of auto-blocked versions with manually tuned versions in LAPACK, both using reference BLAS, ATLAS BLAS and native BLAS specially tuned for the underlying machine architectures.« less
NASA Astrophysics Data System (ADS)
Pollard, D.; Chang, W.; Haran, M.; Applegate, P.; DeConto, R.
2015-11-01
A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ~ 20 000 years. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree quite well with the more advanced techniques, but only for a large ensemble with full factorial parameter sampling. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds. Each run is extended 5000 years into the "future" with idealized ramped climate warming. In the majority of runs with reasonable scores, this produces grounding-line retreat deep into the West Antarctic interior, and the analysis provides sea-level-rise envelopes with well defined parametric uncertainty bounds.
A synthetic design environment for ship design
NASA Technical Reports Server (NTRS)
Chipman, Richard R.
1995-01-01
Rapid advances in computer science and information system technology have made possible the creation of synthetic design environments (SDE) which use virtual prototypes to increase the efficiency and agility of the design process. This next generation of computer-based design tools will rely heavily on simulation and advanced visualization techniques to enable integrated product and process teams to concurrently conceptualize, design, and test a product and its fabrication processes. This paper summarizes a successful demonstration of the feasibility of using a simulation based design environment in the shipbuilding industry. As computer science and information science technologies have evolved, there have been many attempts to apply and integrate the new capabilities into systems for the improvement of the process of design. We see the benefits of those efforts in the abundance of highly reliable, technologically complex products and services in the modern marketplace. Furthermore, the computer-based technologies have been so cost effective that the improvements embodied in modern products have been accompanied by lowered costs. Today the state-of-the-art in computerized design has advanced so dramatically that the focus is no longer on merely improving design methodology; rather the goal is to revolutionize the entire process by which complex products are conceived, designed, fabricated, tested, deployed, operated, maintained, refurbished and eventually decommissioned. By concurrently addressing all life-cycle issues, the basic decision making process within an enterprise will be improved dramatically, leading to new levels of quality, innovation, efficiency, and customer responsiveness. By integrating functions and people with an enterprise, such systems will change the fundamental way American industries are organized, creating companies that are more competitive, creative, and productive.
Ruppin, Eytan; Papin, Jason A; de Figueiredo, Luis F; Schuster, Stefan
2010-08-01
With the advent of modern omics technologies, it has become feasible to reconstruct (quasi-) whole-cell metabolic networks and characterize them in more and more detail. Computer simulations of the dynamic behavior of such networks are difficult due to a lack of kinetic data and to computational limitations. In contrast, network analysis based on appropriate constraints such as the steady-state condition (constraint-based analysis) is feasible and allows one to derive conclusions about the system's metabolic capabilities. Here, we review methods for the reconstruction of metabolic networks, modeling techniques such as flux balance analysis and elementary flux modes and current progress in their development and applications. Game-theoretical methods for studying metabolic networks are discussed as well. Copyright © 2010 Elsevier Ltd. All rights reserved.
Semantic segmentation of 3D textured meshes for urban scene analysis
NASA Astrophysics Data System (ADS)
Rouhani, Mohammad; Lafarge, Florent; Alliez, Pierre
2017-01-01
Classifying 3D measurement data has become a core problem in photogrammetry and 3D computer vision, since the rise of modern multiview geometry techniques, combined with affordable range sensors. We introduce a Markov Random Field-based approach for segmenting textured meshes generated via multi-view stereo into urban classes of interest. The input mesh is first partitioned into small clusters, referred to as superfacets, from which geometric and photometric features are computed. A random forest is then trained to predict the class of each superfacet as well as its similarity with the neighboring superfacets. Similarity is used to assign the weights of the Markov Random Field pairwise-potential and to account for contextual information between the classes. The experimental results illustrate the efficacy and accuracy of the proposed framework.
Techniques of EMG signal analysis: detection, processing, classification and applications
Hussain, M.S.; Mohd-Yasin, F.
2006-01-01
Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694
Recent advances in imaging technologies in dentistry.
Shah, Naseem; Bansal, Nikhil; Logani, Ajay
2014-10-28
Dentistry has witnessed tremendous advances in all its branches over the past three decades. With these advances, the need for more precise diagnostic tools, specially imaging methods, have become mandatory. From the simple intra-oral periapical X-rays, advanced imaging techniques like computed tomography, cone beam computed tomography, magnetic resonance imaging and ultrasound have also found place in modern dentistry. Changing from analogue to digital radiography has not only made the process simpler and faster but also made image storage, manipulation (brightness/contrast, image cropping, etc.) and retrieval easier. The three-dimensional imaging has made the complex cranio-facial structures more accessible for examination and early and accurate diagnosis of deep seated lesions. This paper is to review current advances in imaging technology and their uses in different disciplines of dentistry.
Recent advances in imaging technologies in dentistry
Shah, Naseem; Bansal, Nikhil; Logani, Ajay
2014-01-01
Dentistry has witnessed tremendous advances in all its branches over the past three decades. With these advances, the need for more precise diagnostic tools, specially imaging methods, have become mandatory. From the simple intra-oral periapical X-rays, advanced imaging techniques like computed tomography, cone beam computed tomography, magnetic resonance imaging and ultrasound have also found place in modern dentistry. Changing from analogue to digital radiography has not only made the process simpler and faster but also made image storage, manipulation (brightness/contrast, image cropping, etc.) and retrieval easier. The three-dimensional imaging has made the complex cranio-facial structures more accessible for examination and early and accurate diagnosis of deep seated lesions. This paper is to review current advances in imaging technology and their uses in different disciplines of dentistry. PMID:25349663
Flexible use and technique extension of logistics management
NASA Astrophysics Data System (ADS)
Xiong, Furong
2011-10-01
As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.
Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel
2017-06-15
Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.
Corradini, Stefanie; Ballhausen, Hendrik; Weingandt, Helmut; Freislederer, Philipp; Schönecker, Stephan; Niyazi, Maximilian; Simonetto, Cristoforo; Eidemüller, Markus; Ganswindt, Ute; Belka, Claus
2018-03-01
Modern breast cancer radiotherapy techniques, such as respiratory-gated radiotherapy in deep-inspiration breath-hold (DIBH) or volumetric-modulated arc radiotherapy (VMAT) have been shown to reduce the high dose exposure of the heart in left-sided breast cancer. The aim of the present study was to comparatively estimate the excess relative and absolute risks of radiation-induced secondary lung cancer and ischemic heart disease for different modern radiotherapy techniques. Four different treatment plans were generated for ten computed tomography data sets of patients with left-sided breast cancer, using either three-dimensional conformal radiotherapy (3D-CRT) or VMAT, in free-breathing (FB) or DIBH. Dose-volume histograms were used for organ equivalent dose (OED) calculations using linear, linear-exponential, and plateau models for the lung. A linear model was applied to estimate the long-term risk of ischemic heart disease as motivated by epidemiologic data. Excess relative risk (ERR) and 10-year excess absolute risk (EAR) for radiation-induced secondary lung cancer and ischemic heart disease were estimated for different representative baseline risks. The DIBH maneuver resulted in a significant reduction of the ERR and estimated 10-year excess absolute risk for major coronary events compared to FB in 3D-CRT plans (p = 0.04). In VMAT plans, the mean predicted risk reduction through DIBH was less pronounced and not statistically significant (p = 0.44). The risk of radiation-induced secondary lung cancer was mainly influenced by the radiotherapy technique, with no beneficial effect through DIBH. VMAT plans correlated with an increase in 10-year EAR for radiation-induced lung cancer as compared to 3D-CRT plans (DIBH p = 0.007; FB p = 0.005, respectively). However, the EARs were affected more strongly by nonradiation-associated risk factors, such as smoking, as compared to the choice of treatment technique. The results indicate that 3D-CRT plans in DIBH pose the lowest risk for both major coronary events and secondary lung cancer.
Simulation training tools for nonlethal weapons using gaming environments
NASA Astrophysics Data System (ADS)
Donne, Alexsana; Eagan, Justin; Tse, Gabriel; Vanderslice, Tom; Woods, Jerry
2006-05-01
Modern simulation techniques have a growing role for evaluating new technologies and for developing cost-effective training programs. A mission simulator facilitates the productive exchange of ideas by demonstration of concepts through compellingly realistic computer simulation. Revolutionary advances in 3D simulation technology have made it possible for desktop computers to process strikingly realistic and complex interactions with results depicted in real-time. Computer games now allow for multiple real human players and "artificially intelligent" (AI) simulated robots to play together. Advances in computer processing power have compensated for the inherent intensive calculations required for complex simulation scenarios. The main components of the leading game-engines have been released for user modifications, enabling game enthusiasts and amateur programmers to advance the state-of-the-art in AI and computer simulation technologies. It is now possible to simulate sophisticated and realistic conflict situations in order to evaluate the impact of non-lethal devices as well as conflict resolution procedures using such devices. Simulations can reduce training costs as end users: learn what a device does and doesn't do prior to use, understand responses to the device prior to deployment, determine if the device is appropriate for their situational responses, and train with new devices and techniques before purchasing hardware. This paper will present the status of SARA's mission simulation development activities, based on the Half-Life gameengine, for the purpose of evaluating the latest non-lethal weapon devices, and for developing training tools for such devices.
The place of human psychophysics in modern neuroscience.
Read, J C A
2015-06-18
Human psychophysics is the quantitative measurement of our own perceptions. In essence, it is simply a more sophisticated version of what humans have done since time immemorial: noticed and reflected upon what we can see, hear, and feel. In the 21st century, when hugely powerful techniques are available that enable us to probe the innermost structure and function of nervous systems, is human psychophysics still relevant? I argue that it is, and that in combination with other techniques, it will continue to be a key part of neuroscience for the foreseeable future. I discuss these points in detail using the example of binocular stereopsis, where human psychophysics in combination with physiology and computational vision, has made a substantial contribution. Copyright © 2014 The Author. Published by Elsevier Ltd.. All rights reserved.
[Imaging and the new fabric of the human body].
Moulin, Anne-Marie; Baulieu, Jean-Louis
2010-11-01
A short historical survey recalls the main techniques of medical imaging, based on modern physico-chemistry and computer science. Imagery has provided novel visions of the inside of the body, which are not self-obvious but require a training of the gaze. Yet, these new images have permeated the contemporary mind and inspired esthetic ventures. The popularity of these images may be related to their ambiguous status, between real and virtual. The images, reminiscent of Vesalius' De humani corporis fabrica, crosslink art, science and society in a specific way: which role will they play in the "empowerment" of the tomorrow patient?
NASA Technical Reports Server (NTRS)
Kim, Jonnathan H.
1995-01-01
Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).
Linear Calibration of Radiographic Mineral Density Using Video-Digitizing Methods
NASA Technical Reports Server (NTRS)
Martin, R. Bruce; Papamichos, Thomas; Dannucci, Greg A.
1990-01-01
Radiographic images can provide quantitative as well as qualitative information if they are subjected to densitometric analysis. Using modern video-digitizing techniques, such densitometry can be readily accomplished using relatively inexpensive computer systems. However, such analyses are made more difficult by the fact that the density values read from the radiograph have a complex, nonlinear relationship to bone mineral content. This article derives the relationship between these variables from the nature of the intermediate physical processes, and presents a simple mathematical method for obtaining a linear calibration function using a step wedge or other standard.
[The development of hospital medical supplies information management system].
Cao, Shaoping; Gu, Hongqing; Zhang, Peng; Wang, Qiang
2010-05-01
The information management of medical materials by using high-tech computer, in order to improve the efficiency of the consumption of medical supplies, hospital supplies and develop a new technology way to manage the hospital and material support. Using C # NET, JAVA techniques to develop procedures for the establishment of hospital material management information system, set the various management modules, production of various statistical reports, standard operating procedures. The system is convenient, functional and strong, fluent statistical functions. It can always fully grasp and understand the whole hospital supplies run dynamic information, as a modern and effective tool for hospital materials management.
A genetic algorithm for replica server placement
NASA Astrophysics Data System (ADS)
Eslami, Ghazaleh; Toroghi Haghighat, Abolfazl
2012-01-01
Modern distribution systems use replication to improve communication delay experienced by their clients. Some techniques have been developed for web server replica placement. One of the previous studies was Greedy algorithm proposed by Qiu et al, that needs knowledge about network topology. In This paper, first we introduce a genetic algorithm for web server replica placement. Second, we compare our algorithm with Greedy algorithm proposed by Qiu et al, and Optimum algorithm. We found that our approach can achieve better results than Greedy algorithm proposed by Qiu et al but it's computational time is more than Greedy algorithm.
A genetic algorithm for replica server placement
NASA Astrophysics Data System (ADS)
Eslami, Ghazaleh; Toroghi Haghighat, Abolfazl
2011-12-01
Modern distribution systems use replication to improve communication delay experienced by their clients. Some techniques have been developed for web server replica placement. One of the previous studies was Greedy algorithm proposed by Qiu et al, that needs knowledge about network topology. In This paper, first we introduce a genetic algorithm for web server replica placement. Second, we compare our algorithm with Greedy algorithm proposed by Qiu et al, and Optimum algorithm. We found that our approach can achieve better results than Greedy algorithm proposed by Qiu et al but it's computational time is more than Greedy algorithm.
Yang, Guang-Fu; Huang, Xiaoqin
2006-01-01
Over forty years have elapsed since Hansch and Fujita published their pioneering work of quantitative structure-activity relationships (QSAR). Following the introduction of Comparative Molecular Field Analysis (CoMFA) by Cramer in 1998, other three-dimensional QSAR methods have been developed. Currently, combination of classical QSAR and other computational techniques at three-dimensional level is of greatest interest and generally used in the process of modern drug discovery and design. During the last several decades, a number of different mythologies incorporating a range of molecular descriptors and different statistical regression ways have been proposed and successfully applied in developing of new drugs, thus QSAR method has been proven to be indispensable in not only the reliable prediction of specific properties of new compounds, but also the help to elucidate the possible molecular mechanism of the receptor-ligand interactions. Here, we review the recent developments in QSAR and their applications in rational drug design, focusing on the reasonable selection of novel molecular descriptors and the construction of predictive QSAR models by the help of advanced computational techniques.
Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics
NASA Astrophysics Data System (ADS)
Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.
2006-06-01
Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.
State of the art in treatment of facial paralysis with temporalis tendon transfer.
Sidle, Douglas M; Simon, Patrick
2013-08-01
Temporalis tendon transfer is a technique for dynamic facial reanimation. Since its inception, nearly 80 years ago, it has undergone a wealth of innovation to produce the modern operation. The purpose of this review is to update the literature as to the current techniques and perioperative management of patients undergoing temporalis tendon transfer. The modern technique focuses on the minimally invasive approaches and aesthetic refinements to enhance the final product of the operation. The newest techniques as well as preoperative assessment and postoperative rehabilitation are discussed. When temporalis tendon transfer is indicated for facial reanimation, the modern operation offers a refined technique that produces an aesthetically acceptable outcome. Preoperative smile assessment and postoperative smile rehabilitation are necessary and are important adjuncts to a successful operation.
Securing Secrets and Managing Trust in Modern Computing Applications
ERIC Educational Resources Information Center
Sayler, Andy
2016-01-01
The amount of digital data generated and stored by users increases every day. In order to protect this data, modern computing systems employ numerous cryptographic and access control solutions. Almost all of such solutions, however, require the keeping of certain secrets as the basis of their security models. How best to securely store and control…
ERIC Educational Resources Information Center
Zhamanov, Azamat; Yoo, Seong-Moo; Sakhiyeva, Zhulduz; Zhaparov, Meirambek
2018-01-01
Students nowadays are hard to be motivated to study lessons with traditional teaching methods. Computers, smartphones, tablets and other smart devices disturb students' attentions. Nevertheless, those smart devices can be used as auxiliary tools of modern teaching methods. In this article, the authors review two popular modern teaching methods:…
Borresen, Jon; Lynch, Stephen
2012-01-01
In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory.
Contemporary machine learning: techniques for practitioners in the physical sciences
NASA Astrophysics Data System (ADS)
Spears, Brian
2017-10-01
Machine learning is the science of using computers to find relationships in data without explicitly knowing or programming those relationships in advance. Often without realizing it, we employ machine learning every day as we use our phones or drive our cars. Over the last few years, machine learning has found increasingly broad application in the physical sciences. This most often involves building a model relationship between a dependent, measurable output and an associated set of controllable, but complicated, independent inputs. The methods are applicable both to experimental observations and to databases of simulated output from large, detailed numerical simulations. In this tutorial, we will present an overview of current tools and techniques in machine learning - a jumping-off point for researchers interested in using machine learning to advance their work. We will discuss supervised learning techniques for modeling complicated functions, beginning with familiar regression schemes, then advancing to more sophisticated decision trees, modern neural networks, and deep learning methods. Next, we will cover unsupervised learning and techniques for reducing the dimensionality of input spaces and for clustering data. We'll show example applications from both magnetic and inertial confinement fusion. Along the way, we will describe methods for practitioners to help ensure that their models generalize from their training data to as-yet-unseen test data. We will finally point out some limitations to modern machine learning and speculate on some ways that practitioners from the physical sciences may be particularly suited to help. This work was performed by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
On-line range images registration with GPGPU
NASA Astrophysics Data System (ADS)
Będkowski, J.; Naruniec, J.
2013-03-01
This paper concerns implementation of algorithms in the two important aspects of modern 3D data processing: data registration and segmentation. Solution proposed for the first topic is based on the 3D space decomposition, while the latter on image processing and local neighbourhood search. Data processing is implemented by using NVIDIA compute unified device architecture (NIVIDIA CUDA) parallel computation. The result of the segmentation is a coloured map where different colours correspond to different objects, such as walls, floor and stairs. The research is related to the problem of collecting 3D data with a RGB-D camera mounted on a rotated head, to be used in mobile robot applications. Performance of the data registration algorithm is aimed for on-line processing. The iterative closest point (ICP) approach is chosen as a registration method. Computations are based on the parallel fast nearest neighbour search. This procedure decomposes 3D space into cubic buckets and, therefore, the time of the matching is deterministic. First technique of the data segmentation uses accele-rometers integrated with a RGB-D sensor to obtain rotation compensation and image processing method for defining pre-requisites of the known categories. The second technique uses the adapted nearest neighbour search procedure for obtaining normal vectors for each range point.
Fast ray-tracing of human eye optics on Graphics Processing Units.
Wei, Qi; Patkar, Saket; Pai, Dinesh K
2014-05-01
We present a new technique for simulating retinal image formation by tracing a large number of rays from objects in three dimensions as they pass through the optic apparatus of the eye to objects. Simulating human optics is useful for understanding basic questions of vision science and for studying vision defects and their corrections. Because of the complexity of computing such simulations accurately, most previous efforts used simplified analytical models of the normal eye. This makes them less effective in modeling vision disorders associated with abnormal shapes of the ocular structures which are hard to be precisely represented by analytical surfaces. We have developed a computer simulator that can simulate ocular structures of arbitrary shapes, for instance represented by polygon meshes. Topographic and geometric measurements of the cornea, lens, and retina from keratometer or medical imaging data can be integrated for individualized examination. We utilize parallel processing using modern Graphics Processing Units (GPUs) to efficiently compute retinal images by tracing millions of rays. A stable retinal image can be generated within minutes. We simulated depth-of-field, accommodation, chromatic aberrations, as well as astigmatism and correction. We also show application of the technique in patient specific vision correction by incorporating geometric models of the orbit reconstructed from clinical medical images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Real-time multiple human perception with color-depth cameras on a mobile robot.
Zhang, Hao; Reardon, Christopher; Parker, Lynne E
2013-10-01
The ability to perceive humans is an essential requirement for safe and efficient human-robot interaction. In real-world applications, the need for a robot to interact in real time with multiple humans in a dynamic, 3-D environment presents a significant challenge. The recent availability of commercial color-depth cameras allow for the creation of a system that makes use of the depth dimension, thus enabling a robot to observe its environment and perceive in the 3-D space. Here we present a system for 3-D multiple human perception in real time from a moving robot equipped with a color-depth camera and a consumer-grade computer. Our approach reduces computation time to achieve real-time performance through a unique combination of new ideas and established techniques. We remove the ground and ceiling planes from the 3-D point cloud input to separate candidate point clusters. We introduce the novel information concept, depth of interest, which we use to identify candidates for detection, and that avoids the computationally expensive scanning-window methods of other approaches. We utilize a cascade of detectors to distinguish humans from objects, in which we make intelligent reuse of intermediary features in successive detectors to improve computation. Because of the high computational cost of some methods, we represent our candidate tracking algorithm with a decision directed acyclic graph, which allows us to use the most computationally intense techniques only where necessary. We detail the successful implementation of our novel approach on a mobile robot and examine its performance in scenarios with real-world challenges, including occlusion, robot motion, nonupright humans, humans leaving and reentering the field of view (i.e., the reidentification challenge), human-object and human-human interaction. We conclude with the observation that the incorporation of the depth information, together with the use of modern techniques in new ways, we are able to create an accurate system for real-time 3-D perception of humans by a mobile robot.
Hathaway, John C.
1971-01-01
The purpose of the data file presented below is twofold: the first purpose is to make available in printed form the basic data relating to the samples collected as part of the joint U.S. Geological Survey - Woods Hole Oceanographic Institution program of study of the Atlantic continental margin of the United States; the second purpose is to maintain these data in a form that is easily retrievable by modern computer methods. With the data in such form, repeate manual transcription for statistical or similar mathematical treatment becomes unnecessary. Manual plotting of information or derivatives from the information may also be eliminated. Not only is handling of data by the computer considerably faster than manual techniques, but a fruitful source of errors, transcription mistakes, is eliminated.
NASA Astrophysics Data System (ADS)
Tramm, John R.; Gunow, Geoffrey; He, Tim; Smith, Kord S.; Forget, Benoit; Siegel, Andrew R.
2016-05-01
In this study we present and analyze a formulation of the 3D Method of Characteristics (MOC) technique applied to the simulation of full core nuclear reactors. Key features of the algorithm include a task-based parallelism model that allows independent MOC tracks to be assigned to threads dynamically, ensuring load balancing, and a wide vectorizable inner loop that takes advantage of modern SIMD computer architectures. The algorithm is implemented in a set of highly optimized proxy applications in order to investigate its performance characteristics on CPU, GPU, and Intel Xeon Phi architectures. Speed, power, and hardware cost efficiencies are compared. Additionally, performance bottlenecks are identified for each architecture in order to determine the prospects for continued scalability of the algorithm on next generation HPC architectures.
Real-time colouring and filtering with graphics shaders
NASA Astrophysics Data System (ADS)
Vohl, D.; Fluke, C. J.; Barnes, D. G.; Hassan, A. H.
2017-11-01
Despite the popularity of the Graphics Processing Unit (GPU) for general purpose computing, one should not forget about the practicality of the GPU for fast scientific visualization. As astronomers have increasing access to three-dimensional (3D) data from instruments and facilities like integral field units and radio interferometers, visualization techniques such as volume rendering offer means to quickly explore spectral cubes as a whole. As most 3D visualization techniques have been developed in fields of research like medical imaging and fluid dynamics, many transfer functions are not optimal for astronomical data. We demonstrate how transfer functions and graphics shaders can be exploited to provide new astronomy-specific explorative colouring methods. We present 12 shaders, including four novel transfer functions specifically designed to produce intuitive and informative 3D visualizations of spectral cube data. We compare their utility to classic colour mapping. The remaining shaders highlight how common computation like filtering, smoothing and line ratio algorithms can be integrated as part of the graphics pipeline. We discuss how this can be achieved by utilizing the parallelism of modern GPUs along with a shading language, letting astronomers apply these new techniques at interactive frame rates. All shaders investigated in this work are included in the open source software shwirl (Vohl 2017).
Improving robustness and computational efficiency using modern C++
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paterno, M.; Kowalkowski, J.; Green, C.
2014-01-01
For nearly two decades, the C++ programming language has been the dominant programming language for experimental HEP. The publication of ISO/IEC 14882:2011, the current version of the international standard for the C++ programming language, makes available a variety of language and library facilities for improving the robustness, expressiveness, and computational efficiency of C++ code. However, much of the C++ written by the experimental HEP community does not take advantage of the features of the language to obtain these benefits, either due to lack of familiarity with these features or concern that these features must somehow be computationally inefficient. In thismore » paper, we address some of the features of modern C+-+, and show how they can be used to make programs that are both robust and computationally efficient. We compare and contrast simple yet realistic examples of some common implementation patterns in C, currently-typical C++, and modern C++, and show (when necessary, down to the level of generated assembly language code) the quality of the executable code produced by recent C++ compilers, with the aim of allowing the HEP community to make informed decisions on the costs and benefits of the use of modern C++.« less
Sawall, Mathias; Kubis, Christoph; Börner, Armin; Selent, Detlef; Neymeyr, Klaus
2015-09-03
Modern computerized spectroscopic instrumentation can result in high volumes of spectroscopic data. Such accurate measurements rise special computational challenges for multivariate curve resolution techniques since pure component factorizations are often solved via constrained minimization problems. The computational costs for these calculations rapidly grow with an increased time or frequency resolution of the spectral measurements. The key idea of this paper is to define for the given high-dimensional spectroscopic data a sequence of coarsened subproblems with reduced resolutions. The multiresolution algorithm first computes a pure component factorization for the coarsest problem with the lowest resolution. Then the factorization results are used as initial values for the next problem with a higher resolution. Good initial values result in a fast solution on the next refined level. This procedure is repeated and finally a factorization is determined for the highest level of resolution. The described multiresolution approach allows a considerable convergence acceleration. The computational procedure is analyzed and is tested for experimental spectroscopic data from the rhodium-catalyzed hydroformylation together with various soft and hard models. Copyright © 2015 Elsevier B.V. All rights reserved.
van der Ploeg, Tjeerd; Austin, Peter C; Steyerberg, Ewout W
2014-12-22
Modern modelling techniques may potentially provide more accurate predictions of binary outcomes than classical techniques. We aimed to study the predictive performance of different modelling techniques in relation to the effective sample size ("data hungriness"). We performed simulation studies based on three clinical cohorts: 1282 patients with head and neck cancer (with 46.9% 5 year survival), 1731 patients with traumatic brain injury (22.3% 6 month mortality) and 3181 patients with minor head injury (7.6% with CT scan abnormalities). We compared three relatively modern modelling techniques: support vector machines (SVM), neural nets (NN), and random forests (RF) and two classical techniques: logistic regression (LR) and classification and regression trees (CART). We created three large artificial databases with 20 fold, 10 fold and 6 fold replication of subjects, where we generated dichotomous outcomes according to different underlying models. We applied each modelling technique to increasingly larger development parts (100 repetitions). The area under the ROC-curve (AUC) indicated the performance of each model in the development part and in an independent validation part. Data hungriness was defined by plateauing of AUC and small optimism (difference between the mean apparent AUC and the mean validated AUC <0.01). We found that a stable AUC was reached by LR at approximately 20 to 50 events per variable, followed by CART, SVM, NN and RF models. Optimism decreased with increasing sample sizes and the same ranking of techniques. The RF, SVM and NN models showed instability and a high optimism even with >200 events per variable. Modern modelling techniques such as SVM, NN and RF may need over 10 times as many events per variable to achieve a stable AUC and a small optimism than classical modelling techniques such as LR. This implies that such modern techniques should only be used in medical prediction problems if very large data sets are available.
Computing in Hydraulic Engineering Education
NASA Astrophysics Data System (ADS)
Duan, J. G.
2011-12-01
Civil engineers, pioneers of our civilization, are rarely perceived as leaders and innovators in modern society because of retardations in technology innovation. This crisis has resulted in the decline of the prestige of civil engineering profession, reduction of federal funding on deteriorating infrastructures, and problems with attracting the most talented high-school students. Infusion of cutting-edge computer technology and stimulating creativity and innovation therefore are the critical challenge to civil engineering education. To better prepare our graduates to innovate, this paper discussed the adaption of problem-based collaborative learning technique and integration of civil engineering computing into a traditional civil engineering curriculum. Three interconnected courses: Open Channel Flow, Computational Hydraulics, and Sedimentation Engineering, were developed with emphasis on computational simulations. In Open Channel flow, the focuses are principles of free surface flow and the application of computational models. This prepares students to the 2nd course, Computational Hydraulics, that introduce the fundamental principles of computational hydraulics, including finite difference and finite element methods. This course complements the Open Channel Flow class to provide students with in-depth understandings of computational methods. The 3rd course, Sedimentation Engineering, covers the fundamentals of sediment transport and river engineering, so students can apply the knowledge and programming skills gained from previous courses to develop computational models for simulating sediment transport. These courses effectively equipped students with important skills and knowledge to complete thesis and dissertation research.
NASA Astrophysics Data System (ADS)
Rabemananajara, Tanjona R.; Horowitz, W. A.
2017-09-01
To make predictions for the particle physics processes, one has to compute the cross section of the specific process as this is what one can measure in a modern collider experiment such as the Large Hadron Collider (LHC) at CERN. Theoretically, it has been proven to be extremely difficult to compute scattering amplitudes using conventional methods of Feynman. Calculations with Feynman diagrams are realizations of a perturbative expansion and when doing calculations one has to set up all topologically different diagrams, for a given process up to a given order of coupling in the theory. This quickly makes the calculation of scattering amplitudes a hot mess. Fortunately, one can simplify calculations by considering the helicity amplitude for the Maximally Helicity Violating (MHV). This can be extended to the formalism of on-shell recursion, which is able to derive, in a much simpler way the expression of a high order scattering amplitude from lower orders.
Global Magnetohydrodynamic Simulation Using High Performance FORTRAN on Parallel Computers
NASA Astrophysics Data System (ADS)
Ogino, T.
High Performance Fortran (HPF) is one of modern and common techniques to achieve high performance parallel computation. We have translated a 3-dimensional magnetohydrodynamic (MHD) simulation code of the Earth's magnetosphere from VPP Fortran to HPF/JA on the Fujitsu VPP5000/56 vector-parallel supercomputer and the MHD code was fully vectorized and fully parallelized in VPP Fortran. The entire performance and capability of the HPF MHD code could be shown to be almost comparable to that of VPP Fortran. A 3-dimensional global MHD simulation of the earth's magnetosphere was performed at a speed of over 400 Gflops with an efficiency of 76.5 VPP5000/56 in vector and parallel computation that permitted comparison with catalog values. We have concluded that fluid and MHD codes that are fully vectorized and fully parallelized in VPP Fortran can be translated with relative ease to HPF/JA, and a code in HPF/JA may be expected to perform comparably to the same code written in VPP Fortran.
Costa, Paulo R; Caldas, Linda V E
2002-01-01
This work presents the development and evaluation using modern techniques to calculate radiation protection barriers in clinical radiographic facilities. Our methodology uses realistic primary and scattered spectra. The primary spectra were computer simulated using a waveform generalization and a semiempirical model (the Tucker-Barnes-Chakraborty model). The scattered spectra were obtained from published data. An analytical function was used to produce attenuation curves from polychromatic radiation for specified kVp, waveform, and filtration. The results of this analytical function are given in ambient dose equivalent units. The attenuation curves were obtained by application of Archer's model to computer simulation data. The parameters for the best fit to the model using primary and secondary radiation data from different radiographic procedures were determined. They resulted in an optimized model for shielding calculation for any radiographic room. The shielding costs were about 50% lower than those calculated using the traditional method based on Report No. 49 of the National Council on Radiation Protection and Measurements.
On the Emergence of Modern Humans
ERIC Educational Resources Information Center
Amati, Daniele; Shallice, Tim
2007-01-01
The emergence of modern humans with their extraordinary cognitive capacities is ascribed to a novel type of cognitive computational process (sustained non-routine multi-level operations) required for abstract projectuality, held to be the common denominator of the cognitive capacities specific to modern humans. A brain operation (latching) that…
Through Kazan ASPERA to Modern Projects
NASA Astrophysics Data System (ADS)
Gusev, Alexander; Kitiashvili, Irina; Petrova, Natasha
Now the European Union form the Sixth Framework Programme. One of its the objects of the EU Programme is opening national researches and training programmes. The Russian PhD students and young astronomers have business and financial difficulties in access to modern databases and astronomical projects and so they has not been included in European overview of priorities. Modern requirements to the organization of observant projects on powerful telescopes assumes painstaking scientific computer preparation of the application. A rigid competition for observation time assume preliminary computer modeling of target object for success of the application. Kazan AstroGeoPhysics Partnership
Modern Radar Techniques for Geophysical Applications: Two Examples
NASA Technical Reports Server (NTRS)
Arokiasamy, B. J.; Bianchi, C.; Sciacca, U.; Tutone, G.; Zirizzotti, A.; Zuccheretti, E.
2005-01-01
The last decade of the evolution of radar was heavily influenced by the rapid increase in the information processing capabilities. Advances in solid state radio HF devices, digital technology, computing architectures and software offered the designers to develop very efficient radars. In designing modern radars the emphasis goes towards the simplification of the system hardware, reduction of overall power, which is compensated by coding and real time signal processing techniques. Radars are commonly employed in geophysical radio soundings like probing the ionosphere; stratosphere-mesosphere measurement, weather forecast, GPR and radio-glaciology etc. In the laboratorio di Geofisica Ambientale of the Istituto Nazionale di Geofisica e Vulcanologia (INGV), Rome, Italy, we developed two pulse compression radars. The first is a HF radar called AIS-INGV; Advanced Ionospheric Sounder designed both for the purpose of research and for routine service of the HF radio wave propagation forecast. The second is a VHF radar called GLACIORADAR, which will be substituting the high power envelope radar used by the Italian Glaciological group. This will be employed in studying the sub glacial structures of Antarctica, giving information about layering, the bed rock and sub glacial lakes if present. These are low power radars, which heavily rely on advanced hardware and powerful real time signal processing. Additional information is included in the original extended abstract.
Detection and monitoring of cardiotoxicity-what does modern cardiology offer?
Jurcut, Ruxandra; Wildiers, Hans; Ganame, Javier; D'hooge, Jan; Paridaens, Robert; Voigt, Jens-Uwe
2008-05-01
With new anticancer therapies, many patients can have a long life expectancy. Treatment-related comorbidities become an issue for cancer survivors. Cardiac toxicity remains an important side effect of anticancer therapies. Myocardial dysfunction can become apparent early or long after end of therapy and may be irreversible. Detection of cardiac injury is crucial since it may facilitate early therapeutic measures. Traditionally, chemotherapy-induced cardiotoxicity has been detected by measuring changes in left ventricular ejection fraction. This parameter is, however, insensitive to subtle changes in myocardial function as they occur in early cardiotoxicity. This review will discuss conventional and modern cardiologic approaches of assessing myocardial function. It will focus on Doppler myocardial imaging, a method which allows to sensitively measure myocardial function parameters like myocardial velocity, deformation (strain), or deformation rate (strain rate) and which has been shown to reliably detect early abnormalities in both regional and global myocardial function in an early stage. Other newer echocardiographic function estimators are based on automated border detection algorithms and ultrasonic integrated backscatter analysis. A further technique to be discussed is dobutamine stress echocardiography. The use of new biomarkers like B-type natriuretic peptide and troponin and less often used imaging techniques like magnetic resonance imaging and computed tomography will also be mentioned.
Borresen, Jon; Lynch, Stephen
2012-01-01
In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory. PMID:23173034
An Introduction to Modern Missing Data Analyses
ERIC Educational Resources Information Center
Baraldi, Amanda N.; Enders, Craig K.
2010-01-01
A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional…
Computational strategies to address chromatin structure problems
NASA Astrophysics Data System (ADS)
Perišić, Ognjen; Schlick, Tamar
2016-06-01
While the genetic information is contained in double helical DNA, gene expression is a complex multilevel process that involves various functional units, from nucleosomes to fully formed chromatin fibers accompanied by a host of various chromatin binding enzymes. The chromatin fiber is a polymer composed of histone protein complexes upon which DNA wraps, like yarn upon many spools. The nature of chromatin structure has been an open question since the beginning of modern molecular biology. Many experiments have shown that the chromatin fiber is a highly dynamic entity with pronounced structural diversity that includes properties of idealized zig-zag and solenoid models, as well as other motifs. This diversity can produce a high packing ratio and thus inhibit access to a majority of the wound DNA. Despite much research, chromatin’s dynamic structure has not yet been fully described. Long stretches of chromatin fibers exhibit puzzling dynamic behavior that requires interpretation in the light of gene expression patterns in various tissue and organisms. The properties of chromatin fiber can be investigated with experimental techniques, like in vitro biochemistry, in vivo imagining, and high-throughput chromosome capture technology. Those techniques provide useful insights into the fiber’s structure and dynamics, but they are limited in resolution and scope, especially regarding compact fibers and chromosomes in the cellular milieu. Complementary but specialized modeling techniques are needed to handle large floppy polymers such as the chromatin fiber. In this review, we discuss current approaches in the chromatin structure field with an emphasis on modeling, such as molecular dynamics and coarse-grained computational approaches. Combinations of these computational techniques complement experiments and address many relevant biological problems, as we will illustrate with special focus on epigenetic modulation of chromatin structure.
NASA Astrophysics Data System (ADS)
MacDonald, J.
This paper looks at the use of astronomical programmes and the development of new media modeling techniques as a means to better understand archaeoastronomy. The paper also suggests that these new methods and technologies are a means of furthering the public perceptions of archaeoastronomy and the important role that 'astronomy' played in the history and development of human culture. This discussion is rooted in a computer simulation of Stonehenge and its land and skyscape. The integration of the astronomy software allows viewing horizon astronomical lignments in relation to digitally recreated Neolithic/Early Bronze Age (EBA) monumental architecture. This work shows how modern virtual modelling techniques can be a tool for testing archaeoastronomical hypotheses, as well as a demonstrative tool for teaching and promoting archaeoastronomy in mainstream media.
New materials and structures for photovoltaics
NASA Astrophysics Data System (ADS)
Zunger, Alex; Wagner, S.; Petroff, P. M.
1993-01-01
Despite the fact that over the years crystal chemists have discovered numerous semiconducting substances, and that modern epitaxial growth techniques are able to produce many novel atomic-scale architectures, current electronic and opto-electronic technologies are based but on a handful of ˜10 traditional semiconductor core materials. This paper surveys a number of yet-unexploited classes of semiconductors, pointing to the much-needed research in screening, growing, and characterizing promising members of these classes. In light of the unmanageably large number of a-priori possibilities, we emphasize the role that structural chemistry and modern computer-aided design must play in screening potentially important candidates. The basic classes of materials discussed here include nontraditional alloys, such as non-isovalent and heterostructural semiconductors, materials at reduced dimensionality, including superlattices, zeolite-caged nanostructures and organic semiconductors, spontaneously ordered alloys, interstitial semiconductors, filled tetrahedral structures, ordered vacancy compounds, and compounds based on d and f electron elements. A collaborative effort among material predictor, material grower, and material characterizer holds the promise for a successful identification of new and exciting systems.
Decision support systems for clinical radiological practice — towards the next generation
Stivaros, S M; Gledson, A; Nenadic, G; Zeng, X-J; Keane, J; Jackson, A
2010-01-01
The huge amount of information that needs to be assimilated in order to keep pace with the continued advances in modern medical practice can form an insurmountable obstacle to the individual clinician. Within radiology, the recent development of quantitative imaging techniques, such as perfusion imaging, and the development of imaging-based biomarkers in modern therapeutic assessment has highlighted the need for computer systems to provide the radiological community with support for academic as well as clinical/translational applications. This article provides an overview of the underlying design and functionality of radiological decision support systems with examples tracing the development and evolution of such systems over the past 40 years. More importantly, we discuss the specific design, performance and usage characteristics that previous systems have highlighted as being necessary for clinical uptake and routine use. Additionally, we have identified particular failings in our current methodologies for data dissemination within the medical domain that must be overcome if the next generation of decision support systems is to be implemented successfully. PMID:20965900
Ross, William N; Miyazaki, Kenichi; Popovic, Marko A; Zecevic, Dejan
2015-04-01
Dynamic calcium and voltage imaging is a major tool in modern cellular neuroscience. Since the beginning of their use over 40 years ago, there have been major improvements in indicators, microscopes, imaging systems, and computers. While cutting edge research has trended toward the use of genetically encoded calcium or voltage indicators, two-photon microscopes, and in vivo preparations, it is worth noting that some questions still may be best approached using more classical methodologies and preparations. In this review, we highlight a few examples in neurons where the combination of charge-coupled device (CCD) imaging and classical organic indicators has revealed information that has so far been more informative than results using the more modern systems. These experiments take advantage of the high frame rates, sensitivity, and spatial integration of the best CCD cameras. These cameras can respond to the faster kinetics of organic voltage and calcium indicators, which closely reflect the fast dynamics of the underlying cellular events.
Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics
NASA Astrophysics Data System (ADS)
Doronin, Alexander; Meglinski, Igor
2012-09-01
In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.
Doronin, Alexander; Meglinski, Igor
2012-09-01
In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.
Computations of Axisymmetric Flows in Hypersonic Shock Tubes
NASA Technical Reports Server (NTRS)
Sharma, Surendra P.; Wilson, Gregory J.
1995-01-01
A time-accurate two-dimensional fluid code is used to compute test times in shock tubes operated at supersonic speeds. Unlike previous studies, this investigation resolves the finer temporal details of the shock-tube flow by making use of modern supercomputers and state-of-the-art computational fluid dynamic solution techniques. The code, besides solving the time-dependent fluid equations, also accounts for the finite rate chemistry in the hypersonic environment. The flowfield solutions are used to estimate relevant shock-tube parameters for laminar flow, such as test times, and to predict density and velocity profiles. Boundary-layer parameters such as bar-delta(sub u), bar-delta(sup *), and bar-tau(sub w), and test time parameters such as bar-tau and particle time of flight t(sub f), are computed and compared with those evaluated by using Mirels' correlations. This article then discusses in detail the effects of flow nonuniformities on particle time-of-flight behind the normal shock and, consequently, on the interpretation of shock-tube data. This article concludes that for accurate interpretation of shock-tube data, a detailed analysis of flowfield parameters, using a computer code such as used in this study, must be performed.
The application of virtual reality systems as a support of digital manufacturing and logistics
NASA Astrophysics Data System (ADS)
Golda, G.; Kampa, A.; Paprocka, I.
2016-08-01
Modern trends in development of computer aided techniques are heading toward the integration of design competitive products and so-called "digital manufacturing and logistics", supported by computer simulation software. All phases of product lifecycle: starting from design of a new product, through planning and control of manufacturing, assembly, internal logistics and repairs, quality control, distribution to customers and after-sale service, up to its recycling or utilization should be aided and managed by advanced packages of product lifecycle management software. Important problems for providing the efficient flow of materials in supply chain management of whole product lifecycle, using computer simulation will be described on that paper. Authors will pay attention to the processes of acquiring relevant information and correct data, necessary for virtual modeling and computer simulation of integrated manufacturing and logistics systems. The article describes possibilities of use an applications of virtual reality software for modeling and simulation the production and logistics processes in enterprise in different aspects of product lifecycle management. The authors demonstrate effective method of creating computer simulations for digital manufacturing and logistics and show modeled and programmed examples and solutions. They pay attention to development trends and show options of the applications that go beyond enterprise.
Generative models for clinical applications in computational psychiatry.
Frässle, Stefan; Yao, Yu; Schöbi, Dario; Aponte, Eduardo A; Heinzle, Jakob; Stephan, Klaas E
2018-05-01
Despite the success of modern neuroimaging techniques in furthering our understanding of cognitive and pathophysiological processes, translation of these advances into clinically relevant tools has been virtually absent until now. Neuromodeling represents a powerful framework for overcoming this translational deadlock, and the development of computational models to solve clinical problems has become a major scientific goal over the last decade, as reflected by the emergence of clinically oriented neuromodeling fields like Computational Psychiatry, Computational Neurology, and Computational Psychosomatics. Generative models of brain physiology and connectivity in the human brain play a key role in this endeavor, striving for computational assays that can be applied to neuroimaging data from individual patients for differential diagnosis and treatment prediction. In this review, we focus on dynamic causal modeling (DCM) and its use for Computational Psychiatry. DCM is a widely used generative modeling framework for functional magnetic resonance imaging (fMRI) and magneto-/electroencephalography (M/EEG) data. This article reviews the basic concepts of DCM, revisits examples where it has proven valuable for addressing clinically relevant questions, and critically discusses methodological challenges and recent methodological advances. We conclude this review with a more general discussion of the promises and pitfalls of generative models in Computational Psychiatry and highlight the path that lies ahead of us. This article is categorized under: Neuroscience > Computation Neuroscience > Clinical Neuroscience. © 2018 Wiley Periodicals, Inc.
Evaluating Modern Defenses Against Control Flow Hijacking
2015-09-01
unsound and could introduce false negatives (opening up another possible set of attacks). CFG Construction using DSA We next evaluate the precision of CFG...Evaluating Modern Defenses Against Control Flow Hijacking by Ulziibayar Otgonbaatar Submitted to the Department of Electrical Engineering and...Computer Science in partial fulfillment of the requirements for the degree of Master of Science in Computer Science and Engineering at the MASSACHUSETTS
Translations on Eastern Europe, Scientific Affairs, Number 542.
1977-04-18
transplanting human tissue has not as yet been given a final juridical approval like euthanasia, artificial insemination , abortion, birth control, and others...and data teleprocessing. This computer may also be used as a satellite computer for complex systems. The IZOT 310 has a large instruction...a well-known truth that modern science is using the most modern and leading technical facilities—from bathyscaphes to satellites , from gigantic
Resiliency in Future Cyber Combat
2016-04-04
including the Internet , telecommunications networks, computer systems, and embed- ded processors and controllers.”6 One important point emerging from the...definition is that while the Internet is part of cyberspace, it is not all of cyberspace. Any computer processor capable of communicating with a...central proces- sor on a modern car are all part of cyberspace, although only some of them are routinely connected to the Internet . Most modern
Yoga and mental health: A dialogue between ancient wisdom and modern psychology
Vorkapic, Camila Ferreira
2016-01-01
Background: Many yoga texts make reference to the importance of mental health and the use of specific techniques in the treatment of mental disorders. Different concepts utilized in modern psychology may not come with contemporary ideas, instead, they seem to share a common root with ancient wisdom. Aims: The goal of this perspective article is to correlate modern techniques used in psychology and psychiatry with yogic practices, in the treatment of mental disorders. Materials and Methods: The current article presented a dialogue between the yogic approach for the treatment of mental disorder and concepts used in modern psychology, such as meta-cognition, disidentification, deconditioning and interoceptive exposure. Conclusions: Contemplative research found out that modern interventions in psychology might not come from modern concepts after all, but share great similarity with ancient yogic knowledge, giving us the opportunity to integrate the psychological wisdom of both East and West. PMID:26865774
Computational membrane biophysics: From ion channel interactions with drugs to cellular function.
Miranda, Williams E; Ngo, Van A; Perissinotti, Laura L; Noskov, Sergei Yu
2017-11-01
The rapid development of experimental and computational techniques has changed fundamentally our understanding of cellular-membrane transport. The advent of powerful computers and refined force-fields for proteins, ions, and lipids has expanded the applicability of Molecular Dynamics (MD) simulations. A myriad of cellular responses is modulated through the binding of endogenous and exogenous ligands (e.g. neurotransmitters and drugs, respectively) to ion channels. Deciphering the thermodynamics and kinetics of the ligand binding processes to these membrane proteins is at the heart of modern drug development. The ever-increasing computational power has already provided insightful data on the thermodynamics and kinetics of drug-target interactions, free energies of solvation, and partitioning into lipid bilayers for drugs. This review aims to provide a brief summary about modeling approaches to map out crucial binding pathways with intermediate conformations and free-energy surfaces for drug-ion channel binding mechanisms that are responsible for multiple effects on cellular functions. We will discuss post-processing analysis of simulation-generated data, which are then transformed to kinetic models to better understand the molecular underpinning of the experimental observables under the influence of drugs or mutations in ion channels. This review highlights crucial mathematical frameworks and perspectives on bridging different well-established computational techniques to connect the dynamics and timescales from all-atom MD and free energy simulations of ion channels to the physiology of action potentials in cellular models. This article is part of a Special Issue entitled: Biophysics in Canada, edited by Lewis Kay, John Baenziger, Albert Berghuis and Peter Tieleman. Copyright © 2017 Elsevier B.V. All rights reserved.
ESIF 2016: Modernizing Our Grid and Energy System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Becelaere, Kimberly
This 2016 annual report highlights work conducted at the Energy Systems Integration Facility (ESIF) in FY 2016, including grid modernization, high-performance computing and visualization, and INTEGRATE projects.
Computer Technology: State of the Art.
ERIC Educational Resources Information Center
Withington, Frederic G.
1981-01-01
Describes the nature of modern general-purpose computer systems, including hardware, semiconductor electronics, microprocessors, computer architecture, input output technology, and system control programs. Seven suggested readings are cited. (FM)
Advances in Modern Botnet Understanding and the Accurate Enumeration of Infected Hosts
ERIC Educational Resources Information Center
Nunnery, Christopher Edward
2011-01-01
Botnets remain a potent threat due to evolving modern architectures, inadequate remediation methods, and inaccurate measurement techniques. In response, this research exposes the architectures and operations of two advanced botnets, techniques to enumerate infected hosts, and pursues the scientific refinement of infected-host enumeration data by…
Future of Assurance: Ensuring that a System is Trustworthy
NASA Astrophysics Data System (ADS)
Sadeghi, Ahmad-Reza; Verbauwhede, Ingrid; Vishik, Claire
Significant efforts are put in defining and implementing strong security measures for all components of the comput-ing environment. It is equally important to be able to evaluate the strength and robustness of these measures and establish trust among the components of the computing environment based on parameters and attributes of these elements and best practices associated with their production and deployment. Today the inventory of techniques used for security assurance and to establish trust -- audit, security-conscious development process, cryptographic components, external evaluation - is somewhat limited. These methods have their indisputable strengths and have contributed significantly to the advancement in the area of security assurance. However, shorter product and tech-nology development cycles and the sheer complexity of modern digital systems and processes have begun to decrease the efficiency of these techniques. Moreover, these approaches and technologies address only some aspects of security assurance and, for the most part, evaluate assurance in a general design rather than an instance of a product. Additionally, various components of the computing environment participating in the same processes enjoy different levels of security assurance, making it difficult to ensure adequate levels of protection end-to-end. Finally, most evaluation methodologies rely on the knowledge and skill of the evaluators, making reliable assessments of trustworthiness of a system even harder to achieve. The paper outlines some issues in security assurance that apply across the board, with the focus on the trustworthiness and authenticity of hardware components and evaluates current approaches to assurance.
The Helicopter Antenna Radiation Prediction Code (HARP)
NASA Technical Reports Server (NTRS)
Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.
1990-01-01
The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.
Metaheuristic Algorithms for Convolution Neural Network
Fanany, Mohamad Ivan; Arymurthy, Aniati Murni
2016-01-01
A typical modern optimization technique is usually either heuristic or metaheuristic. This technique has managed to solve some optimization problems in the research area of science, engineering, and industry. However, implementation strategy of metaheuristic for accuracy improvement on convolution neural networks (CNN), a famous deep learning method, is still rarely investigated. Deep learning relates to a type of machine learning technique, where its aim is to move closer to the goal of artificial intelligence of creating a machine that could successfully perform any intellectual tasks that can be carried out by a human. In this paper, we propose the implementation strategy of three popular metaheuristic approaches, that is, simulated annealing, differential evolution, and harmony search, to optimize CNN. The performances of these metaheuristic methods in optimizing CNN on classifying MNIST and CIFAR dataset were evaluated and compared. Furthermore, the proposed methods are also compared with the original CNN. Although the proposed methods show an increase in the computation time, their accuracy has also been improved (up to 7.14 percent). PMID:27375738
Metaheuristic Algorithms for Convolution Neural Network.
Rere, L M Rasdi; Fanany, Mohamad Ivan; Arymurthy, Aniati Murni
2016-01-01
A typical modern optimization technique is usually either heuristic or metaheuristic. This technique has managed to solve some optimization problems in the research area of science, engineering, and industry. However, implementation strategy of metaheuristic for accuracy improvement on convolution neural networks (CNN), a famous deep learning method, is still rarely investigated. Deep learning relates to a type of machine learning technique, where its aim is to move closer to the goal of artificial intelligence of creating a machine that could successfully perform any intellectual tasks that can be carried out by a human. In this paper, we propose the implementation strategy of three popular metaheuristic approaches, that is, simulated annealing, differential evolution, and harmony search, to optimize CNN. The performances of these metaheuristic methods in optimizing CNN on classifying MNIST and CIFAR dataset were evaluated and compared. Furthermore, the proposed methods are also compared with the original CNN. Although the proposed methods show an increase in the computation time, their accuracy has also been improved (up to 7.14 percent).
NASA Technical Reports Server (NTRS)
Pfeil, W. H.; De Los Reyes, G.; Bobula, G. A.
1985-01-01
A power turbine governor was designed for a recent-technology turboshaft engine coupled to a modern, articulated rotor system using Linear Quadratic Regulator (LQR) and Kalman Filter (KF) techniques. A linear, state-space model of the engine and rotor system was derived for six engine power settings from flight idle to maximum continuous. An integrator was appended to the fuel flow input to reduce the steady-state governor error to zero. Feedback gains were calculated for the system states at each power setting using the LQR technique. The main rotor tip speed state is not measurable, so a Kalman Filter of the rotor was used to estimate this state. The crossover of the system was increased to 10 rad/s compared to 2 rad/sec for a current governor. Initial computer simulations with a nonlinear engine model indicate a significant decrease in power turbine speed variation with the LQR governor compared to a conventional governor.
The Use of Field Programmable Gate Arrays (FPGA) in Small Satellite Communication Systems
NASA Technical Reports Server (NTRS)
Varnavas, Kosta; Sims, William Herbert; Casas, Joseph
2015-01-01
This paper will describe the use of digital Field Programmable Gate Arrays (FPGA) to contribute to advancing the state-of-the-art in software defined radio (SDR) transponder design for the emerging SmallSat and CubeSat industry and to provide advances for NASA as described in the TAO5 Communication and Navigation Roadmap (Ref 4). The use of software defined radios (SDR) has been around for a long time. A typical implementation of the SDR is to use a processor and write software to implement all the functions of filtering, carrier recovery, error correction, framing etc. Even with modern high speed and low power digital signal processors, high speed memories, and efficient coding, the compute intensive nature of digital filters, error correcting and other algorithms is too much for modern processors to get efficient use of the available bandwidth to the ground. By using FPGAs, these compute intensive tasks can be done in parallel, pipelined fashion and more efficiently use every clock cycle to significantly increase throughput while maintaining low power. These methods will implement digital radios with significant data rates in the X and Ka bands. Using these state-of-the-art technologies, unprecedented uplink and downlink capabilities can be achieved in a 1/2 U sized telemetry system. Additionally, modern FPGAs have embedded processing systems, such as ARM cores, integrated inside the FPGA allowing mundane tasks such as parameter commanding to occur easily and flexibly. Potential partners include other NASA centers, industry and the DOD. These assets are associated with small satellite demonstration flights, LEO and deep space applications. MSFC currently has an SDR transponder test-bed using Hardware-in-the-Loop techniques to evaluate and improve SDR technologies.
A numerical analysis of the aortic blood flow pattern during pulsed cardiopulmonary bypass.
Gramigna, V; Caruso, M V; Rossi, M; Serraino, G F; Renzulli, A; Fragomeni, G
2015-01-01
In the modern era, stroke remains a main cause of morbidity after cardiac surgery despite continuing improvements in the cardiopulmonary bypass (CPB) techniques. The aim of the current work was to numerically investigate the blood flow in aorta and epiaortic vessels during standard and pulsed CPB, obtained with the intra-aortic balloon pump (IABP). A multi-scale model, realized coupling a 3D computational fluid dynamics study with a 0D model, was developed and validated with in vivo data. The presence of IABP improved the flow pattern directed towards the epiaortic vessels with a mean flow increase of 6.3% and reduced flow vorticity.
Eigensystem realization algorithm user's guide forVAX/VMS computers: Version 931216
NASA Technical Reports Server (NTRS)
Pappa, Richard S.
1994-01-01
The eigensystem realization algorithm (ERA) is a multiple-input, multiple-output, time domain technique for structural modal identification and minimum-order system realization. Modal identification is the process of calculating structural eigenvalues and eigenvectors (natural vibration frequencies, damping, mode shapes, and modal masses) from experimental data. System realization is the process of constructing state-space dynamic models for modern control design. This user's guide documents VAX/VMS-based FORTRAN software developed by the author since 1984 in conjunction with many applications. It consists of a main ERA program and 66 pre- and post-processors. The software provides complete modal identification capabilities and most system realization capabilities.
Diffraction-geometry refinement in the DIALS framework
Waterman, David G.; Winter, Graeme; Gildea, Richard J.; ...
2016-03-30
Rapid data collection and modern computing resources provide the opportunity to revisit the task of optimizing the model of diffraction geometry prior to integration. A comprehensive description is given of new software that builds upon established methods by performing a single global refinement procedure, utilizing a smoothly varying model of the crystal lattice where appropriate. This global refinement technique extends to multiple data sets, providing useful constraints to handle the problem of correlated parameters, particularly for small wedges of data. Examples of advanced uses of the software are given and the design is explained in detail, with particular emphasis onmore » the flexibility and extensibility it entails.« less
Beuthien-Baumann, B
2018-05-01
Positron emission tomography (PET) is a procedure in nuclear medicine, which is applied predominantly in oncological diagnostics. In the form of modern hybrid machines, such as PET computed tomography (PET/CT) and PET magnetic resonance imaging (PET/MRI) it has found wide acceptance and availability. The PET procedure is more than just another imaging technique, but a functional method with the capability for quantification in addition to the distribution pattern of the radiopharmaceutical, the results of which are used for therapeutic decisions. A profound knowledge of the principles of PET including the correct indications, patient preparation, and possible artifacts is mandatory for the correct interpretation of PET results.
A hybrid Gerchberg-Saxton-like algorithm for DOE and CGH calculation
NASA Astrophysics Data System (ADS)
Wang, Haichao; Yue, Weirui; Song, Qiang; Liu, Jingdan; Situ, Guohai
2017-02-01
The Gerchberg-Saxton (GS) algorithm is widely used in various disciplines of modern sciences and technologies where phase retrieval is required. However, this legendary algorithm most likely stagnates after a few iterations. Many efforts have been taken to improve this situation. Here we propose to introduce the strategy of gradient descent and weighting technique to the GS algorithm, and demonstrate it using two examples: design of a diffractive optical element (DOE) to achieve off-axis illumination in lithographic tools, and design of a computer generated hologram (CGH) for holographic display. Both numerical simulation and optical experiments are carried out for demonstration.
Coprocessors for quantum devices
NASA Astrophysics Data System (ADS)
Kay, Alastair
2018-03-01
Quantum devices, from simple fixed-function tools to the ultimate goal of a universal quantum computer, will require high-quality, frequent repetition of a small set of core operations, such as the preparation of entangled states. These tasks are perfectly suited to realization by a coprocessor or supplementary instruction set, as is common practice in modern CPUs. In this paper, we present two quintessentially quantum coprocessor functions: production of a Greenberger-Horne-Zeilinger state and implementation of optimal universal (asymmetric) quantum cloning. Both are based on the evolution of a fixed Hamiltonian. We introduce a technique for deriving the parameters of these Hamiltonians based on the numerical integration of Toda-like flows.
CUDA Optimization Strategies for Compute- and Memory-Bound Neuroimaging Algorithms
Lee, Daren; Dinov, Ivo; Dong, Bin; Gutman, Boris; Yanovsky, Igor; Toga, Arthur W.
2011-01-01
As neuroimaging algorithms and technology continue to grow faster than CPU performance in complexity and image resolution, data-parallel computing methods will be increasingly important. The high performance, data-parallel architecture of modern graphical processing units (GPUs) can reduce computational times by orders of magnitude. However, its massively threaded architecture introduces challenges when GPU resources are exceeded. This paper presents optimization strategies for compute- and memory-bound algorithms for the CUDA architecture. For compute-bound algorithms, the registers are reduced through variable reuse via shared memory and the data throughput is increased through heavier thread workloads and maximizing the thread configuration for a single thread block per multiprocessor. For memory-bound algorithms, fitting the data into the fast but limited GPU resources is achieved through reorganizing the data into self-contained structures and employing a multi-pass approach. Memory latencies are reduced by selecting memory resources whose cache performance are optimized for the algorithm's access patterns. We demonstrate the strategies on two computationally expensive algorithms and achieve optimized GPU implementations that perform up to 6× faster than unoptimized ones. Compared to CPU implementations, we achieve peak GPU speedups of 129× for the 3D unbiased nonlinear image registration technique and 93× for the non-local means surface denoising algorithm. PMID:21159404
Kazakis, Georgios; Kanellopoulos, Ioannis; Sotiropoulos, Stefanos; Lagaros, Nikos D
2017-10-01
Construction industry has a major impact on the environment that we spend most of our life. Therefore, it is important that the outcome of architectural intuition performs well and complies with the design requirements. Architects usually describe as "optimal design" their choice among a rather limited set of design alternatives, dictated by their experience and intuition. However, modern design of structures requires accounting for a great number of criteria derived from multiple disciplines, often of conflicting nature. Such criteria derived from structural engineering, eco-design, bioclimatic and acoustic performance. The resulting vast number of alternatives enhances the need for computer-aided architecture in order to increase the possibility of arriving at a more preferable solution. Therefore, the incorporation of smart, automatic tools in the design process, able to further guide designer's intuition becomes even more indispensable. The principal aim of this study is to present possibilities to integrate automatic computational techniques related to topology optimization in the phase of intuition of civil structures as part of computer aided architectural design. In this direction, different aspects of a new computer aided architectural era related to the interpretation of the optimized designs, difficulties resulted from the increased computational effort and 3D printing capabilities are covered here in.
CUDA optimization strategies for compute- and memory-bound neuroimaging algorithms.
Lee, Daren; Dinov, Ivo; Dong, Bin; Gutman, Boris; Yanovsky, Igor; Toga, Arthur W
2012-06-01
As neuroimaging algorithms and technology continue to grow faster than CPU performance in complexity and image resolution, data-parallel computing methods will be increasingly important. The high performance, data-parallel architecture of modern graphical processing units (GPUs) can reduce computational times by orders of magnitude. However, its massively threaded architecture introduces challenges when GPU resources are exceeded. This paper presents optimization strategies for compute- and memory-bound algorithms for the CUDA architecture. For compute-bound algorithms, the registers are reduced through variable reuse via shared memory and the data throughput is increased through heavier thread workloads and maximizing the thread configuration for a single thread block per multiprocessor. For memory-bound algorithms, fitting the data into the fast but limited GPU resources is achieved through reorganizing the data into self-contained structures and employing a multi-pass approach. Memory latencies are reduced by selecting memory resources whose cache performance are optimized for the algorithm's access patterns. We demonstrate the strategies on two computationally expensive algorithms and achieve optimized GPU implementations that perform up to 6× faster than unoptimized ones. Compared to CPU implementations, we achieve peak GPU speedups of 129× for the 3D unbiased nonlinear image registration technique and 93× for the non-local means surface denoising algorithm. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
A comparison of acceleration methods for solving the neutron transport k-eigenvalue problem
NASA Astrophysics Data System (ADS)
Willert, Jeffrey; Park, H.; Knoll, D. A.
2014-10-01
Over the past several years a number of papers have been written describing modern techniques for numerically computing the dominant eigenvalue of the neutron transport criticality problem. These methods fall into two distinct categories. The first category of methods rewrite the multi-group k-eigenvalue problem as a nonlinear system of equations and solve the resulting system using either a Jacobian-Free Newton-Krylov (JFNK) method or Nonlinear Krylov Acceleration (NKA), a variant of Anderson Acceleration. These methods are generally successful in significantly reducing the number of transport sweeps required to compute the dominant eigenvalue. The second category of methods utilize Moment-Based Acceleration (or High-Order/Low-Order (HOLO) Acceleration). These methods solve a sequence of modified diffusion eigenvalue problems whose solutions converge to the solution of the original transport eigenvalue problem. This second class of methods is, in our experience, always superior to the first, as most of the computational work is eliminated by the acceleration from the LO diffusion system. In this paper, we review each of these methods. Our computational results support our claim that the choice of which nonlinear solver to use, JFNK or NKA, should be secondary. The primary computational savings result from the implementation of a HOLO algorithm. We display computational results for a series of challenging multi-dimensional test problems.
Kriz, J; Baues, C; Engenhart-Cabillic, R; Haverkamp, U; Herfarth, K; Lukas, P; Schmidberger, H; Marnitz-Schulze, S; Fuchs, M; Engert, A; Eich, H T
2017-02-01
Field design changed substantially from extended-field RT (EF-RT) to involved-field RT (IF-RT) and now to involved-node RT (IN-RT) and involved-site RT (IS-RT) as well as treatment techniques in radiotherapy (RT) of Hodgkin's lymphoma (HL). The purpose of this article is to demonstrate the establishment of a quality assurance program (QAP) including modern RT techniques and field designs within the German Hodgkin Study Group (GHSG). In the era of modern conformal RT, this QAP had to be fundamentally adapted and a new evaluation process has been intensively discussed by the radiotherapeutic expert panel of the GHSG. The expert panel developed guidelines and criteria to analyse "modern" field designs and treatment techniques. This work is based on a dataset of 11 patients treated within the sixth study generation (HD16-17). To develop a QAP of "modern RT", the expert panel defined criteria for analysing current RT procedures. The consensus of a modified QAP in ongoing and future trials is presented. With this schedule, the QAP of the GHSG could serve as a model for other study groups.
Operational forecasting with the subgrid technique on the Elbe Estuary
NASA Astrophysics Data System (ADS)
Sehili, Aissa
2017-04-01
Modern remote sensing technologies can deliver very detailed land surface height data that should be considered for more accurate simulations. In that case, and even if some compromise is made with regard to grid resolution of an unstructured grid, simulations still will require large grids which can be computationally very demanding. The subgrid technique, first published by Casulli (2009), is based on the idea of making use of the available detailed subgrid bathymetric information while performing computations on relatively coarse grids permitting large time steps. Consequently, accuracy and efficiency are drastically enhanced if compared to the classical linear method, where the underlying bathymetry is solely discretized by the computational grid. The algorithm guarantees rigorous mass conservation and nonnegative water depths for any time step size. Computational grid-cells are permitted to be wet, partially wet or dry and no drying threshold is needed. The subgrid technique is used in an operational forecast model for water level, current velocity, salinity and temperature of the Elbe estuary in Germany. Comparison is performed with the comparatively highly resolved classical unstructured grid model UnTRIM. The daily meteorological forcing data are delivered by the German Weather Service (DWD) using the ICON-EU model. Open boundary data are delivered by the coastal model BSHcmod of the German Federal Maritime and Hydrographic Agency (BSH). Comparison of predicted water levels between classical and subgrid model shows a very good agreement. The speedup in computational performance due to the use of the subgrid technique is about a factor of 20. A typical daily forecast can be carried out within less than 10 minutes on standard PC-like hardware. The model is capable of permanently delivering highly resolved temporal and spatial information on water level, current velocity, salinity and temperature for the whole estuary. The model offers also the possibility to recalculate any previous situation. This can be helpful to figure out for instance the context in which a certain event occurred like an accident. In addition to measurement, the model can be used to improve navigability by adjusting the tidal transit-schedule for container vessels that are depending on the tide to approach or leave the port of Hamburg.
DOE Office of Scientific and Technical Information (OSTI.GOV)
French, W.S.
1993-02-01
The exploration industry is changing, exploration technology is changing and the explorationist's job is changing. Resource companies are diversifying internationally and their central organizations are providing advisors rather than services. As a result, the relationship between the resource company and the contractor is changing. Resource companies are promoting standards so that all contract services in all parts of the world will look the same to their advisors. Contractors, for competitive reasons, want to look [open quotes]different[close quotes] from other contractors. The resource companies must encourage competition between contractors to insure the availability of new technology but must also resist themore » current trend of burdening the contractor with more and more of the risk involved in exploration. It is becoming more and more obvious that geophysical expenditures represent the best [open quotes]value added[close quotes] expenditures in exploration and development budgets. As a result, seismic-related contractors represent the growth component of our industry. The predominant growth is in 3-D seismic technology, and this growth is being further propelled by the computational power of the new generation of massively parallel computers and by recent advances in computer graphic techniques. Interpretation of seismic data involves the analysis of wavelet shapes and amplitudes prior to stacking the data. Thus, modern interpretation involves understanding compressional waves, shear waves, and propagating modes which create noise and interference. Modern interpretation and processing are carried out simultaneously, iteratively, and interactively and involve many physics-related concepts. These concepts are not merely tools for the interpretation, they are the interpretation. Explorationists who do not recognize this fact are going the way of the dinosaurs.« less
Natural Products for Drug Discovery in the 21st Century: Innovations for Novel Drug Discovery.
Thomford, Nicholas Ekow; Senthebane, Dimakatso Alice; Rowe, Arielle; Munro, Daniella; Seele, Palesa; Maroyi, Alfred; Dzobo, Kevin
2018-05-25
The therapeutic properties of plants have been recognised since time immemorial. Many pathological conditions have been treated using plant-derived medicines. These medicines are used as concoctions or concentrated plant extracts without isolation of active compounds. Modern medicine however, requires the isolation and purification of one or two active compounds. There are however a lot of global health challenges with diseases such as cancer, degenerative diseases, HIV/AIDS and diabetes, of which modern medicine is struggling to provide cures. Many times the isolation of "active compound" has made the compound ineffective. Drug discovery is a multidimensional problem requiring several parameters of both natural and synthetic compounds such as safety, pharmacokinetics and efficacy to be evaluated during drug candidate selection. The advent of latest technologies that enhance drug design hypotheses such as Artificial Intelligence, the use of 'organ-on chip' and microfluidics technologies, means that automation has become part of drug discovery. This has resulted in increased speed in drug discovery and evaluation of the safety, pharmacokinetics and efficacy of candidate compounds whilst allowing novel ways of drug design and synthesis based on natural compounds. Recent advances in analytical and computational techniques have opened new avenues to process complex natural products and to use their structures to derive new and innovative drugs. Indeed, we are in the era of computational molecular design, as applied to natural products. Predictive computational softwares have contributed to the discovery of molecular targets of natural products and their derivatives. In future the use of quantum computing, computational softwares and databases in modelling molecular interactions and predicting features and parameters needed for drug development, such as pharmacokinetic and pharmacodynamics, will result in few false positive leads in drug development. This review discusses plant-based natural product drug discovery and how innovative technologies play a role in next-generation drug discovery.
Effective approach to spectroscopy and spectral analysis techniques using Matlab
NASA Astrophysics Data System (ADS)
Li, Xiang; Lv, Yong
2017-08-01
With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching
Computer assisted screening, correction, and analysis of historical weather measurements
NASA Astrophysics Data System (ADS)
Burnette, Dorian J.; Stahle, David W.
2013-04-01
A computer program, Historical Observation Tools (HOB Tools), has been developed to facilitate many of the calculations used by historical climatologists to develop instrumental and documentary temperature and precipitation datasets and makes them readily accessible to other researchers. The primitive methodology used by the early weather observers makes the application of standard techniques difficult. HOB Tools provides a step-by-step framework to visually and statistically assess, adjust, and reconstruct historical temperature and precipitation datasets. These routines include the ability to check for undocumented discontinuities, adjust temperature data for poor thermometer exposures and diurnal averaging, and assess and adjust daily precipitation data for undercount. This paper provides an overview of the Visual Basic.NET program and a demonstration of how it can assist in the development of extended temperature and precipitation datasets using modern and early instrumental measurements from the United States.
Spatiotemporal video deinterlacing using control grid interpolation
NASA Astrophysics Data System (ADS)
Venkatesan, Ragav; Zwart, Christine M.; Frakes, David H.; Li, Baoxin
2015-03-01
With the advent of progressive format display and broadcast technologies, video deinterlacing has become an important video-processing technique. Numerous approaches exist in the literature to accomplish deinterlacing. While most earlier methods were simple linear filtering-based approaches, the emergence of faster computing technologies and even dedicated video-processing hardware in display units has allowed higher quality but also more computationally intense deinterlacing algorithms to become practical. Most modern approaches analyze motion and content in video to select different deinterlacing methods for various spatiotemporal regions. We introduce a family of deinterlacers that employs spectral residue to choose between and weight control grid interpolation based spatial and temporal deinterlacing methods. The proposed approaches perform better than the prior state-of-the-art based on peak signal-to-noise ratio, other visual quality metrics, and simple perception-based subjective evaluations conducted by human viewers. We further study the advantages of using soft and hard decision thresholds on the visual performance.
Periodic control of the individual-blade-control helicopter rotor. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Mckillip, R. M., Jr.
1984-01-01
Results of an investigation into methods of controller design for an individual helicopter rotor blade in the high forward-flight speed regime are described. This operating condition poses a unique control problem in that the perturbation equations of motion are linear with coefficients that vary periodically with time. The design of a control law was based on extensions to modern multivariate synthesis techniques and incorporated a novel approach to the reconstruction of the missing system state variables. The controller was tested on both an electronic analog computer simulation of the out-of-plane flapping dynamics, and on a four foot diameter single-bladed model helicopter rotor in the M.I.T. 5x7 subsonic wind tunnel at high levels of advance ratio. It is shown that modal control using the IBC concept is possible over a large range of advance ratios with only a modest amount of computational power required.
Application of near-infrared image processing in agricultural engineering
NASA Astrophysics Data System (ADS)
Chen, Ming-hong; Zhang, Guo-ping; Xia, Hongxing
2009-07-01
Recently, with development of computer technology, the application field of near-infrared image processing becomes much wider. In this paper the technical characteristic and development of modern NIR imaging and NIR spectroscopy analysis were introduced. It is concluded application and studying of the NIR imaging processing technique in the agricultural engineering in recent years, base on the application principle and developing characteristic of near-infrared image. The NIR imaging would be very useful in the nondestructive external and internal quality inspecting of agricultural products. It is important to detect stored-grain insects by the application of near-infrared spectroscopy. Computer vision detection base on the NIR imaging would be help to manage food logistics. Application of NIR imaging promoted quality management of agricultural products. In the further application research fields of NIR image in the agricultural engineering, Some advices and prospect were put forward.
Experimental validation of docking and capture using space robotics testbeds
NASA Technical Reports Server (NTRS)
Spofford, John; Schmitz, Eric; Hoff, William
1991-01-01
This presentation describes the application of robotic and computer vision systems to validate docking and capture operations for space cargo transfer vehicles. Three applications are discussed: (1) air bearing systems in two dimensions that yield high quality free-flying, flexible, and contact dynamics; (2) validation of docking mechanisms with misalignment and target dynamics; and (3) computer vision technology for target location and real-time tracking. All the testbeds are supported by a network of engineering workstations for dynamic and controls analyses. Dynamic simulation of multibody rigid and elastic systems are performed with the TREETOPS code. MATRIXx/System-Build and PRO-MATLAB/Simulab are the tools for control design and analysis using classical and modern techniques such as H-infinity and LQG/LTR. SANDY is a general design tool to optimize numerically a multivariable robust compensator with a user-defined structure. Mathematica and Macsyma are used to derive symbolically dynamic and kinematic equations.
The Fourier analysis of biological transients.
Harris, C M
1998-08-31
With modern computing technology the digital implementation of the Fourier transform is widely available, mostly in the form of the fast Fourier transform (FFT). Although the FFT has become almost synonymous with the Fourier transform, it is a fast numerical technique for computing the discrete Fourier transform (DFT) of a finite sequence of sampled data. The DFT is not directly equivalent to the continuous Fourier transform of the underlying biological signal, which becomes important when analyzing biological transients. Although this distinction is well known by some, for many it leads to confusion in how to interpret the FFT of biological data, and in how to precondition data so as to yield a more accurate Fourier transform using the FFT. We review here the fundamentals of Fourier analysis with emphasis on the analysis of transient signals. As an example of a transient, we consider the human saccade to illustrate the pitfalls and advantages of various Fourier analyses.
NASA Technical Reports Server (NTRS)
Savely, Robert T.; Loftin, R. Bowen
1990-01-01
Training is a major endeavor in all modern societies. Common training methods include training manuals, formal classes, procedural computer programs, simulations, and on-the-job training. NASA's training approach has focussed primarily on on-the-job training in a simulation environment for both crew and ground based personnel. NASA must explore new approaches to training for the 1990's and beyond. Specific autonomous training systems are described which are based on artificial intelligence technology for use by NASA astronauts, flight controllers, and ground based support personnel that show an alternative to current training systems. In addition to these specific systems, the evolution of a general architecture for autonomous intelligent training systems that integrates many of the features of traditional training programs with artificial intelligence techniques is presented. These Intelligent Computer Aided Training (ICAT) systems would provide much of the same experience that could be gained from the best on-the-job training.
Application of CFD to the analysis and design of high-speed inlets
NASA Technical Reports Server (NTRS)
Rose, William C.
1995-01-01
Over the past seven years, efforts under the present Grant have been aimed at being able to apply modern Computational Fluid Dynamics to the design of high-speed engine inlets. In this report, a review of previous design capabilities (prior to the advent of functioning CFD) was presented and the example of the NASA 'Mach 5 inlet' design was given as the premier example of the historical approach to inlet design. The philosophy used in the Mach 5 inlet design was carried forward in the present study, in which CFD was used to design a new Mach 10 inlet. An example of an inlet redesign was also shown. These latter efforts were carried out using today's state-of-the-art, full computational fluid dynamics codes applied in an iterative man-in-the-loop technique. The potential usefulness of an automated machine design capability using an optimizer code was also discussed.
The Computational Ecologist’s Toolbox
Computational ecology, nestled in the broader field of data science, is an interdisciplinary field that attempts to improve our understanding of complex ecological systems through the use of modern computational methods. Computational ecology is based on a union of competence in...
Professional Competence of a Teacher in Higher Educational Institution
ERIC Educational Resources Information Center
Abykanova, Bakytgul; Tashkeyeva, Gulmira; Idrissov, Salamat; Bilyalova, Zhupar; Sadirbekova, Dinara
2016-01-01
Modern reality brings certain corrections to the understanding of forms and methods of teaching various courses in higher educational institution. A special role among the educational techniques and means in the college educational environment is taken by the modern technologies, such as using the techniques, means and ways, which are aimed at…
An Investigative Graduate Laboratory Course for Teaching Modern DNA Techniques
ERIC Educational Resources Information Center
de Lencastre, Alexandre; Torello, A. Thomas; Keller, Lani C.
2017-01-01
This graduate-level DNA methods laboratory course is designed to model a discovery-based research project and engages students in both traditional DNA analysis methods and modern recombinant DNA cloning techniques. In the first part of the course, students clone the "Drosophila" ortholog of a human disease gene of their choosing using…
ERIC Educational Resources Information Center
Fitzgerald, Mary
2017-01-01
This article reflects on the ways in which socially engaged arts practices can contribute to reconceptualizing the contemporary modern dance technique class as a powerful site of social change. Specifically, the author considers how incorporating socially engaged practices into pedagogical models has the potential to foster responsible citizenship…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Y.; Cameron, K.W.
1998-11-24
Workload characterization has been proven an essential tool to architecture design and performance evaluation in both scientific and commercial computing areas. Traditional workload characterization techniques include FLOPS rate, cache miss ratios, CPI (cycles per instruction or IPC, instructions per cycle) etc. With the complexity of sophisticated modern superscalar microprocessors, these traditional characterization techniques are not powerful enough to pinpoint the performance bottleneck of an application on a specific microprocessor. They are also incapable of immediately demonstrating the potential performance benefit of any architectural or functional improvement in a new processor design. To solve these problems, many people rely on simulators,more » which have substantial constraints especially on large-scale scientific computing applications. This paper presents a new technique of characterizing applications at the instruction level using hardware performance counters. It has the advantage of collecting instruction-level characteristics in a few runs virtually without overhead or slowdown. A variety of instruction counts can be utilized to calculate some average abstract workload parameters corresponding to microprocessor pipelines or functional units. Based on the microprocessor architectural constraints and these calculated abstract parameters, the architectural performance bottleneck for a specific application can be estimated. In particular, the analysis results can provide some insight to the problem that only a small percentage of processor peak performance can be achieved even for many very cache-friendly codes. Meanwhile, the bottleneck estimation can provide suggestions about viable architectural/functional improvement for certain workloads. Eventually, these abstract parameters can lead to the creation of an analytical microprocessor pipeline model and memory hierarchy model.« less
Fluid/Structure Interaction Studies of Aircraft Using High Fidelity Equations on Parallel Computers
NASA Technical Reports Server (NTRS)
Guruswamy, Guru; VanDalsem, William (Technical Monitor)
1994-01-01
Abstract Aeroelasticity which involves strong coupling of fluids, structures and controls is an important element in designing an aircraft. Computational aeroelasticity using low fidelity methods such as the linear aerodynamic flow equations coupled with the modal structural equations are well advanced. Though these low fidelity approaches are computationally less intensive, they are not adequate for the analysis of modern aircraft such as High Speed Civil Transport (HSCT) and Advanced Subsonic Transport (AST) which can experience complex flow/structure interactions. HSCT can experience vortex induced aeroelastic oscillations whereas AST can experience transonic buffet associated structural oscillations. Both aircraft may experience a dip in the flutter speed at the transonic regime. For accurate aeroelastic computations at these complex fluid/structure interaction situations, high fidelity equations such as the Navier-Stokes for fluids and the finite-elements for structures are needed. Computations using these high fidelity equations require large computational resources both in memory and speed. Current conventional super computers have reached their limitations both in memory and speed. As a result, parallel computers have evolved to overcome the limitations of conventional computers. This paper will address the transition that is taking place in computational aeroelasticity from conventional computers to parallel computers. The paper will address special techniques needed to take advantage of the architecture of new parallel computers. Results will be illustrated from computations made on iPSC/860 and IBM SP2 computer by using ENSAERO code that directly couples the Euler/Navier-Stokes flow equations with high resolution finite-element structural equations.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis
Steele, Joe; Bastola, Dhundy
2014-01-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502
The trustworthy digital camera: Restoring credibility to the photographic image
NASA Technical Reports Server (NTRS)
Friedman, Gary L.
1994-01-01
The increasing sophistication of computers has made digital manipulation of photographic images, as well as other digitally-recorded artifacts such as audio and video, incredibly easy to perform and increasingly difficult to detect. Today, every picture appearing in newspapers and magazines has been digitally altered to some degree, with the severity varying from the trivial (cleaning up 'noise' and removing distracting backgrounds) to the point of deception (articles of clothing removed, heads attached to other people's bodies, and the complete rearrangement of city skylines). As the power, flexibility, and ubiquity of image-altering computers continues to increase, the well-known adage that 'the photography doesn't lie' will continue to become an anachronism. A solution to this problem comes from a concept called digital signatures, which incorporates modern cryptographic techniques to authenticate electronic mail messages. 'Authenticate' in this case means one can be sure that the message has not been altered, and that the sender's identity has not been forged. The technique can serve not only to authenticate images, but also to help the photographer retain and enforce copyright protection when the concept of 'electronic original' is no longer meaningful.
Achieving energy efficiency during collective communications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundriyal, Vaibhav; Sosonkina, Masha; Zhang, Zhao
2012-09-13
Energy consumption has become a major design constraint in modern computing systems. With the advent of petaflops architectures, power-efficient software stacks have become imperative for scalability. Techniques such as dynamic voltage and frequency scaling (called DVFS) and CPU clock modulation (called throttling) are often used to reduce the power consumption of the compute nodes. To avoid significant performance losses, these techniques should be used judiciously during parallel application execution. For example, its communication phases may be good candidates to apply the DVFS and CPU throttling without incurring a considerable performance loss. They are often considered as indivisible operations although littlemore » attention is being devoted to the energy saving potential of their algorithmic steps. In this work, two important collective communication operations, all-to-all and allgather, are investigated as to their augmentation with energy saving strategies on the per-call basis. The experiments prove the viability of such a fine-grain approach. They also validate a theoretical power consumption estimate for multicore nodes proposed here. While keeping the performance loss low, the obtained energy savings were always significantly higher than those achieved when DVFS or throttling were switched on across the entire application run« less
Multiple hypothesis tracking for cluttered biological image sequences.
Chenouard, Nicolas; Bloch, Isabelle; Olivo-Marin, Jean-Christophe
2013-11-01
In this paper, we present a method for simultaneously tracking thousands of targets in biological image sequences, which is of major importance in modern biology. The complexity and inherent randomness of the problem lead us to propose a unified probabilistic framework for tracking biological particles in microscope images. The framework includes realistic models of particle motion and existence and of fluorescence image features. For the track extraction process per se, the very cluttered conditions motivate the adoption of a multiframe approach that enforces tracking decision robustness to poor imaging conditions and to random target movements. We tackle the large-scale nature of the problem by adapting the multiple hypothesis tracking algorithm to the proposed framework, resulting in a method with a favorable tradeoff between the model complexity and the computational cost of the tracking procedure. When compared to the state-of-the-art tracking techniques for bioimaging, the proposed algorithm is shown to be the only method providing high-quality results despite the critically poor imaging conditions and the dense target presence. We thus demonstrate the benefits of advanced Bayesian tracking techniques for the accurate computational modeling of dynamical biological processes, which is promising for further developments in this domain.
Ribot, Miguel Angel; Botteron, Cyril; Farine, Pierre-André
2016-01-01
The use of the reflected Global Navigation Satellite Systems’ (GNSS) signals in Earth observation applications, referred to as GNSS reflectometry (GNSS-R), has been already studied for more than two decades. However, the estimation precision that can be achieved by GNSS-R sensors in some particular scenarios is still not fully understood yet. In an effort to partially fill this gap, in this paper, we compute the Cramér–Rao bound (CRB) for the specific case of static ground-based GNSS-R receivers and scenarios where the coherent component of the reflected signal is dominant. We compute the CRB for GNSS signals with different modulations, GPS L1 C/A and GPS L5 I/Q, which use binary phase-shift keying, and Galileo E1 B/C and E5, using the binary offset carrier. The CRB for these signals is evaluated as a function of the receiver bandwidth and different scenario parameters, such as the height of the receiver or the properties of the reflection surface. The CRB computation presented considers observation times of up to several tens of seconds, in which the satellite elevation angle observed changes significantly. Finally, the results obtained show the theoretical benefit of using modern GNSS signals with GNSS-R techniques using long observation times, such as the interference pattern technique. PMID:27929388
Ribot, Miguel Angel; Botteron, Cyril; Farine, Pierre-André
2016-12-05
The use of the reflected Global Navigation Satellite Systems' (GNSS) signals in Earth observation applications, referred to as GNSS reflectometry (GNSS-R), has been already studied for more than two decades. However, the estimation precision that can be achieved by GNSS-R sensors in some particular scenarios is still not fully understood yet. In an effort to partially fill this gap, in this paper, we compute the Cramér-Rao bound (CRB) for the specific case of static ground-based GNSS-R receivers and scenarios where the coherent component of the reflected signal is dominant. We compute the CRB for GNSS signals with different modulations, GPS L1 C/A and GPS L5 I/Q, which use binary phase-shift keying, and Galileo E1 B/C and E5, using the binary offset carrier. The CRB for these signals is evaluated as a function of the receiver bandwidth and different scenario parameters, such as the height of the receiver or the properties of the reflection surface. The CRB computation presented considers observation times of up to several tens of seconds, in which the satellite elevation angle observed changes significantly. Finally, the results obtained show the theoretical benefit of using modern GNSS signals with GNSS-R techniques using long observation times, such as the interference pattern technique.
Adaptive wall technology for minimization of wall interferences in transonic wind tunnels
NASA Technical Reports Server (NTRS)
Wolf, Stephen W. D.
1988-01-01
Modern experimental techniques to improve free air simulations in transonic wind tunnels by use of adaptive wall technology are reviewed. Considered are the significant advantages of adaptive wall testing techniques with respect to wall interferences, Reynolds number, tunnel drive power, and flow quality. The application of these testing techniques relies on making the test section boundaries adjustable and using a rapid wall adjustment procedure. A historical overview shows how the disjointed development of these testing techniques, since 1938, is closely linked to available computer support. An overview of Adaptive Wall Test Section (AWTS) designs shows a preference for use of relatively simple designs with solid adaptive walls in 2- and 3-D testing. Operational aspects of AWTS's are discussed with regard to production type operation where adaptive wall adjustments need to be quick. Both 2- and 3-D data are presented to illustrate the quality of AWTS data over the transonic speed range. Adaptive wall technology is available for general use in 2-D testing, even in cryogenic wind tunnels. In 3-D testing, more refinement of the adaptive wall testing techniques is required before more widespread use can be planned.
Computational ecology as an emerging science
Petrovskii, Sergei; Petrovskaya, Natalia
2012-01-01
It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336
ERIC Educational Resources Information Center
Prosise, Jeff
This document presents the principles behind modern computer graphics without straying into the arcane languages of mathematics and computer science. Illustrations accompany the clear, step-by-step explanations that describe how computers draw pictures. The 22 chapters of the book are organized into 5 sections. "Part 1: Computer Graphics in…
Fingerprinting sea-level variations in response to continental ice loss: a benchmark exercise
NASA Astrophysics Data System (ADS)
Barletta, Valentina R.; Spada, Giorgio; Riva, Riccardo E. M.; James, Thomas S.; Simon, Karen M.; van der Wal, Wouter; Martinec, Zdenek; Klemann, Volker; Olsson, Per-Anders; Hagedoorn, Jan; Stocchi, Paolo; Vermeersen, Bert
2013-04-01
Understanding the response of the Earth to the waxing and waning ice sheets is crucial in various contexts, ranging from the interpretation of modern satellite geodetic measurements to the projections of future sea level trends in response to climate change. All the processes accompanying Glacial Isostatic Adjustment (GIA) can be described solving the so-called Sea Level Equation (SLE), an integral equation that accounts for the interactions between the ice sheets, the solid Earth, and the oceans. Modern approaches to the SLE are based on various techniques that range from purely analytical formulations to fully numerical methods. Here we present the results of a benchmark exercise of independently developed codes designed to solve the SLE. The study involves predictions of current sea level changes due to present-day ice mass loss. In spite of the differences in the methods employed, the comparison shows that a significant number of GIA modellers can reproduce their sea-level computations within 2% for well defined, large-scale present-day ice mass changes. Smaller and more detailed loads need further and dedicated benchmarking and high resolution computation. This study shows how the details of the implementation and the inputs specifications are an important, and often underappreciated, aspect. Hence this represents a step toward the assessment of reliability of sea level projections obtained with benchmarked SLE codes.
Analysis of Defenses Against Code Reuse Attacks on Modern and New Architectures
2015-09-01
soundness or completeness. An incomplete analysis will produce extra edges in the CFG that might allow an attacker to slip through. An unsound analysis...Analysis of Defenses Against Code Reuse Attacks on Modern and New Architectures by Isaac Noah Evans Submitted to the Department of Electrical...Engineering and Computer Science in partial fulfillment of the requirements for the degree of Master of Engineering in Electrical Engineering and Computer
Body metaphors--reading the body in contemporary culture.
Skara, Danica
2004-01-01
This paper addresses the linguistic reframing of the human body in contemporary culture. Our aim is to provide a linguistic description of the ways in which the body is represented in modern English language. First, we will try to focus on body metaphors in general. We have collected a sample of 300 words and phrases functioning as body metaphors in modern English language. Reading the symbolism of the body we are witnessing changes in the basic metaphorical structuring of the human body. The results show that new vocabulary binds different fields of knowledge associated with machines and human beings according to a shared textual frame: human as computer and computer as human metaphor. Humans are almost blended with computers and vice versa. This metaphorical use of the human body and its parts reveals not only currents of unconscious though but also the structures of modern society and culture.
Federal Aviation Administration : challenges in modernizing the agency
DOT National Transportation Integrated Search
2000-02-01
FAA's efforts to implement initiatives in five key areas-air traffic control modernization, procurement and personnel reform, aviation safety, aviation and computer security, and financial management-have met with limited success. For example, FAA ha...
ERIC Educational Resources Information Center
Sliva, William R.
1977-01-01
The size of modern dairy plant operations has led to extreme specialization in product manufacturing, milk processing, microbiological analysis, chemical and mathematical computations. Morrisville Agricultural and Technical College, New York, has modernized its curricula to meet these changes. (HD)
Addison's Disease Revisited in Poland: Year 2008 versus Year 1990
Kasperlik-Zaluska, Anna A.; Czarnocka, Barbara; Jeske, Wojciech; Papierska, Lucyna
2010-01-01
This study aimed at comparing two groups of patients with Addison's disease: A, including 180 patients described in 1991 and B, consisting of 138 patients registered since 1991. The incidence of coexisting autoimmune disorders was evaluated and etiological factors were analyzed. Immunological and imaging studies (computed tomography in group B) were performed. Adrenal autoantibodies were examined by an indirect immunofluorescence technique in group A, and by the assay measuring autoantibodies against steroid 21-hydroxylase in group B. Adrenal autoantibodies were revealed in 37% of patients examined by the immunofluorescence method and in 63% investigated by the modern technique. Tuberculosis was found in 52 patients in the group A and in two patients in the group B; metastatic infiltrations of the adrenals in CT were detected in 16 patients. Probable autoimmune Addison's disease was diagnosed in 125/180 patients (69%) in the group A and in 116/138 patients (84%) in the group B. PMID:21188237
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zingone, Gaetano; Licata, Vincenzo; Calogero, Cucchiara
2008-07-08
The present work fits into the interesting theme of seismic prevention for protection of the monumental patrimony made up of churches with drum domes. Specifically, with respect to a church in the historic area of Catania, chosen as a monument exemplifying the typology examined, the seismic behavior is analyzed in the linear field using modern dynamic identification techniques. The dynamically identified computational model arrived at made it possible to identify the macro-element most at risk, the dome-drum system. With respect to this system the behavior in the nonlinear field is analyzed through dynamic tests on large-scale models in the presencemore » of various types of improving reinforcement. The results are used to appraise the ameliorative contribution afforded by each of them and to choose the most suitable type of reinforcement, optimizing the stiffness/ductility ratio of the system.« less
NASA's program on icing research and technology
NASA Technical Reports Server (NTRS)
Reinmann, John J.; Shaw, Robert J.; Ranaudo, Richard J.
1989-01-01
NASA's program in aircraft icing research and technology is reviewed. The program relies heavily on computer codes and modern applied physics technology in seeking icing solutions on a finer scale than those offered in earlier programs. Three major goals of this program are to offer new approaches to ice protection, to improve our ability to model the response of an aircraft to an icing encounter, and to provide improved techniques and facilities for ground and flight testing. This paper reviews the following program elements: (1) new approaches to ice protection; (2) numerical codes for deicer analysis; (3) measurement and prediction of ice accretion and its effect on aircraft and aircraft components; (4) special wind tunnel test techniques for rotorcraft icing; (5) improvements of icing wind tunnels and research aircraft; (6) ground de-icing fluids used in winter operation; (7) fundamental studies in icing; and (8) droplet sizing instruments for icing clouds.
Ridge Regression Signal Processing
NASA Technical Reports Server (NTRS)
Kuhl, Mark R.
1990-01-01
The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.
Coherent Anti-Stokes Raman Spectroscopic Thermometry in a Supersonic Combustor
NASA Technical Reports Server (NTRS)
Cutler, A. D.; Danehy, P. M.; Springer, R. R.; OByrne, S.; Capriotti, D. P.; DeLoach, R.
2003-01-01
An experiment has been conducted to acquire data for the validation of computational fluid dynamics codes used in the design of supersonic combustors. The flow in a supersonic combustor, consisting of a diverging duct with a single downstream-angled wail injector, is studied. Combustor entrance Mach number is 2 and enthalpy nominally corresponds to Mach 7 flight. The primary measurement technique is coherent anti-Stokes Raman spectroscopy, but surface pressures and temperatures have also been acquired. Modern design of experiment techniques have been used to maximize the quality of the data set (for the given level of effort) and to minimize systematic errors. Temperature maps are obtained at several planes in the flow for a case in which the combustor is piloted by injecting fuel upstream of the main injector and one case in which it is not piloted. Boundary conditions and uncertainties are characterized.
ERIC Educational Resources Information Center
Wilcox, Rand R.; Serang, Sarfaraz
2017-01-01
The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…
ERIC Educational Resources Information Center
Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.
2014-01-01
Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…
Student Perceived Importance and Correlations of Selected Computer Literacy Course Topics
ERIC Educational Resources Information Center
Ciampa, Mark
2013-01-01
Traditional college-level courses designed to teach computer literacy are in a state of flux. Today's students have high rates of access to computing technology and computer ownership, leading many policy decision makers to conclude that students already are computer literate and thus computer literacy courses are dinosaurs in a modern digital…
Computational Ecology and Open Science: Tools to Help Manage Lakes for Cyanobacteria in Lakes
Computational ecology is an interdisciplinary field that takes advantage of modern computation abilities to expand our ecological understanding. As computational ecologists, we use large data sets, which often cover large spatial extents, and advanced statistical/mathematical co...
Modern Electronic Devices: An Increasingly Common Cause of Skin Disorders in Consumers.
Corazza, Monica; Minghetti, Sara; Bertoldi, Alberto Maria; Martina, Emanuela; Virgili, Annarosa; Borghi, Alessandro
2016-01-01
: The modern conveniences and enjoyment brought about by electronic devices bring with them some health concerns. In particular, personal electronic devices are responsible for rising cases of several skin disorders, including pressure, friction, contact dermatitis, and other physical dermatitis. The universal use of such devices, either for work or recreational purposes, will probably increase the occurrence of polymorphous skin manifestations over time. It is important for clinicians to consider electronics as potential sources of dermatological ailments, for proper patient management. We performed a literature review on skin disorders associated with the personal use of modern technology, including personal computers and laptops, personal computer accessories, mobile phones, tablets, video games, and consoles.
Wood, T J; Moore, C S; Stephens, A; Saunderson, J R; Beavis, A W
2015-09-01
Given the increasing use of computed tomography (CT) in the UK over the last 30 years, it is essential to ensure that all imaging protocols are optimised to keep radiation doses as low as reasonably practicable, consistent with the intended clinical task. However, the complexity of modern CT equipment can make this task difficult to achieve in practice. Recent results of local patient dose audits have shown discrepancies between two Philips CT scanners that use the DoseRight 2.0 automatic exposure control (AEC) system in the 'automatic' mode of operation. The use of this system can result in drifting dose and image quality performance over time as it is designed to evolve based on operator technique. The purpose of this study was to develop a practical technique for configuring examination protocols on four CT scanners that use the DoseRight 2.0 AEC system in the 'manual' mode of operation. This method used a uniform phantom to generate reference images which form the basis for how the AEC system calculates exposure factors for any given patient. The results of this study have demonstrated excellent agreement in the configuration of the CT scanners in terms of average patient dose and image quality when using this technique. This work highlights the importance of CT protocol harmonisation in a modern Radiology department to ensure both consistent image quality and radiation dose. Following this study, the average radiation dose for a range of CT examinations has been reduced without any negative impact on clinical image quality.
Liao, Wei-Ching; Chuang, Min-Chieh; Ho, Ja-An Annie
2013-12-15
Genetically modified (GM) technique, one of the modern biomolecular engineering technologies, has been deemed as profitable strategy to fight against global starvation. Yet rapid and reliable analytical method is deficient to evaluate the quality and potential risk of such resulting GM products. We herein present a biomolecular analytical system constructed with distinct biochemical activities to expedite the computational detection of genetically modified organisms (GMOs). The computational mechanism provides an alternative to the complex procedures commonly involved in the screening of GMOs. Given that the bioanalytical system is capable of processing promoter, coding and species genes, affirmative interpretations succeed to identify specified GM event in terms of both electrochemical and optical fashions. The biomolecular computational assay exhibits detection capability of genetically modified DNA below sub-nanomolar level and is found interference-free by abundant coexistence of non-GM DNA. This bioanalytical system, furthermore, sophisticates in array fashion operating multiplex screening against variable GM events. Such a biomolecular computational assay and biosensor holds great promise for rapid, cost-effective, and high-fidelity screening of GMO. Copyright © 2013 Elsevier B.V. All rights reserved.
Ultra-Structure database design methodology for managing systems biology data and analyses
Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C
2009-01-01
Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers substantial benefits for biological information systems, the largest being the integration of diverse information sources into a common framework. This facilitates systems biology research by integrating data from disparate high-throughput techniques. It also enables us to readily incorporate new data types, sources, and domain knowledge with no change to the database structure or associated computer code. Ultra-Structure may be a significant step towards solving the hard problem of data management and integration in the systems biology era. PMID:19691849
Vids: Version 2.0 Alpha Visualization Engine
2018-04-25
fidelity than existing efforts. Vids is a project aimed at producing more dynamic and interactive visualization tools using modern computer game ...move through and interact with the data to improve informational understanding. The Vids software leverages off-the-shelf modern game development...analysis and correlations. Recently, an ARL-pioneered project named Virtual Reality Data Analysis Environment (VRDAE) used VR and a modern game engine
NASA Astrophysics Data System (ADS)
Rakvic, Ryan N.; Ives, Robert W.; Lira, Javier; Molina, Carlos
2011-01-01
General purpose computer designers have recently begun adding cores to their processors in order to increase performance. For example, Intel has adopted a homogeneous quad-core processor as a base for general purpose computing. PlayStation3 (PS3) game consoles contain a multicore heterogeneous processor known as the Cell, which is designed to perform complex image processing algorithms at a high level. Can modern image-processing algorithms utilize these additional cores? On the other hand, modern advancements in configurable hardware, most notably field-programmable gate arrays (FPGAs) have created an interesting question for general purpose computer designers. Is there a reason to combine FPGAs with multicore processors to create an FPGA multicore hybrid general purpose computer? Iris matching, a repeatedly executed portion of a modern iris-recognition algorithm, is parallelized on an Intel-based homogeneous multicore Xeon system, a heterogeneous multicore Cell system, and an FPGA multicore hybrid system. Surprisingly, the cheaper PS3 slightly outperforms the Intel-based multicore on a core-for-core basis. However, both multicore systems are beaten by the FPGA multicore hybrid system by >50%.
Identification of key ancestors of modern germplasm in a breeding program of maize.
Technow, F; Schrag, T A; Schipprack, W; Melchinger, A E
2014-12-01
Probabilities of gene origin computed from the genomic kinships matrix can accurately identify key ancestors of modern germplasms Identifying the key ancestors of modern plant breeding populations can provide valuable insights into the history of a breeding program and provide reference genomes for next generation whole genome sequencing. In an animal breeding context, a method was developed that employs probabilities of gene origin, computed from the pedigree-based additive kinship matrix, for identifying key ancestors. Because reliable and complete pedigree information is often not available in plant breeding, we replaced the additive kinship matrix with the genomic kinship matrix. As a proof-of-concept, we applied this approach to simulated data sets with known ancestries. The relative contribution of the ancestral lines to later generations could be determined with high accuracy, with and without selection. Our method was subsequently used for identifying the key ancestors of the modern Dent germplasm of the public maize breeding program of the University of Hohenheim. We found that the modern germplasm can be traced back to six or seven key ancestors, with one or two of them having a disproportionately large contribution. These results largely corroborated conjectures based on early records of the breeding program. We conclude that probabilities of gene origin computed from the genomic kinships matrix can be used for identifying key ancestors in breeding programs and estimating the proportion of genes contributed by them.
Web-Based Computational Chemistry Education with CHARMMing I: Lessons and Tutorial
Miller, Benjamin T.; Singh, Rishi P.; Schalk, Vinushka; Pevzner, Yuri; Sun, Jingjun; Miller, Carrie S.; Boresch, Stefan; Ichiye, Toshiko; Brooks, Bernard R.; Woodcock, H. Lee
2014-01-01
This article describes the development, implementation, and use of web-based “lessons” to introduce students and other newcomers to computer simulations of biological macromolecules. These lessons, i.e., interactive step-by-step instructions for performing common molecular simulation tasks, are integrated into the collaboratively developed CHARMM INterface and Graphics (CHARMMing) web user interface (http://www.charmming.org). Several lessons have already been developed with new ones easily added via a provided Python script. In addition to CHARMMing's new lessons functionality, web-based graphical capabilities have been overhauled and are fully compatible with modern mobile web browsers (e.g., phones and tablets), allowing easy integration of these advanced simulation techniques into coursework. Finally, one of the primary objections to web-based systems like CHARMMing has been that “point and click” simulation set-up does little to teach the user about the underlying physics, biology, and computational methods being applied. In response to this criticism, we have developed a freely available tutorial to bridge the gap between graphical simulation setup and the technical knowledge necessary to perform simulations without user interface assistance. PMID:25057988
Web-based computational chemistry education with CHARMMing I: Lessons and tutorial.
Miller, Benjamin T; Singh, Rishi P; Schalk, Vinushka; Pevzner, Yuri; Sun, Jingjun; Miller, Carrie S; Boresch, Stefan; Ichiye, Toshiko; Brooks, Bernard R; Woodcock, H Lee
2014-07-01
This article describes the development, implementation, and use of web-based "lessons" to introduce students and other newcomers to computer simulations of biological macromolecules. These lessons, i.e., interactive step-by-step instructions for performing common molecular simulation tasks, are integrated into the collaboratively developed CHARMM INterface and Graphics (CHARMMing) web user interface (http://www.charmming.org). Several lessons have already been developed with new ones easily added via a provided Python script. In addition to CHARMMing's new lessons functionality, web-based graphical capabilities have been overhauled and are fully compatible with modern mobile web browsers (e.g., phones and tablets), allowing easy integration of these advanced simulation techniques into coursework. Finally, one of the primary objections to web-based systems like CHARMMing has been that "point and click" simulation set-up does little to teach the user about the underlying physics, biology, and computational methods being applied. In response to this criticism, we have developed a freely available tutorial to bridge the gap between graphical simulation setup and the technical knowledge necessary to perform simulations without user interface assistance.
NASA Astrophysics Data System (ADS)
Shadid, J. N.; Smith, T. M.; Cyr, E. C.; Wildey, T. M.; Pawlowski, R. P.
2016-09-01
A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. In this respect the understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In this study we report on initial efforts to apply integrated adjoint-based computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier-Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. Initial results are presented that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shadid, J.N., E-mail: jnshadi@sandia.gov; Department of Mathematics and Statistics, University of New Mexico; Smith, T.M.
A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. In this respect the understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In this study we report on initial efforts tomore » apply integrated adjoint-based computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier–Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. Initial results are presented that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shadid, J. N.; Smith, T. M.; Cyr, E. C.
A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. The understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In our study we report on initial efforts to apply integrated adjoint-basedmore » computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier–Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. We present the initial results that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.« less
Shadid, J. N.; Smith, T. M.; Cyr, E. C.; ...
2016-05-20
A critical aspect of applying modern computational solution methods to complex multiphysics systems of relevance to nuclear reactor modeling, is the assessment of the predictive capability of specific proposed mathematical models. The understanding of numerical error, the sensitivity of the solution to parameters associated with input data, boundary condition uncertainty, and mathematical models is critical. Additionally, the ability to evaluate and or approximate the model efficiently, to allow development of a reasonable level of statistical diagnostics of the mathematical model and the physical system, is of central importance. In our study we report on initial efforts to apply integrated adjoint-basedmore » computational analysis and automatic differentiation tools to begin to address these issues. The study is carried out in the context of a Reynolds averaged Navier–Stokes approximation to turbulent fluid flow and heat transfer using a particular spatial discretization based on implicit fully-coupled stabilized FE methods. We present the initial results that show the promise of these computational techniques in the context of nuclear reactor relevant prototype thermal-hydraulics problems.« less
NASA Astrophysics Data System (ADS)
Meitzler, Thomas J.
The field of computer vision interacts with fields such as psychology, vision research, machine vision, psychophysics, mathematics, physics, and computer science. The focus of this thesis is new algorithms and methods for the computation of the probability of detection (Pd) of a target in a cluttered scene. The scene can be either a natural visual scene such as one sees with the naked eye (visual), or, a scene displayed on a monitor with the help of infrared sensors. The relative clutter and the temperature difference between the target and background (DeltaT) are defined and then used to calculate a relative signal -to-clutter ratio (SCR) from which the Pd is calculated for a target in a cluttered scene. It is shown how this definition can include many previous definitions of clutter and (DeltaT). Next, fuzzy and neural -fuzzy techniques are used to calculate the Pd and it is shown how these methods can give results that have a good correlation with experiment. The experimental design for actually measuring the Pd of a target by observers is described. Finally, wavelets are applied to the calculation of clutter and it is shown how this new definition of clutter based on wavelets can be used to compute the Pd of a target.
NASA Astrophysics Data System (ADS)
Mitchell, Justin Chadwick
2011-12-01
Using light to probe the structure of matter is as natural as opening our eyes. Modern physics and chemistry have turned this art into a rich science, measuring the delicate interactions possible at the molecular level. Perhaps the most commonly used tool in computational spectroscopy is that of matrix diagonalization. While this is invaluable for calculating everything from molecular structure and energy levels to dipole moments and dynamics, the process of numerical diagonalization is an opaque one. This work applies symmetry and semi-classical techniques to elucidate numerical spectral analysis for high-symmetry molecules. Semi-classical techniques, such as the Potential Energy Surfaces, have long been used to help understand molecular vibronic and rovibronic spectra and dynamics. This investigation focuses on newer semi-classical techniques that apply Rotational Energy Surfaces (RES) to rotational energy level clustering effects in high-symmetry molecules. Such clusters exist in rigid rotor molecules as well as deformable spherical tops. This study begins by using the simplicity of rigid symmetric top molecules to clarify the classical-quantum correspondence of RES semi-classical analysis and then extends it to a more precise and complete theory of modern high-resolution spectra. RES analysis is extended to molecules having more complex and higher rank tensorial rotational and rovibrational Hamiltonians than were possible to understand before. Such molecules are shown to produce an extraordinary range of rotational level clusters, corresponding to a panoply of symmetries ranging from C4v to C2 and C1 (no symmetry) with a corresponding range of new angular momentum localization and J-tunneling effects. Using RES topography analysis and the commutation duality relations between symmetry group operators in the lab-frame to those in the body-frame, it is shown how to better describe and catalog complex splittings found in rotational level clusters. Symmetry character analysis is generalized to give analytic eigensolutions. An appendix provides vibrational analogies. For the first time, interactions between molecular vibrations (polyads) are described semi-classically by multiple RES. This is done for the nu 3/2nu4 dyad of CF4. The nine-surface RES topology of the U(9)-dyad agrees with both computational and experimental work. A connection between this and a simpler U(2) example is detailed in an Appendix.
ERIC Educational Resources Information Center
de Mestre, Neville
2004-01-01
Computers were invented to help mathematicians perform long and complicated calculations more efficiently. By the time that a computing area became a familiar space in primary and secondary schools, the initial motivation for computer use had been submerged in the many other functions that modern computers now accomplish. Not only the mathematics…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J
Organizations maintaining mainframe legacy software can benefit from code modernization and incorporation of security capabilities to address the current threat environment. Oak Ridge National Laboratory is developing the Hyperion system to compute the behavior of software as a means to gain understanding of software functionality and security properties. Computation of functionality is critical to revealing security attributes, which are in fact specialized functional behaviors of software. Oak Ridge is collaborating with MITRE Corporation to conduct a demonstration project to compute behavior of legacy IBM Assembly Language code for a federal agency. The ultimate goal is to understand functionality and securitymore » vulnerabilities as a basis for code modernization. This paper reports on the first phase, to define functional semantics for IBM Assembly instructions and conduct behavior computation experiments.« less
NASA Astrophysics Data System (ADS)
Esquef, Paulo A. A.
The first reproducible recording of human voice was made in 1877 on a tinfoil cylinder phonograph devised by Thomas A. Edison. Since then, much effort has been expended to find better ways to record and reproduce sounds. By the mid-1920s, the first electrical recordings appeared and gradually took over purely acoustic recordings. The development of electronic computers, in conjunction with the ability to record data onto magnetic or optical media, culminated in the standardization of compact disc format in 1980. Nowadays, digital technology is applied to several audio applications, not only to improve the quality of modern and old recording/reproduction techniques, but also to trade off sound quality for less storage space and less taxing transmission capacity requirements.
How do particle physicists learn the programming concepts they need?
NASA Astrophysics Data System (ADS)
Kluth, S.; Pia, M. G.; Schoerner-Sadenius, T.; Steinbach, P.
2015-12-01
The ability to read, use and develop code efficiently and successfully is a key ingredient in modern particle physics. We report the experience of a training program, identified as “Advanced Programming Concepts”, that introduces software concepts, methods and techniques to work effectively on a daily basis in a HEP experiment or other programming intensive fields. This paper illustrates the principles, motivations and methods that shape the “Advanced Computing Concepts” training program, the knowledge base that it conveys, an analysis of the feedback received so far, and the integration of these concepts in the software development process of the experiments as well as its applicability to a wider audience.
Modular, bluetooth enabled, wireless electroencephalograph (EEG) platform.
Lovelace, Joseph A; Witt, Tyler S; Beyette, Fred R
2013-01-01
A design for a modular, compact, and accurate wireless electroencephalograph (EEG) system is proposed. EEG is the only non-invasive measure for neuronal function of the brain. Using a number of digital signal processing (DSP) techniques, this neuronal function can be acquired and processed into meaningful representations of brain activity. The system described here utilizes Bluetooth to wirelessly transmit the digitized brain signal for an end application use. In this way, the system is portable, and modular in terms of the device to which it can interface. Brain Computer Interface (BCI) has become a popular extension of EEG systems in modern research. This design serves as a platform for applications using BCI capability.
Computational overlay metrology with adaptive data analytics
NASA Astrophysics Data System (ADS)
Schmitt-Weaver, Emil; Subramony, Venky; Ullah, Zakir; Matsunobu, Masazumi; Somasundaram, Ravin; Thomas, Joel; Zhang, Linmiao; Thul, Klaus; Bhattacharyya, Kaustuve; Goossens, Ronald; Lambregts, Cees; Tel, Wim; de Ruiter, Chris
2017-03-01
With photolithography as the fundamental patterning step in the modern nanofabrication process, every wafer within a semiconductor fab will pass through a lithographic apparatus multiple times. With more than 20,000 sensors producing more than 700GB of data per day across multiple subsystems, the combination of a light source and lithographic apparatus provide a massive amount of information for data analytics. This paper outlines how data analysis tools and techniques that extend insight into data that traditionally had been considered unmanageably large, known as adaptive analytics, can be used to show how data collected before the wafer is exposed can be used to detect small process dependent wafer-towafer changes in overlay.
Eigenspace techniques for active flutter suppression
NASA Technical Reports Server (NTRS)
Garrard, W. L.
1982-01-01
Mathematical models to be used in the control system design were developed. A computer program, which takes aerodynamic and structural data for the ARW-2 aircraft and converts these data into state space models suitable for use in modern control synthesis procedures, was developed. Reduced order models of inboard and outboard control surface actuator dynamics and a second order vertical wind gust model were developed. An analysis of the rigid body motion of the ARW-2 was conducted. The deletion of the aerodynamic lag states in the rigid body modes resulted in more accurate values for the eigenvalues associated with the plunge and pitch modes than were obtainable if the lag states were retained.
Astronomical Optical Interferometry. I. Methods and Instrumentation
NASA Astrophysics Data System (ADS)
Jankov, S.
2010-12-01
Previous decade has seen an achievement of large interferometric projects including 8-10m telescopes and 100m class baselines. Modern computer and control technology has enabled the interferometric combination of light from separate telescopes also in the visible and infrared regimes. Imaging with milli-arcsecond (mas) resolution and astrometry with micro-arcsecond (muas) precision have thus become reality. Here, I review the methods and instrumentation corresponding to the current state in the field of astronomical optical interferometry. First, this review summarizes the development from the pioneering works of Fizeau and Michelson. Next, the fundamental observables are described, followed by the discussion of the basic design principles of modern interferometers. The basic interferometric techniques such as speckle and aperture masking interferometry, aperture synthesis and nulling interferometry are disscused as well. Using the experience of past and existing facilities to illustrate important points, I consider particularly the new generation of large interferometers that has been recently commissioned (most notably, the CHARA, Keck, VLT and LBT Interferometers). Finally, I discuss the longer-term future of optical interferometry, including the possibilities of new large-scale ground-based projects and prospects for space interferometry.
Bol'shakov, O P
2008-01-01
Modern data on studying and teaching topographic and clinical anatomy in Russia and in the foreign countries at the boundary between the XX and the XXI centuries are analyzed. Definitions of some concepts are given; methodological and organizational bases of studying topographic and clinical anatomy are examined in historical aspect. Various approaches to the teaching and studying of these disciplines in different countries, are demonstrated. Special attention is given to the use of new technologies in teaching; the experience of virtual mode of studying of applied anatomy and surgical techniques is critically evaluated. Article presents author's own opinion and analyzes the conceptions of the foreign authors on the necessity of rational combination of computer and other modem technologies with traditional methods of work using biological materials and experiments on laboratory animals. The longstanding experience of the departments of operative surgery and clinical anatomy is summarized and the benefits of the national system of teaching of applied (topographic and clinical) anatomy are shown. Modem tendencies and priorities in the development of topographic and clinical anatomy are demonstrated.
New developments in supra-threshold perimetry.
Henson, David B; Artes, Paul H
2002-09-01
To describe a series of recent enhancements to supra-threshold perimetry. Computer simulations were used to develop an improved algorithm (HEART) for the setting of the supra-threshold test intensity at the beginning of a field test, and to evaluate the relationship between various pass/fail criteria and the test's performance (sensitivity and specificity) and how they compare with modern threshold perimetry. Data were collected in optometric practices to evaluate HEART and to assess how the patient's response times can be analysed to detect false positive response errors in visual field test results. The HEART algorithm shows improved performance (reduced between-eye differences) over current algorithms. A pass/fail criterion of '3 stimuli seen of 3-5 presentations' at each test location reduces test/retest variability and combines high sensitivity and specificity. A large percentage of false positive responses can be detected by comparing their latencies to the average response time of a patient. Optimised supra-threshold visual field tests can perform as well as modern threshold techniques. Such tests may be easier to perform for novice patients, compared with the more demanding threshold tests.
SoAx: A generic C++ Structure of Arrays for handling particles in HPC codes
NASA Astrophysics Data System (ADS)
Homann, Holger; Laenen, Francois
2018-03-01
The numerical study of physical problems often require integrating the dynamics of a large number of particles evolving according to a given set of equations. Particles are characterized by the information they are carrying such as an identity, a position other. There are generally speaking two different possibilities for handling particles in high performance computing (HPC) codes. The concept of an Array of Structures (AoS) is in the spirit of the object-oriented programming (OOP) paradigm in that the particle information is implemented as a structure. Here, an object (realization of the structure) represents one particle and a set of many particles is stored in an array. In contrast, using the concept of a Structure of Arrays (SoA), a single structure holds several arrays each representing one property (such as the identity) of the whole set of particles. The AoS approach is often implemented in HPC codes due to its handiness and flexibility. For a class of problems, however, it is known that the performance of SoA is much better than that of AoS. We confirm this observation for our particle problem. Using a benchmark we show that on modern Intel Xeon processors the SoA implementation is typically several times faster than the AoS one. On Intel's MIC co-processors the performance gap even attains a factor of ten. The same is true for GPU computing, using both computational and multi-purpose GPUs. Combining performance and handiness, we present the library SoAx that has optimal performance (on CPUs, MICs, and GPUs) while providing the same handiness as AoS. For this, SoAx uses modern C++ design techniques such template meta programming that allows to automatically generate code for user defined heterogeneous data structures.
Combining Distributed and Shared Memory Models: Approach and Evolution of the Global Arrays Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nieplocha, Jarek; Harrison, Robert J.; Kumar, Mukul
2002-07-29
Both shared memory and distributed memory models have advantages and shortcomings. Shared memory model is much easier to use but it ignores data locality/placement. Given the hierarchical nature of the memory subsystems in the modern computers this characteristic might have a negative impact on performance and scalability. Various techniques, such as code restructuring to increase data reuse and introducing blocking in data accesses, can address the problem and yield performance competitive with message passing[Singh], however at the cost of compromising the ease of use feature. Distributed memory models such as message passing or one-sided communication offer performance and scalability butmore » they compromise the ease-of-use. In this context, the message-passing model is sometimes referred to as?assembly programming for the scientific computing?. The Global Arrays toolkit[GA1, GA2] attempts to offer the best features of both models. It implements a shared-memory programming model in which data locality is managed explicitly by the programmer. This management is achieved by explicit calls to functions that transfer data between a global address space (a distributed array) and local storage. In this respect, the GA model has similarities to the distributed shared-memory models that provide an explicit acquire/release protocol. However, the GA model acknowledges that remote data is slower to access than local data and allows data locality to be explicitly specified and hence managed. The GA model exposes to the programmer the hierarchical memory of modern high-performance computer systems, and by recognizing the communication overhead for remote data transfer, it promotes data reuse and locality of reference. This paper describes the characteristics of the Global Arrays programming model, capabilities of the toolkit, and discusses its evolution.« less
NASA Astrophysics Data System (ADS)
Tucker, G. E.
1997-05-01
This NSF supported program, emphasizing hands-on learning and observation with modern instruments, is described in its pilot phase, prior to being launched nationally. A group of 14 year old students are using a small (21 cm) computer controlled telescope and CCD camera to do: (1) a 'sky survey' of brighter celestial objects, finding, identifying, and learning about them, and accumulating a portfolio of images, (2) photometry of variable stars, reducing the data to get a light curve, and (3) learn modern computer-based communication/dissemination skills by posting images and data to a Web site they are designing (http://www.javanet.com/ sky) and contributing data to archives (e.g. AAVSO) via the Internet. To attract more interest to astronomy and science in general and have a wider impact on the school and surrounding community, peer teaching is used as a pedagogical technique and families are encouraged to participate. Students teach e.g. astronomy, software and computers, Internet, instrumentation, and observing to other students, parents and the community by means of daytime presentations of their results (images and data) and evening public viewing at the telescope, operating the equipment themselves. Students can contribute scientifically significant data and experience the `discovery' aspect of science through observing projects where a measurement is made. Their `informal education' activities also help improve the perception of science in general and astronomy in particular in society at large. This program could benefit from collaboration with astronomers wanting to organize geographically distributed observing campaigns coordinated over the Internet and willing to advise on promising observational programs for small telescopes in the context of current science.
STORM WATER MANAGEMENT MODEL (SWMM) MODERNIZATION
The U.S. Environmental Protection Agency's Water Supply and Water Resources Division in partnership with the consulting firm of CDM to redevelop and modernize the Storm Water Management Model (SWMM). In the initial phase of this project EPA rewrote SWMM's computational engine usi...
Torres, Anna; Staśkiewicz, Grzegorz J; Lisiecka, Justyna; Pietrzyk, Łukasz; Czekajlo, Michael; Arancibia, Carlos U; Maciejewski, Ryszard; Torres, Kamil
2016-05-06
A wide variety of medical imaging techniques pervade modern medicine, and the changing portability and performance of tools like ultrasound imaging have brought these medical imaging techniques into the everyday practice of many specialties outside of radiology. However, proper interpretation of ultrasonographic and computed tomographic images requires the practitioner to not only hone certain technical skills, but to command an excellent knowledge of sectional anatomy and an understanding of the pathophysiology of the examined areas as well. Yet throughout many medical curricula there is often a large gap between traditional anatomy coursework and clinical training in imaging techniques. The authors present a radiological anatomy course developed to teach sectional anatomy with particular emphasis on ultrasonography and computed tomography, while incorporating elements of medical simulation. To assess students' overall opinions about the course and to examine its impact on their self-perceived improvement in their knowledge of radiological anatomy, anonymous evaluation questionnaires were provided to the students. The questionnaires were prepared using standard survey methods. A five-point Likert scale was applied to evaluate agreement with statements regarding the learning experience. The majority of students considered the course very useful and beneficial in terms of improving three-dimensional and cross-sectional knowledge of anatomy, as well as for developing practical skills in ultrasonography and computed tomography. The authors found that a small-group, hands-on teaching model in radiological anatomy was perceived as useful both by the students and the clinical teachers involved in their clinical education. In addition, the model was introduced using relatively few resources and only two faculty members. Anat Sci Educ 9: 295-303. © 2015 American Association of Anatomists. © 2015 American Association of Anatomists.
CFD Simulations in Support of Shuttle Orbiter Contingency Abort Aerodynamic Database Enhancement
NASA Technical Reports Server (NTRS)
Papadopoulos, Periklis E.; Prabhu, Dinesh; Wright, Michael; Davies, Carol; McDaniel, Ryan; Venkatapathy, E.; Wercinski, Paul; Gomez, R. J.
2001-01-01
Modern Computational Fluid Dynamics (CFD) techniques were used to compute aerodynamic forces and moments of the Space Shuttle Orbiter in specific portions of contingency abort trajectory space. The trajectory space covers a Mach number range of 3.5-15, an angle-of-attack range of 20deg-60deg, an altitude range of 100-190 kft, and several different settings of the control surfaces (elevons, body flap, and speed brake). Presented here are details of the methodology and comparisons of computed aerodynamic coefficients against the values in the current Orbiter Operational Aerodynamic Data Book (OADB). While approximately 40 cases have been computed, only a sampling of the results is provided here. The computed results, in general, are in good agreement with the OADB data (i.e., within the uncertainty bands) for almost all the cases. However, in a limited number of high angle-of-attack cases (at Mach 15), there are significant differences between the computed results, especially the vehicle pitching moment, and the OADB data. A preliminary analysis of the data from the CFD simulations at Mach 15 shows that these differences can be attributed to real-gas/Mach number effects. The aerodynamic coefficients and detailed surface pressure distributions of the present simulations are being used by the Shuttle Program in the evaluation of the capabilities of the Orbiter in contingency abort scenarios.
A History of Computer Numerical Control.
ERIC Educational Resources Information Center
Haggen, Gilbert L.
Computer numerical control (CNC) has evolved from the first significant counting method--the abacus. Babbage had perhaps the greatest impact on the development of modern day computers with his analytical engine. Hollerith's functioning machine with punched cards was used in tabulating the 1890 U.S. Census. In order for computers to become a…
Super-resolution imaging applied to moving object tracking
NASA Astrophysics Data System (ADS)
Swalaganata, Galandaru; Ratna Sulistyaningrum, Dwi; Setiyono, Budi
2017-10-01
Moving object tracking in a video is a method used to detect and analyze changes that occur in an object that being observed. Visual quality and the precision of the tracked target are highly wished in modern tracking system. The fact that the tracked object does not always seem clear causes the tracking result less precise. The reasons are low quality video, system noise, small object, and other factors. In order to improve the precision of the tracked object especially for small object, we propose a two step solution that integrates a super-resolution technique into tracking approach. First step is super-resolution imaging applied into frame sequences. This step was done by cropping the frame in several frame or all of frame. Second step is tracking the result of super-resolution images. Super-resolution image is a technique to obtain high-resolution images from low-resolution images. In this research single frame super-resolution technique is proposed for tracking approach. Single frame super-resolution was a kind of super-resolution that it has the advantage of fast computation time. The method used for tracking is Camshift. The advantages of Camshift was simple calculation based on HSV color that use its histogram for some condition and color of the object varies. The computational complexity and large memory requirements required for the implementation of super-resolution and tracking were reduced and the precision of the tracked target was good. Experiment showed that integrate a super-resolution imaging into tracking technique can track the object precisely with various background, shape changes of the object, and in a good light conditions.
Computed tomography automatic exposure control techniques in 18F-FDG oncology PET-CT scanning.
Iball, Gareth R; Tout, Deborah
2014-04-01
Computed tomography (CT) automatic exposure control (AEC) systems are now used in all modern PET-CT scanners. A collaborative study was undertaken to compare AEC techniques of the three major PET-CT manufacturers for fluorine-18 fluorodeoxyglucose half-body oncology imaging. An audit of 70 patients was performed for half-body CT scans taken on a GE Discovery 690, Philips Gemini TF and Siemens Biograph mCT (all 64-slice CT). Patient demographic and dose information was recorded and image noise was calculated as the SD of Hounsfield units in the liver. A direct comparison of the AEC systems was made by scanning a Rando phantom on all three systems for a range of AEC settings. The variation in dose and image quality with patient weight was significantly different for all three systems, with the GE system showing the largest variation in dose with weight and Philips the least. Image noise varied with patient weight in Philips and Siemens systems but was constant for all weights in GE. The z-axis mA profiles from the Rando phantom demonstrate that these differences are caused by the nature of the tube current modulation techniques applied. The mA profiles varied considerably according to the AEC settings used. CT AEC techniques from the three manufacturers yield significantly different tube current modulation patterns and hence deliver different doses and levels of image quality across a range of patient weights. Users should be aware of how their system works and of steps that could be taken to optimize imaging protocols.
ERIC Educational Resources Information Center
Papadopoulos, Ioannis
2010-01-01
The issue of the area of irregular shapes is absent from the modern mathematical textbooks in elementary education in Greece. However, there exists a collection of books written for educational purposes by famous Greek scholars dating from the eighteenth century, which propose certain techniques concerning the estimation of the area of such…
Tag-to-Tag Interference Suppression Technique Based on Time Division for RFID.
Khadka, Grishma; Hwang, Suk-Seung
2017-01-01
Radio-frequency identification (RFID) is a tracking technology that enables immediate automatic object identification and rapid data sharing for a wide variety of modern applications using radio waves for data transmission from a tag to a reader. RFID is already well established in technical areas, and many companies have developed corresponding standards and measurement techniques. In the construction industry, effective monitoring of materials and equipment is an important task, and RFID helps to improve monitoring and controlling capabilities, in addition to enabling automation for construction projects. However, on construction sites, there are many tagged objects and multiple RFID tags that may interfere with each other's communications. This reduces the reliability and efficiency of the RFID system. In this paper, we propose an anti-collision algorithm for communication between multiple tags and a reader. In order to suppress interference signals from multiple neighboring tags, the proposed algorithm employs the time-division (TD) technique, where tags in the interrogation zone are assigned a specific time slot so that at every instance in time, a reader communicates with tags using the specific time slot. We present representative computer simulation examples to illustrate the performance of the proposed anti-collision technique for multiple RFID tags.
Ettinger, Kyle S; Alexander, Amy E; Arce, Kevin
2018-04-10
Virtual surgical planning (VSP), computer-aided design and computer-aided modeling, and 3-dimensional printing are 3 distinct technologies that have become increasingly used in head and neck oncology and microvascular reconstruction. Although each of these technologies has long been used for treatment planning in other surgical disciplines, such as craniofacial surgery, trauma surgery, temporomandibular joint surgery, and orthognathic surgery, its widespread use in head and neck reconstructive surgery remains a much more recent event. In response to the growing trend of VSP being used for the planning of fibular free flaps in head and neck reconstruction, some surgeons have questioned the technology's implementation based on its inadequacy in addressing other reconstructive considerations beyond hard tissue anatomy. Detractors of VSP for head and neck reconstruction highlight its lack of capability in accounting for multiple reconstructive factors, such as recipient vessel selection, vascular pedicle reach, need for dead space obliteration, and skin paddle perforator location. It is with this premise in mind that the authors report on a straightforward technique for anatomically localizing peroneal artery perforators during VSP for osteocutaneous fibular free flaps in which bone and a soft tissue skin paddle are required for ablative reconstruction. The technique allows for anatomic perforator localization during the VSP session based solely on data existent at preoperative computed tomographic angiography (CTA); it does not require any modifications to preoperative clinical workflows. It is the authors' presumption that many surgeons in the field are unaware of this planning capability within the context of modern VSP for head and neck reconstruction. The primary purpose of this report is to introduce and further familiarize surgeons with the technique of CTA perforator localization as a method of improving intraoperative fidelity for VSP of osteocutaneous fibular free flaps. Copyright © 2018. Published by Elsevier Inc.
Image analysis and machine learning for detecting malaria.
Poostchi, Mahdieh; Silamut, Kamolrat; Maude, Richard J; Jaeger, Stefan; Thoma, George
2018-04-01
Malaria remains a major burden on global health, with roughly 200 million cases worldwide and more than 400,000 deaths per year. Besides biomedical research and political efforts, modern information technology is playing a key role in many attempts at fighting the disease. One of the barriers toward a successful mortality reduction has been inadequate malaria diagnosis in particular. To improve diagnosis, image analysis software and machine learning methods have been used to quantify parasitemia in microscopic blood slides. This article gives an overview of these techniques and discusses the current developments in image analysis and machine learning for microscopic malaria diagnosis. We organize the different approaches published in the literature according to the techniques used for imaging, image preprocessing, parasite detection and cell segmentation, feature computation, and automatic cell classification. Readers will find the different techniques listed in tables, with the relevant articles cited next to them, for both thin and thick blood smear images. We also discussed the latest developments in sections devoted to deep learning and smartphone technology for future malaria diagnosis. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chironis, N.P.
This book contains a wealth of valuable information carefully selected and compiled from recent issues of Coal Age magazine. Much of the source material has been gathered by Coal Age Editors during their visits to coal mines, research establishments, universities and technical symposiums. Equally important are the articles and data contributed by over 50 top experts, many of whom are well known to the mining industry. Specifically, this easy-to-use handbook is divided into eleven key areas of underground mining. Here you will find the latest information on continuous mining techniques, longwall and shortwall methods and equipment, specialized mining and boringmore » systems, continuous haulage techniques, improved roof control and ventilation methods, mine communications and instrumentation, power systems, fire control methods, and new mining regulations. There is also a section on engineering and management considerations, including the modern use of computer terminals, practical techniques for picking leaders and for encouraging more safety consciousness in employees, factors affecting absenteeism, and some highly important financial considerations. All of this valuable information has been thoroughly indexed to provide immediate access to the specific data needed by the reader.« less
Ultrasonic Evaluation and Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, Susan L.; Anderson, Michael T.; Diaz, Aaron A.
2015-10-01
Ultrasonic evaluation of materials for material characterization and flaw detection is as simple as manually moving a single-element probe across a speci-men and looking at an oscilloscope display in real time or as complex as automatically (under computer control) scanning a phased-array probe across a specimen and collecting encoded data for immediate or off-line data analyses. The reliability of the results in the second technique is greatly increased because of a higher density of measurements per scanned area and measurements that can be more precisely related to the specimen geometry. This chapter will briefly discuss applications of the collection ofmore » spatially encoded data and focus primarily on the off-line analyses in the form of data imaging. Pacific Northwest National Laboratory (PNNL) has been involved with as-sessing and advancing the reliability of inservice inspections of nuclear power plant components for over 35 years. Modern ultrasonic imaging techniques such as the synthetic aperture focusing technique (SAFT), phased-array (PA) technolo-gy and sound field mapping have undergone considerable improvements to effec-tively assess and better understand material constraints.« less
Bio-inspired color sketch for eco-friendly printing
NASA Astrophysics Data System (ADS)
Safonov, Ilia V.; Tolstaya, Ekaterina V.; Rychagov, Michael N.; Lee, Hokeun; Kim, Sang Ho; Choi, Donchul
2012-01-01
Saving of toner/ink consumption is an important task in modern printing devices. It has a positive ecological and social impact. We propose technique for converting print-job pictures to a recognizable and pleasant color sketches. Drawing a "pencil sketch" from a photo relates to a special area in image processing and computer graphics - non-photorealistic rendering. We describe a new approach for automatic sketch generation which allows to create well-recognizable sketches and to preserve partly colors of the initial picture. Our sketches contain significantly less color dots then initial images and this helps to save toner/ink. Our bio-inspired approach is based on sophisticated edge detection technique for a mask creation and multiplication of source image with increased contrast by this mask. To construct the mask we use DoG edge detection, which is a result of blending of initial image with its blurred copy through the alpha-channel, which is created from Saliency Map according to Pre-attentive Human Vision model. Measurement of percentage of saved toner and user study proves effectiveness of proposed technique for toner saving in eco-friendly printing mode.
High performance flight computer developed for deep space applications
NASA Technical Reports Server (NTRS)
Bunker, Robert L.
1993-01-01
The development of an advanced space flight computer for real time embedded deep space applications which embodies the lessons learned on Galileo and modern computer technology is described. The requirements are listed and the design implementation that meets those requirements is described. The development of SPACE-16 (Spaceborne Advanced Computing Engine) (where 16 designates the databus width) was initiated to support the MM2 (Marine Mark 2) project. The computer is based on a radiation hardened emulation of a modern 32 bit microprocessor and its family of support devices including a high performance floating point accelerator. Additional custom devices which include a coprocessor to improve input/output capabilities, a memory interface chip, and an additional support chip that provide management of all fault tolerant features, are described. Detailed supporting analyses and rationale which justifies specific design and architectural decisions are provided. The six chip types were designed and fabricated. Testing and evaluation of a brass/board was initiated.
ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING
The overall goal of the EPA-ORD NERL research program on Computational Toxicology (CompTox) is to provide the Agency with the tools of modern chemistry, biology, and computing to improve quantitative risk assessments and reduce uncertainties in the source-to-adverse outcome conti...
Evaluation of Turbulence-Model Performance in Jet Flows
NASA Technical Reports Server (NTRS)
Woodruff, S. L.; Seiner, J. M.; Hussaini, M. Y.; Erlebacher, G.
2001-01-01
The importance of reducing jet noise in both commercial and military aircraft applications has made jet acoustics a significant area of research. A technique for jet noise prediction commonly employed in practice is the MGB approach, based on the Lighthill acoustic analogy. This technique requires as aerodynamic input mean flow quantities and turbulence quantities like the kinetic energy and the dissipation. The purpose of the present paper is to assess existing capabilities for predicting these aerodynamic inputs. Two modern Navier-Stokes flow solvers, coupled with several modern turbulence models, are evaluated by comparison with experiment for their ability to predict mean flow properties in a supersonic jet plume. Potential weaknesses are identified for further investigation. Another comparison with similar intent is discussed by Barber et al. The ultimate goal of this research is to develop a reliable flow solver applicable to the low-noise, propulsion-efficient, nozzle exhaust systems being developed in NASA focused programs. These programs address a broad range of complex nozzle geometries operating in high temperature, compressible, flows. Seiner et al. previously discussed the jet configuration examined here. This convergent-divergent nozzle with an exit diameter of 3.6 inches was designed for an exhaust Mach number of 2.0 and a total temperature of 1680 F. The acoustic and aerodynamic data reported by Seiner et al. covered a range of jet total temperatures from 104 F to 2200 F at the fully-expanded nozzle pressure ratio. The aerodynamic data included centerline mean velocity and total temperature profiles. Computations were performed independently with two computational fluid dynamics (CFD) codes, ISAAC and PAB3D. Turbulence models employed include the k-epsilon model, the Gatski-Speziale algebraic-stress model and the Girimaji model, with and without the Sarkar compressibility correction. Centerline values of mean velocity and mean temperature are compared with experimental data.
Feasibility of Tactical Air Delivery Resupply Using Gliders
2016-12-01
using modern design and manufacturing techniques including AutoCAD, 3D printing , laser cutting and CorelDraw, and conducting field testing and...Sparrow,” using modern design and manufacturing techniques including AutoCAD, 3D printing , laser cutting and CorelDraw, and conducting field testing and...the desired point(s) of impact due to the atmospheric three-dimensional ( 3D ) wind and density field encountered by the descending load under canopy
Modern Education in China. Bulletin, 1919, No. 44
ERIC Educational Resources Information Center
Edmunds, Charles K.
1919-01-01
The Chinese conception of life's values is so different from that of western peoples that they have failed to develop modern technique and scientific knowledge. Now that they have come to see the value of these, rapid and fundamental changes are taking place. When modern scientific knowledge is added to the skill which the Chinese already have in…
Interaction techniques for radiology workstations: impact on users' productivity
NASA Astrophysics Data System (ADS)
Moise, Adrian; Atkins, M. Stella
2004-04-01
As radiologists progress from reading images presented on film to modern computer systems with images presented on high-resolution displays, many new problems arise. Although the digital medium has many advantages, the radiologist"s job becomes cluttered with many new tasks related to image manipulation. This paper presents our solution for supporting radiologists" interpretation of digital images by automating image presentation during sequential interpretation steps. Our method supports scenario based interpretation, which group data temporally, according to the mental paradigm of the physician. We extended current hanging protocols with support for "stages". A stage reflects the presentation of digital information required to complete a single step within a complex task. We demonstrated the benefits of staging in a user study with 20 lay subjects involved in a visual conjunctive search for targets, similar to a radiology task of identifying anatomical abnormalities. We designed a task and a set of stimuli which allowed us to simulate the interpretation workflow from a typical radiology scenario - reading a chest computed radiography exam when a prior study is also available. The simulation was possible by abstracting the radiologist"s task and the basic workstation navigation functionality. We introduced "Stages," an interaction technique attuned to the radiologist"s interpretation task. Compared to the traditional user interface, Stages generated a 14% reduction in the average interpretation.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.
Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy
2014-11-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Andisheh-Tadbir, Mehdi; Orfino, Francesco P.; Kjeang, Erik
2016-04-01
Modern hydrogen powered polymer electrolyte fuel cells (PEFCs) utilize a micro-porous layer (MPL) consisting of carbon nanoparticles and polytetrafluoroethylene (PTFE) to enhance the transport phenomena and performance while reducing cost. However, the underlying mechanisms are not yet completely understood due to a lack of information about the detailed MPL structure and properties. In the present work, the 3D phase segregated nanostructure of an MPL is revealed for the first time through the development of a customized, non-destructive procedure for monochromatic nano-scale X-ray computed tomography visualization. Utilizing this technique, it is discovered that PTFE is situated in conglomerated regions distributed randomly within connected domains of carbon particles; hence, it is concluded that PTFE acts as a binder for the carbon particles and provides structural support for the MPL. Exposed PTFE surfaces are also observed that will aid the desired hydrophobicity of the material. Additionally, the present approach uniquely enables phase segregated calculation of effective transport properties, as reported herein, which is particularly important for accurate estimation of electrical and thermal conductivity. Overall, the new imaging technique and associated findings may contribute to further performance improvements and cost reduction in support of fuel cell commercialization for clean energy applications.
Python for Information Theoretic Analysis of Neural Data
Ince, Robin A. A.; Petersen, Rasmus S.; Swan, Daniel C.; Panzeri, Stefano
2008-01-01
Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources. PMID:19242557
GPU-BSM: A GPU-Based Tool to Map Bisulfite-Treated Reads
Manconi, Andrea; Orro, Alessandro; Manca, Emanuele; Armano, Giuliano; Milanesi, Luciano
2014-01-01
Cytosine DNA methylation is an epigenetic mark implicated in several biological processes. Bisulfite treatment of DNA is acknowledged as the gold standard technique to study methylation. This technique introduces changes in the genomic DNA by converting cytosines to uracils while 5-methylcytosines remain nonreactive. During PCR amplification 5-methylcytosines are amplified as cytosine, whereas uracils and thymines as thymine. To detect the methylation levels, reads treated with the bisulfite must be aligned against a reference genome. Mapping these reads to a reference genome represents a significant computational challenge mainly due to the increased search space and the loss of information introduced by the treatment. To deal with this computational challenge we devised GPU-BSM, a tool based on modern Graphics Processing Units. Graphics Processing Units are hardware accelerators that are increasingly being used successfully to accelerate general-purpose scientific applications. GPU-BSM is a tool able to map bisulfite-treated reads from whole genome bisulfite sequencing and reduced representation bisulfite sequencing, and to estimate methylation levels, with the goal of detecting methylation. Due to the massive parallelization obtained by exploiting graphics cards, GPU-BSM aligns bisulfite-treated reads faster than other cutting-edge solutions, while outperforming most of them in terms of unique mapped reads. PMID:24842718
NASA Astrophysics Data System (ADS)
Altıparmak, Hamit; Al Shahadat, Mohamad; Kiani, Ehsan; Dimililer, Kamil
2018-04-01
Robotic agriculture requires smart and doable techniques to substitute the human intelligence with machine intelligence. Strawberry is one of the important Mediterranean product and its productivity enhancement requires modern and machine-based methods. Whereas a human identifies the disease infected leaves by his eye, the machine should also be capable of vision-based disease identification. The objective of this paper is to practically verify the applicability of a new computer-vision method for discrimination between the healthy and disease infected strawberry leaves which does not require neural network or time consuming trainings. The proposed method was tested under outdoor lighting condition using a regular DLSR camera without any particular lens. Since the type and infection degree of disease is approximated a human brain a fuzzy decision maker classifies the leaves over the images captured on-site having the same properties of human vision. Optimizing the fuzzy parameters for a typical strawberry production area at a summer mid-day in Cyprus produced 96% accuracy for segmented iron deficiency and 93% accuracy for segmented using a typical human instant classification approximation as the benchmark holding higher accuracy than a human eye identifier. The fuzzy-base classifier provides approximate result for decision making on the leaf status as if it is healthy or not.
Combining Feature Extraction Methods to Assist the Diagnosis of Alzheimer's Disease.
Segovia, F; Górriz, J M; Ramírez, J; Phillips, C
2016-01-01
Neuroimaging data as (18)F-FDG PET is widely used to assist the diagnosis of Alzheimer's disease (AD). Looking for regions with hypoperfusion/ hypometabolism, clinicians may predict or corroborate the diagnosis of the patients. Modern computer aided diagnosis (CAD) systems based on the statistical analysis of whole neuroimages are more accurate than classical systems based on quantifying the uptake of some predefined regions of interests (ROIs). In addition, these new systems allow determining new ROIs and take advantage of the huge amount of information comprised in neuroimaging data. A major branch of modern CAD systems for AD is based on multivariate techniques, which analyse a neuroimage as a whole, considering not only the voxel intensities but also the relations among them. In order to deal with the vast dimensionality of the data, a number of feature extraction methods have been successfully applied. In this work, we propose a CAD system based on the combination of several feature extraction techniques. First, some commonly used feature extraction methods based on the analysis of the variance (as principal component analysis), on the factorization of the data (as non-negative matrix factorization) and on classical magnitudes (as Haralick features) were simultaneously applied to the original data. These feature sets were then combined by means of two different combination approaches: i) using a single classifier and a multiple kernel learning approach and ii) using an ensemble of classifier and selecting the final decision by majority voting. The proposed approach was evaluated using a labelled neuroimaging database along with a cross validation scheme. As conclusion, the proposed CAD system performed better than approaches using only one feature extraction technique. We also provide a fair comparison (using the same database) of the selected feature extraction methods.
Detecting buried explosive hazards with handheld GPR and deep learning
NASA Astrophysics Data System (ADS)
Besaw, Lance E.
2016-05-01
Buried explosive hazards (BEHs), including traditional landmines and homemade improvised explosives, have proven difficult to detect and defeat during and after conflicts around the world. Despite their various sizes, shapes and construction material, ground penetrating radar (GPR) is an excellent phenomenology for detecting BEHs due to its ability to sense localized differences in electromagnetic properties. Handheld GPR detectors are common equipment for detecting BEHs because of their flexibility (in part due to the human operator) and effectiveness in cluttered environments. With modern digital electronics and positioning systems, handheld GPR sensors can sense and map variation in electromagnetic properties while searching for BEHs. Additionally, large-scale computers have demonstrated an insatiable appetite for ingesting massive datasets and extracting meaningful relationships. This is no more evident than the maturation of deep learning artificial neural networks (ANNs) for image and speech recognition now commonplace in industry and academia. This confluence of sensing, computing and pattern recognition technologies offers great potential to develop automatic target recognition techniques to assist GPR operators searching for BEHs. In this work deep learning ANNs are used to detect BEHs and discriminate them from harmless clutter. We apply these techniques to a multi-antennae, handheld GPR with centimeter-accurate positioning system that was used to collect data over prepared lanes containing a wide range of BEHs. This work demonstrates that deep learning ANNs can automatically extract meaningful information from complex GPR signatures, complementing existing GPR anomaly detection and classification techniques.
Computational Skills for Biology Students
ERIC Educational Resources Information Center
Gross, Louis J.
2008-01-01
This interview with Distinguished Science Award recipient Louis J. Gross highlights essential computational skills for modern biology, including: (1) teaching concepts listed in the Math & Bio 2010 report; (2) illustrating to students that jobs today require quantitative skills; and (3) resources and materials that focus on computational skills.
A profile of the demographics and training characteristics of professional modern dancers.
Weiss, David S; Shah, Selina; Burchette, Raoul J
2008-01-01
Modern dancers are a unique group of artists, performing a diverse repertoire in dance companies of various sizes. In this study, 184 professional modern dancers in the United States (males N=49, females N=135), including members of large and small companies as well as freelance dancers, were surveyed regarding their demographics and training characteristics. The mean age of the dancers was 30.1 +/- 7.3 years, and they had danced professionally for 8.9 +/- 7.2 years. The average Body Mass Index (BMI) was 23.6 +/- 2.4 for males and 20.5 +/- 1.7 for females. Females had started taking dance class earlier (age 6.5 +/- 4.2 years) as compared to males (age 15.6 +/- 6.2 years). Females were more likely to have begun their training in ballet, while males more often began with modern classes (55% and 51% respectively, p < 0.0001). The professional modern dancers surveyed spent 8.3 +/- 6.0 hours in class and 17.2 +/- 12.6 hours in rehearsal each week. Eighty percent took modern technique class and 67% reported that they took ballet technique class. The dancers who specified what modern technique they studied (N=84) reported between two and four different techniques. The dancers also participated in a multitude of additional exercise regimens for a total of 8.2 +/- 6.6 hours per week, with the most common types being Pilates, yoga, and upper body weightlifting. The dancers wore many different types of footwear, depending on the style of dance being performed. For modern dance alone, dancers wore 12 different types of footwear. Reflecting the diversity of the dancers and companies surveyed, females reported performing for 23.3 +/- 14.0 weeks (range: 2-52 weeks) per year; males reported performing 20.4 +/- 13.9 weeks (range: 1-40) per year. Only 18% of the dancers did not have any health insurance, with 54% having some type of insurance provided by their employer. However, 23% of the dancers purchased their own insurance, and 22% had insurance provided by their families. Only 16% of dancers reported that they had Workers' Compensation coverage, despite the fact that they were all professionals, including many employed by major modern dance companies across the United States. It is concluded that understanding the training profile of the professional modern dancer should assist healthcare providers in supplying appropriate medical care for these performers.
NASA Astrophysics Data System (ADS)
Burnett, W.
2016-12-01
The Department of Defense's (DoD) High Performance Computing Modernization Program (HPCMP) provides high performance computing to address the most significant challenges in computational resources, software application support and nationwide research and engineering networks. Today, the HPCMP has a critical role in ensuring the National Earth System Prediction Capability (N-ESPC) achieves initial operational status in 2019. A 2015 study commissioned by the HPCMP found that N-ESPC computational requirements will exceed interconnect bandwidth capacity due to the additional load from data assimilation and passing connecting data between ensemble codes. Memory bandwidth and I/O bandwidth will continue to be significant bottlenecks for the Navy's Hybrid Coordinate Ocean Model (HYCOM) scalability - by far the major driver of computing resource requirements in the N-ESPC. The study also found that few of the N-ESPC model developers have detailed plans to ensure their respective codes scale through 2024. Three HPCMP initiatives are designed to directly address and support these issues: Productivity Enhancement, Technology, Transfer and Training (PETTT), the HPCMP Applications Software Initiative (HASI), and Frontier Projects. PETTT supports code conversion by providing assistance, expertise and training in scalable and high-end computing architectures. HASI addresses the continuing need for modern application software that executes effectively and efficiently on next-generation high-performance computers. Frontier Projects enable research and development that could not be achieved using typical HPCMP resources by providing multi-disciplinary teams access to exceptional amounts of high performance computing resources. Finally, the Navy's DoD Supercomputing Resource Center (DSRC) currently operates a 6 Petabyte system, of which Naval Oceanography receives 15% of operational computational system use, or approximately 1 Petabyte of the processing capability. The DSRC will provide the DoD with future computing assets to initially operate the N-ESPC in 2019. This talk will further describe how DoD's HPCMP will ensure N-ESPC becomes operational, efficiently and effectively, using next-generation high performance computing.
Looking ahead in systems engineering
NASA Technical Reports Server (NTRS)
Feigenbaum, Donald S.
1966-01-01
Five areas that are discussed in this paper are: (1) the technological characteristics of systems engineering; (2) the analytical techniques that are giving modern systems work its capability and power; (3) the management, economics, and effectiveness dimensions that now frame the modern systems field; (4) systems engineering's future impact upon automation, computerization and managerial decision-making in industry - and upon aerospace and weapons systems in government and the military; and (5) modern systems engineering's partnership with modern quality control and reliability.
Intraoperative computed tomography.
Tonn, J C; Schichor, C; Schnell, O; Zausinger, S; Uhl, E; Morhard, D; Reiser, M
2011-01-01
Intraoperative computed tomography (iCT) has gained increasing impact among modern neurosurgical techniques. Multislice CT with a sliding gantry in the OR provides excellent diagnostic image quality in the visualization of vascular lesions as well as bony structures including skull base and spine. Due to short acquisition times and a high spatial and temporal resolution, various modalities such as iCT-angiography, iCT-cerebral perfusion and the integration of intraoperative navigation with automatic re-registration after scanning can be performed. This allows a variety of applications, e.g. intraoperative angiography, intraoperative cerebral perfusion studies, update of cerebral and spinal navigation, stereotactic procedures as well as resection control in tumour surgery. Its versatility promotes its use in a multidisciplinary setting. Radiation exposure is comparable to standard CT systems outside the OR. For neurosurgical purposes, however, new hardware components (e.g. a radiolucent headholder system) had to be developed. Having a different range of applications compared to intraoperative MRI, it is an attractive modality for intraoperative imaging being comparatively easy to install and cost efficient.
Gauge backgrounds and zero-mode counting in F-theory
NASA Astrophysics Data System (ADS)
Bies, Martin; Mayrhofer, Christoph; Weigand, Timo
2017-11-01
Computing the exact spectrum of charged massless matter is a crucial step towards understanding the effective field theory describing F-theory vacua in four dimensions. In this work we further develop a coherent framework to determine the charged massless matter in F-theory compactified on elliptic fourfolds, and demonstrate its application in a concrete example. The gauge background is represented, via duality with M-theory, by algebraic cycles modulo rational equivalence. Intersection theory within the Chow ring allows us to extract coherent sheaves on the base of the elliptic fibration whose cohomology groups encode the charged zero-mode spectrum. The dimensions of these cohomology groups are computed with the help of modern techniques from algebraic geometry, which we implement in the software gap. We exemplify this approach in models with an Abelian and non-Abelian gauge group and observe jumps in the exact massless spectrum as the complex structure moduli are varied. An extended mathematical appendix gives a self-contained introduction to the algebro-geometric concepts underlying our framework.
Single-electron random-number generator (RNG) for highly secure ubiquitous computing applications
NASA Astrophysics Data System (ADS)
Uchida, Ken; Tanamoto, Tetsufumi; Fujita, Shinobu
2007-11-01
Since the security of all modern cryptographic techniques relies on unpredictable and irreproducible digital keys generated by random-number generators (RNGs), the realization of high-quality RNG is essential for secure communications. In this report, a new RNG, which utilizes single-electron phenomena, is proposed. A room-temperature operating silicon single-electron transistor (SET) having nearby an electron pocket is used as a high-quality, ultra-small RNG. In the proposed RNG, stochastic single-electron capture/emission processes to/from the electron pocket are detected with high sensitivity by the SET, and result in giant random telegraphic signals (GRTS) on the SET current. It is experimentally demonstrated that the single-electron RNG generates extremely high-quality random digital sequences at room temperature, in spite of its simple configuration. Because of its small-size and low-power properties, the single-electron RNG is promising as a key nanoelectronic device for future ubiquitous computing systems with highly secure mobile communication capabilities.
Mass Storage and Retrieval at Rome Laboratory
NASA Technical Reports Server (NTRS)
Kann, Joshua L.; Canfield, Brady W.; Jamberdino, Albert A.; Clarke, Bernard J.; Daniszewski, Ed; Sunada, Gary
1996-01-01
As the speed and power of modern digital computers continues to advance, the demands on secondary mass storage systems grow. In many cases, the limitations of existing mass storage reduce the overall effectiveness of the computing system. Image storage and retrieval is one important area where improved storage technologies are required. Three dimensional optical memories offer the advantage of large data density, on the order of 1 Tb/cm(exp 3), and faster transfer rates because of the parallel nature of optical recording. Such a system allows for the storage of multiple-Gbit sized images, which can be recorded and accessed at reasonable rates. Rome Laboratory is currently investigating several techniques to perform three-dimensional optical storage including holographic recording, two-photon recording, persistent spectral-hole burning, multi-wavelength DNA recording, and the use of bacteriorhodopsin as a recording material. In this paper, the current status of each of these on-going efforts is discussed. In particular, the potential payoffs as well as possible limitations are addressed.
TauFactor: An open-source application for calculating tortuosity factors from tomographic data
NASA Astrophysics Data System (ADS)
Cooper, S. J.; Bertei, A.; Shearing, P. R.; Kilner, J. A.; Brandon, N. P.
TauFactor is a MatLab application for efficiently calculating the tortuosity factor, as well as volume fractions, surface areas and triple phase boundary densities, from image based microstructural data. The tortuosity factor quantifies the apparent decrease in diffusive transport resulting from convolutions of the flow paths through porous media. TauFactor was originally developed to improve the understanding of electrode microstructures for batteries and fuel cells; however, the tortuosity factor has been of interest to a wide range of disciplines for over a century, including geoscience, biology and optics. It is still common practice to use correlations, such as that developed by Bruggeman, to approximate the tortuosity factor, but in recent years the increasing availability of 3D imaging techniques has spurred interest in calculating this quantity more directly. This tool provides a fast and accurate computational platform applicable to the big datasets (>108 voxels) typical of modern tomography, without requiring high computational power.
High Performance Radiation Transport Simulations on TITAN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Christopher G; Davidson, Gregory G; Evans, Thomas M
2012-01-01
In this paper we describe the Denovo code system. Denovo solves the six-dimensional, steady-state, linear Boltzmann transport equation, of central importance to nuclear technology applications such as reactor core analysis (neutronics), radiation shielding, nuclear forensics and radiation detection. The code features multiple spatial differencing schemes, state-of-the-art linear solvers, the Koch-Baker-Alcouffe (KBA) parallel-wavefront sweep algorithm for inverting the transport operator, a new multilevel energy decomposition method scaling to hundreds of thousands of processing cores, and a modern, novel code architecture that supports straightforward integration of new features. In this paper we discuss the performance of Denovo on the 10--20 petaflop ORNLmore » GPU-based system, Titan. We describe algorithms and techniques used to exploit the capabilities of Titan's heterogeneous compute node architecture and the challenges of obtaining good parallel performance for this sparse hyperbolic PDE solver containing inherently sequential computations. Numerical results demonstrating Denovo performance on early Titan hardware are presented.« less
With Great Measurements Come Great Results
NASA Astrophysics Data System (ADS)
Williams, Carl
Measurements are the foundation for science and modern life. Technologies we take for granted every day depend on them-cell phones, CAT scans, pharmaceuticals, even sports equipment. Metrology, or measurement science, determines what industry can make reliably and what they cannot. At the National Institute of Standards and Technology (NIST) we specialize in making world class measurements that an incredibly wide range of industries use to continually improve their products - computer chips with nanoscale components, atomic clocks that you can hold in your hand, lasers for both super-strong welds and delicate eye surgeries. Think of all the key technologies developed over the last 100 years and better measurements, standards, or analysis techniques played a role in making them possible. NIST works collaboratively with industry researchers on the advanced metrology for tomorrow's technologies. A new kilogram based on electromagnetic force, cars that weigh half as much but are just as strong, quantum computers, personalized medicine, single atom devices - it's all happening in our labs now. This talk will focus on how metrology creates the future.
Silva-Lopes, Victor W; Monteiro-Leal, Luiz H
2003-07-01
The development of new technology and the possibility of fast information delivery by either Internet or Intranet connections are changing education. Microanatomy education depends basically on the correct interpretation of microscopy images by students. Modern microscopes coupled to computers enable the presentation of these images in a digital form by creating image databases. However, the access to this new technology is restricted entirely to those living in cities and towns with an Information Technology (IT) infrastructure. This study describes the creation of a free Internet histology database composed by high-quality images and also presents an inexpensive way to supply it to a greater number of students through Internet/Intranet connections. By using state-of-the-art scientific instruments, we developed a Web page (http://www2.uerj.br/~micron/atlas/atlasenglish/index.htm) that, in association with a multimedia microscopy laboratory, intends to help in the reduction of the IT educational gap between developed and underdeveloped regions. Copyright 2003 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Aboona, Bassam; Holt, Jeremy
2017-09-01
Chiral effective field theory provides a modern framework for understanding the structure and dynamics of nuclear many-body systems. Recent works have had much success in applying the theory to describe the ground- and excited-state properties of light and medium-mass atomic nuclei when combined with ab initio numerical techniques. Our aim is to extend the application of chiral effective field theory to describe the nuclear equation of state required for supercomputer simulations of core-collapse supernovae. Given the large range of densities, temperatures, and proton fractions probed during stellar core collapse, microscopic calculations of the equation of state require large computational resources on the order of one million CPU hours. We investigate the use of graphics processing units (GPUs) to significantly reduce the computational cost of these calculations, which will enable a more accurate and precise description of this important input to numerical astrophysical simulations. Cyclotron Institute at Texas A&M, NSF Grant: PHY 1659847, DOE Grant: DE-FG02-93ER40773.
Adaptive Strategies for Controls of Flexible Arms. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Yuan, Bau-San
1989-01-01
An adaptive controller for a modern manipulator has been designed based on asymptotical stability via the Lyapunov criterion with the output error between the system and a reference model used as the actuating control signal. Computer simulations were carried out to test the design. The combination of the adaptive controller and a system vibration and mode shape estimator show that the flexible arm should move along a pre-defined trajectory with high-speed motion and fast vibration setting time. An existing computer-controlled prototype two link manipulator, RALF (Robotic Arm, Large Flexible), with a parallel mechanism driven by hydraulic actuators was used to verify the mathematical analysis. The experimental results illustrate that assumed modes found from finite element techniques can be used to derive the equations of motion with acceptable accuracy. The robust adaptive (modal) control is implemented to compensate for unmodelled modes and nonlinearities and is compared with the joint feedback control in additional experiments. Preliminary results show promise for the experimental control algorithm.
[Aerobic methylobacteria as promising objects of modern biotechnology].
Doronina, N V; Toronskava, L; Fedorov, D N; Trotsenko, Yu A
2015-01-01
The experimental data of the past decade concerning the metabolic peculiarities of aerobic meth ylobacteria and the prospects for their use in different fields of modern biotechnology, including genetic engineering techniques, have been summarized.
Adeshina, A M; Hashim, R
2017-03-01
Diagnostic radiology is a core and integral part of modern medicine, paving ways for the primary care physicians in the disease diagnoses, treatments and therapy managements. Obviously, all recent standard healthcare procedures have immensely benefitted from the contemporary information technology revolutions, apparently revolutionizing those approaches to acquiring, storing and sharing of diagnostic data for efficient and timely diagnosis of diseases. Connected health network was introduced as an alternative to the ageing traditional concept in healthcare system, improving hospital-physician connectivity and clinical collaborations. Undoubtedly, the modern medicinal approach has drastically improved healthcare but at the expense of high computational cost and possible breach of diagnosis privacy. Consequently, a number of cryptographical techniques are recently being applied to clinical applications, but the challenges of not being able to successfully encrypt both the image and the textual data persist. Furthermore, processing time of encryption-decryption of medical datasets, within a considerable lower computational cost without jeopardizing the required security strength of the encryption algorithm, still remains as an outstanding issue. This study proposes a secured radiology-diagnostic data framework for connected health network using high-performance GPU-accelerated Advanced Encryption Standard. The study was evaluated with radiology image datasets consisting of brain MR and CT datasets obtained from the department of Surgery, University of North Carolina, USA, and the Swedish National Infrastructure for Computing. Sample patients' notes from the University of North Carolina, School of medicine at Chapel Hill were also used to evaluate the framework for its strength in encrypting-decrypting textual data in the form of medical report. Significantly, the framework is not only able to accurately encrypt and decrypt medical image datasets, but it also successfully encrypts and decrypts textual data in Microsoft Word document, Microsoft Excel and Portable Document Formats which are the conventional format of documenting medical records. Interestingly, the entire encryption and decryption procedures were achieved at a lower computational cost using regular hardware and software resources without compromising neither the quality of the decrypted data nor the security level of the algorithms.
Knowledge-based computer systems for radiotherapy planning.
Kalet, I J; Paluszynski, W
1990-08-01
Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.
De Sanctis, Bianca; Krukov, Ivan; de Koning, A P Jason
2017-09-19
Determination of the age of an allele based on its population frequency is a well-studied problem in population genetics, for which a variety of approximations have been proposed. We present a new result that, surprisingly, allows the expectation and variance of allele age to be computed exactly (within machine precision) for any finite absorbing Markov chain model in a matter of seconds. This approach makes none of the classical assumptions (e.g., weak selection, reversibility, infinite sites), exploits modern sparse linear algebra techniques, integrates over all sample paths, and is rapidly computable for Wright-Fisher populations up to N e = 100,000. With this approach, we study the joint effect of recurrent mutation, dominance, and selection, and demonstrate new examples of "selective strolls" where the classical symmetry of allele age with respect to selection is violated by weakly selected alleles that are older than neutral alleles at the same frequency. We also show evidence for a strong age imbalance, where rare deleterious alleles are expected to be substantially older than advantageous alleles observed at the same frequency when population-scaled mutation rates are large. These results highlight the under-appreciated utility of computational methods for the direct analysis of Markov chain models in population genetics.
Impact of the macroeconomic factors on university budgeting the US and Russia
NASA Astrophysics Data System (ADS)
Bogomolova, Arina; Balk, Igor; Ivachenko, Natalya; Temkin, Anatoly
2017-10-01
This paper discuses impact of macroeconomics factor on the university budgeting. Modern developments in the area of data science and machine learning made it possible to utilise automated techniques to address several problems of humankind ranging from genetic engineering and particle physics to sociology and economics. This paper is the first step to create a robust toolkit which will help universities sustain macroeconomic challenges utilising modern predictive analytics techniques.
Hastings, Janna; Chepelev, Leonid; Willighagen, Egon; Adams, Nico; Steinbeck, Christoph; Dumontier, Michel
2011-01-01
Cheminformatics is the application of informatics techniques to solve chemical problems in silico. There are many areas in biology where cheminformatics plays an important role in computational research, including metabolism, proteomics, and systems biology. One critical aspect in the application of cheminformatics in these fields is the accurate exchange of data, which is increasingly accomplished through the use of ontologies. Ontologies are formal representations of objects and their properties using a logic-based ontology language. Many such ontologies are currently being developed to represent objects across all the domains of science. Ontologies enable the definition, classification, and support for querying objects in a particular domain, enabling intelligent computer applications to be built which support the work of scientists both within the domain of interest and across interrelated neighbouring domains. Modern chemical research relies on computational techniques to filter and organise data to maximise research productivity. The objects which are manipulated in these algorithms and procedures, as well as the algorithms and procedures themselves, enjoy a kind of virtual life within computers. We will call these information entities. Here, we describe our work in developing an ontology of chemical information entities, with a primary focus on data-driven research and the integration of calculated properties (descriptors) of chemical entities within a semantic web context. Our ontology distinguishes algorithmic, or procedural information from declarative, or factual information, and renders of particular importance the annotation of provenance to calculated data. The Chemical Information Ontology is being developed as an open collaborative project. More details, together with a downloadable OWL file, are available at http://code.google.com/p/semanticchemistry/ (license: CC-BY-SA).
Hastings, Janna; Chepelev, Leonid; Willighagen, Egon; Adams, Nico; Steinbeck, Christoph; Dumontier, Michel
2011-01-01
Cheminformatics is the application of informatics techniques to solve chemical problems in silico. There are many areas in biology where cheminformatics plays an important role in computational research, including metabolism, proteomics, and systems biology. One critical aspect in the application of cheminformatics in these fields is the accurate exchange of data, which is increasingly accomplished through the use of ontologies. Ontologies are formal representations of objects and their properties using a logic-based ontology language. Many such ontologies are currently being developed to represent objects across all the domains of science. Ontologies enable the definition, classification, and support for querying objects in a particular domain, enabling intelligent computer applications to be built which support the work of scientists both within the domain of interest and across interrelated neighbouring domains. Modern chemical research relies on computational techniques to filter and organise data to maximise research productivity. The objects which are manipulated in these algorithms and procedures, as well as the algorithms and procedures themselves, enjoy a kind of virtual life within computers. We will call these information entities. Here, we describe our work in developing an ontology of chemical information entities, with a primary focus on data-driven research and the integration of calculated properties (descriptors) of chemical entities within a semantic web context. Our ontology distinguishes algorithmic, or procedural information from declarative, or factual information, and renders of particular importance the annotation of provenance to calculated data. The Chemical Information Ontology is being developed as an open collaborative project. More details, together with a downloadable OWL file, are available at http://code.google.com/p/semanticchemistry/ (license: CC-BY-SA). PMID:21991315
Computational Thermochemistry of Jet Fuels and Rocket Propellants
NASA Technical Reports Server (NTRS)
Crawford, T. Daniel
2002-01-01
The design of new high-energy density molecules as candidates for jet and rocket fuels is an important goal of modern chemical thermodynamics. The NASA Glenn Research Center is home to a database of thermodynamic data for over 2000 compounds related to this goal, in the form of least-squares fits of heat capacities, enthalpies, and entropies as functions of temperature over the range of 300 - 6000 K. The chemical equilibrium with applications (CEA) program written and maintained by researchers at NASA Glenn over the last fifty years, makes use of this database for modeling the performance of potential rocket propellants. During its long history, the NASA Glenn database has been developed based on experimental results and data published in the scientific literature such as the standard JANAF tables. The recent development of efficient computational techniques based on quantum chemical methods provides an alternative source of information for expansion of such databases. For example, it is now possible to model dissociation or combustion reactions of small molecules to high accuracy using techniques such as coupled cluster theory or density functional theory. Unfortunately, the current applicability of reliable computational models is limited to relatively small molecules containing only around a dozen (non-hydrogen) atoms. We propose to extend the applicability of coupled cluster theory- often referred to as the 'gold standard' of quantum chemical methods- to molecules containing 30-50 non-hydrogen atoms. The centerpiece of this work is the concept of local correlation, in which the description of the electron interactions- known as electron correlation effects- are reduced to only their most important localized components. Such an advance has the potential to greatly expand the current reach of computational thermochemistry and thus to have a significant impact on the theoretical study of jet and rocket propellants.
Towards prediction of correlated material properties using quantum Monte Carlo methods
NASA Astrophysics Data System (ADS)
Wagner, Lucas
Correlated electron systems offer a richness of physics far beyond noninteracting systems. If we would like to pursue the dream of designer correlated materials, or, even to set a more modest goal, to explain in detail the properties and effective physics of known materials, then accurate simulation methods are required. Using modern computational resources, quantum Monte Carlo (QMC) techniques offer a way to directly simulate electron correlations. I will show some recent results on a few extremely challenging materials including the metal-insulator transition of VO2, the ground state of the doped cuprates, and the pressure dependence of magnetic properties in FeSe. By using a relatively simple implementation of QMC, at least some properties of these materials can be described truly from first principles, without any adjustable parameters. Using the QMC platform, we have developed a way of systematically deriving effective lattice models from the simulation. This procedure is particularly attractive for correlated electron systems because the QMC methods treat the one-body and many-body components of the wave function and Hamiltonian on completely equal footing. I will show some examples of using this downfolding technique and the high accuracy of QMC to connect our intuitive ideas about interacting electron systems with high fidelity simulations. The work in this presentation was supported in part by NSF DMR 1206242, the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Scientific Discovery through Advanced Computing (SciDAC) program under Award Number FG02-12ER46875, and the Center for Emergent Superconductivity, Department of Energy Frontier Research Center under Grant No. DEAC0298CH1088. Computing resources were provided by a Blue Waters Illinois grant and INCITE PhotSuper and SuperMatSim allocations.
Identification of Microorganisms by Modern Analytical Techniques.
Buszewski, Bogusław; Rogowska, Agnieszka; Pomastowski, Paweł; Złoch, Michał; Railean-Plugaru, Viorica
2017-11-01
Rapid detection and identification of microorganisms is a challenging and important aspect in a wide range of fields, from medical to industrial, affecting human lives. Unfortunately, classical methods of microorganism identification are based on time-consuming and labor-intensive approaches. Screening techniques require the rapid and cheap grouping of bacterial isolates; however, modern bioanalytics demand comprehensive bacterial studies at a molecular level. Modern approaches for the rapid identification of bacteria use molecular techniques, such as 16S ribosomal RNA gene sequencing based on polymerase chain reaction or electromigration, especially capillary zone electrophoresis and capillary isoelectric focusing. However, there are still several challenges with the analysis of microbial complexes using electromigration technology, such as uncontrolled aggregation and/or adhesion to the capillary surface. Thus, an approach using capillary electrophoresis of microbial aggregates with UV and matrix-assisted laser desorption ionization time-of-flight MS detection is presented.
On important precursor of singular optics (tutorial)
NASA Astrophysics Data System (ADS)
Polyanskii, Peter V.; Felde, Christina V.; Bogatyryova, Halina V.; Konovchuk, Alexey V.
2018-01-01
The rise of singular optics is usually associated with the seminal paper by J. F. Nye and M. V. Berry [Proc. R. Soc. Lond. A, 336, 165-189 (1974)]. Intense development of this area of modern photonics has started since the early eighties of the XX century due to invention of the interfrence technique for detection and diagnostics of phase singularities, such as optical vortices in complex speckle-structured light fields. The next powerful incentive for formation of singular optics into separate area of the science on light was connectected with discovering of very practical technique for creation of singular optical beams of various kinds on the base of computer-generated holograms. In the eghties and ninetieth of the XX century, singular optics evolved, almost entirely, under the approximation of complete coherency of light field. Only at the threshold of the XXI century, it has been comprehended that the singular-optics approaches can be fruitfully expanded onto partially spatially coherent, partially polarized and polychromatic light fields supporting singularities of new kinds, that has been resulted in establishing of correlation singular optics. Here we show that correlation singular optics has much deeper roots, ascending to "pre-singular" and even pre-laser epoch and associated with the concept of partial coherence and polarization. It is remarcable that correlation singular optics in its present interpretation has forestalled the standard coherent singular optics. This paper is timed to the sixtieth anniversary of the most profound precursor of modern correlation singular optics [J. Opt. Soc. Am., 47, 895-902 (1957)].
NASA Astrophysics Data System (ADS)
Afanasyev, A. P.; Bazhenov, R. I.; Luchaninov, D. V.
2018-05-01
The main purpose of the research is to develop techniques for defining the best technical and economic trajectories of cables in urban power systems. The proposed algorithms of calculation of the routes for laying cables take into consideration topological, technical and economic features of the cabling. The discrete option of an algorithm Fast marching method is applied as a calculating tool. It has certain advantages compared to other approaches. In particular, this algorithm is cost-effective to compute, therefore, it is not iterative. Trajectories of received laying cables are considered as optimal ones from the point of view of technical and economic criteria. They correspond to the present rules of modern urban development.
Advances in analytical chemistry
NASA Technical Reports Server (NTRS)
Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.
1991-01-01
Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.
Ovarian Failure and the Menopause
McEwen, Donald C.
1965-01-01
Long-term replacement therapy for ovarian deficiency or failure, including the menopause, has been widely debated. In the past, treatment, if indicated, was reassurance, sedation, and occasionally short-term estrogen therapy. Today, because of recent advances in steroid synthesis, the consequences of ovarian deficiency may be preventable. Ovarian function and failure are discussed, the apparent physiological renaissance of nine patients documented, and the methods of treatment detailed. Thirty-three patients, who took part in this program during the past two years, have continued treatment with enthusiasm, without major problems in management, and have shown evidence of improved physical and emotional well-being. Further unbiased long-term research, possibly using modern computer technique, is needed to decide between traditional and replacement therapy for the menopause. ImagesFig. 1 PMID:14289145
A Prospectus for the Future Development of a Speech Lab: Hypertext Applications.
ERIC Educational Resources Information Center
Berube, David M.
This paper presents a plan for the next generation of speech laboratories which integrates technologies of modern communication in order to improve and modernize the instructional process. The paper first examines the application of intermediate technologies including audio-video recording and playback, computer assisted instruction and testing…
MODERN LINGUISTICS, ITS DEVELOPMENT AND SCOPE.
ERIC Educational Resources Information Center
LEVIN, SAMUEL R.
THE DEVELOPMENT OF MODERN LINGUISTICS STARTED WITH JONES' DISCOVERY IN 1786 THAT SANSKRIT IS CLOSELY RELATED TO THE CLASSICAL, GERMANIC, AND CELTIC LANGUAGES, AND HAS ADVANCED TO INCLUDE THE APPLICATION OF COMPUTERS IN LANGUAGE ANALYSIS. THE HIGHLIGHTS OF LINGUISTIC RESEARCH HAVE BEEN DE SAUSSURE'S DISTINCTION BETWEEN THE DIACHRONIC AND THE…
Carr, J Jeffrey
2012-01-01
The ability to quantify subclinical disease to assess cardiovascular disease is greatly enhanced by modern medical imaging techniques that incorporate concepts from biomedical engineering and computer science. These techniques' numerical results, known as quantitative phenotypes, can be used to help us better understand both health and disease states. In this report, we describe our efforts in using the latest imaging technologies to assess cardiovascular disease risk by quantifying subclinical disease of participants in the Jackson Heart Study. The CT and MRI exams of the Jackson Heart Study have collected detailed information from approximately 3,000 participants. Analyses of the images from these exams provide information on several measures including the amount of plaque in the coronary arteries and the ability of the heart to pump blood. These measures can then be added to the wealth of information on JHS participants to understand how these conditions, as well as how clinical events, such as heart attacks and heart failure, occur in African Americans.
Traditional Tracking with Kalman Filter on Parallel Architectures
NASA Astrophysics Data System (ADS)
Cerati, Giuseppe; Elmer, Peter; Lantz, Steven; MacNeill, Ian; McDermott, Kevin; Riley, Dan; Tadel, Matevž; Wittich, Peter; Würthwein, Frank; Yagil, Avi
2015-05-01
Power density constraints are limiting the performance improvements of modern CPUs. To address this, we have seen the introduction of lower-power, multi-core processors, but the future will be even more exciting. In order to stay within the power density limits but still obtain Moore's Law performance/price gains, it will be necessary to parallelize algorithms to exploit larger numbers of lightweight cores and specialized functions like large vector units. Example technologies today include Intel's Xeon Phi and GPGPUs. Track finding and fitting is one of the most computationally challenging problems for event reconstruction in particle physics. At the High Luminosity LHC, for example, this will be by far the dominant problem. The most common track finding techniques in use today are however those based on the Kalman Filter. Significant experience has been accumulated with these techniques on real tracking detector systems, both in the trigger and offline. We report the results of our investigations into the potential and limitations of these algorithms on the new parallel hardware.
Development of an oximeter for neurology
NASA Astrophysics Data System (ADS)
Aleinik, A.; Serikbekova, Z.; Zhukova, N.; Zhukova, I.; Nikitina, M.
2016-06-01
Cerebral desaturation can occur during surgery manipulation, whereas other parameters vary insignificantly. Prolonged intervals of cerebral anoxia can cause serious damage to the nervous system. Commonly used method for measurement of cerebral blood flow uses invasive catheters. Other techniques include single photon emission computed tomography (SPECT), positron emission tomography (PET), magnetic resonance imaging (MRI). Tomographic methods frequently use isotope administration, that may result in anaphylactic reactions to contrast media and associated nerve diseases. Moreover, the high cost and the need for continuous monitoring make it difficult to apply these techniques in clinical practice. Cerebral oximetry is a method for measuring oxygen saturation using infrared spectrometry. Moreover reflection pulse oximetry can detect sudden changes in sympathetic tone. For this purpose the reflectance pulse oximeter for use in neurology is developed. Reflectance oximeter has a definite advantage as it can be used to measure oxygen saturation in any part of the body. Preliminary results indicate that the device has a good resolution and high reliability. Modern applied schematics have improved device characteristics compared with existing ones.
Challenge Paper: Validation of Forensic Techniques for Criminal Prosecution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erbacher, Robert F.; Endicott-Popovsky, Barbara E.; Frincke, Deborah A.
2007-04-10
Abstract: As in many domains, there is increasing agreement in the user and research community that digital forensics analysts would benefit from the extension, development and application of advanced techniques in performing large scale and heterogeneous data analysis. Modern digital forensics analysis of cyber-crimes and cyber-enabled crimes often requires scrutiny of massive amounts of data. For example, a case involving network compromise across multiple enterprises might require forensic analysis of numerous sets of network logs and computer hard drives, potentially involving 100?s of gigabytes of heterogeneous data, or even terabytes or petabytes of data. Also, the goal for forensic analysismore » is to not only determine whether the illicit activity being considered is taking place, but also to identify the source of the activity and the full extent of the compromise or impact on the local network. Even after this analysis, there remains the challenge of using the results in subsequent criminal and civil processes.« less
Cars Thermometry in a Supersonic Combustor for CFD Code Validation
NASA Technical Reports Server (NTRS)
Cutler, A. D.; Danehy, P. M.; Springer, R. R.; DeLoach, R.; Capriotti, D. P.
2002-01-01
An experiment has been conducted to acquire data for the validation of computational fluid dynamics (CFD) codes used in the design of supersonic combustors. The primary measurement technique is coherent anti-Stokes Raman spectroscopy (CARS), although surface pressures and temperatures have also been acquired. Modern- design- of-experiment techniques have been used to maximize the quality of the data set (for the given level of effort) and minimize systematic errors. The combustor consists of a diverging duct with single downstream- angled wall injector. Nominal entrance Mach number is 2 and enthalpy nominally corresponds to Mach 7 flight. Temperature maps are obtained at several planes in the flow for two cases: in one case the combustor is piloted by injecting fuel upstream of the main injector, the second is not. Boundary conditions and uncertainties are adequately characterized. Accurate CFD calculation of the flow will ultimately require accurate modeling of the chemical kinetics and turbulence-chemistry interactions as well as accurate modeling of the turbulent mixing
Expanding the frontiers of waveform imaging with Salvus
NASA Astrophysics Data System (ADS)
Afanasiev, M.; Boehm, C.; van Driel, M.; Krischer, L.; Fichtner, A.
2017-12-01
Mechanical waves are natural harbingers of information. From medical ultrasound to the normal modes of Sun, wave motion is often our best window into the character of some underlying continuum. For over a century, geophysicists have been using this window to peer deep into the Earth, developing techniques that have gone on to underlie much of world's energy economy. As computers and numerical techniques have become more powerful over the last several decades, seismologists have begun to scale back classical simplifying approximations of wave propagation physics. As a result, we are now approaching the ideal of `full-waveform inversion'; maximizing the aperture of our window by taking the full complexity of wave motion into account.Salvus is a modern high-performance software suite which aims to bring recent developments in geophysical waveform inversion to new and exciting domains. In this short presentation we will look at the connections between these applications, with examples from non-destructive testing, medical imaging, seismic exploration, and (extra-) planetary seismology.
Trajectory Optimization for Missions to Small Bodies with a Focus on Scientific Merit.
Englander, Jacob A; Vavrina, Matthew A; Lim, Lucy F; McFadden, Lucy A; Rhoden, Alyssa R; Noll, Keith S
2017-01-01
Trajectory design for missions to small bodies is tightly coupled both with the selection of targets for a mission and with the choice of spacecraft power, propulsion, and other hardware. Traditional methods of trajectory optimization have focused on finding the optimal trajectory for an a priori selection of destinations and spacecraft parameters. Recent research has expanded the field of trajectory optimization to multidisciplinary systems optimization that includes spacecraft parameters. The logical next step is to extend the optimization process to include target selection based not only on engineering figures of merit but also scientific value. This paper presents a new technique to solve the multidisciplinary mission optimization problem for small-bodies missions, including classical trajectory design, the choice of spacecraft power and propulsion systems, and also the scientific value of the targets. This technique, when combined with modern parallel computers, enables a holistic view of the small body mission design process that previously required iteration among several different design processes.
Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F; Perez, Danny
2017-10-21
Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.
NASA Astrophysics Data System (ADS)
Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F.; Perez, Danny
2017-10-01
Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.
Quantitative Investigation of the Technologies That Support Cloud Computing
ERIC Educational Resources Information Center
Hu, Wenjin
2014-01-01
Cloud computing is dramatically shaping modern IT infrastructure. It virtualizes computing resources, provides elastic scalability, serves as a pay-as-you-use utility, simplifies the IT administrators' daily tasks, enhances the mobility and collaboration of data, and increases user productivity. We focus on providing generalized black-box…
Graphical User Interface Programming in Introductory Computer Science.
ERIC Educational Resources Information Center
Skolnick, Michael M.; Spooner, David L.
Modern computing systems exploit graphical user interfaces for interaction with users; as a result, introductory computer science courses must begin to teach the principles underlying such interfaces. This paper presents an approach to graphical user interface (GUI) implementation that is simple enough for beginning students to understand, yet…
ERIC Educational Resources Information Center
Rollo, J. Michael; Marmarchev, Helen L.
1999-01-01
The explosion of computer applications in the modern workplace has required student affairs professionals to keep pace with technological advances for office productivity. This article recommends establishing an administrative computer user groups, utilizing coordinated web site development, and enhancing working relationships as ways of dealing…
Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views
ERIC Educational Resources Information Center
Aksela, Maija; Lundell, Jan
2008-01-01
Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…
Computer Problem-Solving Coaches for Introductory Physics: Design and Usability Studies
ERIC Educational Resources Information Center
Ryan, Qing X.; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Mason, Andrew
2016-01-01
The combination of modern computing power, the interactivity of web applications, and the flexibility of object-oriented programming may finally be sufficient to create computer coaches that can help students develop metacognitive problem-solving skills, an important competence in our rapidly changing technological society. However, no matter how…
Introduction to the theory of machines and languages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weidhaas, P. P.
1976-04-01
This text is intended to be an elementary ''guided tour'' through some basic concepts of modern computer science. Various models of computing machines and formal languages are studied in detail. Discussions center around questions such as, ''What is the scope of problems that can or cannot be solved by computers.''
Computer-Aided Instruction in Automated Instrumentation.
ERIC Educational Resources Information Center
Stephenson, David T.
1986-01-01
Discusses functions of automated instrumentation systems, i.e., systems which combine electrical measuring instruments and a controlling computer to measure responses of a unit under test. The computer-assisted tutorial then described is programmed for use on such a system--a modern microwave spectrum analyzer--to introduce engineering students to…
Visions of the Future - the Changing Role of Actors in Data-Intensive Science
NASA Astrophysics Data System (ADS)
Schäfer, L.; Klump, J. F.
2013-12-01
Around the world scientific disciplines are increasingly facing the challenge of a burgeoning volume of research data. This data avalanche consists of a stream of information generated from sensors and scientific instruments, digital recordings, social-science surveys or drawn from the World Wide Web. All areas of the scientific economy are affected by this rapid growth in data, from the logging of digs in Archaeology, telescope data with observations of distant galaxies in Astrophysics or data from polls and surveys in the Social Sciences. The challenge for science is not only to process the data through analysis, reduction and visualization, but also to set up infrastructures for provisioning and storing the data. The rise of new technologies and developments also poses new challenges for the actors in the area of research data infrastructures. Libraries, as one of the actors, enable access to digital media and support the publication of research data and its long-term archiving. Digital media and research data, however, introduce new aspects into the libraries' range of activities. How are we to imagine the library of the future? The library as an interface to the computer centers? Will library and computer center fuse into a new service unit? What role will scientific publishers play in future? Currently the traditional form of publication still carry greater weight - articles for conferences and journals. But will this still be the case in future? New forms of publication are already making their presence felt. The tasks of the computer centers may also change. Yesterday their remit was provisioning of rapid hardware, whereas now everything revolves around the topic of data and services. Finally, how about the researchers themselves? Not such a long time ago, Geoscience was not necessarily seen as linked to Computer Science. Nowadays, modern Geoscience relies heavily on IT and its techniques. Thus, in how far will the profile of the modern geoscientist change? This gives rise to the question of what tools are required to locate and pursue the correct course in a networked world. One tool from the area of innovation management is the scenario technique. This poster will outline visions of the future as possible developments of the scientific world in 2020 (or later). The scenarios presented will show possible developments - both positive and negative. It is up then to the actors themselves to define their own position in this context, to rethink it and consider steps that can achieve a positive development for the future.
Tegegn, Masresha; Arefaynie, Mastewal; Tiruye, Tenaw Yimer
2017-01-01
The contraceptive use of women in the extended postpartum period is usually different from other times in a woman's life cycle due to the additional roles and presence of emotional changes. However, there is lack of evidence regarding women contraceptive need during this period and the extent they met their need. Therefore, the objective of this study was to assess unmet need for modern contraceptives and associated factors among women during the extended postpartum period in Dessie Town, North east Ethiopia in December 2014. A community-based cross-sectional study was conducted among women who gave birth one year before the study period. Systematic random sampling technique was employed to recruit a total of 383 study participants. For data collection, a structured and pretested standard questionnaire was used. Descriptive statistics were done to characterize the study population using different variables. Bivariate and multiple logistic regression models were fitted to control confounding factors. Odds ratios with 95% confidence intervals were computed to identify factors associated with unmet need. This study revealed that 44% of the extended post-partum women had unmet need of modern contraceptives of which 57% unmet need for spacing and 43% for limiting. Education of women (being illiterate) (AOR (adjusted odds ratio) =3.37, 95% CI (confidence interval) 1.22-7.57), antenatal care service (no) (AOR = 2.41, 95% CI 1.11-5.79), Post-natal care service (no) (AOR = 3.63, CI 2.13-6.19) and knowledge of lactational amenorrhea method (AOR = 7.84 95% CI 4.10-15.02) were the factors positively associated with unmet need modern contraceptives in the extended postpartum period. The unmet need for modern contraception is high in the study area. There is need to improve the quality of maternal health service, girls education, information on postpartum risk of pregnancy on the recommended postpartum contraceptives to enable mothers make informed choices of contraceptives.
Fuzzy logic based robotic controller
NASA Technical Reports Server (NTRS)
Attia, F.; Upadhyaya, M.
1994-01-01
Existing Proportional-Integral-Derivative (PID) robotic controllers rely on an inverse kinematic model to convert user-specified cartesian trajectory coordinates to joint variables. These joints experience friction, stiction, and gear backlash effects. Due to lack of proper linearization of these effects, modern control theory based on state space methods cannot provide adequate control for robotic systems. In the presence of loads, the dynamic behavior of robotic systems is complex and nonlinear, especially where mathematical modeling is evaluated for real-time operators. Fuzzy Logic Control is a fast emerging alternative to conventional control systems in situations where it may not be feasible to formulate an analytical model of the complex system. Fuzzy logic techniques track a user-defined trajectory without having the host computer to explicitly solve the nonlinear inverse kinematic equations. The goal is to provide a rule-based approach, which is closer to human reasoning. The approach used expresses end-point error, location of manipulator joints, and proximity to obstacles as fuzzy variables. The resulting decisions are based upon linguistic and non-numerical information. This paper presents a solution to the conventional robot controller which is independent of computationally intensive kinematic equations. Computer simulation results of this approach as obtained from software implementation are also discussed.
Róg, T; Murzyn, K; Hinsen, K; Kneller, G R
2003-04-15
We present a new implementation of the program nMoldyn, which has been developed for the computation and decomposition of neutron scattering intensities from Molecular Dynamics trajectories (Comp. Phys. Commun 1995, 91, 191-214). The new implementation extends the functionality of the original version, provides a much more convenient user interface (both graphical/interactive and batch), and can be used as a tool set for implementing new analysis modules. This was made possible by the use of a high-level language, Python, and of modern object-oriented programming techniques. The quantities that can be calculated by nMoldyn are the mean-square displacement, the velocity autocorrelation function as well as its Fourier transform (the density of states) and its memory function, the angular velocity autocorrelation function and its Fourier transform, the reorientational correlation function, and several functions specific to neutron scattering: the coherent and incoherent intermediate scattering functions with their Fourier transforms, the memory function of the coherent scattering function, and the elastic incoherent structure factor. The possibility to compute memory function is a new and powerful feature that allows to relate simulation results to theoretical studies. Copyright 2003 Wiley Periodicals, Inc. J Comput Chem 24: 657-667, 2003
Hydrodynamic Simulations of Protoplanetary Disks with GIZMO
NASA Astrophysics Data System (ADS)
Rice, Malena; Laughlin, Greg
2018-01-01
Over the past several decades, the field of computational fluid dynamics has rapidly advanced as the range of available numerical algorithms and computationally feasible physical problems has expanded. The development of modern numerical solvers has provided a compelling opportunity to reconsider previously obtained results in search for yet undiscovered effects that may be revealed through longer integration times and more precise numerical approaches. In this study, we compare the results of past hydrodynamic disk simulations with those obtained from modern analytical resources. We focus our study on the GIZMO code (Hopkins 2015), which uses meshless methods to solve the homogeneous Euler equations of hydrodynamics while eliminating problems arising as a result of advection between grid cells. By comparing modern simulations with prior results, we hope to provide an improved understanding of the impact of fluid mechanics upon the evolution of protoplanetary disks.
Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis
NASA Astrophysics Data System (ADS)
Chernoded, Andrey; Dudko, Lev; Myagkov, Igor; Volkov, Petr
2017-10-01
Most of the modern analyses in high energy physics use signal-versus-background classification techniques of machine learning methods and neural networks in particular. Deep learning neural network is the most promising modern technique to separate signal and background and now days can be widely and successfully implemented as a part of physical analysis. In this article we compare Deep learning and Bayesian neural networks application as a classifiers in an instance of top quark analysis.
Web-GIS platform for monitoring and forecasting of regional climate and ecological changes
NASA Astrophysics Data System (ADS)
Gordov, E. P.; Krupchatnikov, V. N.; Lykosov, V. N.; Okladnikov, I.; Titov, A. G.; Shulgina, T. M.
2012-12-01
Growing volume of environmental data from sensors and model outputs makes development of based on modern information-telecommunication technologies software infrastructure for information support of integrated scientific researches in the field of Earth sciences urgent and important task (Gordov et al, 2012, van der Wel, 2005). It should be considered that original heterogeneity of datasets obtained from different sources and institutions not only hampers interchange of data and analysis results but also complicates their intercomparison leading to a decrease in reliability of analysis results. However, modern geophysical data processing techniques allow combining of different technological solutions for organizing such information resources. Nowadays it becomes a generally accepted opinion that information-computational infrastructure should rely on a potential of combined usage of web- and GIS-technologies for creating applied information-computational web-systems (Titov et al, 2009, Gordov et al. 2010, Gordov, Okladnikov and Titov, 2011). Using these approaches for development of internet-accessible thematic information-computational systems, and arranging of data and knowledge interchange between them is a very promising way of creation of distributed information-computation environment for supporting of multidiscipline regional and global research in the field of Earth sciences including analysis of climate changes and their impact on spatial-temporal vegetation distribution and state. Experimental software and hardware platform providing operation of a web-oriented production and research center for regional climate change investigations which combines modern web 2.0 approach, GIS-functionality and capabilities of running climate and meteorological models, large geophysical datasets processing, visualization, joint software development by distributed research groups, scientific analysis and organization of students and post-graduate students education is presented. Platform software developed (Shulgina et al, 2012, Okladnikov et al, 2012) includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also data preprocessing, run and visualization of modeling results of models WRF and «Planet Simulator» integrated into the platform is provided. All functions of the center are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of visualization of processing results, selection of geographical region of interest (pan and zoom) and data layers manipulation (order, enable/disable, features extraction). Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches (Shulgina et al, 2011). Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified graphical web-interface.
Petascale Many Body Methods for Complex Correlated Systems
NASA Astrophysics Data System (ADS)
Pruschke, Thomas
2012-02-01
Correlated systems constitute an important class of materials in modern condensed matter physics. Correlation among electrons are at the heart of all ordering phenomena and many intriguing novel aspects, such as quantum phase transitions or topological insulators, observed in a variety of compounds. Yet, theoretically describing these phenomena is still a formidable task, even if one restricts the models used to the smallest possible set of degrees of freedom. Here, modern computer architectures play an essential role, and the joint effort to devise efficient algorithms and implement them on state-of-the art hardware has become an extremely active field in condensed-matter research. To tackle this task single-handed is quite obviously not possible. The NSF-OISE funded PIRE collaboration ``Graduate Education and Research in Petascale Many Body Methods for Complex Correlated Systems'' is a successful initiative to bring together leading experts around the world to form a virtual international organization for addressing these emerging challenges and educate the next generation of computational condensed matter physicists. The collaboration includes research groups developing novel theoretical tools to reliably and systematically study correlated solids, experts in efficient computational algorithms needed to solve the emerging equations, and those able to use modern heterogeneous computer architectures to make then working tools for the growing community.
Advancing Creative Visual Thinking with Constructive Function-Based Modelling
ERIC Educational Resources Information Center
Pasko, Alexander; Adzhiev, Valery; Malikova, Evgeniya; Pilyugin, Victor
2013-01-01
Modern education technologies are destined to reflect the realities of a modern digital age. The juxtaposition of real and synthetic (computer-generated) worlds as well as a greater emphasis on visual dimension are especially important characteristics that have to be taken into account in learning and teaching. We describe the ways in which an…
Locality-Aware CTA Clustering For Modern GPUs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Ang; Song, Shuaiwen; Liu, Weifeng
2017-04-08
In this paper, we proposed a novel clustering technique for tapping into the performance potential of a largely ignored type of locality: inter-CTA locality. We first demonstrated the capability of the existing GPU hardware to exploit such locality, both spatially and temporally, on L1 or L1/Tex unified cache. To verify the potential of this locality, we quantified its existence in a broad spectrum of applications and discussed its sources of origin. Based on these insights, we proposed the concept of CTA-Clustering and its associated software techniques. Finally, We evaluated these techniques on all modern generations of NVIDIA GPU architectures. Themore » experimental results showed that our proposed clustering techniques could significantly improve on-chip cache performance.« less
Kassiopeia: a modern, extensible C++ particle tracking package
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furse, Daniel; Groh, Stefan; Trost, Nikolaus
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less
Ptolemy's Britain and Ireland: A New Digital Reconstruction
NASA Astrophysics Data System (ADS)
Abshire, Corey; Durham, Anthony; Gusev, Dmitri A.; Stafeyev, Sergey K.
2018-05-01
In this paper, we expand application of our mathematical methods for translating ancient coordinates from the classical Geography by Claudius Ptolemy into modern coordinates from India and Arabia to Britain and Ireland, historically important islands on the periphery of the ancient Roman Empire. The methods include triangulation and flocking with subsequent Bayesian correction. The results of our work can be conveniently visualized in modern GIS tools, such as ArcGIS, QGIS, and Google Earth. The enhancements we have made include a novel technique for handling tentatively identified points. We compare the precision of reconstruction achieved for Ptolemy's Britain and Ireland with the precisions that we had computed earlier for his India before the Ganges and three provinces of Arabia. We also provide improved validation and comparison amongst the methods applied. We compare our results with the prior work, while utilizing knowledge from such important ancient sources as the Antonine Itinerary, Tabula Peutingeriana, and the Ravenna Cosmography. The new digital reconstruction of Claudius Ptolemy's Britain and Ireland presented in this paper, along with the accompanying linguistic analysis of ancient toponyms, contributes to improvement of understanding of our cultural cartographic heritage by making it easier to study the ancient world using the popular and accessible GIS programs.
Kassiopeia: a modern, extensible C++ particle tracking package
Furse, Daniel; Groh, Stefan; Trost, Nikolaus; ...
2017-05-16
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less
Kassiopeia: a modern, extensible C++ particle tracking package
NASA Astrophysics Data System (ADS)
Furse, Daniel; Groh, Stefan; Trost, Nikolaus; Babutzka, Martin; Barrett, John P.; Behrens, Jan; Buzinsky, Nicholas; Corona, Thomas; Enomoto, Sanshiro; Erhard, Moritz; Formaggio, Joseph A.; Glück, Ferenc; Harms, Fabian; Heizmann, Florian; Hilk, Daniel; Käfer, Wolfgang; Kleesiek, Marco; Leiber, Benjamin; Mertens, Susanne; Oblath, Noah S.; Renschler, Pascal; Schwarz, Johannes; Slocum, Penny L.; Wandkowsky, Nancy; Wierman, Kevin; Zacher, Michael
2017-05-01
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur in flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle’s state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.
Zakhia, Frédéric; de Lajudie, Philippe
2006-03-01
Taxonomy is the science that studies the relationships between organisms. It comprises classification, nomenclature, and identification. Modern bacterial taxonomy is polyphasic. This means that it is based on several molecular techniques, each one retrieving the information at different cellular levels (proteins, fatty acids, DNA...). The obtained results are combined and analysed to reach a "consensus taxonomy" of a microorganism. Until 1970, a small number of classification techniques were available for microbiologists (mainly phenotypic characterization was performed: a legume species nodulation ability for a Rhizobium, for example). With the development of techniques based on polymerase chain reaction for characterization, the bacterial taxonomy has undergone great changes. In particular, the classification of the legume nodulating bacteria has been repeatedly modified over the last 20 years. We present here a review of the currently used molecular techniques in bacterial characterization, with examples of application of these techniques for the study of the legume nodulating bacteria.
Strategies for Fermentation Medium Optimization: An In-Depth Review
Singh, Vineeta; Haque, Shafiul; Niwas, Ram; Srivastava, Akansha; Pasupuleti, Mukesh; Tripathi, C. K. M.
2017-01-01
Optimization of production medium is required to maximize the metabolite yield. This can be achieved by using a wide range of techniques from classical “one-factor-at-a-time” to modern statistical and mathematical techniques, viz. artificial neural network (ANN), genetic algorithm (GA) etc. Every technique comes with its own advantages and disadvantages, and despite drawbacks some techniques are applied to obtain best results. Use of various optimization techniques in combination also provides the desirable results. In this article an attempt has been made to review the currently used media optimization techniques applied during fermentation process of metabolite production. Comparative analysis of the merits and demerits of various conventional as well as modern optimization techniques have been done and logical selection basis for the designing of fermentation medium has been given in the present review. Overall, this review will provide the rationale for the selection of suitable optimization technique for media designing employed during the fermentation process of metabolite production. PMID:28111566
Fischbach, Martin; Wiebusch, Dennis; Latoschik, Marc Erich
2017-04-01
Modularity, modifiability, reusability, and API usability are important software qualities that determine the maintainability of software architectures. Virtual, Augmented, and Mixed Reality (VR, AR, MR) systems, modern computer games, as well as interactive human-robot systems often include various dedicated input-, output-, and processing subsystems. These subsystems collectively maintain a real-time simulation of a coherent application state. The resulting interdependencies between individual state representations, mutual state access, overall synchronization, and flow of control implies a conceptual close coupling whereas software quality asks for a decoupling to develop maintainable solutions. This article presents five semantics-based software techniques that address this contradiction: Semantic grounding, code from semantics, grounded actions, semantic queries, and decoupling by semantics. These techniques are applied to extend the well-established entity-component-system (ECS) pattern to overcome some of this pattern's deficits with respect to the implied state access. A walk-through of central implementation aspects of a multimodal (speech and gesture) VR-interface is used to highlight the techniques' benefits. This use-case is chosen as a prototypical example of complex architectures with multiple interacting subsystems found in many VR, AR and MR architectures. Finally, implementation hints are given, lessons learned regarding maintainability pointed-out, and performance implications discussed.
NASA Astrophysics Data System (ADS)
Chan, Hau P.; Bao, Nai-Keng; Kwok, Wing O.; Wong, Wing H.
2002-04-01
The application of Digital Pixel Hologram (DPH) as anti-counterfeiting technology for products such as commercial goods, credit cards, identity cards, paper money banknote etc. is growing important nowadays. It offers many advantages over other anti-counterfeiting tools and this includes high diffraction effect, high resolving power, resistance to photo copying using two-dimensional Xeroxes, potential for mass production of patterns at a very low cost. Recently, we have successfully in fabricating high definition DPH with resolution higher than 2500dpi for the purpose of anti-counterfeiting by applying modern optical diffraction theory to computer pattern generation technique with the assist of electron beam lithography (EBL). In this paper, we introduce five levels of encryption techniques, which can be embedded in the design of such DPHs to further improve its anti-counterfeiting performance with negligible added on cost. The techniques involved, in the ascending order of decryption complexity, are namely Gray-level Encryption, Pattern Encryption, Character Encryption, Image Modification Encryption and Codebook Encryption. A Hong Kong Special Administration Regions (HKSAR) DPH emblem was fabricated at a resolution of 2540dpi using the facilities housed in our Optoelectronics Research Center. This emblem will be used as an illustration to discuss in details about each encryption idea during the conference.
Factors Affecting Teachers' Adoption of Educational Computer Games: A Case Study
ERIC Educational Resources Information Center
Kebritchi, Mansureh
2010-01-01
Even though computer games hold considerable potential for engaging and facilitating learning among today's children, the adoption of modern educational computer games is still meeting significant resistance in K-12 education. The purpose of this paper is to inform educators and instructional designers on factors affecting teachers' adoption of…
Let Me Share a Secret with You! Teaching with Computers.
ERIC Educational Resources Information Center
de Vasconcelos, Maria
The author describes her experiences teaching a computer-enhanced Modern Poetry course. The author argues that using computers enhances the concept of the classroom as learning community. It was the author's experience that students' postings on the discussion board created an atmosphere that encouraged student involvement, as opposed to the…
The State of the Art in Information Handling. Operation PEP/Executive Information Systems.
ERIC Educational Resources Information Center
Summers, J. K.; Sullivan, J. E.
This document explains recent developments in computer science and information systems of interest to the educational manager. A brief history of computers is included, together with an examination of modern computers' capabilities. Various features of card, tape, and disk information storage systems are presented. The importance of time-sharing…
Computer-Mediated Communication as an Autonomy-Enhancement Tool for Advanced Learners of English
ERIC Educational Resources Information Center
Wach, Aleksandra
2012-01-01
This article examines the relevance of modern technology for the development of learner autonomy in the process of learning English as a foreign language. Computer-assisted language learning and computer-mediated communication (CMC) appear to be particularly conducive to fostering autonomous learning, as they naturally incorporate many elements of…
NASA Astrophysics Data System (ADS)
Roussel, Marc R.
1999-10-01
One of the traditional obstacles to learning quantum mechanics is the relatively high level of mathematical proficiency required to solve even routine problems. Modern computer algebra systems are now sufficiently reliable that they can be used as mathematical assistants to alleviate this difficulty. In the quantum mechanics course at the University of Lethbridge, the traditional three lecture hours per week have been replaced by two lecture hours and a one-hour computer-aided problem solving session using a computer algebra system (Maple). While this somewhat reduces the number of topics that can be tackled during the term, students have a better opportunity to familiarize themselves with the underlying theory with this course design. Maple is also available to students during examinations. The use of a computer algebra system expands the class of feasible problems during a time-limited exercise such as a midterm or final examination. A modern computer algebra system is a complex piece of software, so some time needs to be devoted to teaching the students its proper use. However, the advantages to the teaching of quantum mechanics appear to outweigh the disadvantages.
A new model to compute the desired steering torque for steer-by-wire vehicles and driving simulators
NASA Astrophysics Data System (ADS)
Fankem, Steve; Müller, Steffen
2014-05-01
This paper deals with the control of the hand wheel actuator in steer-by-wire (SbW) vehicles and driving simulators (DSs). A novel model for the computation of the desired steering torque is presented. The introduced steering torque computation does not only aim to generate a realistic steering feel, which means that the driver should not miss the basic steering functionality of a modern conventional steering system such as an electric power steering (EPS) or hydraulic power steering (HPS), and this in every driving situation. In addition, the modular structure of the steering torque computation combined with suitably selected tuning parameters has the objective to offer a high degree of customisability of the steering feel and thus to provide each driver with his preferred steering feel in a very intuitive manner. The task and the tuning of each module are firstly described. Then, the steering torque computation is parameterised such that the steering feel of a series EPS system is reproduced. For this purpose, experiments are conducted in a hardware-in-the-loop environment where a test EPS is mounted on a steering test bench coupled with a vehicle simulator and parameter identification techniques are applied. Subsequently, how appropriate the steering torque computation mimics the test EPS system is objectively evaluated with respect to criteria concerning the steering torque level and gradient, the feedback behaviour and the steering return ability. Finally, the intuitive tuning of the modular steering torque computation is demonstrated for deriving a sportier steering feel configuration.
Applications of Computer Technology in Complex Craniofacial Reconstruction.
Day, Kristopher M; Gabrick, Kyle S; Sargent, Larry A
2018-03-01
To demonstrate our use of advanced 3-dimensional (3D) computer technology in the analysis, virtual surgical planning (VSP), 3D modeling (3DM), and treatment of complex congenital and acquired craniofacial deformities. We present a series of craniofacial defects treated at a tertiary craniofacial referral center utilizing state-of-the-art 3D computer technology. All patients treated at our center using computer-assisted VSP, prefabricated custom-designed 3DMs, and/or 3D printed custom implants (3DPCI) in the reconstruction of craniofacial defects were included in this analysis. We describe the use of 3D computer technology to precisely analyze, plan, and reconstruct 31 craniofacial deformities/syndromes caused by: Pierre-Robin (7), Treacher Collins (5), Apert's (2), Pfeiffer (2), Crouzon (1) Syndromes, craniosynostosis (6), hemifacial microsomia (2), micrognathia (2), multiple facial clefts (1), and trauma (3). In select cases where the available bone was insufficient for skeletal reconstruction, 3DPCIs were fabricated using 3D printing. We used VSP in 30, 3DMs in all 31, distraction osteogenesis in 16, and 3DPCIs in 13 cases. Utilizing these technologies, the above complex craniofacial defects were corrected without significant complications and with excellent aesthetic results. Modern 3D technology allows the surgeon to better analyze complex craniofacial deformities, precisely plan surgical correction with computer simulation of results, customize osteotomies, plan distractions, and print 3DPCI, as needed. The use of advanced 3D computer technology can be applied safely and potentially improve aesthetic and functional outcomes after complex craniofacial reconstruction. These techniques warrant further study and may be reproducible in various centers of care.
A view of Kanerva's sparse distributed memory
NASA Technical Reports Server (NTRS)
Denning, P. J.
1986-01-01
Pentti Kanerva is working on a new class of computers, which are called pattern computers. Pattern computers may close the gap between capabilities of biological organisms to recognize and act on patterns (visual, auditory, tactile, or olfactory) and capabilities of modern computers. Combinations of numeric, symbolic, and pattern computers may one day be capable of sustaining robots. The overview of the requirements for a pattern computer, a summary of Kanerva's Sparse Distributed Memory (SDM), and examples of tasks this computer can be expected to perform well are given.
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley W.
2009-01-01
Supporting the Aeronautics Research Mission Directorate guidelines, the National Aeronautics and Space Administration [NASA] Dryden Flight Research Center is developing a multidisciplinary design, analysis, and optimization [MDAO] tool. This tool will leverage existing tools and practices, and allow the easy integration and adoption of new state-of-the-art software. Today s modern aircraft designs in transonic speed are a challenging task due to the computation time required for the unsteady aeroelastic analysis using a Computational Fluid Dynamics [CFD] code. Design approaches in this speed regime are mainly based on the manual trial and error. Because of the time required for unsteady CFD computations in time-domain, this will considerably slow down the whole design process. These analyses are usually performed repeatedly to optimize the final design. As a result, there is considerable motivation to be able to perform aeroelastic calculations more quickly and inexpensively. This paper will describe the development of unsteady transonic aeroelastic design methodology for design optimization using reduced modeling method and unsteady aerodynamic approximation. The method requires the unsteady transonic aerodynamics be represented in the frequency or Laplace domain. Dynamically linear assumption is used for creating Aerodynamic Influence Coefficient [AIC] matrices in transonic speed regime. Unsteady CFD computations are needed for the important columns of an AIC matrix which corresponded to the primary modes for the flutter. Order reduction techniques, such as Guyan reduction and improved reduction system, are used to reduce the size of problem transonic flutter can be found by the classic methods, such as Rational function approximation, p-k, p, root-locus etc. Such a methodology could be incorporated into MDAO tool for design optimization at a reasonable computational cost. The proposed technique is verified using the Aerostructures Test Wing 2 actually designed, built, and tested at NASA Dryden Flight Research Center. The results from the full order model and the approximate reduced order model are analyzed and compared.
Real-time dynamic display of registered 4D cardiac MR and ultrasound images using a GPU
NASA Astrophysics Data System (ADS)
Zhang, Q.; Huang, X.; Eagleson, R.; Guiraudon, G.; Peters, T. M.
2007-03-01
In minimally invasive image-guided surgical interventions, different imaging modalities, such as magnetic resonance imaging (MRI), computed tomography (CT), and real-time three-dimensional (3D) ultrasound (US), can provide complementary, multi-spectral image information. Multimodality dynamic image registration is a well-established approach that permits real-time diagnostic information to be enhanced by placing lower-quality real-time images within a high quality anatomical context. For the guidance of cardiac procedures, it would be valuable to register dynamic MRI or CT with intraoperative US. However, in practice, either the high computational cost prohibits such real-time visualization of volumetric multimodal images in a real-world medical environment, or else the resulting image quality is not satisfactory for accurate guidance during the intervention. Modern graphics processing units (GPUs) provide the programmability, parallelism and increased computational precision to begin to address this problem. In this work, we first outline our research on dynamic 3D cardiac MR and US image acquisition, real-time dual-modality registration and US tracking. Then we describe image processing and optimization techniques for 4D (3D + time) cardiac image real-time rendering. We also present our multimodality 4D medical image visualization engine, which directly runs on a GPU in real-time by exploiting the advantages of the graphics hardware. In addition, techniques such as multiple transfer functions for different imaging modalities, dynamic texture binding, advanced texture sampling and multimodality image compositing are employed to facilitate the real-time display and manipulation of the registered dual-modality dynamic 3D MR and US cardiac datasets.
Sun, Chao; Feng, Wenquan; Du, Songlin
2018-01-01
As multipath is one of the dominating error sources for high accuracy Global Navigation Satellite System (GNSS) applications, multipath mitigation approaches are employed to minimize this hazardous error in receivers. Binary offset carrier modulation (BOC), as a modernized signal structure, is adopted to achieve significant enhancement. However, because of its multi-peak autocorrelation function, conventional multipath mitigation techniques for binary phase shift keying (BPSK) signal would not be optimal. Currently, non-parametric and parametric approaches have been studied specifically aiming at multipath mitigation for BOC signals. Non-parametric techniques, such as Code Correlation Reference Waveforms (CCRW), usually have good feasibility with simple structures, but suffer from low universal applicability for different BOC signals. Parametric approaches can thoroughly eliminate multipath error by estimating multipath parameters. The problems with this category are at the high computation complexity and vulnerability to the noise. To tackle the problem, we present a practical parametric multipath estimation method in the frequency domain for BOC signals. The received signal is transferred to the frequency domain to separate out the multipath channel transfer function for multipath parameter estimation. During this process, we take the operations of segmentation and averaging to reduce both noise effect and computational load. The performance of the proposed method is evaluated and compared with the previous work in three scenarios. Results indicate that the proposed averaging-Fast Fourier Transform (averaging-FFT) method achieves good robustness in severe multipath environments with lower computational load for both low-order and high-order BOC signals. PMID:29495589
NASA Astrophysics Data System (ADS)
Sheikholeslami, R.; Hosseini, N.; Razavi, S.
2016-12-01
Modern earth and environmental models are usually characterized by a large parameter space and high computational cost. These two features prevent effective implementation of sampling-based analysis such as sensitivity and uncertainty analysis, which require running these computationally expensive models several times to adequately explore the parameter/problem space. Therefore, developing efficient sampling techniques that scale with the size of the problem, computational budget, and users' needs is essential. In this presentation, we propose an efficient sequential sampling strategy, called Progressive Latin Hypercube Sampling (PLHS), which provides an increasingly improved coverage of the parameter space, while satisfying pre-defined requirements. The original Latin hypercube sampling (LHS) approach generates the entire sample set in one stage; on the contrary, PLHS generates a series of smaller sub-sets (also called `slices') while: (1) each sub-set is Latin hypercube and achieves maximum stratification in any one dimensional projection; (2) the progressive addition of sub-sets remains Latin hypercube; and thus (3) the entire sample set is Latin hypercube. Therefore, it has the capability to preserve the intended sampling properties throughout the sampling procedure. PLHS is deemed advantageous over the existing methods, particularly because it nearly avoids over- or under-sampling. Through different case studies, we show that PHLS has multiple advantages over the one-stage sampling approaches, including improved convergence and stability of the analysis results with fewer model runs. In addition, PLHS can help to minimize the total simulation time by only running the simulations necessary to achieve the desired level of quality (e.g., accuracy, and convergence rate).
Technology and the Modern Library.
ERIC Educational Resources Information Center
Boss, Richard W.
1984-01-01
Overview of the impact of information technology on libraries highlights turnkey vendors, bibliographic utilities, commercial suppliers of records, state and regional networks, computer-to-computer linkages, remote database searching, terminals and microcomputers, building local databases, delivery of information, digital telefacsimile,…
The Effect of Systematic Error in Forced Oscillation Testing
NASA Technical Reports Server (NTRS)
Williams, Brianne Y.; Landman, Drew; Flory, Isaac L., IV; Murphy, Patrick C.
2012-01-01
One of the fundamental problems in flight dynamics is the formulation of aerodynamic forces and moments acting on an aircraft in arbitrary motion. Classically, conventional stability derivatives are used for the representation of aerodynamic loads in the aircraft equations of motion. However, for modern aircraft with highly nonlinear and unsteady aerodynamic characteristics undergoing maneuvers at high angle of attack and/or angular rates the conventional stability derivative model is no longer valid. Attempts to formulate aerodynamic model equations with unsteady terms are based on several different wind tunnel techniques: for example, captive, wind tunnel single degree-of-freedom, and wind tunnel free-flying techniques. One of the most common techniques is forced oscillation testing. However, the forced oscillation testing method does not address the systematic and systematic correlation errors from the test apparatus that cause inconsistencies in the measured oscillatory stability derivatives. The primary objective of this study is to identify the possible sources and magnitude of systematic error in representative dynamic test apparatuses. Sensitivities of the longitudinal stability derivatives to systematic errors are computed, using a high fidelity simulation of a forced oscillation test rig, and assessed using both Design of Experiments and Monte Carlo methods.
Brain tumor classification using AFM in combination with data mining techniques.
Huml, Marlene; Silye, René; Zauner, Gerald; Hutterer, Stephan; Schilcher, Kurt
2013-01-01
Although classification of astrocytic tumors is standardized by the WHO grading system, which is mainly based on microscopy-derived, histomorphological features, there is great interobserver variability. The main causes are thought to be the complexity of morphological details varying from tumor to tumor and from patient to patient, variations in the technical histopathological procedures like staining protocols, and finally the individual experience of the diagnosing pathologist. Thus, to raise astrocytoma grading to a more objective standard, this paper proposes a methodology based on atomic force microscopy (AFM) derived images made from histopathological samples in combination with data mining techniques. By comparing AFM images with corresponding light microscopy images of the same area, the progressive formation of cavities due to cell necrosis was identified as a typical morphological marker for a computer-assisted analysis. Using genetic programming as a tool for feature analysis, a best model was created that achieved 94.74% classification accuracy in distinguishing grade II tumors from grade IV ones. While utilizing modern image analysis techniques, AFM may become an important tool in astrocytic tumor diagnosis. By this way patients suffering from grade II tumors are identified unambiguously, having a less risk for malignant transformation. They would benefit from early adjuvant therapies.
Bolliger, Stephan A; Thali, Michael J; Ross, Steffen; Buck, Ursula; Naether, Silvio; Vock, Peter
2008-02-01
The transdisciplinary research project Virtopsy is dedicated to implementing modern imaging techniques into forensic medicine and pathology in order to augment current examination techniques or even to offer alternative methods. Our project relies on three pillars: three-dimensional (3D) surface scanning for the documentation of body surfaces, and both multislice computed tomography (MSCT) and magnetic resonance imaging (MRI) to visualise the internal body. Three-dimensional surface scanning has delivered remarkable results in the past in the 3D documentation of patterned injuries and of objects of forensic interest as well as whole crime scenes. Imaging of the interior of corpses is performed using MSCT and/or MRI. MRI, in addition, is also well suited to the examination of surviving victims of assault, especially choking, and helps visualise internal injuries not seen at external examination of the victim. Apart from the accuracy and three-dimensionality that conventional documentations lack, these techniques allow for the re-examination of the corpse and the crime scene even decades later, after burial of the corpse and liberation of the crime scene. We believe that this virtual, non-invasive or minimally invasive approach will improve forensic medicine in the near future.
Thompson, Randall C; Allam, Adel H; Zink, Albert; Wann, L Samuel; Lombardi, Guido P; Cox, Samantha L; Frohlich, Bruno; Sutherland, M Linda; Sutherland, James D; Frohlich, Thomas C; King, Samantha I; Miyamoto, Michael I; Monge, Janet M; Valladolid, Clide M; El-Halim Nur El-Din, Abd; Narula, Jagat; Thompson, Adam M; Finch, Caleb E; Thomas, Gregory S
2014-06-01
Although atherosclerosis is widely thought to be a disease of modernity, computed tomographic evidence of atherosclerosis has been found in the bodies of a large number of mummies. This article reviews the findings of atherosclerotic calcifications in the remains of ancient people-humans who lived across a very wide span of human history and over most of the inhabited globe. These people had a wide range of diets and lifestyles and traditional modern risk factors do not thoroughly explain the presence and easy detectability of this disease. Nontraditional risk factors such as the inhalation of cooking fire smoke and chronic infection or inflammation might have been important atherogenic factors in ancient times. Study of the genetic and environmental risk factors for atherosclerosis in ancient people may offer insights into this common modern disease. Copyright © 2014 World Heart Federation (Geneva). Published by Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Fairlie, Robert W.; Robinson, Jonathan
2013-01-01
Computers are an important part of modern education, yet large segments of the population--especially low-income and minority children--lack access to a computer at home. Does this impede educational achievement? We test this hypothesis by conducting the largest-ever field experiment involving the random provision of free computers for home use to…
A Computer Model for Red Blood Cell Chemistry
1996-10-01
5012. 13. ABSTRACT (Maximum 200 There is a growing need for interactive computational tools for medical education and research. The most exciting...paradigm for interactive education is simulation. Fluid Mod is a simulation based computational tool developed in the late sixties and early seventies at...to a modern Windows, object oriented interface. This development will provide students with a useful computational tool for learning . More important
Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators
Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew
2014-01-01
Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in multi-dimensional media. PMID:24497950
The efficiency of geophysical adjoint codes generated by automatic differentiation tools
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Köhl, A.; Stammer, D.
2016-02-01
The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Illidge, Tim, E-mail: Tim.Illidge@ics.manchester.ac.uk; Specht, Lena; Yahalom, Joachim
2014-05-01
Radiation therapy (RT) is the most effective single modality for local control of non-Hodgkin lymphoma (NHL) and is an important component of therapy for many patients. Many of the historic concepts of dose and volume have recently been challenged by the advent of modern imaging and RT planning tools. The International Lymphoma Radiation Oncology Group (ILROG) has developed these guidelines after multinational meetings and analysis of available evidence. The guidelines represent an agreed consensus view of the ILROG steering committee on the use of RT in NHL in the modern era. The roles of reduced volume and reduced doses aremore » addressed, integrating modern imaging with 3-dimensional planning and advanced techniques of RT delivery. In the modern era, in which combined-modality treatment with systemic therapy is appropriate, the previously applied extended-field and involved-field RT techniques that targeted nodal regions have now been replaced by limiting the RT to smaller volumes based solely on detectable nodal involvement at presentation. A new concept, involved-site RT, defines the clinical target volume. For indolent NHL, often treated with RT alone, larger fields should be considered. Newer treatment techniques, including intensity modulated RT, breath holding, image guided RT, and 4-dimensional imaging, should be implemented, and their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control.« less
ExpoCast: Exposure Science for Prioritization and Toxicity Testing (T)
The US EPA National Center for Computational Toxicology (NCCT) has a mission to integrate modern computing and information technology with molecular biology to improve Agency prioritization of data requirements and risk assessment of chemicals. Recognizing the critical need for ...
Mathematic in science progress reort, June 1, 1973-May 31, 1974
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bellman, R.
1974-01-01
The purpose of the matematical biosciences group is to use the conceptual, analytic and computational methods of modern mathematics to treat biomedical and environmental problems. We are employing a systems approach to research, beginning with experiment and medical practice at one end of the scale, continuing through the intermediary of mathematical models and computer techniques, and culminating in clinical applications working closely with teams of doctors. We are pursuing the application of biostatistical methods to a number of medical questions as well as a thoroughgoing use of operations research and systems analysis to hospital practice. The overall objective is tomore » make the Medicare program operational, effective as well as cheap. Pattern recognition and other aspects of artificial intelligence are important here for patient screening. Major efforts are devoted to nuclear medicine, radiotherapy and neurophysiology using the mathematical theory of control and decision processes (dynamic programming and invarient imbedding). Important savings have been made in the time required for tumor scanning using techniques of nuclear medicine. Major mathematical breakthroughs have been made in the treatment of large scale systems, and parameter identification processes. In the field of mental health, we have developed and extended the computerized simulation processes using graphics which are versatile tools for research and training in human interaction processes, particularly in the initial psychotherapy interview.« less
The Jinn and the Computer. Consumption and Identity in Arabic Children's Magazines
ERIC Educational Resources Information Center
Peterson, Mark Allen
2005-01-01
One of the fundamental problems facing middle-class Egyptian parents is the problem of how to ensure that their children are simultaneously modern and Egyptian. Arabic children's magazines offer a window into the processes by which consumption links childhood and modernity in the social imaginations of children and their parents as they construct…
Adaptationism and intuitions about modern criminal justice.
Petersen, Michael Bang
2013-02-01
Research indicates that individuals have incoherent intuitions about particular features of the criminal justice system. This could be seen as an argument against the existence of adapted computational systems for counter-exploitation. Here, I outline how the model developed by McCullough et al. readily predicts the production of conflicting intuitions in the context of modern criminal justice issues.
Qu, Hai-bin; Cheng, Yi-yu; Wang, Yue-sheng
2003-10-01
Based on the review of some engineering problems on developing modern production industry of Traditional Chinese Medicine (TCM), the differences of TCM production industry between China and abroad were pointed out. Accelerating the application and extension of high-tech and computer integrated manufacturing system (CIMS) were suggested to promote the technology advancement of TCM industry.
Topography of Cells Revealed by Variable-Angle Total Internal Reflection Fluorescence Microscopy.
Cardoso Dos Santos, Marcelina; Déturche, Régis; Vézy, Cyrille; Jaffiol, Rodolphe
2016-09-20
We propose an improved version of variable-angle total internal reflection fluorescence microscopy (vaTIRFM) adapted to modern TIRF setup. This technique involves the recording of a stack of TIRF images, by gradually increasing the incident angle of the light beam on the sample. A comprehensive theory was developed to extract the membrane/substrate separation distance from fluorescently labeled cell membranes. A straightforward image processing was then established to compute the topography of cells with a nanometric axial resolution, typically 10-20 nm. To highlight the new opportunities offered by vaTIRFM to quantify adhesion process of motile cells, adhesion of MDA-MB-231 cancer cells on glass substrate coated with fibronectin was examined. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Diamond Thin-Film Thermionic Generator
NASA Astrophysics Data System (ADS)
Clewell, J. M.; Ordonez, C. A.; Perez, J. M.
1997-03-01
Since the eighteen-hundreds scientists have sought to develop the highest thermal efficiency in heat engines such as thermionic generators. Modern research in the emerging diamond film industry has indicated the work functions of diamond thin-films can be much less than one electron volt, compelling fresh investigation into their capacity as thermionic generators and inviting new methodology for determining that efficiency. Our objective is to predict the efficiency of a low-work-function, degenerate semiconductor (diamond film) thermionic generator operated as a heat engine between two constant-temperature thermal reservoirs. Our presentation will focus on a theoretical model which predicts the efficiency of the system by employing a Monte Carlo computational technique from which we report results for the thermal efficiency and the thermionic current densities of diamond thin-films.
Echocardiography in the Era of Multimodality Cardiovascular Imaging
Shah, Benoy Nalin
2013-01-01
Echocardiography remains the most frequently performed cardiac imaging investigation and is an invaluable tool for detailed and accurate evaluation of cardiac structure and function. Echocardiography, nuclear cardiology, cardiac magnetic resonance imaging, and cardiovascular-computed tomography comprise the subspeciality of cardiovascular imaging, and these techniques are often used together for a multimodality, comprehensive assessment of a number of cardiac diseases. This paper provides the general cardiologist and physician with an overview of state-of-the-art modern echocardiography, summarising established indications as well as highlighting advances in stress echocardiography, three-dimensional echocardiography, deformation imaging, and contrast echocardiography. Strengths and limitations of echocardiography are discussed as well as the growing role of real-time three-dimensional echocardiography in the guidance of structural heart interventions in the cardiac catheter laboratory. PMID:23878804
NASA Astrophysics Data System (ADS)
Pan, Jun-Yang; Xie, Yi
2015-02-01
With tremendous advances in modern techniques, Einstein's general relativity has become an inevitable part of deep space missions. We investigate the relativistic algorithm for time transfer between the proper time τ of the onboard clock and the Geocentric Coordinate Time, which extends some previous works by including the effects of propagation of electromagnetic signals. In order to evaluate the implicit algebraic equations and integrals in the model, we take an analytic approach to work out their approximate values. This analytic model might be used in an onboard computer because of its limited capability to perform calculations. Taking an orbiter like Yinghuo-1 as an example, we find that the contributions of the Sun, the ground station and the spacecraft dominate the outcomes of the relativistic corrections to the model.
A guide to missing data for the pediatric nephrologist.
Larkins, Nicholas G; Craig, Jonathan C; Teixeira-Pinto, Armando
2018-03-13
Missing data is an important and common source of bias in clinical research. Readers should be alert to and consider the impact of missing data when reading studies. Beyond preventing missing data in the first place, through good study design and conduct, there are different strategies available to handle data containing missing observations. Complete case analysis is often biased unless data are missing completely at random. Better methods of handling missing data include multiple imputation and models using likelihood-based estimation. With advancing computing power and modern statistical software, these methods are within the reach of clinician-researchers under guidance of a biostatistician. As clinicians reading papers, we need to continue to update our understanding of statistical methods, so that we understand the limitations of these techniques and can critically interpret literature.
Design of Launch Vehicle Flight Control Systems Using Ascent Vehicle Stability Analysis Tool
NASA Technical Reports Server (NTRS)
Jang, Jiann-Woei; Alaniz, Abran; Hall, Robert; Bedossian, Nazareth; Hall, Charles; Jackson, Mark
2011-01-01
A launch vehicle represents a complicated flex-body structural environment for flight control system design. The Ascent-vehicle Stability Analysis Tool (ASAT) is developed to address the complicity in design and analysis of a launch vehicle. The design objective for the flight control system of a launch vehicle is to best follow guidance commands while robustly maintaining system stability. A constrained optimization approach takes the advantage of modern computational control techniques to simultaneously design multiple control systems in compliance with required design specs. "Tower Clearance" and "Load Relief" designs have been achieved for liftoff and max dynamic pressure flight regions, respectively, in the presence of large wind disturbances. The robustness of the flight control system designs has been verified in the frequency domain Monte Carlo analysis using ASAT.
An Advanced simulation Code for Modeling Inductive Output Tubes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thuc Bui; R. Lawrence Ives
2012-04-27
During the Phase I program, CCR completed several major building blocks for a 3D large signal, inductive output tube (IOT) code using modern computer language and programming techniques. These included a 3D, Helmholtz, time-harmonic, field solver with a fully functional graphical user interface (GUI), automeshing and adaptivity. Other building blocks included the improved electrostatic Poisson solver with temporal boundary conditions to provide temporal fields for the time-stepping particle pusher as well as the self electric field caused by time-varying space charge. The magnetostatic field solver was also updated to solve for the self magnetic field caused by time changing currentmore » density in the output cavity gap. The goal function to optimize an IOT cavity was also formulated, and the optimization methodologies were investigated.« less
Injuries in students of three different dance techniques.
Echegoyen, Soledad; Acuña, Eugenia; Rodríguez, Cristina
2010-06-01
As with any athlete, the dancer has a high risk for injury. Most studies carried out relate to classical and modern dance; however, there is a lack of reports on injuries involving other dance techniques. This study is an attempt to determine the differences in the incidence, the exposure-related rates, and the kind of injuries in three different dance techniques. A prospective study about dance injuries was carried out between 2004 and 2007 on students of modern, Mexican folkloric, and Spanish dance at the Escuela Nacional de Danza. A total of 1,168 injuries were registered in 444 students; the injury rate was 4 injuries/student for modern dance and 2 injuries/student for Mexican folkloric and Spanish dance. The rate per training hours was 4 for modern, 1.8 for Mexican folkloric, and 1.5 injuries/1,000 hr of training for Spanish dance. The lower extremity is the most frequent structure injured (70.47%), and overuse injuries comprised 29% of the total. The most frequent injuries were strain, sprain, back pain, and patellofemoral pain. This study has a consistent medical diagnosis of the injuries and is the first attempt in Mexico to compare the incidence of injuries in different dance techniques. To decrease the frequency of student injury, it is important to incorporate prevention programs into dance program curricula. More studies are necessary to define causes and mechanisms of injury, as well as an analysis of training methodology, to decrease the incidence of the muscle imbalances resulting in injury.
Joint the Center for Applied Scientific Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd; Bremer, Timo; Van Essen, Brian
The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.